Sample records for normaler verlauf und

  1. B-Zell-Lymphome der Haut - Pathogenese, Diagnostik und Therapie.

    PubMed

    Nicolay, Jan P; Wobser, Marion

    2016-12-01

    Primär kutane B-Zell-Lymphome (PCBCL) beschreiben reifzellige lymphoproliferative Erkrankungen der B-Zell-Reihe, die primär die Haut betreffen. Die Biologie und der klinische Verlauf der einzelnen PCBCL-Subtypen variieren untereinander stark und unterscheiden sich grundsätzlich von primär nodalen und systemischen B-Zell-Lymphomen. Primär kutane Marginalzonenlymphome (PCMZL) und primäre kutane follikuläre Keimzentrumslymphome (PCFCL) werden auf Grund ihres unkomplizierten Verlaufs und ihrer exzellenten Prognose zu den indolenten PCBCL gezählt. Demgegenüber stellen die diffus großzelligen B-Zell-Lymphome, hauptsächlich vom Beintyp (DLBCL, LT) die aggressiveren PCBCL-Varianten mit schlechterer Prognose dar. Für die Ausbreitungsdiagnostik und die Therapieentscheidung sind eine genaue histologische und immunhistochemische Klassifizierung sowie der Ausschluss einer systemischen Beteiligung in Abgrenzung zu nodalen oder systemischen Lymphomen notwendig. Die Diagnostik sollte dabei durch molekularbiologische Untersuchungen unterstützt werden. Therapeutisch stehen für die indolenten PCBCL primär operative und radioonkologische Maßnahmen im Vordergrund sowie eine Systemtherapie mit dem CD20-Antikörper Rituximab bei disseminiertem Befall. Die aggressiveren Varianten sollten in erster Linie mit Kombinationen aus Rituximab und Polychemotherapieschemata wie z. B. dem CHOP-Schema oder Modifikationen davon behandelt werden. Auf Grund der in allen seinen Einzelheiten noch nicht vollständig verstandenen Pathogenese und Biologie sowie des begrenzten Therapiespektrums der PCBCL besteht hier, speziell beim DLBCL, LT, noch erheblicher Forschungsbedarf. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  2. Tinea capitis: Erregerspektrum und Epidemiologie im zeitlichen Wandel.

    PubMed

    Ziegler, Wiebke; Lempert, Sigrid; Goebeler, Matthias; Kolb-Mäurer, Annette

    2016-08-01

    Die Tinea capitis ist die häufigste Dermatomykose des Kindesalters. Das Erregerprofil zeigt unterschiedliche geographische Verteilungsmuster und variiert im Laufe der Zeit. Zwischen 1990-2014 an der Würzburger Universitätsklinik für Dermatologie erhobene Daten von 150 Patienten mit mykologisch gesicherter Tinea capitis wurden hinsichtlich Alter, Geschlecht und Erregerspektrum analysiert und über zwei Zeiträume von jeweils 12,5 Jahren miteinander verglichen. Obwohl eine Tinea capitis am häufigsten bei Kindern der Altersgruppe zwischen 0 und 5 Jahren diagnostiziert wurde, lag der Anteil betroffener Erwachsener mit 16 % höher als bislang berichtet. Der zoophile Dermatophyt Microsporum canis konnte am häufigsten als Erreger der Tinea capitis identifiziert werden, jedoch war ein Anstieg von Infektionen mit den anthropophilen Pilzen Trichophyton tonsurans und Trichophyton rubrum zu verzeichnen. Tendenziell sank der Anteil zoophiler im Verhältnis zu den anthropophilen Erregern. Im zeitlichen Verlauf zeigte sich eine zunehmende Heterogenität des Erregerspektrums: Dermatophyten wie Trichophyton soudanense und Trichophyton violaceum, Trichophyton anamorph von Arthroderma benhamiae sowie Trichophyton schoenleinii und Microsporum audouinii konnten erstmalig bzw. nach langer Zeit wieder erneut isoliert werden. Wenngleich Microsporum-canis-Infektionen noch dominieren, sind zunehmend anthropophile Erreger nachzuweisen. Angesichts des unerwartet hohen Anteils von Erwachsenen sollte eine Tinea capitis in allen Altersgruppen differenzialdiagnostisch in Betracht gezogen werden. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  3. Ustekinumab in der Therapie der Pustulosis palmoplantaris - Eine Fallserie mit neun Patienten.

    PubMed

    Buder, Valeska; Herberger, Katharina; Jacobi, Arnd; Augustin, Matthias; Radtke, Marc Alexander

    2016-11-01

    Die Pustulosis palmoplantaris ist eine chronisch entzündliche Hauterkrankung, die mit bedeutenden Einschränkungen der Lebensqualität und der Belastbarkeit einhergeht. Aufgrund von Zulassungsbeschränkungen und einem häufig therapierefraktären Verlauf sind die Behandlungsmöglichkeiten limitiert. Nach zuvor frustranen Therapien erhielten 9 Patienten mit Pustulosis palmoplantaris nach Ausschluss einer latenten Tuberkulose Ustekinumab (45 mg Ustekinumab bei < 100 kg Körpergewicht [KG], 90 mg Ustekinumab > 100 kg KG) in Woche 0, 4, 12 und 24. Reguläre Visiten erfolgten nach 4 und 12 Wochen, im weiteren Verlauf alle 12 Wochen. Das Durchschnittsalter bei Therapiebeginn betrug 48 Jahre. Drei Patienten waren männlich. Bei n  =  4 Patienten (44,4 %) wurde eine Verbesserung um 75 % des Palmoplantar-Psoriasis-Area-Severity-Index (PPPASI) erreicht. Insgesamt verbesserte sich der PPPASI nach 24 Wochen durchschnittlich um 71,6 %. Eine komplette Abheilung zeigte sich bei n  =  2 Patienten nach 24 Wochen. Bis auf lokale Injektionsreaktionen und leichte Infekte wurden keine unerwünschten Wirkungen beobachtet. Die Fallserie ist ein weiterer Beleg für die Wirksamkeit und Verträglichkeit von Ustekinumab in der Therapie der Pustulosis palmoplantaris. Zur Beurteilung der Langzeitwirkung und -sicherheit sowie der Wirksamkeit einer intermittierenden Therapie sind kontrollierte Studiendaten sowie Beobachtungen im Rahmen von Patientenregistern notwendig. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  4. Intelligente Messsysteme - Mehrwert für unterschiedliche Stufen der Wertschöpfung

    NASA Astrophysics Data System (ADS)

    Deppe, Benjamin

    Die Veränderung der Energiewirtschaft schreitet kontinuierlich voran - und wird mit Gesetz zur Digitalisierung der Energiewende nicht abgeschlossen sein. Vielmehr steigen die Innovationsgeschwindigkeit und die Erwartungshaltung der Kunden. Dieses Kapitel beschreibt, wie intelligente Messsysteme die Basis für neue Möglichkeiten schaffen und wie sich diese Veränderungen auf die Wertschöpfung auswirken. Der Fokus liegt dabei auf dem Wandel der einzelnen Wertschöpfungsstufen im Verlauf der Liberalisierung des Messwesens und dem seit dem 02.09.2016 verbindlichen Messstellenbetriebsgesetzes. Der Beitrag zeigt auf, wie sich bisher getrennte Wertschöpfungsstufen nunmehr direkt berühren und miteinander interagieren. Deutlich wird dadurch, wie Informationen zu Mehrwert führen können - aber auch, welche Fragen noch zu beantworten sind und welche Hürden es zu überwinden gilt.

  5. Zeitlicher Verlauf der avaskulären Nekrose des Hüftkopfes bei Patienten mit Pemphigus vulgaris.

    PubMed

    Balighi, Kamran; Daneshpazhooh, Maryam; Aghazadeh, Nessa; Saeidi, Vahide; Shahpouri, Farzam; Hejazi, Pardis; Chams-Davatchi, Cheyda

    2016-10-01

    Pemphigus vulgaris (PV) wird in der Regel mit systemischen Corticosteroiden und Immunsuppressiva behandelt. Avaskuläre Nekrose (AVN) des Hüftkopfes ist eine gut bekannte schwerere Komplikation einer Corticosteroid-Therapie. Die Charakteristika dieser schweren Komplikation bei PV sind nach wie vor unbekannt. Nicht kontrollierte, retrospektive Untersuchung aller PV-bedingten AVN-Fälle, die in einer iranischen Klinik für bullöse Autoimmunerkrankungen zwischen 1985 und 2013 diagnostiziert wurden. Anhand der Krankenakten von 2321 untersuchten PV-Patienten wurden 45 Fälle (1,93 %) von femoraler AVN identifiziert. Dreißig davon waren Männer. Das mittlere Alter bei der Diagnose der AVN betrug 47,4 ± 14,2 Jahre. Der mittlere Zeitraum zwischen der Diagnose des PV und dem Einsetzen der AVN lag bei 25,3 ± 18,3 Monaten. Mit Ausnahme von acht Fällen (17,8 %) setzte die AVN bei der Mehrheit der Patienten innerhalb von drei Jahren nach Diagnose des PV ein. Die mittlere kumulative Dosis von Prednisolon bei Patienten mit AVN betrug 13.115,8 ± 7041,1 mg. Zwischen der Prednisolon-Gesamtdosis und dem Zeitraum bis zum Einsetzen der AVN bestand eine starke Korrelation (p = 0,001). Bei Patienten mit Alendronateinnahme in der Vorgeschichte war dieser Zeitraum signifikant kürzer (p = 0,01). Die AVN ist eine schwere Komplikation einer Corticosteroid-Behandlung bei Patienten mit PV. Sie wird bei 2 % der Patienten beobachtet und tritt vor allem in den ersten drei Behandlungsjahren auf. Bei Patienten, die höhere Dosen von Prednisolon erhalten, setzt die AVN tendenziell früher ein. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  6. Neue Laser und Strahlquellen - alte und neue Risiken?

    PubMed

    Paasch, Uwe; Schwandt, Antje; Seeber, Nikolaus; Kautz, Gerd; Grunewald, Sonja; Haedersdal, Merete

    2017-05-01

    Die Entwicklungen im Bereich dermatologischer Laser, hochenergetischer Blitzlampen, LED und neuer Energie- und Strahlquellen der letzten Jahre haben gezeigt, dass mit neuen Wellenlängen, Konzepten und Kombinationen zusätzliche, zum Teil über den ästhetischen Bereich hinaus gehende therapeutische Optionen für den Dermatologen erschlossen werden konnten. Wurden bisher zum Beispiel mit fraktionalen Lasern Falten behandelt, sind eben diese Systeme heute in Kombination mit Medikamenten wichtige Werkzeuge bei der Behandlung von Narben, bei Feldkanzerisierung und epithelialen Tumoren. Die Anforderungen an den die Indikation stellenden und vorzugsweise therapierenden Arzt steigen mit der immer komplexer werdenden Technik und den zunehmenden Komorbiditäten und Komedikationen einer älter werdenden Patientenklientel. Parallel etabliert wurden, zunächst für einige wenige Indikationen, Geräte für die Heimanwendung, die sich durch geringe Leistung und spezielle Sicherheitsvorkehrungen zur Vermeidung von Unfällen, Risiken und Nebenwirkungen auszeichnen. Trotz der reduzierten Effizienz solcher Selbstbehandlungsmaßnahmen steigt die Wahrscheinlichkeit einer Fehlanwendung, da die Grundvoraussetzung für eine korrekte Therapie, nämlich die exakte Diagnose und Indikationsstellung, nicht vorausgesetzt werden kann. Bei einer Haarentfernung können so Pigmenttumoren, bei einer Faltentherapie neoplastische Hautveränderungen adressiert und zu erwartende, unvorhergesehene und neue Nebenwirkungen und Komplikationen induziert werden. In diesem Szenario ist es wichtig, alle potenziellen Anwender dieser neuen Technologien vor deren Einsatz so zu qualifizieren, dass den Therapierten maximale Therapiesicherheit bei höchster Effizienz unter dem Leitbild diagnosis certa - ullae therapiae fundamentum garantiert wird. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  7. Pyropheophorbide und a as a catabolite of ethylene-induced chlorophyll und a degradation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shimokawa, Keishi; Hashizume, Akihito; Shioi, Yuzo

    1990-05-01

    An enzyme extract prepared from ethylene-induced degreening Citrus fruits contains chlorophyll (Chl) degrading enzymes. The fate of Chl carbons during an enzymatic degradation was investigated using Chl {und a}-{sup 14}C. Accompanying the disappearance of labelled Chl {und a}, pheophorbide {und a} and pyropheophorbide {und a} appeared and accumulation of pyropheophorbide {und a} was observed. HydroxyChl {und a} was also detected, but this is thought to be an artifact during chromatography. Unlike ethylene-induced Citrus fruits (in vivo), further degradation of pyropheophorbide {und a} did not occur in vitro enzyme system. This suggests that there is a lack of enzyme(s) and/or cofactor(s)more » for further degradation. It is concluded that Chl {und a} degraded enzymatically by the following order: Chl {und a}, chlorophyllide {und a}, pheophorbide {und a} and pyropheophorbide {und a}.« less

  8. Weiße und Braune Zwerge zeigen sich unwirtlich

    NASA Astrophysics Data System (ADS)

    Heller, René

    2013-02-01

    Mehr als 850 Exoplaneten haben Astronomen seit dem Jahr 1992 nachgewiesen. Die meisten von ihnen umkreisen normale Hauptreihensterne, aber es wurden auch Trabanten im Umlauf um Braune oder Weiße Zwerge aufgespürt. Nun haben Rory Barnes von der University of Washington in Seattle und René Heller am Leibniz-Institut für Astrophysik Potsdam die Bewohnbarkeit möglicher Planeten um solche Objekte untersucht und dabei festgestellt, dass sie für Leben, wie wir es kennen, ungeeignet sind.

  9. Sternbilder und ihre Mythen

    NASA Astrophysics Data System (ADS)

    Fasching, Gerhard

    Die Sternbilder und die damit verbundenen Mythen helfen, sich am Sternenhimmel zurechtzufinden und vermitteln die Vielfalt der Bilder der Mythologie und der Sternsagen. Sternkarten und alte Kupferstiche aus dem Bestand der Österreichischen Nationalbibliothek zeigen, wie man sich in früheren Jahrhunderten den Sternenhimmel vorgestellt hat. Ausführliche Sachverzeichnisse mit über 3000 Suchbegriffen erleichtern den Zugang zu Stern- und Sternbildnamen und zur Mythologie. Die dritte, erweiterte Auflage gibt für die kommenden Jahrzehnte darüber Auskunft, wo und wann die Planeten aber auch Sternhaufen, Gasnebel und Galaxien am Himmel mühelos aufgefunden werden können.

  10. Mikrodaten und statistische Auswertungsmethoden

    NASA Astrophysics Data System (ADS)

    Hujer, Reinhard

    Mit der zunehmenden Verfügbarkeit immer größerer Querschnitts- und Längschnittsdatensätze für Personen, Haushalte und Betriebe sowie deren Verknüpfungen hat sich die mikroökonometrische Forschung in den vergangenen Jahren rasant weiterentwickelt. Dies gilt sowohl aus methodischer als auch aus empirischer, anwendungsorientierter Sicht. Mikrodaten und mikroökonometrische Ansätze dienen dazu, aktuelle, politikrelevante Fragen aufzugreifen, sie zu analysieren und fundierte politische Empfehlungen zu geben, beispielsweise im Rahmen der Arbeitsmarkt- und Sozialpolitik, der Finanzanalyse und der Marketingforschung. Die Deutsche Statistische Gesellschaft (DStatG) und deren Mitglieder haben sich in den Ausschüssen und in Hauptversammlungen kontinuierlich mit den Weiterentwicklungen der mikroökonometrischen Methodik und den empirischen Anwendungen befasst. Zahlreiche Publikationen von Mitgliedern der DStatG haben entscheidend zum kritischen Diskurs und zum wissenschaftlichen Fortschritt in diesem Bereich beigetragen.

  11. [Vegetarische und vegane Ernährung bei Kindern - Stand der Forschung und Forschungsbedarf].

    PubMed

    Keller, Markus; Müller, Stine

    2016-01-01

    Die Praxis vegetarischer Ernährungsformen ist in Deutschland im letzten Jahrzehnt deutlich angestiegen. Allerdings ist der Anteil vegetarischer und veganer Kinder dabei unbekannt. Studien mit Erwachsenen zeigen das präventive Potenzial, aber auch potenzielle Schwachstellen pflanzenbasierter Kostformen. Die Vorteile und Risiken einer vegetarischen bzw. veganen Ernährung im Kindesalter wurden bisher jedoch relativ selten untersucht. Außerdem lassen das unterschiedliche Alter der Kinder, das heterogene Studiendesign sowie die teilweise geringe Probandenzahl der Studien keine verbindlichen Aussagen zu. In dieser Übersichtsarbeit werden die Ergebnisse der wenigen Studien zu vegetarisch und vegan ernährten Kindern (< 12 Jahren) in Nordamerika und Europa zusammengefasst. Demnach lag die Zufuhr von Nahrungsenergie und Makronährstoffen vegetarischer und veganer Kinder meist näher an den Empfehlungen der Fachgesellschaften als die Ernährung gleichaltriger Mischkostkinder. Ebenso wiesen vegetarisch und vegan ernährte Kinder eine höhere Zufuhr von und bessere Versorgung mit verschiedenen Vitaminen und Mineralstoffen auf. Häufiger zeigten sich jedoch Defizite bei Vitamin B12, Zink, Kalzium, Eisen und Vitamin D. Das Wachstum und die Entwicklung vegetarisch und vegan ernährter Kinder entsprachen weitgehend den Referenzstandards, wobei sie dazu tendierten, leichter, schlanker und (< 5 Jahren) auch kleiner zu sein. Aufgrund der unzureichenden Studienlage besteht erheblicher Forschungsbedarf zu den Auswirkungen einer vegetarischen und veganen Ernährung im Kindesalter. © 2016 S. Karger GmbH, Freiburg.

  12. Sternbilder und ihre Mythen

    NASA Astrophysics Data System (ADS)

    Fasching, Gerhard

    Die Sternbilder, die seit alters her die Menschen in ihren Bann gezogen haben, und die damit verbundenen Mythen werden in zweifacher Weise vorgestellt. Erstens ist es die Absicht, dem Leser zu helfen, sich am Sternenhimmel zurechtzufinden, und zweitens will es ihm die Vielfalt der Bilder vermitteln, die damit verbunden sind. Am Anfang des Buches stehen die prächtigen Erzählungen aus Ovids Metamorphosen. Dann ist vom Sternenhimmel im Jahreskreis die Rede, um den Leser anzuregen, diesen fast unendlichen Bilderreichtum sich selbst durch eigene Beobachtungen zu erschließen. Ein umfangreicher Abschnitt behandelt die einzelnen Sternbilder und das hierzu überlieferte Wissen. Sternkarten und alte Kupferstiche aus dem Bestand der Österreichischen Nationalbibliothek zeigen, wie man sich in früheren Jahrhunderten den Sternenhimmel vorgestellt hat. Sternsagen und Mythen werden erzählt und auch das ptolemäische und das kopernikanische Weltsystem werden einander gegenübergestellt. Ausführliche Sachverzeichnisse mit über 3000 Suchbegriffen erleichtern den Zugang zu Stern- und Sternbildnamen und zur Mythologie.

  13. S1-Leitlinie Lipödem.

    PubMed

    Reich-Schupke, Stefanie; Schmeller, Wilfried; Brauer, Wolfgang Justus; Cornely, Manuel E; Faerber, Gabriele; Ludwig, Malte; Lulay, Gerd; Miller, Anya; Rapprich, Stefan; Richter, Dirk Frank; Schacht, Vivien; Schrader, Klaus; Stücker, Markus; Ure, Christian

    2017-07-01

    Die vorliegende überarbeitete Leitlinie zum Lipödem wurde unter der Federführung der Deutschen Gesellschaft für Phlebologie (DGP) erstellt und finanziert. Die Inhalte beruhen auf einer systematischen Literaturrecherche und dem Konsens von acht medizinischen Fachgesellschaften und Berufsverbänden. Die Leitlinie beinhaltet Empfehlungen zu Diagnostik und Therapie des Lipödems. Die Diagnose ist dabei auf der Basis von Anamnese und klinischem Befund zu stellen. Charakteristisch ist eine umschriebene, symmetrisch lokalisierte Vermehrung des Unterhautfettgewebes an den Extremitäten mit deutlicher Disproportion zum Stamm. Zusätzlich finden sich Ödeme, Hämatomneigung und eine gesteigerte Schmerzhaftigkeit der betroffenen Körperabschnitte. Weitere apparative Untersuchungen sind bisher besonderen Fragestellungen vorbehalten. Die Erkrankung ist chronisch progredient mit individuell unterschiedlichem und nicht vorhersehbarem Verlauf. Die Therapie besteht aus vier Säulen, die individuell kombiniert und an das aktuelle Beschwerdebild angepasst werden sollten: komplexe physikalische Entstauungstherapie (manuelle Lymphdrainage, Kompressionstherapie, Bewegungstherapie, Hautpflege), Liposuktion und plastisch-chirurgische Interventionen, Ernährung und körperliche Aktivität sowie ggf. additive Psychotherapie. Operative Maßnahmen sind insbesondere dann angezeigt, wenn trotz konsequent durchgeführter konservativer Therapie noch Beschwerden bestehen bzw. eine Progredienz des Befundes und/oder der Beschwerden auftritt. Eine begleitend zum Lipödem bestehende morbide Adipositas sollte vor einer Liposuktion therapeutisch angegangen werden. © 2017 The Authors | Journal compilation © Blackwell Verlag GmbH, Berlin.

  14. Neuausrichtung und Konsolidierung

    NASA Astrophysics Data System (ADS)

    Grohmann, Heinz

    Mit der Wahl von Wolfgang Wetzel zum Vorsitzenden der Deutschen Statistischen Gesellschaft im Jahre 1972 begann eine 32jährige Ära, in der die praktische und die theoretische Statistik in einem ausgewogenen Verhältnis gepflegt wurden. Ein regelmäßiger vierjähriger Wechsel im Vorsitz stärkte die Gemeinschaft und die praktische wie die wissenschaftliche Arbeit gleichermaßen. Die jährlichen Hauptversammlungen behandelten gesellschaftlich aktuelle wie zukunftsorientierte Themen, und die Ausschüsse sowie weitere Veranstaltungen gaben Gelegenheit zur Förderung und Pflege einer Vielzahl von Arbeitsgebieten der Statistik. Darüber wird nicht nur in diesem Kapitel, sondern auch in den Teilen II und III des Bandes berichtet.

  15. Spirituelles Wohlbefinden und Coping bei Sklerodermie, Lupus erythematodes und malignem Melanom.

    PubMed

    Pilch, Michaela; Scharf, Sabina Nadine; Lukanz, Martin; Wutte, Nora Johanna; Fink-Puches, Regina; Glawischnig-Goschnik, Monika; Unterrainer, Human-Friedrich; Aberer, Elisabeth

    2016-07-01

    Religiös-spirituelles Wohlbefinden ist verbunden mit höherer Vitalität und verminderter Depressionsneigung. In unserer Studie untersuchten wir die Strategien zur Krankheitsbewältigung und die Rolle von Religiosität-Spiritualität (R-S) zur Verbesserung des subjektiven Wohlbefindens. 149 Patienten (107 Frauen), 44 mit systemischer Sklerodermie (SKL), 48 mit Lupus erythematodes (LE) und 57 mit malignem Melanom (MM), Stadium I-II, wurden mittels eines selbstentwickelten Fragebogens zum subjektiven Wohlbefinden, zu den mit der Erkrankung einhergehenden Umständen sowie mit dem Multidimensionalen Inventar (MI-RSB) zu R-S befragt. LE-Patienten sind zum Zeitpunkt der Diagnosestellung stärker belastet als SKL- und MM-Patienten. SKL- und LE-Patienten können erst nach Jahren die Erkrankung akzeptieren. Der Gesamtscore des religiös-spirituellen Befindens liegt bei LE-Patienten signifikant unter dem Wert der Normalbevölkerung. Fotosensitivität und Gelenksschmerzen sind bei LE-Patienten negativ assoziiert mit der Fähigkeit Vergeben zu können. SKL-Patienten mit Gesichtsveränderungen und Lungenbeteiligung zeigen höhere allgemeine Religiosität. MM-Patienten haben höhere Werte für transzendente Hoffnung. Vorträge über die Krankheit und psychologische Betreuung sind die wichtigsten Bedürfnisse von Patienten mit SKL, LE und MM an ihre Betreuer. Religiös-spirituelle Angebote zur Krankheitsverarbeitung scheinen derzeit eine untergeordnete Rolle zu spielen, könnten aber eine wichtige Ressource sein, der man in Zukunft mehr Aufmerksamkeit schenken sollte. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  16. Einleitung

    NASA Astrophysics Data System (ADS)

    Ha, Suk-Woo

    Der Einsatz von Implantaten zielt auf die Unterstützung oder den Ersatz von Zelloder Gewebefunktionen im menschlichen Körper. Die Werkstoffauswahl für diese Implantate hängt dabei von der Art und der Funktion des zu ersetzenden Gewebes ab. Die Anforderungen an den Implantatwerkstoff bezüglich Eigenschaften und Struktur können je nach Implantationsort und Funktionalität ganz unterschiedlich sein. Implantate, die im Knochengewebe Funktionen der Lasteinleitung und -überleitung ausüben, sind hohen mechanischen Anforderungen (optimale Bauteilsteifigkeit, Dauerfestigkeit) unterworfen, während bei Blutgefässimplantaten die Werkstoffoberfläche, primär in ihrer chemischen Zusammensetzung derart gestaltet sein muss, dass eine minimale Thrombogenität resultiert. Für den Erfolg des Implantatwerkstoffes oder -bauteils sind folgende drei Faktoren relevant: (a) Biokompatibilität, (b) Gesundheitszustand des Patienten und (c) Verlauf der Operation und der nachfolgenden Therapie. Bei Vorliegen einer Erkrankung, wie z. B. die allergische Sensibilisierung gegenüber Metallionen (Nickelallergie) oder Osteoporose im Fall der Verankerung von Hüftprothesen, ist der Implantatwerkstoff höheren Anforderungen bezüglich der Biokompatibilität unterworfen als bei organisch gesunden Patienten.

  17. Statistik in Naturwissenschaft und Technik

    NASA Astrophysics Data System (ADS)

    Wilrich, Peter-Theodor

    Mit dem Aufschwung von Naturwissenschaft und Technik zu Beginn des 19. Jahrhunderts entstand die mathematische Statistik, angeregt aus der Geodäsie (wie die Methode der kleinsten Quadrate) und der Anthropologie (wie die statistische Analyse mehrdimensionaler Beobachtungen und ihrer stochastischen Abhängigkeiten). Im Gegensatz zu den Wirtschafts- und Sozialwissenschaften, in denen Daten vorwiegend aus Beobachtungsstudien gewonnen werden, stehen in den Naturwissenschaften Experimente im Vordergrund. Daher gehört die statistische Versuchsplanung zu den insbesondere in Naturwissenschaft und Technik angewendeten statistischen Methoden, aber auch die Extremwertstatistik und Lebensdaueranalysen sowie die Methoden der räumlichen Statistik (insbesondere in der Umweltforschung). Im 20. Jahrhundert wurden Stichprobenpläne und Regelkarten (Kontrollkarten) zur Prozessregelung als Hilfsmittel der statistischen Qualitätssicherung entwickelt. Diese Methodenbündel, mit denen sich der Ausschuss Statistik in Naturwissenschaft und Technik immer wieder befasst, werden im Folgenden vorgestellt.

  18. Milch, Milchprodukte, Analoge und Speiseeis

    NASA Astrophysics Data System (ADS)

    Coors, Ursula

    Die Produktpalette Milch und Erzeugnisse aus Milch beinhaltet Konsummilch, die aus Milch oder Bestandteilen der Milch hergestellten Milcherzeugnisse wie Sauermilch-, Joghurt-, Kefir-, Buttermilch-, Sahne-, Kondensmilch-, Trockenmilch- und Molkenerzeugnisse, Milchmisch- und Molkenmischprodukte (Produkte mit beigegebenen Lebensmitteln), Milchzucker, Milcheiweißerzeugnisse, Milchfette und Käse.

  19. Aufgaben und Zielsetzungen

    NASA Astrophysics Data System (ADS)

    Bauer, Jürgen

    Die technisch orientierte Betriebswirtschaft unterstützt den Techniker und Ingenieur bei der Planung und Realisierung wirtschaftlicher Prozesse (Fertigungsprozesse, Entwicklungsprozesse im F+E-Bereich, Vertriebsprozesse, Beschaffungsprozesse),

  20. Ein Blick ins Licht. Einblicke in die Natur des Lichts und des Sehens, in Farbe und Fotografie.

    NASA Astrophysics Data System (ADS)

    Falk, D. S.; Brill, D. R.; Stork, D. G.

    This book is a German translation, by A. Ehlers, of the American original "Seeing the light. Optics in nature, photography, color, vision, and holography", published in 1986. Contents: 1. Die Haupteigenschaften des Lichts. 2. Grundlagen der geometrischen Optik. 3. Spiegel und Linsen. 4. Kamera und Fotografie. 5. Das menschliche Auge und sein Sehvermögen. I: Wie das Bild erzeugt wird. 6. Optische Instrumente. 7. Das menschliche Auge und sein Sehvermögen. II: Bildverarbeitung. 8. Räumliches Sehen und Tiefenwahrnehmung. 9. Farbe. 10. Farbwahrnehmung. 11. Farbfotografie. 12. Wellenoptik. 13. Streuung und Polarisation. 14. Holografie. 15. Ein Blick in die moderne Physik.

  1. Dynamische Motorvermessung mit verschiedenen Methoden und Modellen

    NASA Astrophysics Data System (ADS)

    Schreiber, Alexander

    Die stark zunehmenden gesetzlichen und wirtschaftlichen Vorgaben zur Senkung von Kraftstoffverbrauch und Abgasemissionen stellen große Anforderungen an die weitere Entwicklung von Benzin- und Dieselmotoren. Hierbei sind grundlegende Fortschritte durch Konstruktion und auslegungsbedingte Maßnahmen im Bereich der Einspritzung, Gemischaufbereitung, Aufladung, Brennverfahren und Abgasnachbehandlung zu erreichen. Ein wesentlicher Teil dieser Verbesserungen wird jedoch durch eine Zunahme von Variabilitäten erreicht wie z.B. verstellbaren Vor-, Haupt- und Nacheinspritzungen, variablem Raildruck, variablen Nockenwellensteuerwinkeln, Ventilhüben, Drall-/Tumbleklappen sowie verstellbaren Abgasturbinen, Abgasrückführströmen und Abgasnachbehandlungssystemen. Dadurch steigt die Zahl der Stellglieder (Aktoren) stark an. Hinzu kommen zusätzliche Sensoren wie z.B. für Luftzahl, NOx, Brennraumdruck, Abgastemperatur und Abgasdruck. Deshalb nimmt der Umfang der Steuerungs-, Regelungs- und Diagnosefunktionen in der Motorelektronik (ECU) stark zu. Bild 7-1 zeigt als Beispiel den Signalfluss für die gesteuerten und geregelten Größen eines Dieselmotors in einer beispielhaften Prüfstandsumgebung.

  2. Chancen und Visionen Der Modernen Mechanik

    NASA Astrophysics Data System (ADS)

    Ehlers, Wolfgang; Wriggers, Peter

    Die Gesellschaft des 21. Jahrhunderts wird ihre Anforderungen an Lebensqualität und Umweltstandards stetig erhöhen. Dies betrifft auch die Anforderungen an Qualität und Vorhersagegenauigkeit bei der Berechnung komplexer Probleme und umfasst insbesondere das Gesamtdesign von Produkten unserer unmittelbaren Umgebung wie Architektur- und Ingenieurbauwerken, aber auch von Industrieprodukten, die wir in unseremtäglichen Leben einsetzen.

  3. Korrespondenzfragen zwischen Energiesystem und Telekommunikation

    NASA Astrophysics Data System (ADS)

    Lehmann, Heiko

    Telekommunikationsnetze und Stromnetze weisen viele Gemeinsamkeiten auf - sowohl hinsichtlich allgemeiner topologischer und hierarchischer Eigenschaften, als auch betreffs der konkreten Technoökonomie unter Regulierungsbedingungen. Das folgende Kapitel analysiert diese Eigenschaften und gibt Antworten auf die Frage, wie die wechselseitige Verkopplung beider, jeweils systemabdeckender Infrastrukturen zu einem Treiber der Energiewende in entwickelten Volkswirtschaften werden kann.

  4. Netzformen und VDE-Bestimmungen

    NASA Astrophysics Data System (ADS)

    Bernstein, Herbert

    Der Verband Deutscher Elektrotechniker (VDE) hat eine Reihe von Vorschriften ausgearbeitet, die dem Schutz von Leben und Sachen beim Umgang mit elektrischer Energie dienen. Besonders wichtig sind die in den VDE-Bestimmungen 0100 und 0411 festgelegten Vorschriften.

  5. Bildanalyse in Medizin und Biologie

    NASA Astrophysics Data System (ADS)

    Athelogou, Maria; Schönmeyer, Ralf; Schmidt, Günther; Schäpe, Arno; Baatz, Martin; Binnig, Gerd

    Heutzutage sind bildgebende Verfahren aus medizinischen Untersuchungen nicht mehr wegzudenken. Diverse Methoden - basierend auf dem Einsatz von Ultraschallwellen, Röntgenstrahlung, Magnetfeldern oder Lichtstrahlen - werden dabei spezifisch eingesetzt und liefern umfangreiches Datenmaterial über den Körper und sein Inneres. Anhand von Mikroskopieaufnahmen aus Biopsien können darüber hinaus Daten über die morphologische Eigenschaften von Körpergeweben gewonnen werden. Aus der Analyse all dieser unterschiedlichen Arten von Informationen und unter Konsultation weiterer klinischer Untersuchungen aus diversen medizinischen Disziplinen kann unter Berücksichtigung von Anamnesedaten ein "Gesamtbild“ des Gesundheitszustands eines Patienten erstellt werden. Durch die Flut der erzeugten Bilddaten kommt der Bildverarbeitung im Allgemeinen und der Bildanalyse im Besonderen eine immer wichtigere Rolle zu. Gerade im Bereich der Diagnoseunterstützung, der Therapieplanung und der bildgeführten Chirurgie bilden sie Schlüsseltechnologien, die den Forschritt nicht nur auf diesen Gebieten maßgeblich vorantreiben.

  6. Grundlagen und Grundbegriffe der Messtechnik

    NASA Astrophysics Data System (ADS)

    Plaßmann, Wilfried

    Es ist eine wesentliche Aufgabe der Messtechnik, technische Vorgänge quantitativ zu erfassen und anhand der gemessenen Größen Funktionsabläufe zu steuern. Als Beispiel sei ein Kraftwerk zur Energieerzeugung genannt, bei dem nur über die Messung von Temperaturen, Leistungen, Drücken und anderen Größen Aussagen über den momentanen Zustand möglich sind und bei Abweichungen vom Sollwert geeignete Eingriffe in das System erfolgen können. Damit eine eindeutige Kommunikation möglich wird, sind die in der Messtechnik verwendeten Begriffe, Messverfahren und Maßeinheiten in entsprechenden Normen oder Vorschriften festgelegt.

  7. Aus Wirtschaft und Betrieb. Biomasse: Gewinnung und Verarbeitung mit Profilschal-maschinen

    Treesearch

    P. Koch

    1977-01-01

    1963 wurden in den Südstaaten der USA nur 30% der oberund unterirdischen Biomasse der geernteten sog. Southern pines für Schnittholz und Zellstoff verwertet bzw. als getrockneies, gehobletes und abgelängtes Schnittholz oder als Kraftpapier verkauft. Keine der zusammen mit den Kiefern vorkommenden Laubholzarten wurde bisher in nennenswertem Umfan verwertet. Auch heute...

  8. Wirkstoffe, Medikamente und Mathematische Bildverarbeitung

    NASA Astrophysics Data System (ADS)

    Bauer, Günter J.; Lorenz, Dirk A.; Maaß, Peter; Preckel, Hartwig; Trede, Dennis

    Die Entwicklung neuer Medikamente ist langwierig und teuer. Der erste Schritt ist hierbei die Suche nach neuen Wirkstoffkandidaten, die für die Behandlung bislang schwer therapierbarer Krankheiten geeignet sind. Hierfür stehen der Pharma- und Biotechnologieindustrie riesige Substanzbibliotheken zur Verfügung. In diesen Bibliotheken werden die unterschiedlichsten Substanzen gesammelt, die entweder synthetisch hergestellt oder aus Pilzen, Bakterienkulturen und anderen Lebewesen gewonnen werden können.

  9. Quantitative Analyse und Visualisierung der Herzfunktionen

    NASA Astrophysics Data System (ADS)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  10. Aufbau und Belastung tribologischer Systeme

    NASA Astrophysics Data System (ADS)

    Schumacher, Jan; Murrenhoff, Hubertus

    Die Tribologie ist laut DIN 50323 die Wissenschaft und Technik von aufeinander einwirkenden Oberflächen in Relativbewegung. Es werden die Teilgebiete Reibung, Verschleiß und Schmierung von ihr behandelt.

  11. Mittelwert- und Arbeitstaktsynchrone Simulation von Dieselmotoren

    NASA Astrophysics Data System (ADS)

    Zahn, Sebastian

    Getrieben durch die immer restriktiveren Anforderungen an das Emissions- und Verbrauchsverhalten moderner Verbrennungsmotoren steigt die Komplexität von Motormanagementsystemen mit jeder Modellgeneration an. Damit geht nicht nur eine Zunahme des Softwareumfangs von Steuergeräten sondern zugleich ein deutlicher Anstieg des Applikations-, Vermessungs- und Testaufwandes einher. Zur Effizienzsteigerung des Software- und Funktionsentwicklungsprozesses haben sich daher in der Automobilindustrie sowie in Forschungsinstituten verschiedene modell- und simulationsbasierte Methoden wie die Model-in-the-Loop (MiL) Simulation, die Software-in-the-Loop (SiL) Simulation, das Rapid Control Prototyping (RCP) sowie die Hardware-in-the-Loop (HiL) Simulation etabliert.

  12. Strukturen und Geschäftsmodelle eines neuen Energiemarkts

    NASA Astrophysics Data System (ADS)

    Mildebrath, Bernhard

    Die Energiewende fördert nicht nur technologische, sondern auch organisatorische und kommerzielle Innovationen. Die Strukturen und Geschäftsmodelle eines neuen Energiemarktes entwickeln sich bereits. Im Spagat zwischen Versuch und Irrtum werden sie völlig neue Lösungen für eine altbekannte Forderung schaffen: Strom soll preiswert, sicher und umweltverträglich sein. Der nachstehende Artikel präsentiert dafür - teils zugespitzte - Überlegungen zu den Strukturen und Geschäftsmodellen eines neuen Energiemarkts.

  13. Personen- und Güterverkehr

    NASA Astrophysics Data System (ADS)

    Flämig, Heike; Gertz, Carsten; Mühlhausen, Thorsten

    Im Jahr 2010 war in Deutschland der Verkehrssektor für fast 20 % der energiebedingten Treibhausgase verantwortlich. Das Klima hat sich bereits so weit verändert, dass zur Sicherung der Funktion der Verkehrssysteme auch Anpassungsmaßnahmen notwendig sind. Diese Maßnahmen müssen durch verkehrsreduzierende bzw. -beeinflussende Maßnahmen ergänzt werden. Ausgehend von den Emissionen im Verkehrssektor und möglichen Minderungen schlägt das Kapitel den Bogen zu Optionen der Anpassung an den Klimawandel, die ausführlich und konkret dargestellt werden. Besonders betrachtet werden die mannigfaltigen Gefahren der Rückkopplung sowie die vielfältigen Wechselwirungen mit anderen Themengebieten und Sektoren.

  14. Frühe Stresserfahrungen und Krankheitsvulnerabilität

    PubMed Central

    Entringer, Sonja; Buss, Claudia; Heim, Christine

    2016-01-01

    Zusammenfassung Hintergrund Das stetig wachsende Forschungsgebiet der “Frühe[n] Programmierung von Krankheit und Gesundheit” untersucht, inwieweit die individuelle Vulnerabilität für die Entstehung verschiedenster Erkrankungen über die Lebensspanne bereits während der frühen Entwicklung beeinflusst wird. Ziele der Arbeit In der vorliegenden Übersichtsarbeit werden das Konzept der frühen Programmierung von Krankheitsvulnerabilität erläutert sowie Befunde zu den Folgen frühkindlicher Traumatisierung und pränataler Stressexposition zusammenfassend dargestellt. Es werden außerdem biologische Mechanismen diskutiert, die das erhöhte Krankheitsrisiko nach lebensgeschichtlich früher Stresserfahrungen vermitteln. Die Möglichkeit der transgenerationalen Transmission frühkindlicher Erfahrungen an die nächste Generation und die zugrundeliegenden Mechanismen dieser Übertragung werden ebenfalls vorgestellt. Fazit Die Befundlage zu Stresserfahrungen im frühen Leben und der Entstehung von psychischen und körperlichen Störungen über die Lebensspanne wächst stetig. Die Mechanismen werden derzeit weiter bis hin zur molekularbiologischen und epigenetischen Ebene erforscht. Hier ergeben sich ganz neue Perspektiven, welche die Präzision klinischer Diagnostik und den Erfolg von Interventionen erheblich verbessern könnten. Momentan existiert jedoch noch ein erheblicher Mangel an Translation zwischen diesen Forschungserkenntnissen und deren Anwendung in der klinischen Versorgung. PMID:27604117

  15. Darwinische Kulturtheorie - Evolutionistische und "evolutionistische`` Theorien sozialen Wandels

    NASA Astrophysics Data System (ADS)

    Antweiler, Christoph

    Evolutionistische Argumentationen außerhalb der Biologie sind weit verbreitet. Wenn sie vertreten werden, heißt das mitnichten, dass sie notwendigerweise von darwinischen Argumenten geprägt sind. Wenn man Evolution und Kultur aus explizit darwinischer Perspektive zusammen bringt, bedeutet das noch lange nicht unbedingt Soziobiologie. Und es bedeutet sicherlich nicht Sozialdarwinismus. Dieser Beitrag soll einen Überblick der so genannten evolutionären Ansätze bzw. evolutionistischen Ansätze zu menschlichen Gesellschaften bzw. Kulturen geben. Es soll gezeigt werden, was in den Ansätzen analytisch zu trennen ist und was synthetisch zusammen gehört. Mein Beitrag ist nicht wissenschaftsgeschichtlich angelegt, sondern systematisch ausgerichtet und hat zwei Schwerpunkte (Antweiler 2008; Antweiler 2009b). Zum einen geht es um kausale Zusammenhänge von organischer Evolution und gesellschaftlichem Wandel. Auf der anderen Seite werden Analogien zwischen biotischer und kultureller Evolution erläutert, die als spezifische Ähnlichkeiten dieser beiden als grundsätzlich verschieden gesehenen Prozesse aufgefasst werden. Dadurch wird die Frage aufgeworfen, ob die Evolution von Organismen einerseits und die Transformation von Gesellschaften bzw. Kulturen andererseits, spezielle Fälle eines allgemeinen Modells von Evolution darstellen.

  16. Zahlen und Rechenvorgänge auf unterschiedlichen Abstraktionsniveaus

    NASA Astrophysics Data System (ADS)

    Rödler, Klaus

    "Das Verständnis geht langsam vor sich!" Diesen wichtigen Satz hörte ich bei einem Vortrag von Martin Lowsky. Auf die hier behandelte Fragestellung übertragen heißt das: Was eine Zahl ist und wie ich sie im Rechenvorgang einsetzen und interpretieren kann, das erschließt sich erst allmählich. Die Zahl des Rechenanfängers ist nicht dieselbe wie die des kompetenten Rechners und es ist nicht die Zahl des Lehrers oder der Lehrerin. Die Zahlen sind nur auf der Oberfläche der Worte und Zeichen gleich. Im Innern, im Verständnis, sind sie völlig verschieden! Ich glaube, dass die Missachtung dieser Divergenz dazu führt, dass manche Kinder in für den Lehrer und Lehrerin nicht nachvollziehbaren Routinen stecken bleiben, einfachste Informationen nicht wirklich integrieren. Die auf beiden Seiten wachsende Verunsicherung durch die nicht erkannte und daher nicht kommunizierbare Diskrepanz im inneren Zahlkonzept stört den allmählichen Aufbau strukturierter Zahlvorstellungen.

  17. Symmetriebrechung und Emergenz in der Kosmologie.

    NASA Astrophysics Data System (ADS)

    Mainzer, K.

    Seit der Antike wird der Aufbau des Universums mit einfachen und regulären (symmetrischen) Grundstrukturen verbunden. Diese Annahme liegt selbst noch den Standardmodellen der relativistischen Kosmologie zugrunde. Demgegenüber läßt sich die Emergenz neuer Strukturen von den Elementarteilchen über Moleküle bis zu den komplexen Systemen des Lebens als Symmetriebrechung verstehen. Symmetriebrechung und strukturelle Komplexität bestimmen die kosmische Evolution. Damit zeichnet sich ein fachübergreifendes Forschungsprogramm von Physik, Chemie und Biologie ab, in dem die Evolution des Universums untersucht werden kann.

  18. Kostenüberwachung und Wirtschaftlichkeitsrechnung

    NASA Astrophysics Data System (ADS)

    Bauer, Jürgen

    Die ERP-Produktkalkulation erfolgt auf der Basis des Mengen- und Wertgerüsts der Produktionsprozesse. Sie greift dabei auf die Stammdaten (Materialstamm, Arbeitsplätze, Arbeitspläne, Stücklisten) zu. Basis ist die übliche Industriekalkulation in der Form einer Zuschlagskalkulation, ergänzt durch Platzkostensätze der Maschinen und Arbeitsplätze (siehe Teil ).

  19. Ruhende Flüssigkeiten und Gase

    NASA Astrophysics Data System (ADS)

    Heintze, Joachim

    Das mechanische Verhalten von Flüssigkeiten und Gasen ist dadurch gekennzeichnet, dass sie keine statische Schubfestigkeit besitzen, andernfalls würden sie nicht beginnen, zu fließen. In ruhenden Flüssigkeiten und Gasen können daher keine Schubspannungen bestehen:

  20. Stenting und technische Stentumgebung

    NASA Astrophysics Data System (ADS)

    Hoffstetter, Marc; Pfeifer, Stefan; Schratzenstaller, Thomas; Wintermantel, Erich

    In hoch entwickelten Industrieländern stehen laut Weltgesundheitsorganisation (WHO) Herz-Kreislauf-Erkrankungen und speziell die Koronare Herzkrankheit (KHK) an erster Stelle der Todesursachen. In Deutschland betrug die Zahl der erfassten, an KHK erkrankten Personen ohne Berücksichtigung der Dunkelziffer allein im Jahre 2001 über 473.000. Die KHK war im Jahre 2003 mit 92.673 erfassten Todesfällen immer noch die häufigste Todesursache, obgleich in Deutschland die Häufigkeit der Koronarinterventionen zur Behandlung der KHK zwischen 1984 und 2003 um fast das 80fache von 2.809 auf 221.867 Eingriffe pro Jahr gestiegen ist [1]. Neben der hohen Zahl an Todesfällen haben die betroffenen Personen durch chronische Schmerzen und eingeschränkte körperliche Leistungsfähigkeit zusätzlich eine starke Beeinträchtigung der Lebensqualität [2].In Folge dessen wird die erkrankte Person häufig zum Pflegefall was neben den gesundheitlichen Aspekten auch eine sozioökonomische Komponente in Form der fehlenden Arbeitskraft und den auftretenden Pflegekosten nach sich zieht. Die Kosten für die Behandlung der KHK in Deutschland beliefen sich im Jahre 2002 laut Statistischem Bundesamt auf rund 6,9 Mrd. €. Verglichen mit ähnlichen Zahlen der USA dürfte sich der entstandene Schaden für die deutsche Volkswirtschaft im zwei- bis dreistelligen Milliardenbereich bewegen [3].

  1. Methodik und Qualität statistischer Erhebungen

    NASA Astrophysics Data System (ADS)

    Krug, Walter; Schmidt, Jürgen; Wiegert, Rolf

    Kapitel 8 wirft einen Blick hinter die Kulissen statistischer Arbeit und ihrer Methoden, insbesondere auch hinter die der amtlichen Statistik: Wie kommen die Myriaden von Zahlen zustande, die heute aus statistischen Quellenwerken aller Art und aus Datenbanken abgerufen werden können? Dabei wird deutlich, welche Schwierigkeiten bei Erhebungen, insbesondere bei Stichprobenerhebungen, zu überwinden sind, wie man Antwortverweigerer kooperativer stimmt, wie sich auch aus kleinen Stichproben auf intelligente Weise verlässliche Ergebnisse erzielen lassen und wie Großstichproben auf europäischer Ebene harmonisiert werden. Am Beispiel des Zensus 2011 wird gezeigt, wie sich eine Kombination von Stichproben und Registerauswertungen als Ersatz für eine Volkszählung nutzen lässt. Mitglieder der Deutschen Statistischen Gesellschaft waren daran kooperativ beteiligt.

  2. Aufnahme, Analyse und Visualisierung von Bewegungen nativer Herzklappen in-vitro

    NASA Astrophysics Data System (ADS)

    Weiß, Oliver; Friedl, Sven; Kondruweit, Markus; Wittenberg, Thomas

    Die hohe Zahl an Transplantationen von Herzklappen und viele nötige Re-Operationen machen eine detaillierte Analyse der Strömungen und Klappenbewegungen klinisch interessant. Ein neuer Ansatz ist hierbei der Einsatz von Hochgeschwindigkeitskameras um Bewegungsabl äufe der Herzklappen beobachten und auswerten zu können. Die hohen Datenraten erfordern allerdings eine möglichst automatisierte Analyse und möglichst komprimierte Darstellung des Schwingungsverhaltens. In dieser Arbeit wird ein Ansatz vorgestellt, bei dem Bewegungen nativer Herzklappen in-vitro aufgenommen, analysiert und kompakt visualisiert werden.

  3. Kontinuierliche Wanddickenbestimmung und Visualisierung des linken Herzventrikels

    NASA Astrophysics Data System (ADS)

    Dornheim, Lars; Hahn, Peter; Oeltze, Steffen; Preim, Bernhard; Tönnies, Klaus D.

    Zur Bestimmung von Defekten in der Herztätigkeit kann die Veränderung der Wanddicke des linken Ventrikels in zeitlichen MRTAufnahmesequenzen gemessen werden. Derzeit werden für diese Bestimmung im allgemeinen nur die aufwändig manuell erstellte Segmentierungen der Endsystole und Enddiastole benutzt. Wir stellen ein bis auf die Startpunktinitialisierung automatisches Verfahren zur Bestimmung der Wanddicke des linken Ventrikels und ihrer Veränderung vor, das auf einer vollständigen Segmentierung der Herzwand in allen Zeitschritten durch ein dynamisches dreidimensionales Formmodell (Stabiles Feder-Masse-Modell) basiert. Dieses Modell nutzt bei der Segmentierung neben der Grauwertinformation eines Zeitschrittes auch die Segmentierungen der anderen Zeitschritte und ist so aufgebaut, dass die Wanddicken direkt gemessen und visualisiert werden können. Auf diese Weise werden die lokalen Wanddickenextrema über den gesamten Aufnahmezeitraum detektiert, auch wenn sie nicht in die Endsystole bzw. -diastole fallen. Das Verfahren wurde auf sechs 4D-Kardio-MRT-Datensätzen evaluiert und stellte sich als sehr robust bzgl. der einzig nötigen Interaktion heraus.

  4. Prä- und perioperative Aspekte der Versorgung dermatochirurgischer Patienten.

    PubMed

    Müller, Cornelia S L; Hubner, Wakiko; Thieme-Ruffing, Sigrid; Pföhler, Claudia; Vogt, Thomas; Volk, Thomas; Gärtner, Barbara C; Bialas, Patric

    2017-02-01

    Die Dermatochirurgie nimmt hinsichtlich vieler Punkte eine Sonderstellung unter den operativen Fächern ein. Hierzu gehört in erster Linie die Tatsache, dass bis auf wenige Ausnahmen fast alle Eingriffe traditionell in Lokal- bzw. Regionalanästhesie und oft auch in räumlich-infrastruktureller Trennung von den großen Zentral-Operationssälen stattfinden können. Die peri- und postoperative Überwachung obliegt dabei dem dermatochirurgischen Operationsteam. Das sui generis kleinere OP-Team hat somit eine ganze Reihe perioperativer Notwendigkeiten zu beachten, um die sich in den "großen" chirurgischen Fächern eine Vielzahl verschiedener beteiligter Fachgruppen gemeinsam kümmern. Hierzu gehören neben Hygieneaspekten, Kenntnissen in der Überwachung der Patienten sowie dem Aspekt der surgical site infections auch Fragen zur postoperativen Schmerztherapie sowie detailliertes pharmakologisches Wissen über die zur Anwendung kommenden Lokalanästhetika und das Handling der damit assoziierten toxischen und allergischen Reaktionen. Eine interdisziplinäre Zusammenarbeit und Verantwortung für den Patienten ist notwendig und erfordert die Erarbeitung und Umsetzung qualitätsorientierter und evidenzbasierter Handlungsanweisungen, die im dermatochirurgischen OP-Setting meist weit über das eigentliche Fach hinausgehen. Ziel dieses Weiterbildungsartikels soll die komprimierte Darstellung der genannten fachübergreifenden Standpunkte bezüglich der wichtigsten perioperativen Aspekte sein. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  5. Kommunale Energieversorger als wesentliche Akteure der Digitalisierung - Strategien und Handlungsoptionen

    NASA Astrophysics Data System (ADS)

    Reiche, Katherina

    Die Digitalisierung erfasst sämtliche Bereiche des Lebens und Wirtschaftens. Auch die Kommunalwirtschaft - insbesondere die kommunale Energiewirtschaft - sieht sich perspektivisch disruptiven Entwicklungen gegenüber. Stadtwerke haben bereits viele Herausforderungen erfolgreich gemeistert und stehen auch der Digitalisierung positiv gegenüber. Vielerorts gestalten kommunale Unternehmen den digitalen Wandel bereits aktiv mit. Dieser Artikel arbeitet die Assets kommunaler Unternehmen heraus und zeigt Strategien und Handlungsoptionen zum Umgang mit der digitalen Transformation für kommunale Energieversorgungsunternehmen auf. Dabei zeigt sich, dass das politische und regulatorische Umfeld für das positive Gelingen der Digitalisierung entscheidend sind. Kommunale Unternehmen benötigen die gleichen Marktzugangsbedingungen wie andere Akteure. Ferner profitieren kommunale Unternehmen von einigen Wettbewerbsvorteilen, etwa hohen Vertrauenswerten ihrer Kunden und umfangreiches Know-how im Datenmanagement.

  6. Satellitenbewegung, band III: Natiirliche und gesteuerte bewegung.

    NASA Astrophysics Data System (ADS)

    Jochim, E. F.

    2014-12-01

    Im dritten Band der Satellitenbewegung werden in fortlaufender Nummerierung einige für Untersuchungen der Bewegung der künstlichen Satelliten wichtige Grundlagen der Astrodynamik mit ausführlichen mathematischen Formelsystemen behandelt. Dazu zählen die unterschiedlichen Aspekte der Bewegung der natürlichen Himmelskörper, die Steuerung und Kontrolle von künstlichen Objekten, und insbesondere die für eine Satellitenbahnanalyse wichtigen physikalischen Beeinflussungen einer Satellitenbewegung. Mathematisch entscheidend ist die Wahl geeigneter Bahnparameter, die ein bestimmtes Bewegungsproblem widerspruchsfrei und singularitätenfrei zu behandeln gestatten. Für die Behandlung routinemäßiger Aufgabenstellungen der Satellitenbewegung, in erster Linie einer präzisen Bahnbestimmung und Bahnverbesserung, kann auf eine Fülle von lehrbuchartigen Monographien verwiesen werden, so dass diese Problematik in der vorliegenden Arbeit nur angedeutet werden soll.

  7. Sticktechnologie für medizinische Textilien und Tissue Engineering

    NASA Astrophysics Data System (ADS)

    Karamuk, Erdal; Mayer, Jörg; Wintermantel, Erich

    Textile Strukturen werden in grossem Ausmass als medizinische Implantate eingesetzt, um Weich- und Hartgewebe zu unterstützen oder zu ersetzen. Im Tissue Engineering gewinnen sie an Bedeutung als scaffolds, um biologische Gewebe in vitro zu züchten für anschliessende Implantation oder extrakorporale Anwendungen. Textilien sind gewöhnlich anisotrope zweidimensionale Strukturen mit hoher Steifigkeit in der Ebene und geringer Biegesteifigkeit. Durch eine Vielzahl textiler Prozesse und durch entsprechende Wahl des Fasermaterials ist es möglich, Oberfläche, Porosität und mechanische Anisotropie in hohem Masse zu variieren. Wegen ihrer einzigartigen strukturellen und mechanischen Eigenschaften können faserbasierte Materialien in weitem Masse biologischem Gewebe nachgeahmt werden [1]. Gesticke erweitern das Feld von technischen und besonders medizinischen Textilien, denn sie vereinen sehr hohe strukturelle Variabilität mit der Möglichkeit, mechanische Eigenschaften in einem grossen Bereich einzustellen, um so die mechanischen Anforderungen des Empfängergewebes zu erfüllen (Abb. 42.1).

  8. Konsistente Verknüpfung von Aktivitäts-, Sequenz- und Zustandsdiagrammen

    NASA Astrophysics Data System (ADS)

    Ebrecht, Lars; Lemmer, Karsten

    Der folgende Beitrag stellt eine formale, generische Verhaltensstruktur und Semantik vor, die die Basis für die konsistente Verknüpfung der drei wichtigsten UML Verhaltensdiagramme bildet. Es wird gezeigt, wie sich das grobe und übersichtliche Verhalten in Aktivitätsdiagrammen, das detaillierte Schnittstellenverhalten in Sequenzdiagrammen und detaillierte Verhaltensmodelle in Zustandsdiagrammen mit Hilfe der Semantik konsistent miteinander verknüpfen lassen und die Inhalte der drei genannten Verhaltensdiagramme eindeutig miteinander in Beziehung gebracht werden können. Als Anwendungsbeispiel wird die komplexe, sicherheits- und echtzeitkritische zugseitige Komponente des Europäischen Leit- und Sicherungstechnik Systems (ETCS) verwendet.

  9. Dürre, Waldbrände, gravitative Massenbewegungen und andere klimarelevante Naturgefahren

    NASA Astrophysics Data System (ADS)

    Glade, Thomas; Hoffmann, Peter; Thonicke, Kirsten

    Klimarelevante Naturgefahren sind auf vielfältige Faktoren zurückzuführen, deren Zusammenwirken in der Gesamtheit betrachtet werden muss. Die vorbereitenden, auslösenden und kontrollierenden Faktoren werden in unterschiedlichster Weise vom Klimawandel beeinflusst. Die Autoren beschreiben beobachtete Trends und Projektionen zu Dürre, Waldbränden, gravitativen Massenbewegungen (Muren, Fels- und Bergstürze, Hangrutschungen) und Schneelawinen sowie das Zusammenspiel der unterschiedlichen Ursachen. Sie diskutieren darüber hinaus, welche der Veränderungen in der Häufigkeit oder Stärke von Naturgefahren tatsächlich ausschließlich dem Klimawandel zuzuschreiben sein könnten und welche Anteile hierbei der direkte menschliche Einfluss hat und konstatieren, dass eine eindeutige Trennung häufig nicht vollzogen werden kann.

  10. Biokompatible Implantate und Neuentwicklungen in der Gynäkologie

    NASA Astrophysics Data System (ADS)

    Jacobs, Volker R.; Kiechle, Marion

    Für den Einsatz in der Gynäkologie stehen heute eine Vielzahl unterschiedlicher, biokompatibler Materialien und Implantate zur Verfügung. Auf eine Auswahl soll hier näher eingegangen werden, die die verschiedenen Materialien und Bauweisen repräsentieren. So sind Brustimplantate seit fast vier Jahrzehnten im Gebrauch für die Brustvergrösserung und den Brustwiederaufbau. Material, Bauweisen und medizinische Aspekte einschliesslich der kontroversen Diskussion um Silikon werden im folgenden erläutert. Neuere Entwicklungen von Verhütungstechniken für permanente Sterilisation wie den Filshie ClipTM für transabdominalen und den STOPTM für intraluminalen Verschluss der Eileiter oder die intrauterin plazierte Hormonspirale MirenaTM für zeitlich begrenzte Verhütung werden beschrieben. Eine neue Perspektive zur Verhinderung postoperativer intraabdominaler Adhäsionen stellt Spray-GelTM, ein Zweikomponenten Hydrogel aus Polyethylenglykol, dar.

  11. Tissue Engineering in der Hals-Nasen-Ohrenheilkunde, Kopf- und Halschirurgie

    NASA Astrophysics Data System (ADS)

    Bücheler, Markus; Bootz, Friedrich

    Tissue Engineering ist eine Schlüsseltechnologie für den Gewebeersatz der Zukunft. Am Beispiel der Hals-Nasen-Ohrenheilkunde, Kopf- und Halschirurgie werden klinisch etablierte Gewebeersatzmethoden und aktuelle Entwicklungen des Tissue Engineering gegenübergestellt. Die Besonderheiten der zu ersetzenden Gewebe im Kopf- und Halsbereich erfordert vielfältige Ersatzverfahren. Im klinischen Alltag werden heute vor allem autogene Transplantate und Implantate für den Gewebeersatz verwendet [1]. In vitro hergestellte Gewebe werden abgesehen von Einzelanwendungen zur Zeit noch nicht am Patienten eingesetzt.

  12. Erfassung tribologischer Zusammenhänge und Erkenntnisse in einer Datenbank

    NASA Astrophysics Data System (ADS)

    Gold, Peter Werner; Jacobs, Georg; Loos, J.; Rombach, Volker; Kurutas, Savas; Fröde, Astrid

    Die Datenbank enthält die tribologischen und stofflichen Daten, die innerhalb des Sonderforschungsbereichs 442 ermittelt wurden. Sie wird genutzt, um Informationen über die Eigenschaften von Schmierstoffen und Werkstoffverbunden, sowie über Prüfstände und Versuchsergebnisse zu erhalten. Diese Wissensbasis kann dazu beitragen, ökologisch verträgliche Tribosysteme mit Hilfe der Auswerte-Systeme (Module) auszulegen. Dabei handelt es sich um Berechnungswerkzeuge und Expertenwissen z. B. in Bezug auf die Auswahl von Schichtsystemen. Eine nähere Beschreibung der Auswerte-Systeme kann den folgenden Abschnitten entnommen werden.

  13. Der deutsche Röntgensatellit ABRIXAS: Mission und wissenschaftliche Zielsetzung.

    NASA Astrophysics Data System (ADS)

    Predehl, P.

    ABRIXAS (A BRoad band Imaging X-ray All-sky Survey) ist ein Röntgensatellit mit sieben 27-fach genesteten Wolterteleskopen, die sich in ihren Brennpunkten eine pn-CCD Kamera teilen. ABRIXAS soll im Frühjahr 1999 auf einer russischen Cosmos-Rakete gestartet werden und die erste abbildende Himmelsdurchmusterung oberhalb von 2.4 keV durchführen. Man erwartet während der dreijährigen Mission wenigstens 10.000 neue Röntgenquellen zu entdecken. Dies sind vor allem solche Quellen, die durch vorgelagerte Staub- und Gasschichten für den ABRIXAS-Vorgänger ROSAT unsichtbar blieben. Darüber hinaus wird ABRIXAS hervorragend geeignet sein, ausgedehnte, diffuse Quellen spektroskopisch zu studieren und Intensitätsvariationen von Röntgenquellen auf sehr unterschiedlichen Zeitskalen zu untersuchen. Das Projekt ist eine wissenschaftliche Zusammenarbeit zwischen dem Astrophysikalischen Institut Potsdam (AIP), dem Institut für Astronomie und Astrophysik der Universität Tübingen (IAAT) und dem Max-Planck-Institut für extraterrestrische Physik (MPE).

  14. Von Start-ups lernen - Methoden und Entwicklungsprozesse, die Jungunternehmen erfolgreich machen

    NASA Astrophysics Data System (ADS)

    Böhme, Eckhart

    Die Start-up-Bewegung bringt beständig sog. Disruptoren hervor, die jede Branche betreffen und so gut wie keinen Lebensbereich auslassen. Diese Jungunternehmen, insbesondere aus der Softwarebranche, verfügen zwar nicht über Ressourcen wie etablierte Unternehmen, sie sind jedoch agil, "hungrig", können frei von "Ballast" agieren und treiben die Digitalisierung aller Branchen voran. Aber auch Start-ups können nicht einfach ungetestete Ideen in erfolgreiche Produkte oder Dienstleistungen umwandeln und ihren Erfolg dem Zufall überlassen. Erfolgreiche Jungunternehmen folgen vielmehr einem strukturierten Prozess, um marktgetestete Nutzenversprechen und Geschäftsmodelle zu entwickeln. Zunehmend adaptieren etablierte Unternehmen innovative Entwicklungsprozesse und Methoden. Die Fragestellung für Energieversorgungsunternehmen (EVUs) lautet, welche Methoden, Werkzeuge und Entwicklungsprozesse, die heute bei vielen Start-ups eingesetzt werden, sie aufgreifen können, um das Unternehmen gegenüber Disruptoren robust zu machen?

  15. Einstellung und Wissen von Lehramtsstudierenden zur Evolution - ein Vergleich zwischen Deutschland und der Türkei

    NASA Astrophysics Data System (ADS)

    Graf, Dittmar; Soran, Haluk

    Es wird eine Untersuchung vorgestellt, in der Wissen und Überzeugungen von Lehramtsstudierenden aller Fächer zum Thema Evolution an zwei Universitäten in Deutschland und der Türkei erhoben worden sind. Die Befragung wurde in Dortmund und in Ankara durchgeführt. Es stellte sich heraus, dass ausgeprägte Defizite im Verständnis der Evolutionsmechanismen herrschen. Viele Studierende, insbesondere aus der Türkei, sind nicht von der Faktizität der Evolution überzeugt. Dies gilt sowohl für Studierende mit Fach Biologie als auch für Studierende mit anderen Fächern. Näher untersucht worden sind die Faktoren, die die Überzeugungen zur Evolution beeinflussen können, was ja in Anbetracht der hohen Ablehnungsrate der Evolution von besonderem Interesse ist. Das Vertrauen in die Wissenschaft spielt hierbei eine besondere Rolle: Wer der Wissenschaft vertraut, ist auch eher von der Evolution überzeugt, als diejenigen, die skeptisch gegenüber der Wissenschaft sind.

  16. Smart Meter Rollout: Intelligente Messsysteme als Schnittstelle zum Kunden im Smart Grid und Smart Market

    NASA Astrophysics Data System (ADS)

    Vortanz, Karsten; Zayer, Peter

    Das Gesetz zur Digitalisierung der Energiewende ist verabschiedet. Ab 2017 sind moderne Messeinrichtungen (mME) und intelligente Messsysteme (iMSys) zu verbauen und zu betreiben. Der "deutsche Weg" für die Einführung von Smart Metern sieht einen stufenweisen Rollout sowie ein Höchstmaß an Informations- und Datensicherheit vor. Dabei spielen iMSys und mME eine wichtige Rolle bei der Neugestaltung der intelligenten Netze (Smart Grids) und des neuen Marktmodells (Smart Market). Dieser Beitrag beschäftigt sich mit den neuen Gesetzen, den Marktrollen und ihren Aufgaben, Datenschutz und Datensicherheit, dem iMSys als sichere Lösung, dem sicheren Betrieb von Smart Meter Gateways, Smart Grid - Smart Market, dem Zusammenspiel zwischen reguliertem Bereich und Markt, den Einsatzbereichen der iMSys sowie den Auswirkungen auf Prozesse und Systeme und gibt Handlungsempfehlungen.

  17. Zu einer inhaltsorientierten Theorie des Lernens und Lehrens der biologischen Evolution

    NASA Astrophysics Data System (ADS)

    Wallin, Anita

    Der Zweck dieser Studie (zwecks Überblick siehe dazu Abb. 9.1) war zu untersuchen, wie die Schüler der Sekundarstufe II ein Verständnis von der Theorie der biologischen Evolution entwickeln. Vom Ausgangspunkt "Vorurteile der Schüler“ ausgehend wurden Unterrichtssequenzen entwickelt und drei verschiedene Lernexperimente in einem zyklischen Prozess durchgeführt. Das Wissen der Schüler wurde vor, während und nach den Unterrichtssequenzen mit Hilfe von schriftlichen Tests, Interviews und Diskussionsrunden in kleinen Gruppen abgefragt. Etwa 80 % der Schüler hatten vor dem Unterricht alternative Vorstellungen von Evolution, und in dem Nachfolgetest erreichten circa 75 % ein wissenschaftliches Niveau. Die Argumentation der Schüler in den verschiedenen Tests wurde sorgfältig unter Rücksichtnahme auf Vorurteile, der konzeptionellen Struktur der Theorie der Evolution und den Zielen des Unterrichts analysiert. Daraus konnten Einsichten in solche Anforderungen an Lehren und Lernen gewonnen werden, die Herausforderungen an Schüler und Lehrer darstellen, wenn sie anfangen, evolutionäre Biologie zu lernen oder zu lehren. Ein wichtiges Ergebnis war, dass das Verständnis existierender Variation in einer Population der Schlüssel zum Verständnis von natürlicher Selektion ist. Die Ergebnisse sind in einer inhaltsorientierten Theorie zusammengefasst, welche aus drei verschiedenen Aspekten besteht: 1) den inhaltsspezifischen Aspekten, die einzigartig für jedes wissenschaftliche Feld sind; 2) den Aspekten, die die Natur der Wissenschaft betreffen; und 3) den allgemeinen Aspekten. Diese Theorie kann in neuen Experimenten getestet und weiter entwickelt werden.

  18. Einfluss des Internets auf das Informations-, Einkaufs- und Verkehrsverhalten

    NASA Astrophysics Data System (ADS)

    Nerlich, Mark R.; Schiffner, Felix; Vogt, Walter

    Mit Daten aus eigenen Erhebungen können das einkaufsbezogene Informations- und Einkaufsverhalten im Zusammenhang mit den verkehrlichen Aspekten (Distanzen, Verkehrsmittel, Wegekopplungen) dargestellt werden. Die Differenzierung in die drei Produktkategorien des täglichen, mittelfristigen und des langfristigen Bedarfs berücksichtigt in erster Linie die Wertigkeit eines Gutes, die seine Erwerbshäufigkeit unmittelbar bestimmt. Der Einsatz moderner IKT wie das Internet eröffnet dem Endverbraucher neue Möglichkeiten bei Information und Einkauf. Die verkehrliche Relevanz von Online-Shopping wird deutlich, wenn man berücksichtigt, dass im Mittel rund 17% aller Online-Einkäufe, die die Probanden durchgeführt haben, Einkäufe in Ladengeschäften ersetzen. Dies gilt in verstärktem Maße für Online-Informationen: etwa die Hälfte hätte alternativ im stationären Einzelhandel stattgefunden. Da der Erwerb von Gütern des täglichen Bedarfs häufig nahräumlich und in relevantem Anteil nicht-motorisiert erfolgen kann, sind in diesem Segment - im Gegensatz zum mittel- und langfristigen Bedarf - nur geringe Substitutionseffekte zu beobachten.

  19. XML-basierte Produkt- und Prozessdaten für die Leittechnik-Projektierung

    NASA Astrophysics Data System (ADS)

    Schleipen, Miriam

    Für die Überwachung und Steuerung hochkomplexer Produktionsprozesse werden Prozessleitsysteme eingesetzt. Ständige Veränderungen zwingen Produktionsbetriebe wandlungsfähig zu sein. Entsprechend muss auch die Technik diese Flexibilität unterstützen. Jede Veränderung des Produktionsprozesses muss eingeplant, die Anlagen neu konfiguriert und projektiert werden. Dabei müssen auch neue Prozessbilder für die Bedien- und Steuerungssysteme erstellt werden. Am Fraunhofer IITB wurde ein Engineering-Framework entwickelt, das das Leitsystem automatisch projektiert und die zugehörige Prozessvisualisierung generiert. In diesem Beitrag wird das Modul vorgestellt, dass die Prozessabbilder erstellt. Neben der Visualisierung von Anlagen werden auch laufende Prozesse und bearbeitete Produkte dargestellt. So können beispielsweise Identsysteme mit der Leittechnik gekoppelt werden.

  20. Kometen und Asteroiden

    NASA Astrophysics Data System (ADS)

    Borgeest, Ulf; Staude, Jakob; Hahn, Gerhard; Harris, Alan W.; Jaumann, Ralf; Köhler, Ulrich; Kührt, Ekkehard; Schulz, Rita; Neukum, Gerhard; Arnold, Gabriele; Keller, Horst Uwe; Denk, Tilmann; Müller, Thomas; Wulff, André; Maruhn, Nicolaus; Fischer, Daniel; Trieloff, Mario; Althaus, Tilmann

    Contents: Die Kleinkörper des Sonnensystems. Kern, Koma und Schweife. Ziele der Planetenforschung. ROSETTA: Naherkundung von Kometen. Asteroiden: Trümmer aus planetarer Urzeit. Kleinkörper im Infrarotweltall. Selbs beobachten! Apocalypse - not now! Meteoriten - Boten aus der Urzeit des Sonnensystems.

  1. Herstellung von Chitosan und einige Anwendungen

    NASA Astrophysics Data System (ADS)

    Struszczyk, Marcin Henryk

    2001-05-01

    1. Die Deacetylierung von crabshell - Chitosan führte gleichzeitig zu einem drastischen Abfall der mittleren viscosimetrischen Molmasse ( Mv), insbesondere wenn die Temperatur und die Konzentration an NaOH erhöht werden. Diese Parameter beeinflussten jedoch nicht den Grad der Deacetylierung (DD). Wichtig ist jedoch die Quelle des Ausgangsmaterials: Chitin aus Pandalus borealis ist ein guter Rohstoff für die Herstellung von Chitosan mit niedrigem DD und gleichzeitig hoher mittlerer Mv, während Krill-Chitin (Euphausia superba) ein gutes Ausgangsmaterial zur Herstellung von Chitosan mit hohem DD und niedrigem Mv ist. Chitosan, das aus Insekten (Calliphora erythrocephala), unter milden Bedingungen (Temperatur: 100°C, NaOH-Konzentration: 40 %, Zeit: 1-2h ) hergestellt wurde, hatte die gleichen Eigenschaften hinsichtlich DD und Mv wie das aus Krill hergestellte Chitosan. Der Bedarf an Zeit, Energie und NaOH ist für die Herstellung von Insekten-Chitosan geringer als für crabshell-Chitosan vergleichbare Resultaten für DD und Mv. 2. Chitosan wurde durch den Schimmelpilz Aspergillus fumigatus zu Chitooligomeren fermentiert. Die Ausbeute beträgt 25%. Die Chitooligomere wurden mit Hilfe von HPLC und MALDI-TOF-Massenspektrmetrie identifiziert. Die Fermentationsmischung fördert die Immunität von Pflanzen gegen Bakterien und Virusinfektion. Die Zunahme der Immunität schwankt jedoch je nach System Pflanze-Pathogen. Die Fermentation von Chitosan durch Aspergillus fumigatus könnte eine schnelle und billige Methode zur Herstellung von Chitooligomeren mit guter Reinheit und Ausbeute sein. Eine partiell aufgereinigte Fermentationsmischung dieser Art könnte in der Landwirtschaft als Pathogeninhibitor genutzt werden. Durch kontrollierte Fermentation, die Chitooligomere in definierter Zusammensetzung (d.h. definierter Verteilung des Depolymerisationsgrades) liefert, könnte man zu Mischungen kommen, die für die jeweilige Anwendung eine optimale Bioaktivität besitzen. 3

  2. Sternbilder und ihre Mythen

    NASA Astrophysics Data System (ADS)

    Fasching, Gerhard

    Aus den Besprechungen: "... Wem bei seinen philosophischen Höhenflügen allerdings die einfachsten Grundlagen fehlen, wer sich am Himmel ähnlich zurechtfindet wie ein Amazonasindianer im Großstadtverkehr, dem seien die Sternbilder und ihre Mythen ans Herz gelegt, die der Wiener Universitätsprofessor Gerhard Fasching zusammengestellt hat... Da werden Wegweiser-Sternkarten für das ganze Jahr gezeigt, die auch einem astronomischen Ignoranten die nächtliche Orientierung ermöglichen. Daneben werden die Sternsagen des Ovid opulent ausgebreitet, das überlieferte Wissen aus verschiedenen Kulturkreisen zitiert und wissenschaftliche Erklärungsmodelle zusammengetragen. Die moderne Weltsicht erscheint dabei nicht als der Weisheit letzter Schluß, sondern nur als derzeit anerkanntes Abbild der Wirklichkeit..." #Ulrich Schnabel/Die Zeit#

  3. Strahlen-und kinetische Waffen: Neue Waffentechniken und Rüstungskontrolle

    NASA Astrophysics Data System (ADS)

    Neuneck, Götz

    Laserstrahlen, Mikrowellen oder elektromagnetische Beschleuniger lassen sich nicht nur für zivile, sondern für militärische Zwecke einsetzen. Die Aufgabe einer vorbeugenden Rüstungskontrolle wäre es, diese wie andere künftige Waffentechnologien auf ihren destabilisierenden Charakter hin zu untersuchen und ihre Stationierung zu beschränken oder zu verhindern.

  4. Messen, Kalibrieren, Eichen in der Radiologie: Prinzipien und Praxis

    NASA Astrophysics Data System (ADS)

    Wagner, Siegfried R.

    Nach einleitender Erläuterung der unterschiedlichen Meßbedingungen in der Strahlentherapie und im Strahlenschutz werden die metrologischen Probleme am Beispiel der Größenkategorie Äquivalentdosis diskutiert. Als spezielle Größen werden effektive Äquivalentdosis und Umgebungs-Äquivalentdosis eingeführt. Es wird gezeigt, wie richtiges Messen durch ein konsistentes System von Bauartanforderungen an Meßgeräte, durch Kalibrieren und durch Eichen gewährleistet werden kann. Die Bedeutung von Meßunsicherheiten und Fehlergrenzen wird erläutert und ihre Auswirkung auf die Interpretation von Meßergebnissen behandelt.Translated AbstractMeasurements, Calibration, Verification in Radiology: Principles and PracticeThe different measuring conditions in radiotherapy and in radiation protection are discussed in the introduction. Then, the metrological problems are discussed exemplarily with the dose equivalent as a category of quantity. Effective dose equivalent and ambient dose equivalent are introduced as special quantities. It is demonstrated, how correct measurements can be secured by a consistent system of instrument pattern requirements, by calibration and verification. The importance of uncertainties of measurements and of error limits is illustrated and their influence on the interpretation of the results of measurements is treated.

  5. Rückwärtsintegration - Zu den Verhältnissen Gymnasium, Hochschule und Arbeitswelt

    NASA Astrophysics Data System (ADS)

    Schmid, Gerhard; Heppner, Winfried; Focht, Eva

    In seiner 2007 erschienen Sammlung von Vorträgen und Essays beschäftigt sich Wolfgang Frühwald, mit der Frage "Wieviel Wissen brauchen wir?“ [1] Die Kernproblematik moderner Wissenschaft und Forschung sieht der Autor, emeritierter Ordinarius für Neuere Deutsche Literaturwissenschaft und von 1992 bis 1997 Präsident der Deutschen Forschungsgemeinschaft, einerseits in der zunehmenden Spezialisierung der Wissenschaftsbereiche, andererseits in der Gefahr der Abkoppelung der Naturwissenschaften von den Geisteswissenschaften. Wiederholt plädiert er dafür, über der rasanten Entwicklung beispielsweise in der Biologie und Physik, die historische, gesellschaftliche und besonders die ethische Dimension der Forschung nicht zu übersehen und fordert eine übergeordnete Theorie der Wissenschaft, die nur im Dialog zwischen den einzelnen Fachgebieten zu entwickeln sei.

  6. Arbeitsgestaltung und Mitarbeiterqualifizierung

    NASA Astrophysics Data System (ADS)

    Weiss-Oberdorfer, Werner; Hörner, Barbara; Holm, Ruth; Pirner, Evelin

    Die Wertkette gliedert ein Unternehmen in strategisch relevante Tätigkeiten, um dadurch Kostenverhalten sowie vorhandene und potenzielle Differenzierungsquellen zu verstehen. Wenn ein Unternehmen diese strategisch wichtigen Aktivitäten billiger oder besser als seine Konkurrenten erledigt, verschafft es sich einen Wettbewerbsvorteil." Michael Porter, 1985

  7. Gebändigtes Knallgas: Brennstoffzellen im mobilen und stationären Einsatz

    NASA Astrophysics Data System (ADS)

    Waidhas, Manfred; Landes, Harald

    2001-07-01

    Die Brennstoffzelle hat aus technischer Sicht einen hohen Stand erreicht. Die PEMFC konnte ihre Zuverlässigkeit in einer Reihe von Nischenanwendungen, aber auch in Form erster mobiler und dezentraler Prototypen beweisen. Die SOFC und die MCFC konnten bereits in Anlagen von 100 kW und mehr in Erprobung gehen. Um jedoch wirtschaftlich konkur-renzfähig zu den etablierten Technologien der mobilen und dezentralen Energiewandlung zu werden, muss noch eine drastische Kostenreduktion sowohl beim Brennstoffzellen-Stack als auch bei den zu seinem Betrieb notwendigen Hilfsaggregaten erreicht werden. Für Fahrzeugantriebe muss außerdem eine Antwort auf die noch offene Treibstofffrage (Infrastruktur, H2-Erzeugung und H2-Speicherung) gefunden werden.

  8. IQM-Reifegradmodell für die Bewertung und Verbesserung des Information Lifecycle Management Prozesses

    NASA Astrophysics Data System (ADS)

    Baškarada, Saša; Gebauer, Marcus; Koronios, Andy; Gao, Jing

    Heutige Organisationen produzieren und speichern mehr Informationen als je zuvor. Der resultierende Informationsüberfluss, zusammen mit einem Mangel an Qualitätssicherung für das Information Lifecycle Management, führt zu einem unsicheren Status der Informationsqualität in vielen Organisationen. Weiterhin hat sich herausgestellt, dass das Bewerten, Verbessern und Steuern der Informationsqualität ein offenkundig schwieriges Unterfangen ist. Dieses Kapitel stellt ein Modell zur Bewertung und Verbesserung der Information Quality Management Capability Maturity (IQM-Reifegrad) vor. Es wird ein Satz von Kriterien vorgestellt, der aus Literaturrecherche und Fallstudien abgeleitet wurde. Die Reifegradindikatoren werden validiert und in einem mehrstufigen Reifegradmodell durch eine Delphi-Studie gruppiert. Das abgeleitete IQM-Reifegradmodell hilft Organisationen ihre bestehenden Praktiken im IQM zu bewerten und potentielle Lücken und Verbesserungsstrategien zu ermitteln.

  9. Molekulare Diagnostik von Hautinfektionen am Paraffinmaterial - Übersicht und interdisziplinärer Konsensus.

    PubMed

    Sunderkötter, Cord; Becker, Karsten; Kutzner, Heinz; Meyer, Thomas; Blödorn-Schlicht, Norbert; Reischl, Udo; Nenoff, Pietro; Geißdörfer, Walter; Gräser, Yvonne; Herrmann, Mathias; Kühn, Joachim; Bogdan, Christian

    2018-02-01

    Nukleinsäure-Amplifikations-Techniken (NAT), wie die PCR, sind hochsensitiv sowie selektiv und stellen in der mikrobiologischen Diagnostik wertvolle Ergänzungen zur kulturellen Anzucht und Serologie dar. Sie bergen aber gerade bei formalinfixiertem und in Paraffin eingebettetem Gewebe ein Risiko für sowohl falsch negative als auch falsch positive Resultate, welches nicht immer richtig eingeschätzt wird. Daher haben Vertreter der Deutschen Gesellschaft für Hygiene und Mikrobiologie (DGHM) und der Deutschen Dermatologischen Gesellschaft (DDG) einen Konsensus in Form einer Übersichtsarbeit erarbeitet, wann eine NAT am Paraffinschnitt angezeigt und sinnvoll ist und welche Punkte dabei in der Präanalytik und Befundinterpretation beachtet werden müssen. Da bei Verdacht auf eine Infektion grundsätzlich Nativgewebe genutzt werden soll, ist die PCR am Paraffinschnitt ein Sonderfall, wenn beispielsweise bei erst nachträglichaufgekommenem Verdacht auf eine Infektion kein Nativmaterial zur Verfügung steht und nicht mehr gewonnen werden kann. Mögliche Indikationen sind der histologisch erhobene Verdacht auf eine Leishmaniose, eine Infektion durch Bartonellen oder Rickettsien, oder ein Ecthyma contagiosum. Nicht sinnvoll ist oder kritisch gesehen wird eine NAT am Paraffinschnitt zum Beispiel bei Infektionen mit Mykobakterien oder RNA-Viren. Die Konstellation für eine NAT aus Paraffingewebe sollte jeweils benannt werden, die erforderliche Prä-Analytik, die jeweiligen Grenzen des Verfahrens und die diagnostischen Alternativen bekannt sein. Der PCR-Befund sollte entsprechend kommentiert werden, um Fehleinschätzungen zu vermeiden. © 2018 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  10. Digitalisierung in der Energiewirtschaft - empirische Untersuchung und Wertschöpfungskette

    NASA Astrophysics Data System (ADS)

    Dell, Timo

    Die Energiewirtschaft nutzt seit je her digitale Strukturen zur Umsetzung ihrer Prozesse. Durch den (neuen) verabschiedeten politischen Ordnungsrahmen - dem Gesetz zur Digitalisierung der Energiewende - und durch die rasante Fortentwicklung technologischer Strukturen ergeben sich jedoch die Wertschöpfungsstufen erweiternde, diversifizierende und innovative Möglichkeiten für Energieversorger (EVU) Geschäftsfelder auszubauen bzw. neue zu erschließen. Dabei ist die digitale (R)Evolution keine rein technische Umsetzung, sondern insbesondere auch eine unternehmensinterne, strategische und intern-kulturelle Herausforderung.

  11. Ökophysik: Plaudereien über das Leben auf dem Land, im Wasser und in der Luft

    NASA Astrophysics Data System (ADS)

    Nachtigall, W.

    Prof. em. Dr. rer. nat. Werner Nachtigall, geb. 1934, war als Zoophysiologe und Biophysiker Leiter des Zoologischen Instituts der Universität des Saarlandes in Saarbrücken. In Forschung und Ausbildung hat er sich insbesondere mit Aspekten der Technischen Biologie und Bionik befasst und mit seinen Forschergruppen viele Basisdaten insbesondere zur Ökologie, Physiologie und Physik des Fliegens und Schwimmens aber auch zur Stabilität beispielsweise der Gräser erarbeitet. Lebewesen überraschen immer wieder durch ihre "Biodiversität", ihre hochspezifischen Ausgestaltungen und Anpassungen.

  12. Digitalisierung und Energie 4.0 - Wie schaffen wir die digitale Energiewende?

    NASA Astrophysics Data System (ADS)

    Irlbeck, Maximilian

    Die digitale Energiewende verändert nachhaltig die Systeme der "alten" Energiewelt. Ein Zusammenwachsen verschiedener Domänen im Energiesystem, die durch digitale Technologie möglich wird, birgt enorme Herausforderungen, ist aber notwendig, um die Energiewende und ihre Ziele zu meistern. Dieser Beitrag beschreibt die Wirkung der Digitalisierung auf das Energiesystem, listet Charakteristika der digitalen Energiewende auf und schildert für verschiedene Domänen mögliche Zielvorstellungen, die durch digitale Technologie umsetzbar sind. Am Ende erläutert der Beitrag Handlungsschritte, die auf dem Weg zu einem erneuerbaren Energiesystem gegangen werden sollten und zeigt Probleme und Risiken einer Fehlentwicklung auf.

  13. Zwischen Commonsense und Wissenschaft Mathematik in der Erziehungsphilosophie A. N. Whiteheads

    NASA Astrophysics Data System (ADS)

    Sölch, Dennis

    Obwohl Whitehead heute wie selbstverständlich als Philosoph rezipiert wird, so hat er seine wissenschaftliche Laufbahn doch als Mathematiker begonnen. Lange Zeit war er gemeinsam mit Bertrand Russell als Autor der Principia Mathematica unter Mathematikern und mathematischen Logikern deutlich besser bekannt als unter Philosophen. Doch selbst von denjenigen, die sich mit Whiteheads Überlegungen zur Metaphysik, zur Wissenschaftsgeschichte und zur Theologie befassen, werden seine Schriften zur Philosophie von Erziehung und Bildung häufig kaum beachtet. So entgeht es leicht, dass Whitehead nicht nur ein auf theoretischem Gebiet brillanter Mathematiker war, sondern sein theoretisches Fachwissen im Hinblick auf pädagogische und didaktische Relevanz fortwährend reflektiert hat.

  14. Digital Transformation Canvas - Übersicht behalten und Handlungsfelder gestalten

    NASA Astrophysics Data System (ADS)

    Köster, Michael; Mache, Tobias

    Im Beitrag "Digital Transformation Canvas - Übersicht behalten und Handlungsfelder gestalten" wird zunächst grob auf die wesentlichen Herausforderungen, die mit der zunehmenden Digitalisierung einhergehen, eingegangen. Anschließend werden ausgewählte Konzepte des Business Transformation Management vorgestellt, die sich mit der grundlegenden Weiterentwicklung von Organisationen - wie es die Digitalisierung erfordert - auseinandersetzen. Eine detaillierte Einführung in die Methodik des Business Transformation Canvas, der sich mit den unterschiedlichsten Gestaltungsfeldern der Transformation auseinandersetzt und ein Framework für Transformationsprojekte darstellt, rundet den Beitrag ab. Er schließt mit einem Fazit und Ausblick.

  15. Wie verstehen Schülerinnen und Schüler den Begriff der Unendlichkeit?

    NASA Astrophysics Data System (ADS)

    Schimmöller, Tabea

    Wie Hilbert bereits feststellte, wirkt die Idee der Unendlichkeit, wie keine andere, schon seit Zeiten sehr anregend und fruchtbar auf den Verstand und bewegt das Gemüt der Menschen. Der Begriff der Unendlichkeit bedarf aber auch, wie kein anderer, der Aufklärung, denn mit ihm eröffnet sich ein weites Feld, welches nicht nur aus vielen verschiedenen Definitionen besteht, sondern auch aus völlig unterschiedlichen Disziplinen. Physiker suchen immer dringender nach einer "Theorie für Alles" oder einer "Weltformel", Kosmologen beschäftigen sich unter anderem mit der Ewigkeit des Universums, Theologen interessiert eher die Unendlichkeit Gottes, Philosophen diskutieren unter anderem Grenzfragen zwischen Naturwissenschaft und Philosophie und die Mathematiker versuchen den Paradoxien des Unendlichen einen Sinn zu geben. Und so wird ersichtlich, dass nichts abstrakter ist als das Unendliche: Obwohl die Unendlichkeit für die unterschiedlichsten Wissenschaften von großer Bedeutung ist, "[ist] in der Wirklichkeit das Unendliche nirgends zu finden, [egal] was für Erfahrungen und Beobachtungen und welcherlei Wissenschaft wir auch heranziehen".

  16. Epidemiologie, Prävention und Früherkennung des Zervixkarzinoms

    PubMed Central

    Wentzensen, Nicolas

    2016-01-01

    Zusammenfassung Hintergrund Persistierende Infektionen mit humanen Papillomviren sind die notwendige Ursache des Zervixkarzinoms. Die Entwicklung von HPV-basierten Präventionsverfahren, der HPV Impfung und der HPV-Testung, führt derzeit zu umfangreichen Veränderungen von Zervixkarzinom-Vorsorgeprogrammen. Eine Dekade nach Einführung der HPV-Imfpung in vielen Ländern werden bereits Reduktionen von HPV-Infektionen und Krebsvorstufen in jungen Frauen beobachtet. Der Fokus liegt jetzt auf der Integration von neuen Testverfahren im Screening von Populationen mit zunehmenden Impfraten. Ergebnisse und Schlussfolgerung Ein erfolgreiches Zervixkarzinom-Vorsorgeprogramm besteht aus verschiedenen Komponenten, vom primären Screening über die Triage zur Kolposkopie mit Biopsie, um Frauen mit Kresbvorstufen zu identifizieren, die eine therapeutische Intervention benötigen. Im primären Screening wird eine kleine Gruppe von Frauen mit erhöhtem Risiko für eine Krebsvorstufe identifiziert, während die grosse Mehrheit kein erhöhtes Risiko hat. Je nach primärem Testverfahren werden in screen-positiven Frauen zusätzliche Triage-Tests durchgeführt, um zu entscheiden, wer zur Kolposkopie überwiesen werden sollte. Derzeit gibt es drei verschiedene Ansätze für das primäre Zervixkarzinomscreening: Die Zervix-Zytologie, die HPV-Testung, und die HPV-Zytologie Ko-Testung. Zahlreiche Triage-Tests für HPV-positive Frauen werden derzeit untersucht, darunter sind die Zytologie, HPV-Genotypisierung, p16/Ki-67 Zytologie, und diverse Methylierungstests. Die steigende Anzahl an Optionen für die Früherkennung des Zervixkarzinoms stellt eine Herausforderung für klinische Leitlinien dar. Die zunehmende Komplexität von Vorsorgeprogrammen kann zur Verunsicherung von Ärzten und von am Screening teilnehmenden Frauen führen. Die Präzisions-Prävention des Zervixkarzinoms ist eine neuer Ansatz, der umfangreiche Risikodaten basierend auf der individuellen Vorgeschichte und von

  17. Schönheit und andere Provokationen - Eine neue evolutionsbiologische Theorie der Kunst

    NASA Astrophysics Data System (ADS)

    Junker, Thomas

    Die Evolution hat viele spektakuläre Phänomene hervorgebracht - von der Eleganz des Vogelflugs über die gigantischen Körper der Dinosaurier und die farbenprächtige Vielfalt der Korallenriffe bis hin zu ihrem jüngsten Geniestreich - der menschlichen Kunst. Die schönen Künste - Malerei, Bildhauerei und Architektur, Theater, Tanz, Oper und Filmkunst, Musik und Literatur - Produkte der Evolution? Diese Vorstellung mutet vielen Menschen fremd an, aber wie könnte es anders sein? Denn wenn Charles Darwin recht hat, dann sind nicht nur die körperlichen Merkmale der Menschen als Antworten auf die Erfordernisse des Lebens entstanden, sondern auch ihre geistigen Fähigkeiten und Verhaltensweisen. Im Jahr 1859 hatte er auf den letzten Seiten seines berühmten Buches über die Entstehung der Arten eine kühne Prophezeiung gemacht: Durch die Evolutionstheorie werde es "zu einer bemerkenswerten Revolution in der Naturwissenschaft kommen […]. Die Psychologie wird auf die neue Grundlage gestellt, dass jede geistige Kraft und Fähigkeit notwendigerweise durch graduelle Übergänge erworben wird“ (Darwin 1859, S. 484, 488; Junker 2008).

  18. Von Donuts und Zucker: Mit Neutronen biologische Makromoleküle erforschen

    NASA Astrophysics Data System (ADS)

    May, Roland P.

    2003-05-01

    Für die Erforschung von Biomolekülen bieten Neutronen einzigartige Eigenschaften. Vor allem ihre unterschiedliche Wechselwirkung mit dem natürlichen Wasserstoff und seinem schweren Isotop Deuterium ermöglicht tiefe Einblicke in Struktur, Funktion und Dynamik von Proteinen, Nukleinsäuren und Biomembranen. Bei vielen Fragestellungen zur Strukturaufklärung gibt es kaum oder keine Alternative zum Neutron. Das Institut Laue-Langevin trägt Bahnbrechendes zum Erfolg der Neutronen-Methoden in der Biologie bei.

  19. Kraft-Wärmekopplung und Blockheiz-Kraftwerke BHKW

    NASA Astrophysics Data System (ADS)

    Zahoransky, Richard; Allelein, Hans-Josef; Bollin, Elmar; Oehler, Helmut; Schelling, Udo

    Die thermischen Wirkungsgrade von Kraftwerken zur Stromerzeugung sind relativ gering. Beispielsweise erreichen moderne Kohlekraftwerke heute bis etwa 45 %, Gasturbinen maximal 40 % und Diesel-Motoren nicht über 50 %. Kombinations-Kraftwerke, Gas- und Dampfturbinen-Prozesse können an die 60 % thermischer Wirkungsgrad bei der Umwandlung der zugeführten Wärme in mechanische bzw. elektrische Energie erzielen. Ein ähnlich hoher Wert wird in Zukunft von den Brennstoffzellen erwartet. Der nicht in Arbeit umgewandelte Anteil der zugeführten Wärme fällt als Abwärme an und geht ungenutzt in die Umgebung. Ein Teil dieser Abwärme lässt sich durch entsprechende Installationen bei allen Kraftwerksprozessen zur Wassererwärmung oder zur Dampferzeugung für industrielle Zwecke nutzen. Für Heizzwecke genügt eine Temperatur der Abwärme von 60 %C bis 80 %C, während die Erzeugung von Industriedampf deutlich höhere Temperaturen voraussetzt.

  20. Transplantate und Implantate im Mittelohrbereich - Teil 1 (Stand 2002)

    NASA Astrophysics Data System (ADS)

    Kempf, Hans-Georg; Lenarz, Thomas; Eckert, Karl-Ludwig

    In Deutschland leben ungefähr 12 Millionen Menschen, die an einer ein- oder beidseitigen Schwerhörigkeit leiden. Diese kann angeboren oder im Laufe des Lebens erworben sein. Klinisch und therapeutisch wichtig ist die Unterscheidung, ob die Ursache der Schwerhörigkeit im Bereich des Mittelohres, d. h. der Schallübertragung, oder im Bereich des Innenohres, der Hörnerven und der zentralen Hörbahnabschnitte, d. h. der Schallempfindung, liegt. 2,5 Millionen Schwerhörige haben dabei das Problem der Schallübertragung, d.h. die Störung liegt im Mittelohrbereich, und hier kann man in der Regel mit operativen, mikrochirurgisch durchgeführten Massnahmen helfen [1, 2]. Im Vordergrund steht als Ursache hier die chronische Mittelohrentzündung, die sich als Perforation des Trommelfells, als Defekt oder Unterbrechung der Gehörknöchelchen oder auch als Cholesteatom, einer sogenannte Knocheneiterung äussern kann [3]. Therapeutisch und damit als Prinzip der operativen Hörverbesserung steht primär der Verschluss des Trommelfells oder eine Rekonstruktion der Gehörknöchelchen an.

  1. [COPD und Klangtherapie: Pilotstudie zur Wirksamkeit einer Behandlung mit Körpertambura bei COPD-Patienten].

    PubMed

    Hartwig, Bernhard; Schmidt, Stefan; Hartwig, Isabella

    2016-01-01

    Hintergrund: Erkrankungen der Atemorgane treten mit steigendem Alter öfter auf, nehmen weltweit zu und sind häufige Ursachen für Morbidität und Mortalität. In dieser Pilotstudie wurde der Frage nachgegangen, ob eine einmalige 10-minütige Behandlung mit einer Körpertambura eine signifikante und effektive Verbesserung der Lungenfunktion von Patienten mit chronisch-obstruktiver Lungenerkrankung (COPD; GOLD-Stadium A oder B) erbringen kann. Patienten und Methoden: 54 Probanden konnten je zur Hälfte in eine Behandlungsgruppe (Körpertambura) und eine aktive Kontrollgruppe (Atemtherapie) randomisiert werden. Eine Bestimmung der Lungenfunktionsmessparameter «Einsekundenkapazität» (FEV1) und «inspiratorische Vitalkapazität» (IVC) zu den Zeitpunkten T1 (Baseline), T2 (direkt nach Behandlung) und als Follow-up etwa 3 Wochen nach T1 (T3). Ergebnisse: Die Behandlungsgruppe zeigte sich der Kontrollgruppe in beiden Werten signifikant überlegen. Die Zeit-×-Gruppe-Interaktion (Varianzanalyse) ergab p = 0,001 (FEV1) bzw. p = 0,04 (IVC). Die Behandlungsgruppe zeigte bei beiden Werten eine Verbesserung von klinischer Relevanz. Schlussfolgerung: Diese Ergebnisse zeigen, dass die Klangbehandlung mittels einer Körpertambura - neben den schulmedizinischen, leitliniengerechten Therapien - eine zusätzliche, nebenwirkungsarme, aber durchaus klinisch wirksame Option für die Behandlung von COPD-Patienten darstellen kann, um deren Lebensqualität zu stabilisieren und zu verbessern. © 2016 S. Karger GmbH, Freiburg.

  2. Das Prinzip Bewegung - Herz und Gehirn als Metaphern des menschlichen Lebens

    NASA Astrophysics Data System (ADS)

    Otis, Laura

    In diesem Jahr, in dem wir Charles Darwins gedenken, möchte ich etwas riskieren und eine Frage erörtern, die für die Literatur ebenso wie für die Biologie zentral ist: Was ist das Leben? Die Antwort auf diese Frage finden wir nicht in der Bibliothek und nicht im Labor, zumindest nicht an diesen erkenntnisproduzierenden Stellen allein. Als Literaturwissenschaftlerin und ehemalige Naturwissenschaftlerin glaube ich, dass wir das Leben nur verstehen werden, wenn wir seinen Wirkungen überall nachforschen, inklusive in der Literatur.

  3. Energiewende 4.0 - Chancen, Erfolgsfaktoren, Herausforderungen, Barrieren für Stadtwerke und Verteilnetzbetreiber

    NASA Astrophysics Data System (ADS)

    Rieger, Volker; Weber, Sven

    Energiewende und Digitalisierung transformieren die Energiewirtschaft in noch nicht da gewesenem Maße. Durch den Wandel des linearen, vertikalen Geschäftsmodells in ein horizontales und vernetztes entstehen neue Geschäftsmodelle, in die vermehrt neue Anbieter aus anderen Branchen und Start-ups eintreten. Auf Basis langjähriger Beratungserfahrung erläutern die Autoren die zukünftige Geschäftslogik der Energiewelt 4.0. Anhand von Beispielen aus anderen Branchen zeigen sie dabei wesentliche Handlungsfelder speziell für regionale Energieunternehmen auf. Um in der neuen Energiewelt relevant zu bleiben, müssen Energieversorger ihre Kunden in den Fokus rücken, sich für Partnerschaften öffnen, in die Leistungsfähigkeit ihrer Infrastruktur investieren und v. a. einen Kulturwandel hin zu mehr Agilität und Offenheit vollführen.

  4. Trizentrische Analyse von Kofaktoren und Komorbidität des Pyoderma gangraenosum.

    PubMed

    Jockenhöfer, Finja; Herberger, Katharina; Schaller, Jörg; Hohaus, Katja Christina; Stoffels-Weindorf, Maren; Ghazal, Philipp Al; Augustin, Matthias; Dissemond, Joachim

    2016-10-01

    Das Pyoderma gangraenosum (PG) ist eine seltene, inflammatorische destruktiv-ulzerierende neutrophile Erkrankung mit weitgehend unklarer Pathophysiologie. In dieser Studie wurden die potenziell relevanten Kofaktoren und Begleiterkrankungen von Patienten mit PG aus drei dermatologischen Wundzentren in Deutschland differenziert ausgewertet. Von den insgesamt 121 analysierten Patienten waren Frauen (66,9 %) häufiger betroffen als Männer. Das Alter der Patienten war 18-96 Jahre (Mittelwert [MW]: 59,8); die Wunden hatten eine Größe von 1-600 cm² (MW: 65,6 cm²) und waren überwiegend sehr schmerzhaft (VAS 1-10, MW: 7). Die Unterschenkel waren am häufigsten (71,9 %) betroffen. Bei 12 (9,9 %) Patienten bestanden chronisch entzündliche Darmerkrankungen (5,8 % Colitis ulcerosa; 4,1 % Morbus Crohn), bei 14,1 % der Patienten wurde eine Begleiterkrankung aus dem rheumatischen Formenkreis beschrieben. Neoplasien bestanden bei 20,6 % der Patienten, von denen 6,6 % als hämatologische und 14,1 % als solide Neoplasien klassifiziert wurden. Aus dem Kreis des metabolischen Syndroms wurde bei 69,4 % Patienten eine Adipositas, bei 57,9 % eine arterielle Hypertonie und bei 33,9 % ein Diabetes mellitus diagnostiziert. Diese Datenanalyse bestätigt Assoziationen des PG mit dem metabolischen Syndrom und mit Neoplasien, die zukünftig frühzeitig bei einer zielgerichteten Diagnostik der Patienten beachtet und behandelt werden sollten. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  5. [Leben im Eismeer - Tauchuntersuchungen zur Biologie arktischer Meerespflanzen und Meerestiere

    PubMed

    Lippert; Karsten; Wiencke

    2000-01-01

    Die Maske wird nochmals auf Dichtigkeit überprüft, der Knoten der Sicherungsleine mit zwei halben Schlägen fixiert, dann rutscht die Taucherin von der Eiskante in das kalte Wasser. Eine halbe Stunde vergeht, bevor ihr Kopf wieder aus dem Eisloch auftaucht und sie ein großes Sammelnetz nach oben reicht, gefüllt mit verschiedenen Arten von Makroalgen. Obwohl noch große Flächen des Kongsfjordes im arktischen Spitzbergen zugefroren sind und das Festland von einer dicken Schneedecke bedeckt ist, hat unter Wasser in den Algenwäldern bereits der Sommer und damit die Saison der Meeresbiologen begonnen.

  6. Haptische Modellierung und Deformation einer Kugelzelle

    NASA Astrophysics Data System (ADS)

    Schippritt, Darius; Wiemann, Martin; Lipinski, Hans-Gerd

    Haptische Simulationsmodelle dienen in der Medizin in erster Linie dem Training operativer Eingriffe. Sie basieren zumeist auf physikalischen Gewebemodellen, welche eine sehr genaue Simulation der biomechanischen Eigenschaften des betreffenden Gewebes erlauben, aber gleichzeitig sehr rechenintensiv und damit zeitaufwändig in der Ausführung sind. Die menschliche Wahrnehmung kann allerdings auch eine ungenaue haptische Modellierung psychooptisch ausgleichen. Daher kann es sinnvoll sein, haptische Simulationen auch mit nicht vollständig physikalisch definierten Deformationsmodellen durchzuführen. Am Beispiel der haptischer Simulation einer in-vitro Fertilisation wird gezeigt, dass durch die Anwendung eines geometrischen Deformationsmodells eine künstliche Befruchtung unter realistischen experimentellen Bedingungen in Echtzeit haptisch simuliert und damit trainiert werden kann.

  7. Biologie statt Philosophie? Evolutionäre Kulturerklärungen und ihre Grenzen

    NASA Astrophysics Data System (ADS)

    Illies, Christian

    Vor über siebzig Jahren fand man in einer Höhle nahe Hohlenstein-Stadel, im heutigen Baden-Württemberg, eine Frau, die keiner bekannten Spezies und nicht einmal eindeutig den Hominiden zugeordnet werden konnte. Wegen ihres Aussehens wurde sie schon bald als "Löwenfrau“ bekannt (unterdessen wird sie als "Löwenmensch“ bezeichnet, da die in solchen Fragen Klarheit schaffenden Geschlechtsteile bei der Figur fehlen und in Zeiten von gender mainstreaming derartige Festlegungen gerne vermieden werden), denn sie hatte eine menschlich-aufrechte, unbehaarte Gestalt mit weiblichen Rundungen, aber zugleich eine Mähne, sowie Augen, Ohren und Schnauze eines Löwen. Eine sehr weitläufige Verwandte des Minotaurus, so schien es, und doch wesentlich älter als alle Bewohner des Olymps, denn vermutlich wurde die knapp 30 cm große Skulptur bereits in der Altsteinzeit vor etwa 32.000 Jahren aus Mammut-Elfenbein geschnitzt. Wir wissen nicht, ob sie kultischen Zwecken diente oder ein Kind mit ihr spielte, ob sie als Glücksbringer für die Jagd oder als Schamanin mit Löwenmaske verehrt und gefürchtet wurde. Aber die Löwenfrau legt nahe, dass der Mensch schon im Morgendämmern seiner Kultur über die eigene Nähe, aber auch Distanz zum Tier nachgedacht haben muss. Die Frage nach der menschlichen Selbstverortung begegnet uns in dieser Figur, und sie bestimmt viele Zeugnisse menschlichen Nachdenkens, welche uns die Altertumswissenschaften vorlegen. Mit dem Begriff "animal rationale“, wie er unter Bezug auf Aristoteles geprägt wurde, findet sie schließlich ihre klassische, für das Abendland lange Zeit maßgebliche Antwort: Der Mensch als Tier, dessen spezifisches Merkmal die Vernunftbegabtheit ist, die ihn zugleich von allen anderen Tieren abgrenzt und über sie stellt. Aber wo genau verläuft die Grenze? Und wie kann der Mensch beides zugleich sein? Die aristotelische Definition beantwortet diese Fragen nach der Doppelnatur nicht, sondern erhebt das offene R

  8. Einsatz und Wirksamkeit von Systemtherapien bei Erwachsenen mit schwerer Neurodermitis: Erste Ergebnisse des deutschen Neurodermitis-Registers TREATgermany.

    PubMed

    Schmitt, Jochen; Abraham, Susanne; Trautmann, Freya; Stephan, Victoria; Fölster-Holst, Regina; Homey, Bernhard; Bieber, Thomas; Novak, Natalija; Sticherling, Michael; Augustin, Matthias; Kleinheinz, Andreas; Elsner, Peter; Weidinger, Stephan; Werfel, Thomas

    2017-01-01

    Versorgungsregister dienen der Erfassung des Einsatzes und der Wirksamkeit von Therapien unter realen Versorgungsbedingungen und sind als Basis einer evidenzbasierten Gesundheitsversorgung unverzichtbar. Das deutsche Neurodermitis-Register TREATgermany wurde als weltweit erstes Register für Patienten mit schwerer Neurodermitis 2011 initiiert. Erwachsene mit schwerer Neurodermitis (aktuelle/frühere antientzündliche Systemtherapie und/oder objektiver SCORAD ≥ 40) werden über einen Zeitraum von 24 Monaten prospektiv beobachtet. Anhand validierter Erhebungsinstrumente werden die klinische Erkrankungsschwere (EASI, SCORAD), Lebensqualität (DLQI), Symptome, globale Erkrankungsschwere sowie die Patientenzufriedenheit erfasst und die durchgeführten Therapien dokumentiert. Die vorliegende Analyse beschreibt die Charakteristika, Therapiewahl und Wirksamkeit der eingesetzten antiinflammatorischen Systemtherapien der bis Oktober 2014 eingeschlossenen Patienten. An fünf Zentren wurden insgesamt 78 Patienten (Durchschnittsalter 39 Jahre, 61 % männlich) eingeschlossen. Bei den Patienten besteht eine hohe Inanspruchnahme ambulanter und stationärer Leistungen. Ciclosporin war das am häufigsten eingesetzte Systemtherapeutikum und zeigte die höchste klinische Effektivität (EASI-50-Ansprechrate 51 %; EASI-75-Ansprechrate 34 % nach zwölfwöchiger Therapie). Azathioprin, Methotrexat (MTX), Prednisolon oral, Mycophenolat, Alitretinoin und Leflunomid wurden ebenfalls bei einzelnen Patienten eingesetzt. Die vorliegende Registerauswertung gibt wichtige Hinweise zur derzeitigen Versorgung von Erwachsenen mit schwerer Neurodermitis in Deutschland, dokumentiert die hohe Erkrankungslast, den Nutzen vorhandener Therapien und den Bedarf an weiteren, effektiven und in der Langzeitanwendung sicheren Therapieoptionen. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  9. Physik gestern und heute Von der Metallstange zum Hochenergielaser

    NASA Astrophysics Data System (ADS)

    Heering, Peter

    2002-05-01

    Im Mai 1752 wurde in Marly bei Paris auf Anregung des amerikanischen Forschers und Politikers Benjamin Franklin erstmals die elektrische Natur des Blitzes nachgewiesen. Damals beschrieb Franklin auch eine technische Vorrichtung, die als Schutz von Gebäuden vor Blitzschlägen dienen sollte: den Blitzableiter. Diese aus heutiger Sicht scheinbar triviale Vorrichtung wurde aber keineswegs unmittelbar akzeptiert. Und bis heute ist die Forschung zum Schutz von Einrichtungen vor Blitzschlägen nicht abgeschlossen.

  10. Technische Systeme für den Herzersatz und die Herzunterstützung

    NASA Astrophysics Data System (ADS)

    Schöb, Reto; Loree, Howard M.

    Herzkrankheiten verursachen allein in den Vereinigten Staaten jährlich mehr als 700’000 Todesfälle. Ungefähr 3 Millionen Patienten in den U.S.A. leiden gemäss der American Heart Association (AHA) und dem National Heart, Lung and Blood Institute (NHLBI) an kongestivem Herzversagen (Congestive Heart Failure, CHF), welches eine chronische, sehr entkräftende und degenerative Krankheit ist: Das Herz ist dabei unfähig, hinreichend Blut zu den Organen des Körpers zu pumpen. Über 400’000 Fälle von CHF werden jedes Jahr diagnostiziert. Ähnliche Zahlen werden für Europa und Japan zusammen geschätzt. Basierend auf Daten vom AHA und NHLBI beträgt die fünfjährige Überlebensrate für CHF-Patienten lediglich etwa 50% [1]. 70’000-120’000 dieser Patienten könnten von einer Herzverpflanzung profitieren. 1999 wurden in den USA aber nur 2185 Herztransplantationen durchgeführt während die Warteliste über 4000 Patienten beträgt [2]. Ein akuter Mangel an Spenderherzen und die enormen Kosten (250’000-400’000 USD pro Patient) sind die begrenzenden Faktoren für Herztransplantationen [3]. Dies bedeutet, dass eine riesige Anzahl von Patienten durch ein zuverlässiges und verschleissfreies, nichtthrombotisches, total implantierbares, künstliches Herz gerettet werden könnten. Bis heute jedoch kein derartiges Implantat kommerziell verfügbar.

  11. Modulare und durchgängige Produktmodelle als Erfolgsfaktor zur Bedienung einer Omni-Channel-Architektur - PLM 4.0

    NASA Astrophysics Data System (ADS)

    Golovatchev, Julius; Felsmann, Marcus

    Mit der Transformation der Wertschöpfungsstrukturen von Utility 1.0 zu Utility 4.0 erfolgt offensichtlich auch eine Veränderung des Produkts. Vor dem Hintergrund disruptiver Technologien (IoT, Big Data, Cloud, Robotics etc.) und auch gesellschaftlicher Veränderungen entstehen ständig neue Geschäftsmodelle und Produkte, die über die reine Versorgungsdienstleistung (z. B. Strom) hinausgehen. Dabei muss der wertvolle Rohstoff Produktdaten für smarte Produkte durchgängiger und schneller nutzbar gemacht werden. Die modularen und durchgängigen Produktstrukturen leisten einen Beitrag zur Beherrschung von Komplexität und stellen somit einen wesentlichen Hebel für erfolgreiche Produktentwicklung und -management dar. In diesem Beitrag werden Ansätze beschrieben, wie es den vor der Herausforderung Utility 4.0 stehenden Unternehmen gelingen kann, Smart-Energy-Produkte so zu modellieren, dass sie die Interoperabilität der einzelnen Produktionsmodule sicherstellt und ein Ende-zu-Ende-Management ermöglicht.

  12. Deutsches "Nationales Krebshilfe-Monitoring" 2015-2019 - Studienprotokoll und erste Ergebnisse.

    PubMed

    Schneider, Sven; Görig, Tatiana; Schilling, Laura; Breitbart, Eckhard W; Greinert, Rüdiger; Diehl, Katharina

    2017-09-01

    Das Projekt "Nationales Krebshilfe-Monitoring zur Solariennutzung" (National Cancer Aid Monitoring of Tanning Bed Use, NCAM) ist eine deutsche Großstudie mit dem Ziel, die wichtigsten Risikofaktoren für Hautkrebs zu beobachten: natürliches Sonnenlicht und künstliche UV-Strahlung. NCAM ist eine bundesweite Querschnittstudie mit zunächst vier Runden der Datenerfassung (sogenannten Wellen) zwischen 2015 und 2018. Jedes Jahr wird eine bundesweit repräsentative Stichprobe aus 3.000 Personen im Alter von 14 bis 45 Jahren befragt. Die Querschnittsbefragung wird durch eine Kohorte von n = 450 aktuellen Solariennutzern ergänzt. Die erste Welle im Jahr 2015 ergab eine Gesamtprävalenz der Solariennutzung von 29,5 %. Elf Prozent aller Teilnehmer hatten in den vergangenen zwölf Monaten ein Solarium genutzt. Zu den Determinanten der aktuellen Solariennutzung gehörten jüngeres Alter, weibliches Geschlecht und Vollzeit-/Teilzeitbeschäftigung. Die hauptsächlichen Beweggründe, die für die Nutzung eines Solariums genannt wurden, waren Entspannung und Attraktivitätssteigerung. NCAM ist weltweit die erste Studie zur Überwachung der Risikofaktoren für Hautkrebs in jährlichen Intervallen anhand einer großen, landesweit repräsentativen Stichprobe. Erste Ergebnisse deuten darauf hin, dass Millionen Deutsche trotz Warnungen der WHO Solarien nutzen, und dass viele dieser Nutzer Jugendliche sind - trotz gesetzlicher Beschränkungen, die das Ziel haben, die Nutzung von Solarien durch Minderjährige zu verhindern. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  13. Uniform National Discharge Standards (UNDS): Outreach

    EPA Pesticide Factsheets

    Describes the Federalism and Tribal consultation efforts related to the Uniform National Discharge Standards (UNDS) and links to copies of each presentation, both to state and local representatives, as well as federally-recognized tribes.

  14. Sentinel-Lymphknoten-Biopsie des Melanoms mittels Indocyaningrün und "FOVIS"-System.

    PubMed

    Göppner, Daniela; Nekwasil, Stephan; Jellestad, Anne; Sachse, Alexander; Schönborn, Karl-Heinz; Gollnick, Harald

    2017-02-01

    Der Nachweis metastatischer Infiltrate im Sentinel-Lymphkoten (SLN) gilt als wesentlicher prognostischer Faktor des Melanoms. Alternativ zur Farbstoffmethode mit Patentblau zum Goldstandard der SLN-Biopsie (SLNB) mittels Radiokolloid wird die fluoreszenzoptische Darstellung mit Hilfe von Indocyaningrün (ICG) und Nahinfrarot (NIR)-Kamerasystem kommuniziert. Im Vergleich zur konventionellen Methode wurde die Wertigkeit des ICG-/NIR-Verfahrens in Abhängigkeit vom Body-Mass-Index (BMI) des Patienten und der Konzentration von ICG bezüglich der Visualisierung des Lymphabstroms und des SLNs untersucht. An zehn Patienten wurde die SLNB mittels Technetium-99m, Patentblau und ICG durchgeführt. Die Fluoreszenz-Darstellung von Lymphbahnen und SLN erfolgte in Echtzeit mittels der NIR-Kameratechnik "FOVIS". Je nach erzielter Bildqualität wurde ICG in einer Dosis von 0,25 mg bis 2,5 mg intrakutan appliziert. Neun der zehn SLN wurden fluoreszenzoptisch identifiziert (90 %), alle zehn radioaktiv (100 %), nur acht (80 %) mittels ICG-Grünfärbung bzw. Patenblau-Markierung. Transdermal wurde ein SLN dargestellt (10 %). In Korrelation zum BMI waren höhere ICG-Mengen, bis zu 2,5 mg intrakutan absolut, in der Darstellung der Lymphbahnen von Vorteil. Die SLN-Fluoreszenzmarkierung mit dem ICG/NIR-Kamera-System "FOVIS" stellt eine sichere Alternative zur Farbstoffmethode mit Patentblau ergänzend zur Radiokolloidmethode mit Technetium-99m dar. Weitere Studien zur optimalen Dosierung von ICG und transdermalen Bildgebung in Relation zum BMI sind notwendig. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  15. Hinderniserkennung und -verfolgung mit einer PMD-kamera im automobil

    NASA Astrophysics Data System (ADS)

    Schamm, Thomas; Vacek, Stefan; Natroshvilli, Koba; Marius Zöllner, J.; Dillmann, Rüdiger

    Die Detektion von Hindernissen vor dem Automobil ist eine Hauptanforderung an moderne Fahrerassistenzsysteme (FAS). In dieser Arbeit wird ein System vorgestellt, das mit Hilfe einer PMDKamera (Photomischdetektor) Hindernisse auf der Fahrspur erkennt und deren relevante Parameter bestimmt. Durch die PMD-Kamera werden zunächst 3D-Tiefenbilder der Fahrzeugumwelt generiert. Nach einem initialen Filterprozess werden im Tiefenbild mit Hilfe eines Bereichswachstumsverfahrens Hindernisse gesucht. Zur Stabilisierung des Verfahrens und zur Parameterberechnung wird ein Kaiman Filter eingesetzt. Das Ergebnis ist eine Liste aller Hindernisse im Fahrbereich des Automobils.

  16. Implantate und Verfahren in der Augenheilkunde

    NASA Astrophysics Data System (ADS)

    Neuhann, Tobias H.

    Das in der Medizin mit am häufigsten verwendete Implantat weltweit ist die Intraokulare Linse (IOL). Die Gründe hierfür sind vielschichtig: einmal haben die Operationstechniken in den letzten 30 Jahren eine wesentliche Steigerung an Gleichmäßigkeit, Erfolg und Effizienz erfahren, zum anderen verursachen die gestiegenen Anforderungen des Alltags in den Industrienationen und im Berufsleben den höheren Anspruch an das Sehvermögen. Ist die menschliche Linse Ursache für schlechtes Sehvermögen, besteht meist eine Trübung des Linsenproteins. Diese Trübung nennt wird Volksmund Grauer Star genannt, wissenschaftlich die Katarakt (cataracta). Es gibt unterschiedliche Formen wie angeborene (congenita) oder erworbene, traumatische, krankheitsoder altersbedingte Formen [45]. Wird die eingetrübte Linse nun mittels moderner Operationsverfahren entfernt, muss für Ersatz dieses lichtbrechenden Mediums gesorgt werden [2].

  17. Die Deutsche Statistische Gesellschaft in der Weimarer Republik und während der Nazidiktatur

    NASA Astrophysics Data System (ADS)

    Wilke, Jürgen

    Nach anfänglichen Schwierigkeiten durch den 1. Weltkrieg erlangte die Deutsche Statistische Gesellschaft (DStatG) unter dem renommierten Statistiker und Vorsitzenden der DStatG, Friedrich Zahn, durch eine Vielzahl von Aktivitäten hohes Ansehen. Es gab Bestrebungen, Statistiker aus allen Arbeitsfeldern der Statistik in die DStatG zu integrieren, wobei die "Mathematische Statistik" nur zögerlich akzeptiert wurde (Konjunkturforschung, Zeitreihenanalyse). Nach der Machtübernahme 1933 durch Adolf Hitler geriet die DStatG in das Fahrwasser nationalsozialistischer Ideologie und Politik (Führerprinzip, Gleichschaltung des Vereinswesens). Damit war eine personelle Umstrukturierung in der DStatG verbunden. Politisch Missliebige und rassisch Verfolgte mussten die DStatG verlassen (Bernstein, Freudenberg, Gumbel u.a.). Unter den Statistikern gab es alle Abstufungen im Verhalten zum Regime von Ablehnung und zwangsweiser Anpassung über bereitwilliges Mitläufertum bis zu bewusster Täterschaft. Besonders die Bevölkerungsstatistik wurde durch die NS- Rassenpolitik auf lange Sicht diskreditiert. Im Rahmen von Wirtschaftsplanung und Aufrüstung wurden neue zukunftsträchtige statistische Modelle (Grünig, Bramstedt, Leisse) entwickelt.

  18. Terror mit Atomwaffen: reale Gefahr? Nukleare und Radiologische Waffen

    NASA Astrophysics Data System (ADS)

    Harigel, Gert G.

    2006-01-01

    Können Terroristen sich nukleare Massenvernichtungswaffen beschaffen? Dazu müssten sie ausreichende Mengen an waffenfähigem, spaltbarem Material stehlen. Selbst der Bau einer primitiven Atombombe erfordert einen hohen technischen Aufwand und Spezialisten. Wahrscheinlicher ist deshalb der Diebstahl einer kleinen taktischen Kernwaffe. Alternativ könnten Terroristen sich radioaktives Material aus zivilen Quellen beschaffen und daraus eine Schmutzige Bombe bauen. Eine solche radiologische Waffe wäre keine echte Massenvernichtungswaffe, doch ihre psychologische Wirkung könnte stark sein. Das macht sie für Terroristen attraktiv, weswegen diese Gefahr ernst genommen werden muss.

  19. Rosazea-Management: Update über allgemeine Maßnahmen und topische Therapieoptionen.

    PubMed

    Schaller, M; Schöfer, H; Homey, B; Hofmann, M; Gieler, U; Lehmann, P; Luger, T A; Ruzicka, T; Steinhoff, M

    2016-12-01

    Obwohl bislang für die Rosazea keine kurative Therapie besteht, können verschiedene Optionen zur Behandlung der Symptome und zur Vorbeugung von Exazerbationen empfohlen werden. Neben Selbsthilfemaßnahme wie der Vermeidung von Triggerfaktoren und einer geeigneten Hautpflege sollte das Rosazea-Management bei Patienten mit erythematöser und leichter bis schwerer papulopustulöser Rosazea die Anwendung topischer Präparate als First-Line-Therapie umfassen. Da Überlappungen der charakteristischen Rosazea-Symptome im klinischen Alltag die Regel sind, sollte die medikamentöse Therapie auf die individuellen Symptome zugeschnitten werden; auch eine Kombinationstherapie kann erforderlich sein. Zu den für die Behandlung der Hauptsymptome der Rosazea zugelassenen Wirkstoffen gehören Brimonidin gegen das Erythem sowie Ivermectin, Metronidazol oder Azelainsäure gegen entzündliche Läsionen. Ihre Wirksamkeit wurde in zahlreichen validen, gut kontrollierten Studien belegt. Darüber hinaus existieren verschiedene nicht zugelassene topische Behandlungsmöglichkeiten, deren Wirksamkeit und Sicherheit noch in größeren, kontrollierten Studien zu untersuchen ist. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  20. Spitznävi: unterschiedliche klinische, dermatoskopische und histopathologische Merkmale in der Kindheit.

    PubMed

    Dika, Emi; Neri, Iria; Fanti, Pier Alessandro; Barisani, Alessia; Ravaioli, Giulia Maria; Patrizi, Annalisa

    2017-01-01

    Die Charakterisierung der klinischen Merkmale und des biologischen Potenzials von Spitznävi hat in den letzten Jahrzehnten breites Interesse gefunden. Das Ziel dieser Arbeit ist die Beschreibung der klinischen und dermatoskopischen Merkmale von Spitznävi sowie des klinischen Ergebnisses nach chirurgischer Exzision von Spitznävi in drei pädiatrischen Altersgruppen. Restrospektive Studie zur Analyse von klinischen Merkmalen, videodermatoskopischen Bildern, histopathologischen Diagnosen und Behandlungsergebnissen. Der Grad der Pigmentierung wurde sowohl klinisch als auch histopathologisch beurteilt. Bei 71 Patienten wurden 72 spitzoide Neoplasien entfernt. Videodermatologische Bilder lagen für 41 Patienten vor. Das Muster der Pigmentierung korrelierte signifikant mit dem Alter der Patienten: Hyperpigmentierte Läsionen waren bei Vorschulkindern selten, bei Patienten von 7 bis 12 Jahren und ab 13 Jahren dagegen häufiger. Eine histopathologische Diagnose von atypischen Spitznävi wurde selten durchgeführt. Keiner der Patienten, bei denen ursprünglich ein atypischer Spitznävus diagnostiziert worden war, entwickelte ein Lokalrezidiv oder Metastasen während der anschließenden Nachbeobachtung. Pigmentierte Spitznävi traten im Alter ab 13 Jahren häufiger auf. Die Studie bestätigt andere Berichte über die Altersverteilung von Pigmentierungsmustern und hebt die geringe Anzahl atypischer Spitznävi bei pädiatrischen Patienten hervor sowie das Ausbleiben von Rezidiven bei der langfristigen Nachbeobachtung. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  1. Bildbasierte Navigation eines mobilen Roboters mittels omnidirektionaler und schwenkbarer Kamera

    NASA Astrophysics Data System (ADS)

    Nierobisch, Thomas; Hoffmann, Frank; Krettek, Johannes; Bertram, Torsten

    Dieser Beitrag präsentiert einen neuartigen Ansatz zur entkoppelten Regelung der Kamera-Blickrichtung und der Bewegung eines mobilen Roboters im Kontext der bildbasierten Navigation. Eine schwenkbare monokulare Kamera hält unabhängig von der Roboterbewegung die relevanten Merkmale für die Navigation im Sichtfeld. Die Entkopplung der Kamerablickrichtung von der eigentlichen Roboterbewegung wird durch die Projektion der Merkmale auf eine virtuelle Bildebene realisiert. In der virtuellen Bildebene hängt die Ausprägung der visuellen Merkmale für die bildbasierte Regelung nur von der Roboterposition ab und ist unabhängig gegenüber der tatsächlichen Blickrichtung der Kamera. Durch die Schwenkbarkeit der monokularen Kamera wird der Arbeitsbereich, über dem sich ein Referenzbild zur bildbasierten Regelung eignet, gegenüber einer statischen Kamera signifikant vergrößert. Dies ermöglicht die Navigation auch in texturarmen Umgebungen, die wenig verwertbare Textur- und Strukturmerkmale aufweisen.

  2. Die Interhalogenkationen [Br2F5]+ und [Br3F8].

    PubMed

    Ivlev, Sergei; Karttunen, Antti; Buchner, Magnus; Conrad, Matthias; Kraus, Florian

    2018-05-02

    Wir berichten über die Synthese und Charakterisierung der bislang einzigen Polyhalogenkationen, in denen verbrückende Fluoratome vorliegen. Das [Br2F5]+-Kation enthält eine symmetrische [F2Br-µ-F-BrF2]-Brücke, das [Br3F8]+-Kation enthält unsymmetrische µ-F-Brücken. Die Fluoronium-Ionen wurden in Form ihrer [SbF6]--Salze erhalten und Raman-, und 19F-NMR-spektroskopisch, sowie durch Röntgenbeugung am Einkristall untersucht. Quantenchemische Rechnungen, sowohl für die isolierten Kationen in der Gasphase, als auch für die Festkörper selbst, wurden durchgeführt. Populationsanalysen zeigen, dass die µ-F-Atome die am stärksten negativ partialgeladenen Atome der Kationen sind. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Gerbstoffe aus Potentilla officinalis wirken entzündungshemmend im UV-Erythem-Test und bei Anwendung auf atopischer Haut.

    PubMed

    Hoffmann, Julia; Wölfle, Ute; Schempp, Christoph M; Casetti, Federica

    2016-09-01

    Das Rhizom von Potentilla officinalis (PO) ist reich an Gerbstoffen und wird traditionell zur äußerlichen Behandlung von Entzündungen der Haut und der Schleimhäute verwendet. Ziel der vorliegenden Arbeit war die Bestätigung der antiinflammatorischen Eigenschaften von PO mittels eines UV-Erythem-Tests und einer klinischen Anwendungsstudie bei atopischer Haut. Die antiinflammatorische Wirkung eines PO-Extrakts (standardisiert auf 2 % Trockensubstanz) wurde in einer prospektiven, randomisierten, placebokontrollierten Doppelblindstudie mit 40 gesunden Erwachsenen im UV-Erythem-Test im Vergleich zu 1 % Hydrocortisonacetat untersucht. Im Rahmen einer prospektiven nicht kontrollierten Studie wurde die Wirkung und Verträglichkeit der 2 % PO-Creme an zwölf Erwachsenen und zwölf Kindern mit atopischer Haut nach Anwendung über zwei Wochen in einem definierten Testareal anhand eines Teil-SCORAD untersucht. Zusätzlich wurde die Beeinflussung der Hautrötung im Testareal photometrisch gemessen. Im UV-Erythem-Test zeigte die PO-Creme eine signifikante Reduktion des Erythemindex im Vergleich zum Vehikel. Die antiinflammatorische Wirkung des Verums entsprach der der 1 % Hydrocortisonacetat-Creme. Die klinische Studie bei Atopikern zeigte eine signifikante Abnahme des Teil-SCORAD und des Erythems im Testareal. Es wurden keine Unverträglichkeitsreaktionen beobachtet. PO als 2%ige Zubereitung besitzt entzündungshemmende Eigenschaften und ist wirksam und gut verträglich auf atopischer Haut. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  4. Potenzielle Arzneimittelwechsel-wirkungen und unerwünschte Arzneimittelwirkungen bei stationären dermatologischen Patienten.

    PubMed

    Koch, Lukas; Kränke, Birger; Aberer, Werner

    2016-11-01

    Informationen zur Häufigkeit von Arzneimittelwechselwirkungen und unerwünschten Arzneimittelwirkungen zu präsentieren und Hilfestellung zu leisten, wie diese wichtigen Probleme in der pharmakologischen Behandlung stationärer dermatologischer Patienten minimiert werden können. Die Medikation von 1 099 stationären dermatologischen Patienten wurde retrospektiv mittels einer Internet-basierten Software für Medikamenteninteraktionen (Diagnosia ® Check) auf Arzneimittelwechselwirkungen und unerwünschte Arzneimittelwirkungen analysiert. Wir beschreiben eine Gesamthäufigkeit relevanter Arzneimittelwechselwirkungen von 51,7 % mit durchschnittlich 3,2 Interaktionen pro betroffenem stationären Patienten. Arzneimittelkombinationen, die gemieden werden sollten, wurden bei 5,7 % der Studienpopulation festgestellt. Der wichtigste Risikofaktor war die Gesamtzahl der verabreichten Medikamente. Die Arzneimittelgruppen, die bei der Mehrzahl der Wechselwirkungen beteiligt waren, waren Analgetika, Herz-Kreislauf-Medikamente und gerinnungshemmende Medikamente sowie Antidepressiva. Das Risiko unerwünschte Arzneimittelwirkungen auszubilden wurde bei 53,1 % der stationären Patienten als "hoch" eingestuft. Die fünf wichtigsten unerwünschten Nebenwirkungen in dieser Patientengruppe waren Blutungen, Obstipation, anticholinerge Effekte, Sedierung und orthostatische Effekte. Potenzielle Arzneimittelwechselwirkungen sowie unerwünschte Arzneimittelwirkungen sind bei stationären dermatologischen Patienten alarmierend häufig. Bei jedem zweiten Patienten besteht die Gefahr, derartige Wechselwirkungen oder unerwünschte Nebenwirkungen zu erleiden und jeder zwanzigste Patient erhält eine Arzneimittelkombination, die nicht verabreicht werden sollte. Erhöhte Wachsamkeit ist erforderlich, um die gefährdeten Patienten zu erkennen. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  5. Isotrope und homogene Materie - Kosmen; On Dynamics and Thermodynamics of Isotropic Matter-Universes

    NASA Astrophysics Data System (ADS)

    Treder, H.-J.

    Die Dynamik und Thermodynamik großer kosmischer Systeme ist fast unabhängig von den besonderen Theorien über die Gravitation. Nur die Feinstruktur der Kosmologie und Kosmonogie reflektiert die speziellen Hypothesen. Diese Neutralität gegenüber den konkreten Gravodynamiken ist die Konsequenz der fundamentalen Eigenschaften der Gravitation: Der Prinzipien der Äquivalenz von Trägheit und Schwere. The dynamics and thermodynamics of great cosmical systems are nearly independent of the theory of gravitation and only the fine-structure of cosmogony and cosmology reflects the special hypotheses.The neutrality against the concret gravodynamics is a consequence of the fundamental properties of gravitation: the principlies of equivalence of gravity and inertia.

  6. Die Grundlagen der Fernsehtechnik: Systemtheorie und Technik der Bildübertragung

    NASA Astrophysics Data System (ADS)

    Mahler, Gerhard

    Umfassende Einführung in die Grundlagen der Bewegtbild-Übertragung von den Anfängen bis zum heutigen Stand des digitalen Fernsehens mit einer aus der Praxis entstandenen systemtheoretischen Analyse. Die kompakte und anschaulich bebilderte Darstellung mit elementaren mathematischen Beschreibungen macht es dem Leser leicht, sich in die Bildübertragungstechnik einzuarbeiten. Thematische Einheiten erweitern den Wissensstoff - u.a. zu den Themen visuelle Wahrnehmung, mehrdimensionale Signaldarstellung, Farbmetrik, Digitalisierung, Elektronenoptik - und zeigen deren Anwendung auf die elektronische Bildübertragung.

  7. Smart Home, Smart Grid, Smart Meter - digitale Konzepte und das Recht an Daten

    NASA Astrophysics Data System (ADS)

    Spiecker genannt Döhmann, Indra

    Modernes Energiemanagement setzt auf ein intelligent gesteuertes Energieinformationsnetz, das Smart Grid. In diesem ist der Smart Meter, die intelligente Messstelle beim Nutzer, ein zentrales Instrument für den wechselseitigen Austausch von Informationen. Allerdings werfen die über diverse Gesetze forcierten Informationsströme erhebliche datenschutzrechtliche Fragen auf. Der Beitrag stellt zentrale datenschutzrechtliche Leitlinien und Probleme vor und behandelt auch offene Fragestellungen.

  8. Die Digitalisierung der Energiewirtschaft: Potenziale und Herausforderungen der IKT-Branche für Utility 4.0

    NASA Astrophysics Data System (ADS)

    Aichele, Christian; Schönberger, Marius

    Energieunternehmen haben auf dem Weg zur digitalen Transformation noch viele Herausforderungen zu bewältigen. Ein besonderer Schwerpunkt liegt derzeit auf der Modernisierung der IT-Systeme. Ausgangspunkt hierzu ist, dass sich bei den Endkonsumenten Mobile Applikationen, Smartphones, Tablet-PCs oder Smart TVs einer immensen Beliebtheit erfreuen. Durch diese Technologien wird die physische und virtuelle Welt in immer weiter zunehmendem Maße miteinander verknüpft. Mobile Applikation können einen wahren Hype hervorrufen und Verhaltensweisen auch nachhaltig verändern (ein Beispiel hierfür ist Pokémon Go, eine App die ein virtuelles Spiel mit der realen Umgebung kombiniert und die erstmalig auch eingefleischte Zocker aus der Anonymität ihrer häuslichen Umgebung hervorlocken konnte und für analoge Bewegung im Freien sorgte).

  9. Beschallungstechnik, Beschallungsplanung und Simulation

    NASA Astrophysics Data System (ADS)

    Ahnert, Wolfgang; Goertz, Anselm

    Die primäre Aufgabe einer Lautsprecheranlage ist es, Musik, Sprache oder auch Signaltöne und Geräusche wiederzugeben. Diese können von einem Tonträger kommen (CD, Sprachspeicher), von einem anderen Ort übertragen (Zuspielung über Radio, TV, Telefon) oder vor Ort erzeugt werden. Letzteres umfasst Konzerte, Ansprachen, Durchsagen oder künstlerische Darbietungen, bei denen es meist darum geht, eine bereits vorhandene Quelle einer größeren oder weiter verteilten Anzahl von Personen zugänglich zu machen.

  10. Interaktive Visualisierung von Abständen und Ausdehnungen anatomischer Strukturen für die Interventionsplanung

    NASA Astrophysics Data System (ADS)

    Rössling, Ivo; Cyrus, Christian; Dornheim, Lars; Hahn, Peter; Preim, Bernhard; Boehm, Andreas

    Im Rahmen der Interventionsplanung muss der Chirurg therapierelevante Entscheidungen auf Basis räumlicher Relationen anatomischer Strukturen treffen. Interaktive 3D-Visualisierungen unterstützen diesen Prozess qualitativ. Quantitative Fragestellungen (Tumorausdehnung, Infiltrationstiefe, etc.) erfordern die Integration einer Bemaßung, deren Nutzen wesentlich von einer geeigneten Darstellung abhängt. In dieser Arbeit haben wir allgemeine Kriterien für die Eignung von Visualisierungen von Bemaßungen in interaktiven 3D-Szenen erarbeitet. Daran orientierend haben wir verschiedene Varianten der Darstellung von Abständen und Ausdehnungen anatomischer Strukturen betrachtet und ihr Erscheinungsbild hierzu zweckmäßig parametrisiert. Die Ausprägungen dieser Darstellungsparameter wurden in einer Studie auf ihre visuellen Wirkung hin an Chirurgen evaluiert. Es zeigte sich, dass die befragten Mediziner höchsten Wert auf Kohärenz und klare Zuordnung der Bemaßung setzten und überraschenderweise dafür sogar Abstriche in der direkten Lesbarkeit in Kauf nahmen.

  11. Unfallaufnahme und Datenerhebung

    NASA Astrophysics Data System (ADS)

    Brösdorf, Klaus-Dieter; Moser, Andreas; Burg, Jürgen

    Unfälle ereignen sich in unterschiedlichen Schweregraden. Man unterscheidet zwischen Unfälle mit nur Sachschaden und in Unfälle mit Personenschaden. Gemäß Statistik [1] machten in Deutschland im Jahr 2005 Unfälle mit Personenschaden (336.619) etwa 15 % der Gesamtanzahl der polizeilich erfassten Unfälle (2.253.992) aus. In den amtlichen Statistiken sind nur polizeilich erfasste Unfälle enthalten. Eine größere Zahl von Unfällen, insbesondere leichtere Unfälle, wird offensichtlich polizeilich nicht gemeldet. Mit den Daten der Versicherungswirtschaft wird die Anzahl der Kfz-Schäden pro Jahr in Deutschland mit 8.673.000 angegeben [2].

  12. „Überholen ohne einzuholen“ Die Entwicklung von Technologien für übermorgen in Kernenergie und Mikroelektronik der DDR

    NASA Astrophysics Data System (ADS)

    Barkleit, Gerhard

    Dem nuklearen Patt zwischen Ostblock und westlichem Staatenbündnis ist es nach weitgehend übereinstimmender Auffassung von Politik und Wissenschaft zu danken, dass der "Kalte Krieg" in der zweiten Hälfte des 20. Jahrhunderts nicht zum weltumfassenden Flächenbrand eskalierte. An der raschen Herstellung dieses Patts waren zwei Dresdner Physiker maßgeblich beteiligt, deren einer im Manhattan-Projekt in den USA gearbeitet hatte und später in England der Spionage für die Sowjetunion und des Verrats des Know-how der Atombombe überführt wurde.

  13. Prognostischer Wert der Fläche und Dichte von Lymphgefäßen bei kutanem Plattenepithelkarzinom.

    PubMed

    Krediet, Jorien Tannette; Kanitakis, Jean; Bob, Adrienne; Schmitter, Julia; Carine Krediet, Annelot; Röwert, Joachim; Stockfleth, Eggert; Painsi, Clemens; Hügel, Rainer; Terhorst, Dorothea; Lange-Asschenfeldt, Bernhard

    2016-11-01

    Kutane Plattenepithelkarzinome (SCC) sind bekannt für ihre Fähigkeit, über Lymphgefäße zu metastasieren. In neueren Studien wird das Ausmaß der Lymphangiogenese als möglicher prognostischer Faktor bei einigen Hauttumoren genannt. Ziel dieser Studie war die Quantifizierung der Lymphangiogenese bei SCC entweder durch computergestützte Bildanalyse oder mithilfe der Zählmethode nach Chalkley. Gefäßparameter wurden im Hinblick auf ihre Vorhersagekraft für die Bildung von Tumormetastasen beurteilt und verglichen. In dieser Fallkontrollstudie wurden die klinischen und histologischen Daten von jeweils 15 SCC-Patienten mit bzw. ohne Metastasen retrospektiv analysiert. In den SCC-Proben wurde der für das Lymphendothel spezifische Marker D2-40 und der pan-vaskuläre Marker CD31 immunhistochemisch angefärbt und durch computergestützte morphometrische Bildanalyse in Hotspots sowie mithilfe der digitalisierten Zählmethode nach Chalkley analysiert. Die Dichte von Lymphgefäßen, die relative Lymphgefäßfläche und die mit der Chalkley-Methode ermittelte Zahl an Lymphgefäßen (Chalkley-Count) waren bei metastasierten SCC signifikant erhöht. Die Tumordicke war bei metastasierten SCC signifikant höher und besaß die höchste Vorhersagekraft für eine Metastasierung. Die Tumordicke war ein signifikanter Prädiktor für Lymphangiogeneseparameter. Die Lymphangiogenese ist bei metastasierten SCC erhöht, doch ihr Ausmaß wird von der Tumordicke beeinflusst. Die Tumordicke bildet weiterhin den zuverlässigsten prädiktiven Faktor für die Metastasierung. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  14. Direct Calculation of Short Circuit Reactance, Winding Strength, and Inherent Forms of Output Transformers (Direkte Berechung der Kurzschlussreaktanzen, Wicklungskraefte und Eigenformen von Leistungstransformatoren)

    DTIC Science & Technology

    2001-01-01

    die elektrische Leitfiihigkeit a,. und die relative Permeabilitat Yrn auf. In die Leiter sind harmonisch von der Zeit abhdngige Str ~me eingeprligt, die...Zusammengefasst werden die Beziehungen (5.15) dann durch Mit den Maschen- und Knotengleichungen sind die Abhdngigkeiten der Str ~ime und Span- nungen...besteht aus t Spalten und kennzeichnet die Tore, an denen Str ~me eingeprdgt sind. Daher ist sie gemdl3 [TnT] J,,,5, wenn in Tor bein Strom eingepriigt ist

  15. Erzwingt die Quantenmechanik eine drastische Änderung unseres Weltbilds? Gedanken und Experimente nach Einstein, Podolsky und Rosen

    NASA Astrophysics Data System (ADS)

    Frodl, Peter

    Von den Anfängen der Quantenmechanik bis heute gibt es Versuche, sie als statistische Theorie über Ensembles individueller klassischer Systeme zu interpretieren. Die Bedingungen, unter denen Theorien verborgener Parameter zu deterministischen Beschreibungen dieser individuellen Systeme als klassisch angesehen werden können, wurden von Einstein, Podolsky und Rosen 1935 formuliert: 1. Physikalische Systeme sind im Prinzip separierbar. 2. Zu jeder physikalischen Größe, deren Wert man ohne Störung des betrachteten Systems mit Sicherheit voraussagen kann, existiert ein ihr entsprechendes Element der physikalischen Realität.Zusammen sind sie, wie Bell 1964 gezeigt hat, prinzipiell unverträglich mit der Quantenmechanik und unhaltbar angesichts neuerer Experimente. Diese erweisen einmal mehr die Quantenmechanik als richtige Theorie. Um ihre Ergebnisse zu verstehen, müssen wir entweder die in der klassischen Physik als selbstverständlich angesehene Annahme der Separierbarkeit physikalischer Systeme aufgeben oder unseren Begriff der physikalischen Realität revidieren. Eine Untersuchung des Begriffs der Separabilität und einige Überlegungen zum Problem der Messung von Observablen zeigen, daß eine Änderung des Begriffs der physikalischen Realität unumgänglich ist. Der revidierte Realitätsbegriff sollte mit klassischer Physik und Quantenmechanik verträglich sein, um ein einheitliches Weltbild zu ermöglichen.Translated AbstractDo Quantum Mechanics Force us to Drastically Change our View of the World? Thoughts and Experiments after Einstein, Podolsky and RosenSince the advent of quantum mechanics there have been attempts of its interpretation in terms of statistical theory concerning individual classical systems. The very conditions necessary to consider hidden variable theories describing these individual systems as classical had been pointed out by Einstein, Podolsky and Rosen in 1935: 1. Physical systems are in principle separable. 2. If it is possible to

  16. Die physikalischen Umweltwissenschaften und das Militär Zur Erforschung Grönlands im Kalten Krieg

    NASA Astrophysics Data System (ADS)

    Heymann, Matthias

    Die modernen Umweltwissenschaften stehen heute im Mittelpunkt von Forschungsförderung und öffentlicher Aufmerksamkeit. Im Zuge des seit den 1970er Jahren erwachten Interesses am globalen Wandel der Umwelt und den damit verknüpften Problemen, ist ihre Bedeutung rasch gestiegen. Viele Wurzeln der modernen Umweltwissenschaften liegen jedoch im Kalten Krieg.

  17. [Hepatitis E - More than a Rare Travel-Associated Infectious Disease!

    PubMed

    Wedemeyer, Heiner

    2017-06-01

    Übertragungswege  In Deutschland infizieren sich jedes Jahr wahrscheinlich mehr als 300 000 Menschen mit dem Hepatitis-E-Virus (HEV). Die Hepatitis E ist in Mitteleuropa in der Regel eine durch den zoonotischen HEV-Genotyp-3-verursachte autochthone, d. h. lokal erworbene Infektionskrankheit. Der Verzehr von nicht ausreichend erhitztem Schweine- oder Wildfleisch ist ein Hauptrisikofaktor für HEV-Infektionen, Übertragungen des Virus durch Bluttransfusionen sind aber auch möglich. Diagnostik  Bei Immunkompetenten kann die Diagnose einer akuten Hepatitis E mit dem Nachweis von anti-HEV-IgM gestellt werden. Serologische Tests können bei Immunsupprimierten aber falsch-negativ sein, weshalb in diesen Fällen eine HEV-Infektion nur durch den direkten Nachweis des Erregers mittels PCR im Blut oder Stuhl erfolgen sollte. Natürlicher Verlauf  Eine akute Hepatitis E kann bei Patienten mit anderen chronischen Lebererkrankungen zu einem Leberversagen führen. Chronische Verläufe, definiert durch eine Virämie von mind. 3 Monaten, sind bei Organtransplantierten mit immunsuppressiver Medikation beschrieben, können aber auch bei anderen Immundefizienzen auftreten. Eine chronische Hepatitis E kann innerhalb von Monaten zu einer fortgeschrittenen Leberfibrose oder zur Zirrhose führen. Extrahepatische Manifestationen  Extrahepatische Manifestationen können während und nach einer HEV-Infektion auftreten. Insbesondere Guillain-Barré-Syndrome und die neuralgische Schulteramyotrophie sind mit einer Hepatitis E assoziiert worden. Therapie  Ribavirin hat eine antivirale Wirksamkeit gegen HEV. Bei chronischer Hepatitis E sollte die Behandlung für 3 – 6 Monate durchgeführt werden. Therapieversagen und Rückfälle nach Beendigung einer Behandlung sind möglich. Ein Impfstoff gegen HEV ist bisher nur in China zugelassen.

  18. Historisches Rätsel Physik mit Gewehr und Eiern

    NASA Astrophysics Data System (ADS)

    Loos, Andreas

    2003-11-01

    Es fing schon gut an: Mit zehn Jahren saß der begabte Junge bereits in der Universität, wo ihn kein Geringerer als sein Vater persönlich unterrichtete. Damit schlug dieser zwei Fliegen mit einer Klappe: Sein Sohn lernte etwas Gescheites, und er war zugleich in sicherer Obhut.

  19. Mean flow generation mechanism by inertial waves and normal modes

    NASA Astrophysics Data System (ADS)

    Will, Andreas; Ghasemi, Abouzar

    2016-04-01

    , respectively. The former is used to find the analytical solution of the normal modes (Borcia 2012). Plugging two independent solutions into the latter we investigate the generation mechanism of INMMF. We found R1^1=overbar{partial_z(u_r1 u_z^1)}, R2^1=overbar{partial_r(u_r1 u_r^1)} as source terms responsible for the generation of INMMF. The helical structure of the inertial waves causes the nonlinear terms R1 and R2 to be nonzero, contributing to the generation of INMMF. We used u_ra and u_za obtained from the analytical solution (Borcia 2012) and computed the source terms R1a and R2a and found a structural correspondence with the corresponding field computed from the DNS solution for the three normal modes investigated. The sum of R11 and R21 exhibits a good structural correspondence with INMMF. Interestingly, INMMF magnitude depends on the inertial wave beams and normal modes. For instance we found that INMMF is generated more efficiently for the libration frequency ω=1.58, although the resonant frequency is predicted by the analytical solution to be at ω=1.576 (normal mode (2,1)). Separating the inertial wave beams from the flow field obtained by DNS, using the analytical normal mode solution, we explored the phase lag between inertial wave beams and normal mode. We inferred that the normal mode amplitude is high only if the phase lag between the inertial wave beam and the normal mode is predominantly positive. In this case a high amplitude INMMF amplitude can be found. This supports the hypothesis that the normal modes are generated by the inertial wave beam in analogy to resonant forcing in classical mechanics. Interestingly, the 'optimum' phase lag found is much smaller than π/2. {Acknowledgement:} This work is a part of the project "Mischung und Grundstromanregung durch propagierende Trgheitswellen: Theorie, Experiment und Simulation" supported by the German Science Foundation (DFG). We would like to thank M. Klein, U. Harlander, I. Borcia and E. Schaller for

  20. Introduction of a Diagnosis Related Groups’ Case Flat Rate System: Hopes and Fears (einfuerhrng eines drg-fallpauschalensystems - hoffnungen und aengste)

    DTIC Science & Technology

    2000-06-01

    Anwendung und Pflege des neuen Systems dienen soil, wenn die entscheidenden Vorgaben bereits durch die Deckelung gegeben sind. Oder denkt man an den...Software und die da- mit verbundenen Lizenzgebuhren, sondern auch in der Folge um die Schulung der Mitarbeiter sowie die Pflege und Weiterentwicklung...sung an die medizinische Entwicklung" sowie von "Verfahren zur laufen- den Pflege des VergOtungssystems" gesprochen. Es mossen jedoch klare

  1. Gesellschaft, Lebensgemeinschaft, Ökosystem - Über die Kongruenz von politischen und ökologischen Theorien der Entwicklung

    NASA Astrophysics Data System (ADS)

    Voigt, Annette

    Im Jahr 1859 veröffentlichte Charles Darwin "On the Origin of Species“. Seine Evolutionstheorie ist das wohl spektakulärste Beispiel einer naturwissenschaftlichen Theorie großer gesellschaftlicher Relevanz. Ihre verschiedenen Facetten wurden in der Öffentlichkeit kontrovers diskutiert, unter anderem auch ihre Anwendung zur Erklärung von Zuständen und Prozessen menschlicher Gesellschaften. Zum Teil wurde die Seiensweise der Natur - scheinbar unabhängig von gesellschaftlichen Interessen - für die Erklärung und Legitimation gesellschaftlicher Zustände oder die Legitimation von politischen Ideologien herangezogen (Sozialdarwinismus). Denn Gesellschaft funktioniere ja so, wie Darwin die Natur erklärt habe: es herrsche z. B. Konkurrenzkampf, Auslese und Arbeitsteilung, Erfolg hätten diejenigen, die sich an die Bedingungen am Besten anpassten.

  2. Combined Therapy of Septicemia with Ofloxacin and/or Synthetic Trehalose Dicorynomycolate (S-TDCM) in Irradiated and Wounded Mice (Die Kombinierte Therapie der Septikaemie mit Ofloxacin und/oder Synthetischem Trehalose- Dicorynomycolat (S-TDCM) bei Bestrahlten und Verwundeten Maeusen)

    DTIC Science & Technology

    1989-01-01

    COMBINED THERAPY OF SEPTICEMIA WITH OFLOXACIN AND/OR SYNTHETIC TREHALOSE DICORYNOMYCOLATE (S-TDCM)IN IRRADIATED AND WOUNDED MICE * DIE KOMBINIERTE THERAPIE...DER SEPTIKAMIE MIT OFLOXACIN UND ’ODER SYNTHETISCHEM TREHALOSE -DICORYNOMYCOLAT (S-TDCM) BEI BESTRAHLTEN UND VERWUNDETEN MAUSEN GARY S. M4ADONNA. MARY...ceptibility to bacterial infection from either endogenous or exogenous origin. Treatment with ofloxacin or synthetic trehalose dicorynemycolate (S

  3. Wirksamkeit und Sicherheit von Fumarsäureestern in Kombination mit Phototherapie bei Patienten mit moderater bis schwerer Plaque-Psoriasis (FAST).

    PubMed

    Weisenseel, Peter; Reich, Kristian; Griemberg, Wiebke; Merten, Katharina; Gröschel, Christine; Gomez, Natalie Nunez; Taipale, Kirsi; Bräu, Beate; Zschocke, Ina

    2017-02-01

    Die Behandlung von Psoriasis-Patienten mit einer Kombination aus Fumarsäureestern (FSE, Fumaderm ® ) und Phototherapie (UV) ist verbreitet, wurde aber im Rahmen von Studien wenig untersucht. Bisher liegen lediglich Daten aus einer kleinen Pilotstudie vor. Intention dieser Studie war, eine FSE/UV-Kombinationsbehandlung an einem größeren Patientenkollektiv mit mittelschwerer bis schwerer Psoriasis zu untersuchen. In dieser prospektiven, multizentrischen, nichtinterventionellen Studie wurden Daten von Patienten mit FSE/UV-Kombinationstherapie hinsichtlich der Wirksamkeit (PGA' PASI, DLQI, EQ-5D), Sicherheit und Dosierung über einen Zeitraum von zwölf Monaten erfasst und mit Daten einer retrospektiven Studie mit FSE-Monotherapie verglichen. Es wurden Daten von 363 Patienten ausgewertet. Unter der Kombinationstherapie verbesserten sich alle Wirksamkeitsparameter deutlich. Im Vergleich zur Monotherapie mit FSE konnte durch die Kombination mit UV ein schnellerer Wirkeintritt erzielt werden, wobei nach zwölf Monaten kein Unterschied in der Wirksamkeit bestand. Die Dauer und Art der Phototherapie zeigte keinen Einfluss auf die Wirksamkeitsparameter. Allgemein wurde die Kombinationstherapie gut vertragen. Unerwünschte Ereignisse wurden bei 7 % der Patienten berichtet. Die FSE/UV Kombinationstherapie zeigt eine gute Wirksamkeit und Verträglichkeit und kann zu einem schnelleren Wirkeintritt führen. Eine Kombinationstherapie erscheint vor allem in den ersten drei Monaten der FSE Behandlung sinnvoll. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  4. Das Lektin aus der Erbse Pisum sativum : Bindungsstudien, Monomer-Dimer-Gleichgewicht und Rückfaltung aus Fragmenten

    NASA Astrophysics Data System (ADS)

    Küster, Frank

    2002-11-01

    Das Lektin aus Pisum sativum, der Gartenerbse, ist Teil der Familie der Leguminosenlektine. Diese Proteine haben untereinander eine hohe Sequenzhomologie, und die Struktur ihrer Monomere, ein all-ß-Motiv, ist hoch konserviert. Dagegen gibt es innerhalb der Familie eine große Vielfalt an unterschiedlichen Quartärstrukturen, die Gegenstand kristallographischer und theoretischer Arbeiten waren. Das Erbsenlektin ist ein dimeres Leguminosenlektin mit einer Besonderheit in seiner Struktur: Nach der Faltung in der Zelle wird aus einem Loop eine kurze Aminosäuresequenz herausgeschnitten, so dass sich in jeder Untereinheit zwei unabhängige Polypeptidketten befinden. Beide Ketten sind aber stark miteinander verschränkt und bilden eine gemeinsame strukturelle Domäne. Wie alle Lektine bindet Erbsenlektin komplexe Oligosaccharide, doch sind seine physiologische Rolle und der natürliche Ligand unbekannt. In dieser Arbeit wurden Versuche zur Entwicklung eines Funktionstests für Erbsenlektin durchgeführt und seine Faltung, Stabilität und Monomer-Dimer-Gleichgewicht charakterisiert. Um die spezifische Rolle der Prozessierung für Stabilität und Faltung zu untersuchen, wurde ein unprozessiertes Konstrukt in E. coli exprimiert und mit der prozessierten Form verglichen. Beide Proteine zeigen die gleiche kinetische Stabilität gegenüber chemischer Denaturierung. Sie denaturieren extrem langsam, weil nur die isolierten Untereinheiten entfalten können und das Monomer-Dimer-Gleichgewicht bei mittleren Konzentrationen an Denaturierungsmittel auf der Seite der Dimere liegt. Durch die extrem langsame Entfaltung zeigen beide Proteine eine apparente Hysterese im Gleichgewichtsübergang, und es ist nicht möglich, die thermodynamische Stabilität zu bestimmen. Die Stabilität und die Geschwindigkeit der Assoziation und Dissoziation in die prozessierten bzw. nichtprozessierten Untereinheiten sind für beide Proteine gleich. Darüber hinaus konnte gezeigt werden, dass auch unter

  5. Increasing Pragmatic Awareness: Die Vagheit der Sprache "und so"

    ERIC Educational Resources Information Center

    Overstreet, Maryann; Tran, Jennie; Zietze, Sylvia

    2006-01-01

    This article presents a description of some pragmatic expressions ("oder so," "und so," "oder wie") rarely found in textbooks, but common in everyday conversation. Though often treated as vague or superfluous, these expressions perform important functions in interpersonal communication. Focusing on these easily identifiable phrases can help…

  6. Tokamak und Stellarator - zwei Wege zur Fusionsenergie: Fusionsforschung

    NASA Astrophysics Data System (ADS)

    Milch, Isabella

    2006-07-01

    Im Laufe der Fusionsforschung haben sich zwei Bautypen für ein zukünftiges Kraftwerk als besonders aussichtsreich erwiesen: Tokamak und Stellarator. Mit dem geplanten Tokamak-Experimentalreaktor ITER steht die internationale Fusionsforschung vor der Demonstration eines Energie liefernden Plasmas. Parallel soll die in Greifswald entstehende Forschungsanlage Wendelstein 7-X die Kraftwerkstauglichkeit des alternativen Bauprinzips der Stellaratoren zeigen.

  7. Campus Single Sign-On und hochschulübergreifendes Identity Management

    NASA Astrophysics Data System (ADS)

    Hommel, Wolfgang

    Das im Rahmen von IntegraTUM für die TUM geschaffene Identity & Access Management System setzt das Paradigma unified login um, d. h. ein Benutzer kann alle für ihn relevanten Dienste innerhalb der Hochschule mit derselben Loginname-/Passwortkombination nutzen. Dieser Artikel zeigt, wie auf Basis der Software Shibboleth und der deutschlandweiten Hochschulföderation DFN-AAI als weitere Mehrwerte das campusweite web single sign-on und die nahtlose Nutzung zahlreicher externer Web-Anwendungen erreicht werden. Als Beispiel für die Abläufe bei der Erschließung neuer Dienste für die hochschulübergreifende Nutzung wird die Anbindung von Learning Management Systemen auf Basis des DFN-AAI E-Learning-Profils diskutiert. Den umfassenden Vorteilen werden schließlich die aktuellen technischen Grenzen bei der Umsetzung des hochschulübergreifenden Identity Management gegenübergestellt.

  8. [Relevance of the "Wunsch- und Wahlrecht" of § 9 social code book 9 in medical rehabilitation from the patients' perspective].

    PubMed

    Pohontsch, N; Raspe, H; Welti, F; Meyer, T

    2011-08-01

    Everyone applying for medical rehabilitation (and other benefits to support participation) has a "Wunsch- und Wahlrecht" (meaning the right to individual wishes and choice relative to assessments, services and institutions as well as to the various benefits) according to § 9 of Book 9 of the German Social Code (SGB 9) concerning every aspect of the implementation of these services. This study was aimed at exploring the wishes of rehabilitants, their attitudes towards and experiences with the various aspects of the "Wunsch- und Wahlrecht" as well as their criteria in choosing a rehabilitation centre. A total of 10 open guided focus groups were conducted with 71 male and female participants from 5 different indications and aged between 26 and 80 years. Transcripts were analyzed by means of a summary content analysis. Persons applying for medical rehabilitation benefits did not as a rule get information about their "Wunsch- und Wahlrecht" during the application process. Applying for post-hospital rehabilitation often meant to be faced with an only allegedly existing choice ("pseudo Wunsch- und Wahlrecht"). The participants objected only rarely to this missing share in decision-making. Most of them did not care about their rights to choose a rehab centre if only the application for rehabilitation was allowed. Various arguments were brought forward against the "Wunsch- und Wahlrecht", especially insufficient information about and time for enforcement and implementation of the "Wunsch- und Wahlrecht". Despite an explicit stipulation in § 9 SGB 9, notices of approval rarely stated reasons for ignoring the wishes expressed by the applicants. Many participants had reflected only little about choosing a specific rehab centre when applying for rehabilitation. Accordingly, most of the participants had difficulties to mention possible selection criteria. On the whole, applicants have Only little knowledge about the "Wunsch- und Wahlrecht". This complicates its implementation

  9. A Comparison of Anthropogenic Carbon Dioxide Emissions Datasets: UND and CDIAC

    NASA Astrophysics Data System (ADS)

    Gregg, J. S.; Andres, R. J.

    2005-05-01

    Using data from the Department of Energy's Energy Information Administration (EIA), a technique is developed to estimate the monthly consumption of solid, liquid and gaseous fossil fuels for each state in the union. This technique employs monthly sales data to estimate the relative monthly proportions of the total annual carbon dioxide emissions from fossil-fuel use for all states in the union. The University of North Dakota (UND) results are compared to those published by Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL). Recently, annual emissions per U.S. state (Blasing, Broniak, Marland, 2004a) as well as monthly CO2 emissions for the United States (Blasing, Broniak, Marland, 2004b) have been added to the CDIAC website. To determine the success of this technique, the individual state results are compared to the annual state totals calculated by CDIAC. In addition, the monthly country totals are compared with those produced by CDIAC. In general, the UND technique produces estimates that are consistent with those available on the CDIAC Trends website. Comparing the results from these two methods permits an improved understanding of the strengths and shortcomings of both estimation techniques. The primary advantages of the UND approach are its ease of implementation, the improved spatial and temporal resolution it can produce, and its universal applicability.

  10. Arzt und Hobby-Astronom in stürmischen Zeiten Der Büchernachlass des Doktor Johannes Häringshauser, Viertelsmedicus in Mistelbach (1630-1641) in der Melker Stiftsbibliothek.

    NASA Astrophysics Data System (ADS)

    Davison, Giles; Glaßner, Gottfried

    2009-06-01

    Auf der Suche nach astronomischer Literatur stieß Giles Davison in der Melker Stiftsbibliothek auf den Namen "Doctor Johannes Häringshauser“ als Besitzer seltener und interessanter astronomischer Werke u.a. von Johannes Regiomontan, Georg von Peuerbach, Michael Mästlin, Johannes Kepler und Daniel Sennert. Weitere in den Jahren 2007-2009 durchgeführte Nachforschungen ergaben, dass es sich um den von 1630-1641 in Mistelbach, Niederösterreich, als Landschaftsarzt tätigen Vater des Melker Konventualen und Bibliothekars Sigismund Häringshauser (1631-1698) handelt. Er wurde 1603 als Sohn des aus Magdeburg stammenden Apothekers Johannes Häringshauser geboren und starb 1642 in Mistelbach. Johannes Häringshauser Sen. bekleidete von 1613-1640 eine Reihe wichtiger Ämter in der Wiener Stadtregierung und starb 1647. Der Studienaufenthalt von Dr. Johannes Häringshauser Jun. in Padua (1624-1626) dürfte das Interesse für Astronomie geweckt haben, das sich in seiner in die Bestände der Melker Stiftsbibliothek eingegangenen Privatbibliothek widerspiegelt. Der Großteil der 10 dem Fachbereich der Astronomie und Astrologie zuzuweisenden Titel wurde von ihm in den Jahren 1636 und 1637 erworben.

  11. Die Struktur von schlankem Materialfluss mit Lean Production Kanban und Innovationen

    NASA Astrophysics Data System (ADS)

    Scheid, Wolf-Michael

    In der Literatur wird Materialfluss überwiegend in Spezialdisziplinen betrachtet, etwa der Steuerungslogik, der Logistiktechnik oder dem Supply Chain Management. Ein charakterisierendes Merkmal des Materialflusses ist jedoch, dass er sich aus vielfältigen Einzelbausteinen zusammensetzt, die alle harmonisch abgestimmt sein müssen. Die maximal erreichbare Effizienz wird nicht durch Höchstleistungen in dem einen oder anderen Spezialthema bestimmt, sondern durch das schwächste Glied im gesamten komplexen Netzwerk. Den Schnittstellen zwischen den betroffenen Fachbereichen in einem Unternehmen kommt hier eine ganz besondere Bedeutung zu: Erst ein harmonischer Einklang ermöglicht hohe Effektivität. Dies setzt umfassendes Verständnis für interdisziplinäre Notwendigkeiten, ein hohes Maß an Abstimmung mit den operativen Prozessen und letztlich einen einvernehmlichen Umgang und den Respekt vor den Problemstellungen des Anderen voraus.

  12. Das CARNOTsche Paradigma und seine erkenntnistheoretischen Implikationen

    NASA Astrophysics Data System (ADS)

    Schöpf, Hans-Georg

    Der vorliegende historisch-kritische Essay führt die Eigentümlichkeiten der klassischen phänomenologischen Thermodynamik auf das von CARNOT geschaffene Paradigma zurück und greift einige damit zusammenhängende Fragen auf.Translated AbstractCARNOT's Paradigm and its Epistemological ImplicationsThe present historic-critical essay traces the pecularities of classical phenomenological thermodynamics back to the paradigm, created by CARNOT, and takes up some questions to which this paradigm gives rise.

  13. Tanzendes Tier oder exzentrische Positionalität - Philosophische Anthropologie zwischen Darwinismus und Kulturalismus

    NASA Astrophysics Data System (ADS)

    Fischer, Joachim

    Zunächst kurz vorweg zu den Formeln im Titel: "exzentrische Positionalität“ ist der Kategorienvorschlag der Philosophischen Anthropologie (genauer: von Helmuth Plessner) für den Menschen, für seine "Sonderstellung“ unter den Lebewesen - ich werde diesen Begriff erläutern. So viel kann man sagen: Der Terminus ist nicht schwieriger als "Transzendentalität“ oder das "Apriori“ oder "Autopoiesis“, also Begriffe, mit deren Orientierungswert in der intellektuellen Öffentlichkeit bereits gespielt wird, bietet aber möglicherweise mehr Erschließungskraft als die Kunstbegriffe z. B. von Kant, Maturana oder Luhmann. Und "tanzendes Tier“ ist ein glücklicher Anschauungsbegriff, eine Art Übersetzung für "exzentrische Positionalität“ - also ein "verrücktes“ Lebewesen, eine Verrückung im evolutionären Leben, die dieses Lebewesen von Natur aus zu einer bestimmten Art von Lebensführung, nämlich Kultur nötigt. Die Absicht des Beitrages ist es, die Philosophische Anthropologie als eine spezifische Theorietechnik zu präsentieren, um einen adäquaten Begriff des Menschen zu erreichen, und zwar eine Theoriestrategie angesichts des cartesianischen Dualismus - also des Dualismus zwischen Naturalismus und Kulturalismus.

  14. Verknüpfung von DQ-Indikatoren mit KPIs und Auswirkungen auf das Return on Investment

    NASA Astrophysics Data System (ADS)

    Block, Frank

    Häufig ist nicht klar, welche Beziehungen zwischen Datenqualitätsindikatoren (DQI, Definition folgt weiter unten) und Key Performance Indicators (KPI, s. Abschnitt 1.3 für weitere Details) eines Unternehmens oder einer Organisation bestehen. Dies ist insbesondere deshalb von Bedeutung, da die Kenntnis dieser Beziehungen maßgeblich die Ausprägung eines Datenqualitätsprojekts beeinflusst. Sie ist als Entscheidungsgrundlage unabdingbar und gibt Antworten auf folgende Fragen: Was kostet unserem Unternehmen/unserer Organisation1 schlechte Datenqualität? Können wir uns das leisten?

  15. Mercaptursäure und Nukleosidaddukt im Harn als Biomarker in 1-Hydroxymethylpyren-exponierten Ratten

    NASA Astrophysics Data System (ADS)

    Ma, Lan

    2002-01-01

    1-Methylpyren (MP) ist hepatokanzerogen in neugeborenen männlichen Mäusen. Durch Hydroxylierung an der benzylischen Stelle und anschließende Sulfonierung wird MP zu DNA-reaktivem 1-Sulfooxymethylpyren (SMP) aktiviert. In der Ratte führt die Exposition des benzylischen Alkohols, 1-Hydroxymethylpyren (HMP), zur DNA-Adduktbildung in verschiedenen Geweben. Eventuelle Konsequenz der Toxifizierung ist die Ausscheidung entsprechender Mercaptursäure und Nukleosidaddukt im Harn, welche aufgrund ihrer Herkunft als Biomarker eignen könnten. In dieser Arbeit wird die Ausscheidung der Mercaptursäure und des N2-Desoxyguanosinadduktes in HMP-exponierten Ratten untersucht. Nach der Applikation von HMP bzw. MP wurden weniger als 1 % der Dosis als MPMA über Urin und Faeces ausgeschieden (0 - 48 h). Die Ausscheidung erfolgt hauptsächlich in den ersten 24 h nach der Applikation. MPdG konnte weder in Urin noch in Faeces der HMP-behandelten Tieren identifiziert werden. Nach direkter SMP-Applikation wurde MPdG nur in sehr geringe Menge (weniger als 0,9 ppm in 12 h) im Urin gefunden. Aufgrund der geringen Menge eignet sich MPdG nicht als Biomarker. MPMA dagegen, lässt sich analytisch gut erfassen. Es sollte daher untersucht werden, ob MPMA die Toxifizierung des HMP wiederspiegelt. Die Voraussetzung dafür ist die Kenntnisse über das Metabolismusmuster von HMP. Es wurde daher umfassende Untersuchungen zum Metabolismus des HMP durchgeführt. Die Ergebnisse zeigten, dass mehr als 80 % der Metaboiten in ihrer oxidierten Form (PCS, deren Glucuronsäure-Konjugate sowie phenolische Sulfatester der PCS) ausgeschieden wurden. Demnach spielt die Oxidation des HMP zu PCS eine sehr wichtige Rolle bei der Detoxifizierung und Ausscheidung von HMP. Ferne konnte nachgewiesen werden, dass die Enzyme Alkohol- und Aldehyd-Dehydrogenase an der Oxidation von HMP beteiligt waren. Die Inhibitoren Disulfiram und Ethanol der o. g. Enzyme wurde daher zur Modulation der Detoxifizierung in vivo eingesetzt

  16. Azelainsäure 20 % Creme: Auswirkung auf Lebensqualität und Krankheitsaktivität bei erwachsenen Patientinnen mit Acne vulgaris.

    PubMed

    Kainz, Julius Thomas; Berghammer, Gabriele; Auer-Grumbach, Piet; Lackner, Verena; Perl-Convalexius, Sylvia; Popa, Rodica; Wolfesberger, Barbara

    2016-12-01

    Zur Wirksamkeit von Aknetherapien und deren Auswirkungen auf die Lebensqualität erwachsener Patienten liegen kaum Daten vor. ZIEL: Erhebung der Wirkung von Azelainsäure 20 % Creme (Skinoren ® ) auf Akne-Schweregrad und krankheitsbedingte Lebensqualität. Nichtinterventionelle Studie bei erwachsenen Patientinnen mit leichter bis mittelschwerer Akne. Wirksamkeitsparameter waren DLQI sowie Akne-Schweregrad im Gesicht, am Dekolleté sowie am Rücken im Gesamturteil des Prüfarztes (IGA-Skala: Grad 1 = annähernd reine Haut; 2 = leichte Akne; 3 = mittelschwere Akne). Visiten waren zu Studienbeginn sowie nach 4-8 und zwölf Wochen geplant. Von den 251 eingeschlossenen Patientinnen lag zu Studienbeginn bei 59 %, 31 % bzw. 10 % ein IGA-Grad von 1, 2 bzw. 3 vor; die am häufigsten betroffene Hautpartie war das Gesicht (IGA-Grad 2 oder 3: 79 %). Nach zwölf Behandlungswochen war eine signifikante Besserung der Acne vulgaris im Gesicht (IGA-Grad 0 oder 1: 82 %) sowie auf Dekolleté und Rücken feststellbar. Der mediane DLQI-Wert sank von neun zu Studienbeginn auf fünf nach zwölf Behandlungswochen. Neunzig Prozent der behandelnden Ärzte und Patientinnen beurteilten die Verträglichkeit der Behandlung als sehr gut oder gut. Die Anwendung von 20%iger Azelainsäure-Creme führt bei erwachsenen Frauen zu einer signifikanten Besserung der Acne vulgaris und der krankheitsbedingten Lebensqualität. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  17. Krankheitsverlauf, medizinische Versorgung und Lebensqualität von Patienten mit kongenitalen melanozytären Nävi - Auswertung des deutschsprachigen KMN-Registers.

    PubMed

    Elisabeth Wramp, Maria; Langenbruch, Anna; Augustin, Matthias; Zillikens, Detlef; Krengel, Sven

    2017-02-01

    Kongenitale melanozytäre Nävi (KMN) bedeuten für Patienten und Familien eine psychologische Belastung und bergen zudem medizinische Risiken. Das 2005 gegründete deutschsprachige KMN-Register wurde nun einer Zwischenauswertung bezüglich des Krankheitsverlaufes, der medizinischen Versorgung und der Lebensqualität unterzogen. 100 Patienten, die sich in den Jahren 2005 bis 2012 mit einem Erstmeldebogen registriert hatten, wurde im Rahmen einer prospektiven Kohortenstudie Anfang 2013 ein Folgemeldebogen zugesandt. Außerdem wurden mithilfe standardisierter Fragebögen Daten zu Lebensqualität (dermatology life quality index, DLQI) und Stigmatisierungserfahrungen (perceived stigmatization questionnaire, PSQ; social comfort questionnaire, SCQ) erhoben. 83 % der Patienten oder deren Eltern antworteten (Altersdurchschnitt 11,2 Jahre, Median 6 Jahre; mittleres Follow-up 4,4 Jahre). Im Gesamtkollektiv wurden vier Melanome diagnostiziert, davon zwei zerebrale Melanome im Kindesalter, ein kutanes Melanom im Erwachsenenalter und eines, das sich als proliferierender Knoten erwies. Bei vier Kindern wurde eine neurokutane Melanozytose festgestellt, drei davon mit neurologischer Symptomatik. Chirurgisch behandelt wurden 88 % (73/83). Achtundsiebzig Prozent der Befragten berichteten eine geringe oder keine Beeinträchtigung der Lebensqualität. Die wahrgenommene Stigmatisierung beziehungsweise Beeinträchtigung des sozialen Wohlbefindens war generell ebenfalls gering. Die Ergebnisse geben einen Überblick über die Situation von Patienten mit KMN in Deutschland, Österreich und der Schweiz. Ein Melanom entwickelte sich in 3 %, eine ZNS-Beteiligung bestand in 4 % der Fälle. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  18. Vergleich von rekombinanten Vaccinia- und DNA-Vektoren zur Tumorimmuntherapie im C57BL/6-Mausmodell

    NASA Astrophysics Data System (ADS)

    Johnen, Heiko

    2002-10-01

    In der vorliegenden Arbeit wurden Tumorimpfstoffe auf der Basis des Plasmid-Vektors pCI, modified vaccinia virus Ankara (MVA) und MVA-infizierten dendritischen Zellen entwickelt und durch Sequenzierung, Western blotting und durchflußzytometrische Analyse überprüft. Die in vivo Wirksamkeit der Vakzinen wurde in verschiedenen Tumormodellen in C57BL/6 Mäusen verglichen. Die auf dem eukaryotischen Expressionsvektor pCI basierende DNA-Vakzinierung induzierte einen sehr wirksamen, antigenspezifischen und langfristigen Schutz vor Muzin, CEA oder beta-Galactosidase exprimierenden Tumoren. Eine MVA-Vakzinierung bietet in den in dieser Arbeit durchgeführten Tumormodellen keinen signifikanten Schutz vor Muzin oder beta-Galactosidase exprimierenden Tumoren. Sowohl humane, als auch murine in vitro generierte dendritische Zellen lassen sich mit MVA – im Vergleich zu anderen viralen Vektoren – sehr gut infizieren. Die Expressionsrate der eingefügten Gene ist aber gering im Vergleich zur Expression in permissiven Wirtszellen des Virus (embryonale Hühnerfibroblasten). Es konnte gezeigt werden, daß eine MVA-Infektion dendritischer Zellen ähnliche Auswirkungen auf den Reifezustand humaner und muriner dendritischer Zellen hat, wie eine Infektion mit replikationskompetenten Vakzinia-Stämmen, und außerdem die Hochregulation von CD40 während der terminalen Reifung von murinen dendritischen Zellen inhibiert wird. Die während der langfristigen in vitro Kultur auf CEF-Zellen entstandenen Deletionen im MVA Genom führten zu einer starken Attenuierung und dem Verlust einiger Gene, die immunmodulatorische Proteine kodieren, jedoch nicht zu einer Verminderung des zytopathischen Effekts in dendritischen Zellen. Die geringe Expressionsrate und die beobachtete Inhibition der Expression kostimulatorischer Moleküle auf dendritischen Zellen kann für eine wenig effektive Induktion einer Immunantwort in MVA vakzinierten Tieren durch cross priming oder die direkte Infektion

  19. [Grief in Children und Adolescents as a Result of Acute Traumatization].

    PubMed

    Juen, Barbara; Werth, Manuela; Warger, Ruth; Nindl, Sandra

    2017-01-01

    Grief in Children und Adolescents as a Result of Acute Traumatization Specifities of grief in children after trauma will be presented as well as potential reactions to acute traumatization and resulting needs of children and adolescents in order to discuss adequate interventions.

  20. Europeanization of the Hospital Markets - Opportunities and Risks for German Hospitals (Europaeisierung des Krankenhausmarkets - Chancen und Risiken fuer deutsche Krankenhaeuser)

    DTIC Science & Technology

    2004-07-06

    Information Services Institut (3M HIS Institut) hat u.a. mit der Entwicklung eines Werkzeugs zur Pflege und Weiterentwicklung des deutschen DRG-Systems ftir... Pflege (Krankenhauspflege) a. die Kosten wachsen schneller als allgerneine Inflation b. praktisch stabile Krankenversicherungsentnahme c. es existieren 9...Krankenkassen d. die Pflege ist von der Krankenversicherung voll (theoretisch) vergUltet e. Missverhlitnis zwischen Versicherungsentnuhrne und

  1. Einheit von Forschung und Lehre: Implications for State Funding of Universities

    ERIC Educational Resources Information Center

    Frolich, Nicoline; Coate, Kelly; Mignot-Gerard, Stephanie; Knill, Christoph

    2010-01-01

    The Humboldtian educational ideal is based on the idea of the unity of teaching and research in universities ("Einheit von Forschung und Lehre"). The role of the state, according to Humboldt, was to fund universities in such a way that their autonomy was maintained. Much has changed in the funding mechanisms of higher education systems…

  2. Antibakterielle In-vitro-Wirksamkeit ätherischer Öle gegen veterinärmedizinisch relevante Keime klinischer Isolate von Hunden, Katzen und Pferden.

    PubMed

    Bismarck, Doris; Schneider, Marianne; Müller, Elisabeth

    Einleitung: Ätherische Öle sind die Grundlage der Aromatherapie. Unter anderem wird ihnen eine antibakterielle Wirkung zugeschrieben. In dieser Studie sollte die In-vitro-Wirksamkeit ätherischer Öle gegen ein breites Spektrum veterinärmedizinisch relevanter Erreger getestet werden. Methoden: Die antibakterielle Aktivität von 16 ätherischen Ölen wurde mittels Agardiffusionstest bestimmt. Getestet wurden grampositive und gramnegative Erreger, die aus klinischen Isolaten von Hunden, Katzen und Pferden aus der veterinärmedizinischen Routinediagnostik stammten. Die Einteilung der Wirksamkeit in nicht, gering-, mittel- und hochgradig wirksam erfolgte anhand der Größe der Hemmhofradien des Bakterienwachstums. Ergebnisse: Generell zeigten sich sowohl grampositive als auch gramnegative Erreger empfindlich gegen einige der getesteten ätherischen Öle. Nicht nur gegen Staphylokokken, sondern auch gegen Methicillin-resistente Stämme der Staphylokokken wiesen die ätherischen Öle in vitro eine nicht zu vernachlässigende Wirkung auf. Pasteurella multocida stellte sich als eher sensibler Keim heraus, während Pseudomonas aeruginosa als vollkommen resistenter Keim eine Ausnahme bildete. Teebaum-, Oregano-, und Bergbohnenkrautöl waren die potentesten Öle. Zusätzlich zeigten sich bei den grampositiven Erregern Lemongrasöl und bei den gramnegativen Erregern Thymianöl als gut wirksam. Schlussfolgerung: Ätherische Öle verfügen in vitro über eine antibakterielle Aktivität gegen klinische Isolate von Hunden, Katzen und Pferden. Diese Studie bietet eine Grundlage für die Anwendung ätherischer Öle in der Veterinärmedizin. Es zeichneten sich Tendenzen im Wirkspektrum einzelner ätherischer Öle bzw. im Grad der Wirksamkeit ätherischer Öle hinsichtlich einzelner Erregerspezies ab, allerdings lässt sich keine sichere Vorhersage über ihre Wirksamkeit gegen einen spezifischen Keim eines individuellen Patienten treffen. Deswegen sollte vor einer Therapie mit

  3. Effekt einer ad libitum verzehrten fettreduzierten Kost, reich an Obst, Gemüse und Milchprodukten auf den Blutdruck bei Borderline-Hypertonikern

    NASA Astrophysics Data System (ADS)

    Möseneder, Jutta M.

    2002-01-01

    In der randomisierten, multizentrischen DASH-Studie (Dietary Approaches to Stop Hy-pertension), die unter kontrollierten Bedingungen stattfand, führte eine fettreduzierte Mischkost, reich an Obst, Gemüse und Milchprodukten, bei Borderline-Hypertonikern zu einer signifikanten Blutdrucksenkung. Während der Studienphase wurden Körpermasse, Natrium-Aufnahme sowie Alkoholzufuhr aufgrund der bekannten Einflussnahme auf den Blutdruck konstant gehalten. In der eigenen Pilot-Studie sollte untersucht werden, ob das Ergebnis der DASH-Studie (i) mit deutschen Hypertonikern und (ii) unter habituellen Ernährungs- und Lebensbedingungen mit regelmäßig durchgeführter Ernährungsberatung und ad libitum Verzehr anstelle des streng kontrollierten Studienansatzes bestätigt werden kann. Eine Konstanz der Körpermasse, der Natrium-Urinausscheidung (unter diesem Studienansatz valider als die Aufnahme) und des Alkoholkonsums wurde vorausgesetzt. Die Studienpopulation setzte sich aus 53 übergewichtigen Probanden mit einer nicht medikamentös therapierten Borderline-Hypertonie und ohne Stoffwechselerkrankungen zusammen. Die Studienteilnehmer wurden randomisiert entweder der Idealgruppe mit einer fettarmen Kost reich an Milchprodukten, Obst und Gemüse (ähnlich der DASH-Idealgruppe) oder der Kontrollgruppe mit habitueller Ernährungsweise zugeteilt. Über einen Zeitraum von fünf Wochen wurde den Probanden etwa 50% ihres täglichen Lebensmittelbedarfes entsprechend ihrer Gruppenzugehörigkeit kostenfrei zur Verfügung gestellt. Gelegenheitsblutdruckmessungen und 24h-Blutdruckmessungen, Ernährungs- und Aktivitätsprotokolle, Blut- und Urinproben sowie anthropometrische Messungen wurden vor, während und fünf Wochen nach der Interventionsphase durchgeführt. Die Ergebnisse zeigen, dass in der Idealgruppe keine signifikante Blutdrucksenkung beobachtet werden konnte. Dies lässt sich durch die Tatsache erklären, dass die Lebens-mittel- und Nährstoffaufnahme der deutschen

  4. Empirische Verfahren zur Ableitung verschiedener Porositätsarten aus Durchlässigkeitsbeiwert und Ungleichkörnigkeitszahl - ein Überblick

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Ziesche, Michael; Nillert, Peter

    2017-06-01

    This paper comprises a review of the 13 studies that have been proposed for the derivation of porosity, effective porosity and/or specific yield from grain size distributions (Lejbenson 1947; Istomina 1957; Beyer 1964; Hennig 1966; Golf 1966; Marotz 1968; Beyer und Schweiger 1969; Seiler 1973; Bureau of Reclamation 1984; Helmbold 1988; Beims und Luckner 1999; Balke et al. 2000; Helmbold 2002). Experimental design, limitations and application boundaries are discussed and methods are compared against each other. The quality of the predictive methods strongly depends on the experimental design and the sample type.

  5. [Neuropathological research on organs of patients of the "Heil- und pflegeanstalt" (state hospital) Günzburg].

    PubMed

    Steger, F; Strube, W; Becker, T

    2011-03-31

    The two Kaiser Wilhelm-Institutes (KWI) in Berlin (1914, new building 1931) and in Munich (1917, new building 1926-28), specialized on pathologic anatomical as well as psychiatric genetic research, were set up before times of National Socialism. Data evaluation is based on patient documents and annual reports of the archive of today's district hospital Günzburg and on patient documents (copies) of the historical archive of today's Max-Planck Institute of Psychiatry. The KWI in Munich was indirectly provided with brain material by Bavarian "Heil- und Pflegeanstalten" (state hospitals) including the state hospital Günzburg. During National Socialism patients' organs were sent from the "Heil- und Pflegeanstalt" (state hospital) Günzburg to the KWI in Munich for the purpose of conducting research. Commemorating patients' fates and clarifying what happened defines a place of remembrance.

  6. Albrecht Penck: Vorbereiter und Wegbereiter der NS-Lebensraumpolitik?

    NASA Astrophysics Data System (ADS)

    Schultz, Hans Dietrich

    2018-01-01

    Albrecht Penck was one of the eminent representatives of Quaternary research in the first half of the twentieth century. But apart from this, there was a political-geographical side to Penck, which, since 1945, has long been ignored or downplayed by geographers. Today, given his concept of Volks- und Kulturboden, he is considered as having ushered in German geography the völkisch (ethno-nationalistic) turn. Thus, critics say, he paved the way for Nazi Lebensraum policies and became an accomplice in the resulting crimes. The present contribution examines Penck's political-geographical worldview and reaches an ambivalent conclusion regarding the accusations.

  7. Umgang mit Antithrombotika bei Operationen an der Haut vor und nach Publikation der entsprechenden S3-Leitlinie.

    PubMed

    Gaskins, Matthew; Dittmann, Martin; Eisert, Lisa; Werner, Ricardo Niklas; Dressler, Corinna; Löser, Christoph; Nast, Alexander

    2018-03-01

    Laut einer Befragung im Jahre 2012 war der Umgang mit Antithrombotika bei dermatochirurgischen Eingriffen in Deutschland sehr heterogen. 2014 wurde erstmals eine evidenzbasierte Leitlinie zu diesem Thema veröffentlicht. Es wurde eine anonyme Befragung derselben Stichprobe zum Umgang mit Antithrombotika sowie zu Kenntnissen der Leitlinie durchgeführt. Die Ergebnisse wurden als relative Häufigkeiten berichtet und denen aus 2012 gegenübergestellt. 208 Antwortbögen wurden ausgewertet (Rücklaufquote: 36,6 %). Die große Mehrheit der Dermatologen erklärte, kleinere Eingriffe unter Fortführung der Therapie mit Phenprocoumon, niedrig dosierter Acetylsalicylsäure (≤ 100 mg) und Clopidogrel sowie mit direkten oralen Antikoagulanzien durchzuführen. Bei größeren Eingriffen war der Umgang hingegen weiterhin heterogen, insbesondere unter niedergelassenen Dermatologen. Der Anteil der Dermatologen, die Phenprocoumon, Acetylsalicylsäure und Clopidogrel leitlinienkonform verwendeten, hat sich insgesamt vergrößert. Führten 2012 beispielsweise 53,8 % der Klinikärzte bzw. 36,3 % der niedergelassenen Dermatologen eine große Exzision unter Fortführung der Therapie mit niedrig dosierter Acetylsalicylsäure durch, taten dies 2017 90,2 % bzw. 57,8 % (Phenprocoumon: 33,8 % bzw. 11,9 % auf 63,9 % bzw. 29,9 %; Clopidogrel: 36,9 % bzw. 23,2 % auf 63,9 % bzw. 30,6 %). Unter den Klinikärzten war ein hoher Anteil mit der Leitlinie vertraut und fand diese hilfreich. Eine Zunahme des leitlinienkonformen Verhaltens war bei allen Eingriffen zu verzeichnen. Bei größeren Eingriffen zeigte sich trotz deutlicher Verbesserung die Notwendigkeit verstärkter Anstrengungen zur Leitlinienumsetzung bzw. zur Identifizierung von Implementierungsbarrieren. © 2018 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  8. 77 FR 16968 - Airworthiness Directives; Burkhart GROB Luft- und Raumfahrt GmbH Powered Sailplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... verbal contact we receive about this proposed AD. Discussion The European Aviation Safety Agency (EASA...-200. (h) Related Information Refer to MCAI European Aviation Safety Agency (EAS) AD No. 2012- 0027...- und Raumfahrt GmbH Powered Sailplanes AGENCY: Federal Aviation Administration (FAA), Department of...

  9. 77 FR 32887 - Airworthiness Directives; Burkhart GROB Luft- und Raumfahrt GmbH Powered Sailplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-04

    ... Airworthiness Directives; Burkhart GROB Luft- und Raumfahrt GmbH Powered Sailplanes AGENCY: Federal Aviation... seaside and therefore exposed to a salty environment, causing the excessive corrosion. This condition, if... Programs,'' describes in more detail the scope of the Agency's authority. We are issuing this rulemaking...

  10. Was leistet ein Sportler? Kraft, Leistung und Energie im Muskel

    NASA Astrophysics Data System (ADS)

    Thaller, Sigrid; Mathelitsch, Leopold

    2006-01-01

    Der Leistungsbegriff ist im Sport weiter gefasst als in der Physik. In beiden Fällen liegt der Fokus jedoch auf einer pro Zeiteinheit erfolgten Energieumsetzung. Allerdings gibt die rein physikalische Leistung nicht immer Auskunft über den Energieumsatz der Muskeln. Die Muskelkraft hängt von der Kontraktionsgeschwindigkeit des Muskels ab. Ein Muskel verhält sich also anders als eine Feder. Für den Hochleistungssport müssen die Energieumsätze der Muskeln durch spezielle Trainings- und Nahrungsprogramme optimiert werden.

  11. Teaching Normal Birth, Normally

    PubMed Central

    Hotelling, Barbara A

    2009-01-01

    Teaching normal-birth Lamaze classes normally involves considering the qualities that make birth normal and structuring classes to embrace those qualities. In this column, teaching strategies are suggested for classes that unfold naturally, free from unnecessary interventions. PMID:19436595

  12. [Forced sterilisation based on the Law for the Prevention of Hereditarily Diseased Offspring. The role of the Heil- und Pflegeanstalt (State Hospital) Günzburg].

    PubMed

    Steger, F; Schmer, B; Strube, W; Becker, T

    2012-03-01

    From 1934 to 1945, 350,000-400,000 human beings were sterilised by force in the German Reich. Forced sterilisation was based on the Gesetz zur Verhütung erbkranken Nachwuchses (Law for the Prevention of Hereditarily Diseased Offspring). The Heil- und Pflegeanstalt (State Hospital) Günzburg was one of the institutions where compulsory sterilisation was practised. Data evaluation was based on patient documents and annual reports of the archives of today's district hospital at Günzburg. Patient records were analysed with respect to predefined criteria. The municipal archives of Günzburg provided further historical sources and data. Between 1934 and 1943, 366 patients were sterilised in the Heil- und Pflegeanstalt (State Hospital) Günzburg. Age, sex and diagnosis were found to be criteria relevant for selection of patients for sterilisation. The study was able to show the active involvement of the Heil- und Pflegeanstalt (State Hospital) Günzburg in the compulsory sterilisation programme.

  13. Zeitschrift fur erziehungs--und sozialwissenschaftliche Forschung (Journal for Education and Social Sciences Research), 1984-1988 (11 issues).

    ERIC Educational Resources Information Center

    Zeitschrift fur erziehungs--und socialwissenschaftliche Forschung (Journal for Education and Social Sciences Research), 1984

    1984-01-01

    Recognizing a growing globalization of nations and cultures, "Zeitschrift fur erziehungs--und sozialwissenchaftliche Forschung" brings together educational and social science research topics that address the interactions between education and society in their pedagogical, social, physical, economic, legal, and administrative dimensions.…

  14. Strömende Flüssigkeiten und Gase

    NASA Astrophysics Data System (ADS)

    Heintze, Joachim

    Die Bemerkung über die Probleme eines allgemeingültigen Ansatzes, die wir zu Anfang von Kap. 1 machten, gilt in noch höherem Maße für die Mechanik von strömenden Flüssigkeiten; dort erreicht man sogar ziemlich rasch die Grenze der Leistungsfähigkeit der heutigen Mathematik, d. h. wir können zwar - ausgehend von den Newtonschen Gesetzen (Bd. I/3) - eine Differentialgleichung für die Strömung von Flüssigkeiten aufstellen, die sog. Navier-Stokes-Gleichung, es sind aber keine allgemein anwendbaren Lösungsverfahren für diese Gleichung bekannt. Ein Blick in die Natur und auf die vielfältigen Strömungsphänomene zeigt, dass diese Tatsache nicht verwunderlich ist.

  15. [Kombinierte Anwendung von Strahlentherapie und adjuvanter Therapie mit einem Mistelextrakt (Viscum album L.) zur Behandlung des oralen malignen Melanoms beim Hund: Eine retrospektive Studie].

    PubMed

    von Bodungen, Uta; Ruess, Katja; Reif, Marcus; Biegel, Ulrike

    2017-01-01

    Hintergrund: Orale maligne Melanome (OMM) des Hundes zeichnen sich durch schnelles Wachstum, lokale Invasion und hohe Metastasierungsraten aus. Extrakte auf Basis von Viscum album L. (VAE) werden zunehmend in der Krebstherapie sowohl in der Human- als auch in der Veterinärmedizin eingesetzt. Ziel unserer Studie war es zu untersuchen, inwieweit die adjuvante Therapie mit VAE eine therapeutische Option zur Behandlung von OMM ist. Besonderes Augenmerk galt dabei der Überlebenszeit und möglichen Nebenwirkungen. Tiere und Methoden: 26 Hunde mit OMM, die in einem der größten veterinäronkologischen Zentren der Schweiz allesamt eine Strahlentherapie erhielten (teilweise nach operativer Tumorresektion) wurden in die retrospektive Studie eingeschlossen: 18 Hunde wurden mit VAE behandelt (1 ml VAE (Iscador®) in ansteigenden Konzentrationen von 0,1 bis 20 mg/ml subkutan 3-mal pro Woche (VAE-Gruppe), 8 erhielten keine adjuvante Behandlung (Vergleichsgruppe). Wir verglichen die Größenentwicklung der OMM sowie die Überlebenszeit. Ergebnisse: Patienten mit Bestrahlung und adjuvanter VAE-Therapie zeigten mit 236 Tagen eine signifikant längere mediane Überlebenszeit im Vergleich zu Patienten mit Bestrahlung, aber ohne adjuvante VAE-Therapie (49 Tage; Log-Rank-Test: p = 0,0047). Die VAE-Therapie verlängerte die Überlebenszeit um mehr als zwei Drittel (Hazard Ratio (HR) = 0,30, 95%-Konfidenzintervall (KI) 0,11-0,86; p = 0,024), während ein höheres Tumorstadium gemäß UICC (Union internationale contre le cancer) einen statistischen Trend zur Verdopplung des Sterberisikos zeigte (UICC-Stadium III/IV vs. I/II: HR = 2,12, 95%-KI 0,88-5,12; p = 0,095). Zwei Patienten zeigten milde Nebenwirkungen während der VAE-Behandlung. Einer der beiden zeigte 1 Tag lang ein selbstlimitiertes Fieber, bei dem anderen Patienten reduzierten wir die Dosis von einem konzentrierteren zu einem weniger konzentrierten VAE (Serie 0) aufgrund von Müdigkeit, die daraufhin verschwand

  16. Verknüpfung von DQ-Indikatoren mit KPIs und Auswirkungen auf das Return on Investment

    NASA Astrophysics Data System (ADS)

    Block, Frank

    Häufig ist nicht klar, welche Beziehungen zwischen Datenqualitätsindikatoren (DQI, Definition folgt weiter unten) und Key Performance Indicators (KPI, s. Abschnitt 1.3 für weitere Details) eines Unternehmens oder einer Organisation bestehen. Dies ist insbesondere deshalb von Bedeutung, da die Kenntnis dieser Beziehungen maßgeblich die Ausprägung eines Datenqualitätsprojekts beeinflusst.

  17. Der Organismus der Mathematik - mikro-, makro- und mesoskopisch betrachtet

    NASA Astrophysics Data System (ADS)

    Winkler, Reinhard

    Meist enden ähnliche Gespräche über Mathematik etwa an diesem Punkt, ohne dass der Nichtmathematiker von der Sinnhaftigkeit mathematischer Forschung, ja mathematischer Tätigkeit generell überzeugt werden konnte. Ich glaube nicht, dass dem Laien Blindheit für die Großartigkeit unserer Wissenschaft vorzuwerfen ist, wenn hier keine befriedigendere Kommunikation zustande kommt. Ich sehe als Ursache eher ein stark verkürztes Bild von der Mathematik, welches auch Fachleute oft zeichnen, weil ihnen eine angemessenere Darstellung ihres Faches zu viel Mühe macht - und das obwohl Mathematik nur betreiben kann, wer geistige Mühen sonst keineswegs scheut. Ich will versuchen, den Ursachen dieses eigentümlichen Phänomens auf den Grund zu gehen.

  18. Versuche zur Gewinnung von katalytischen Antikörpern zur Hydrolyse von Arylcarbamaten und Arylharnstoffen. (English Title: Attempts to produce catalytic antibodies for hydrolysis of arylcarbamates and arylureas)

    NASA Astrophysics Data System (ADS)

    Werner, Deljana

    2002-05-01

    Im Rahmen dieser Arbeit gelang es, katalytische Antikörper zur Hydrolyse von Benzylphenylcarbamaten sowie zahlreiche monoklonale Antikörper gegen Haptene herzustellen. Es wurden verschiedene Hapten-Protein-Konjugate unter Verwendung unterschiedlicher Kopplungsmethoden hergestellt und charakterisiert. Zur Generierung der hydrolytisch aktiven Antikörper wurden Inzuchtmäuse mit KLH-Konjugaten von 4 Übergangszustandsanaloga (ÜZA) immunisiert. Mit Hilfe der Hybridomtechnik wurden verschiedene monoklonale Antikörper gegen diese ÜZA gewonnen. Dabei wurden sowohl verschiedene Immunisierungsschemata als auch verschiedene Inzuchtmausstämme und Fusionstechniken verwendet. Insgesamt wurden 32 monoklonale Antikörper gegen die verwendeten ÜZA selektiert. Diese Antikörper wurden in groen Mengen hergestellt und gereinigt. Zum Nachweis der Antikörper-vermittelten Katalyse wurden verschiedene Methoden entwickelt und eingesetzt, darunter immunologische Nachweismethoden mit Anti-Substrat- und Anti-Produkt-Antikörpern und eine photometrische Methode mit Dimethylaminozimtaldehyd. Der Nachweis der hydrolytischen Aktivität gelang mit Hilfe eines Enzymsensors, basierend auf immobilisierter Tyrosinase. Die Antikörper N1-BC1-D11, N1-FA7-C4, N1-FA7-D12 und R3-LG2-F9 hydrolysierten die Benzylphenylcarbamate POCc18, POCc19 und Substanz 27. Der Nachweis der hydrolytischen Aktivität dieser Antikörper gelang auch mit Hilfe der HPLC. Der katalytische Antikörper N1-BC1-D11 wurde kinetisch und thermodynamisch untersucht. Es wurde eine Michaelis-Menten-Kinetik mit Km von 210 µM, vmax von 3 mM/min und kcat von 222 min-1 beobachtet. Diese Werte korrelieren mit den Werten der wenigen bekannten Diphenylcarbamat-spaltenden Abzyme. Die Beschleunigungsrate des Antikörpers N1-BC1-D11 betrug 10. Das ÜZA Hei3 hemmte die hydrolytische Aktivität. Dies beweist, dass die Hydrolyse in der Antigenbindungsstelle stattfindet. Weiter wurde zwischen der Antikörperkonzentration und der

  19. Peter Andreas Hansen und die astronomische Gemeinschaft - eine erste Auswertung des Hansen-Nachlasses.

    NASA Astrophysics Data System (ADS)

    Schwarz, O.; Strumpf, M.

    The literary assets of Peter Andreas Hansen are deposited in the Staatsarchiv Hamburg, the Forschungs- und Landesbibliothek Gotha and the Thüringer Staatsarchiv Gotha. They were never systematically investigated. The authors present here some results of a first evaluation. It was possible to reconstruct the historical events with regard to the maintenance of the Astronomische Nachrichten and the Altona observatory in 1854. Hansen was a successful teacher for many young astronomers. His way of stimulating the evolution of astronomy followed Zach's tradition.

  20. Entwicklung von umwelt- und naturschutzgerechten Verfahren der landwirtschaftlichen Landnutzung für das Biosphärenreservat Schorfheide-Chorin

    NASA Astrophysics Data System (ADS)

    Meyer-Aurich, Andreas

    1999-11-01

    Mit der vorliegenden Arbeit werden exemplarisch Chancen und Grenzen der Integration von Umwelt- und Naturschutz in Verfahren der ackerbaulichen Landnutzung aufgezeigt. Die Umsetzung von Zielen des Umwelt- und Naturschutzes in Verfahren der Landnutzung ist mit verschiedenen Schwierigkeiten verbunden. Diese liegen zum einen in der Konkretisierung der Ziele, um diese umsetzen zu können, zum anderen in vielfach unzulänglichem Wissen über den Zusammenhang zwischen unterschiedlichen Formen der Landnutzung und insbesondere den biotischen Naturschutzzielen. Zunächst wird die Problematik der Zielfestlegung und Konkretisierung erörtert. Das Umweltqualitätszielkonzept von Fürst et al. (1992) stellt einen Versuch dar, Ziele des Umwelt- und Naturschutzes zu konkretisieren. Dieses Konzept haben Heidt et al. (1997) auf einen Landschaftsausschnitt von ca. 6000 ha im Biosphärenreservat Schorfheide-Chorin im Nordosten Brandenburgs angewendet. Eine Auswahl der von Heidt et al. (1997) formulierten Umweltqualitätsziele bildet die Basis dieser Arbeit. Für die ausgewählten Umweltqualitätsziele wurden wesentliche Einflussfaktoren der Landnutzung identifiziert und ein Bewertungssystem entwickelt, mit dem die Auswirkungen von landwirtschaftlichen Anbauverfahren auf diese Umweltqualitätsziele abgebildet werden können. Die praktizierte Landnutzung von 20 Betrieben im Biosphärenreservat Schorfheide-Chorin wurde von 1994 bis 1997 hinsichtlich ihrer Auswirkungen auf die Umweltqualitätsziele analysiert. Die Analyse ergab ein sehr differenziertes Bild, das zum Teil Unterschiede in der Auswirkung auf die Umweltqualitätsziele für den Anbau einzelner Kulturen oder für bestimmte Betriebstypen zeigte. Es zeigte sich aber auch, dass es bei der Gestaltung des Anbaus einzelner Kulturarten große Unterschiede gab, die für Umweltqualitätsziele Bedeutung haben. Neben der Analyse der Landnutzung im Biosphärenreservat Schorfheide-Chorin wurde ein System entwickelt, mit dem die modellhafte

  1. Untersuchungen zur Entwicklung von Satellitengalaxien

    NASA Astrophysics Data System (ADS)

    Seidel, Björn

    2002-01-01

    Apogalaktikum statt. (4) Die Geschwindigkeitsdispersionen der Komponenten im Kern sind im PG entlang der Sichtlinie kurzzeitig stark erhöht, sie fallen ebenso wie die Dichten und deren Verhältnis mit jedem PG stufenförmig ab; einiges davon sind nur Projektionseffekte. (5) Der radiale Verlauf der Geschwindigkeitsdispersion ist vom Bahnort abhängig er wurde genau untersucht. (6) Der Satellit zerfällt über einen Kreislauf aus Massenverlust, flacher werdendem Potential und kleiner werdendem Gezeitenradius. Equilibrium models of one- and two-component satellite galaxies are created. The tidal influence of the Milky Way, which is modeled as a rigid external potential, is examined. A first set of simulations adopting originally spherically-symmetric, one-component satellites show that after elliptical deformation, a bar and tails of unequal length arise which change their morphology cyclically. By considering comparative simulations, the following phenomena are discovered: (1) High-density regions in the tails, (2) low-density zones around the core or bar, (3) an often concealed bar. The overall morphology as function of time is analysed. The particles are lost from the core via the bar and move along certain morphologically characteristic structures into the tails. After deducing general quantities of the multi-component King model, three stable standard models of two-component satellite galaxies with different distributions of dark and visible matter are found. These models have a mass ratio of 1:10 for baryonic to dark matter. Without restriction to the universality of the results, the Large Magellanic Cloud was taken as basic prototype for these models. To choose the proper orbits, the tidal radius of the three-component model of the potential of the Galaxy is calculated both analytically and numerically. Then the simulations are analysed with the different behaviour of the components being the main focus of interest. Main results: (1) It is possible to detach a large

  2. Variabilität des Reviergesangs des Buchfinken (Fringilla coelebs) zur Raum-Zeit-Beschreibung von Metapopulationen

    NASA Astrophysics Data System (ADS)

    Nolte, Björn

    2003-10-01

    werden, wohingegen die interindividuelle Variation in zwei Fällen signifikant war. In einem Fall bestand ein Trend und in einem weiteren Fall war die Variationsunterschiede nicht signifikant. - Der Verlauf der Brutsaison lässt sich an der jährlichen Gesangsaktivität nachvollziehen. Chaffinch song was recorded in Potsdam in two major populations of chaffinches over a period of three years. Each male was identified unambiguously because of their individual song type repertoires. These are usually easy to distinguish from sonagrams as the variation is discontinuous. A further point for individual recognition is the fixed territorial behaviour of adult males. The described method is employed to examine whole populations and to observe changes with space and time in the song of a population. The major findings of the study are: - The total amount of basic song types in each population is constant over years. - The quantity of each basic song type is different and varies from year to year and from population to population. - Song copying is extremely accurate on at least 96% of occasions. - Song-type sharing is high within populations. Discussed mechanisms for song neighbourhoods are: expectation of life, semi-migratory behaviour, learning skills, establishment of song types, female choice and male vs male interaction. Furthermore a model of cultural evolution of chaffinch song was programmed to determine the role of factors like error rate, rate of emigration and running time. The changes are gradual in space and time. Hence the dialect borders are smooth. Despite this fact established song types mark the population. As every second juvenile bird settles in the population of his birth inbreeding is avoided and the dialect structure is retained. - Analysing the repertoires of neighbouring males (“next door neighbours”) in isolated avenues to examine mutual influences suggests that these have the same amount of song types in common than would be expected by

  3. Normalized modes at selected points without normalization

    NASA Astrophysics Data System (ADS)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  4. Clarifying Normalization

    ERIC Educational Resources Information Center

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  5. Pre Normal Science and the Transition to Post-Normal Policy

    NASA Astrophysics Data System (ADS)

    Halpern, J. B.

    2015-12-01

    Post-Normal Science as formulated by Funtowicz and Ravetz describes cases where "facts are uncertain, values in dispute, stakes high, and decisions urgent". However Post-Normal Science is better described as Pre-Normal Science, the stage at which something has been observed, but no one quite knows where it came from, what it means (science) or what to do about it (policy). The initial flailing about to reach a useful understanding is later used by those who oppose action to obfuscate by insisting that still nothing is known, what is known is wrong, or at best that more research is needed. Consider AIDS/HIV, stratospheric ozone, tobacco, acid rain, climate change, etc. As these issues gained attention, we entered the Pre-Normal Science stage. What was the cause? How could they be dealt with? Every idea could be proposed and was. Normal science sorted through them. Many proposers of the discarded theories still clutched them strongly, but mostly they are dismissed within the scientific community. Post-Normal Policy ensues when normal science has reached a consensus and it is clear that action is needed but it is economically or philosophically impossible for some to accept that. The response is to deny the utility of science and scientific judgment, thus the attacks on scientists and scientific panels that provide policy makers with their best scientific advice. Recognizing the division between Pre-Normal Science and Post-Normal Policy and the uses of the former to block action by the later is useful for understanding the course of controversies that require normal science to influence policy.

  6. Advocating for Normal Birth With Normal Clothes

    PubMed Central

    Waller-Wise, Renece

    2007-01-01

    Childbirth educators need to be aware that the clothes they wear when teaching classes send a nonverbal message to class participants. Regardless of who wears the clothing or what is worn, clothes send a message; thus, both the advantages and disadvantages related to clothing choice should be considered. Ultimately, the message should reflect the values of supporting normal birth. For childbirth educators who are allowed to choose their own apparel to wear in their classes, street clothes may be the benchmark for which to strive. This article discusses the many nonverbal messages that clothes convey and provides support for the choice of street clothes as the dress for the professional childbirth educator; thus, “normal clothes to promote normal birth.” PMID:18408807

  7. Wissenschaft, die unsere Kultur verändert. Tiefenschichten des Streits um die Evolutionstheorie

    NASA Astrophysics Data System (ADS)

    Patzelt, Werner J.

    Die Evolutionstheorie ist eine der erfolgreichsten wissenschaftlichen Theorien. Sie erlaubt es, unsere Herkunft zu verstehen und riskante Merkmale gerade der menschlichen Spezies zu begreifen. Zugleich ist die Evolutionstheorie eine der umstrittensten Theorien. Das liegt nicht an ihrer empirischen Tragfähigkeit, sondern an ihrem Gegenstand. Sie handelt nämlich nicht nur - wie Hunderte andere wissenschaftliche Theorien - von der "Welt da draußen“, sondern vor allem auch von uns selbst und von unserem Platz in dieser Welt. Den einen gilt sie obendrein als Überwinderin religiösen Aberglaubens, den anderen als neuer Zugang zu Gott und seinem Wirken in der Welt. Ferner sehen die einen in der Evolution eine unbezweifelbare Tatsache gleich der Schwerkraft oder dem Holocaust, die anderen aber eine - noch oder dauerhaft - unbewiesene Hypothese oder gar eine falsche Schöpfungslehre. Und während die meisten Streitfragen solcher Art nach wechselseitig akzeptierten Regeln ‚normaler Wissenschaft‘ geklärt werden, wird bei der Frage nach dem Woher unserer Spezies und Kultur die intellektuelle Zuständigkeit von Wissenschaft mitunter überhaupt bezweifelt. Anscheinend geht es schon um recht tiefe Schichten unserer Kultur und nicht nur der wissenschaftlichen, wenn - wie seit 150 Jahren - um die Evolutionstheorie gestritten wird. Wie sehen diese Schichten aus?

  8. Power of tests of normality for detecting contaminated normal samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thode, H.C. Jr.; Smith, L.A.; Finch, S.J.

    1981-01-01

    Seventeen tests of normality or goodness of fit were evaluated for power at detecting a contaminated normal sample. This study used 1000 replications each of samples of size 12, 17, 25, 33, 50, and 100 from six different contaminated normal distributions. The kurtosis test was the most powerful over all sample sizes and contaminations. The Hogg and weighted Kolmogorov-Smirnov tests were second. The Kolmogorov-Smirnov, chi-squared, Anderson-Darling, and Cramer-von-Mises tests had very low power at detecting contaminated normal random variables. Tables of the power of the tests and the power curves of certain tests are given.

  9. Interactions between Polygonal Normal Faults and Larger Normal Faults, Offshore Nova Scotia, Canada

    NASA Astrophysics Data System (ADS)

    Pham, T. Q. H.; Withjack, M. O.; Hanafi, B. R.

    2017-12-01

    Polygonal faults, small normal faults with polygonal arrangements that form in fine-grained sedimentary rocks, can influence ground-water flow and hydrocarbon migration. Using well and 3D seismic-reflection data, we have examined the interactions between polygonal faults and larger normal faults on the passive margin of offshore Nova Scotia, Canada. The larger normal faults strike approximately E-W to NE-SW. Growth strata indicate that the larger normal faults were active in the Late Cretaceous (i.e., during the deposition of the Wyandot Formation) and during the Cenozoic. The polygonal faults were also active during the Cenozoic because they affect the top of the Wyandot Formation, a fine-grained carbonate sedimentary rock, and the overlying Cenozoic strata. Thus, the larger normal faults and the polygonal faults were both active during the Cenozoic. The polygonal faults far from the larger normal faults have a wide range of orientations. Near the larger normal faults, however, most polygonal faults have preferred orientations, either striking parallel or perpendicular to the larger normal faults. Some polygonal faults nucleated at the tip of a larger normal fault, propagated outward, and linked with a second larger normal fault. The strike of these polygonal faults changed as they propagated outward, ranging from parallel to the strike of the original larger normal fault to orthogonal to the strike of the second larger normal fault. These polygonal faults hard-linked the larger normal faults at and above the level of the Wyandot Formation but not below it. We argue that the larger normal faults created stress-enhancement and stress-reorientation zones for the polygonal faults. Numerous small, polygonal faults formed in the stress-enhancement zones near the tips of larger normal faults. Stress-reorientation zones surrounded the larger normal faults far from their tips. Fewer polygonal faults are present in these zones, and, more importantly, most polygonal faults

  10. 25 Jahre - Institut fuer Geodaesie, Teil 2: Forschungsarbeiten und Veroeffentlichungen (25 Years - Institute of Geodesy, Part 2: Research Areas and Publications)

    DTIC Science & Technology

    2000-01-01

    Georg von Neumayer" dar. Der geo- ddtische Beitrag zu dern interdisziplinlren Forschungsprojekt. ,,Massenhaushalt und Dyna - mik" umnfasst im wesentlichen...zu robustifizieren. Es sind dabei zwei Entwicklungslinien festzustellen. Die eine ver- sucht, die in der Statistik entwickelte Theorie auf geodaitische...Hampel, F.R.: Contributions to the theory of robust estimation. Ph.D. Thesis, University of California, Berkeley, 1968, Hampel, F.R.: A general qualitative

  11. Smooth quantile normalization.

    PubMed

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  12. Cortical Thinning in Network-Associated Regions in Cognitively Normal and Below-Normal Range Schizophrenia

    PubMed Central

    Pinnock, Farena; Parlar, Melissa; Hawco, Colin; Hanford, Lindsay; Hall, Geoffrey B.

    2017-01-01

    This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB) composite score (T = 50 ± 10) and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n = 39) had greater cortical thickness than both cognitively normal (n = 17) and below-normal range (n = 49) patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n = 24) or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment. PMID:28348889

  13. Multivariate normality

    NASA Technical Reports Server (NTRS)

    Crutcher, H. L.; Falls, L. W.

    1976-01-01

    Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.

  14. Normal Coagulation

    DTIC Science & Technology

    2014-09-04

    LO TTIN G with vitamin K antagonist...confidential until formal publication.6 F CHAPTER 34 Normal Coagulation 531 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Table 34-1 Procoagulant...formal publication.8 F CHAPTER 34 Normal Coagulation 533 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Figure 34-4 Vitamin K–dependent com-

  15. New spatial upscaling methods for multi-point measurements: From normal to p-normal

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Li, Xin

    2017-12-01

    Careful attention must be given to determining whether the geophysical variables of interest are normally distributed, since the assumption of a normal distribution may not accurately reflect the probability distribution of some variables. As a generalization of the normal distribution, the p-normal distribution and its corresponding maximum likelihood estimation (the least power estimation, LPE) were introduced in upscaling methods for multi-point measurements. Six methods, including three normal-based methods, i.e., arithmetic average, least square estimation, block kriging, and three p-normal-based methods, i.e., LPE, geostatistics LPE and inverse distance weighted LPE are compared in two types of experiments: a synthetic experiment to evaluate the performance of the upscaling methods in terms of accuracy, stability and robustness, and a real-world experiment to produce real-world upscaling estimates using soil moisture data obtained from multi-scale observations. The results show that the p-normal-based methods produced lower mean absolute errors and outperformed the other techniques due to their universality and robustness. We conclude that introducing appropriate statistical parameters into an upscaling strategy can substantially improve the estimation, especially if the raw measurements are disorganized; however, further investigation is required to determine which parameter is the most effective among variance, spatial correlation information and parameter p.

  16. Biosynthesis of UDP-GlcNAc, UndPP-GlcNAc and UDP-GlcNAcA Involves Three Easily Distinguished 4-Epimerase Enzymes, Gne, Gnu and GnaB

    PubMed Central

    Cunneen, Monica M.; Liu, Bin; Wang, Lei; Reeves, Peter R.

    2013-01-01

    We have undertaken an extensive survey of a group of epimerases originally named Gne, that were thought to be responsible for inter-conversion of UDP-N-acetylglucosamine (UDP-GlcNAc) and UDP-N-acetylgalactosamine (UDP-GalNAc). The analysis builds on recent work clarifying the specificity of some of these epimerases. We find three well defined clades responsible for inter-conversion of the gluco- and galacto-configuration at C4 of different N-acetylhexosamines. Their major biological roles are the formation of UDP-GalNAc, UDP-N-acetylgalactosaminuronic acid (UDP-GalNAcA) and undecaprenyl pyrophosphate-N-acetylgalactosamine (UndPP-GalNAc) from the corresponding glucose forms. We propose that the clade of UDP-GlcNAcA epimerase genes be named gnaB and the clade of UndPP-GlcNAc epimerase genes be named gnu, while the UDP-GlcNAc epimerase genes retain the name gne. The Gne epimerases, as now defined after exclusion of those to be named GnaB or Gnu, are in the same clade as the GalE 4-epimerases for inter-conversion of UDP-glucose (UDP-Glc) and UDP-galactose (UDP-Gal). This work brings clarity to an area that had become quite confusing. The identification of distinct enzymes for epimerisation of UDP-GlcNAc, UDP-GlcNAcA and UndPP-GlcNAc will greatly facilitate allocation of gene function in polysaccharide gene clusters, including those found in bacterial genome sequences. A table of the accession numbers for the 295 proteins used in the analysis is provided to enable the major tree to be regenerated with the inclusion of additional proteins of interest. This and other suggestions for annotation of 4-epimerase genes will facilitate annotation. PMID:23799153

  17. Visual Memories Bypass Normalization.

    PubMed

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  18. Group normalization for genomic data.

    PubMed

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  19. 25 Jahre - Institut fuer Geodaesie, Teil 1: Wissenschaftliche Beitraege und Berichte (25 Years - Institute of Geodesy, Part 1: Scientific Contributions and Reports)

    DTIC Science & Technology

    2000-01-01

    R.: Contribution to the Theory of Robust Estimation. PhD-Thesis, Univ. of California, Berkely 1968 HEISTER, H., HOLLMANN, R. u. LANG, M.: Multipath...AVN 106, S. 128-133, 1999 RCHRNOSSL, H., BRUNNER , F. u. RoTHACHER, M.: Modellierung der troposph4drischen Korrekturflur Deformationsmessungen mit GPS...fiber die Entwicklungen in Theorie und Praxis berichtet, die die Erwartung begrUnden, dass die GeodAtische Astronomie in der GeodAsie kinftig wieder eine

  20. Visual Memories Bypass Normalization

    PubMed Central

    Bloem, Ilona M.; Watanabe, Yurika L.; Kibbe, Melissa M.; Ling, Sam

    2018-01-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores—neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation. PMID:29596038

  1. Group Normalization for Genomic Data

    PubMed Central

    Ghandi, Mahmoud; Beer, Michael A.

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets. PMID:22912661

  2. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    PubMed

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  3. Self-image and perception of mother and father in psychotic and borderline patients.

    PubMed

    Armelius, K; Granberg

    2000-02-01

    Psychotic and borderline patients rated their self-image and their perception of their mother and father using the Structural Analysis of Social Behavior model (SASB). The borderline patients had more negative images of themselves and their parents, especially their fathers, than did the psychotic patients and the normal subjects, while the psychotic patients' ratings did not differ much from those of the normal subjects. The self-image was related to the images of both parents for borderline patients and normal subjects, while for the psychotic patients only the image of the mother was important for the self-image. In addition, the psychotic patients did not differentiate between the poles of control and autonomy in the introjected self-image. It was concluded that borderline patients are characterized by negative attachment, while psychotic patients are characterized by poor separation from the mother and poor differentiation between autonomy and control. The paper also discusses how this may influence the patients' relations to others. Psychotische und Borderline Patienten beurteilten ihr Selbstbild und ihre Wahrnehmung von Mutter und Vater mit Hilfe der strukturalen Analyse sozialen Verhaltens (SASB). Die Borderline Patienten hattten negativere Selbstbilder und Elternbilder (speziell Vaterbilder) als die psychotischen Patienten und gesunde Personen. Die Beurteilungen der psychotischen Patienten unterschieden sich dagegen nicht besonders von jenen Gesunder. Das Selbstbild stand in Beziehung zu beiden Elternbildern bei den Borderline Patienten und den Gesunden, während bei den psychotischen Patienten nur das Mutterbild für das Selbstbild bedeutsam war. Außerdem konnte bei den psychotischen Patienten nicht zwischen den Polen der Kontrolle und Autonomie bzgl. der introjizierten Selbstbilder differenziert werden. Aus den Ergebnissen wird gefolgert, dass Borderline Patienten durch eine negative Bindung charackterisiert sind, psychotische Patienten dagegen durch

  4. Statokinesigram normalization method.

    PubMed

    de Oliveira, José Magalhães

    2017-02-01

    Stabilometry is a technique that aims to study the body sway of human subjects, employing a force platform. The signal obtained from this technique refers to the position of the foot base ground-reaction vector, known as the center of pressure (CoP). The parameters calculated from the signal are used to quantify the displacement of the CoP over time; there is a large variability, both between and within subjects, which prevents the definition of normative values. The intersubject variability is related to differences between subjects in terms of their anthropometry, in conjunction with their muscle activation patterns (biomechanics); and the intrasubject variability can be caused by a learning effect or fatigue. Age and foot placement on the platform are also known to influence variability. Normalization is the main method used to decrease this variability and to bring distributions of adjusted values into alignment. In 1996, O'Malley proposed three normalization techniques to eliminate the effect of age and anthropometric factors from temporal-distance parameters of gait. These techniques were adopted to normalize the stabilometric signal by some authors. This paper proposes a new method of normalization of stabilometric signals to be applied in balance studies. The method was applied to a data set collected in a previous study, and the results of normalized and nonnormalized signals were compared. The results showed that the new method, if used in a well-designed experiment, can eliminate undesirable correlations between the analyzed parameters and the subjects' characteristics and show only the experimental conditions' effects.

  5. Die nuklearen Anlagen von Hanford (1943-1987) Eine Fallstudie über die Schnittstellen von Physik, Biologie und die US-amerikanische Gesellschaft zur Zeit des Kalten Krieges

    NASA Astrophysics Data System (ADS)

    Macuglia, Daniele

    Die Geschichte des Kalten Krieges eröffnet viele Möglichkeiten, sich näher mit den Schnittstellen von Physik und Biologie während des 20. Jahrhunderts zu befassen. Nicht nur das Unglück in Tschernobyl aus dem Jahr 1986, auch das Beispiel der nuklearen Anlagen in Hanford in den Vereinigten Staaten zeigt die biologischen Folgen von nuklearer Physik.

  6. NOAA predicts near-normal or below-normal 2014 Atlantic hurricane season

    Science.gov Websites

    Related link: Atlantic Basin Hurricane Season Outlook Discussion El Niño/Southern Oscillation (ENSO predicts near-normal or below-normal 2014 Atlantic hurricane season El Niño expected to develop and . The main driver of this year's outlook is the anticipated development of El Niño this summer. El NiÃ

  7. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    PubMed

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  8. Sympathetic nerve traffic and baroreflex function in optimal, normal, and high-normal blood pressure states.

    PubMed

    Seravalle, Gino; Lonati, Laura; Buzzi, Silvia; Cairo, Matteo; Quarti Trevano, Fosca; Dell'Oro, Raffaella; Facchetti, Rita; Mancia, Giuseppe; Grassi, Guido

    2015-07-01

    Adrenergic activation and baroreflex dysfunction are common in established essential hypertension, elderly hypertension, masked and white-coat hypertension, resistant hypertension, and obesity-related hypertension. Whether this autonomic behavior is peculiar to established hypertension or is also detectable in the earlier clinical phases of the disease, that is, the high-normal blood pressure (BP) state, is still largely undefined, however. In 24 individuals with optimal BP (age: 37.1  ±  2.1 years, mean  ±  SEM) and in 27 with normal BP and 38 with high-normal BP, age matched with optimal BP, we measured clinic, 24-h and beat-to-beat BP, heart rate (HR), and muscle sympathetic nerve activity (MSNA) at rest and during baroreceptor stimulation and deactivation. Measurements also included anthropometric as well as echocardiographic and homeostasis model assessment (HOMA) index. For similar anthropometric values, clinic, 24-h ambulatory, and beat-to-beat BPs were significantly greater in normal BP than in optimal BP. This was the case when the high-normal BP group was compared to the normal and optimal BP groups. MSNA (but not HR) was also significantly greater in high-normal BP than in normal BP and optimal BP (51.3  ±  2.0 vs. 40.3  ±  2.3 and 41.1 ± 2.6  bursts per 100  heartbeats, respectively, P < 0.01). The sympathetic activation seen in high-normal BP was coupled with an impairment of baroreflex HR control (but not MSNA) and with a significant increase in HOMA Index, which showed a significant direct relationship with MSNA. Thus, independently of which BP the diagnosis is based, high-normal BP is a condition characterized by a sympathetic activation. This neurogenic alteration, which is likely to be triggered by metabolic rather than reflex alterations, might be involved, together with other factors, in the progression of the condition to established hypertension.

  9. Upper-normal waist circumference is a risk marker for metabolic syndrome in normal-weight subjects.

    PubMed

    Okada, R; Yasuda, Y; Tsushita, K; Wakai, K; Hamajima, N; Matsuo, S

    2016-01-01

    To elucidate implication of upper-normal waist circumference (WC), we examined whether the normal range of WC still represents a risk of metabolic syndrome (MetS) or non-adipose MetS components among normal-weight subjects. A total of 173,510 persons (100,386 men and 73,124 women) with normal WC (<90/80 cm in men/women) and body mass index (BMI) of 18.5-24.9 were included. Subjects were categorized as having low, moderate, and upper-normal WC for those with WC < 80, 80-84, and 85-89 cm in men and <70, 70-74, and 75-79 cm in women, respectively. The prevalence of all the non-adipose MetS components (e.g. prediabetes and borderline dyslipidemia) was significantly higher in subjects with upper-normal WC on comparison with those with low WC. Overall, the prevalence of MetS (having three or more of four non-adipose MetS components) gradually increased with increasing WC (12%, 21%, and 27% in men and 11%, 14%, and 19% in women for low, moderate, and upper-normal WC, respectively). Moreover, the risk of having a greater number of MetS components increased in subjects with upper-normal WC compared with those with low WC (odds ratios for the number of one, two, three, and four MetS components: 1.29, 1.81, 2.53, and 2.47 in men and 1.16, 1.55, 1.49, and 2.20 in women, respectively). Upper-normal WC represents a risk for acquiring a greater number of MetS components and the early stage of MetS components (prediabetes and borderline dyslipidemia), after adjusting for BMI, in a large general population with normal WC and BMI. Copyright © 2015 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  10. Supervised normalization of microarrays

    PubMed Central

    Mecham, Brigham H.; Nelson, Peter S.; Storey, John D.

    2010-01-01

    Motivation: A major challenge in utilizing microarray technologies to measure nucleic acid abundances is ‘normalization’, the goal of which is to separate biologically meaningful signal from other confounding sources of signal, often due to unavoidable technical factors. It is intuitively clear that true biological signal and confounding factors need to be simultaneously considered when performing normalization. However, the most popular normalization approaches do not utilize what is known about the study, both in terms of the biological variables of interest and the known technical factors in the study, such as batch or array processing date. Results: We show here that failing to include all study-specific biological and technical variables when performing normalization leads to biased downstream analyses. We propose a general normalization framework that fits a study-specific model employing every known variable that is relevant to the expression study. The proposed method is generally applicable to the full range of existing probe designs, as well as to both single-channel and dual-channel arrays. We show through real and simulated examples that the method has favorable operating characteristics in comparison to some of the most highly used normalization methods. Availability: An R package called snm implementing the methodology will be made available from Bioconductor (http://bioconductor.org). Contact: jstorey@princeton.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20363728

  11. Laser-scanned fluorescence of nonlased/normal, lased/normal, nonlased/carious, and lased/carious enamel

    NASA Astrophysics Data System (ADS)

    Zakariasen, Kenneth L.; Barron, Joseph R.; Paton, Barry E.

    1992-06-01

    Research has shown that low levels of CO2 laser irradiation raise enamel resistance to sub-surface demineralization. Additionally, laser scanned fluorescence analysis of enamel, as well a laser and white light reflection studies, have potential for both clinical diagnosis and comparative research investigations of the caries process. This study was designed to compare laser fluorescence and laser/white light reflection of (1) non-lased/normal with lased/normal enamel and (2) non-lased/normal with non-lased/carious and lased/carious enamel. Specimens were buccal surfaces of extracted third molars, coated with acid resistant varnish except for either two or three 2.25 mm2 windows (two window specimens: non-lased/normal, lased/normal--three window specimens: non-lased/normal, non-lased carious, lased/carious). Teeth exhibiting carious windows were immersed in a demineralizing solution for twelve days. Non-carious windows were covered with wax during immersion. Following immersion, the wax was removed, and fluorescence and laser/white light reflection analyses were performed on all windows utilizing a custom scanning laser fluorescence spectrometer which focuses light from a 25 mWatt He-Cd laser at 442 nm through an objective lens onto a cross-section >= 3 (mu) in diameter. For laser/white light reflection analyses, reflected light intensities were measured. A HeNe laser was used for laser light reflection studies. Following analyses, the teeth are sectioned bucco-lingually into 80 micrometers sections, examined under polarized light microscopy, and the lesions photographed. This permits comparison between fluorescence/reflected light values and the visualized decalcification areas for each section, and thus comparisons between various enamel treatments and normal enamel. The enamel specimens are currently being analyzed.

  12. Bewehrte Betonbauteile unter Betriebsbedingungen: Forschungsbericht

    NASA Astrophysics Data System (ADS)

    Eligehausen, Rolf; Kordina, Karl; Schießl, Peter

    2000-09-01

    .2 Ungerissener Zustand. 1.3 Gerissener Zustand. 1.4 FEM-Berechnungen. 1.5 Zusammenfassung. 1.6 Literatur. 2 Auswirkungen des unterschiedlichen Verformungsverhaltens bei Beund Entlastung auf die Beanspruchungen im Gebrauchszustand (Jochen Keysberg). 2.1 Einleitung und Zielsetzung. 2.2 Modelle für die Momenten-Verkrümmungs-Beziehung. 2.3 Programm zur nichtlinearen Berechnung. 2.4 Einfluß von Lastwechseln auf nichtlineare Berechnungen. 2.5 Zusammenfassung. 2.6 Literatur. 3 3D-Analyse von Balken-Stützen-Verbindungen aus normal- und hochfestem Beton unter zyklischer Beanspruchung (Josko Ozbolt, Yijun Li und Rolf Eligehausen). 3.1 Einleitung. 3.2 Materialmodell und FE-Diskretisierung. 3.3 Numerische Analyse. 3.4 Schlußfolgerungen. 3.5 Zusammenfassung. 3.6 Literatur. 4 Der Einfluß von freien Schwingungen infolge dynamischer Belastung auf die Deterioration eines Bauwerks (Manfred Specht und Michael Kramp). 4.1 Veranlassung des Forschungsvorhabens. 4.2 Forschungsziele. 4.3 Versuchsträger, Versuchsdurchführung und Versuchsergebnisse. 4.4 Auswertung. 4.5 Ergebnisse für die Systemidentifikation von Stahlbetonkonstruktionen. 4.6 Literatur. 5 Lokale Schwind- und Temperaturgradienten in bewehrten, oberflächennahen Zonen von Betonkonstruktionen (Josef Eibl und Stephan Kranz). 5.1 Problemstellung. 5.2 Temperatur- und Feuchtefeldberechnung. 5.3 Numerisches Berechnungsmodell zur Spannungsanalyse im Beton. 5.4 Durchgeführte Versuche. 5.5 Rechnerische Untersuchungen. 5.6 Zusammenfassung. 5.7 Literatur. 6 Wassereindringverhalten von Flüssigkeiten beim Biegeriß (Gert König und Christian Brunsch). 6.1 Problemstellung. 6.2 Experimentelle Untersuchungen. 6.3 Entwicklung eines Modells zur rechnerischen Abschätzung des zeitlichen Eindringens einer Wassermenge in Biegerisse von Stahlbetonbauteilen. 6.4 Zusammenfassung und Diskussion der Versuchsreihe. 6.5 Literatur. 7 Dauerhaftigkeitsprobleme von offenen Becken (György Iványi, Wilhelm Buschmeyer und Udo Paas). 7.1 Einleitung. 7

  13. Spinal cord normalization in multiple sclerosis.

    PubMed

    Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S

    2014-01-01

    Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P < .05). There were correlations between clinical measures and raw-SCV (EDSS:r = -.20; MSFC:r = .16; strength:r = .35; vibration:r = -.19). Correlations consistently strengthened with normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.

  14. Normal Stress or Adjustment Disorder?

    MedlinePlus

    ... Lifestyle Stress management What's the difference between normal stress and an adjustment disorder? Answers from Daniel K. Hall-Flavin, M.D. Stress is a normal psychological and physical reaction to ...

  15. Understanding a Normal Distribution of Data.

    PubMed

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  16. Bereits nach Ablauf der Halbwertszeit droht der vollständige Zerfall Die britische Atomic Scientists’ Association, die Ideologie der „objektiven” Wissenschaft und die H-Bombe

    NASA Astrophysics Data System (ADS)

    Laucht, Christoph

    Präsident Harry Trumans Verlautbarung vom 31.1.1950, seine Regierung wolle die Entwicklung der Wasserstoffbombe vorantreiben, fand große Beachtung in den britischen Medien. Die illustrierte Zeitschrift Picture Post widmete der HBombe einen Artikel, der unter anderem kurze Stellungnahmen der britischen Atomwissenschaftler Eric Burhop, Kathleen Lonsdale, Harrie Massey, Rudolf Peierls und Maurice Pryce enthielt, die alle Mitglieder der Atomic Scientists' Association (ASA) waren.

  17. Cell proliferation in normal epidermis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinstein, G.D.; McCullough, J.L.; Ross, P.

    1984-06-01

    A detailed examination of cell proliferation kinetics in normal human epidermis is presented. Using tritiated thymidine with autoradiographic techniques, proliferative and differentiated cell kinetics are defined and interrelated. The proliferative compartment of normal epidermis has a cell cycle duration (Tc) of 311 h derived from 3 components: the germinative labeling index (LI), the duration of DNA synthesis (ts), and the growth fraction (GF). The germinative LI is 2.7% +/- 1.2 and ts is 14 h, the latter obtained from a composite fraction of labeled mitoses curve obtained from 11 normal subjects. The GF obtained from the literature and from humanmore » skin xenografts to nude mice is estimated to be 60%. Normal-appearing epidermis from patients with psoriasis appears to have a higher proliferation rate. The mean LI is 4.2% +/- 0.9, approximately 50% greater than in normal epidermis. Absolute cell kinetic values for this tissue, however, cannot yet be calculated for lack of other information on ts and GF. A kinetic model for epidermal cell renewal in normal epidermis is described that interrelates the rate of birth/entry, transit, and/or loss of keratinocytes in the 3 epidermal compartments: proliferative, viable differentiated (stratum malpighii), and stratum corneum. Expected kinetic homeostasis in the epidermis is confirmed by the very similar ''turnover'' rates in each of the compartments that are, respectively, 1246, 1417, and 1490 cells/day/mm2 surface area. The mean epidermal turnover time of the entire tissue is 39 days. The Tc of 311 h in normal cells in 8-fold longer than the psoriatic Tc of 36 h and is necessary for understanding the hyperproliferative pathophysiologic process in psoriasis.« less

  18. Normal Weight Obesity: A Hidden Health Risk?

    MedlinePlus

    Normal weight obesity: A hidden health risk? Can you be considered obese if you have a normal body weight? Answers from ... considered obese — a condition known as normal weight obesity. Normal weight obesity means you may have the ...

  19. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    PubMed

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  20. Comprehensive non-dimensional normalization of gait data.

    PubMed

    Pinzone, Ornella; Schwartz, Michael H; Baker, Richard

    2016-02-01

    Normalizing clinical gait analysis data is required to remove variability due to physical characteristics such as leg length and weight. This is particularly important for children where both are associated with age. In most clinical centres conventional normalization (by mass only) is used whereas there is a stronger biomechanical argument for non-dimensional normalization. This study used data from 82 typically developing children to compare how the two schemes performed over a wide range of temporal-spatial and kinetic parameters by calculating the coefficients of determination with leg length, weight and height. 81% of the conventionally normalized parameters had a coefficient of determination above the threshold for a statistical association (p<0.05) compared to 23% of those normalized non-dimensionally. All the conventionally normalized parameters exceeding this threshold showed a reduced association with non-dimensional normalization. In conclusion, non-dimensional normalization is more effective that conventional normalization in reducing the effects of height, weight and age in a comprehensive range of temporal-spatial and kinetic parameters. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Rock friction under variable normal stress

    USGS Publications Warehouse

    Kilgore, Brian D.; Beeler, Nicholas M.; Lozos, Julian C.; Oglesby, David

    2017-01-01

    This study is to determine the detailed response of shear strength and other fault properties to changes in normal stress at room temperature using dry initially bare rock surfaces of granite at normal stresses between 5 and 7 MPa. Rapid normal stress changes result in gradual, approximately exponential changes in shear resistance with fault slip. The characteristic length of the exponential change is similar for both increases and decreases in normal stress. In contrast, changes in fault normal displacement and the amplitude of small high-frequency elastic waves transmitted across the surface follow a two stage response consisting of a large immediate and a smaller gradual response with slip. The characteristic slip distance of the small gradual response is significantly smaller than that of shear resistance. The stability of sliding in response to large step decreases in normal stress is well predicted using the shear resistance slip length observed in step increases. Analysis of the shear resistance and slip-time histories suggest nearly immediate changes in strength occur in response to rapid changes in normal stress; these are manifested as an immediate change in slip speed. These changes in slip speed can be qualitatively accounted for using a rate-independent strength model. Collectively, the observations and model show that acceleration or deceleration in response to normal stress change depends on the size of the change, the frictional characteristics of the fault surface, and the elastic properties of the loading system.

  3. Evaluation of the ASOS impact on climatic normals and assessment of variable-length time periods in calculation of normals

    NASA Astrophysics Data System (ADS)

    Kauffman, Chad Matthew

    The temperature and precipitation that describe the norm of daily, monthly, and seasonal climate conditions are ``climate normals.'' They are usually calculated based on climate data covering a 30-year period, and updated in every 10 years. The next update will take place in year 2001. Because of the advent of the Automated Surface Observations Systems (ASOS) beginning in early 1990s and recognized temperature bias between ASOS and the conventional temperature sensors there is an uncertainty of how the ASOS data should be used to calculate the 1971-2000 temperature normal. This study examined the uncertainty and offered a method to minimize it. It showed that the ASOS bias has a measurable impact on the new 30-year temperature normal. The impact varies among stations and climate regions. Some stations with a cooling trend in ASOS temperature have a cooler normal for their temperature, while others with a warming trend have a warmer normal for temperature. These quantitative evaluations of ASOS effect for stations and regions can be used to reduce ASOS bias in temperature normals. This study also evaluated temperature normals for different length periods and compared them to the 30-year normal. It showed that the difference between the normals, is smaller in maritime climate than in continental temperate climate. In the former, the six- year normal describes a similar temperature variation as the 30-year normal does. In the latter, the 18-year normal starts to resemble the temperature variation that the 30-year normal describes. These results provide a theoretical basis for applying different normals in different regions. The study further compared temperature normal for different periods and identified a seasonal shift in climate change in the southwestern U.S. where the summer maximum temperature has shifted to a late summer month and the winter minimum temperature shifted to an early winter month in the past 30 years.

  4. A Statistical Selection Strategy for Normalization Procedures in LC-MS Proteomics Experiments through Dataset Dependent Ranking of Normalization Scaling Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Jacobs, Jon M.

    2011-12-01

    Quantification of LC-MS peak intensities assigned during peptide identification in a typical comparative proteomics experiment will deviate from run-to-run of the instrument due to both technical and biological variation. Thus, normalization of peak intensities across a LC-MS proteomics dataset is a fundamental step in pre-processing. However, the downstream analysis of LC-MS proteomics data can be dramatically affected by the normalization method selected . Current normalization procedures for LC-MS proteomics data are presented in the context of normalization values derived from subsets of the full collection of identified peptides. The distribution of these normalization values is unknown a priori. If theymore » are not independent from the biological factors associated with the experiment the normalization process can introduce bias into the data, which will affect downstream statistical biomarker discovery. We present a novel approach to evaluate normalization strategies, where a normalization strategy includes the peptide selection component associated with the derivation of normalization values. Our approach evaluates the effect of normalization on the between-group variance structure in order to identify candidate normalization strategies that improve the structure of the data without introducing bias into the normalized peak intensities.« less

  5. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Galal, Ken

    1992-01-01

    Methods are presented to normalize the attitude quaternion in two extended Kalman filters (EKF), namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). It is concluded that all the normalization methods work well and yield comparable results. In the AEKF, normalization is not essential, since the data chosen for the test do not have a rapidly varying attitude. In the MEKF, normalization is necessary to avoid divergence of the attitude estimate. All of the methods of the methods behave similarly when the spacecraft experiences low angular rates.

  6. Relating normalization to neuronal populations across cortical areas

    PubMed Central

    Alberts, Joshua J.; Cohen, Marlene R.

    2016-01-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. PMID:27358313

  7. Relating normalization to neuronal populations across cortical areas.

    PubMed

    Ruff, Douglas A; Alberts, Joshua J; Cohen, Marlene R

    2016-09-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. Copyright © 2016 the American Physiological Society.

  8. Normal evaporation of binary alloys

    NASA Technical Reports Server (NTRS)

    Li, C. H.

    1972-01-01

    In the study of normal evaporation, it is assumed that the evaporating alloy is homogeneous, that the vapor is instantly removed, and that the alloy follows Raoult's law. The differential equation of normal evaporation relating the evaporating time to the final solute concentration is given and solved for several important special cases. Uses of the derived equations are exemplified with a Ni-Al alloy and some binary iron alloys. The accuracy of the predicted results are checked by analyses of actual experimental data on Fe-Ni and Ni-Cr alloys evaporated at 1600 C, and also on the vacuum purification of beryllium. These analyses suggest that the normal evaporation equations presented here give satisfactory results that are accurate to within an order of magnitude of the correct values, even for some highly concentrated solutions. Limited diffusion and the resultant surface solute depletion or enrichment appear important in the extension of this normal evaporation approach.

  9. Fluid involvement in normal faulting

    NASA Astrophysics Data System (ADS)

    Sibson, Richard H.

    2000-04-01

    Evidence of fluid interaction with normal faults comes from their varied role as flow barriers or conduits in hydrocarbon basins and as hosting structures for hydrothermal mineralisation, and from fault-rock assemblages in exhumed footwalls of steep active normal faults and metamorphic core complexes. These last suggest involvement of predominantly aqueous fluids over a broad depth range, with implications for fault shear resistance and the mechanics of normal fault reactivation. A general downwards progression in fault rock assemblages (high-level breccia-gouge (often clay-rich) → cataclasites → phyllonites → mylonite → mylonitic gneiss with the onset of greenschist phyllonites occurring near the base of the seismogenic crust) is inferred for normal fault zones developed in quartzo-feldspathic continental crust. Fluid inclusion studies in hydrothermal veining from some footwall assemblages suggest a transition from hydrostatic to suprahydrostatic fluid pressures over the depth range 3-5 km, with some evidence for near-lithostatic to hydrostatic pressure cycling towards the base of the seismogenic zone in the phyllonitic assemblages. Development of fault-fracture meshes through mixed-mode brittle failure in rock-masses with strong competence layering is promoted by low effective stress in the absence of thoroughgoing cohesionless faults that are favourably oriented for reactivation. Meshes may develop around normal faults in the near-surface under hydrostatic fluid pressures to depths determined by rock tensile strength, and at greater depths in overpressured portions of normal fault zones and at stress heterogeneities, especially dilational jogs. Overpressures localised within developing normal fault zones also determine the extent to which they may reutilise existing discontinuities (for example, low-angle thrust faults). Brittle failure mode plots demonstrate that reactivation of existing low-angle faults under vertical σ1 trajectories is only likely if

  10. The Normal Fetal Pancreas.

    PubMed

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P < .001) and significant correlations also with abdominal circumference and estimated fetal weight (Pearson r = 0.829 and 0.812, respectively; P < .001). Modeled pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  11. Normal pressure hydrocephalus

    MedlinePlus

    Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. Ferri's Clinical Advisor 2016 . Philadelphia, PA: Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders of cerebrospinal fluid circulation. ...

  12. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  13. 3j Symbols: To Normalize or Not to Normalize?

    ERIC Educational Resources Information Center

    van Veenendaal, Michel

    2011-01-01

    The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…

  14. Normal Aging and Linguistic Decrement.

    ERIC Educational Resources Information Center

    Emery, Olga B.

    A study investigated language patterning, as an indication of synthetic mental activity, in comparison groups of normal pre-middle-aged adults (30-42 years), normal elderly adults (75-93), and elderly adults (71-91) with Alzheimer's dementia. Semiotic theory was used as the conceptual context. Linguistic measures included the Token Test, the…

  15. Quantiles for Finite Mixtures of Normal Distributions

    ERIC Educational Resources Information Center

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  16. Metabolic Cost, Mechanical Work, and Efficiency during Normal Walking in Obese and Normal-Weight Children

    ERIC Educational Resources Information Center

    Huang, Liang; Chen, Peijie; Zhuang, Jie; Zhang, Yanxin; Walt, Sharon

    2013-01-01

    Purpose: This study aimed to investigate the influence of childhood obesity on energetic cost during normal walking and to determine if obese children choose a walking strategy optimizing their gait pattern. Method: Sixteen obese children with no functional abnormalities were matched by age and gender with 16 normal-weight children. All…

  17. Normalizing Catastrophe: An Educational Response

    ERIC Educational Resources Information Center

    Jickling, Bob

    2013-01-01

    Processes of normalizing assumptions and values have been the subjects of theoretical framing and critique for several decades now. Critique has often been tied to issues of environmental sustainability and social justice. Now, in an era of global warming, there is a rising concern that the results of normalizing of present values could be…

  18. A Normalized Direct Approach for Estimating the Parameters of the Normal Ogive Three-Parameter Model for Ability Tests.

    ERIC Educational Resources Information Center

    Gugel, John F.

    A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…

  19. Genomic Changes in Normal Breast Tissue in Women at Normal Risk or at High Risk for Breast Cancer

    PubMed Central

    Danforth, David N.

    2016-01-01

    Sporadic breast cancer develops through the accumulation of molecular abnormalities in normal breast tissue, resulting from exposure to estrogens and other carcinogens beginning at adolescence and continuing throughout life. These molecular changes may take a variety of forms, including numerical and structural chromosomal abnormalities, epigenetic changes, and gene expression alterations. To characterize these abnormalities, a review of the literature has been conducted to define the molecular changes in each of the above major genomic categories in normal breast tissue considered to be either at normal risk or at high risk for sporadic breast cancer. This review indicates that normal risk breast tissues (such as reduction mammoplasty) contain evidence of early breast carcinogenesis including loss of heterozygosity, DNA methylation of tumor suppressor and other genes, and telomere shortening. In normal tissues at high risk for breast cancer (such as normal breast tissue adjacent to breast cancer or the contralateral breast), these changes persist, and are increased and accompanied by aneuploidy, increased genomic instability, a wide range of gene expression differences, development of large cancerized fields, and increased proliferation. These changes are consistent with early and long-standing exposure to carcinogens, especially estrogens. A model for the breast carcinogenic pathway in normal risk and high-risk breast tissues is proposed. These findings should clarify our understanding of breast carcinogenesis in normal breast tissue and promote development of improved methods for risk assessment and breast cancer prevention in women. PMID:27559297

  20. Normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  1. Normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1997-06-10

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3{prime} noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. 4 figs.

  2. Confirmed viral meningitis with normal CSF findings.

    PubMed

    Dawood, Naghum; Desjobert, Edouard; Lumley, Janine; Webster, Daniel; Jacobs, Michael

    2014-07-17

    An 18-year-old woman presented with a progressively worsening headache, photophobia feverishness and vomiting. Three weeks previously she had returned to the UK from a trip to Peru. At presentation, she had clinical signs of meningism. On admission, blood tests showed a mild lymphopenia, with a normal C reactive protein and white cell count. Chest X-ray and CT of the head were normal. Cerebrospinal fluid (CSF) microscopy was normal. CSF protein and glucose were in the normal range. MRI of the head and cerebral angiography were also normal. Subsequent molecular testing of CSF detected enterovirus RNA by reverse transcriptase PCR. The patient's clinical syndrome correlated with her virological diagnosis and no other cause of her symptoms was found. Her symptoms were self-limiting and improved with supportive management. This case illustrates an important example of viral central nervous system infection presenting clinically as meningitis but with normal CSF microscopy. 2014 BMJ Publishing Group Ltd.

  3. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  4. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  5. What's normal? Influencing women's perceptions of normal genitalia: an experiment involving exposure to modified and nonmodified images.

    PubMed

    Moran, C; Lee, C

    2014-05-01

    Examine women's perceptions of what is 'normal' and 'desirable' in female genital appearance. Experiment with random allocation across three conditions. Community. A total of 97 women aged 18-30 years. Women were randomly assigned to view a series of images of (1) surgically modified vulvas or (2) nonmodified vulvas, or (3) no images. They then viewed and rated ten target images of surgically modified vulvas and ten of unmodified vulvas. Women used a four-point Likert scale ('strongly agree' to 'strongly disagree'), to rate each target image for 'looks normal' and 'represents society's ideal'. For each woman, we created two summary scores that represented the extent to which she rated the unmodified vulvas as more 'normal' and more 'society's ideal' than the modified vulvas. For ratings of 'normality,' there was a significant effect for condition (F2,94  = 2.75 P = 0.007, radj2 = 0.082): women who had first viewed the modified images rated the modified target vulvas as more normal than the nonmodified vulvas, significantly different from the control group, who rated them as less normal. For ratings of 'society's ideal', there was again a significant effect for condition (F2,92  = 7.72, P < 0.001, radj2 = 0.125); all three groups rated modified target vulvas as more like society's ideal than the nonmodified target vulvas, with the effect significantly strongest for the women who had viewed the modified images. Exposure to images of modified vulvas may change women's perceptions of what is normal and desirable. This may explain why some healthy women seek labiaplasty. © 2013 Royal College of Obstetricians and Gynaecologists.

  6. Valuation of Normal Range of Ankle Systolic Blood Pressure in Subjects with Normal Arm Systolic Blood Pressure.

    PubMed

    Gong, Yi; Cao, Kai-wu; Xu, Jin-song; Li, Ju-xiang; Hong, Kui; Cheng, Xiao-shu; Su, Hai

    2015-01-01

    This study aimed to establish a normal range for ankle systolic blood pressure (SBP). A total of 948 subjects who had normal brachial SBP (90-139 mmHg) at investigation were enrolled. Supine BP of four limbs was simultaneously measured using four automatic BP measurement devices. The ankle-arm difference (An-a) on SBP of both sides was calculated. Two methods were used for establishing normal range of ankle SBP: the 99% method was decided on the 99% reference range of actual ankle BP, and the An-a method was the sum of An-a and the low or up limits of normal arm SBP (90-139 mmHg). Whether in the right or left side, the ankle SBP was significantly higher than the arm SBP (right: 137.1 ± 16.9 vs 119.7 ± 11.4 mmHg, P<0.05). Based on the 99% method, the normal range of ankle SBP was 94~181 mmHg for the total population, 84~166 mmHg for the young (18-44 y), 107~176 mmHg for the middle-aged(45-59 y) and 113~179 mmHg for the elderly (≥ 60 y) group. As the An-a on SBP was 13 mmHg in the young group and 20 mmHg in both middle-aged and elderly groups, the normal range of ankle SBP on the An-a method was 103-153 mmHg for young and 110-160 mmHg for middle-elderly subjects. A primary reference for normal ankle SBP was suggested as 100-165 mmHg in the young and 110-170 mmHg in the middle-elderly subjects.

  7. Bruno Braunerde und die Bodentypen - Learning about soil diversity and soil functions with cartoon characters

    NASA Astrophysics Data System (ADS)

    Hofmann, Anett

    2015-04-01

    "Bruno Braunerde und die Bodentypen" is a German-language learning material that fosters discovery of soil diversity and soil functions in kids, teens and adults who enjoy interactive learning activities. The learning material consists of (i) a large poster (dimensions 200 x 120 cm) showing an imaginative illustrated landscape that could be situated in Austria, Switzerland or southern Germany and (ii) a set of 15 magnetic cards that show different soil cartoon characters, e.g. Bruno Braunerde (Cambisol), Stauni Pseudogley (Stagnic Luvisol) or Heidi Podsol (Podzol) on the front and a fun profession and address (linked to the respective soil functions) on the back side. The task is to place the soil cartoon characters to their 'home' in the landscape. This learning material was developed as a contribution to the International Year of Soils 2015 and is supported by the German, Austrian and Swiss Soil Sciences Societies and the Swiss Federal Office for the Environment. The soil cartoon characters are an adaptation of the original concept by the James Hutton Institute, Aberdeen, Scotland (www.hutton.ac.uk/learning/dirt-doctor).

  8. Theoretical construction in physics - The role of Leibniz for Weyl's 'Philosophie der Mathematik und Naturwissenschaft'

    NASA Astrophysics Data System (ADS)

    Sieroka, Norman

    2018-02-01

    This paper aims at closing a gap in recent Weyl research by investigating the role played by Leibniz for the development and consolidation of Weyl's notion of theoretical (symbolic) construction. For Weyl, just as for Leibniz, mathematics was not simply an accompanying tool when doing physics-for him it meant the ability to engage in well-guided speculations about a general framework of reality and experience. The paper first introduces some of the background of Weyl's notion of theoretical construction and then discusses particular Leibnizian inheritances in Weyl's 'Philosophie der Mathematik und Naturwissenschaft', such as the general appreciation of the principles of sufficient reason and of continuity. Afterwards the paper focuses on three themes: first, Leibniz's primary quality phenomenalism, which according to Weyl marked the decisive step in realizing that physical qualities are never apprehended directly; second, the conceptual relation between continuity and freedom; and third, Leibniz's notion of 'expression', which allows for a certain type of (surrogative) reasoning by structural analogy and which gave rise to Weyl's optimism regarding the scope of theoretical construction.

  9. Strength of Gamma Rhythm Depends on Normalization

    PubMed Central

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  10. [Normal aging of frontal lobe functions].

    PubMed

    Calso, Cristina; Besnard, Jérémy; Allain, Philippe

    2016-03-01

    Normal aging in individuals is often associated with morphological, metabolic and cognitive changes, which particularly concern the cerebral frontal regions. Starting from the "frontal lobe hypothesis of cognitive aging" (West, 1996), the present review is based on the neuroanatomical model developed by Stuss (2008), introducing four categories of frontal lobe functions: executive control, behavioural and emotional self-regulation and decision-making, energization and meta-cognitive functions. The selected studies only address the changes of one at least of these functions. The results suggest a deterioration of several cognitive frontal abilities in normal aging: flexibility, inhibition, planning, verbal fluency, implicit decision-making, second-order and affective theory of mind. Normal aging seems also to be characterised by a general reduction in processing speed observed during neuropsychological assessment (Salthouse, 1996). Nevertheless many cognitive functions remain preserved such as automatic or non-conscious inhibition, specific capacities of flexibility and first-order theory of mind. Therefore normal aging doesn't seem to be associated with a global cognitive decline but rather with a selective change in some frontal systems, conclusion which should be taken into account for designing caring programs in normal aging.

  11. Normal metal - insulator - superconductor thermometers and coolers with titanium-gold bilayer as the normal metal

    NASA Astrophysics Data System (ADS)

    Räisänen, I. M. W.; Geng, Z.; Kinnunen, K. M.; Maasilta, I. J.

    2018-03-01

    We have fabricated superconductor - insulator - normal metal - insulator - superconductor (SINIS) tunnel junctions in which Al acts as the superconductor, AlOx is the insulator, and the normal metal consists of a thin Ti layer (5 nm) covered with a thicker Au layer (40 nm). We have characterized the junctions by measuring their current-voltage curves between 60 mK and 750 mK. For comparison, the same measurements have been performed for a SINIS junction pair whose normal metal is Cu. The Ti-Au bilayer decreases the SINIS tunneling resistance by an order of magnitude compared to junctions where Cu is used as normal metal, made with the same oxidation parameters. The Ti-Au devices are much more robust against chemical attacks, and their lower tunneling resistance makes them more robust against static charge. More significantly, they exhibit significantly stronger electron cooling than Cu devices with identical fabrication steps, when biased close to the energy gap of the superconducting Al. By using a self-consistent thermal model, we can fit the current-voltage characteristics well, and show an electron cooling from 200 mK to 110 mK, with a non-optimized device.

  12. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  13. Neither Hematocrit Normalization nor Exercise Training Restores Oxygen Consumption to Normal Levels in Hemodialysis Patients

    PubMed Central

    Stray-Gundersen, James; Parsons, Dora Beth; Thompson, Jeffrey R.

    2016-01-01

    Patients treated with hemodialysis develop severely reduced functional capacity, which can be partially ameliorated by correcting anemia and through exercise training. In this study, we determined perturbations of an erythroid-stimulating agent and exercise training to examine if and where limitation to oxygen transport exists in patients on hemodialysis. Twenty-seven patients on hemodialysis completed a crossover study consisting of two exercise training phases at two hematocrit (Hct) values: 30% (anemic) and 42% (physiologic; normalized by treatment with erythroid-stimulating agent). To determine primary outcome measures of peak power and oxygen consumption (VO2) and secondary measures related to components of oxygen transport and utilization, all patients underwent numerous tests at five time points: baseline, untrained at Hct of 30%, after training at Hct of 30%, untrained at Hct of 42%, and after training at Hct of 42%. Hct normalization, exercise training, or the combination thereof significantly improved peak power and VO2 relative to values in the untrained anemic phase. Hct normalization increased peak arterial oxygen and arteriovenous oxygen difference, whereas exercise training improved cardiac output, citrate synthase activity, and peak tissue diffusing capacity. However, although the increase in arterial oxygen observed in the combination phase reached a value similar to that in healthy sedentary controls, the increase in peak arteriovenous oxygen difference did not. Muscle biopsy specimens showed markedly thickened endothelium and electron–dense interstitial deposits. In conclusion, exercise and Hct normalization had positive effects but failed to normalize exercise capacity in patients on hemodialysis. This effect may be caused by abnormalities identified within skeletal muscle. PMID:27153927

  14. Muscular hypertrophy and atrophy in normal rats provoked by the administration of normal and denervated muscle extracts.

    PubMed

    Agüera, Eduardo; Castilla, Salvador; Luque, Evelio; Jimena, Ignacio; Leiva-Cepas, Fernando; Ruz-Caracuel, Ignacio; Peña, José

    2016-12-01

    This study was conducted to determine the effects of extracts obtained from both normal and denervated muscles on different muscle types. Wistar rats were used and were divided into a control group and four experimental groups. Each experimental group was treated intraperitoneally during 10 consecutive days with a different extract. These extracts were obtained from normal soleus muscle, denervated soleus, normal extensor digitorum longus, and denervated extensor digitorum longus. Following treatment, the soleus and extensor digitorum longus muscles were obtained for study under optic and transmission electron microscope; morphometric parameters and myogenic responses were also analyzed. The results demonstrated that the treatment with normal soleus muscle and denervated soleus muscle extracts provoked hypertrophy and increased myogenic activity. In contrast, treatment with extracts from the normal and denervated EDL had a different effect depending on the muscle analyzed. In the soleus muscle it provoked hypertrophy of type I fibers and increased myogenic activity, while in the extensor digitorum longus atrophy of the type II fibers was observed without changes in myogenic activity. This suggests that the muscular responses of atrophy and hypertrophy may depend on different factors related to the muscle type which could be related to innervation.

  15. Normalization of satellite imagery

    NASA Technical Reports Server (NTRS)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  16. Evaluation of CT-based SUV normalization

    NASA Astrophysics Data System (ADS)

    Devriese, Joke; Beels, Laurence; Maes, Alex; Van de Wiele, Christophe; Pottel, Hans

    2016-09-01

    The purpose of this study was to determine patients’ lean body mass (LBM) and lean tissue (LT) mass using a computed tomography (CT)-based method, and to compare standardized uptake value (SUV) normalized by these parameters to conventionally normalized SUVs. Head-to-toe positron emission tomography (PET)/CT examinations were retrospectively retrieved and semi-automatically segmented into tissue types based on thresholding of CT Hounsfield units (HU). The following HU ranges were used for determination of CT-estimated LBM and LT (LBMCT and LTCT):  -180 to  -7 for adipose tissue (AT), -6 to 142 for LT, and 143 to 3010 for bone tissue (BT). Formula-estimated LBMs were calculated using formulas of James (1976 Research on Obesity: a Report of the DHSS/MRC Group (London: HMSO)) and Janmahasatian et al (2005 Clin. Pharmacokinet. 44 1051-65), and body surface area (BSA) was calculated using the DuBois formula (Dubois and Dubois 1989 Nutrition 5 303-11). The CT segmentation method was validated by comparing total patient body weight (BW) to CT-estimated BW (BWCT). LBMCT was compared to formula-based estimates (LBMJames and LBMJanma). SUVs in two healthy reference tissues, liver and mediastinum, were normalized for the aforementioned parameters and compared to each other in terms of variability and dependence on normalization factors and BW. Comparison of actual BW to BWCT shows a non-significant difference of 0.8 kg. LBMJames estimates are significantly higher than LBMJanma with differences of 4.7 kg for female and 1.0 kg for male patients. Formula-based LBM estimates do not significantly differ from LBMCT, neither for men nor for women. The coefficient of variation (CV) of SUV normalized for LBMJames (SUVLBM-James) (12.3%) was significantly reduced in liver compared to SUVBW (15.4%). All SUV variances in mediastinum were significantly reduced (CVs were 11.1-12.2%) compared to SUVBW (15.5%), except SUVBSA (15.2%). Only SUVBW and SUVLBM-James show

  17. Normal Psychosexual Development

    ERIC Educational Resources Information Center

    Rutter, Michael

    1971-01-01

    Normal sexual development is reviewed with respect to physical maturation, sexual interests, sex drive", psychosexual competence and maturity, gender role, object choice, children's concepts of sexual differences, sex role preference and standards, and psychosexual stages. Biologic, psychoanalytic and psychosocial theories are briefly considered.…

  18. Visual attention and flexible normalization pools

    PubMed Central

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  19. A normalization strategy for comparing tag count data

    PubMed Central

    2012-01-01

    Background High-throughput sequencing, such as ribonucleic acid sequencing (RNA-seq) and chromatin immunoprecipitation sequencing (ChIP-seq) analyses, enables various features of organisms to be compared through tag counts. Recent studies have demonstrated that the normalization step for RNA-seq data is critical for a more accurate subsequent analysis of differential gene expression. Development of a more robust normalization method is desirable for identifying the true difference in tag count data. Results We describe a strategy for normalizing tag count data, focusing on RNA-seq. The key concept is to remove data assigned as potential differentially expressed genes (DEGs) before calculating the normalization factor. Several R packages for identifying DEGs are currently available, and each package uses its own normalization method and gene ranking algorithm. We compared a total of eight package combinations: four R packages (edgeR, DESeq, baySeq, and NBPSeq) with their default normalization settings and with our normalization strategy. Many synthetic datasets under various scenarios were evaluated on the basis of the area under the curve (AUC) as a measure for both sensitivity and specificity. We found that packages using our strategy in the data normalization step overall performed well. This result was also observed for a real experimental dataset. Conclusion Our results showed that the elimination of potential DEGs is essential for more accurate normalization of RNA-seq data. The concept of this normalization strategy can widely be applied to other types of tag count data and to microarray data. PMID:22475125

  20. Parental Perceptions of the Outcome and Meaning of Normalization

    PubMed Central

    Knafl, Kathleen A.; Darney, Blair G.; Gallo, Agatha M.; Angst, Denise B.

    2010-01-01

    The purpose of this secondary analysis was to identify the meaning of normalization for parents of a child with a chronic genetic condition. The sample was comprised of 28 families (48 parents), selected to reflect two groups: Normalization Present (NP) and Normalization Absent (NA). Constant comparison analysis was used to identify themes characterizing parents' perceptions of the meaning of normalization. The meanings parents attributed to normalization reflected their evaluation of condition management, parenting role, and condition impact, with parents in the NP and NA groups demonstrating distinct patterns of meaning. These meaning patterns are discussed as an outcome of normalization. Providers can play a pivotal role in helping families achieve normalization by providing guidance on how to balance condition management with normal family life. PMID:20108258

  1. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  2. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  3. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  4. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  5. A normality bias in legal decision making.

    PubMed

    Prentice, Robert A; Koehler, Jonathan J

    2003-03-01

    It is important to understand how legal fact finders determine causation and assign blame. However, this process is poorly understood. Among the psychological factors that affect decision makers are an omission bias (a tendency to blame actions more than inactions [omissions] for bad results), and a normality bias (a tendency to react more strongly to bad outcomes that spring from abnormal rather than normal circumstances). The omission and normality biases often reinforce one another when inaction preserves the normal state and when action creates an abnormal state. But what happens when these biases push in opposite directions as they would when inaction promotes an abnormal state or when action promotes a normal state? Which bias exerts the stronger influence on the judgments and behaviors of legal decision makers? The authors address this issue in two controlled experiments. One experiment involves medical malpractice and the other involves stockbroker negligence. They find that jurors pay much more attention to the normality of conditions than to whether those conditions arose through acts or omissions. Defendants who followed a nontraditional medical treatment regime or who chose a nontraditional stock portfolio received more blame and more punishment for bad outcomes than did defendants who obtained equally poor results after recommending a traditional medical regime or a traditional stock portfolio. Whether these recommendations entailed an action or an omission was essentially irrelevant. The Article concludes with a discussion of the implications of a robust normality bias for American jurisprudence.

  6. Normal peer models and autistic children's learning.

    PubMed Central

    Egel, A L; Richman, G S; Koegel, R L

    1981-01-01

    Present research and legislation regarding mainstreaming autistic children into normal classrooms have raised the importance of studying whether autistic children can benefit from observing normal peer models. The present investigation systematically assessed whether autistic children's learning of discrimination tasks could be improved if they observed normal children perform the tasks correctly. In the context of a multiple baseline design, four autistic children worked on five discrimination tasks that their teachers reported were posing difficulty. Throughout the baseline condition the children evidenced very low levels of correct responding on all five tasks. In the subsequent treatment condition, when normal peers modeled correct responses, the autistic children's correct responding increased dramatically. In each case, the peer modeling procedure produced rapid achievement of the acquisition which was maintained after the peer models were removed. These results are discussed in relation to issues concerning observational learning and in relation to the implications for mainstreaming autistic children into normal classrooms. PMID:7216930

  7. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  8. Normal stresses in shear thickening granular suspensions.

    PubMed

    Pan, Zhongcheng; de Cagny, Henri; Habibi, Mehdi; Bonn, Daniel

    2017-05-24

    When subjected to shear, granular suspensions exhibit normal stresses perpendicular to the shear plane but the magnitude and sign of the different components of the normal stresses are still under debate. By performing both oscillatory and rotational rheology measurements on shear thickening granular suspensions and systematically varying the particle diameters and the gap sizes between two parallel-plates, we show that a transition from a positive to a negative normal stress can be observed. We find that frictional interactions which determine the shear thickening behavior of suspensions contribute to the positive normal stresses. Increasing the particle diameters or decreasing the gap sizes leads to a growing importance of hydrodynamic interactions, which results in negative normal stresses. We determine a relaxation time for the system, set by both the pore and the gap sizes, that governs the fluid flow through the inter-particle space. Finally, using a two-fluid model we determine the relative contributions from the particle phase and the liquid phase.

  9. Statistical normalization techniques for magnetic resonance imaging.

    PubMed

    Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2014-01-01

    While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.

  10. Is My Penis Normal? (For Teens)

    MedlinePlus

    ... worried about whether his penis is a normal size. There's a fairly wide range of normal penis sizes — just as there is for every other body part. And just like other parts of the body, how a penis appears at different stages of a guy's life varies quite a ...

  11. A Skew-Normal Mixture Regression Model

    ERIC Educational Resources Information Center

    Liu, Min; Lin, Tsung-I

    2014-01-01

    A challenge associated with traditional mixture regression models (MRMs), which rest on the assumption of normally distributed errors, is determining the number of unobserved groups. Specifically, even slight deviations from normality can lead to the detection of spurious classes. The current work aims to (a) examine how sensitive the commonly…

  12. Correcting the Normalized Gain for Guessing

    ERIC Educational Resources Information Center

    Stewart, John; Stewart, Gay

    2010-01-01

    The normalized gain, "g", has been an important tool for the characterization of conceptual improvement in physics courses since its use in Hake's extensive study on conceptual learning in introductory physics. The normalized gain is calculated from the score on a pre-test administered before instruction and a post-test administered…

  13. Joint Ordnance Test Procedure (JOTP)-010 Safety and Suitability for Service Assessment Testing for Shoulder Launched Munitions

    DTIC Science & Technology

    2013-01-08

    Richtlinien zur Registrierung und Auswertung von Waffen -und Detonationsknallen 31 Motor Case Burst Probability ARO Report 75-2, SMC-S-001 Def Stan...des bruits de détonation Vorschriften und Richtlinien zur Registrierung und Auswertung von Waffen und Detonationsknallen and STANAG 4569 with...à l’exploitation des bruits d’armes et des bruits de détonation Vorschriften und Richtlinien zur Registrierung und Auswertung von Waffen und

  14. Safety and Suitability for Service Assessment Testing for Shoulder Launched Munitions

    DTIC Science & Technology

    2014-11-07

    Registrierung und Auswertung von Waffen -und Detonationsknallen 31 Motor Case Burst Probability ARO Report 75-2, SMC-S-001 Def Stan 07-85 32...détonation Vorschriften und Richtlinien zur Registrierung und Auswertung von Waffen und Detonationsknallen and STANAG 4569 with references...l’exploitation des bruits d’armes et des bruits de détonation Vorschriften und Richtlinien zur Registrierung und Auswertung von Waffen und

  15. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  16. Evaluation of normalization methods in mammalian microRNA-Seq data

    PubMed Central

    Garmire, Lana Xia; Subramaniam, Shankar

    2012-01-01

    Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701

  17. Peter Andreas Hansen and the astronomical community - a first investigation of the Hansen papers. (German Title: Peter Andreas Hansen und die astronomische Gemeinschaft - eine erste Auswertung des Hansen-Nachlasses. )

    NASA Astrophysics Data System (ADS)

    Schwarz, Oliver; Strumpf, Manfred

    The literary assets of Peter Andreas Hansen are deposited in the Staatsarchiv Hamburg, the Forschungs- und Landesbibliothek Gotha and the Thüringer Staatsarchiv Gotha. They were never systematically investigated. We present here some results of a first evaluation. It was possible to reconstruct the historical events with regard to the maintenance of the Astronomische Nachrichten and the Altona observatory in 1854. Hansen was a successful teacher for many young astronomers. His way of stimulating the evolution of astronomy followed Zach's tradition.

  18. Economic values under inappropriate normal distribution assumptions.

    PubMed

    Sadeghi-Sefidmazgi, A; Nejati-Javaremi, A; Moradi-Shahrbabak, M; Miraei-Ashtiani, S R; Amer, P R

    2012-08-01

    The objectives of this study were to quantify the errors in economic values (EVs) for traits affected by cost or price thresholds when skewed or kurtotic distributions of varying degree are assumed to be normal and when data with a normal distribution is subject to censoring. EVs were estimated for a continuous trait with dichotomous economic implications because of a price premium or penalty arising from a threshold ranging between -4 and 4 standard deviations from the mean. In order to evaluate the impacts of skewness, positive and negative excess kurtosis, standard skew normal, Pearson and the raised cosine distributions were used, respectively. For the various evaluable levels of skewness and kurtosis, the results showed that EVs can be underestimated or overestimated by more than 100% when price determining thresholds fall within a range from the mean that might be expected in practice. Estimates of EVs were very sensitive to censoring or missing data. In contrast to practical genetic evaluation, economic evaluation is very sensitive to lack of normality and missing data. Although in some special situations, the presence of multiple thresholds may attenuate the combined effect of errors at each threshold point, in practical situations there is a tendency for a few key thresholds to dominate the EV, and there are many situations where errors could be compounded across multiple thresholds. In the development of breeding objectives for non-normal continuous traits influenced by value thresholds, it is necessary to select a transformation that will resolve problems of non-normality or consider alternative methods that are less sensitive to non-normality.

  19. Univariate normalization of bispectrum using Hölder's inequality.

    PubMed

    Shahbazi, Forooz; Ewald, Arne; Nolte, Guido

    2014-08-15

    Considering that many biological systems including the brain are complex non-linear systems, suitable methods capable of detecting these non-linearities are required to study the dynamical properties of these systems. One of these tools is the third order cummulant or cross-bispectrum, which is a measure of interfrequency interactions between three signals. For convenient interpretation, interaction measures are most commonly normalized to be independent of constant scales of the signals such that its absolute values are bounded by one, with this limit reflecting perfect coupling. Although many different normalization factors for cross-bispectra were suggested in the literature these either do not lead to bounded measures or are themselves dependent on the coupling and not only on the scale of the signals. In this paper we suggest a normalization factor which is univariate, i.e., dependent only on the amplitude of each signal and not on the interactions between signals. Using a generalization of Hölder's inequality it is proven that the absolute value of this univariate bicoherence is bounded by zero and one. We compared three widely used normalizations to the univariate normalization concerning the significance of bicoherence values gained from resampling tests. Bicoherence values are calculated from real EEG data recorded in an eyes closed experiment from 10 subjects. The results show slightly more significant values for the univariate normalization but in general, the differences are very small or even vanishing in some subjects. Therefore, we conclude that the normalization factor does not play an important role in the bicoherence values with regard to statistical power, although a univariate normalization is the only normalization factor which fulfills all the required conditions of a proper normalization. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Score Normalization for Keyword Search

    DTIC Science & Technology

    2016-06-23

    Anahtar Sözcük Arama için Skor Düzgeleme Score Normalization for Keyword Search Leda Sarı, Murat Saraçlar Elektrik ve Elektronik Mühendisliği Bölümü...skor düzgeleme. Abstract—In this work, keyword search (KWS) is based on a symbolic index that uses posteriorgram representation of the speech data...For each query, sum-to-one normalization or keyword specific thresholding is applied to the search results. The effect of these methods on the proposed

  1. Normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  2. Normal stresses in semiflexible polymer hydrogels

    NASA Astrophysics Data System (ADS)

    Vahabi, M.; Vos, Bart E.; de Cagny, Henri C. G.; Bonn, Daniel; Koenderink, Gijsje H.; MacKintosh, F. C.

    2018-03-01

    Biopolymer gels such as fibrin and collagen networks are known to develop tensile axial stress when subject to torsion. This negative normal stress is opposite to the classical Poynting effect observed for most elastic solids including synthetic polymer gels, where torsion provokes a positive normal stress. As shown recently, this anomalous behavior in fibrin gels depends on the open, porous network structure of biopolymer gels, which facilitates interstitial fluid flow during shear and can be described by a phenomenological two-fluid model with viscous coupling between network and solvent. Here we extend this model and develop a microscopic model for the individual diagonal components of the stress tensor that determine the axial response of semiflexible polymer hydrogels. This microscopic model predicts that the magnitude of these stress components depends inversely on the characteristic strain for the onset of nonlinear shear stress, which we confirm experimentally by shear rheometry on fibrin gels. Moreover, our model predicts a transient behavior of the normal stress, which is in excellent agreement with the full time-dependent normal stress we measure.

  3. Forced Normalization: Antagonism Between Epilepsy and Psychosis.

    PubMed

    Kawakami, Yasuhiko; Itoh, Yasuhiko

    2017-05-01

    The antagonism between epilepsy and psychosis has been discussed for a long time. Landolt coined the term "forced normalization" in the 1950s to describe psychotic episodes associated with the remission of seizures and disappearance of epileptiform activity on electroencephalograms in individuals with epilepsy. Since then, neurologists and psychiatrists have been intrigued by this phenomenon. However, although collaborative clinical studies and basic experimental researches have been performed, the mechanism of forced normalization remains unknown. In this review article, we present a historical overview of the concept of forced normalization, and discuss potential pathogenic mechanisms and clinical diagnosis. We also discuss the role of dopamine, which appears to be a key factor in the mechanism of forced normalization. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Cultured normal mammalian tissue and process

    NASA Technical Reports Server (NTRS)

    Goodwin, Thomas J. (Inventor); Prewett, Tacey L. (Inventor); Wolf, David A. (Inventor); Spaulding, Glenn F. (Inventor)

    1993-01-01

    Normal mammalian tissue and the culturing process has been developed for the three groups of organ, structural and blood tissue. The cells are grown in vitro under microgravity culture conditions and form three dimensional cell aggregates with normal cell function. The microgravity culture conditions may be microgravity or simulated microgravity created in a horizontal rotating wall culture vessel.

  5. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, J.; Markley, F. L.; Bar-Itzhack, Itzhack Y.

    1993-01-01

    Attitude determination of spacecraft usually utilizes vector measurements such as Sun, center of Earth, star, and magnetic field direction to update the quaternion which determines the spacecraft orientation with respect to some reference coordinates in the three dimensional space. These measurements are usually processed by an extended Kalman filter (EKF) which yields an estimate of the attitude quaternion. Two EKF versions for quaternion estimation were presented in the literature; namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). In the multiplicative EKF, it is assumed that the error between the correct quaternion and its a-priori estimate is, by itself, a quaternion that represents the rotation necessary to bring the attitude which corresponds to the a-priori estimate of the quaternion into coincidence with the correct attitude. The EKF basically estimates this quotient quaternion and then the updated quaternion estimate is obtained by the product of the a-priori quaternion estimate and the estimate of the difference quaternion. In the additive EKF, it is assumed that the error between the a-priori quaternion estimate and the correct one is an algebraic difference between two four-tuple elements and thus the EKF is set to estimate this difference. The updated quaternion is then computed by adding the estimate of the difference to the a-priori quaternion estimate. If the quaternion estimate converges to the correct quaternion, then, naturally, the quaternion estimate has unity norm. This fact was utilized in the past to obtain superior filter performance by applying normalization to the filter measurement update of the quaternion. It was observed for the AEKF that when the attitude changed very slowly between measurements, normalization merely resulted in a faster convergence; however, when the attitude changed considerably between measurements, without filter tuning or normalization, the quaternion estimate diverged. However, when the

  6. Normalization vs. Social Role Valorization: Similar or Different?

    ERIC Educational Resources Information Center

    Kumar, Akhilesh; Singh, Rajani Ranjan; Thressiakutty, A. T.

    2015-01-01

    The radical changes towards services for persons with disabilities were brought by Principle of Normalization, originated in 1969. As a consequence of Normalization, disability as a whole, and intellectual disability in particular, received the attention of the masses and the intelligentsia began advocating normalization ideologies which became…

  7. Flow derivatives and curvatures for a normal shock

    NASA Astrophysics Data System (ADS)

    Emanuel, G.

    2018-03-01

    A detached bow shock wave is strongest where it is normal to the upstream velocity. While the jump conditions across the shock are straightforward, many properties, such as the shock's curvatures and derivatives of the pressure, along and normal to a normal shock, are indeterminate. A novel procedure is introduced for resolving the indeterminacy when the unsteady flow is three-dimensional and the upstream velocity may be nonuniform. Utilizing this procedure, normal shock relations are provided for the nonunique orientation of the flow plane and the corresponding shock's curvatures and, e.g., the downstream normal derivatives of the pressure and the velocity components. These algebraic relations explicitly show the dependence of these parameters on the shock's shape and the upstream velocity gradient. A simple relation, valid only for a normal shock, is obtained for the average curvatures. Results are also obtained when the shock is an elliptic paraboloid shock. These derivatives are both simple and proportional to the average curvature.

  8. GC-Content Normalization for RNA-Seq Data

    PubMed Central

    2011-01-01

    Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264

  9. Normal IQ is possible in Smith-Lemli-Opitz syndrome.

    PubMed

    Eroglu, Yasemen; Nguyen-Driver, Mina; Steiner, Robert D; Merkens, Louise; Merkens, Mark; Roullet, Jean-Baptiste; Elias, Ellen; Sarphare, Geeta; Porter, Forbes D; Li, Chumei; Tierney, Elaine; Nowaczyk, Małgorzata J; Freeman, Kurt A

    2017-08-01

    Children with Smith-Lemli-Opitz syndrome (SLOS) are typically reported to have moderate to severe intellectual disability. This study aims to determine whether normal cognitive function is possible in this population and to describe clinical, biochemical and molecular characteristics of children with SLOS and normal intelligent quotient (IQ). The study included children with SLOS who underwent cognitive testing in four centers. All children with at least one IQ composite score above 80 were included in the study. Six girls, three boys with SLOS were found to have normal or low-normal IQ in a cohort of 145 children with SLOS. Major/multiple organ anomalies and low serum cholesterol levels were uncommon. No correlation with IQ and genotype was evident and no specific developmental profile were observed. Thus, normal or low-normal cognitive function is possible in SLOS. Further studies are needed to elucidate factors contributing to normal or low-normal cognitive function in children with SLOS. © 2017 Wiley Periodicals, Inc.

  10. Normal gravity field in relativistic geodesy

    NASA Astrophysics Data System (ADS)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  11. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  12. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  13. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  14. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  15. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  16. Professionelles Learning Service Management an Hochschulen

    NASA Astrophysics Data System (ADS)

    Baume, Matthias; Rathmayer, Sabine; Gergintchev, Ivan; Schulze, Elvira

    Aufbauend auf den Großteils bereits geschaffenen eLearning Infrastrukturen für eine moderne Organisation stehen nahezu alle Hochschulen vor der Aufgabe, geeignete Lern- und Wissensmanagementkonzepte in Hinblick auf die Dienstgüte und den Anwenderbezug zu realisieren. Ein möglicher Lösungsansatz ist dabei die Entwicklung und Umsetzung eines Rahmenkonzeptes zur Verbesserung und Weiterentwicklung der Lehr-/Lernprozesse für Dozenten und Studenten am Beispiel bereits vorhandener und etablierter Service-Management-Konzepte. Übertragen auf die universitäre Organisation und Lehre, wäre ein derartiges Rahmenwerk zur Planung, Erbringung und Unterstützung von Lehr-/Lerndienstleistungen mit Einbezug der wichtigsten Lehr-/Lernprozesse ein dringend benötigter und fundamentaler Schritt hin zu einer schrittweisen Professionalisierung und Verbesserung der Hochschullehre. Der Beitrag erschließt eine Konzeptskizze für professionelles Learning Service Management an Hochschulen und gibt einen Ausblick auf die mögliche Vorgehensweise bei dessen Implementierung und Evaluierung.

  17. FORCED NORMALIZATION: Epilepsy and Psychosis Interaction

    PubMed Central

    Loganathan, Muruga A.; Enja, Manasa

    2015-01-01

    Forced normalization is the emergence of psychoses following the establishment of seizure control in an uncontrolled epilepsy patient. Two illustrative clinical vignettes are provided about people with epilepsy that was newly controlled and followed by emergence of a psychosis; symptoms appeared only after attaining ictal control. For recognition and differential diagnosis purposes, understanding forced normalization is important in clinical practice. PMID:26155377

  18. Corticocortical feedback increases the spatial extent of normalization.

    PubMed

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  19. Normalization of Gravitational Acceleration Models

    NASA Technical Reports Server (NTRS)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  20. Normalized Temperature Contrast Processing in Flash Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing of flash infrared thermography method by the author given in US 8,577,120 B1. The method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided, including converting one from the other. Methods of assessing emissivity of the object, afterglow heat flux, reflection temperature change and temperature video imaging during flash thermography are provided. Temperature imaging and normalized temperature contrast imaging provide certain advantages over pixel intensity normalized contrast processing by reducing effect of reflected energy in images and measurements, providing better quantitative data. The subject matter for this paper mostly comes from US 9,066,028 B1 by the author. Examples of normalized image processing video images and normalized temperature processing video images are provided. Examples of surface temperature video images, surface temperature rise video images and simple contrast video images area also provided. Temperature video imaging in flash infrared thermography allows better comparison with flash thermography simulation using commercial software which provides temperature video as the output. Temperature imaging also allows easy comparison of surface temperature change to camera temperature sensitivity or noise equivalent temperature difference (NETD) to assess probability of detecting (POD) anomalies.

  1. Understanding a Normal Distribution of Data (Part 2).

    PubMed

    Maltenfort, Mitchell

    2016-02-01

    Completing the discussion of data normality, advanced techniques for analysis of non-normal data are discussed including data transformation, Generalized Linear Modeling, and bootstrapping. Relative strengths and weaknesses of each technique are helpful in choosing a strategy, but help from a statistician is usually necessary to analyze non-normal data using these methods.

  2. High-speed digital signal normalization for feature identification

    NASA Technical Reports Server (NTRS)

    Ortiz, J. A.; Meredith, B. D.

    1983-01-01

    A design approach for high speed normalization of digital signals was developed. A reciprocal look up table technique is employed, where a digital value is mapped to its reciprocal via a high speed memory. This reciprocal is then multiplied with an input signal to obtain the normalized result. Normalization improves considerably the accuracy of certain feature identification algorithms. By using the concept of pipelining the multispectral sensor data processing rate is limited only by the speed of the multiplier. The breadboard system was found to operate at an execution rate of five million normalizations per second. This design features high precision, a reduced hardware complexity, high flexibility, and expandability which are very important considerations for spaceborne applications. It also accomplishes a high speed normalization rate essential for real time data processing.

  3. Antitissue Transglutaminase Normalization Postdiagnosis in Children With Celiac Disease.

    PubMed

    Isaac, Daniela Migliarese; Rajani, Seema; Yaskina, Maryna; Huynh, Hien Q; Turner, Justine M

    2017-08-01

    Limited pediatric data exist examining the trend and predictors of antitissue transglutaminase (atTG) normalization over time in children with celiac disease (CD). We aimed to evaluate time to normalization of atTG in children after CD diagnosis, and to assess for independent predictors affecting this duration. A retrospective chart review was completed in pediatric patients with CD diagnosed from 2007 to 2014 at the Stollery Children's Hospital Celiac Clinic (Edmonton, Alberta, Canada). The clinical predictors assessed for impact on time to atTG normalization were initial atTG, Marsh score at diagnosis, gluten-free diet compliance (GFDC), age at diagnosis, sex, ethnicity, medical comorbidities, and family history of CD. Kaplan-Meier survival analysis was completed to assess time to atTG normalization, and Cox regression to assess for independent predictors of this time. A total of 487 patients met inclusion criteria. Approximately 80.5% of patients normalized atTG levels. Median normalization time was 407 days for all patients (95% confidence interval [CI: 361-453]), and 364 days for gluten-free diet compliant patients (95% CI [335-393]). Type 1 diabetes mellitus (T1DM) patients took significantly longer to normalize at 1204 days (95% CI [199-2209], P < 0.001). Cox regression demonstrated T1DM (hazard ratio = 0.36 [0.24-0.55], P < 0.001) and higher baseline atTG (hazard ratio = 0.52 [0.43-0.63], P < 0.001) were significant predictors of longer atTG normalization time. GFDC was a significant predictor of earlier normalization (OR = 13.91 [7.86-24.62], P < 0.001). GFDC and lower atTG at diagnosis are predictors of earlier normalization. Patients with T1DM are less likely to normalize atTG levels, with longer normalization time. Additional research and education for higher-risk populations are needed.

  4. Normalization of energy-dependent gamma survey data.

    PubMed

    Whicker, Randy; Chambers, Douglas

    2015-05-01

    Instruments and methods for normalization of energy-dependent gamma radiation survey data to a less energy-dependent basis of measurement are evaluated based on relevant field data collected at 15 different sites across the western United States along with a site in Mongolia. Normalization performance is assessed relative to measurements with a high-pressure ionization chamber (HPIC) due to its "flat" energy response and accurate measurement of the true exposure rate from both cosmic and terrestrial radiation. While analytically ideal for normalization applications, cost and practicality disadvantages have increased demand for alternatives to the HPIC. Regression analysis on paired measurements between energy-dependent sodium iodide (NaI) scintillation detectors (5-cm by 5-cm crystal dimensions) and the HPIC revealed highly consistent relationships among sites not previously impacted by radiological contamination (natural sites). A resulting generalized data normalization factor based on the average sensitivity of NaI detectors to naturally occurring terrestrial radiation (0.56 nGy hHPIC per nGy hNaI), combined with the calculated site-specific estimate of cosmic radiation, produced reasonably accurate predictions of HPIC readings at natural sites. Normalization against two to potential alternative instruments (a tissue-equivalent plastic scintillator and energy-compensated NaI detector) did not perform better than the sensitivity adjustment approach at natural sites. Each approach produced unreliable estimates of HPIC readings at radiologically impacted sites, though normalization against the plastic scintillator or energy-compensated NaI detector can address incompatibilities between different energy-dependent instruments with respect to estimation of soil radionuclide levels. The appropriate data normalization method depends on the nature of the site, expected duration of the project, survey objectives, and considerations of cost and practicality.

  5. Speaker normalization for chinese vowel recognition in cochlear implants.

    PubMed

    Luo, Xin; Fu, Qian-Jie

    2005-07-01

    Because of the limited spectra-temporal resolution associated with cochlear implants, implant patients often have greater difficulty with multitalker speech recognition. The present study investigated whether multitalker speech recognition can be improved by applying speaker normalization techniques to cochlear implant speech processing. Multitalker Chinese vowel recognition was tested with normal-hearing Chinese-speaking subjects listening to a 4-channel cochlear implant simulation, with and without speaker normalization. For each subject, speaker normalization was referenced to the speaker that produced the best recognition performance under conditions without speaker normalization. To match the remaining speakers to this "optimal" output pattern, the overall frequency range of the analysis filter bank was adjusted for each speaker according to the ratio of the mean third formant frequency values between the specific speaker and the reference speaker. Results showed that speaker normalization provided a small but significant improvement in subjects' overall recognition performance. After speaker normalization, subjects' patterns of recognition performance across speakers changed, demonstrating the potential for speaker-dependent effects with the proposed normalization technique.

  6. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  7. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  8. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  9. Rhythm-based heartbeat duration normalization for atrial fibrillation detection.

    PubMed

    Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim

    2016-05-01

    Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization

  10. Normal modes of weak colloidal gels

    NASA Astrophysics Data System (ADS)

    Varga, Zsigmond; Swan, James W.

    2018-01-01

    The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer

  11. Corticocortical feedback increases the spatial extent of normalization

    PubMed Central

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  12. Plasma Electrolyte Distributions in Humans-Normal or Skewed?

    PubMed

    Feldman, Mark; Dickson, Beverly

    2017-11-01

    It is widely believed that plasma electrolyte levels are normally distributed. Statistical tests and calculations using plasma electrolyte data are often reported based on this assumption of normality. Examples include t tests, analysis of variance, correlations and confidence intervals. The purpose of our study was to determine whether plasma sodium (Na + ), potassium (K + ), chloride (Cl - ) and bicarbonate [Formula: see text] distributions are indeed normally distributed. We analyzed plasma electrolyte data from 237 consecutive adults (137 women and 100 men) who had normal results on a standard basic metabolic panel which included plasma electrolyte measurements. The skewness of each distribution (as a measure of its asymmetry) was compared to the zero skewness of a normal (Gaussian) distribution. The plasma Na + distribution was skewed slightly to the right, but the skew was not significantly different from zero skew. The plasma Cl - distribution was skewed slightly to the left, but again the skew was not significantly different from zero skew. On the contrary, both the plasma K + and [Formula: see text] distributions were significantly skewed to the right (P < 0.01 zero skew). There was also a suggestion from examining frequency distribution curves that K + and [Formula: see text] distributions were bimodal. In adults with a normal basic metabolic panel, plasma potassium and bicarbonate levels are not normally distributed and may be bimodal. Thus, statistical methods to evaluate these 2 plasma electrolytes should be nonparametric tests and not parametric ones that require a normal distribution. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  13. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  14. Is Coefficient Alpha Robust to Non-Normal Data?

    PubMed Central

    Sheng, Yanyan; Sheng, Zhaohui

    2011-01-01

    Coefficient alpha has been a widely used measure by which internal consistency reliability is assessed. In addition to essential tau-equivalence and uncorrelated errors, normality has been noted as another important assumption for alpha. Earlier work on evaluating this assumption considered either exclusively non-normal error score distributions, or limited conditions. In view of this and the availability of advanced methods for generating univariate non-normal data, Monte Carlo simulations were conducted to show that non-normal distributions for true or error scores do create problems for using alpha to estimate the internal consistency reliability. The sample coefficient alpha is affected by leptokurtic true score distributions, or skewed and/or kurtotic error score distributions. Increased sample sizes, not test lengths, help improve the accuracy, bias, or precision of using it with non-normal data. PMID:22363306

  15. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  16. Attention and normalization circuits in macaque V1

    PubMed Central

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-01-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. PMID:25757941

  17. Dependence of normal brain integral dose and normal tissue complication probability on the prescription isodose values for γ-knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Ma, Lijun

    2001-11-01

    A recent multi-institutional clinical study suggested possible benefits of lowering the prescription isodose lines for stereotactic radiosurgery procedures. In this study, we investigate the dependence of the normal brain integral dose and the normal tissue complication probability (NTCP) on the prescription isodose values for γ-knife radiosurgery. An analytical dose model was developed for γ-knife treatment planning. The dose model was commissioned by fitting the measured dose profiles for each helmet size. The dose model was validated by comparing its results with the Leksell gamma plan (LGP, version 5.30) calculations. The normal brain integral dose and the NTCP were computed and analysed for an ensemble of treatment cases. The functional dependence of the normal brain integral dose and the NCTP versus the prescribing isodose values was studied for these cases. We found that the normal brain integral dose and the NTCP increase significantly when lowering the prescription isodose lines from 50% to 35% of the maximum tumour dose. Alternatively, the normal brain integral dose and the NTCP decrease significantly when raising the prescribing isodose lines from 50% to 65% of the maximum tumour dose. The results may be used as a guideline for designing future dose escalation studies for γ-knife applications.

  18. [From the library of the Netherlands Journal of Medicine. Rudolf Virchow: Die Cellularpathologie in ihrer Begründung auf physiologische und pathologische Gewebelehre; 1858].

    PubMed

    Molenaar, J C

    2003-11-08

    With the publication of Die Cellularpathologie in ihrer Begründung auf physiologische und pathologische Gewebelehre in 1858, the author Rudolf Virchow (1821-1902) originated the idea that each cell in each living organism, both plant and animal, originates from another cell and that the origin of disease can only be located in the cell. The book laid the foundations for cell pathology as a scientific discipline and was the most important publication by Virchow, who as doctor and statesman gathered so much fame that he became almost a mythical figure in his own time. The finding that every cell originates from another cell and does not develop from amorphous interstitium is actually attributable to Robert Remak.

  19. Spatially tuned normalization explains attention modulation variance within neurons.

    PubMed

    Ni, Amy M; Maunsell, John H R

    2017-09-01

    Spatial attention improves perception of attended parts of a scene, a behavioral enhancement accompanied by modulations of neuronal firing rates. These modulations vary in size across neurons in the same brain area. Models of normalization explain much of this variance in attention modulation with differences in tuned normalization across neurons (Lee J, Maunsell JHR. PLoS One 4: e4651, 2009; Ni AM, Ray S, Maunsell JHR. Neuron 73: 803-813, 2012). However, recent studies suggest that normalization tuning varies with spatial location both across and within neurons (Ruff DA, Alberts JJ, Cohen MR. J Neurophysiol 116: 1375-1386, 2016; Verhoef BE, Maunsell JHR. eLife 5: e17256, 2016). Here we show directly that attention modulation and normalization tuning do in fact covary within individual neurons, in addition to across neurons as previously demonstrated. We recorded the activity of isolated neurons in the middle temporal area of two rhesus monkeys as they performed a change-detection task that controlled the focus of spatial attention. Using the same two drifting Gabor stimuli and the same two receptive field locations for each neuron, we found that switching which stimulus was presented at which location affected both attention modulation and normalization in a correlated way within neurons. We present an equal-maximum-suppression spatially tuned normalization model that explains this covariance both across and within neurons: each stimulus generates equally strong suppression of its own excitatory drive, but its suppression of distant stimuli is typically less. This new model specifies how the tuned normalization associated with each stimulus location varies across space both within and across neurons, changing our understanding of the normalization mechanism and how attention modulations depend on this mechanism. NEW & NOTEWORTHY Tuned normalization studies have demonstrated that the variance in attention modulation size seen across neurons from the same cortical

  20. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... change salinity patterns, alter erosion or sedimentation rates, aggravate water temperature extremes, and... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  1. Facial-Attractiveness Choices Are Predicted by Divisive Normalization.

    PubMed

    Furl, Nicholas

    2016-10-01

    Do people appear more attractive or less attractive depending on the company they keep? A divisive-normalization account-in which representation of stimulus intensity is normalized (divided) by concurrent stimulus intensities-predicts that choice preferences among options increase with the range of option values. In the first experiment reported here, I manipulated the range of attractiveness of the faces presented on each trial by varying the attractiveness of an undesirable distractor face that was presented simultaneously with two attractive targets, and participants were asked to choose the most attractive face. I used normalization models to predict the context dependence of preferences regarding facial attractiveness. The more unattractive the distractor, the more one of the targets was preferred over the other target, which suggests that divisive normalization (a potential canonical computation in the brain) influences social evaluations. I obtained the same result when I manipulated faces' averageness and participants chose the most average face. This finding suggests that divisive normalization is not restricted to value-based decisions (e.g., attractiveness). This new application to social evaluation of normalization, a classic theory, opens possibilities for predicting social decisions in naturalistic contexts such as advertising or dating.

  2. Normal mode analysis and applications in biological physics.

    PubMed

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  3. Meissner effect in normal-superconducting proximity-contact double layers

    NASA Astrophysics Data System (ADS)

    Higashitani, Seiji; Nagai, Katsuhiko

    1995-02-01

    The Meissner effect in normal-superconducting proximity-contact double layers is discussed in the clean limit. The diamagnetic current is calculated using the quasi-classical Green's function. We obtain the quasi-classical Green's function linear in the vector potential in the proximity-contact double layers with a finite reflection coefficient at the interface. It is found that the diamagnetic current in the clean normal layer is constant in space, therefore, the magnetic field linearly decreases in the clean normal layer. We give an explicit expression for the screening length in the clean normal layer and study its temperature dependence. We show that the temperature dependence in the clean normal layer is considerably different from that in the dirty normal layer and agrees with a recent experiment in Au-Nb system.

  4. Helicon normal modes in Proto-MPEX

    NASA Astrophysics Data System (ADS)

    Piotrowicz, P. A.; Caneses, J. F.; Green, D. L.; Goulding, R. H.; Lau, C.; Caughman, J. B. O.; Rapp, J.; Ruzic, D. N.

    2018-05-01

    The Proto-MPEX helicon source has been operating in a high electron density ‘helicon-mode’. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the ‘helicon-mode’. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besides directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region. ).

  5. Helicon normal modes in Proto-MPEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  6. Helicon normal modes in Proto-MPEX

    DOE PAGES

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.; ...

    2018-05-22

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  7. Normal fault earthquakes or graviquakes

    PubMed Central

    Doglioni, C.; Carminati, E.; Petricca, P.; Riguzzi, F.

    2015-01-01

    Earthquakes are dissipation of energy throughout elastic waves. Canonically is the elastic energy accumulated during the interseismic period. However, in crustal extensional settings, gravity is the main energy source for hangingwall fault collapsing. Gravitational potential is about 100 times larger than the observed magnitude, far more than enough to explain the earthquake. Therefore, normal faults have a different mechanism of energy accumulation and dissipation (graviquakes) with respect to other tectonic settings (strike-slip and contractional), where elastic energy allows motion even against gravity. The bigger the involved volume, the larger is their magnitude. The steeper the normal fault, the larger is the vertical displacement and the larger is the seismic energy released. Normal faults activate preferentially at about 60° but they can be shallower in low friction rocks. In low static friction rocks, the fault may partly creep dissipating gravitational energy without releasing great amount of seismic energy. The maximum volume involved by graviquakes is smaller than the other tectonic settings, being the activated fault at most about three times the hypocentre depth, explaining their higher b-value and the lower magnitude of the largest recorded events. Having different phenomenology, graviquakes show peculiar precursors. PMID:26169163

  8. On the efficacy of procedures to normalize Ex-Gaussian distributions

    PubMed Central

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2015-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588

  9. On the efficacy of procedures to normalize Ex-Gaussian distributions.

    PubMed

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2014-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.

  10. Reproductive and some peri-natal variables in a mixed breed beef cattle herd.

    PubMed

    Ponzoni, R W; Gifford, D R

    1994-01-12

    Calving success (CS), days to calving (DC), birth weight (BW) and calving ease (CE) were studied in a mixed breed (Hereford, Jersey × Hereford and Simmental × Hereford) beef cattle herd. DC was not normally distributed and a number of transformations failed in normalising it. Repeatabilities were estimated by analysis of variance. Inclusion (or exclusion) of non calvers and the transformations studied had little effect on the repeatability of DC, which ranged from 0.10 to 0.12. The repeatabilities for CS, BW and CE were 0.08, 0.26 and 0.03, respectively. The residual correlations of CS with DC and functions of DC were high (-0.68 or greater), whereas the correlations among DC and functions of DC were close to one. The correlations of DC with BW and CE varied little with the transformation applied to DC, ranging from 0.26 to 0.28 and 0.10 to 0.12, respectively. The correlation between BW and CE was 0.06. The study points to a number of problems associated with the use of DC as a reproductive variable in beef cattle. It is concluded that although DC is currently a useful field reproductive variable, the search for appropriate female reproductive traits should continue. ZUSAMMENFASSUNG: Reproduktions- und Perinatal-Variable in einer gemischtrassigen Fleisch-Rinderherde Abkalbeerfolg (CS), Tage bis Abkalbung (DC), Geburtsgewicht (BW) und Kalbeleichtigkeit (CE) wurden in einer gemischtrassigen (Hereford, Jersey × Hereford und Simmental × Hereford) Mutterkuhherde untersucht. DC waren nicht normalverteilt und konnte auch durch eine Reihe von Transformationen nicht normalisiert werden. Wiederholbarkeiten wurden mit Varianzanalyse geschätzt. Berücksichtigung (oder Nichtberücksichtigung) von Nichtkalbungen und die Transformationen hatten wenig Wirkung auf Wiederholbarkeit von DC, die zwischen 0,10 und 0,12 war. Wiederholbarkeiten für CS, BW und CE waren 0,08, 0,26 und 0,03. Die Restkorrelation von CS mit DC und Funktionen von DC waren hoch (- 0,68 oder stärker), w

  11. Normalized Legal Drafting and the Query Method.

    ERIC Educational Resources Information Center

    Allen, Layman E.; Engholm, C. Rudy

    1978-01-01

    Normalized legal drafting, a mode of expressing ideas in legal documents so that the syntax that relates the constituent propositions is simplified and standardized, and the query method, a question-asking activity that teaches normalized drafting and provides practice, are examined. Some examples are presented. (JMD)

  12. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... either attuned to or characterized by these periodic water fluctuations. (b) Possible loss of... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  13. Attention and normalization circuits in macaque V1.

    PubMed

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-04-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. © 2015 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  14. Compressed normalized block difference for object tracking

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  15. Auf dem Weg zur digitalen Fakultät - moderne IT Infrastruktur am Beispiel des Physik-Departments der TU München

    NASA Astrophysics Data System (ADS)

    Homolka, Josef

    Der Geschäftsbetrieb einer Universität ist durch zunehmende Digitalisierung und Nutzung elektronischer Medien gekennzeichnet. Die Einführung immer leistungsfähigerer zentraler IT-Systeme führt zu einer komplexen Vielfalt heterogener Benutzer- und Administrationsschnittstellen. Zur Schaffung einer umfassenden benutzerfreundlichen und nahtlosen IT-Infrastruktur ist die Beteiligung aller organisatorischen Einheiten und Ebenen erforderlich. Am Physik-Departement der Technischen Universität München wurden unter Integration eigener Ressourcen mit zentralen Ressourcen, die im Rahmen des IntegraTUM Projektes entwickelt und bereitgestellt wurden, existierende Dienste weiterentwickelt und neue Angebote aufgebaut. Das System, bestehend aus den Komponenten Netzwerk, Arbeitsplatzrechner, Serverinfrastruktur, E-Mail-Service, WWWDienst, Datenhaltung und Software wurde für die Nutzerkreise Studenten und Mitarbeiter im Hinblick auf Anwenderfreundlichkeit und nahtlosen Zugriff optimiert.

  16. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  17. Tumor vessel normalization after aerobic exercise enhances chemotherapeutic efficacy.

    PubMed

    Schadler, Keri L; Thomas, Nicholas J; Galie, Peter A; Bhang, Dong Ha; Roby, Kerry C; Addai, Prince; Till, Jacob E; Sturgeon, Kathleen; Zaslavsky, Alexander; Chen, Christopher S; Ryeom, Sandra

    2016-10-04

    Targeted therapies aimed at tumor vasculature are utilized in combination with chemotherapy to improve drug delivery and efficacy after tumor vascular normalization. Tumor vessels are highly disorganized with disrupted blood flow impeding drug delivery to cancer cells. Although pharmacologic anti-angiogenic therapy can remodel and normalize tumor vessels, there is a limited window of efficacy and these drugs are associated with severe side effects necessitating alternatives for vascular normalization. Recently, moderate aerobic exercise has been shown to induce vascular normalization in mouse models. Here, we provide a mechanistic explanation for the tumor vascular normalization induced by exercise. Shear stress, the mechanical stimuli exerted on endothelial cells by blood flow, modulates vascular integrity. Increasing vascular shear stress through aerobic exercise can alter and remodel blood vessels in normal tissues. Our data in mouse models indicate that activation of calcineurin-NFAT-TSP1 signaling in endothelial cells plays a critical role in exercise-induced shear stress mediated tumor vessel remodeling. We show that moderate aerobic exercise with chemotherapy caused a significantly greater decrease in tumor growth than chemotherapy alone through improved chemotherapy delivery after tumor vascular normalization. Our work suggests that the vascular normalizing effects of aerobic exercise can be an effective chemotherapy adjuvant.

  18. A Review of Depth and Normal Fusion Algorithms

    PubMed Central

    Štolc, Svorad; Pock, Thomas

    2018-01-01

    Geometric surface information such as depth maps and surface normals can be acquired by various methods such as stereo light fields, shape from shading and photometric stereo techniques. We compare several algorithms which deal with the combination of depth with surface normal information in order to reconstruct a refined depth map. The reasons for performance differences are examined from the perspective of alternative formulations of surface normals for depth reconstruction. We review and analyze methods in a systematic way. Based on our findings, we introduce a new generalized fusion method, which is formulated as a least squares problem and outperforms previous methods in the depth error domain by introducing a novel normal weighting that performs closer to the geodesic distance measure. Furthermore, a novel method is introduced based on Total Generalized Variation (TGV) which further outperforms previous approaches in terms of the geodesic normal distance error and maintains comparable quality in the depth error domain. PMID:29389903

  19. Systemic sclerosis with normal or nonspecific nailfold capillaroscopy.

    PubMed

    Fichel, Fanny; Baudot, Nathalie; Gaitz, Jean-Pierre; Trad, Salim; Barbe, Coralie; Francès, Camille; Senet, Patricia

    2014-01-01

    In systemic sclerosis (SSc), a specific nailfold videocapillaroscopy (NVC) pattern is observed in 90% of cases and seems to be associated with severity and progression of the disease. To describe the characteristics of SSc patients with normal or nonspecific (normal/nonspecific) NVC. In a retrospective cohort study, clinical features and visceral involvements of 25 SSc cases with normal/nonspecific NVC were compared to 63 SSc controls with the SSc-specific NVC pattern. Normal/nonspecific NVC versus SSc-specific NVC pattern was significantly associated with absence of skin sclerosis (32 vs. 6.3%, p = 0.004), absence of telangiectasia (47.8 vs. 17.3%, p = 0.006) and absence of sclerodactyly (60 vs. 25.4%, p = 0.002), and less frequent severe pulmonary involvement (26.3 vs. 58.2%, p = 0.017). Normal/nonspecific NVC in SSc patients appears to be associated with less severe skin involvement and less frequent severe pulmonary involvement. © 2014 S. Karger AG, Basel.

  20. Non-Normality and Testing that a Correlation Equals Zero

    ERIC Educational Resources Information Center

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  1. Deformation associated with continental normal faults

    NASA Astrophysics Data System (ADS)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  2. Zeit im Wandel der Zeit.

    NASA Astrophysics Data System (ADS)

    Aichelburg, P. C.

    Contents: Einleitung(P. C. Aichelburg). 1. Über Zeit, Bewegung und Veränderung (Aristoteles). 2. Ewigkeit und Zeit (Plotin). 3. Was ist die Zeit? (Augustinus). 4. Von der Zeit (Immanuel Kant). 5. Newtons Ansichten über Zeit, Raum und Bewegung (Ernst Mach). 6. Über die mechanische Erklärung irreversibler Vorgänge (Ludwig Boltzmann). 7. Das Maß der Zeit (Henri Poincaré). 8. Dauer und Intuition (Henri Bergson). 9. Die Geschichte des Unendlichkeitsproblems (Bertrand Russell). 10. Raum und Zeit (Hermann Minkowski). 11. Der Unterschied von Zeit und Raum (Hans Reichenbach). 12. Newtonscher und Bergsonscher Zeitbegriff (Norbert Wiener). 13. Die Bildung des Zeitbegriffs beim Kinde (JeanPiaget).14. Eine Bemerkung über die Beziehungen zwischen Relativitätstheorie und der idealistischen Philosophie (Kurt Gödel). 15. Der zweite Hauptsatz und der Unterschied von Vergangenheit und Zukunft (Carl Friedrich v. Weizsäcker). 16. Zeit als physikalischer Begriff (Friedrich Hund). 17. Zeitmessung und Zeitbegriff in der Astronomie (Otto Heckmann). 18. Kann die Zeit rückwärts gehen? (Martin Gardner). 19. Zeit und Zeiten (Ilya Prigogine, Isabelle Stengers). 20. Zeit als dynamische Größe in der Relativitätstheorie (P. C. Aichelburg).

  3. Normalization as a canonical neural computation

    PubMed Central

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  4. Resonance Raman of BCC and normal skin

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-hui; Sriramoju, Vidyasagar; Boydston-White, Susie; Wu, Binlin; Zhang, Chunyuan; Pei, Zhe; Sordillo, Laura; Beckman, Hugh; Alfano, Robert R.

    2017-02-01

    The Resonance Raman (RR) spectra of basal cell carcinoma (BCC) and normal human skin tissues were analyzed using 532nm laser excitation. RR spectral differences in vibrational fingerprints revealed skin normal and cancerous states tissues. The standard diagnosis criterion for BCC tissues are created by native RR biomarkers and its changes at peak intensity. The diagnostic algorithms for the classification of BCC and normal were generated based on SVM classifier and PCA statistical method. These statistical methods were used to analyze the RR spectral data collected from skin tissues, yielding a diagnostic sensitivity of 98.7% and specificity of 79% compared with pathological reports.

  5. Bulimia nervosa in overweight and normal-weight women.

    PubMed

    Masheb, Robin; White, Marney A

    2012-02-01

    The aim of the present study was to examine overweight bulimia nervosa (BN) in a community sample of women. Volunteers (n = 1964) completed self-report questionnaires of weight, binge eating, purging, and cognitive features. Participants were classified as overweight (body mass index ≥25) or normal weight (body mass index <25). Rates of BN within the overweight and normal-weight classes did not differ (6.4% vs 7.9%). Of the 131 participants identified as BN, 64% (n = 84) were classified as overweight BN and 36% (n = 47) as normal-weight BN. The overweight BN group had a greater proportion of ethnic minorities and reported significantly less restraint than the normal-weight BN group. Otherwise, the 2 groups reported similarly, even in terms of purging and depression. In summary, rates of BN did not differ between overweight and normal-weight women. Among BN participants, the majority (two thirds) were overweight. Differences in ethnicity and restraint, but little else, were found between overweight and normal-weight BN. Findings from the present study should serve to increase awareness of the weight range and ethnic diversity of BN, and highlight the need to address weight and cultural sensitivity in the identification and treatment of eating disorders. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Normal force and drag force in magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Miao, Chunlin; Shafrir, Shai N.; Lambropoulos, John C.; Jacobs, Stephen D.

    2009-08-01

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, λ, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials including optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low "coefficient of friction". The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.

  7. High-Frequency Normal Mode Propagation in Aluminum Cylinders

    USGS Publications Warehouse

    Lee, Myung W.; Waite, William F.

    2009-01-01

    Acoustic measurements made using compressional-wave (P-wave) and shear-wave (S-wave) transducers in aluminum cylinders reveal waveform features with high amplitudes and with velocities that depend on the feature's dominant frequency. In a given waveform, high-frequency features generally arrive earlier than low-frequency features, typical for normal mode propagation. To analyze these waveforms, the elastic equation is solved in a cylindrical coordinate system for the high-frequency case in which the acoustic wavelength is small compared to the cylinder geometry, and the surrounding medium is air. Dispersive P- and S-wave normal mode propagations are predicted to exist, but owing to complex interference patterns inside a cylinder, the phase and group velocities are not smooth functions of frequency. To assess the normal mode group velocities and relative amplitudes, approximate dispersion relations are derived using Bessel functions. The utility of the normal mode theory and approximations from a theoretical and experimental standpoint are demonstrated by showing how the sequence of P- and S-wave normal mode arrivals can vary between samples of different size, and how fundamental normal modes can be mistaken for the faster, but significantly smaller amplitude, P- and S-body waves from which P- and S-wave speeds are calculated.

  8. ["Normal pressure" hydrocephalus].

    PubMed

    Philippon, Jacques

    2005-03-01

    Normal pressure hydrocephalus (NPH) or, more precisely, chronic adult hydrocephalus, is a complex condition. Even if the basic mechanism is found in an impediment to CSF absorption, the underlying pathology is heterogeneous. In secondary NPH, the disruption of normal CSF pathways, following meningitis or sub-arachnoid haemorrhage, is responsible for ventricular dilatation. However, in about half of the cases, the etiology remains obscure. NPH is more frequently found in elderly people, probably in relation with the increased incidence of cerebrovascular disease. The diagnosis of NPH is based upon a triad of clinical symptoms. The main symptom is gait disturbances, followed by urinary incontinence and various degree of cognitive changes. The latter two symptoms are not prerequisites for the diagnosis. Radiological ventricular dilatation without cortical sulcal enlargement is a key factor, as well as substantial clinical improvement after CSF withdrawal (CSF tap test). Other CSF dynamic studies and various imaging investigations have been proposed to improve diagnostic accuracy, but no simple test can predict the results of CSF drainage. The current treatment is ventriculo-peritonial shunting, ideally using an adjustable valve. Results are directly dependent upon the accuracy of the preoperative diagnosis. Post-surgical complications may be observed in about 10% of cases.

  9. Self-Monitoring of Listening Abilities in Normal-Hearing Children, Normal-Hearing Adults, and Children with Cochlear Implants

    PubMed Central

    Rothpletz, Ann M.; Wightman, Frederic L.; Kistler, Doris J.

    2012-01-01

    Background Self-monitoring has been shown to be an essential skill for various aspects of our lives, including our health, education, and interpersonal relationships. Likewise, the ability to monitor one’s speech reception in noisy environments may be a fundamental skill for communication, particularly for those who are often confronted with challenging listening environments, such as students and children with hearing loss. Purpose The purpose of this project was to determine if normal-hearing children, normal-hearing adults, and children with cochlear implants can monitor their listening ability in noise and recognize when they are not able to perceive spoken messages. Research Design Participants were administered an Objective-Subjective listening task in which their subjective judgments of their ability to understand sentences from the Coordinate Response Measure corpus presented in speech spectrum noise were compared to their objective performance on the same task. Study Sample Participants included 41 normal-hearing children, 35 normal-hearing adults, and 10 children with cochlear implants. Data Collection and Analysis On the Objective-Subjective listening task, the level of the masker noise remained constant at 63 dB SPL, while the level of the target sentences varied over a 12 dB range in a block of trials. Psychometric functions, relating proportion correct (Objective condition) and proportion perceived as intelligible (Subjective condition) to target/masker ratio (T/M), were estimated for each participant. Thresholds were defined as the T/M required to produce 51% correct (Objective condition) and 51% perceived as intelligible (Subjective condition). Discrepancy scores between listeners’ threshold estimates in the Objective and Subjective conditions served as an index of self-monitoring ability. In addition, the normal-hearing children were administered tests of cognitive skills and academic achievement, and results from these measures were compared

  10. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    PubMed

    Choi, Jiwoong; Hoffman, Eric A; Lin, Ching-Long; Milhem, Mohammed M; Tessier, Jean; Newell, John D

    2017-01-01

    Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells) through interactions with host cells. We explored this with serial inspiratory computed tomography (CT) and image matching to assess regional changes in lung expansion. We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05. Lung regions of metastatic sarcoma patients (but not the normal control group) demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05). There was also evidence of increased lung "tissue" volume (non-air components) in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions. This new quantitative CT (QCT) method for linking serial

  11. About normal distribution on SO(3) group in texture analysis

    NASA Astrophysics Data System (ADS)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  12. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    PubMed

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  13. Normal Birth: Two Stories

    PubMed Central

    Scaer, Roberta M.

    2002-01-01

    The author shares two stories: one of a normal birth that took place in a hospital with a nurse-midwife in attendance and another of a home birth unexpectedly shared by many colleagues. Both are told with the goal to inform, inspire, and educate. PMID:17273292

  14. Mutual regulation of tumour vessel normalization and immunostimulatory reprogramming.

    PubMed

    Tian, Lin; Goldstein, Amit; Wang, Hai; Ching Lo, Hin; Sun Kim, Ik; Welte, Thomas; Sheng, Kuanwei; Dobrolecki, Lacey E; Zhang, Xiaomei; Putluri, Nagireddy; Phung, Thuy L; Mani, Sendurai A; Stossi, Fabio; Sreekumar, Arun; Mancini, Michael A; Decker, William K; Zong, Chenghang; Lewis, Michael T; Zhang, Xiang H-F

    2017-04-13

    Blockade of angiogenesis can retard tumour growth, but may also paradoxically increase metastasis. This paradox may be resolved by vessel normalization, which involves increased pericyte coverage, improved tumour vessel perfusion, reduced vascular permeability, and consequently mitigated hypoxia. Although these processes alter tumour progression, their regulation is poorly understood. Here we show that type 1 T helper (T H 1) cells play a crucial role in vessel normalization. Bioinformatic analyses revealed that gene expression features related to vessel normalization correlate with immunostimulatory pathways, especially T lymphocyte infiltration or activity. To delineate the causal relationship, we used various mouse models with vessel normalization or T lymphocyte deficiencies. Although disruption of vessel normalization reduced T lymphocyte infiltration as expected, reciprocal depletion or inactivation of CD4 + T lymphocytes decreased vessel normalization, indicating a mutually regulatory loop. In addition, activation of CD4 + T lymphocytes by immune checkpoint blockade increased vessel normalization. T H 1 cells that secrete interferon-γ are a major population of cells associated with vessel normalization. Patient-derived xenograft tumours growing in immunodeficient mice exhibited enhanced hypoxia compared to the original tumours in immunocompetent humans, and hypoxia was reduced by adoptive T H 1 transfer. Our findings elucidate an unexpected role of T H 1 cells in vasculature and immune reprogramming. T H 1 cells may be a marker and a determinant of both immune checkpoint blockade and anti-angiogenesis efficacy.

  15. Miocene climate as recorded on slope carbonates : examples from Malta (Central Mediterranean) and Northeastern Australia (Marion Plateau, ODP LEG 194)

    NASA Astrophysics Data System (ADS)

    John, Cédric Michaël

    2003-08-01

    land mass (Malta) and the absence of a barrier to shelter from the effects of open ocean (Marion Plateau). Im Rahmen dieser Doktorarbeit wurden die Hangkarbonate von zwei miozänen heterozoischen Karbonatsystemen näher untersucht: die Malta Inselgruppe (zentrales Mittelmeer) und das Marion Plateau (Nordost Australien, ODP Leg 194). Die Auswirkungen der mittelmiozänen Abkühlung (Mi3), die auf 13.6 Ma datiert wird und starken Einfluß auf die Sauerstoffisotopenkurve hatte, in den oben genannten Flachwassersystemen stellten das Ziel dieser Arbeit dar. Dieses Abkühlungsereignis beeinflußte außerdem sehr stark die ozeanographischen und klimatischen Muster, die im weiteren Verlauf zum modernen Eishausklima führten. So steht insbesondere die Vereisung von Ostantarktika mit diesem Ereignis in Verbindung. Diese Arbeit untersucht den Einfluß dieses Ereignisses auf Flachwassersysteme, um vorliegende Untersuchungen in Tiefwassersystemen zu ergänzen und so zum globalen Verständnis des miozänen Klimawechsels beizutragen. Die Profile auf der Maltainselgruppe wurden mit Hilfe von Kohlenstoff- und Sauerstoffisotopen Auswertungen im Gesamtgestein, Gesamtgesteinmineralogie, Tonmineralanalyse und organischer Geochemie untersucht. Durch einen Wechsel von karbonatischeren zu tonigeren Sedimenten beeinflußte das mittelmiozäne Abkühlungsereignis die Sedimentation in diesem Gebiet sehr stark. Weiterhin wurde beobachtet, daß jede Phase der antarktischen Vereisung, nicht nur das mittelmiozäne Hauptereignis, zu einem erhöhten terrigenen Eintrag in den Hangsedimenten der Maltainselgruppe führte. Akkumulationsraten zeigen, daß dieser erhöhte terrigene Eintrag den einzelnen Vereisungsperioden zusammenhängt und die karbonatischen Sedimente durch tonreiche Sedimente “verunreinigt” wurden. Das daraufhin entwickelte Modell erklärt diesen erhöhten terrigenen Eintrag mit einer nordwärtigen Verlagerung der innertropischen Konvergenzzone durch die Bildung von kalten

  16. Phenformin-induced Hypoglycaemia in Normal Subjects*

    PubMed Central

    Lyngsøe, J.; Trap-Jensen, J.

    1969-01-01

    Study of the effect of phenformin on the blood glucose level in normal subjects before and during 70 hours of starvation showed a statistically significant hypoglycaemic effect after 40 hours of starvation. This effect was not due to increased glucose utilization. Another finding in this study was a statistically significant decrease in total urinary nitrogen excretion during starvation in subjects given phenformin. These findings show that the hypoglycaemic effect of phenformin in starved normal subjects is due to inhibition of gluconeogenesis. PMID:5780431

  17. Generalized approach for using unbiased symmetric metrics with negative values: normalized mean bias factor and normalized mean absolute error factor

    EPA Science Inventory

    Unbiased symmetric metrics provide a useful measure to quickly compare two datasets, with similar interpretations for both under and overestimations. Two examples include the normalized mean bias factor and normalized mean absolute error factor. However, the original formulations...

  18. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  19. [Primary culture of human normal epithelial cells].

    PubMed

    Tang, Yu; Xu, Wenji; Guo, Wanbei; Xie, Ming; Fang, Huilong; Chen, Chen; Zhou, Jun

    2017-11-28

    The traditional primary culture methods of human normal epithelial cells have disadvantages of low activity of cultured cells, the low cultivated rate and complicated operation. To solve these problems, researchers made many studies on culture process of human normal primary epithelial cell. In this paper, we mainly introduce some methods used in separation and purification of human normal epithelial cells, such as tissue separation method, enzyme digestion separation method, mechanical brushing method, red blood cell lysis method, percoll layered medium density gradient separation method. We also review some methods used in the culture and subculture, including serum-free medium combined with low mass fraction serum culture method, mouse tail collagen coating method, and glass culture bottle combined with plastic culture dish culture method. The biological characteristics of human normal epithelial cells, the methods of immunocytochemical staining, trypan blue exclusion are described. Moreover, the factors affecting the aseptic operation, the conditions of the extracellular environment, the conditions of the extracellular environment during culture, the number of differential adhesion, and the selection and dosage of additives are summarized.

  20. Neuropathological and neuropsychological changes in "normal" aging: evidence for preclinical Alzheimer disease in cognitively normal individuals.

    PubMed

    Hulette, C M; Welsh-Bohmer, K A; Murray, M G; Saunders, A M; Mash, D C; McIntyre, L M

    1998-12-01

    The presence of diffuse or primitive senile plaques in the neocortex of cognitively normal elderly at autopsy has been presumed to represent normal aging. Alternatively, these patients may have developed dementia and clinical Alzheimer disease (AD) if they had survived. In this setting, these patients could be subjects for cognitive or pharmacologic intervention to delay disease onset. We have thus followed a cohort of cognitively normal elderly subjects with a Clinical Dementia Rating (CDR) of 0 at autopsy. Thirty-one brains were examined at postmortem according to Consortium to Establish a Registry for Alzheimer Disease (CERAD) criteria and staged according to Braak. Ten patients were pathologically normal according to CERAD criteria (1a). Two of these patients were Braak Stage II. Seven very elderly subjects exhibited a few primitive neuritic plaques in the cortex and thus represented CERAD 1b. These individuals ranged in age from 85 to 105 years and were thus older than the CERAD la group that ranged in age from 72 to 93. Fourteen patients displayed Possible AD according to CERAD with ages ranging from 66 to 95. Three of these were Braak Stage I, 4 were Braak Stage II, and 7 were Braak Stage III. The Apolipoprotein E4 allele was over-represented in this possible AD group. Neuropsychological data were available on 12 individuals. In these 12 individuals, Possible AD at autopsy could be predicted by cognitive deficits in 1 or more areas including savings scores on memory testing and overall performance on some measures of frontal executive function.

  1. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, Tuan; Panjehpour, Masoud; Overholt, Bergein F.

    1996-01-01

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample.

  2. Normal Force and Drag Force in Magnetorheological Finishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, C.; Shafrir, S.N.; Lambropoulos, J.C.

    2010-01-13

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, tau, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials includingmore » optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low “coefficient of friction”. The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.« less

  3. Ultraviolet Spectra of Normal Spiral Galaxies

    NASA Technical Reports Server (NTRS)

    Kinney, Anne

    1997-01-01

    The data related to this grant on the Ultraviolet Spectra of Normal Spiral Galaxies have been entirely reduced and analyzed. It is incorporated into templates of Spiral galaxies used in the calculation of K corrections towards the understanding of high redshift galaxies. The main paper was published in the Astrophysical Journal, August 1996, Volume 467, page 38. The data was also used in another publication, The Spectral Energy Distribution of Normal Starburst and Active Galaxies, June 1997, preprint series No. 1158. Copies of both have been attached.

  4. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  5. Normal-range verbal-declarative memory in schizophrenia.

    PubMed

    Heinrichs, R Walter; Parlar, Melissa; Pinnock, Farena

    2017-10-01

    Cognitive impairment is prevalent and related to functional outcome in schizophrenia, but a significant minority of the patient population overlaps with healthy controls on many performance measures, including declarative-verbal-memory tasks. In this study, we assessed the validity, clinical, and functional implications of normal-range (NR), verbal-declarative memory in schizophrenia. Performance normality was defined using normative data for 8 basic California Verbal Learning Test (CVLT-II; Delis, Kramer, Kaplan, & Ober, 2000) recall and recognition trials. Schizophrenia patients (n = 155) and healthy control participants (n = 74) were assessed for performance normality, defined as scores within 1 SD of the normative mean on all 8 trials, and assigned to normal- and below-NR memory groups. NR schizophrenia patients (n = 26) and control participants (n = 51) did not differ in general verbal ability, on a reading-based estimate of premorbid ability, across all 8 CVLT-II-score comparisons or in terms of intrusion and false-positive errors and auditory working memory. NR memory patients did not differ from memory-impaired patients (n = 129) in symptom severity, and both patient groups were significantly and similarly disabled in terms of functional status in the community. These results confirm a subpopulation of schizophrenia patients with normal, verbal-declarative-memory performance and no evidence of decline from higher premorbid ability levels. However, NR patients did not experience less severe psychopathology, nor did they show advantage in community adjustment relative to impaired patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Normalization of a chromosomal contact map.

    PubMed

    Cournac, Axel; Marie-Nelly, Hervé; Marbouty, Martial; Koszul, Romain; Mozziconacci, Julien

    2012-08-30

    Chromatin organization has been increasingly studied in relation with its important influence on DNA-related metabolic processes such as replication or regulation of gene expression. Since its original design ten years ago, capture of chromosome conformation (3C) has become an essential tool to investigate the overall conformation of chromosomes. It relies on the capture of long-range trans and cis interactions of chromosomal segments whose relative proportions in the final bank reflect their frequencies of interactions, hence their spatial proximity in a population of cells. The recent coupling of 3C with deep sequencing approaches now allows the generation of high resolution genome-wide chromosomal contact maps. Different protocols have been used to generate such maps in various organisms. This includes mammals, drosophila and yeast. The massive amount of raw data generated by the genomic 3C has to be carefully processed to alleviate the various biases and byproducts generated by the experiments. Our study aims at proposing a simple normalization procedure to minimize the influence of these unwanted but inevitable events on the final results. Careful analysis of the raw data generated previously for budding yeast S. cerevisiae led to the identification of three main biases affecting the final datasets, including a previously unknown bias resulting from the circularization of DNA molecules. We then developed a simple normalization procedure to process the data and allow the generation of a normalized, highly contrasted, chromosomal contact map for S. cerevisiae. The same method was then extended to the first human genome contact map. Using the normalized data, we revisited the preferential interactions originally described between subsets of discrete chromosomal features. Notably, the detection of preferential interactions between tRNA in yeast and CTCF, PolII binding sites in human can vary with the normalization procedure used. We quantitatively reanalyzed the

  7. Normal forms for Hopf-Zero singularities with nonconservative nonlinear part

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh; Sanders, Jan A.

    In this paper we are concerned with the simplest normal form computation of the systems x˙=2xf(x,y2+z2), y˙=z+yf(x,y2+z2), z˙=-y+zf(x,y2+z2), where f is a formal function with real coefficients and without any constant term. These are the classical normal forms of a larger family of systems with Hopf-Zero singularity. Indeed, these are defined such that this family would be a Lie subalgebra for the space of all classical normal form vector fields with Hopf-Zero singularity. The simplest normal forms and simplest orbital normal forms of this family with nonzero quadratic part are computed. We also obtain the simplest parametric normal form of any non-degenerate perturbation of this family within the Lie subalgebra. The symmetry group of the simplest normal forms is also discussed. This is a part of our results in decomposing the normal forms of Hopf-Zero singular systems into systems with a first integral and nonconservative systems.

  8. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Role of the normal gut microbiota.

    PubMed

    Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D

    2015-08-07

    Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.

  10. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, T.; Panjehpour, M.; Overholt, B.F.

    1996-12-03

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample. 5 figs.

  11. A general approach to double-moment normalization of drop size distributions

    NASA Astrophysics Data System (ADS)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  12. Toward the optimization of normalized graph Laplacian.

    PubMed

    Xie, Bo; Wang, Meng; Tao, Dacheng

    2011-04-01

    Normalized graph Laplacian has been widely used in many practical machine learning algorithms, e.g., spectral clustering and semisupervised learning. However, all of them use the Euclidean distance to construct the graph Laplacian, which does not necessarily reflect the inherent distribution of the data. In this brief, we propose a method to directly optimize the normalized graph Laplacian by using pairwise constraints. The learned graph is consistent with equivalence and nonequivalence pairwise relationships, and thus it can better represent similarity between samples. Meanwhile, our approach, unlike metric learning, automatically determines the scale factor during the optimization. The learned normalized Laplacian matrix can be directly applied in spectral clustering and semisupervised learning algorithms. Comprehensive experiments demonstrate the effectiveness of the proposed approach.

  13. The Effect of Normalization in Violence Video Classification Performance

    NASA Astrophysics Data System (ADS)

    Ali, Ashikin; Senan, Norhalina

    2017-08-01

    Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.

  14. Mitochondrial dysfunction in myocardium obtained from clinically normal dogs, clinically normal anesthetized dogs, and dogs with dilated cardiomyopathy.

    PubMed

    Sleeper, Meg M; Rosato, Bradley P; Bansal, Seema; Avadhani, Narayan G

    2012-11-01

    To compare mitochondrial complex I and complex IV activity in myocardial mitochondria of clinically normal dogs, clinically normal dogs exposed to inhalation anesthesia, and dogs affected with dilated cardiomyopathy. Myocardial samples obtained from 21 euthanized dogs (6 clinically normal [control] dogs, 5 clinically normal dogs subjected to inhalation anesthesia with isoflurane prior to euthanasia, 5 dogs with juvenile-onset dilated cardiomyopathy, and 5 dogs with adult-onset dilated cardiomyopathy). Activity of mitochondrial complex I and complex IV was assayed spectrophotometrically in isolated mitochondria from left ventricular tissue obtained from the 4 groups of dogs. Activity of complex I and complex IV was significantly decreased in anesthetized dogs, compared with activities in the control dogs and dogs with juvenile-onset or adult-onset dilated cardiomyopathy. Inhalation anesthesia disrupted the electron transport chain in the dogs, which potentially led to an outburst of reactive oxygen species that caused mitochondrial dysfunction. Inhalation anesthesia depressed mitochondrial function in dogs, similar to results reported in other species. This effect is important to consider when anesthetizing animals with myocardial disease and suggested that antioxidant treatments may be beneficial in some animals. Additionally, this effect should be considered when designing studies in which mitochondrial enzyme activity will be measured. Additional studies that include a larger number of animals are warranted.

  15. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  16. Verbal Processing Reaction Times in "Normal" and "Poor" Readers.

    ERIC Educational Resources Information Center

    Culbertson, Jack; And Others

    After it had been determined that reaction time (RT) was a sensitive measure of hemispheric dominance in a verbal task performed by normal adult readers, the reaction times of three groups of subjects (20 normal reading college students, 12 normal reading third graders and 11 poor reading grade school students) were compared. Ss were exposed to…

  17. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  18. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  19. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  20. Volume-preserving normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  1. Towards Automatic Threat Recognition

    DTIC Science & Technology

    2006-12-01

    York: Bantam. Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Towards...Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Content Preliminaries about Information Fusion The System Ontology Unification...as Processing Principle Back to the Example Conclusion and Outlook Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN

  2. An Integrated Approach for RNA-seq Data Normalization.

    PubMed

    Yang, Shengping; Mercante, Donald E; Zhang, Kun; Fang, Zhide

    2016-01-01

    DNA copy number alteration is common in many cancers. Studies have shown that insertion or deletion of DNA sequences can directly alter gene expression, and significant correlation exists between DNA copy number and gene expression. Data normalization is a critical step in the analysis of gene expression generated by RNA-seq technology. Successful normalization reduces/removes unwanted nonbiological variations in the data, while keeping meaningful information intact. However, as far as we know, no attempt has been made to adjust for the variation due to DNA copy number changes in RNA-seq data normalization. In this article, we propose an integrated approach for RNA-seq data normalization. Comparisons show that the proposed normalization can improve power for downstream differentially expressed gene detection and generate more biologically meaningful results in gene profiling. In addition, our findings show that due to the effects of copy number changes, some housekeeping genes are not always suitable internal controls for studying gene expression. Using information from DNA copy number, integrated approach is successful in reducing noises due to both biological and nonbiological causes in RNA-seq data, thus increasing the accuracy of gene profiling.

  3. Pattern Adaptation and Normalization Reweighting.

    PubMed

    Westrick, Zachary M; Heeger, David J; Landy, Michael S

    2016-09-21

    Adaptation to an oriented stimulus changes both the gain and preferred orientation of neural responses in V1. Neurons tuned near the adapted orientation are suppressed, and their preferred orientations shift away from the adapter. We propose a model in which weights of divisive normalization are dynamically adjusted to homeostatically maintain response products between pairs of neurons. We demonstrate that this adjustment can be performed by a very simple learning rule. Simulations of this model closely match existing data from visual adaptation experiments. We consider several alternative models, including variants based on homeostatic maintenance of response correlations or covariance, as well as feedforward gain-control models with multiple layers, and we demonstrate that homeostatic maintenance of response products provides the best account of the physiological data. Adaptation is a phenomenon throughout the nervous system in which neural tuning properties change in response to changes in environmental statistics. We developed a model of adaptation that combines normalization (in which a neuron's gain is reduced by the summed responses of its neighbors) and Hebbian learning (in which synaptic strength, in this case divisive normalization, is increased by correlated firing). The model is shown to account for several properties of adaptation in primary visual cortex in response to changes in the statistics of contour orientation. Copyright © 2016 the authors 0270-6474/16/369805-12$15.00/0.

  4. Fault stability under conditions of variable normal stress

    USGS Publications Warehouse

    Dieterich, J.H.; Linker, M.F.

    1992-01-01

    The stability of fault slip under conditions of varying normal stress is modelled as a spring and slider system with rate- and state-dependent friction. Coupling of normal stress to shear stress is achieved by inclining the spring at an angle, ??, to the sliding surface. Linear analysis yields two conditions for unstable slip. The first, of a type previously identified for constant normal stress systems, results in instability if stiffness is below a critical value. Critical stiffness depends on normal stress, constitutive parameters, characteristic sliding distance and the spring angle. Instability of the first type is possible only for velocity-weakening friction. The second condition yields instability if spring angle ?? <-cot-1??ss, where ??ss is steady-state sliding friction. The second condition can arise under conditions of velocity strengthening or weakening. Stability fields for finite perturbations are investigated by numerical simulation. -Authors

  5. Bacterial microflora of normal and telangiectatic livers in cattle.

    PubMed

    Stotland, E I; Edwards, J F; Roussel, A J; Simpson, R B

    2001-07-01

    To identify potential bacterial pathogens in normal and telangiectatic livers of mature cattle at slaughter and to identify consumer risk associated with hepatic telangiectasia. 50 normal livers and 50 severely telangiectatic livers. Normal and telangiectatic livers were collected at slaughter for aerobic and anaerobic bacterial culture. Isolates were identified, and patterns of isolation were analyzed. Histologic examination of all livers was performed. Human pathogens isolated from normal and telangiectatic livers included Escherichia coli O157:H7 and group-D streptococci. Most livers in both groups contained bacteria in low numbers; however, more normal livers yielded negative culture results. More group-D streptococci were isolated from the right lobes of telangiectatic livers than from the left lobes, and more gram-negative anaerobic bacteria were isolated from left lobes of telangiectatic livers than from right lobes. All telangiectatic lesions were free of fibrosis, active necrotizing processes, and inflammation. The USDA regulation condemning telangiectatic livers is justified insofar as these livers contain more bacteria than normal livers do; however, normal livers contain similar species of microflora. Development of telangiectasia could not be linked to an infectious process. The finding of E coli O157:H7 in bovine livers suggests that information regarding bacterial content of other offal and muscle may identify sources of this and other potential foodborne pathogens and assist in establishing critical control points for the meat industry.

  6. CEC-normalized clay-water sorption isotherm

    NASA Astrophysics Data System (ADS)

    Woodruff, W. F.; Revil, A.

    2011-11-01

    A normalized clay-water isotherm model based on BET theory and describing the sorption and desorption of the bound water in clays, sand-clay mixtures, and shales is presented. Clay-water sorption isotherms (sorption and desorption) of clayey materials are normalized by their cation exchange capacity (CEC) accounting for a correction factor depending on the type of counterion sorbed on the mineral surface in the so-called Stern layer. With such normalizations, all the data collapse into two master curves, one for sorption and one for desorption, independent of the clay mineralogy, crystallographic considerations, and bound cation type; therefore, neglecting the true heterogeneity of water sorption/desorption in smectite. The two master curves show the general hysteretic behavior of the capillary pressure curve at low relative humidity (below 70%). The model is validated against several data sets obtained from the literature comprising a broad range of clay types and clay mineralogies. The CEC values, derived by inverting the sorption/adsorption curves using a Markov chain Monte Carlo approach, are consistent with the CEC associated with the clay mineralogy.

  7. Vagina: What's Normal, What's Not

    MedlinePlus

    ... some antibiotics increases the risk of a vaginal yeast infection. Birth control and feminine-hygiene products. Barrier ... or change in the normal balance of vaginal yeast and bacteria can cause inflammation of the vagina ( ...

  8. Graph-based normalization and whitening for non-linear data analysis.

    PubMed

    Aaron, Catherine

    2006-01-01

    In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.

  9. Normal modes of a small gamelan gong.

    PubMed

    Perrin, Robert; Elford, Daniel P; Chalmers, Luke; Swallowe, Gerry M; Moore, Thomas R; Hamdan, Sinin; Halkon, Benjamin J

    2014-10-01

    Studies have been made of the normal modes of a 20.7 cm diameter steel gamelan gong. A finite-element model has been constructed and its predictions for normal modes compared with experimental results obtained using electronic speckle pattern interferometry. Agreement was reasonable in view of the lack of precision in the manufacture of the instrument. The results agree with expectations for an axially symmetric system subject to small symmetry breaking. The extent to which the results obey Chladni's law is discussed. Comparison with vibrational and acoustical spectra enabled the identification of the small number of modes responsible for the sound output when played normally. Evidence of non-linear behavior was found, mainly in the form of subharmonics of true modes. Experiments using scanning laser Doppler vibrometry gave satisfactory agreement with the other methods.

  10. Werkstoffe

    NASA Astrophysics Data System (ADS)

    Hornbogen, Erhard; Eggeler, Gunther; Werner, Ewald

    Lernziel: Dieses Kapitel vermittelt einen ersten Eindruck von Werkstoffen, die bestimmte technische Eigenschaften besitzen müssen, dabei einfach herstellbar sein sollen und die Forderung der Wirtschaftlichkeit erfüllen müssen. Wir diskutieren Werkstoffe in einfachen, allgemeinen und speziellen Zusammenhängen und lernen das Wissensgebiet Werkstoffkunde kennen, das die Werkstoffwissenschaft und die Werkstofftechnik umfasst. Wir verschaffen uns einen ersten Eindruck vom mikroskopischen Aufbau der vier Werkstoffgruppen Metalle, Gläser/Keramiken, Kunststoffe und Verbundwerkstoffe. Wir lernen einige wichtige Werkstoffeigenschaften kennen. Es geht dann um zuverl ässige Datenüber Eigenschaften von Werkstoffen und in diesem Zusammenhang wird die Prüfung, die Normung und die Bezeichnung von Werkstoffen betrachtet. Schließlich befassen wir uns kurz mit der zeitlichen Entwicklung von Werkstoffen und führen den Begriff der Nachhaltigkeit ein.

  11. Anwendungsorientierte Funktionswerkstoffe mittels Walzplattieren

    NASA Astrophysics Data System (ADS)

    Reichelt, Stephan; Schmidt, J.-F.; Neubauer, M.; Schade, A.; Andler, G.; Buerkle, G.; Hansen, H.; Hofmann, L.; Stiehler, J.; Janisch, S. D.

    Obwohl walzplattierte Werkstoffe in der allgemeinen Öffentlichkeit nur wenig beachtet und behandelt werden, besitzen sie bereits heute eine hohe Bedeutung, um unterschiedliche Güter des täglichen Bedarfs realisieren und produzieren zu können. Durch eine hohe Produktivität und ein umfangreiches technologisches Wissen lassen sich bereits heute sehr anspruchsvolle und aufwändige Funktionswerkstoffe mittels Plattierverfahren herstellen, die maßgeschneiderte Produkteigenschaften für viele Halbzeuge und Fertigprodukte garantieren. Bedingt durch die konstruktiven Trends im Bereich von Automotive, Luft- und Raumfahrt sowie dem allgemeinen konstruktiven und materiellem Leichtbau ist davon auszugehen, dass die Bedeutung maßgeschneiderter Funktionswerkstoffe für spezifische Anwendungsfelder in naher Zukunft weiter wachsen und die Bedeutung einzelner Plattierverfahren für eine wirtschaftliche Produktionskette verschiedener Produkte wachsen wird.

  12. Der neue Kosmos

    NASA Astrophysics Data System (ADS)

    Unsöld, Albrecht; Baschek, Bodo

    Astronomie, Astrophysik und Weltraumforschung haben innerhalb weniger Jahrzehnte eine geradezu explosive Entwicklung genommen. Die neuen Beobachtungsmöglichkeiten durch die Raumfahrt, die Entwicklung hochempfindlicher Lichtdetektoren und der Einsatz leistungsstarker Computer haben uns neuartige Aspekte in der faszinierenden Welt der Galaxien und Quasare, der Sterne und Planeten erschlossen. Nachdem die dritte Auflage vergriffen ist, trägt die vorliegende vierte, völlig neubearbeitete Auflage des NEUEN KOSMOS dieser stürmischen Entwicklung Rechnung. In überschaubarem Umfang wird - bei bescheidenen Ansprüchen an die mathematisch-naturwissenschaftliche Vorbildung des Lesers - eine zusammenhängende Einführung in das Gesamtgebiet der Astronomie und Astrophysik gegeben, welche die Beobachtungen und die Grundgedanken ihrer theoretischen Deutung in gleicher Weise berücksichtigt. Auch in seiner neuen Gestalt wird der NEUE KOSMOS den Studenten und Forschern in Bereichen der Astronomie, Physik und Geowissenschaften sowie einem weiten Kreis ernsthaft interessierter Amateure viel Neues und viel Freude bringen.

  13. Konzeption, Entwicklung und Evaluierung eines Messsystems zur sortenreinen Klassifikation von fluoreszenzcodierten Kunststoffen im Rahmen des Kunststoff-Recyclings(Conception, development and evaluation of a measuring system for the classification of fluorescence coded plastics within the framework of plastic recycling)

    DTIC Science & Technology

    2016-06-13

    REPORT DOCUMENTATION PAGE 1 Form Approved OMB No. 0704-0 188 Public reporting burden for this collection of information is estimated to average 1...PRI CE CODE 19, SECURITY CLASSI FI CATI ON 20. LIM ITATION OF ABSTRACT OF ABSTRACT UNCLASSIFIED UL Standard Form 298 (Rev. 2-89) Prescribed by...Trennung komplexer Kunststoffmixturen in Form von typi- schem Kunststoffmahlgut ("Flakes") und insbesondere dunkler bzw. schwarzer Kunst- stoffe beseitigt

  14. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  15. Normal Language Skills and Normal Intelligence in a Child with de Lange Syndrome.

    ERIC Educational Resources Information Center

    Cameron, Thomas H.; Kelly, Desmond P.

    1988-01-01

    The subject of this case report is a two-year, seven-month-old girl with de Lange syndrome, normal intelligence, and age-appropriate language skills. She demonstrated initial delays in gross motor skills and in receptive and expressive language but responded well to intensive speech and language intervention, as well as to physical therapy.…

  16. The experience of weight management in normal weight adults.

    PubMed

    Hernandez, Cheri Ann; Hernandez, David A; Wellington, Christine M; Kidd, Art

    2016-11-01

    No prior research has been done with normal weight persons specific to their experience of weight management. The purpose of this research was to discover the experience of weight management in normal weight individuals. Glaserian grounded theory was used. Qualitative data (focus group) and quantitative data (food diary, study questionnaire, and anthropometric measures) were collected. Weight management was an ongoing process of trying to focus on living (family, work, and social), while maintaining their normal weight targets through five consciously and unconsciously used strategies. Despite maintaining normal weights, the nutritional composition of foods eaten was grossly inadequate. These five strategies can be used to develop new weight management strategies that could be integrated into existing weight management programs, or could be developed into novel weight management interventions. Surprisingly, normal weight individuals require dietary assessment and nutrition education to prevent future negative health consequences. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Bayes classification of terrain cover using normalized polarimetric data

    NASA Technical Reports Server (NTRS)

    Yueh, H. A.; Swartz, A. A.; Kong, J. A.; Shin, R. T.; Novak, L. M.

    1988-01-01

    The normalized polarimetric classifier (NPC) which uses only the relative magnitudes and phases of the polarimetric data is proposed for discrimination of terrain elements. The probability density functions (PDFs) of polarimetric data are assumed to have a complex Gaussian distribution, and the marginal PDF of the normalized polarimetric data is derived by adopting the Euclidean norm as the normalization function. The general form of the distance measure for the NPC is also obtained. It is demonstrated that for polarimetric data with an arbitrary PDF, the distance measure of NPC will be independent of the normalization function selected even when the classifier is mistrained. A complex Gaussian distribution is assumed for the polarimetric data consisting of grass and tree regions. The probability of error for the NPC is compared with those of several other single-feature classifiers. The classification error of NPCs is shown to be independent of the normalization function.

  18. Telomere length in normal and neoplastic canine tissues.

    PubMed

    Cadile, Casey D; Kitchell, Barbara E; Newman, Rebecca G; Biller, Barbara J; Hetler, Elizabeth R

    2007-12-01

    To determine the mean telomere restriction fragment (TRF) length in normal and neoplastic canine tissues. 57 solid-tissue tumor specimens collected from client-owned dogs, 40 samples of normal tissue collected from 12 clinically normal dogs, and blood samples collected from 4 healthy blood donor dogs. Tumor specimens were collected from client-owned dogs during diagnostic or therapeutic procedures at the University of Illinois Veterinary Medical Teaching Hospital, whereas 40 normal tissue samples were collected from 12 control dogs. Telomere restriction fragment length was determined by use of an assay kit. A histologic diagnosis was provided for each tumor by personnel at the Veterinary Diagnostic Laboratory at the University of Illinois. Mean of the mean TRF length for 44 normal samples was 19.0 kilobases (kb; range, 15.4 to 21.4 kb), and the mean of the mean TRF length for 57 malignant tumors was 19.0 kb (range, 12.9 to 23.5 kb). Although the mean of the mean TRF length for tumors and normal tissues was identical, tumor samples had more variability in TRF length. Telomerase, which represents the main mechanism by which cancer cells achieve immortality, is an attractive therapeutic target. The ability to measure telomere length is crucial to monitoring the efficacy of telomerase inhibition. In contrast to many other mammalian species, the length of canine telomeres and the rate of telomeric DNA loss are similar to those reported in humans, making dogs a compelling choice for use in the study of human anti-telomerase strategies.

  19. Normal mode-guided transition pathway generation in proteins

    PubMed Central

    Lee, Byung Ho; Seo, Sangjae; Kim, Min Hyeok; Kim, Youngjin; Jo, Soojin; Choi, Moon-ki; Lee, Hoomin; Choi, Jae Boong

    2017-01-01

    The biological function of proteins is closely related to its structural motion. For instance, structurally misfolded proteins do not function properly. Although we are able to experimentally obtain structural information on proteins, it is still challenging to capture their dynamics, such as transition processes. Therefore, we need a simulation method to predict the transition pathways of a protein in order to understand and study large functional deformations. Here, we present a new simulation method called normal mode-guided elastic network interpolation (NGENI) that performs normal modes analysis iteratively to predict transition pathways of proteins. To be more specific, NGENI obtains displacement vectors that determine intermediate structures by interpolating the distance between two end-point conformations, similar to a morphing method called elastic network interpolation. However, the displacement vector is regarded as a linear combination of the normal mode vectors of each intermediate structure, in order to enhance the physical sense of the proposed pathways. As a result, we can generate more reasonable transition pathways geometrically and thermodynamically. By using not only all normal modes, but also in part using only the lowest normal modes, NGENI can still generate reasonable pathways for large deformations in proteins. This study shows that global protein transitions are dominated by collective motion, which means that a few lowest normal modes play an important role in this process. NGENI has considerable merit in terms of computational cost because it is possible to generate transition pathways by partial degrees of freedom, while conventional methods are not capable of this. PMID:29020017

  20. The Influence of Normalization Weight in Population Pharmacokinetic Covariate Models.

    PubMed

    Goulooze, Sebastiaan C; Völler, Swantje; Välitalo, Pyry A J; Calvier, Elisa A M; Aarons, Leon; Krekels, Elke H J; Knibbe, Catherijne A J

    2018-03-23

    In covariate (sub)models of population pharmacokinetic models, most covariates are normalized to the median value; however, for body weight, normalization to 70 kg or 1 kg is often applied. In this article, we illustrate the impact of normalization weight on the precision of population clearance (CL pop ) parameter estimates. The influence of normalization weight (70, 1 kg or median weight) on the precision of the CL pop estimate, expressed as relative standard error (RSE), was illustrated using data from a pharmacokinetic study in neonates with a median weight of 2.7 kg. In addition, a simulation study was performed to show the impact of normalization to 70 kg in pharmacokinetic studies with paediatric or obese patients. The RSE of the CL pop parameter estimate in the neonatal dataset was lowest with normalization to median weight (8.1%), compared with normalization to 1 kg (10.5%) or 70 kg (48.8%). Typical clearance (CL) predictions were independent of the normalization weight used. Simulations showed that the increase in RSE of the CL pop estimate with 70 kg normalization was highest in studies with a narrow weight range and a geometric mean weight away from 70 kg. When, instead of normalizing with median weight, a weight outside the observed range is used, the RSE of the CL pop estimate will be inflated, and should therefore not be used for model selection. Instead, established mathematical principles can be used to calculate the RSE of the typical CL (CL TV ) at a relevant weight to evaluate the precision of CL predictions.

  1. Decorin and biglycan of normal and pathologic human corneas

    NASA Technical Reports Server (NTRS)

    Funderburgh, J. L.; Hevelone, N. D.; Roth, M. R.; Funderburgh, M. L.; Rodrigues, M. R.; Nirankari, V. S.; Conrad, G. W.

    1998-01-01

    PURPOSE: Corneas with scars and certain chronic pathologic conditions contain highly sulfated dermatan sulfate, but little is known of the core proteins that carry these atypical glycosaminoglycans. In this study the proteoglycan proteins attached to dermatan sulfate in normal and pathologic human corneas were examined to identify primary genes involved in the pathobiology of corneal scarring. METHODS: Proteoglycans from human corneas with chronic edema, bullous keratopathy, and keratoconus and from normal corneas were analyzed using sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), quantitative immunoblotting, and immunohistology with peptide antibodies to decorin and biglycan. RESULTS: Proteoglycans from pathologic corneas exhibit increased size heterogeneity and binding of the cationic dye alcian blue compared with those in normal corneas. Decorin and biglycan extracted from normal and diseased corneas exhibited similar molecular size distribution patterns. In approximately half of the pathologic corneas, the level of biglycan was elevated an average of seven times above normal, and decorin was elevated approximately three times above normal. The increases were associated with highly charged molecular forms of decorin and biglycan, indicating modification of the proteins with dermatan sulfate chains of increased sulfation. Immunostaining of corneal sections showed an abnormal stromal localization of biglycan in pathologic corneas. CONCLUSIONS: The increased dermatan sulfate associated with chronic corneal pathologic conditions results from stromal accumulation of decorin and particularly of biglycan in the affected corneas. These proteins bear dermatan sulfate chains with increased sulfation compared with normal stromal proteoglycans.

  2. The Normalization of Cannabis Use Among Bangladeshi and Pakistani Youth: A New Frontier for the Normalization Thesis?

    PubMed

    Williams, Lisa; Ralphs, Rob; Gray, Paul

    2017-03-21

    The Asian population in Britain has grown, representing the second largest ethnic group; Bangladeshi, Pakistani, and Indian nationalities are prevalent (Jivraj, 2012 ; Office for National Statistics, 2013 ). Yet, we know relatively little about the nature and extent of their substance use. Jayakody et al. ( 2006 ) argue ethnic minority groups may be influenced by the norms and values of the dominant culture. Given recreational drug use has undergone a process of normalization in Britain (Aldridge et al., 2011 ; Parker et al., 1998 , 2002 ), we explore the degree to which this is occurring in a Bangladeshi and Pakistani community of Muslim faith in Northern England; a group typically assumed to reject substance use because of robust religious and cultural values. To examine the extent, frequency, and nature of substance use, and associated attitudes. A cross-sectional study collecting qualitative data from a sample (N = 43) of adolescents accessing a drug service and a range of professionals working with them during 2014. We also present analyses of routinely collected quantitative client data. Adolescent interviewees reported extensive personal experience smoking skunk cannabis, and professionals working in the community confirmed many young Asians smoked it. Its consumption appeared to be accommodated into the daily lives of young people and the supply of it also showed signs of acceptance. Skunk cannabis may be undergoing a process of normalization within some Asian communities in Britain. Our study has significant implications for the normalization thesis, finding evidence for normalization within a subpopulation that is typically perceived to resist this trend.

  3. Cortical thickness in neuropsychologically near-normal schizophrenia.

    PubMed

    Cobia, Derin J; Csernansky, John G; Wang, Lei

    2011-12-01

    Schizophrenia is a severe psychiatric illness with widespread impairments of cognitive functioning; however, a certain percentage of subjects are known to perform in the normal range on neuropsychological measures. While the cognitive profiles of these individuals have been examined, there has been relatively little attention to the neuroanatomical characteristics of this important subgroup. The aims of this study were to statistically identify schizophrenia subjects with relatively normal cognition, examine their neuroanatomical characteristics relative to their more impaired counterparts using cortical thickness mapping, and to investigate relationships between these characteristics and demographic variables to better understand the nature of cognitive heterogeneity in schizophrenia. Clinical, neuropsychological, and MRI data were collected from schizophrenia (n = 79) and healthy subjects (n = 65). A series of clustering algorithms on neuropsychological scores was examined, and a 2-cluster solution that separated subjects into neuropsychologically near-normal (NPNN) and neuropsychologically impaired (NPI) groups was determined most appropriate. Surface-based cortical thickness mapping was utilized to examine differences in thinning among schizophrenia subtypes compared with the healthy participants. A widespread cortical thinning pattern characteristic of schizophrenia emerged in the NPI group, while NPNN subjects demonstrated very limited thinning relative to healthy comparison subjects. Analysis of illness duration indicated minimal effects on subtype classification and cortical thickness results. Findings suggest a strong link between cognitive impairment and cortical thinning in schizophrenia, where subjects with near-normal cognitive abilities also demonstrate near-normal cortical thickness patterns. While generally supportive of distinct etiological processes for cognitive subtypes, results provide direction for further examination of additional

  4. COMS normal operation for Earth Observation mission

    NASA Astrophysics Data System (ADS)

    Cho, Young-Min

    2012-09-01

    Communication Ocean Meteorological Satellite (COMS) for the hybrid mission of meteorological observation, ocean monitoring, and telecommunication service was launched onto Geostationary Earth Orbit on June 27, 2010 and it is currently under normal operation service since April 2011. The COMS is located on 128.2° East of the geostationary orbit. In order to perform the three missions, the COMS has 3 separate payloads, the meteorological imager (MI), the Geostationary Ocean Color Imager (GOCI), and the Ka-band antenna. Each payload is dedicated to one of the three missions, respectively. The MI and GOCI perform the Earth observation mission of meteorological observation and ocean monitoring, respectively. For this Earth observation mission the COMS requires daily mission commands from the satellite control ground station and daily mission is affected by the satellite control activities. For this reason daily mission planning is required. The Earth observation mission operation of COMS is described in aspects of mission operation characteristics and mission planning for the normal operation services of meteorological observation and ocean monitoring. And the first year normal operation results after the In-Orbit-Test (IOT) are investigated through statistical approach to provide the achieved COMS normal operation status for the Earth observation mission.

  5. Normals to a Parabola

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2013-01-01

    Given a parabola in the standard form y[superscript 2] = 4ax, corresponding to three points on the parabola, such that the normals at these three points P, Q, R concur at a point M = (h, k), the equation of the circumscribing circle through the three points P, Q, and R provides a tremendous opportunity to illustrate "The Art of Algebraic…

  6. Measurements of normal joint angles by goniometry in calves.

    PubMed

    Sengöz Şirin, O; Timuçin Celik, M; Ozmen, A; Avki, S

    2014-01-01

    The aim of this study was to establish normal reference values of the forelimb and hindlimb joint angles in normal Holstein calves. Thirty clinically normal Holstein calves that were free of any detectable musculoskeletal abnormalities were included in the study. A standard transparent plastic goniometer was used to measure maximum flexion, maximum extension, and range-of-motion of the shoulder, elbow, carpal, hip, stifle, and tarsal joints. The goniometric measurements were done on awake calves that were positioned in lateral recumbency. The goniometric values were measured and recorded by two independent investigators. As a result of the study it was concluded that goniometric values obtained from awake calves in lateral recumbency were found to be highly consistent and accurate between investigators (p <0.05). The data of this study acquired objective and useful information on the normal forelimb and hindlimb joint angles in normal Holstein calves. Further studies can be done to predict detailed goniometric values from different diseases and compare them.

  7. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  8. Normal keratinized mucosa transplants in nude mice.

    PubMed

    Holmstrup, P; Dabelsteen, E; Reibel, J; Harder, F

    1981-01-01

    Two types of normal keratinized mucosa were transplanted to subcutaneous sites of nude mice of two different strains. 24 intact specimens of clinically normal human palatal mucosa were transplanted to nude mice of the strain nu/nu NC. The transplants were recovered after 42 d with a recovery rate of 96%. Moreover, 22 intact specimens of normal rat forestomach mucosa were transplanted to nude mice of the strain nu/nu BALB/c/BOM. These transplants were recovered after 21 d with a recovery rate of 63%. The histologic features of the transplants were essentially the same as those of the original tissues. However, epithelial outgrowths from the transplants differed with respect to the pattern of keratinization. The outgrowths of human palatal mucosa transplants were essentially unkeratinized, while the outgrowths of the rat forestomach transplants showed continued keratinization.

  9. Masturbation, sexuality, and adaptation: normalization in adolescence.

    PubMed

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  10. Are cancer cells really softer than normal cells?

    PubMed

    Alibert, Charlotte; Goud, Bruno; Manneville, Jean-Baptiste

    2017-05-01

    Solid tumours are often first diagnosed by palpation, suggesting that the tumour is more rigid than its surrounding environment. Paradoxically, individual cancer cells appear to be softer than their healthy counterparts. In this review, we first list the physiological reasons indicating that cancer cells may be more deformable than normal cells. Next, we describe the biophysical tools that have been developed in recent years to characterise and model cancer cell mechanics. By reviewing the experimental studies that compared the mechanics of individual normal and cancer cells, we argue that cancer cells can indeed be considered as softer than normal cells. We then focus on the intracellular elements that could be responsible for the softening of cancer cells. Finally, we ask whether the mechanical differences between normal and cancer cells can be used as diagnostic or prognostic markers of cancer progression. © 2017 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.

  11. Spectra of normal and nutrient-deficient maize leaves

    NASA Technical Reports Server (NTRS)

    Al-Abbas, A. H.; Barr, R.; Hall, J. D.; Crane, F. L.; Baumgardner, M. F.

    1973-01-01

    Reflectance, transmittance and absorptance spectra of normal and six types of nutrient-deficient (N, P, K, S, Mg, and Ca) maize (Zea mays L.) leaves were analyzed at 30 selected wavelengths from 500 to 2600 nm. The analysis of variance showed significant differences in reflectance, transmittance and absorptance in the visible wavelengths among leaf numbers 3, 4, and 5, among the seven treatments, and among the interactions of leaf number and treatments. In the infrared wavelengths only treatments produced significant differences. The chlorophyll content of leaves was reduced in all nutrient-deficient treatments. Percent moisture was increased in S-, Mg-, and N-deficiencies. Polynomial regression analysis of leaf thickness and leaf moisture content showed that these two variables were significantly and directly related. Leaves from the P- and Ca-deficient plants absorbed less energy in the near infrared than the normal plants; S-, Mg-, K-, and N-deficient leaves absorbed more than the normal. Both S- and N-deficient leaves had higher temperatues than normal maize leaves.

  12. Craniota, Wirbel- oder Schädeltiere

    NASA Astrophysics Data System (ADS)

    Schultze, Hans-Peter

    Zu den Craniota zählen alle Chordatiere, die eine dreiteilige Regionalisierung des Körpers in Kopf, Rumpf und Schwanz aufweisen. Der Kopf umfasst (1) das Neurocranium mit Gehirn und komplexen Sinnesorganen zur Wahrnehmung der Umgebung, (2) das Viscerocranium zur Nahrungsaufnahme und zur Ventilation der Kiemen bei den primär wasserlebenden Craniota und (3) das Dermatocranium (S. 38). Letzteres entsteht durch Verknöcherungen im Bindegewebe des Integuments, es dient dem Schutz des Kopfes und trägt im Mundbreich die Zähne. Zusammen bilden die drei Skelettstrukturen die funktionelle Einheit Schädel (Cranium). Außer der (somatischen) Rumpfmuskulatur und dem Axialskelett liegen im Rumpf Kreislauf-, Atmungs-, Verdauungs-, Exkretions- und Fortpflanzungsorgane. Der Schwanz, der Abschnitt hinter der Afteröffnung, die das Ende der Leibeshöhle markiert, dient mit Muskeln und Schwanzflosse der Fortbewegung.

  13. Innovative BI-Lösungen als Basis für eine erfolgreiche Transformation zu Utility 4.0

    NASA Astrophysics Data System (ADS)

    Phillipp, Daniel; Ebert, Sebastian

    Für eine erfolgreiche Transformation, vom reinen Energieversorger hin zum Energiedienstleister, werden innovative Business-Intelligence-Lösungen notwendig sein und eine zentrale Rolle einnehmen. Dabei ist es zunächst essenziell, die Herausforderungen zu kennen und ihnen mit geeigneten Analysen zu begegnen. Die Basis hierzu bildet eine abgestimmte und auf die strategischen Unternehmensziele ausgerichtete Architektur und Vorgehensweise. Zwei Beispiele veranschaulichen, wie ein gesamtheitlicher Ansatz, auch bei Datenvielfalt und hoher Komplexität, operative Prozesse optimiert, und fortgeschrittene Analysen zukünftig einen Beitrag zum Unternehmenserfolg liefern können.

  14. Funkmesstechnik

    NASA Astrophysics Data System (ADS)

    Plaßmann, Wilfried

    Ein Hauptgebiet der Funkmesstechnik ist durch den Begriff RADAR (radio detection and ranging gekennzeichnet. Bei diesem Verfahren werden impulsförmige elektromagnetische Wellen von einer Antenne ausgesendet und an Körpern oder Stoffverteilungen (Wolken) reflektiert. Die Sendeantenne wird auf Empfang umgeschaltet, und anhand des Echos sind Rückschlüsse auf die Lage und die Beschaffenheit der Körper oder der Stoffverteilungen möglich. Angewendet wird die Radartechnik bei der Kontrolle und der Sicherung des Land-, Wasser- und Flugverkehrs, in der Meteorologie zur Wetterprognose, in der Astrologie und im militärischen Bereich.

  15. Proteoglycans in Leiomyoma and Normal Myometrium

    PubMed Central

    Barker, Nichole M.; Carrino, David A.; Caplan, Arnold I.; Hurd, William W.; Liu, James H.; Tan, Huiqing; Mesiano, Sam

    2015-01-01

    Uterine leiomyoma are a common benign pelvic tumors composed of modified smooth muscle cells and a large amount of extracellular matrix (ECM). The proteoglycan composition of the leiomyoma ECM is thought to affect pathophysiology of the disease. To test this hypothesis, we examined the abundance (by immunoblotting) and expression (by quantitative real-time polymerase chain reaction) of the proteoglycans biglycan, decorin, and versican in leiomyoma and normal myometrium and determined whether expression is affected by steroid hormones and menstrual phase. Leiomyoma and normal myometrium were collected from women (n = 17) undergoing hysterectomy or myomectomy. In vitro studies were performed on immortalized leiomyoma (UtLM) and normal myometrial (hTERT-HM) cells with and without exposure to estradiol and progesterone. In leiomyoma tissue, abundance of decorin messenger RNA (mRNA) and protein were 2.6-fold and 1.4-fold lower, respectively, compared with normal myometrium. Abundance of versican mRNA was not different between matched samples, whereas versican protein was increased 1.8-fold in leiomyoma compared with myometrium. Decorin mRNA was 2.4-fold lower in secretory phase leiomyoma compared with proliferative phase tissue. In UtLM cells, progesterone decreased the abundance of decorin mRNA by 1.3-fold. Lower decorin expression in leiomyoma compared with myometrium may contribute to disease growth and progression. As decorin inhibits the activity of specific growth factors, its reduced level in the leiomyoma cell microenvironment may promote cell proliferation and ECM deposition. Our data suggest that decorin expression in leiomyoma is inhibited by progesterone, which may be a mechanism by which the ovarian steroids affect leiomyoma growth and disease progression. PMID:26423601

  16. The classification of normal screening mammograms

    NASA Astrophysics Data System (ADS)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  17. The COBE normalization for standard cold dark matter

    NASA Technical Reports Server (NTRS)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  18. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  19. Normal stress effects on Knudsen flow

    NASA Astrophysics Data System (ADS)

    Eu, Byung Chan

    2018-01-01

    Normal stress effects are investigated on tube flow of a single-component non-Newtonian fluid under a constant pressure gradient in a constant temperature field. The generalized hydrodynamic equations are employed, which are consistent with the laws of thermodynamics. In the cylindrical tube flow configuration, the solutions of generalized hydrodynamic equations are exactly solvable and the flow velocity is obtained in a simple one-dimensional integral quadrature. Unlike the case of flow in the absence of normal stresses, the flow develops an anomaly in that the flow in the boundary layer becomes stagnant and the thickness of such a stagnant velocity boundary layer depends on the pressure gradient, the aspect ratio of the radius to the length of the tube, and the pressure (or density and temperature) at the entrance of the tube. The volume flow rate formula through the tube is derived for the flow. It generalizes the Knudsen flow rate formula to the case of a non-Newtonian stress tensor in the presence of normal stress differences. It also reduces to the Navier-Stokes theory formula in the low shear rate limit near equilibrium.

  20. Chirurgie angeborener Herzfehler

    NASA Astrophysics Data System (ADS)

    Schreiber, Christian; Libera, Paul; Lange, Rüdiger

    Störungen der embryonalen Entwicklung in der frühen Phase der Schwangerschaft können zu Fehlbildungen am Herz- und Gefäßsystem führen. Die Häufigkeit liegt bei 0.8-1 % aller lebend geborenen Kinder. In Deutschland werden jedes Jahr etwa 6.000 Kinder mit einem Herzfehler geboren (Quelle: http://www.kompetenznetzahf.de). Das Spektrum reicht von einfachen Fehlern, die das Herz-Kreislauf-System wenig beeinträchtigen, bis zu sehr schweren Herzerkrankungen, die unbehandelt zum Tode führen. Fortschritte der Kinderkardiologie, Herzchirurgie und Anästhesie ermöglichen heute ein Überleben bei über 90 % der Patienten. Auch die spezialisierte Pränataldiagnostik (vorgeburtliche Diagnostik) ermöglicht schon die frühe Weichenstellung für mögliche Therapieoptionen. Bei der chirurgischen Therapie ist jedoch festzuhalten, dass ein Herzfehler entweder korrigierend behandelt wird oder nur "palliiert“ werden kann. Bei letzterer Therapie wird bei einem Patienten eine medizinische Maßnahme durchgeführt, die nicht die Herstellung normaler Körperfunktionen zum Ziel hat, sondern in Anpassung an die physiologischen Besonderheiten des Patienten dessen Zustand lediglich stabilisiert und optimiert. Dies kann beispielsweise bei einer nicht korrigierbaren angeborenen Fehlbildung notwendig sein, bei der lediglich eine funktionelle Herzkammer vorhanden ist (z. B. hypoplastisches Linksherz). Hierbei muss eine prothetische Verbindung zur Lungenstrombahn in der Folgezeit entfernt werden.

  1. Glymphatic MRI in idiopathic normal pressure hydrocephalus

    PubMed Central

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-01-01

    Abstract The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer’s disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic

  2. Glymphatic MRI in idiopathic normal pressure hydrocephalus.

    PubMed

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-10-01

    The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer's disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic function. In

  3. Smartes System für die Energiewende - der Übertragungsnetzbetreiber in der digitalen Zukunft

    NASA Astrophysics Data System (ADS)

    Pflaum, Rainer; Egeler, Tobias

    Die Übertragungsnetze stellen eine zuverlässige Versorgung von Haushalt, Gewerbe und Industrie mit elektrischer Energie sicher und sind damit Grundlage einer modernen Wirtschaft und Gesellschaft. Die mittlerweile unumkehrbaren Entwicklungen der nationalen und europäischen Energiewende stellen den Übertragungsnetzbetreiber in seinen Kernaufgaben, dem Bau und Betrieb von Netzen, dem Markt- und Netzzugang und der Integration der erneuerbaren Energien vor neue und große Herausforderungen. Verbrauchsnahe dezentrale Erzeugung wie verbrauchsferne zentrale Erzeugung muss im Sinne der Gewährleistung der Systemstabilität gemanagt und in Einklang mit dem Verbrauch gebracht werden. Erneuerbare Energien müssen zudem in einem solchen System ihren Beitrag zur System- und Marktintegration leisten. All das erfordert mehr Daten, um in einem Gesamtsystem dynamische Reaktionsmöglichkeiten gewährleisten zu können. Erst die "Digitalisierung" schafft dabei die notwendigen Voraussetzungen die Komplexität zu stemmen. Die Digitalisierung stellt daher ein Kernelement dieses Wandels des Übertragungsnetzbetreibers dar, die einerseits mit zum Entstehen der neuen Herausforderungen beiträgt, andererseits aber auch hilft Werkzeuge bereitzustellen, diesen Herausforderungen zu begegnen. Im folgenden Beitrag wird aufgezeigt, wie die Digitalisierung die Aufgaben und Instrumente des Übertragungsnetzbetreibers verändern. Ausgehend von den heutigen Aufgaben eines Übertragungsnetzbetreibers und dem gültigen Rechtsrahmen werden unter dem Begriff "Notwendiges Set für morgen" smarte Elemente und Werkzeuge beschrieben, die bereits heute im Einsatz sind oder in den nächsten Jahren notwendig werden. Im Anschluss erfolgt anhand einiger Beispiele aus unterschiedlichen Bereichen eine Konkretisierung der Einsatzzwecke der Digitalisierung beim Übertragungsnetzbetreiber. Ein kurzer Ausblick mit Fokus auf den weiteren Veränderungsprozess rundet den Beitrag ab.

  4. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  5. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1998-11-03

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3` noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries. 19 figs.

  6. Normal Perceptual Sensitivity Arising From Weakly Reflective Cone Photoreceptors

    PubMed Central

    Bruce, Kady S.; Harmening, Wolf M.; Langston, Bradley R.; Tuten, William S.; Roorda, Austin; Sincich, Lawrence C.

    2015-01-01

    Purpose To determine the light sensitivity of poorly reflective cones observed in retinas of normal subjects, and to establish a relationship between cone reflectivity and perceptual threshold. Methods Five subjects (four male, one female) with normal vision were imaged longitudinally (7–26 imaging sessions, representing 82–896 days) using adaptive optics scanning laser ophthalmoscopy (AOSLO) to monitor cone reflectance. Ten cones with unusually low reflectivity, as well as 10 normally reflective cones serving as controls, were targeted for perceptual testing. Cone-sized stimuli were delivered to the targeted cones and luminance increment thresholds were quantified. Thresholds were measured three to five times per session for each cone in the 10 pairs, all located 2.2 to 3.3° from the center of gaze. Results Compared with other cones in the same retinal area, three of 10 monitored dark cones were persistently poorly reflective, while seven occasionally manifested normal reflectance. Tested psychophysically, all 10 dark cones had thresholds comparable with those from normally reflecting cones measured concurrently (P = 0.49). The variation observed in dark cone thresholds also matched the wide variation seen in a large population (n = 56 cone pairs, six subjects) of normal cones; in the latter, no correlation was found between cone reflectivity and threshold (P = 0.0502). Conclusions Low cone reflectance cannot be used as a reliable indicator of cone sensitivity to light in normal retinas. To improve assessment of early retinal pathology, other diagnostic criteria should be employed along with imaging and cone-based microperimetry. PMID:26193919

  7. Microarray expression profiling in adhesion and normal peritoneal tissues.

    PubMed

    Ambler, Dana R; Golden, Alicia M; Gell, Jennifer S; Saed, Ghassan M; Carey, David J; Diamond, Michael P

    2012-05-01

    To identify molecular markers associated with adhesion and normal peritoneal tissue using microarray expression profiling. Comparative study. University hospital. Five premenopausal women. Adhesion and normal peritoneal tissue samples were obtained from premenopausal women. Ribonucleic acid was extracted using standard protocols and processed for hybridization to Affymetrix Whole Transcript Human Gene Expression Chips. Microarray data were obtained from five different patients, each with adhesion tissue and normal peritoneal samples. Real-time polymerase chain reaction was performed for confirmation using standard protocols. Gene expression in postoperative adhesion and normal peritoneal tissues. A total of 1,263 genes were differentially expressed between adhesion and normal tissues. One hundred seventy-three genes were found to be up-regulated and 56 genes were down-regulated in the adhesion tissues compared with normal peritoneal tissues. The genes were sorted into functional categories according to Gene Ontology annotations. Twenty-six up-regulated genes and 11 down-regulated genes were identified with functions potentially relevant to the pathophysiology of postoperative adhesions. We evaluated and confirmed expression of 12 of these specific genes via polymerase chain reaction. The pathogenesis, natural history, and optimal treatment of postoperative adhesive disease remains unanswered. Microarray analysis of adhesions identified specific genes with increased and decreased expression when compared with normal peritoneum. Knowledge of these genes and ontologic pathways with altered expression provide targets for new therapies to treat patients who have or are at risk for postoperative adhesions. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  8. Resistance to antibiotics in the normal flora of animals.

    PubMed

    Sørum, H; Sunde, M

    2001-01-01

    The normal bacterial flora contains antibiotic resistance genes to various degrees, even in individuals with no history of exposure to commercially prepared antibiotics. Several factors seem to increase the number of antibiotic-resistant bacteria in feces. One important factor is the exposure of the intestinal flora to antibacterial drugs. Antibiotics used as feed additives seem to play an important role in the development of antibiotic resistance in normal flora bacteria. The use of avoparcin as a feed additive has demonstrated that an antibiotic considered "safe" is responsible for increased levels of antibiotic resistance in the normal flora enterococci of animals fed with avoparcin and possibly in humans consuming products from these animals. However, other factors like stress from temperature, crowding, and management also seem to contribute to the occurrence of antibiotic resistance in normal flora bacteria. The normal flora of animals has been studied with respect to the development of antibiotic resistance over four decades, but there are few studies with the intestinal flora as the main focus. The results of earlier studies are valuable when focused against the recent understanding of mobile genetics responsible for bacterial antibiotic resistance. New studies should be undertaken to assess whether the development of antibiotic resistance in the normal flora is directly linked to the dramatic increase in antibiotic resistance of bacterial pathogens. Bacteria of the normal flora, often disregarded scientifically, should be studied with the intention of using them as active protection against infectious diseases and thereby contributing to the overall reduction of use of antibioties in both animals and humans.

  9. The significance of early post-exercise ST segment normalization.

    PubMed

    Chow, Rudy; Fordyce, Christopher B; Gao, Min; Chan, Sammy; Gin, Kenneth; Bennett, Matthew

    2015-01-01

    The persistence of ST segment depression in recovery signifies a strongly positive exercise treadmill test (ETT). However, it is unclear if early recovery of ST segments portends a similar prognosis. We sought to determine if persistence of ST depression into recovery correlates with ischemic burden based on myocardial perfusion imaging (MPI). This was a retrospective analysis of 853 consecutive patients referred for exercise MPI at a tertiary academic center over a 24-month period. Patients were stratified into three groups based on the results of the ETT: normal (negative ETT), persistence (positive ETT with >1mm ST segment depression at 1minute in recovery) and early normalization (positive ETT with <1mm ST segment depression at 1minute in recovery). Summed stress scores (SSSs) were calculated then for each patient, while the coronary anatomy was reported for the subset of patients who received coronary angiograms. A total of 513 patients had a negative ETT, 235 patients met criteria for early normalization, while 105 patients met criteria for persistence. The persistence group had a significantly greater SSS (8.48±7.77) than both the early normalization (4.34±4.98, p<0.001) and normal (4.47±5.31, p<0.001) groups. The SSSs of the early normalization and normal groups were not statistically different and met the prespecified non-inferiority margin (mean difference 0.12, -0.66=lower 95% CI, p<0.001). Among the 87 patients who underwent an angiogram, significant three-vessel or left main disease was seen in 39.3% of the persistence group compared with 5.9% of normal and 7.4% of early normalization groups. Among patients with an electrically positive ETT, recovery of ST segment depression within 1minute was associated with a lower SSS than patients with persistence of ST depression beyond 1minute. Furthermore, early ST segment recovery conferred a similar SSS to patients with a negative ETT. These results suggest that among patients evaluated for chest pain with

  10. Regionalstatistik

    NASA Astrophysics Data System (ADS)

    Eppmann, Helmut; Fürnrohr, Michael

    Viele Aufgaben in Politik, Wirtschaft und Gesellschaft erfordern nicht nur globale, sondern auch regionale Lösungen. Die Regionalstatistik ist deshalb unentbehrlich für viele Planungs- und Entscheidungsprozesse. Ihren Ausbau und ihre Nutzung zu fördern, hat sich der Ausschuss für Regionalstatistik der Deutschen Statistischen Gesellschaft zum Ziel gesetzt. Dieses Kapitel stellt zunächst einige Grundlagen der Regionalstatistik und die Aufgaben des Ausschusses dar. Es folgen das umfangreiche regionalstatistische Datenangebot der Statistischen Ämter des Bundes und der Länder und seine Nutzung. Ein ergänzender Abschnitt ist der Arbeit des Instituts für Bau-, Stadt- und Raumforschung gewidmet. Das Kapitel schließt mit einem Ausblick auf die Weiterentwicklung des regionalstatistischen Datenangebotes aus Sicht der amtlichen Statistik.

  11. Improved Algorithm For Finite-Field Normal-Basis Multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1989-01-01

    Improved algorithm reduces complexity of calculations that must precede design of Massey-Omura finite-field normal-basis multipliers, used in error-correcting-code equipment and cryptographic devices. Algorithm represents an extension of development reported in "Algorithm To Design Finite-Field Normal-Basis Multipliers" (NPO-17109), NASA Tech Briefs, Vol. 12, No. 5, page 82.

  12. Gang Youth, Substance Use Patterns, and Drug Normalization

    ERIC Educational Resources Information Center

    Sanders, Bill

    2012-01-01

    Gang membership is an indicator of chronic illicit substance use and such patterns of use may have a normalized character. Using epidemiological and qualitative data collected between 2006 and 2007, this manuscript examines the drug normalization thesis among a small sample (n=60) of gang youth aged 16-25 years from Los Angeles. Overall, while…

  13. Invasive forstliche schad-organismen in Nordamerika

    Treesearch

    Christopher J. Fettig; Horst E.  Delb

    2017-01-01

    In Nordamerika sind Wälder infolge von Einwandernden, internationalem Handel und Tourismus traditionell stark von Einschleppungen gebietsfremder invasiver Schadorganismen betroffen. Infolge von Globalisierung und Klimawandel gibt es aufgrund der Ähnlichkeiten von Klima, Flora und Fauna zwischen Europa und Nordamerika viele Parallelen.

  14. Analytische Geometrie

    NASA Astrophysics Data System (ADS)

    Kemnitz, Arnfried

    Der Grundgedanke der Analytischen Geometrie besteht darin, dass geometrische Untersuchungen mit rechnerischen Mitteln geführt werden. Geometrische Objekte werden dabei durch Gleichungen beschrieben und mit algebraischen Methoden untersucht. Behandelt werden folgende Themen: Koordinatensysteme: Kartesisches Koordinatensystem der Ebene und des Raumes, Polarkoordinatensystem der Ebene, Zusammenhang zwischen kartesischen und Polarkoordinaten; Geraden: Geradengleichungen, Abstände von Geraden; Kreise: Kreisgleichungen, Kreisberechnungen; Kugeln; Kegelschnitte; Ellipsen; Hyperbeln; Parabeln; Anwendungen von Kegelschnitten aus Technik und Mathematik; Vektoren: Definitionen, Addition, Multiplikation, Komponentendarstellung in der Ebene und im Raum, Skalarprodukt, Vektorprodukt. Zu den einzelnen Themenkreisen sind Beispiele aufgeführt. Wichtige Regeln und Gesetze sind durch Umrandung besonders kenntlich gemacht.

  15. Empirical evaluation of data normalization methods for molecular classification.

    PubMed

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  16. Empirical evaluation of data normalization methods for molecular classification

    PubMed Central

    Huang, Huei-Chung

    2018-01-01

    Background Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers—an increasingly important application of microarrays in the era of personalized medicine. Methods In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. Results In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Conclusion Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy. PMID:29666754

  17. Supplement to: Astronomical ephemerides, navigation and war. The astonishing cooperation of the ephemeris institutes of Germany, England, France and the USA during the Second World War based on documents in the archives of the Astronomisches Rechen-Institut. Scans of the documents. (German Title: Supplement zu: Astronomische Ephemeriden, Navigation und Krieg. Die erstaunliche Zusammenarbeit der Ephemeriden-Institute von Deutschland, England, Frankreich und den USA im Zweiten Weltkrieg nach Dokumenten im Archiv des Astronomischen Rechen-Instituts. Scans der Dokumente.)

    NASA Astrophysics Data System (ADS)

    Wielen, Roland; Wielen, Ute

    In a previous paper (Wielen R. und Wielen U. 2016a: Astronomical Ephemerides, Navigation and War), we have presented the astonishing cooperation of the ephemeris institutes of Germany, England, France and the USA during the Second World War. We were able to use numerous archivalia which we also describe and comment in that paper. In the present paper, we publish colour scans of these archivalia. All the documents shown here are held in the archives of the Astronomisches Rechen-Institut in Heidelberg.

  18. Indentation stiffness does not discriminate between normal and degraded articular cartilage.

    PubMed

    Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle

    2007-08-01

    Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.

  19. Quasi-normal modes from non-commutative matrix dynamics

    NASA Astrophysics Data System (ADS)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  20. „3D-augmented-reality“-Visualisierung für die navigierte Osteosynthese von Beckenfrakturen

    PubMed Central

    Befrui, N.; Fischer, M.; Fuerst, B.; Lee, S.-C.; Fotouhi, J.; Weidert, S.; Johnson, A.; Euler, E.; Osgood, G.; Navab, N.; Böcker, W.

    2018-01-01

    Zusammenfassung Hintergrund Trotz großer Fortschritte in der Entwicklung der Hard- und Software von Navigationssystemen finden diese aufgrund ihrer vermeintlichen Komplexität, umständlichen Integration in klinische Arbeitsabläufe und fraglichen Vorteilen gegenüber konventionellen bildgebenden Verfahren bisher wenig Einsatz in den heutigen Operationssälen. Ziel der Arbeit Entwicklung einer „Augmented-reality“(AR)-Darstellung zur chirurgischen Navigation ohne Infrarot(„IR“)-Tracking-Marker und Vergleich zum konventioneller Röntgen in einem simulierten Eingriff. Material und Methoden Navigationssystem bestehend aus „Cone-beam-CT“(CBCT)-fähigem C-Bogen und „Red-green-blue-depth“(RGBD)-Kamera. Testung durch Kirschner(K)-Draht-Platzierung in Modellen unter Berücksichtigung der benötigten Zeit, der Strahlendosis und der Benutzerfreundlichkeit der Systeme. Ergebnisse Eine signifikante Reduktion der benötigten Zeit, der Röntgenbilder und der gesamten Strahlendosis bei der AR-Navigation gegenüber dem konventionellen Röntgen bei gleichbleibender Präzision. Schlussfolgerung Die AR-Navigation mithilfe der RGBD-Kamera bietet flexible und intuitive Darstellungsmöglichkeiten des Operations-situs für navigierte Osteosynthesen ohne Tracking-Marker. Hiermit ist es möglich, Operationen schneller, einfacher und mit geringerer Strahlenbelastung für Patient und OP-Personal durchzuführen. PMID:29500506

  1. Normalization of urinary drug concentrations with specific gravity and creatinine.

    PubMed

    Cone, Edward J; Caplan, Yale H; Moser, Frank; Robert, Tim; Shelby, Melinda K; Black, David L

    2009-01-01

    Excessive fluid intake can substantially dilute urinary drug concentrations and result in false-negative reports for drug users. Methods for correction ("normalization") of drug/metabolite concentrations in urine have been utilized by anti-doping laboratories, pain monitoring programs, and in environmental monitoring programs to compensate for excessive hydration, but such procedures have not been used routinely in workplace, legal, and treatment settings. We evaluated two drug normalization procedures based on specific gravity and creatinine. These corrections were applied to urine specimens collected from three distinct groups (pain patients, heroin users, and marijuana/ cocaine users). Each group was unique in characteristics, study design, and dosing conditions. The results of the two normalization procedures were highly correlated (r=0.94; range, 0.78-0.99). Increases in percent positives by specific gravity and creatinine normalization were small (0.3% and -1.0%, respectively) for heroin users (normally hydrated subjects), modest (4.2-9.8%) for pain patients (unknown hydration state), and substantial (2- to 38-fold increases) for marijuana/cocaine users (excessively hydrated subjects). Despite some limitations, these normalization procedures provide alternative means of dealing with highly dilute, dilute, and concentrated urine specimens. Drug/metabolite concentration normalization by these procedures is recommended for urine testing programs, especially as a means of coping with dilute specimens.

  2. Protein Degradation in Normal and Beige (Chediak-Higashi) Mice

    PubMed Central

    Lyons, Robert T.; Pitot, Henry C.

    1978-01-01

    The beige mouse, C57BL/6 (bg/bg), is an animal model for the Chediak-Higashi syndrome in man, a disease characterized morphologically by giant lysosomes in most cell types. Half-lives for the turnover of [14C]bicarbonate-labeled total soluble liver protein were determined in normal and beige mice. No significant differences were observed between the normal and mutant strain for both rapidly and slowly turning-over classes of proteins. Glucagon treatment during the time-course of protein degradation had similar effects on both normal and mutant strains and led to the conclusion that the rate of turnover of endogenous intracellular protein in the beige mouse liver does not differ from normal. The rates of uptake and degradation of an exogenous protein were determined in normal and beige mice by intravenously injecting 125I-bovine serum albumin and following, in peripheral blood, the loss with time of phosphotungstic acid-insoluble bovine serum albumin and the parallel appearance of phosphotungstic acid-soluble (degraded) material. No significant differences were observed between beige and normal mice in the uptake by liver lysosomes of 125I-bovine serum albumin (t½ = 3.9 and 2.8 h, respectively). However, it was found that lysosomes from livers of beige mice released phosphotungstic acid-soluble radioactivity at a rate significantly slower than normal (t½ = 6.8 and 3.1 h, respectively). This defect in beige mice could be corrected by chronic administration of carbamyl choline (t½ = 3.5 h), a cholinergic agonist which raises intracellular cyclic GMP levels. However, no significant differences between normal and beige mice were observed either in the ability of soluble extracts of liver and kidney to bind [3H]cyclic GMP in vitro or in the basal levels of cyclic AMP in both tissues. The relevance of these observations to the presumed biochemical defect underlying the Chediak-Higashi syndrome is discussed. PMID:202611

  3. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected... whole milk which such farmer would have sold in the commercial market in each of the pay periods in the...

  4. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassanein, A.; Konkashbaev, I.

    1999-03-15

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutionsmore » provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters.« less

  5. A comparison of vowel normalization procedures for language variation research

    NASA Astrophysics Data System (ADS)

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .

  6. A comparison of vowel normalization procedures for language variation research.

    PubMed

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels ("vowel-extrinsic" information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself ("vowel-intrinsic" information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., "formant-extrinsic" F2-F1).

  7. Normalized methodology for medical infrared imaging

    NASA Astrophysics Data System (ADS)

    Vargas, J. V. C.; Brioschi, M. L.; Dias, F. G.; Parolin, M. B.; Mulinari-Brenner, F. A.; Ordonez, J. C.; Colman, D.

    2009-01-01

    A normalized procedure for medical infrared imaging is suggested, and illustrated by a leprosy and hepatitis C treatment follow-up, in order to investigate the effect of concurrent treatment which has not been reported before. A 50-year-old man with indeterminate leprosy and a 20-year history of hepatitis C was monitored for 587 days, starting from the day the patient received treatment for leprosy. Standard therapy for hepatitis C started 30 days later. Both visual observations and normalized infrared imaging were conducted periodically to assess the response to leprosy treatment. The primary end points were effectiveness of the method under different boundary conditions over the period, and rapid assessment of the response to leprosy treatment. The patient achieved sustained hepatitis C virological response 6 months after the end of the treatment. The normalized infrared results demonstrate the leprosy treatment success in spite of the concurrent hepatitis C treatment, since day 87, whereas repigmentation was visually assessed only after day 182, and corroborated with a skin biopsy on day 390. The method detected the effectiveness of the leprosy treatment in 87 days, whereas repigmentation started only in 182 days. Hepatitis C and leprosy treatment did not affect each other.

  8. A Late Babylonian Normal and ziqpu star text

    NASA Astrophysics Data System (ADS)

    Roughton, N. A.; Steele, J. M.; Walker, C. B. F.

    2004-09-01

    The Late Babylonian tablet BM 36609+ is a substantial rejoined fragment of an important and previously unknown compendium of short texts dealing with the use of stars in astronomy. Three of the fragments which constitute BM 36609+ were first identified as containing a catalogue of Babylonian "Normal Stars" (stars used as reference points in the sky to track the movement of the moon and planets) by N. A. Roughten. C. B. F. Walker has been able to join several more fragments to the tablet which have revealed that other sections of the compendium concern a group of stars whose culminations are used for keeping time, known as ziqpu-stars after the Akkadian term for culmination, ziqpu. All the preserved sections on the obverse of BM 36609+ concern ziqpu-stars. On the reverse of the tablet we find several sections concerning Normal Stars. This side begins with a catalogue of Normal Stars giving their positions within zodiacal signs. The catalogue is apparently related to the only other Normal Star catalogue previously known, BM 46083 published by Sachs. In the following we present an edition of BM 36609+ based upon Walker's transliteration of the tablet. Since Sachs' edition of BM 46083, the Normal Star catalogue related to BM 36609+, was based upon a photograph and is incomplete we include a fresh edition of the tablet. A list of Akkadian and translated star names with identifications is given.

  9. Modeling and simulation of normal and hemiparetic gait

    NASA Astrophysics Data System (ADS)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  10. Not Quite Normal: Consequences of Violating the Assumption of Normality in Regression Mixture Models

    ERIC Educational Resources Information Center

    Van Horn, M. Lee; Smith, Jessalyn; Fagan, Abigail A.; Jaki, Thomas; Feaster, Daniel J.; Masyn, Katherine; Hawkins, J. David; Howe, George

    2012-01-01

    Regression mixture models, which have only recently begun to be used in applied research, are a new approach for finding differential effects. This approach comes at the cost of the assumption that error terms are normally distributed within classes. This study uses Monte Carlo simulations to explore the effects of relatively minor violations of…

  11. Notes on power of normality tests of error terms in regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Střelec, Luboš

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less

  12. A brain imaging repository of normal structural MRI across the life course: Brain Images of Normal Subjects (BRAINS).

    PubMed

    Job, Dominic E; Dickie, David Alexander; Rodriguez, David; Robson, Andrew; Danso, Sammy; Pernet, Cyril; Bastin, Mark E; Boardman, James P; Murray, Alison D; Ahearn, Trevor; Waiter, Gordon D; Staff, Roger T; Deary, Ian J; Shenkin, Susan D; Wardlaw, Joanna M

    2017-01-01

    The Brain Images of Normal Subjects (BRAINS) Imagebank (http://www.brainsimagebank.ac.uk) is an integrated repository project hosted by the University of Edinburgh and sponsored by the Scottish Imaging Network: A Platform for Scientific Excellence (SINAPSE) collaborators. BRAINS provide sharing and archiving of detailed normal human brain imaging and relevant phenotypic data already collected in studies of healthy volunteers across the life-course. It particularly focusses on the extremes of age (currently older age, and in future perinatal) where variability is largest, and which are under-represented in existing databanks. BRAINS is a living imagebank where new data will be added when available. Currently BRAINS contains data from 808 healthy volunteers, from 15 to 81years of age, from 7 projects in 3 centres. Additional completed and ongoing studies of normal individuals from 1st to 10th decades are in preparation and will be included as they become available. BRAINS holds several MRI structural sequences, including T1, T2, T2* and fluid attenuated inversion recovery (FLAIR), available in DICOM (http://dicom.nema.org/); in future Diffusion Tensor Imaging (DTI) will be added where available. Images are linked to a wide range of 'textual data', such as age, medical history, physiological measures (e.g. blood pressure), medication use, cognitive ability, and perinatal information for pre/post-natal subjects. The imagebank can be searched to include or exclude ranges of these variables to create better estimates of 'what is normal' at different ages. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Mehr Mathematik Wagen in der Medizin

    NASA Astrophysics Data System (ADS)

    Deuflhard, Peter; Dössel, Olaf; Louis, Alfred K.; Zachow, Stefan

    In diesem Artikel wird an drei Erfolgsmodellen dargestellt, wie das Zusammenwirken von Mathematik und Medizin eine Entwicklung hin zu patientenspezifischen Modellen auf Basis moderner medizinischer Bildgebung angestoßen hat, die in naher Zukunft dynamisch weiter Raum greifen wird. Dabei existiert ein Gleichklang der Interessen von Medizin und Mathematik: Beide Disziplinen wollen die Resultate schnell und zuverlässig. Für die Klinik heißt dies, dass notwendige Rechnungen in möglichst kurzer Zeit, und zwar auf dem PC, ablaufen müssen und dass die Resultate so genau und belastbar sein müssen, dass medizinische Entscheidungen darauf aufbauen können. Für die Mathematik folgt daraus, dass höchste Anforderungen an die Effizienz der verwendeten Algorithmen und die darauf aufbauende Software in Numerik und Visualisierung zu stellen sind. Allerdings ist es noch ein weiter Weg, bis anatomische und medizinisch brauchbare funktionelle Modelle auch nur für die wichtigsten Körperteile und die häufigsten Krankheitsfälle verfügbar sein werden. Führende Universitätskliniken könnten, als Zentren einer interdisziplinären Kooperation von Medizinern, Ingenieuren und Mathematikern, eine Vorreiterrolle dabei übernehmen, mehr Mathematik in der Medizin zu wagen. Dies wäre zweifellos ein wichtiger Schritt in Richtung auf eine individuelle quantitative Medizin, bei dem Deutschland die besten Voraussetzungen hätte, die Rolle des "Schrittmachers“ zu übernehmen.

  14. Normal central retinal function and structure preserved in retinitis pigmentosa.

    PubMed

    Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V

    2010-02-01

    To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.

  15. Amniotic fluid cortisol and alpha-fetoprotein in normal and aneuploid pregnancies.

    PubMed

    Drugan, A; Subramanian, M G; Johnson, M P; Evans, M I

    1988-01-01

    Cortisol and alpha-fetoprotein (AFP) levels were measured in amniotic fluid (AF) samples at 15-20 weeks of gestation from 125 normal pregnancies and 29 pregnancies affected by aneuploidy. The normal pregnancy group was further subdivided into 'low' AF-AFP (less than 0.6 MOM, n = 60) and 'normal' AF-AFP (0.6 less than AFP less than 1.4 MOM, n = 65). A significant, inverse, linear correlation was found between cortisol and AF-AFP for both normal AFP and low AFP groups (r = -0.26, and r = -0.4, respectively, p less than 0.05). Gestational age was significantly correlated with both cortisol and AFP levels in the normal pregnancy groups. No difference was found when cortisol levels were compared between the low and normal AFP groups. The correlation between cortisol and AFP in aneuploid pregnancies was not significant (p = 0.37). The strong association between cortisol or AFP and gestational age in normal pregnancy (p less than 0.00001) was lost in trisomic gestation. We conclude that higher cortisol levels do not seem to be the cause of low AFP in normal or aneuploid pregnancies.

  16. Physical Properties of Normal Grade Biodiesel and Winter Grade Biodiesel

    PubMed Central

    Sadrolhosseini, Amir Reza; Moksin, Mohd Maarof; Nang, Harrison Lau Lik; Norozi, Monir; Yunus, W. Mahmood Mat; Zakaria, Azmi

    2011-01-01

    In this study, optical and thermal properties of normal grade and winter grade palm oil biodiesel were investigated. Surface Plasmon Resonance and Photopyroelectric technique were used to evaluate the samples. The dispersion curve and thermal diffusivity were obtained. Consequently, the variation of refractive index, as a function of wavelength in normal grade biodiesel is faster than winter grade palm oil biodiesel, and the thermal diffusivity of winter grade biodiesel is higher than the thermal diffusivity of normal grade biodiesel. This is attributed to the higher palmitic acid C16:0 content in normal grade than in winter grade palm oil biodiesel. PMID:21731429

  17. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    PubMed

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Normality of different orders for Cantor series expansions

    NASA Astrophysics Data System (ADS)

    Airey, Dylan; Mance, Bill

    2017-10-01

    Let S \\subseteq {N} have the property that for each k \\in S the set (S - k) \\cap {N} \\setminus S has asymptotic density 0. We prove that there exists a basic sequence Q where the set of numbers Q-normal of all orders in S but not Q-normal of all orders not in S has full Hausdorff dimension. If the function \

  19. Bas-Relief Modeling from Normal Images with Intuitive Styles.

    PubMed

    Ji, Zhongping; Ma, Weiyin; Sun, Xianfang

    2014-05-01

    Traditional 3D model-based bas-relief modeling methods are often limited to model-dependent and monotonic relief styles. This paper presents a novel method for digital bas-relief modeling with intuitive style control. Given a composite normal image, the problem discussed in this paper involves generating a discontinuity-free depth field with high compression of depth data while preserving or even enhancing fine details. In our framework, several layers of normal images are composed into a single normal image. The original normal image on each layer is usually generated from 3D models or through other techniques as described in this paper. The bas-relief style is controlled by choosing a parameter and setting a targeted height for them. Bas-relief modeling and stylization are achieved simultaneously by solving a sparse linear system. Different from previous work, our method can be used to freely design bas-reliefs in normal image space instead of in object space, which makes it possible to use any popular image editing tools for bas-relief modeling. Experiments with a wide range of 3D models and scenes show that our method can effectively generate digital bas-reliefs.

  20. Variance-reduction normalization technique for a compton camera system

    NASA Astrophysics Data System (ADS)

    Kim, S. M.; Lee, J. S.; Kim, J. H.; Seo, H.; Kim, C. H.; Lee, C. S.; Lee, S. J.; Lee, M. C.; Lee, D. S.

    2011-01-01

    For an artifact-free dataset, pre-processing (known as normalization) is needed to correct inherent non-uniformity of detection property in the Compton camera which consists of scattering and absorbing detectors. The detection efficiency depends on the non-uniform detection efficiency of the scattering and absorbing detectors, different incidence angles onto the detector surfaces, and the geometry of the two detectors. The correction factor for each detected position pair which is referred to as the normalization coefficient, is expressed as a product of factors representing the various variations. The variance-reduction technique (VRT) for a Compton camera (a normalization method) was studied. For the VRT, the Compton list-mode data of a planar uniform source of 140 keV was generated from a GATE simulation tool. The projection data of a cylindrical software phantom were normalized with normalization coefficients determined from the non-uniformity map, and then reconstructed by an ordered subset expectation maximization algorithm. The coefficient of variations and percent errors of the 3-D reconstructed images showed that the VRT applied to the Compton camera provides an enhanced image quality and the increased recovery rate of uniformity in the reconstructed image.

  1. Effective normalization for copy number variation detection from whole genome sequencing.

    PubMed

    Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka

    2012-01-01

    Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls

  2. "I Treat Him as a Normal Patient": Unveiling the Normalization Coping Strategy Among Formal Caregivers of Persons With Dementia and Its Implications for Person-Centered Care.

    PubMed

    Bentwich, Miriam Ethel; Dickman, Nomy; Oberman, Amitai; Bokek-Cohen, Ya'arit

    2017-11-01

    Currently, 47 million people have dementia, worldwide, often requiring paid care by formal caregivers. Research regarding family caregivers suggests normalization as a model for coping with negative emotional outcomes in caring for a person with dementia (PWD). The study aims to explore whether normalization coping mechanism exists among formal caregivers, reveal differences in its application among cross-cultural caregivers, and examine how this coping mechanism may be related to implementing person-centered care for PWDs. Content analysis of interviews with 20 formal caregivers from three cultural groups (Jews born in Israel [JI], Arabs born in Israel [AI], Russian immigrants [RI]), attending to PWDs. We extracted five normalization modes, revealing AI caregivers had substantially more utterances of normalization expressions than their colleagues. The normalization modes most commonly expressed by AI caregivers relate to the personhood of PWDs. These normalization modes may enhance formal caregivers' ability to employ person-centered care.

  3. Color normalization for robust evaluation of microscopy images

    NASA Astrophysics Data System (ADS)

    Švihlík, Jan; Kybic, Jan; Habart, David

    2015-09-01

    This paper deals with color normalization of microscopy images of Langerhans islets in order to increase robustness of the islet segmentation to illumination changes. The main application is automatic quantitative evaluation of the islet parameters, useful for determining the feasibility of islet transplantation in diabetes. First, background illumination inhomogeneity is compensated and a preliminary foreground/background segmentation is performed. The color normalization itself is done in either lαβ or logarithmic RGB color spaces, by comparison with a reference image. The color-normalized images are segmented using color-based features and pixel-wise logistic regression, trained on manually labeled images. Finally, relevant statistics such as the total islet area are evaluated in order to determine the success likelihood of the transplantation.

  4. Selective attention in normal and impaired hearing.

    PubMed

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  5. Kompressionstherapie - Versorgungspraxis: Informationsstand von Patienten mit Ulcus cruris venosum.

    PubMed

    Protz, Kerstin; Heyer, Kristina; Dissemond, Joachim; Temme, Barbara; Münter, Karl-Christian; Verheyen-Cronau, Ida; Klose, Katharina; Hampel-Kalthoff, Carsten; Augustin, Matthias

    2016-12-01

    Eine Säule der kausalen Therapie bei Patienten mit Ulcus cruris venosum ist die Kompressionstherapie. Sie unterstützt die Abheilung, reduziert Schmerzen und Rezidive und steigert die Lebensqualität. Bislang existieren kaum wissenschaftliche Daten zu dem Versorgungsstand und fachspezifischem Wissen von Patienten mit Ulcus cruris venosum. Standardisierte Fragebögen wurden bundesweit in 55 Pflegediensten, 32 Arztpraxen, vier Wundzentren und -sprechstunden sowie einem Pflegetherapiestützpunkt von Patienten mit Ulcus cruris venosum bei Erstvorstellung anonym ausgefüllt. Insgesamt nahmen 177 Patienten (Durchschnittsalter 69,4 Jahre; 75,1 % Frauen) teil. Ein florides Ulcus cruris venosum bestand im Mittel 17 Monate. 31,1 % hatten keine Kompressionstherapie, 40,1 % Binden und 28,8 % Strümpfe. Bei der Bestrumpfung hatten 13,7 % Kompressionsklasse III, 64,7 % Kompressionsklasse II und 19,6 % Kompressionsklasse I. 70,6 % legten die Strümpfe nach dem Aufstehen an, 21,1 % trugen sie Tag und Nacht. 39,2 % bereiteten die Strümpfe Beschwerden. Lediglich 11,7 % hatten eine An- und Ausziehhilfe. Die Binden wurden im Mittel 40,7 Wochen getragen und bei 69 % nicht unterpolstert. Bei 2,8 % wurde der Knöchel- und Waden-Umfang zur Erfolgskontrolle gemessen. Venensport machten 45,9 %. Ein Drittel hatte keine Kompressionsversorgung, obwohl diese eine Basismaßnahme der Therapie des Ulcus cruris venosum ist. Zudem ist deren korrekte Auswahl und Anwendung angesichts der langen Bestandsdauer der Ulzerationen zu hinterfragen. Weiterführende Fachkenntnisse bei Anwendern und Verordnern sowie Patientenschulungen sind erforderlich. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  6. Normal mode study of the earth's rigid body motions

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1983-01-01

    In this paper it is shown that the earth's rigid body (rb) motions can be represented by an analytical set of eigensolutions to the equation of motion for elastic-gravitational free oscillations. Thus each degree of freedom in the rb motion is associated with a rb normal mode. Cases of both nonrotating and rotating earth models are studied, and it is shown that the rb modes do incorporate neatly into the earth's system of normal modes of free oscillation. The excitation formula for the rb modes are also obtained, based on normal mode theory. Physical implications of the results are summarized and the fundamental differences between rb modes and seismic modes are emphasized. In particular, it is ascertained that the Chandler wobble, being one of the rb modes belonging to the rotating earth, can be studied using the established theory of normal modes.

  7. Normal mode Rossby waves observed in the upper stratosphere

    NASA Technical Reports Server (NTRS)

    Hirooka, T.; Hirota, I.

    1985-01-01

    In recent years, observational evidence has been obtained for westward traveling planetary waves in the middle atmosphere with the aid of global data from satellites. There is no doubt that the fair portion of the observed traveling waves can be understood as the manifestation of the normal mode Rossby waves which are theoretically derived from the tidal theory. Some observational aspects of the structure and behavior of the normal model Rossby waves in the upper stratosphere are reported. The data used are the global stratospheric geopotential thickness and height analyses which are derived mainly from the Stratospheric Sounding Units (SSUs) on board TIROS-N and NOAA satellites. A clear example of the influence of the normal mode Rossby wave on the mean flow is reported. The mechanism considered is interference between the normal mode Rossby wave and the quasi-stationary wave.

  8. Advanced Very High Resolution Radiometer Normalized Difference Vegetation Index Composites

    USGS Publications Warehouse

    ,

    2005-01-01

    The Advanced Very High Resolution Radiometer (AVHRR) is a broad-band scanner with four to six bands, depending on the model. The AVHRR senses in the visible, near-, middle-, and thermal- infrared portions of the electromagnetic spectrum. This sensor is carried on a series of National Oceanic and Atmospheric Administration (NOAA) Polar Orbiting Environmental Satellites (POES), beginning with the Television InfraRed Observation Satellite (TIROS-N) in 1978. Since 1989, the United States Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) has been mapping the vegetation condition of the United States and Alaska using satellite information from the AVHRR sensor. The vegetation condition composites, more commonly called greenness maps, are produced every week using the latest information on the growth and condition of the vegetation. One of the most important aspects of USGS greenness mapping is the historical archive of information dating back to 1989. This historical stretch of information has allowed the USGS to determine a 'normal' vegetation condition. As a result, it is possible to compare the current week's vegetation condition with normal vegetation conditions. An above normal condition could indicate wetter or warmer than normal conditions, while a below normal condition could indicate colder or dryer than normal conditions. The interpretation of departure from normal will depend on the season and geography of a region.

  9. Teaching Normal Birth Interactively

    PubMed Central

    Hotelling, Barbara A.

    2004-01-01

    In this column, the author provides examples of teaching strategies that childbirth educators may utilize to illustrate each of the six care practices supported by Lamaze International to promote normal birth: labor begins on its own, freedom of movement throughout labor, continuous labor support, no routine interventions, non-supine (e.g., upright or side-lying) positions for birth, and no separation of mother and baby with unlimited opportunity for breastfeeding. PMID:17273389

  10. Cross Correlation versus Normalized Mutual Information on Image Registration

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Tilton, James C.; Lin, Guoqing

    2016-01-01

    This is the first study to quantitatively assess and compare cross correlation and normalized mutual information methods used to register images in subpixel scale. The study shows that the normalized mutual information method is less sensitive to unaligned edges due to the spectral response differences than is cross correlation. This characteristic makes the normalized image resolution a better candidate for band to band registration. Improved band-to-band registration in the data from satellite-borne instruments will result in improved retrievals of key science measurements such as cloud properties, vegetation, snow and fire.

  11. Berechnung verkehrlicher Substitutionseffekte im Personenverkehr bei Online-Shopping

    NASA Astrophysics Data System (ADS)

    Nerlich, Mark R.; Schiffner, Felix; Vogt, Walter; Rauh, Jürgen; Breidenbach, Petra

    Für Güter des täglichen, mittelfristigen und langfristigen Bedarfs sowie für das Beispiel Baumarktartikel wird das Potenzial für Personenverkehrsaufwand von Einkaufsaktivtäten quantitativ abgeschätzt. Die entwickelten Algorithmen behandeln die einkaufsvorbereitende Information und den eigentlichen Einkauf, d.h. den Erwerb eines Gutes, separat. Informationsaktivitäten haben insbesondere bei höherwertigen Gütern einen hohen Stellenwert und damit auch verkehrliche Relevanz. Wie Berechnungen zeigen, spart Online-Shopping Informations- und Einkaufsverkehrsaufwand im Pkw-Verkehr ein. Die notwendigen Eingangsdaten wie differenzierte Informations- und Einkaufshäufigkeiten sowie verkehrliche Parameter zu Verkehrsmittelwahl, Entfernungen und Wegekopplungen wurden aus eigenen Erhebungen gewonnen.

  12. Quadratic Finite Element Method for 1D Deterministic Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tolar, Jr., D R; Ferguson, J M

    2004-01-06

    In the discrete ordinates, or SN, numerical solution of the transport equation, both the spatial ({und r}) and angular ({und {Omega}}) dependences on the angular flux {psi}{und r},{und {Omega}}are modeled discretely. While significant effort has been devoted toward improving the spatial discretization of the angular flux, we focus on improving the angular discretization of {psi}{und r},{und {Omega}}. Specifically, we employ a Petrov-Galerkin quadratic finite element approximation for the differencing of the angular variable ({mu}) in developing the one-dimensional (1D) spherical geometry S{sub N} equations. We develop an algorithm that shows faster convergence with angular resolution than conventional S{sub N} algorithms.

  13. Organisationsaspekte in der Umsetzung

    NASA Astrophysics Data System (ADS)

    Balck, Henning; Bungard, Walter; Hofmann, Karsten; Ganz, Walter; Schwenker, Burkhard; Hanßen, Dirk; Meindl, Rudolf; Schloske, Alexander; Thieme, Paul; Teufel, Peter

    Strukturbrüche sind eine der Hauptursachen für die Schwierigkeiten vieler Unternehmen, ihre Organisationsform zu modernisieren und vor allem turbulenten Marktbedingungen anzupassen. Klassische Beispiele für Strukturbrüche finden sich in der Spaltung von Aufbau- und Ablauforganisation, der Spaltung von Produktion und Dienstleistung oder der Spaltung von Planung und Ausführung. Ein wirkungsvoller Ansatz zur Überwindung solcher Spaltungen ist eine Art Versöhnungsmuster: die Polare Organisation. Wesentliche Elemente dieser Organisationsform sind ihr Netzwerkcharakter, kooperatives Zusammenwirken, eine hohe Kommunikationsintensität und eine polare Koppelung der kommunizierenden Partner oder - in abstrahierter Form - die organisierte Balance erfolgskritischer Gegensätze, wie Kosten und Qualität.

  14. Return to normality after a radiological emergency.

    PubMed

    Lochard, J; Prêtre, S

    1995-01-01

    Some preliminary considerations from the management of post-accident situations connected to large scale and high land contamination are presented. The return to normal, or at least acceptable living conditions, as soon as reasonably achievable, and the prevention of the possible emergence of a post-accident crisis is of key importance. A scheme is proposed for understanding the dynamics of the various phases after an accident. An attempt is made to characterize some of the parameters driving the acceptability of post-accident situations. Strategies to return to normal living conditions in contaminated areas are considered.

  15. The normalization of deviance in healthcare delivery

    PubMed Central

    Banja, John

    2009-01-01

    Many serious medical errors result from violations of recognized standards of practice. Over time, even egregious violations of standards of practice may become “normalized” in healthcare delivery systems. This article describes what leads to this normalization and explains why flagrant practice deviations can persist for years, despite the importance of the standards at issue. This article also provides recommendations to aid healthcare organizations in identifying and managing unsafe practice deviations before they become normalized and pose genuine risks to patient safety, quality care, and employee morale. PMID:20161685

  16. Reversible grasp reflexes in normal pressure hydrocephalus.

    PubMed

    Thomas, Rhys H; Bennetto, Luke; Silva, Mark T

    2009-05-01

    We present two cases of normal pressure hydrocephalus in combination with grasp reflexes. In both cases the grasp reflexes disappeared following high volume cerebrospinal fluid removal. In one of the cases the grasp reflexes returned over a period of weeks but again resolved following definitive cerebrospinal fluid shunting surgery, and remained absent until final follow up at 9 months. We hypothesise that resolving grasp reflexes following high volume CSF removal has both diagnostic and prognostic value in normal pressure hydrocephalus, encouraging larger studies on the relevance of primitive reflexes in NPH.

  17. Normal and abnormal human vestibular ocular function

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Black, F. O.

    1986-01-01

    The major motivation of this research is to understand the role the vestibular system plays in sensorimotor interactions which result in spatial disorientation and motion sickness. A second goal was to explore the range of abnormality as it is reflected in quantitative measures of vestibular reflex responses. The results of a study of vestibular reflex measurements in normal subjects and preliminary results in abnormal subjects are presented in this report. Statistical methods were used to define the range of normal responses, and determine age related changes in function.

  18. Quantitative RNFL attenuation coefficient measurements by RPE-normalized OCT data

    NASA Astrophysics Data System (ADS)

    Vermeer, K. A.; van der Schoot, J.; Lemij, H. G.; de Boer, J. F.

    2012-03-01

    We demonstrate significantly different scattering coefficients of the retinal nerve fiber layer (RNFL) between normal and glaucoma subjects. In clinical care, SD-OCT is routinely used to assess the RNFL thickness for glaucoma management. In this way, the full OCT data set is conveniently reduced to an easy to interpret output, matching results from older (non- OCT) instruments. However, OCT provides more data, such as the signal strength itself, which is due to backscattering in the retinal layers. For quantitative analysis, this signal should be normalized to adjust for local differences in the intensity of the beam that reaches the retina. In this paper, we introduce a model that relates the OCT signal to the attenuation coefficient of the tissue. The average RNFL signal (within an A-line) was then normalized based on the observed RPE signal, resulting in normalized RNFL attenuation coefficient maps. These maps showed local defects matching those found in thickness data. The average (normalized) RNFL attenuation coefficient of a fixed band around the optic nerve head was significantly lower in glaucomatous eyes than in normal eyes (3.0mm-1 vs. 4.9mm-1, P<0.01, Mann-Whitney test).

  19. Evaluating acoustic speaker normalization algorithms: evidence from longitudinal child data.

    PubMed

    Kohn, Mary Elizabeth; Farrington, Charlie

    2012-03-01

    Speaker vowel formant normalization, a technique that controls for variation introduced by physical differences between speakers, is necessary in variationist studies to compare speakers of different ages, genders, and physiological makeup in order to understand non-physiological variation patterns within populations. Many algorithms have been established to reduce variation introduced into vocalic data from physiological sources. The lack of real-time studies tracking the effectiveness of these normalization algorithms from childhood through adolescence inhibits exploration of child participation in vowel shifts. This analysis compares normalization techniques applied to data collected from ten African American children across five time points. Linear regressions compare the reduction in variation attributable to age and gender for each speaker for the vowels BEET, BAT, BOT, BUT, and BOAR. A normalization technique is successful if it maintains variation attributable to a reference sociolinguistic variable, while reducing variation attributable to age. Results indicate that normalization techniques which rely on both a measure of central tendency and range of the vowel space perform best at reducing variation attributable to age, although some variation attributable to age persists after normalization for some sections of the vowel space. © 2012 Acoustical Society of America

  20. Betriebsführung multimodaler Energiesysteme

    NASA Astrophysics Data System (ADS)

    Mackensen, Reinhard

    Die Transformation des Energiesystems von einer zentral geprägten, unidirektional orientierten und in unterschiedliche Sektoren separierten Struktur hin zu einer umfassenden, multimodalen, dezentralen und flexiblen Erzeuger- und Verbraucherlandschaft findet auf verschiedenen Ebenen statt. Randbedingung bei dieser Umwälzung ist immer die Einhaltung der Teilziele Versorgungssicherheit, Wirtschaftlichkeit und Effizienz. Im Einzelnen schlägt sich diese Transformation in einer Diversifizierung der Akteurslandschaft durch die Mechanismen des Unbundling nieder. Weiterhin finden eine Dezentralisierung der Erzeugerlandschaft und damit eine Substitution von mehrheitlich fossil betriebener Großkraftwerkstechnologie durch eine Vielzahl dezentraler Erzeuger mit zumeist regenerativem Charakter statt. Dieser Wandel hat im Wesentlichen zwei Hauptkonsequenzen. Zum einen ergeben sich durch dezentrale, flächige Verteilung der Erzeuger neue Anforderungen an den Energieaustausch, bspw. aus der Erweiterung der Stromnetze für einen bidirektionalen Energieaustausch, zum anderen werden Abstimmungsmechanismen erforderlich, welche die fluktuierende Einspeisung derart mit dem Verbrauch in Waage hält, dass sowohl elektrische Netzrestriktionen, die Qualität der Versorgung und Aspekte der Energieeffizienz und damit der Wirtschaftlichkeit berücksichtigt werden. Mögliche Antworten auf die mit dieser Betrachtung einhergehenden Fragen liegen in der Konzeption eines multimodalen Energiesystems, also in der Gesamtbetrachtung der Sektoren Strom, Wärme und Verkehr. Dieses Kapitel soll Mechanismen darlegen und Wege aufzeigen, wie eine solche Konzeption gestaltet werden kann und wie solche komplexen Systeme in der Praxis betrieben werden können.

  1. Developing Normal Turns-Amplitude Clouds for Upper and Lower Limbs.

    PubMed

    Jabre, Joe F; Nikolayev, Sergey G; Babayev, Michael B; Chindilov, Denis V; Muravyov, Anatoly Y

    2016-10-01

    Turns and amplitude analysis (T&A) is a frequently used method for automatic EMG interference pattern analysis. The T&A normal values have only been developed for a limited number of muscles. Our objective was to obtain normal T&A clouds for upper and lower extremity muscles for which no normal values exist in the literature. The T&A normative data using concentric needle electrodes were obtained from 68 men and 56 women aged 20 to 60 years. Normal upper and lower extremity T&A clouds were obtained and presented in this article. The T&A normal values collected in this study maybe used to detect neurogenic and myopathic abnormalities in men and women at low-to-moderate muscle contractions. The effect of turns-amplitude data obtained at high force level of muscle contraction and its potential to falsely show neurogenic abnormalities are discussed.

  2. Informative graphing of continuous safety variables relative to normal reference limits.

    PubMed

    Breder, Christopher D

    2018-05-16

    Interpreting graphs of continuous safety variables can be complicated because differences in age, gender, and testing site methodologies data may give rise to multiple reference limits. Furthermore, data below the lower limit of normal are compressed relative to those points above the upper limit of normal. The objective of this study is to develop a graphing technique that addresses these issues and is visually intuitive. A mock dataset with multiple reference ranges is initially used to develop the graphing technique. Formulas are developed for conditions where data are above the upper limit of normal, normal, below the lower limit of normal, and below the lower limit of normal when the data value equals zero. After the formulae are developed, an anonymized dataset from an actual set of trials for an approved drug is evaluated comparing the technique developed in this study to standard graphical methods. Formulas are derived for the novel graphing method based on multiples of the normal limits. The formula for values scaled between the upper and lower limits of normal is a novel application of a readily available scaling formula. The formula for the lower limit of normal is novel and addresses the issue of this value potentially being indeterminate when the result to be scaled as a multiple is zero. The formulae and graphing method described in this study provides a visually intuitive method to graph continuous safety data including laboratory values, vital sign data.

  3. Online irrigation service for fruit und vegetable crops at farmers site

    NASA Astrophysics Data System (ADS)

    Janssen, W.

    2009-09-01

    Online irrigation service for fruit und vegetable crops at farmers site by W. Janssen, German Weather Service, 63067 Offenbach Agrowetter irrigation advice is a product which calculates the present soil moisture as well as the soil moisture to be expected over the next 5 days for over 30 different crops. It's based on a water balance model and provides targeted recommendations for irrigation. Irrigation inputs according to the soil in order to avoid infiltration and, as a consequence thereof, the undesired movement of nitrate and plant protectants into the groundwater. This interactive 'online system' takes into account the user's individual circumstances such as crop and soil characteristics and the precipitation and irrigation amounts at the user's site. Each user may calculate up to 16 different enquiries simultaneously (different crops or different emergence dates). The user can calculate the individual soil moistures for his fields with a maximum effort of 5 minutes per week only. The sources of water are precipitation and irrigation whereas water losses occur due to evapotranspiration and infiltration of water into the ground. The evapotranspiration is calculated by multiplying a reference evapotranspiration (maximum evapotranspiration over grass) with the so-called crop coefficients (kc values) that have been developed by the Geisenheim Research Centre, Vegetable Crops Branch. Kc values depending on the crop and the individual plant development stage. The reference evapotranspiration is calculated from a base weather station user has chosen (out of around 500 weather stations) using Penman method based on daily values. After chosen a crop and soil type the user must manually enter the precipitation data measured at the site, the irrigation water inputs and the dates for a few phenological stages. Economical aspects can be considered by changing the values of soil moisture from which recommendations for irrigation start from optimal to necessary plant supply

  4. A normal ano-genital exam: sexual abuse or not?

    PubMed

    Hornor, Gail

    2010-01-01

    Sexual abuse is a problem of epidemic proportions in the United States. Pediatric nurse practitioners (PNPs) are at the forefront of providing care to children and families. The PNP is in a unique position to educate patients and families regarding sexual abuse and dispel common myths associated with sexual abuse. One such myth is that a normal ano-genital examination is synonymous with the absence of sexual abuse. This article will provide primary care providers, including PNPs, with a framework for understanding why a normal ano-genital examination does not negate the possibility of sexual abuse/assault. Normal ano-genital anatomy, changes that occur with puberty, and physical properties related to the genitalia and anus will be discussed. Photos will provide visualization of both normal variants of the pre-pubertal hymen and genitalia as well as changes that occur with puberty. Implications for practice for PNPs will be discussed.

  5. TÜV - Zertifizierungen in der Life Science Branche

    NASA Astrophysics Data System (ADS)

    Schaff, Peter; Gerbl-Rieger, Susanne; Kloth, Sabine; Schübel, Christian; Daxenberger, Andreas; Engler, Claus

    Life Sciences [1] (Lebenswissenschaften) sind ein globales Innovationsfeld mit Anwendungen der Bio- und Medizinwissenschaften, der Pharma-, Chemie-, Kosmetik- und Lebensmittelindustrie. Diese Branche zeichnet sich durch eine stark interdisziplinäre Ausrichtung aus, mit Anwendung wissenschaftlicher Erkenntnisse und Einsatz von Ausgangsstoffen aus der modernen Biologie, Chemie und Humanmedizin sowie gezielter marktwirtschaftlich orientierter Arbeit.

  6. Doktor Johannes Häringshauser - Was seine Bücher über ihn erzählen.

    NASA Astrophysics Data System (ADS)

    Feola, Vittoria

    2009-06-01

    Die Bibliothek des Dr. Johannes Häringshauser (1603-1642) weist ihren Besitzer als Arzt und Gelehrten mit großem geistigen Horizont aus. Hervorzuheben ist sein Interesse für Astronomie und Astrologie. Neben Werken, die unmittelbar mit seinen Studien in Wien und Padua und den Erfordernissen eines Arztes in Zusammenhang zu bringen sind (Klassiker der Heilkunde genauso wie aktuelle medizinische Publikationen), wird in seiner Büchersammlung eine reiche Palette an Themen abgedeckt: Theologie, Philosophie, Philologie, Politik, Geschichte und Länderkunde.

  7. Quasi-Normal Modes of Stars and Black Holes.

    PubMed

    Kokkotas, Kostas D; Schmidt, Bernd G

    1999-01-01

    Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman) and relativistic stars (non-rotating and slowly-rotating). The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.

  8. IRAS far-infrared colours of normal stars

    NASA Technical Reports Server (NTRS)

    Waters, L. B. F. M.; Cote, J.; Aumann, H. H.

    1987-01-01

    The analysis of IRAS observations at 12, 25, 60 and 100 microns of bright stars of spectral type O to M is presented. The objective is to identify the 'normal' stellar population and to characterize it in terms of the relationships between (B-V) and (V-/12/), between (R-I) and (V-/12/), and as a function of spectral type and luminosity class. A well-defined relation is found between the color of normal stars in the visual (B-V), (R-I) and in the IR, which does not depend on luminosity class. Using the (B-V), (V-/12/) relation for normal stars, it is found that B and M type stars show a large fraction of deviating stars, mostly with IR excess that is probably caused by circumstellar material. A comparison of IRAS colors with the Johnson colors as a function of spectral type shows good agreement except for the K0 to M5 type stars. The results will be useful in identifying the deviating stars detected with IRAS.

  9. Effect of transforming growth factor-beta1 on embryonic and posthatch muscle growth and development in normal and low score normal chicken.

    PubMed

    Li, X; Velleman, S G

    2009-02-01

    During skeletal muscle development, transforming growth factor-beta1 (TGF-beta1) is a potent inhibitor of muscle cell proliferation and differentiation. The TGF-beta1 signal is carried by Smad proteins into the cell nucleus, inhibiting the expression of key myogenic regulatory factors including MyoD and myogenin. However, the molecular mechanism by which TGF-beta1 inhibits muscle cell proliferation and differentiation has not been well documented in vivo. The present study investigated the effect of TGF-beta1 on in vivo skeletal muscle growth and development. A chicken line, Low Score Normal (LSN) with reduced muscling and upregulated TGF-beta1 expression, was used and compared to a normal chicken line. The injection of TGF-beta1 at embryonic day (ED) 3 significantly reduced the pectoralis major (p. major) muscle weight in the normal birds at 1 wk posthatch, whereas no significant difference was observed in the LSN birds. The difference between normal and LSN birds in response to TGF-beta1 is likely due to different levels of endogenous TGF-beta1 where the LSN birds have increased TGF-beta1 expression in their p. major muscle at both 17 ED and 6 wk posthatch. Smad3 expression was reduced by TGF-beta1 from 10 ED to 1 wk posthatch in normal p. major muscle. Unlike Smad3, Smad7 expression was not significantly affected by TGF-beta1 until posthatch in both normal and LSN p. major muscle. Expression of MyoD was reduced 35% by TGF-beta1 during embryonic development in normal p. major muscle, whereas LSN p. major muscle showed a delayed decrease at 1 d posthatch in MyoD expression in response to the TGF-beta1 treatment. Myogenin expression was reduced 29% by TGF-beta1 after hatch in normal p. major muscle. In LSN p. major muscle, TGF-beta1 treatment significantly decreased myogenin expression by 43% at 1 d posthatch and 32% at 1 wk posthatch. These data suggested that TGF-beta1 reduced p. major muscle growth by inhibiting MyoD and myogenin expression during both embryonic

  10. What is normal in normal aging? Effects of Aging, Amyloid and Alzheimer’s Disease on the Cerebral Cortex and the Hippocampus

    PubMed Central

    Fjell, Anders M.; McEvoy, Linda; Holland, Dominic; Dale, Anders M.; Walhovd, Kristine B

    2015-01-01

    What can be expected in normal aging, and where does normal aging stop and pathological neurodegeneration begin? With the slow progression of age-related dementias such as Alzheimer’s Disease (AD), it is difficult to distinguish age-related changes from effects of undetected disease. We review recent research on changes of the cerebral cortex and the hippocampus in aging and the borders between normal aging and AD. We argue that prominent cortical reductions are evident in fronto-temporal regions in elderly even with low probability of AD, including regions overlapping the default mode network. Importantly, these regions show high levels of amyloid deposition in AD, and are both structurally and functionally vulnerable early in the disease. This normalcy-pathology homology is critical to understand, since aging itself is the major risk factor for sporadic AD. Thus, rather than necessarily reflecting early signs of disease, these changes may be part of normal aging, and may inform on why the aging brain is so much more susceptible to AD than is the younger brain. We suggest that regions characterized by a high degree of life-long plasticity are vulnerable to detrimental effects of normal aging, and that this age-vulnerability renders them more susceptible to additional, pathological AD-related changes. We conclude that it will be difficult to understand AD without understanding why it preferably affects older brains, and that we need a model that accounts for age-related changes in AD-vulnerable regions independently of AD-pathology. PMID:24548606

  11. The use of normal forms for analysing nonlinear mechanical vibrations

    PubMed Central

    Neild, Simon A.; Champneys, Alan R.; Wagg, David J.; Hill, Thomas L.; Cammarano, Andrea

    2015-01-01

    A historical introduction is given of the theory of normal forms for simplifying nonlinear dynamical systems close to resonances or bifurcation points. The specific focus is on mechanical vibration problems, described by finite degree-of-freedom second-order-in-time differential equations. A recent variant of the normal form method, that respects the specific structure of such models, is recalled. It is shown how this method can be placed within the context of the general theory of normal forms provided the damping and forcing terms are treated as unfolding parameters. The approach is contrasted to the alternative theory of nonlinear normal modes (NNMs) which is argued to be problematic in the presence of damping. The efficacy of the normal form method is illustrated on a model of the vibration of a taut cable, which is geometrically nonlinear. It is shown how the method is able to accurately predict NNM shapes and their bifurcations. PMID:26303917

  12. Normal forms for reduced stochastic climate models

    PubMed Central

    Majda, Andrew J.; Franzke, Christian; Crommelin, Daan

    2009-01-01

    The systematic development of reduced low-dimensional stochastic climate models from observations or comprehensive high-dimensional climate models is an important topic for atmospheric low-frequency variability, climate sensitivity, and improved extended range forecasting. Here techniques from applied mathematics are utilized to systematically derive normal forms for reduced stochastic climate models for low-frequency variables. The use of a few Empirical Orthogonal Functions (EOFs) (also known as Principal Component Analysis, Karhunen–Loéve and Proper Orthogonal Decomposition) depending on observational data to span the low-frequency subspace requires the assessment of dyad interactions besides the more familiar triads in the interaction between the low- and high-frequency subspaces of the dynamics. It is shown below that the dyad and multiplicative triad interactions combine with the climatological linear operator interactions to simultaneously produce both strong nonlinear dissipation and Correlated Additive and Multiplicative (CAM) stochastic noise. For a single low-frequency variable the dyad interactions and climatological linear operator alone produce a normal form with CAM noise from advection of the large scales by the small scales and simultaneously strong cubic damping. These normal forms should prove useful for developing systematic strategies for the estimation of stochastic models from climate data. As an illustrative example the one-dimensional normal form is applied below to low-frequency patterns such as the North Atlantic Oscillation (NAO) in a climate model. The results here also illustrate the short comings of a recent linear scalar CAM noise model proposed elsewhere for low-frequency variability. PMID:19228943

  13. Selective Attention in Normal and Impaired Hearing

    PubMed Central

    Shinn-Cunningham, Barbara G.; Best, Virginia

    2008-01-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention. PMID:18974202

  14. Data Science Bowl Launched to Improve Lung Cancer Screening | Division of Cancer Prevention

    Cancer.gov

    [[{"fid":"2078","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl Logo","field_file_image_title_text[und][0][value]":"Data Science Bowl Logo","field_folder[und]":"76"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl

  15. CUILESS2016: a clinical corpus applying compositional normalization of text mentions.

    PubMed

    Osborne, John D; Neu, Matthew B; Danila, Maria I; Solorio, Thamar; Bethard, Steven J

    2018-01-10

    Traditionally text mention normalization corpora have normalized concepts to single ontology identifiers ("pre-coordinated concepts"). Less frequently, normalization corpora have used concepts with multiple identifiers ("post-coordinated concepts") but the additional identifiers have been restricted to a defined set of relationships to the core concept. This approach limits the ability of the normalization process to express semantic meaning. We generated a freely available corpus using post-coordinated concepts without a defined set of relationships that we term "compositional concepts" to evaluate their use in clinical text. We annotated 5397 disorder mentions from the ShARe corpus to SNOMED CT that were previously normalized as "CUI-less" in the "SemEval-2015 Task 14" shared task because they lacked a pre-coordinated mapping. Unlike the previous normalization method, we do not restrict concept mappings to a particular set of the Unified Medical Language System (UMLS) semantic types and allow normalization to occur to multiple UMLS Concept Unique Identifiers (CUIs). We computed annotator agreement and assessed semantic coverage with this method. We generated the largest clinical text normalization corpus to date with mappings to multiple identifiers and made it freely available. All but 8 of the 5397 disorder mentions were normalized using this methodology. Annotator agreement ranged from 52.4% using the strictest metric (exact matching) to 78.2% using a hierarchical agreement that measures the overlap of shared ancestral nodes. Our results provide evidence that compositional concepts can increase semantic coverage in clinical text. To our knowledge we provide the first freely available corpus of compositional concept annotation in clinical text.

  16. 14 CFR 1216.306 - Actions normally requiring an EIS.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... normally requiring an EIS. (a) NASA will prepare an EIS for actions with the potential to significantly... action or mitigation of its potentially significant impacts. (b) Typical NASA actions normally requiring... material greater than the quantity for which the NASA Nuclear Flight Safety Assurance Manager may grant...

  17. Normalizing Catastrophe: Sustainability and Scientism

    ERIC Educational Resources Information Center

    Bonnett, Michael

    2013-01-01

    Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…

  18. Metabolic differences between short children with GH peak levels in the lower normal range and healthy children of normal height.

    PubMed

    Tidblad, Anders; Gustafsson, Jan; Marcus, Claude; Ritzén, Martin; Ekström, Klas

    2017-06-01

    Severe growth hormone deficiency (GHD) leads to several metabolic effects in the body ranging from abnormal body composition to biochemical disturbances. However, less is known regarding these parameters in short children with GH peak levels in the lower normal range during provocation tests. Our aim was to study the metabolic profile of this group and compare it with that of healthy children of normal height. Thirty-five pre-pubertal short children (<-2.5 SDS) aged between 7 and 10years, with peak levels of GH between 7 and 14μg/L in an arginine insulin tolerance test (AITT), were compared with twelve age- and sex-matched children of normal height. The metabolic profile of the subjects was analysed by blood samples, DEXA, frequently sampled intravenous glucose tolerance test, microdialysis and stable isotope examinations of rates of glucose production and lipolysis. There were no overall significant metabolic differences between the groups. However, in the subgroup analysis, the short children with GH peaks <10μg/L had significantly lower fasting insulin levels which also correlated to other metabolic parameters. The short pre-pubertal children with GH peak levels between 7 and 14μg/L did not differ significantly from healthy children of normal height but subpopulations within this group show significant metabolic differences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Luftqualität

    NASA Astrophysics Data System (ADS)

    Schultz, Martin G.; Klemp, Dieter; Wahner, Andreas

    Die Qualität der Luft beeinflusst in besonderer Weise die menschliche Gesundheit und hat auch Auswirkungen auf die Landwirtschaft und Ökosysteme. Viele Luftschadstoffe absorbieren oder streuen zudem die Sonnen- oder Wärmestrahlung und sind daher klimawirksam. Luftchemische Prozesse hängen, ebenso wie die Emissionen, von klimatischen Faktoren wie Sonneneinstrahlung, Temperatur und Niederschlag ab. Deshalb ist zu erwarten, dass die projizierten Klimaänderungen für Deutschland auch die Luftschadstoffkonzentrationen beeinflussen werden, auch wenn dieser Zusammenhang noch nicht gut erforscht ist. Dieses Kapitel vermittelt einen Überblick über die Zusammenhänge und weist zumindest qualitativ auf mögliche künftige Entwicklungen hin. Im Vordergrund stehen die Entwicklungen bei Feinstaub und Ozon.

  20. Magnetoseed - Vasculäres Tissue Engineering

    NASA Astrophysics Data System (ADS)

    Perea Saavedra, Héctor; Methe, Heiko; Wintermantel, Erich

    Gegenwärtig sind kardiovaskuläre Erkrankungen, allen voran die Arteriosklerose koronarer und zerebraler Gefäße, Ursache für 38% aller Todesfälle in Nordamerika und häufigste Todesursache europäischer Männer < 65 Jahre und zweithäufigste Todesursache bei Frauen [4]. Es wird prognostiziert, dass innerhalb der nächsten 10-15 Jahre kardiovaskuläre Erkrankungen und deren Komplikationen weltweit die häufigste Todesursache stellen werden. Dies ist zum einen Folge der ansteigenden Prävalenz kardiovaskulärer Erkrankungen in Osteuropa und zunehmend auch in den Entwicklungsländern, zum anderen Folge der kontinuierlich ansteigenden Inzidenz von Übergewicht und Diabetes mellitus in den westlichen Ländern.

  1. Normalization in Lie algebras via mould calculus and applications

    NASA Astrophysics Data System (ADS)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  2. Measuring and Estimating Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2013-01-01

    Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.

  3. Himmelsfotografie MIT Schmidt-Teleskopen

    NASA Astrophysics Data System (ADS)

    Marx, Siegfried; Pfau, Werner

    Auf dem Höhepunkt der Verbreitung und astronomischen Anwendung von Schmidt-Teleskopen legen S. Marx und W. Pfau einen reich illustrierten Bildband zu diesem Fernrohrtyp vor. Der thematische Bogen reicht von der Teleskoptechnik und ihrer Geschichte über das Leben von Berhard Schmidt bis zu den schönsten, hier in hervorragender Qualität wiedergegebenen Himmelsaufnahmen und ihrer wissenschaftlichen Interpretation. Praktische Hinweise zu eigener fotografischer Tätigkeit und ein Glossar machen das Buch nützlich für jeden Liebhaber der Himmelskunde.

  4. Grundlagen der Nachrichtenübertragung

    NASA Astrophysics Data System (ADS)

    Plaßmann, Wilfried

    Die Nachrichtentechnik hat die Aufgabe, Nachrichten auszutauschen. Nachrichten sind z. B. Fragen, Beobachtungen und Befehle. Unter dem Begriff Nachrichtenübertragung wird hier die elektrische Nachrichtenübertragung verstanden, denn zur Übermittlung von Nachrichten werden Spannungen und Ströme sowie elektrische und magnetische Felder eingesetzt. Die Übertragung beginnt mit der Nachrichtenquelle, die die Nachrichten aussendet, und endet mit der Nachrichtensenke, die sie empfängt. Zwischen beiden liegt das elektrische Nachrichtenübertragungssystem, dessen Funktionseinheiten dargestellt sind und erläutert werden.

  5. Trojan dynamics well approximated by a new Hamiltonian normal form

    NASA Astrophysics Data System (ADS)

    Páez, Rocío Isabel; Locatelli, Ugo

    2015-10-01

    We revisit a classical perturbative approach to the Hamiltonian related to the motions of Trojan bodies, in the framework of the planar circular restricted three-body problem, by introducing a number of key new ideas in the formulation. In some sense, we adapt the approach of Garfinkel to the context of the normal form theory and its modern techniques. First, we make use of Delaunay variables for a physically accurate representation of the system. Therefore, we introduce a novel manipulation of the variables so as to respect the natural behaviour of the model. We develop a normalization procedure over the fast angle which exploits the fact that singularities in this model are essentially related to the slow angle. Thus, we produce a new normal form, i.e. an integrable approximation to the Hamiltonian. We emphasize some practical examples of the applicability of our normalizing scheme, e.g. the estimation of the stable libration region. Finally, we compare the level curves produced by our normal form with surfaces of section provided by the integration of the non-normalized Hamiltonian, with very good agreement. Further precision tests are also provided. In addition, we give a step-by-step description of the algorithm, allowing for extensions to more complicated models.

  6. Normal modes of the shallow water system on the cubed sphere

    NASA Astrophysics Data System (ADS)

    Kang, H. G.; Cheong, H. B.; Lee, C. H.

    2017-12-01

    Spherical harmonics expressed as the Rossby-Haurwitz waves are the normal modes of non-divergent barotropic model. Among the normal modes in the numerical models, the most unstable mode will contaminate the numerical results, and therefore the investigation of normal mode for a given grid system and a discretiztaion method is important. The cubed-sphere grid which consists of six identical faces has been widely adopted in many atmospheric models. This grid system is non-orthogonal grid so that calculation of the normal mode is quiet challenge problem. In the present study, the normal modes of the shallow water system on the cubed sphere discretized by the spectral element method employing the Gauss-Lobatto Lagrange interpolating polynomials as orthogonal basis functions is investigated. The algebraic equations for the shallow water equation on the cubed sphere are derived, and the huge global matrix is constructed. The linear system representing the eigenvalue-eigenvector relations is solved by numerical libraries. The normal mode calculated for the several horizontal resolution and lamb parameters will be discussed and compared to the normal mode from the spherical harmonics spectral method.

  7. A Compendium of Canine Normal Tissue Gene Expression

    PubMed Central

    Chen, Qing-Rong; Wen, Xinyu; Khan, Javed; Khanna, Chand

    2011-01-01

    Background Our understanding of disease is increasingly informed by changes in gene expression between normal and abnormal tissues. The release of the canine genome sequence in 2005 provided an opportunity to better understand human health and disease using the dog as clinically relevant model. Accordingly, we now present the first genome-wide, canine normal tissue gene expression compendium with corresponding human cross-species analysis. Methodology/Principal Findings The Affymetrix platform was utilized to catalogue gene expression signatures of 10 normal canine tissues including: liver, kidney, heart, lung, cerebrum, lymph node, spleen, jejunum, pancreas and skeletal muscle. The quality of the database was assessed in several ways. Organ defining gene sets were identified for each tissue and functional enrichment analysis revealed themes consistent with known physio-anatomic functions for each organ. In addition, a comparison of orthologous gene expression between matched canine and human normal tissues uncovered remarkable similarity. To demonstrate the utility of this dataset, novel canine gene annotations were established based on comparative analysis of dog and human tissue selective gene expression and manual curation of canine probeset mapping. Public access, using infrastructure identical to that currently in use for human normal tissues, has been established and allows for additional comparisons across species. Conclusions/Significance These data advance our understanding of the canine genome through a comprehensive analysis of gene expression in a diverse set of tissues, contributing to improved functional annotation that has been lacking. Importantly, it will be used to inform future studies of disease in the dog as a model for human translational research and provides a novel resource to the community at large. PMID:21655323

  8. Early Detection | Division of Cancer Prevention

    Cancer.gov

    [[{"fid":"171","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Early

  9. Log-Normal Turbulence Dissipation in Global Ocean Models

    NASA Astrophysics Data System (ADS)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  10. The Snail Family in Normal and Malignant Haematopoiesis.

    PubMed

    Carmichael, Catherine L; Haigh, Jody J

    2017-01-01

    Snail family proteins are key inducers of the epithelial-mesenchymal transition (EMT), a critical process required for normal embryonic development. They have also been strongly implicated in regulating the EMT-like processes required for tumour cell invasion, migration, and metastasis. Whether these proteins also contribute to normal blood cell development, however, remains to be clearly defined. Increasing evidence supports a role for the Snail family in regulating cell survival, migration, and differentiation within the haematopoietic system, as well as potentially an oncogenic role in the malignant transformation of haematopoietic stem cells. This review will provide a broad overview of the Snail family, including key aspects of their involvement in the regulation and development of solid organ cancer, as well as a discussion on our current understanding of Snail family function during normal and malignant haematopoiesis. © 2017 S. Karger AG, Basel.

  11. Developing Visualization Support System for Teaching/Learning Database Normalization

    ERIC Educational Resources Information Center

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  12. "Das Konkrete ist das Abstrakte, an das man sich schließlich gewöhnt hat." (Laurent Schwartz) Über den Ablauf des mathematischen Verstehens

    NASA Astrophysics Data System (ADS)

    Lowsky, Martin

    Die im Titel genannte Aussage findet sich in den Lebenserinnerungen von Laurent Schwartz (1915-2002), einem der fruchtbarsten Mathematiker, Mitglied der Gruppe Bourbaki. Im Original lautet die Aussage: "un objet concret est un objet abstrait auquel on a fini par s'habituer." Schwartz erläutert sie am Beispiel des Integrals über {e^{-1/2{x^2}}} , das den Wert Wurzel aus 2π hat und in dem sich also die Zahlen e und π verknüpfen. Was Schwartz aber vor allem ausdrücken will, ist dies: Das mathematische Verständnisd geht langsam vor sich und es bedarf der Anstrengung. "Es ist eine Frage der Zeit und der Energie", sagt Schwartz, und gerade dies mache es so schwer, die höhere Mathematik unter das Volk zu bringen. Das Lernen und Lehren von Mathematik laufe eben mühevoll und langsam ab.

  13. Morphological Differences Between Seyfert Hosts and Normal Galaxies

    NASA Astrophysics Data System (ADS)

    Shlosman, Isaac

    Using new sub-arcsecond resolution imaging we compare large-scale stellar bar fraction in CfA sample of Seyferts and a closely matched control sample of normal galaxies. We find a difference between the samples on the 2.5σ level. We further compare the axial ratios of bars in all available samples quoted in the literature and find a deficiency of small axial ratio bars in Seyferts compared to normal galaxies.

  14. Comparison of Social Interaction between Cochlear-Implanted Children with Normal Intelligence Undergoing Auditory Verbal Therapy and Normal-Hearing Children: A Pilot Study.

    PubMed

    Monshizadeh, Leila; Vameghi, Roshanak; Sajedi, Firoozeh; Yadegari, Fariba; Hashemi, Seyed Basir; Kirchem, Petra; Kasbi, Fatemeh

    2018-04-01

    A cochlear implant is a device that helps hearing-impaired children by transmitting sound signals to the brain and helping them improve their speech, language, and social interaction. Although various studies have investigated the different aspects of speech perception and language acquisition in cochlear-implanted children, little is known about their social skills, particularly Persian-speaking cochlear-implanted children. Considering the growing number of cochlear implants being performed in Iran and the increasing importance of developing near-normal social skills as one of the ultimate goals of cochlear implantation, this study was performed to compare the social interaction between Iranian cochlear-implanted children who have undergone rehabilitation (auditory verbal therapy) after surgery and normal-hearing children. This descriptive-analytical study compared the social interaction level of 30 children with normal hearing and 30 with cochlear implants who were conveniently selected. The Raven test was administered to the both groups to ensure normal intelligence quotient. The social interaction status of both groups was evaluated using the Vineland Adaptive Behavior Scale, and statistical analysis was performed using Statistical Package for Social Sciences (SPSS) version 21. After controlling age as a covariate variable, no significant difference was observed between the social interaction scores of both the groups (p > 0.05). In addition, social interaction had no correlation with sex in either group. Cochlear implantation followed by auditory verbal rehabilitation helps children with sensorineural hearing loss to have normal social interactions, regardless of their sex.

  15. Speech Rate Normalization and Phonemic Boundary Perception in Cochlear-Implant Users

    ERIC Educational Resources Information Center

    Jaekel, Brittany N.; Newman, Rochelle S.; Goupell, Matthew J.

    2017-01-01

    Purpose: Normal-hearing (NH) listeners rate normalize, temporarily remapping phonemic category boundaries to account for a talker's speech rate. It is unknown if adults who use auditory prostheses called cochlear implants (CI) can rate normalize, as CIs transmit degraded speech signals to the auditory nerve. Ineffective adjustment to rate…

  16. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  17. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  18. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  19. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...

  20. 20 CFR 336.11 - Exhaustion of rights to normal unemployment benefits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Exhaustion of rights to normal unemployment... RAILROAD UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Extended Benefits § 336.11 Exhaustion of rights to normal unemployment benefits. For the purposes of this part, the Board considers that...