Sample records for valores normales con

  1. Social Role Valorization: A Proposed New Term for the Principle of Normalization

    ERIC Educational Resources Information Center

    Wolfensberger, Wolf

    2011-01-01

    The highest goal of the principle of normalization has recently been clarified to be the establishment, enhancement, or defense of the social role(s) of a person or group, via the enhancement of people's social images and personal competencies. In consequence, it is proposed that normalization be henceforth called "social role valorization."

  2. Normalization vs. Social Role Valorization: Similar or Different?

    ERIC Educational Resources Information Center

    Kumar, Akhilesh; Singh, Rajani Ranjan; Thressiakutty, A. T.

    2015-01-01

    The radical changes towards services for persons with disabilities were brought by Principle of Normalization, originated in 1969. As a consequence of Normalization, disability as a whole, and intellectual disability in particular, received the attention of the masses and the intelligentsia began advocating normalization ideologies which became…

  3. Bio-refinery approach for spent coffee grounds valorization.

    PubMed

    Mata, Teresa M; Martins, António A; Caetano, Nídia S

    2018-01-01

    Although normally seen as a problem, current policies and strategic plans concur that if adequately managed, waste can be a source of the most interesting and valuable products, among which metals, oils and fats, lignin, cellulose and hemicelluloses, tannins, antioxidants, caffeine, polyphenols, pigments, flavonoids, through recycling, compound recovery or energy valorization, following the waste hierarchy. Besides contributing to more sustainable and circular economies, those products also have high commercial value when compared to the ones obtained by currently used waste treatment methods. In this paper, it is shown how the bio-refinery framework can be used to obtain high value products from organic waste. With spent coffee grounds as a case study, a sequential process is used to obtain first the most valuable, and then other products, allowing proper valorization of residues and increased sustainability of the whole process. Challenges facing full development and implementation of waste based bio-refineries are highlighted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Precisión de las velocidades radiales obtenidas con el REOSC

    NASA Astrophysics Data System (ADS)

    González, J. F.; Lapasset, E.

    Complementando una línea de trabajo iniciada con anterioridad discutimos la estabilidad del espectrógrafo REOSC de CASLEO en DC para la medición de velocidades radiales en base al análisis de observaciones realizadas en enero y abril de 1997. En esas oportunidades obtuvimos 26 espectros de estrellas patrones y 27 espectros de 3 estrellas usadas como estrellas de referencia en nuestro programa de cúmulos abiertos. Además tomamos 26 espectros de crepúsculo con el telescopio en posiciones cubriendo el rango H=-4,+4 y δ =-90,+30. Mediante correlaciones cruzadas derivamos la velocidad de 19 órdenes en cada uno de estos espectros. En base a un análisis estadístico de los datos obtenidos discutimos la contribución de los distintos factores que afectan a la dispersión de lectura observada. En particular, la flexión del instrumento no introduciría errores significativos cuando se observa con masas de aire menores que 2.0. La dispersión de los valores de velocidad medidos para espectros de alta relación S/N de una misma estrella resultó del orden de 0.5 km/s. La comparación con los valores de velocidad publicados por distintos autores para las estrellas patrones no permite distinguir ninguna diferencia sistemática apreciable de las velocidades de CASLEO, siendo la media cuadrática de los residuos del orden de 1.0 km/s.

  5. A comprehensive dairy valorization model.

    PubMed

    Banaszewska, A; Cruijssen, F; van der Vorst, J G A J; Claassen, G D H; Kampman, J L

    2013-02-01

    Dairy processors face numerous challenges resulting from both unsteady dairy markets and some specific characteristics of dairy supply chains. To maintain a competitive position on the market, companies must look beyond standard solutions currently used in practice. This paper presents a comprehensive dairy valorization model that serves as a decision support tool for mid-term allocation of raw milk to end products and production planning. The developed model was used to identify the optimal product portfolio composition. The model allocates raw milk to the most profitable dairy products while accounting for important constraints (i.e., recipes, composition variations, dairy production interdependencies, seasonality, demand, supply, capacities, and transportation flows). The inclusion of all relevant constraints and the ease of understanding dairy production dynamics make the model comprehensive. The developed model was tested at the international dairy processor FrieslandCampina (Amersfoort, the Netherlands). The structure of the model and its output were discussed in multiple sessions with and approved by relevant FrieslandCampina employees. The elements included in the model were considered necessary to optimally valorize raw milk. To illustrate the comprehensiveness and functionality of the model, we analyzed the effect of seasonality on milk valorization. A large difference in profit and a shift in the allocation of milk showed that seasonality has a considerable impact on the valorization of raw milk. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Valor Collegiate Academies

    ERIC Educational Resources Information Center

    EDUCAUSE, 2015

    2015-01-01

    The four guiding principles behind the blended, competency-based, personalized learning model of Valor Collegiate Academies, a charter organization serving grades 5-12 in Nashville, TN: (1) Reflect the diversity of both our country and local community; (2) Personalize a student's experience to meet his/her unique academic and non-academic needs;…

  7. Lignin Valorization: Emerging Approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckham, Gregg T

    Lignin, an aromatic biopolymer found in plant cell walls, is a key component of lignocellulosic biomass and generally utilized for heat and power. However, lignin's chemical composition makes it an attractive source for biological and catalytic conversion to fuels and chemicals. Bringing together experts from biology, catalysis, engineering, analytical chemistry, and techno-economic/life-cycle analysis, Lignin Valorization presents a comprehensive, interdisciplinary picture of how lignocellulosic biorefineries could potentially employ lignin valorization technologies. Chapters will specifically focus on the production of fuels and chemicals from lignin and topics covered include (i) methods for isolating lignin in the context of the lignocellulosic biorefinery, (ii)more » thermal, chemo-catalytic, and biological methods for lignin depolymerization, (iii) chemo-catalytic and biological methods for upgrading lignin, (iv) characterization of lignin, and (v) techno-economic and life-cycle analysis of integrated processes to utilize lignin in an integrated biorefinery. The book provides the latest breakthroughs and challenges in upgrading lignin to fuels and chemicals for graduate students and researchers in academia, governmental laboratories, and industry interested in biomass conversion.« less

  8. Effect and key factors of byproducts valorization: the case of dairy industry.

    PubMed

    Banaszewska, A; Cruijssen, F; Claassen, G D H; van der Vorst, J G A J

    2014-01-01

    Production of many consumer products results in byproducts that contain a considerably large part of nutrients originating from input materials. High production volumes, environmental impact, and nutritional content of byproducts make them an important subject for careful valorization. Valorization allows us to explore the possibility of reusing nutrients in the production of main products, and thus highlights the potential gains that can be achieved. The main aim of this study was to evaluate the added value of cheese whey valorization, and to determine the effect of integral valorization of main products and byproducts on the profit of a dairy producer. Several scenarios and cases were implemented and analyzed using a decision support tool, the integral dairy valorization model. Data originated from the international dairy processor FrieslandCampina (Amersfoort, the Netherlands). The outcomes of scenarios were analyzed with regard to profit and shifts in the production of nonwhey end products, and were validated by company experts. Modeling results showed that the valorization of byproducts is very profitable (24.3% more profit). Furthermore, additional profit can be achieved when 2 valorization processes (main products and byproducts) are integrated. This effect is, however, considerably affected by current capacity and market demand limitations. Significant benefits can be created if demand of whey-based products is increased by 25%. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. Nuevas observaciones de 3C10 con el VLA*: estudio de la expansión

    NASA Astrophysics Data System (ADS)

    Reynoso, E. M.; Moffett, D. A.:; Dubner, G. M.; Giacani, E. B.; Reynolds, S. P.; Goss, W. M.; Dickel, J.

    Se presentan nuevos resultados sobre la expansión del remanente de la supernova de Tycho a lo largo de un intervalo de 10.9 años, comparando nuevas observaciones tomadas con el VLA a 1375 y 1635 MHz durante 1994 y 1995, con observaciones previas realizadas entre 1983 y 1984 (Dickel y col. ~1991 AJ 101, 2151), usando las mismas configuraciones, anchos de banda, calibradores y tiempos de integración. El coeficiente de expansión se calcula para sectores radiales de 4o de ancho cada uno, ajustando la correlación cruzada de las derivadas de los perfiles promedio para cada época. A partir de la expansión medida, se estima el índice (parámetro de expansión) de la ley potencial R∝ tm como m≡ d ln R/d ln t . Este valor se compara con coeficientes teóricos para diferentes fases evolutivas de remanentes de supernova.

  10. Valorization of Cereal Based Biorefinery Byproducts: Reality and Expectations

    PubMed Central

    2013-01-01

    The growth of the biobased economy will lead to an increase in new biorefinery activities. All biorefineries face the regular challenges of efficiently and economically treating their effluent to be compatible with local discharge requirements and to minimize net water consumption. The amount of wastes resulting from biorefineries industry is exponentially growing. The valorization of such wastes has drawn considerable attention with respect to resources with an observable economic and environmental concern. This has been a promising field which shows great prospective toward byproduct usage and increasing value obtained from the biorefinery. However, full-scale realization of biorefinery wastes valorization is not straightforward because several microbiological, technological and economic challenges need to be resolved. In this review we considered valorization options for cereals based biorefineries wastes while identifying their challenges and exploring the opportunities for future process. PMID:23931701

  11. Valorization of cereal based biorefinery byproducts: reality and expectations.

    PubMed

    Elmekawy, Ahmed; Diels, Ludo; De Wever, Heleen; Pant, Deepak

    2013-08-20

    The growth of the biobased economy will lead to an increase in new biorefinery activities. All biorefineries face the regular challenges of efficiently and economically treating their effluent to be compatible with local discharge requirements and to minimize net water consumption. The amount of wastes resulting from biorefineries industry is exponentially growing. The valorization of such wastes has drawn considerable attention with respect to resources with an observable economic and environmental concern. This has been a promising field which shows great prospective toward byproduct usage and increasing value obtained from the biorefinery. However, full-scale realization of biorefinery wastes valorization is not straightforward because several microbiological, technological and economic challenges need to be resolved. In this review we considered valorization options for cereals based biorefineries wastes while identifying their challenges and exploring the opportunities for future process.

  12. [In Process Citation].

    PubMed

    Santos, Carla Adriana; Fonseca, Jorge; Carolino, Elisabete; Lopes, Teresa; Sousa Guerreiro, António

    2016-03-25

    Introducción y objetivos: el cobre (Cu) es un oligoelemento muy estudiado, pero poco se sabe de su evolución en los pacientes alimentados por gastrostomía endoscópica (GEP). Pretendemos evaluar la evolución del Cu sérico desde la gastrostomía hasta 12 semanas después de la intervención en estos pacientes alimentados con preparaciones domésticas.  Métodos: realizamos un estudio observacional prospectivo para evaluar el Cu sérico, la albúmina, la transferrina y el índice de masa corporal (IMC) en el momento de la GEP, tras 4 semanas y 12 semanas después de la intervención. Los datos incluyen edad, género, NRS 2002 y enfermedad subyacente: cánceres de cabeza y cuello (CCC) y disfagia neurológica (DN). Después de la intervención, estos pacientes fueron alimentados conpreparaciones domésticas. Resultados: 146 enfermos (89 hombres), entre 21-95 años: CCC-56, DN-90. Valores de Cu entre 42-160 μg/dl (normal: 70-140 μg/dl); normales 89% (n = 130); bajos 11% (n = 16), albúmina baja: 53% (n = 77), transferrina baja: 65% (n = 94), IMC bajo: 53% (n = 78). Después de 4 semanas: valores normales de Cu en 93% y bajos en 7%, albúmina baja en 34%, transferrina baja en 52%. Tras 12 semanas: valores normales de Cu en 95% y bajos en 5%, albúmina baja en 25%, transferrina baja en 32%. No encontramos diferencias significativas en el Cusérico cuando se compara edad, género, enfermedad subyacente, IMC, albúmina y transferrina. Conclusiones: la mayoría de los enfermos presentan Cu sérico normal en el momento de la gastrostomía. Para los enfermos con Cu sérico bajo antes del procedimiento, la alimentación con preparaciones domésticas parece suficiente para su normalización progresiva.

  13. Recent advances in oxidative valorization of lignin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Ruoshui; Guo, Mond; Zhang, Xiao

    Lignin, an aromatic macromolecule synthesized by all higher plants, is one of the most intriguing natural materials for utilization across a wide range of applications. Depolymerization and fragmentation of lignin into small chemicals constituents which can either replace current market products or be used building blocks for new material synthesis is a focus of current lignin valorization strategies. And among the variety of lignin degradation chemistries, catalytic oxidation of lignin presents an energy efficient means of lignin depolymerization and generating selective reaction products. Our review provides a summary of the recent advancements in oxidative lignin valorization couched in a discussionmore » on how these chemistries may contribute to the degradation of the lignin macromolecule through three major approaches: 1) inter-unit linkages cleavage; 2) propanyl side-chain oxidative modification; and 3) oxidation of the aromatic ring and ring cleavage reactions.« less

  14. Recent advances in oxidative valorization of lignin

    DOE PAGES

    Ma, Ruoshui; Guo, Mond; Zhang, Xiao

    2017-07-21

    Lignin, an aromatic macromolecule synthesized by all higher plants, is one of the most intriguing natural materials for utilization across a wide range of applications. Depolymerization and fragmentation of lignin into small chemicals constituents which can either replace current market products or be used building blocks for new material synthesis is a focus of current lignin valorization strategies. And among the variety of lignin degradation chemistries, catalytic oxidation of lignin presents an energy efficient means of lignin depolymerization and generating selective reaction products. Our review provides a summary of the recent advancements in oxidative lignin valorization couched in a discussionmore » on how these chemistries may contribute to the degradation of the lignin macromolecule through three major approaches: 1) inter-unit linkages cleavage; 2) propanyl side-chain oxidative modification; and 3) oxidation of the aromatic ring and ring cleavage reactions.« less

  15. Opportunities and challenges in biological lignin valorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckham, Gregg T.; Johnson, Christopher W.; Karp, Eric M.

    Lignin is a primary component of lignocellulosic biomass that is an underutilized feedstock in the growing biofuels industry. Despite the fact that lignin depolymerization has long been studied, the intrinsic heterogeneity of lignin typically leads to heterogeneous streams of aromatic compounds, which in turn present significant technical challenges when attempting to produce lignin-derived chemicals where purity is often a concern. In Nature, microorganisms often encounter this same problem during biomass turnover wherein powerful oxidative enzymes produce heterogeneous slates of aromatics compounds. Some microbes have evolved metabolic pathways to convert these aromatic species via ‘upper pathways’ into central intermediates, which canmore » then be funneled through ‘lower pathways’ into central carbon metabolism in a process we dubbed ‘biological funneling’. This funneling approach offers a direct, biological solution to overcome heterogeneity problems in lignin valorization for the modern biorefinery. Coupled to targeted separations and downstream chemical catalysis, this concept offers the ability to produce a wide range of molecules from lignin. This perspective describes research opportunities and challenges ahead for this new field of research, which holds significant promise towards a biorefinery concept wherein polysaccharides and lignin are treated as equally valuable feedstocks. In particular, we discuss tailoring the lignin substrate for microbial utilization, host selection for biological funneling, ligninolytic enzyme–microbe synergy, metabolic engineering, expanding substrate specificity for biological funneling, and process integration, each of which presents key challenges. Ultimately, for biological solutions to lignin valorization to be viable, multiple questions in each of these areas will need to be addressed, making biological lignin valorization a multidisciplinary, co-design problem.« less

  16. Application of the New Propulsion Theory to the Design of Propellers. Comparison with the Lifting Line Theory (Aplicacion de la Nueva Teoria de la Impulsion al Diseno de Propulsores. Comparacion con la Teoria de las Lineas Sustentadoras),

    DTIC Science & Technology

    1983-11-07

    obtained with the lifting line theory is very sensitive to the type of radial distribution law of circulation. In the past, especially for theoretical...I. Bacquerizo Briones, "Utilidad de la educacion de Poincare para el proyecto y analisis de propulsores con valores finitos de la circulacion en el

  17. Electric field-based technologies for valorization of bioresources.

    PubMed

    Rocha, Cristina M R; Genisheva, Zlatina; Ferreira-Santos, Pedro; Rodrigues, Rui; Vicente, António A; Teixeira, José A; Pereira, Ricardo N

    2018-04-01

    This review provides an overview of recent research on electrotechnologies applied to the valorization of bioresources. Following a comprehensive summary of the current status of the application of well-known electric-based processing technologies, such as pulsed electric fields (PEF) and high voltage electrical discharges (HVED), the application of moderate electric fields (MEF) as an extraction or valorization technology will be considered in detail. MEF, known by its improved energy efficiency and claimed electroporation effects (allowing enhanced extraction yields), may also originate high heating rates - ohmic heating (OH) effect - allowing thermal stabilization of waste stream for other added-value applications. MEF is a simple technology that mostly makes use of green solvents (mainly water) and that can be used on functionalization of compounds of biological origin broadening their application range. The substantial increase of MEF-based plants installed in industries worldwide suggests its straightforward application for waste recovery. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Valorization of winery waste vs. the costs of not recycling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devesa-Rey, R., E-mail: rosa.devesa.rey@uvigo.es; Vecino, X.; Varela-Alende, J.L.

    Graphical abstract: Highlights: > Lactic acid, biosurfactants, xylitol or ethanol may be obtained from wine residues. > By-products valorization turns wine wastes into products with industrial applications. > The costs of waste disposal enhances the search of economically viable solutions for valorizing residues. - Abstract: Wine production generates huge amounts of waste. Before the 1990s, the most economical option for waste removal was the payment of a disposal fee usually being of around 3000 Euros. However, in recent years the disposal fee and fines for unauthorized discharges have increased considerably, often reaching 30,000-40,000 Euros, and a prison sentence is sometimesmore » also imposed. Some environmental friendly technologies have been proposed for the valorization of winery waste products. Fermentation of grape marc, trimming vine shoot or vinification lees has been reported to produce lactic acid, biosurfactants, xylitol, ethanol and other compounds. Furthermore, grape marc and seeds are rich in phenolic compounds, which have antioxidants properties, and vinasse contains tartaric acid that can be extracted and commercialized. Companies must therefore invest in new technologies to decrease the impact of agro-industrial residues on the environment and to establish new processes that will provide additional sources of income.« less

  19. Protein Supplements: Pros and Cons.

    PubMed

    Samal, Jay Rabindra Kumar; Samal, Indira R

    2018-05-04

    To provide a comprehensive analysis of the literature examining the pros and cons of protein supplementation, various articles on protein supplementation were obtained from Google Scholar, PubMed, and National Center for Biotechnology Information. Over the past few years, protein supplementation has become commonplace for gym-goers as well as for the public. A large segment of the general population relies on protein supplementation for meal replacement, weight reduction, and purported health benefits. These protein supplements have varying pros and cons associated with them, which are often overlooked by the public. This review aims to assimilate existing studies and form a consensus regarding the benefits and disadvantages of protein supplementation. The purported health benefits of protein supplementation have led to overuse by both adults and adolescents. Although the pros and cons of protein supplementation is a widely debated topic, not many studies have been conducted regarding the same. The few studies that exist either provide insufficient evidence or have not employed proper conditions for the conduct of the tests. It should be considered that protein supplements are processed materials and often do not contain other essential nutrients required for the sustenance of a healthy lifestyle. It is suggested that the required protein intake should be obtained from natural food sources and protein supplementation should be resorted to only if sufficient protein is not available in the normal diet.

  20. Experience of valorization projects ISTC for laser technologies

    NASA Astrophysics Data System (ADS)

    Sartory, A. V.; Stepennov, D. B.; Vlasova, E. Y.; Pokrovsky, K. K.

    2002-04-01

    Application of the achievements of the ISTC projects is one of the main problems being solved to achieve one of the basic goals of the ISTC, namely, to adapt Russian scientists to conditions of developing market economy in Russia. The present report is aimed at rendering of promotional services for ISTC project teams in the context of the program of projects outcomes valorization.

  1. The valorization of the plastic waste to the rheological characteristics of bituminous mixtures

    NASA Astrophysics Data System (ADS)

    Boucherba, Mohammed; Kriker, Abdelouahed; Kebaili, Nabil

    2017-02-01

    The valorization of materials used at the end of the cycle currently constitutes one of the major challenges for the state for the safeguarding of the environment. Indeed, plastic waste from their obstruction and weak biodegradability often constitutes a threat for health, nature and the environment. The present study treats a mining method and valorization of these wastes in the road, where this waste is incorporated in the pure bitumen of asphalt concretes using the Dry process. The vital objective of this work is to see their impact on the mechanical behavior of these concretes using the Marshall Test and NAT.

  2. Normalization of Gravitational Acceleration Models

    NASA Technical Reports Server (NTRS)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  3. Knowledge and Valorization of Historical Sites Through 3d Documentation and Modeling

    NASA Astrophysics Data System (ADS)

    Farella, E.; Menna, F.; Nocerino, E.; Morabito, D.; Remondino, F.; Campi, M.

    2016-06-01

    The paper presents the first results of an interdisciplinary project related to the 3D documentation, dissemination, valorization and digital access of archeological sites. Beside the mere 3D documentation aim, the project has two goals: (i) to easily explore and share via web references and results of the interdisciplinary work, including the interpretative process and the final reconstruction of the remains; (ii) to promote and valorize archaeological areas using reality-based 3D data and Virtual Reality devices. This method has been verified on the ruins of the archeological site of Pausilypon, a maritime villa of Roman period (Naples, Italy). Using Unity3D, the virtual tour of the heritage site was integrated and enriched with the surveyed 3D data, text documents, CAAD reconstruction hypotheses, drawings, photos, etc. In this way, starting from the actual appearance of the ruins (panoramic images), passing through the 3D digital surveying models and several other historical information, the user is able to access virtual contents and reconstructed scenarios, all in a single virtual, interactive and immersive environment. These contents and scenarios allow to derive documentation and geometrical information, understand the site, perform analyses, see interpretative processes, communicate historical information and valorize the heritage location.

  4. Development of green extraction processes for Nannochloropsis gaditana biomass valorization.

    PubMed

    Sánchez-Camargo, Andrea Del Pilar; Pleite, Natalia; Mendiola, José Antonio; Cifuentes, Alejandro; Herrero, Miguel; Gilbert-López, Bienvenida; Ibáñez, Elena

    2018-04-23

    In the present work, the valorization of Nannochloropsis gaditana biomass is proposed within the concept of biorefinery. To this aim, high-pressure homogenization (HPH) was used to break down the strong cell wall and supercritical fluid extraction (SFE) with pure CO 2 was applied as a first step to extract valuable compounds (such as non-polar lipids and pigments). Extraction of the remaining residue for the recovery of bioactive compounds was studied by means of an experimental design based on response surface methodology (RSM) employing pressurized liquid extraction (PLE) with green solvents such as water and ethanol. Optimum extract was achieved with pure ethanol at 170°C for 20 min, providing an important antioxidant capacity (0.72 ± 0.03 mmol trolox eq g -1 extract). Complete chemical characterization of the optimum extract was carried out by using different chromatographic methods such as reverse-phase high-performance liquid chromatography with diode array detection (RP-HPLC-DAD), normal-phase HPLC with evaporative light scattering detection (NP-HPLC-ELSD) and gas chromatography coupled to mass spectrometry detection (GC-MS); carotenoids (e.g. violaxanthin), chlorophylls and polar lipids were the main compounds observed while palmitoleic, palmitic, myristic acids and the polyunsaturated eicosapentanoic (EPA) acid were the predominant fatty acids in all PLE extracts. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Application of life-cycle assessment (LCA) methodology for valorization of building demolition materials and products

    NASA Astrophysics Data System (ADS)

    Sara, Balazs; Antonini, Ernesto; Tarantini, Mario

    2001-02-01

    The VAMP project (VAlorization of building demolition Materials and Products, LIFE 98/ENV/IT/33) aims to build an effective and innovative information system to support decision making in selective demolition activity and to manage the valorization (recovery-reuse-recycling) of waste flows produced by the construction and demolition (C&D) sector. The VAMP information system will be tested it in Italy in some case studies of selective demolition. In this paper the proposed demolition-valorization system will be compared to the traditional one in a life cycle perspective, applying LCA methodology to highlight the advantages of VAMP system from an eco-sustainability point of view. Within the system boundaries demolition processes, transport of demolition wastes and its recovery/treatment or disposal in landfill were included. Processes avoided due to reuse-recycling activities, such as extraction of natural resources and manufacture of building materials and components, were considered too. In this paper data collection procedure applied in inventory and impact assessment phases and a general overview about data availability for LCA studies in this sector are presented. Results of application of VAMP methodology to a case study are discussed and compared with a simulated traditional demolition of the same building. Environmental advantages of VAMP demolition-valorization system are demonstrated quantitatively emphasizing the special importance of reuse of building components with high demand of energy for manufacture.

  6. An "If This, Then that" Formulation of Decisions Related to Social Role Valorization as a Better Way of Interpreting It to People

    ERIC Educational Resources Information Center

    Wolfensberger, Wolf

    2011-01-01

    Social Role Valorization is interpreted as a high-order empirical social science theory that informs people about the relation between the social roles that people hold and what happens to them as a result, and how to valorize (improve or defend) the social roles of people at risk of social devaluation. Because Social Role Valorization is not a…

  7. Critical design of heterogeneous catalysts for biomass valorization: current thrust and emerging prospects

    DOE PAGES

    De, Sudipta; Dutta, Saikat; Saha, Basudeb

    2016-01-01

    Catalysis in the heterogeneous phase plays a crucial role in the valorization of biorenewable substrates with controlled reactivity, efficient mechanical process separation, greater recyclability and minimization of environmental effects.

  8. Toward engineering E. coli with an autoregulatory system for lignin valorization.

    PubMed

    Wu, Weihua; Liu, Fang; Singh, Seema

    2018-03-20

    Efficient lignin valorization could add more than 10-fold the value gained from burning it for energy and is critical for economic viability of future biorefineries. However, lignin-derived aromatics from biomass pretreatment are known to be potent fermentation inhibitors in microbial production of fuels and other value-added chemicals. In addition, isopropyl-β-d-1-thiogalactopyranoside and other inducers are routinely added into fermentation broth to induce the expression of pathway enzymes, which further adds to the overall process cost. An autoregulatory system that can diminish the aromatics' toxicity as well as be substrate-inducible can be the key for successful integration of lignin valorization into future lignocellulosic biorefineries. Toward that goal, in this study an autoregulatory system is demonstrated that alleviates the toxicity issue and eliminates the cost of an external inducer. Specifically, this system is composed of a catechol biosynthesis pathway coexpressed with an active aromatic transporter CouP under induction by a vanillin self-inducible promoter, ADH7, to effectively convert the lignin-derived aromatics into value-added chemicals using Escherichia coli as a host. The constructed autoregulatory system can efficiently transport vanillin across the cell membrane and convert it to catechol. Compared with the system without CouP expression, the expression of catechol biosynthesis pathway with transporter CouP significantly improved the catechol yields about 30% and 40% under promoter pTrc and ADH7, respectively. This study demonstrated an aromatic-induced autoregulatory system that enabled conversion of lignin-derived aromatics into catechol without the addition of any costly, external inducers, providing a promising and economically viable route for lignin valorization. Copyright © 2018 the Author(s). Published by PNAS.

  9. Toward engineering E. coli with an autoregulatory system for lignin valorization

    PubMed Central

    Wu, Weihua; Liu, Fang; Singh, Seema

    2018-01-01

    Efficient lignin valorization could add more than 10-fold the value gained from burning it for energy and is critical for economic viability of future biorefineries. However, lignin-derived aromatics from biomass pretreatment are known to be potent fermentation inhibitors in microbial production of fuels and other value-added chemicals. In addition, isopropyl-β-d-1-thiogalactopyranoside and other inducers are routinely added into fermentation broth to induce the expression of pathway enzymes, which further adds to the overall process cost. An autoregulatory system that can diminish the aromatics’ toxicity as well as be substrate-inducible can be the key for successful integration of lignin valorization into future lignocellulosic biorefineries. Toward that goal, in this study an autoregulatory system is demonstrated that alleviates the toxicity issue and eliminates the cost of an external inducer. Specifically, this system is composed of a catechol biosynthesis pathway coexpressed with an active aromatic transporter CouP under induction by a vanillin self-inducible promoter, ADH7, to effectively convert the lignin-derived aromatics into value-added chemicals using Escherichia coli as a host. The constructed autoregulatory system can efficiently transport vanillin across the cell membrane and convert it to catechol. Compared with the system without CouP expression, the expression of catechol biosynthesis pathway with transporter CouP significantly improved the catechol yields about 30% and 40% under promoter pTrc and ADH7, respectively. This study demonstrated an aromatic-induced autoregulatory system that enabled conversion of lignin-derived aromatics into catechol without the addition of any costly, external inducers, providing a promising and economically viable route for lignin valorization. PMID:29500185

  10. Interleukin-induced increase in Ia expression by normal mouse B cells.

    PubMed

    Roehm, N W; Leibson, H J; Zlotnik, A; Kappler, J; Marrack, P; Cambier, J C

    1984-09-01

    The constitutive culture supernatant (SN) of the macrophage tumor line P388D1 (P388 SN) and the concanavalin A (Con A)-induced culture supernatant of the T cell hybridoma FS6-14.13 (FS6 Con A SN) were shown to contain nonspecific factors capable of inducing increased Ia expression by normal resting B cells in a dose-dependent manner. In six consecutive experiments the relative increase in Ia expression induced by P388 SN was 4.9 +/- 0.9, with FS6 Con A SN 10.7 +/- 1.5, and with a combination of both preparations 13.0 +/- 1.7. This increase in Ia expression was observed to occur in virtually all the B cells, reaching maximum levels within 24 h of culture. The interleukin-induced increase in B cell Ia expression occurred in the absence of ancillary signals provided by ligand-receptor Ig cross-linking and despite the fact that virtually all the control B cells, cultured in the absence of factors, remained in G0. These results suggest that functional receptors for at least some interleukins are expressed on normal resting B cells and their effects can be manifest in the absence of additional activating signals. The increased Ia expression induced by the nonspecific factor preparations was shown to be correlated with enhanced antigen-presenting capacity by the B cells to T cell hybridomas. The nature of the interleukins responsible for these effects remains to be definitively determined, however, the activity of FS6 Con A SN was shown to correlate with B cell growth factor activity and increased B cell Ia expression was not observed using interleukin 2 (IL-2) or interferon-gamma, prepared by recombinant DNA technology.

  11. Value-added beef products (Productos Carnicos con Valor Agregado)

    Treesearch

    Mac Donaldson; Will Holder; Jan Holder

    2006-01-01

    I'm speaking for Will and Jan Holder, who couldn't be here. I happen to be familiar with Will and Jan's company, Ervin's Natural Beef, and its program because I've sold them cattle. Will and Jan's value-added beef program is based on their family ranch in the area known as The Blue, in the mountains of eastern Arizona.

  12. Study on the valorization routes of ashes from thermoelectric power plants working under mono-and co-combustion regimes

    NASA Astrophysics Data System (ADS)

    Barbosa, Rui Pedro Fernandes

    The main objective of this thesis was to study new valorization routes of ashes produced in combustion and co-combustion processes. Three main valorization pathways were analyzed: (i)production of cement mortars, (ii) production of concretes, and (iii) use as chemical agents to remove contaminants from wastewaters. Firstly, the ashes produced during the mono-combustion of coal, co-combustion of coal and meat and bone meal (MBM), and mono-combustion of MBM were characterized. The aim of this study was to understand the ashes properties in extreme levels of substitution of coal by a residue with a high contamination of specific metals. The substitution of coal by MBM produced ashes with higher content of heavy metals. Secondly, the ashes coming from an industrial power plant working under mono-combustion(coal) and co-combustion conditions (coal+sewage sludge+MBM) were studied. The use of cofuels did not promote significant changes in the chemical and ecotoxicological properties of ashes. Fly ashes were successfully stabilized/solidified in cement mortar, and bottom and circulating ashes were successfully used as raw materials in concrete. The third step involved the characterization and valorization of biomass ashes resulting from the combustion of forestry residues. The highest concentrations of metals/metalloids were found in the lowest particle size fractions of ashes. Biomass ashes successfully substituted cement and natural aggregates in concretes, without compromising their mechanical, chemical, and ecotoxicological properties. Finally, the biomass ashes were tested as chemical agents to remove contaminants from wastewaters. The removal of P, mainly phosphates, and Pb from wastewaters was assayed. Biomass ashes presented a high capacity to remove phosphates. As fly ashes were more efficient in removing phosphates, they were further used to remove Pb from wastewaters. Again, they presented a high efficiency in Pb removal. New potential valorization routes for

  13. Social Role Theory and Social Role Valorization for Care Management Practice.

    PubMed

    Blakely, Thomas J; Dziadosz, Gregory M

    2015-01-01

    This article proposes that social role theory (SRT) and social role valorization (SRV) be established as organizing theories for care managers. SRT is a recognized sociological theory that has a distinctive place in care management practice. SRV is an adjunct for SRT that focuses on people who are devalued by being in a negative social position and supports behavior change and movement to a valued social position.

  14. High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization

    PubMed Central

    Rodríguez-Escribano, David; de Salas, Felipe; Camarero, Susana

    2017-01-01

    Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin–Ciocalteau reagent (FCR), to detect the decrease in phenolic content produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). The method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates. PMID:28820431

  15. An Investigation of the Valorization of Durian Biomass

    NASA Astrophysics Data System (ADS)

    Ng, C.

    2016-12-01

    The unsustainable exploitation of limited resources has made the valorization of biomass to obtain a higher value from waste a particular area of interest in green chemistry. Much research has been done on the conversion of food waste to valuable chemicals. This study investigates the conversion of the biomass of durian (durio zibethinus), a fruit widely consumed particularly in Southeast Asia, to gamma-valerolactone (GVL). In the presence of sulfuric acid catalyst, the process occurs via four consecutive reactions, including the dehydration of carbohydrates such as fructose (C6H12O6) and cellulose ((C6H10O5)n) to 5-(hydroxymethyl)furfural (HMF), the hydration of HMF to levulinic acid (LA) and formic acid (FA), the hydrogenation of LA to 4-hydroxyvaleric acid (4-HVA), and ultimately the dehydration to gamma-valerolactone (GVL). It is hypothesized that, throughout an 8 hour period, there will be an initial peak in HMF concentration, followed by a steady decrease in its concentration due to hydration of HMF to LA and FA. Concentrations of HMF, LA, FA, and ammonium ion will be measured by running NMR analyses of the durian skin, meat, and seed samples taken at intervals of 0, 1, 2, 4, and 8 hours elapsed. Many of the impressive physical and chemical properties of GVL, including its nontoxicity, miscibility with water, and low vapor pressure, make it highly suitable as a sustainable liquid for use as a solvent, a transportation fuel, and a versatile feedstock for further derivatization. For example, addition of GVL to a diesel-biodiesel mixture results in a significant reduction in smoke and carbon monoxide emissions. Therefore, our aim in this study is to identify the concentrations of various valuable compounds in durian waste, and thereby assess the viability of the valorization of durian biomass.

  16. Multimodal Imaging of the Normal Eye.

    PubMed

    Kawali, Ankush; Pichi, Francesco; Avadhani, Kavitha; Invernizzi, Alessandro; Hashimoto, Yuki; Mahendradas, Padmamalini

    2017-10-01

    Multimodal imaging is the concept of "bundling" images obtained from various imaging modalities, viz., fundus photograph, fundus autofluorescence imaging, infrared (IR) imaging, simultaneous fluorescein and indocyanine angiography, optical coherence tomography (OCT), and, more recently, OCT angiography. Each modality has its pros and cons as well as its limitations. Combination of multiple imaging techniques will overcome their individual weaknesses and give a comprehensive picture. Such approach helps in accurate localization of a lesion and understanding the pathology in posterior segment. It is important to know imaging of normal eye before one starts evaluating pathology. This article describes multimodal imaging modalities in detail and discusses healthy eye features as seen on various imaging modalities mentioned above.

  17. High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez-Escribano, David; de Salas, Felipe; Pardo, Isabel

    Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin-Ciocalteau reagent (FCR), to detect the decrease in phenolic contentmore » produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). As a result, the method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates.« less

  18. High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization

    DOE PAGES

    Rodriguez-Escribano, David; de Salas, Felipe; Pardo, Isabel; ...

    2017-08-18

    Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin-Ciocalteau reagent (FCR), to detect the decrease in phenolic contentmore » produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). As a result, the method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates.« less

  19. VALORATE: fast and accurate log-rank test in balanced and unbalanced comparisons of survival curves and cancer genomics.

    PubMed

    Treviño, Victor; Tamez-Pena, Jose

    2017-06-15

    The association of genomic alterations to outcomes in cancer is affected by a problem of unbalanced groups generated by the low frequency of alterations. For this, an R package (VALORATE) that estimates the null distribution and the P -value of the log-rank based on a recent reformulation is presented. For a given number of alterations that define the size of survival groups, the log-rank density is estimated by a weighted sum of conditional distributions depending on a co-occurrence term of mutations and events. The estimations are accurately accelerated by sampling across co-occurrences allowing the analysis of large genomic datasets in few minutes. In conclusion, the proposed VALORATE R package is a valuable tool for survival analysis. The R package is available in CRAN at https://cran.r-project.org and in http://bioinformatica.mty.itesm.mx/valorateR . vtrevino@itesm.mx. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Trends in food waste valorization for the production of chemicals, materials and fuels: Case study South and Southeast Asia.

    PubMed

    Ong, Khai Lun; Kaur, Guneet; Pensupa, Nattha; Uisan, Kristiadi; Lin, Carol Sze Ki

    2018-01-01

    Staggering amounts of food waste are being generated in Asia by means of agricultural processing, food transportation and storage, and human food consumption activities. This along with the recent sustainable development goals of food security, environmental protection, and energy efficiency are the key drivers for food waste valorization. The aim of this review is to provide an insight on the latest trends in food waste valorization in Asian countries such as India, Thailand, Singapore, Malaysia and Indonesia. Landfilling, incineration, and composting are the first-generation food waste processing technologies. The advancement of valorisation alternatives to tackle the food waste issue is the focus of this review. Furthermore, a series of examples of key food waste valorization schemes in this Asian region as case studies to demonstrate the advancement in bioconversions in these countries are described. Finally, important legislation aspects for food waste disposal in these Asian countries are also reported. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. From gene to biorefinery: microbial β-etherases as promising biocatalysts for lignin valorization.

    PubMed

    Picart, Pere; de María, Pablo Domínguez; Schallmey, Anett

    2015-01-01

    The set-up of biorefineries for the valorization of lignocellulosic biomass will be core in the future to reach sustainability targets. In this area, biomass-degrading enzymes are attracting significant research interest for their potential in the production of chemicals and biofuels from renewable feedstock. Glutathione-dependent β-etherases are emerging enzymes for the biocatalytic depolymerization of lignin, a heterogeneous aromatic polymer abundant in nature. They selectively catalyze the reductive cleavage of β-O-4 aryl-ether bonds which account for 45-60% of linkages present in lignin. Hence, application of β-etherases in lignin depolymerization would enable a specific lignin breakdown, selectively yielding (valuable) low-molecular-mass aromatics. Albeit β-etherases have been biochemically known for decades, only very recently novel β-etherases have been identified and thoroughly characterized for lignin valorization, expanding the enzyme toolbox for efficient β-O-4 aryl-ether bond cleavage. Given their emerging importance and potential, this mini-review discusses recent developments in the field of β-etherase biocatalysis covering all aspects from enzyme identification to biocatalytic applications with real lignin samples.

  2. From gene to biorefinery: microbial β-etherases as promising biocatalysts for lignin valorization

    PubMed Central

    Picart, Pere; de María, Pablo Domínguez; Schallmey, Anett

    2015-01-01

    The set-up of biorefineries for the valorization of lignocellulosic biomass will be core in the future to reach sustainability targets. In this area, biomass-degrading enzymes are attracting significant research interest for their potential in the production of chemicals and biofuels from renewable feedstock. Glutathione-dependent β-etherases are emerging enzymes for the biocatalytic depolymerization of lignin, a heterogeneous aromatic polymer abundant in nature. They selectively catalyze the reductive cleavage of β-O-4 aryl-ether bonds which account for 45–60% of linkages present in lignin. Hence, application of β-etherases in lignin depolymerization would enable a specific lignin breakdown, selectively yielding (valuable) low-molecular-mass aromatics. Albeit β-etherases have been biochemically known for decades, only very recently novel β-etherases have been identified and thoroughly characterized for lignin valorization, expanding the enzyme toolbox for efficient β-O-4 aryl-ether bond cleavage. Given their emerging importance and potential, this mini-review discusses recent developments in the field of β-etherase biocatalysis covering all aspects from enzyme identification to biocatalytic applications with real lignin samples. PMID:26388858

  3. 76 FR 37375 - Meeting of the Public Safety Officer Medal of Valor Review Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-27

    ... INFORMATION: The Public Safety Officer Medal of Valor Review Board carries out those advisory functions... who wish to participate must register at least seven (7) days in advance of the meeting/conference... registration. Anyone requiring special accommodations should contact Mr. Joy at least seven (7) days in advance...

  4. Is phytoremediation without biomass valorization sustainable? - comparative LCA of landfilling vs. anaerobic co-digestion.

    PubMed

    Vigil, Miguel; Marey-Pérez, Manuel F; Martinez Huerta, Gemma; Álvarez Cabal, Valeriano

    2015-02-01

    This study examines the sustainability of phytoremediation for soils contaminated with heavy metals, especially the influence of management of the produced metal-enriched biomass on the environmental performance of the complete system. We examine a case study in Asturias (north of Spain), where the land was polluted with Pb by diffuse emissions from an adjacent steelmaking factory. A Phytoremediation scenario based on this case was assessed by performing a comparative life cycle assessment and by applying the multi-impact assessment method ReCiPe. Our Baseline scenario used the produced biomass as feedstock for an anaerobic digester that produces biogas, which is later upgraded cryogenically. The Baseline scenario was compared with two alternative scenarios: one considers depositing the produced biomass into landfill, and the other considers excavating the contaminated soil, disposing it in a landfill, and refilling the site with pristine soil. A sensitivity analysis was performed using different yields of biomass and biogas, and using different distances between site and biomass valorization/disposal center. Our results show that the impacts caused during agricultural activities and biomass valorization were compensated by the production of synthetic natural gas and the avoided impact of natural gas production. In addition, it was found that if the produced biomass was not valorized, the sustainability of phytoremediation is questionable. The distance between the site and the biomass processing center is not a major factor for determining the technology's sustainability, providing distances are less than 200-300 km. However, distance to landfill or to the source of pristine soil is a key factor when deciding to use phytoremediation or other ex-situ conventional remediation techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A proposed new framework for valorization of geoheritage in Norway

    NASA Astrophysics Data System (ADS)

    Dahl, Rolv; Bergengren, Anna; Heldal, Tom

    2015-04-01

    The geological history of Norway is a complex one, . The exploitation of geological resources of different kinds has always provided the backbone of the Norwegian community. Nevertheless, the perception of geology and the geological processes that created the landscape is little appreciated, compared to bio-diversity and cultural heritage. Some geological localities play an important role in our perception and scientific understanding of the landscape. Other localities are, or could be, important tourist destinations. Other localities can in turn be important for geoscience education on all levels, whereas other plays a major role in the understanding of geodiversity and geoheritage and should be protected as natural monuments. A database based on old registrations has been compiled and a web mapping server is recently launched based on old and new registrations. However, no systematical classification and identification of important sites has been done for the last thirty years. We are now calling for a crowdsourcing process in the geological community in order to validate and valorize the registrations, as well as defining new points and areas of interest. Furthermore, we are developing a valorization system for these localities. The framework for this system is based on studies from inventories in other countries, as well as suggestions from ProGeo. The aim is to raise awareness of important sites, and how they are treated and utilized for scientific, or educational purposes, as tourist destinations or heritage sites. Our presentation will focus on the development of the framework and its implications.

  6. 75 FR 30859 - Meeting of the Public Safety Officer Medal of Valor Review Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ... INFORMATION: The Public Safety Officer Medal of Valor Review Board carries out those advisory functions... participate must register at least seven (7) days in advance of the meeting/conference call by contacting Mr.... Anyone requiring special accommodations should contact Mr. Joy at least seven (7) days in advance of the...

  7. A Randomized Phase 2 Study of Long-Acting TransCon GH vs Daily GH in Childhood GH Deficiency.

    PubMed

    Chatelain, Pierre; Malievskiy, Oleg; Radziuk, Klaudziya; Senatorova, Ganna; Abdou, Magdy O; Vlachopapadopoulou, Elpis; Skorodok, Yulia; Peterkova, Valentina; Leff, Jonathan A; Beckert, Michael

    2017-05-01

    TransCon Growth Hormone (GH) (Ascendis Pharma) is a long-acting recombinant sustained-release human GH prodrug in development for children with GH deficiency (GHD). To compare the pharmacokinetics, pharmacodynamics, safety, and efficacy of weekly TransCon GH to that of daily GH in prepubertal children with GHD. Randomized, open-label, active-controlled study of three doses of weekly TransCon GH versus daily Genotropin (Pfizer). Thirty-eight centers in 14 European countries and Egypt. Prepubertal male and female treatment-naïve children with GHD (n = 53). Subjects received one of three TransCon GH doses (0.14, 0.21, or 0.30 mg GH/kg/wk) or Genotropin 0.03 mg GH/kg/d for 26 weeks. GH and insulinlike growth factor-1 (IGF-1) levels, growth, adverse events, and immunogenicity. Both GH maximum concentration and area under the curve were similar following TransCon GH or Genotropin administration at comparable doses. A dose response was observed, with IGF-1 standard deviation scores increasing into the normal range for all three TransCon GH doses. Annualized mean height velocity for the three TransCon GH doses ranged from 11.9 cm to 13.9 cm, which was not statistically different from 11.6 cm for Genotropin. Adverse events were mild to moderate, and most were unrelated to the study drug. Injection site tolerance was good. One TransCon GH subject developed a low-titer, nonneutralizing antibody response to GH. The results suggest that long-acting TransCon GH is comparable to daily Genotropin for GH (pharmacokinetics) and IGF-1 (pharmacodynamics) levels, safety, and efficacy and support advancement into phase 3 development. Copyright © 2017 Endocrine Society

  8. Teaching Normal Birth, Normally

    PubMed Central

    Hotelling, Barbara A

    2009-01-01

    Teaching normal-birth Lamaze classes normally involves considering the qualities that make birth normal and structuring classes to embrace those qualities. In this column, teaching strategies are suggested for classes that unfold naturally, free from unnecessary interventions. PMID:19436595

  9. The protein fraction from wheat-based dried distiller's grain with solubles (DDGS): extraction and valorization

    PubMed Central

    Villegas-Torres, M.F.; Ward, J.M.; Lye, G.J.

    2015-01-01

    Nowadays there is worldwide interest in developing a sustainable economy where biobased chemicals are the lead actors. Various potential feedstocks are available including glycerol, rapeseed meal and municipal solid waste (MSW). For biorefinery applications the byproduct streams from distilleries and bioethanol plants, such as wheat-based dried distiller's grain with solubles (DDGS), are particularly attractive, as they do not compete for land use. Wheat DDGS is rich in polymeric sugars, proteins and oils, making it ideal as a current animal feed, but also a future substrate for the synthesis of fine and commodity chemicals. This review focuses on the extraction and valorization of the protein fraction of wheat DDGS as this has received comparatively little attention to date. Since wheat DDGS production is expected to increase greatly in the near future, as a consequence of expansion of the bioethanol industry in the UK, strategies to valorize the component fractions of DDGS are urgently needed. PMID:25644639

  10. The Pros and Cons of Army Automation

    DTIC Science & Technology

    2007-11-13

    The Pros and Cons of Army Automation 1 Running Head: THE PROS AND CONS OF ARMY AUTOMATION The Pros and Cons of Army Automation SGM...TITLE AND SUBTITLE The Pros and Cons of Army Automation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Prescribed by ANSI Std Z39-18 The Pros and Cons of Army Automation 2 Outline I. Introduction (MSG (P) Dostie) II. Manual skills (MSG (P

  11. Geodiversity of Georgia: valorization of geotouristic potential

    NASA Astrophysics Data System (ADS)

    Abramowicz, Anna

    2017-04-01

    Georgia, as a country with a high geodiversity, boasts an infinite variety of landscapes, wealth of geological formations and surface water systems. These attributes have a significant influence on the development of geo-touristic potential. The prevalence of geotourism can positively improve the situation in the country. Unfortunately, many interesting places are not sufficiently well utilized, which makes them difficult to access. There is also a failure to provide or disseminate information about these places to visitors. Various sources describe numerous locations, but none of them carry a full inventory or database of categorized objects. Inventory based on studies and field work helped to create categorized geosites in Georgia (including the occupied territories). Evidential cards with detailed descriptions were prepared for every cataloged object. Categorized geosites were used to carry out a valorisation of geotouristic objects and geodiversity evaluation by QGIS and ArcGIS. Valorization of geotouristic potential determined two regions with an exceptionally huge attractiveness and geodiversity on a national scale. Results of the evaluation and valorisation were visualised and presented as an application in ArcGIS Online platform.

  12. Experimental and Computational Studies of Carbonyl Diazide (CON6) as a Precursor to Diazirinone (CON2)

    NASA Astrophysics Data System (ADS)

    Esselman, Brian J.; Amberger, Brent K.; Nolan, Alex M.; Woods, R. Claude; McMahon, R. J.

    2011-10-01

    Intrigued by the reported 2005 synthesis of diazirinone (1), we carried out further experimental and theoretical studies aimed at the detailed matrix-isolation and millimeter-wave spectroscopic characterizations of 1. Diazirinone (1) is a peculiar isoconjugate of two very stable molecules and may be of astrochemical interest. Unfortunately, the original reported methods of diazirinone (1) generation did not yield this species, rather its decomposition products. Inspired by a more recent gas phase pyrolysis of CON6 (2) to yield CON2 (1), we proposed a new method of generating CON6 (2) in solution as a precursor of diazirinone (1). This new synthesis may allow us to generate larger quantities of both CON6 and CON2 for investigation by millimeter-wave spectroscopy. We are able to safely generate carbonyl diazide (2) in sufficient yield from the reaction of triphosgene (3) and tetrabutylammonium azide in diethyl ether. This has allowed us to obtain both matrix-isolation and gas phase IR spectra of carbonyl diazide (2). After purification, it has a gas-phase lifetime that allows samples to be useable for up to several weeks. However, it is a shock-sensitive material that must be handled with care to prevent violent decomposition. In order to provide better mechanistic insight into the decomposition of carbonyl diazide (2) to diazirinone (1), we have engaged in a DFT and ab initio computational study. We have found a pathway between the two species via the triplet acylnitrene, CON4, and an oxaziridine CON2 species, but not at sufficiently low energies to allow for the trapping and detection of diazirinone (1). Preliminary millimeter-wave spectra have been obtained from several synthesized and purified samples of CON6 (2). However, the assignment of the spectra lines has been unexpectedly problematic. We have placed several CON6 (2) samples, confirmed by IR spectroscopy at the time of sample loading, into our instrument and obtained two different sets of rotational lines

  13. Presentación del estudio "Links" de hombres que tienes sexo con hombres en Buenos Aires, Argentina.

    PubMed

    Carballo-Diéguez, Alex; Avila, María M; Balán, Iván C; Marone, Rubén; Pando, María A; Barreda, Victoria

    2011-03-01

    Estudios previos en Buenos Aires reportaron altas prevalencias de HIV entre HSH, con valores que oscilan entre 9 y 14% durante casi 10 años de continuo testeo. El objetivo principal de este estudio fue la evaluación de factores relacionados al comportamiento de alto riesgo para transmisión del HIV entre HSH entre los que se incluyen el conocimiento y factores emocionales, socioculturales y ambientales. Por otro lado se realizó la estimación de prevalencia e incidencia de HIV utilizando RDS (Respondent Driven Sampling), así como la presencia de otras infecciones de transmisión sexual. Por último se evaluaron los hábitos de testeo para HIV indagando que factores facilitan o impiden su realización. El estudio constó de dos fases, en primer lugar una fase cualitativa y posteriormente una fase cuantitativa con una duración total de 4 años y medio. Durante la fase cualitativa se realizaron 44 entrevistas individuales en profundidad, 8 grupos focales y 10 observaciones etnográficas (hoteles, baños públicos ("teteras"), cines pornográficos, fiestas privadas, dark rooms y discotecas). Durante la fase cuantitativa del estudio se realizó el reclutamiento de 500 participantes que provinieron de la Ciudad Autónoma de Buenos Aires, así como del Gran Buenos Aires. El reclutamiento se comenzó con 16 participantes llamados semillas. Se realizó el diagnóstico de infección por HIV, hepatitis B y C (HBV y HCV), Treponema pallidum, Virus Papiloma Humano (HPV) y Chlamidias. La colaboración establecida entre los grupos de trabajo enfocados en áreas diversas posibilitó el abordaje conjunto de nuevas estrategias de investigación antes no exploradas en nuestro país. Los resultados más relevantes de esta investigación serán progresivamente publicados en sucesivos números de Actualizaciones en SIDA.

  14. Presentación del estudio “Links” de hombres que tienes sexo con hombres en Buenos Aires, Argentina

    PubMed Central

    Carballo-Diéguez, Alex; Ávila, María M; Balán, Iván C.; Marone, Rubén; Pando, María A.; Barreda, Victoria

    2011-01-01

    Resumen Estudios previos en Buenos Aires reportaron altas prevalencias de HIV entre HSH, con valores que oscilan entre 9 y 14% durante casi 10 años de continuo testeo. El objetivo principal de este estudio fue la evaluación de factores relacionados al comportamiento de alto riesgo para transmisión del HIV entre HSH entre los que se incluyen el conocimiento y factores emocionales, socioculturales y ambientales. Por otro lado se realizó la estimación de prevalencia e incidencia de HIV utilizando RDS (Respondent Driven Sampling), así como la presencia de otras infecciones de transmisión sexual. Por último se evaluaron los hábitos de testeo para HIV indagando que factores facilitan o impiden su realización. El estudio constó de dos fases, en primer lugar una fase cualitativa y posteriormente una fase cuantitativa con una duración total de 4 años y medio. Durante la fase cualitativa se realizaron 44 entrevistas individuales en profundidad, 8 grupos focales y 10 observaciones etnográficas (hoteles, baños públicos (“teteras”), cines pornográficos, fiestas privadas, dark rooms y discotecas). Durante la fase cuantitativa del estudio se realizó el reclutamiento de 500 participantes que provinieron de la Ciudad Autónoma de Buenos Aires, así como del Gran Buenos Aires. El reclutamiento se comenzó con 16 participantes llamados semillas. Se realizó el diagnóstico de infección por HIV, hepatitis B y C (HBV y HCV), Treponema pallidum, Virus Papiloma Humano (HPV) y Chlamidias. La colaboración establecida entre los grupos de trabajo enfocados en áreas diversas posibilitó el abordaje conjunto de nuevas estrategias de investigación antes no exploradas en nuestro país. Los resultados más relevantes de esta investigación serán progresivamente publicados en sucesivos números de Actualizaciones en SIDA. PMID:25264397

  15. 9 CFR 319.300 - Chili con carne.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Chili con carne. 319.300 Section 319.300 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY... Products § 319.300 Chili con carne. “Chili con carne” shall contain not less than 40 percent of meat...

  16. Citrus waste as feedstock for bio-based products recovery: Review on limonene case study and energy valorization.

    PubMed

    Negro, Viviana; Mancini, Giuseppe; Ruggeri, Bernardo; Fino, Debora

    2016-08-01

    The citrus peels and residue of fruit juices production are rich in d-limonene, a cyclic terpene characterized by antimicrobial activity, which could hamper energy valorization bioprocess. Considering that limonene is used in nutritional, pharmaceutical and cosmetic fields, citrus by-products processing appear to be a suitable feedstock either for high value product recovery or energy bio-processes. This waste stream, more than 10MTon at 2013 in European Union (AIJN, 2014), can be considered appealing, from the view point of conducting a key study on limonene recovery, as its content of about 1%w/w of high value-added molecule. Different processes are currently being studied to recover or remove limonene from citrus peel to both prevent pollution and energy resources recovery. The present review is aimed to highlight pros and contras of different approaches suggesting an energy sustainability criterion to select the most effective one for materials and energy valorization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. La utilizacion de los mapas conceptuales en la ensenanza de biologia y su efecto sobre el dominio del proceso de fotosintesis en los estudiantes universitarios

    NASA Astrophysics Data System (ADS)

    Gonzalez Rivera, Maria M.

    Se investigo el efecto de los mapas conceptuales sobre el dominio del proceso de fotosintesis en estudiantes universitarios. La investigacion utilizo dos estrategias: mapas conceptuales individuales y mapas conceptuales colaborativos, con el fin de investigar si existen diferencias significativas en el dominio del proceso de fotosintesis. El analisis de los datos incluyo aspectos cualitativos y cuantitativos. Se desprende del estudio que el 80% de los estudiantes describen la utilizacion de los mapas conceptuales como una experiencia beneficiosa. El 70% de los estudiantes expreso que los mapas conceptuales son utiles en el aprendizaje del proceso de fotosintesis y el 61% indico que facilitan la comprension de los conceptos. Los hallazgos mas importantes del analisis cuantitativo indican que los estudiantes que utilizaron los mapas conceptuales mejoraron significativamente su desempeno en la posprueba global. Se utilizo la prueba Mann-Whitney para investigar si existian diferencias significativas en la posprueba y preprueba global, el valor de W = 1945.0, para un valor p de 0.00, lo cual establece diferencias significativas. Para determinar si existian diferencias significativas entre la posprueba y preprueba del grupo individual, se realizo la prueba nuevamente. El valor de W correspondio a 490.5, que es significativo, con un valor p de 0.00. Se concluye que existen diferencias significativas entre la ejecucion de la posprueba y preprueba del grupo individual. Los datos proveen suficiente evidencia para sostener que los estudiantes que utilizaron la estrategia de mapas conceptuales individuales mejoraron el dominio del proceso de fotosintesis significativamente. Se realizo nuevamente la prueba para los resultados de posprueba y preprueba del grupo colaborativo. El valor de W correspondio a 446 con un valor p de 0.00. Se concluyo que existen diferencias significativas entre la ejecucion de la posprueba y preprueba del grupo colaborativo. Finalmente, se efectuo una

  18. Ergonomical valorization of working spaces in multipurpose ships.

    PubMed

    Seif, Mehdi; Degiuli, Nastija; Muftić, Osman

    2003-06-01

    In this work it is shown how anthropological data are among the most needed factors in ergonomical valorization of crew working spaces. Ship's working or living environment involves many unique human factors, which should be specially considered in our case as limitation of crew space. In this work we have chosen ships of different years of construction to prove this tendency. As a micro study, the work posture analysis using the pulling force experiment is performed in order to determine lumbar moment, intra-abdominal pressure as a measure of evaluating and comparing different crew work positions. As a macro-study, the "crew work posture analysis" was carried out by the use of the data collected from real cases. The most probable work postures in different spaces of a ship are classified and after some corrections of the work place the profile and its grade were determined. The "statistical analysis for real ship's spaces" is also performed, as well as another macro study, in order to show some real designed ship spaces from the point of view of the allocated volume.

  19. Energetic and biochemical valorization of cork boiling wastewater by anaerobic digestion.

    PubMed

    Marques, Isabel Paula; Gil, Luís; La Cara, Francesco

    2014-01-01

    In addition to energy benefits, anaerobic digestion offers other interesting advantages. The cork industry is of great environmental, economic and social significance in the western Mediterranean region, with Portugal being the world-leading producer and exporter. Cork boiling wastewater (CBW) is a toxic and recalcitrant organic effluent produced by this sector, which constitutes a serious environmental hazard. However, there is no documented research on anaerobic treatment/valorization performed with this effluent. The work presented here was developed with the aim to use the anaerobic digestion process to convert the CBW polluting organic load into an energy carrier gas and valuable molecules for industry. No lag phases were observed and a methane yield of 0.126 to 0.142 m(3) kg(-1) chemical oxygen demand (COD)added was registered in the mesophilic consortium experiments carried out in batch flasks at 37 ± 1°C. Anaerobic digestion can be advantageously connected to ultrafiltration or electrochemical processes, due to the following: 1) reduction of ellagic acid content and consequent decrease of CBW viscosity; and 2) increase in conductivity after the anaerobic process, avoiding the electrolyte application of the electrochemical process. The improvement of several CBW biochemical features shows that anaerobic digestion may provide additionally useful molecules. The rise in concentration of some of these compounds, belonging to the benzoic acid family (gallic, protocatechuic, vanillic and syringic acids), is responsible for the increase of antiradical activity of the phenolic fraction. Additionally, some enzymatic activity was also observed and while the laccase activity increased in the digested effluent by anaerobiosis, xylanase was formed in the process. The multidisciplinary approach adopted allowed the valorization of CBW in terms of energy and valuable biomolecules. By exploiting the anaerobic digestion process potential, a novel methodology to toxic

  20. Energetic and biochemical valorization of cork boiling wastewater by anaerobic digestion

    PubMed Central

    2014-01-01

    Background In addition to energy benefits, anaerobic digestion offers other interesting advantages. The cork industry is of great environmental, economic and social significance in the western Mediterranean region, with Portugal being the world-leading producer and exporter. Cork boiling wastewater (CBW) is a toxic and recalcitrant organic effluent produced by this sector, which constitutes a serious environmental hazard. However, there is no documented research on anaerobic treatment/valorization performed with this effluent. The work presented here was developed with the aim to use the anaerobic digestion process to convert the CBW polluting organic load into an energy carrier gas and valuable molecules for industry. Results No lag phases were observed and a methane yield of 0.126 to 0.142 m3 kg-1 chemical oxygen demand (COD)added was registered in the mesophilic consortium experiments carried out in batch flasks at 37 ± 1°C. Anaerobic digestion can be advantageously connected to ultrafiltration or electrochemical processes, due to the following: 1) reduction of ellagic acid content and consequent decrease of CBW viscosity; and 2) increase in conductivity after the anaerobic process, avoiding the electrolyte application of the electrochemical process. The improvement of several CBW biochemical features shows that anaerobic digestion may provide additionally useful molecules. The rise in concentration of some of these compounds, belonging to the benzoic acid family (gallic, protocatechuic, vanillic and syringic acids), is responsible for the increase of antiradical activity of the phenolic fraction. Additionally, some enzymatic activity was also observed and while the laccase activity increased in the digested effluent by anaerobiosis, xylanase was formed in the process. Conclusions The multidisciplinary approach adopted allowed the valorization of CBW in terms of energy and valuable biomolecules. By exploiting the anaerobic digestion process potential, a

  1. Mincle Signaling Promotes Con-A Hepatitis

    PubMed Central

    Greco, Stephanie H.; Torres-Hernandez, Alejandro; Kalabin, Aleksandr; Whiteman, Clint; Rokosh, Rae; Ravirala, Sushma; Ochi, Atsuo; Gutierrez, Johana; Salyana, Muhammad Atif; Mani, Vishnu R.; Nagaraj, Savitha V.; Deutsch, Michael; Seifert, Lena; Daley, Donnele; Barilla, Rocky; Hundeyin, Mautin; Nikifrov, Yuriy; Tejada, Karla; Gelb, Bruce E.; Katz, Steven C.; Miller, George

    2016-01-01

    Concanavalin-A (Con-A) hepatitis is regarded as a T cell-mediated model of acute liver injury. Mincle is a C-type lectin receptor (CLR) that is critical in the immune response to mycobacteria and fungi, but does not have a well-defined role in pre-clinical models of non-pathogen mediated inflammation. Since Mincle can ligate the cell death ligand SAP130, we postulated that Mincle signaling drives intrahepatic inflammation and liver injury in Con-A hepatitis. Acute liver injury was assessed in the murine Con-A hepatitis model using C57BL/6, Mincle−/−, and Dectin-1−/− mice. The role of C/EBPβ and HIF-1α signaling was assessed using selective inhibitors. We found that Mincle was highly expressed in hepatic innate inflammatory cells and endothelial cells in both mice and humans. Furthermore, sterile Mincle ligands and Mincle signaling intermediates were increased in the murine liver in Con-A hepatitis. Most significantly, Mincle deletion or blockade protected against Con-A hepatitis whereas Mincle ligation exacerbated disease. Bone marrow chimeric and adoptive transfer experiments suggested that Mincle signaling in infiltrating myeloid cells dictates disease phenotype. Conversely, signaling via other CLRs did not alter disease course. Mechanistically, we found that Mincle blockade decreased the NF-κβ related signaling intermediates, C/EBPβ and HIF-1α, both of which are necessary in macrophage-mediated inflammatory responses. Accordingly, Mincle deletion lowered production of nitrites in Con-A hepatitis and inhibition of both C/EBPβ and HIF1-α reduced the severity of liver disease. Our work implicates a novel innate immune driver of Con-A hepatitis and, more broadly, suggests a potential role for Mincle in diseases governed by sterile inflammation. PMID:27559045

  2. Mincle Signaling Promotes Con A Hepatitis.

    PubMed

    Greco, Stephanie H; Torres-Hernandez, Alejandro; Kalabin, Aleksandr; Whiteman, Clint; Rokosh, Rae; Ravirala, Sushma; Ochi, Atsuo; Gutierrez, Johana; Salyana, Muhammad Atif; Mani, Vishnu R; Nagaraj, Savitha V; Deutsch, Michael; Seifert, Lena; Daley, Donnele; Barilla, Rocky; Hundeyin, Mautin; Nikifrov, Yuriy; Tejada, Karla; Gelb, Bruce E; Katz, Steven C; Miller, George

    2016-10-01

    Con A hepatitis is regarded as a T cell-mediated model of acute liver injury. Mincle is a C-type lectin receptor that is critical in the immune response to mycobacteria and fungi but does not have a well-defined role in preclinical models of non-pathogen-mediated inflammation. Because Mincle can ligate the cell death ligand SAP130, we postulated that Mincle signaling drives intrahepatic inflammation and liver injury in Con A hepatitis. Acute liver injury was assessed in the murine Con A hepatitis model using C57BL/6, Mincle(-/-), and Dectin-1(-/-) mice. The role of C/EBPβ and hypoxia-inducible factor-1α (HIF-1α) signaling was assessed using selective inhibitors. We found that Mincle was highly expressed in hepatic innate inflammatory cells and endothelial cells in both mice and humans. Furthermore, sterile Mincle ligands and Mincle signaling intermediates were increased in the murine liver in Con A hepatitis. Most significantly, Mincle deletion or blockade protected against Con A hepatitis, whereas Mincle ligation exacerbated disease. Bone marrow chimeric and adoptive transfer experiments suggested that Mincle signaling in infiltrating myeloid cells dictates disease phenotype. Conversely, signaling via other C-type lectin receptors did not alter disease course. Mechanistically, we found that Mincle blockade decreased the NF-κβ-related signaling intermediates C/EBPβ and HIF-1α, both of which are necessary in macrophage-mediated inflammatory responses. Accordingly, Mincle deletion lowered production of nitrites in Con A hepatitis and inhibition of both C/EBPβ and HIF-1α reduced the severity of liver disease. Our work implicates a novel innate immune driver of Con A hepatitis and, more broadly, suggests a potential role for Mincle in diseases governed by sterile inflammation. Copyright © 2016 by The American Association of Immunologists, Inc.

  3. Lignolytic-consortium omics analyses reveal novel genomes and pathways involved in lignin modification and valorization.

    PubMed

    Moraes, Eduardo C; Alvarez, Thabata M; Persinoti, Gabriela F; Tomazetto, Geizecler; Brenelli, Livia B; Paixão, Douglas A A; Ematsu, Gabriela C; Aricetti, Juliana A; Caldana, Camila; Dixon, Neil; Bugg, Timothy D H; Squina, Fabio M

    2018-01-01

    Lignin is a heterogeneous polymer representing a renewable source of aromatic and phenolic bio-derived products for the chemical industry. However, the inherent structural complexity and recalcitrance of lignin makes its conversion into valuable chemicals a challenge. Natural microbial communities produce biocatalysts derived from a large number of microorganisms, including those considered unculturable, which operate synergistically to perform a variety of bioconversion processes. Thus, metagenomic approaches are a powerful tool to reveal novel optimized metabolic pathways for lignin conversion and valorization. The lignin-degrading consortium (LigMet) was obtained from a sugarcane plantation soil sample. The LigMet taxonomical analyses (based on 16S rRNA) indicated prevalence of Proteobacteria , Actinobacteria and Firmicutes members, including the Alcaligenaceae and Micrococcaceae families, which were enriched in the LigMet compared to sugarcane soil. Analysis of global DNA sequencing revealed around 240,000 gene models, and 65 draft bacterial genomes were predicted. Along with depicting several peroxidases, dye-decolorizing peroxidases, laccases, carbohydrate esterases, and lignocellulosic auxiliary (redox) activities, the major pathways related to aromatic degradation were identified, including benzoate (or methylbenzoate) degradation to catechol (or methylcatechol), catechol ortho-cleavage, catechol meta-cleavage, and phthalate degradation. A novel Paenarthrobacter strain harboring eight gene clusters related to aromatic degradation was isolated from LigMet and was able to grow on lignin as major carbon source. Furthermore, a recombinant pathway for vanillin production was designed based on novel gene sequences coding for a feruloyl-CoA synthetase and an enoyl-CoA hydratase/aldolase retrieved from the metagenomic data set. The enrichment protocol described in the present study was successful for a microbial consortium establishment towards the lignin and

  4. [Not Available].

    PubMed

    Roman B-C, Michela; Gonzales-Huamán, Flor; Maguiña, Jorge L

    2016-07-19

    Leímos con interés el artículo titulado "Valores del índice cintura/cadera en la población escolar de Bogotá, Colombia. Estudio FUPRECOL" que se realizó en 5.921 sujetos de entre 9 a 17,9 años y que tuvo como objetivo determinar los valores de referencia del índice cintura-cadera (ICC).

  5. Deep eutectic solvent-based valorization of spent coffee grounds.

    PubMed

    Yoo, Da Eun; Jeong, Kyung Min; Han, Se Young; Kim, Eun Mi; Jin, Yan; Lee, Jeongmi

    2018-07-30

    Spent coffee grounds (SCGs) are viewed as a valuable resource for useful bioactive compounds, such as chlorogenic acids and flavonoids, and we suggest an eco-friendly and efficient valorization method. A series of choline chloride-based deep eutectic solvents (DESs) were tested as green extraction solvents for use with ultrasound-assisted extraction. Extraction efficiency was evaluated based on total phenolic content (TPC), total flavonoid content, total chlorogenic acids, and/or anti-oxidant activity. A binary DES named HC-6, which was composed of 1,6-hexanediol:choline chloride (molar ratio 7:1) was designed to produce the highest efficiency. Experimental conditions were screened and optimized for maximized efficiency using a two-level fractional factorial design and a central composite design, respectively. As a result, the proposed method presented significantly enhanced TPC and anti-oxidant activity. In addition, phenolic compounds could be easily recovered from extracts at high recovery yields (>90%) by adsorption chromatography. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Savior siblings, parenting and the moral valorization of children.

    PubMed

    Strong, Kimberly; Kerridge, Ian; Little, Miles

    2014-05-01

    Philosophy has long been concerned with 'moral status'. Discussions about the moral status of children, however, seem often to promote confusion rather than clarity. Using the creation of 'savior siblings' as an example, this paper provides a philosophical critique of the moral status of children and the moral relevance of parenting and the role that formative experience, regret and relational autonomy play in parental decisions. We suggest that parents make moral decisions that are guided by the moral significance they attach to children, to sick children and most importantly, to a specific sick child (theirs). This moral valorization is rarely made explicit and has generally been ignored by both philosophers and clinicians in previous critiques. Recognizing this, however, may transform not only the focus of bioethical discourse but also the policies and practices surrounding the care of children requiring bone marrow or cord blood transplantation by better understanding the values at stake behind parental decision making. © 2012 John Wiley & Sons Ltd.

  7. Women in Combat Pros and Cons

    DTIC Science & Technology

    1988-04-01

    and Cons . Major Thomas H. Cecil 88-0490 "--"insights into tomorrou,"’ ..v- A A 0 PtY-i f(.> i’I,-:::x:’~ --pcr~ j.~ ~~* --. -- iiV • DISCLAIMER The...k. r- r,’ I’. REPORT NUMBER 88-0490 TITLE WOMEN IN COMBAT-PROS AND CONS AUTHOR(S) MAJOR THOMAS H. CEC-IL, USAF -% FACULTY ADVISOR CH, LT COL DAVID W...NUMBERS 11 TITLE (include Security Classification) WOMEN IN COMBAT--PROS AND CONS 12. PERSON4AL AUTHOR(S) Cecil, Thomas H1., Major, USAF 9a YýOF REPORT

  8. VALOR joint oscillation analysis using multiple LAr-TPCs in the Booster Neutrino Beam at Fermilab

    NASA Astrophysics Data System (ADS)

    Andreopoulos, C.; Barry, C.; Bench, F.; Chappell, A.; Dealtry, T.; Dennis, S.; Escudero, L.; Jones, R.; Grant, N.; Roda, M.; Sgalaberna, D.; Shah, R.

    2017-09-01

    Anomalies observed by different experiments, the most significant ones being the ∼3.8 sigma νe appearance in a ∼50 MeV νµ beam from muon decay at rest observed by the LSND experiment and the ∼3.8 sigma νe and {\\bar{ν }}e appearance in a ∼1 GeV neutrino beam from pion decay in flight observed by MiniBooNE, suggest the existence of sterile neutrinos. The Short Baseline Neutrino (SBN) program at Fermilab aims to perform a sensitive search for sterile neutrinos by performing analyses of νe appearance and νµ disappearance employing three Liquid Argon Time Projection Chambers (LAr-TPCs) at different baselines. The VALOR neutrino fitting group was established within the T2K experiment and has led numerous flagship T2K oscillation analyses, and provided sensitivity and detector optimisation studies for DUNE and Hyper-K. The neutrino oscillation framework developed by this group is able to perform fits of several samples and systematic parameters within different neutrino models and experiments. Thus, VALOR is an ideal environment for the neutrino oscillation fits using multiple LAr-TPC detectors with proper treatment of correlated systematic uncertainties necessary for the SBN analyses.

  9. Pros and cons of phage therapy

    PubMed Central

    Loc-Carrillo, Catherine

    2011-01-01

    Many publications list advantages and disadvantages associated with phage therapy, which is the use of bacterial viruses to combat populations of nuisance or pathogenic bacteria. The goal of this commentary is to discuss many of those issues in a single location. In terms of “Pros,” for example, phages can be bactericidal, can increase in number over the course of treatment, tend to only minimally disrupt normal flora, are equally effective against antibiotic-sensitive and antibiotic-resistant bacteria, often are easily discovered, seem to be capable of disrupting bacterial biofilms, and can have low inherent toxicities. In addition to these assets, we consider aspects of phage therapy that can contribute to its safety, economics, or convenience, but in ways that are perhaps less essential to the phage potential to combat bacteria. For example, autonomous phage transfer between animals during veterinary application could provide convenience or economic advantages by decreasing the need for repeated phage application, but is not necessarily crucial to therapeutic success. We also consider possible disadvantages to phage use as antibacterial agents. These “Cons,” however, tend to be relatively minor. PMID:22334867

  10. EJERCICIO Y LA DETECCION DEL MAL AGUDO DE MONTAÑA GRAVE

    PubMed Central

    Garófoli, Adrián; Montoya, Paola; Elías, Carlos; Benzo, Roberto

    2012-01-01

    El Mal Agudo de Montaña (MAM) es un conjunto de síntomas inespecíficos padecidos por sujetos que ascienden rápidamente desde baja a alta altura sin adecuada aclimatación. Usualmente es autolimitado, pero las formas graves (edema pulmonar y cerebral) pueden causar la muerte. La hipoxemia exagerada en reposo está relacionada con el desarrollo de MAM pero su valor predictivo es limitado. Dado que el ejercicio en altura se acompaña de mayor hipoxemia y síntomas, postulamos el valor predictivo de un simple test de ejercicio para pronosticar MAM grave. Se estudió el valor predictivo de la saturación de oxígeno en reposo y ejercicio submáximo a 2 700m y 4 300m en 63 sujetos que ascendían al cerro Aconcagua (6 962m). Se consideró desaturación de oxígeno con ejercicio a una disminución >=5% respecto al reposo. Se utilizó la escala de Lake-Louise para establecer la presencia de MAM grave. 6 sujetos presentaron MAM grave (9.5%) y requirieron evacuación. La saturación de oxígeno en reposo a 2 700m no fue significativa para clasificar sujetos que luego desarrollaron MAM grave. Por el contrario, la asociación de desaturación durante el ejercicio a 2 700m más la saturación inapropiada en reposo a 4 300m fue significativa para clasificar a los sujetos que desarrollaron MAM grave con un valor predictivo positivo de 80% y un valor predictivo negativo del 97%. Nuestros resultados son relevantes para el montañismo y sugieren la adición de un simple test de ejercicio en la predicción del MAM grave. PMID:20228017

  11. Ependimoma myxopapilar sacro gigante con osteolisis

    PubMed Central

    Ajler, Pablo; Landriel, Federico; Goldschmidt, Ezequiel; Campero, Álvaro; Yampolsky, Claudio

    2014-01-01

    Objetivo: la presentación de un caso de una paciente con un ependimoma sacro con extensa infiltración y destrucción ósea local. Descripción del caso: una mujer de 53 años acudió a la consulta por dolor lumbosacro y alteraciones sensitivas perineales y esfinterianas. La imágenes por Resonancia Magnética (IRM) y la Tomografía Axial Computada (TAC) mostraron una lesión expansiva gigante a nivel S2-S4 con extensa osteólisis e invasión de tejidos adyacentes. Se realizó una exéresis tumoral completa con mejoría del estatus funcional. La anatomía patológica informó ependimoma mixopapilar. Discusión: la extensión de la resección quirúrgica es el mejor predictor de buen pronóstico. El tratamiento radiante se reserva como opción adyuvante para las resecciones incompletas y recidiva tumoral. La quimioterapia sólo debería utilizarse en casos en que la cirugía y la radioterapia estén contraindicadas. Conclusión: Los ependimomas mixopapilares sacros con destrucción ósea y presentación intra y extradural son muy infrecuentes y deben ser tenidos en cuenta entre los diagnósticos diferenciales preoperatorios. Su resección total, siempre que sea posible, es la mejor alternativa terapéutica. PMID:25165615

  12. Ultrasound and photochemical procedures for nanocatalysts preparation: application in photocatalytic biomass valorization.

    PubMed

    Colmenares, Juan Carlos

    2013-07-01

    Nano-photocatalysis is becoming increasingly important due to its multiple applications and multidisciplinary aspects. Applications such as water/air purification, solar energy storage, chemicals production and optoelectronics are some of the most promising. In recent years, the development of novel environmental friendly and cost efficient methods for materials preparation that could replace the old ones is on demand. Unconventional and "soft" techniques such as sonication and photochemistry offer huge possibilities for the synthesis of a broad spectrum of nanostructured materials (e.g., nano-photocatalysts). In the present study, I focus on ultrasound and photochemical procedures for the preparation of nanostructured photocatalysts (e.g., supported metals, metal oxides) and their application in food organic wastes valorization.

  13. Evaluation of a new automated microscopy urine sediment analyser - sediMAX conTRUST®.

    PubMed

    Bogaert, Laura; Peeters, Bart; Billen, Jaak

    2017-04-01

    This study evaluated the performance of the stand-alone sediMAX conTRUST (77Elektronika, Budapest, Hungary) analyser as an alternative to microscopic analysis of urine. The validation included a precision, carry-over, categorical correlation and diagnostic performance study with manual phase-contrast microscopy as reference method. A total of 260 routine urine samples were assessed. The within-run precision was much better at higher concentrations than at very low concentrations. The precision met our predefined limits for all the elements at the different concentrations, with the exception of the lowest RBC, the WBC, pathological casts and crystals count. There was no sample carry-over. The analyser showed good categorical agreement with manual microscopy for RBC and WBC counts, moderate agreement for yeast cells, crystals and squamous epithelial cells and bad agreement for non-squamous epithelial cells, bacteria and casts. Diagnostic performance was satisfying only for RBC, WBC and yeast cells. The number of false negative results was acceptable (≤4%) for all elements after connecting the sediMAX conTRUST with an automatic strip reader (AutionMAX) and after implementation of review rules. We conclude that the sediMAX conTRUST should be used as a screening tool in combination with an automatic strip reader, for the identification of normal samples. Therefore, adequate review rules should be defined. Manual microscopy is still required in 'flagged' pathological samples. Despite the poor analytical performance on pathological samples, the images on the screen can be used for interpretation without the microscope and can be stored as PDF-documents for archiving the results.

  14. Planificación Neuroquirúrgica con Software Osirix

    PubMed Central

    Jaimovich, Sebastián Gastón; Guevara, Martin; Pampin, Sergio; Jaimovich, Roberto; Gardella, Javier Luis

    2014-01-01

    Introducción: La individualidad anatómica es clave para reducir el trauma quirúrgico y obtener un mejor resultado. Actualmente, el avance en las neuroimágenes ha permitido objetivar esa individualidad anatómica, permitiendo planificar la intervención quirúrgica. Con este objetivo, presentamos nuestra experiencia con el software Osirix. Descripción de la técnica: Se presentan 3 casos ejemplificadores de 40 realizados. Caso 1: Paciente con meningioma de la convexidad parasagital izquierda en área premotora; Caso 2: Paciente con macroadenoma hipofisario, operada previamente por vía transeptoesfenoidal en otra institución con una resección parcial; Caso 3: Paciente con lesiones en pedúnculo cerebeloso medio bilateral. Se realizó la planificación prequirúrgica con el software OsiriX, fusionando y reconstruyendo en 3D las imágenes de TC e IRM, para analizar relaciones anatómicas, medir distancias, coordenadas y trayectorias, entre otras funciones. Discusión: El software OsiriX de acceso libre y gratuito permite al cirujano, mediante la fusión y reconstrucción en 3D de imágenes, analizar la anatomía individual del paciente y planificar de forma rápida, simple, segura y económica cirugías de alta complejidad. En el Caso 1 se pudo analizar las relaciones del tumor con las estructuras adyacentes para minimizar el abordaje. En el Caso 2 permitió comprender la anatomía post-operatoria previa del paciente, para determinar la trayectoria del abordaje transnasal endoscópico y la necesidad de ampliar su exposición, logrando la resección tumoral completa. En el Caso 3 permitió obtener las coordenadas estereotáxicas y trayectoria de una lesión sin representación tomográfica. Conclusión: En casos de no contar con costosos sistemas de neuronavegación o estereotáxia el software OsiriX es una alternativa a la hora de planificar la cirugía, con el objetivo de disminuir el trauma y la morbilidad operatoria. PMID:25165617

  15. Pros and Cons of International Weapons Procurement Collaboration.

    DTIC Science & Technology

    1995-01-01

    ad- vanced U.S. industry. Greater risk of cost growth and schedule slippage. Pro: U.S. and partners share common equip- ment. Con : U.S. require...Mark A., 1947- Pros and cons of international weapons procurement collaboration / Mark Lorell, Julia Lowell, p. cm "Prepared for the Office...one/ Cons of International Weapons Procurement Collaboration Mark Lorell Julia Lowell National Defense Research Institute Prepared for the

  16. Selective recovery of silver from waste low-temperature co-fired ceramic and valorization through silver nanoparticle synthesis.

    PubMed

    Swain, Basudev; Shin, Dongyoon; Joo, So Yeong; Ahn, Nak Kyoon; Lee, Chan Gi; Yoon, Jin-Ho

    2017-11-01

    Considering the value of silver metal and silver nanoparticles, the waste generated during manufacturing of low temperature co-fired ceramic (LTCC) were recycled through the simple yet cost effective process by chemical-metallurgy. Followed by leaching optimization, silver was selectively recovered through precipitation. The precipitated silver chloride was valorized though silver nanoparticle synthesis by a simple one-pot greener synthesis route. Through leaching-precipitation optimization, quantitative selective recovery of silver chloride was achieved, followed by homogeneous pure silver nanoparticle about 100nm size were synthesized. The reported recycling process is a simple process, versatile, easy to implement, requires minimum facilities and no specialty chemicals, through which semiconductor manufacturing industry can treat the waste generated during manufacturing of LTCC and reutilize the valorized silver nanoparticles in manufacturing in a close loop process. Our reported process can address issues like; (i) waste disposal, as well as value-added silver recovery, (ii) brings back the material to production stream and address the circular economy, and (iii) can be part of lower the futuristic carbon economy and cradle-to-cradle technology management, simultaneously. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Valorization of Rhizoclonium sp. algae via pyrolysis and catalytic pyrolysis.

    PubMed

    Casoni, Andrés I; Zunino, Josefina; Piccolo, María C; Volpe, María A

    2016-09-01

    The valorization of Rhizoclonium sp. algae through pyrolysis for obtaining bio-oils is studied in this work. The reaction is carried out at 400°C, at high contact time. The bio-oil has a practical yield of 35% and is rich in phytol. Besides, it is simpler than the corresponding to lignocellulosic biomass due to the absence of phenolic compounds. This property leads to a bio-oil relatively stable to storage. In addition, heterogeneous catalysts (Al-Fe/MCM-41, SBA-15 and Cu/SBA-15), in contact with algae during pyrolysis, are analyzed. The general trend is that the catalysts decrease the concentration of fatty alcohols and other high molecular weight products, since their mild acidity sites promote degradation reactions. Thus, the amount of light products increases upon the use of the catalysts. Particularly, acetol concentration in the bio-oils obtained from the catalytic pyrolysis with SBA-15 and Cu/SBA-15 is notably high. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Valorization of winery waste vs. the costs of not recycling.

    PubMed

    Devesa-Rey, R; Vecino, X; Varela-Alende, J L; Barral, M T; Cruz, J M; Moldes, A B

    2011-11-01

    Wine production generates huge amounts of waste. Before the 1990s, the most economical option for waste removal was the payment of a disposal fee usually being of around 3000 Euros. However, in recent years the disposal fee and fines for unauthorized discharges have increased considerably, often reaching 30,000-40,000 Euros, and a prison sentence is sometimes also imposed. Some environmental friendly technologies have been proposed for the valorization of winery waste products. Fermentation of grape marc, trimming vine shoot or vinification lees has been reported to produce lactic acid, biosurfactants, xylitol, ethanol and other compounds. Furthermore, grape marc and seeds are rich in phenolic compounds, which have antioxidants properties, and vinasse contains tartaric acid that can be extracted and commercialized. Companies must therefore invest in new technologies to decrease the impact of agro-industrial residues on the environment and to establish new processes that will provide additional sources of income. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Valorization of starchy, cellulosic, and sugary food waste into hydroxymethylfurfural by one-pot catalysis.

    PubMed

    Yu, Iris K M; Tsang, Daniel C W; Yip, Alex C K; Chen, Season S; Ok, Yong Sik; Poon, Chi Sun

    2017-10-01

    This study aimed to produce a high-value platform chemical, hydroxymethylfurfural (HMF), from food waste and evaluate the catalytic performance of trivalent and tetravalent metals such as AlCl 3 , CrCl 3 , FeCl 3 , Zr(O)Cl 2 , and SnCl 4 for one-pot conversion. Starchy food waste, e.g., cooked rice and penne produced 4.0-8.1 wt% HMF and 46.0-64.8 wt% glucose over SnCl 4 after microwave heating at 140 °C for 20 min. This indicated that starch hydrolysis was effectively catalyzed but subsequent glucose isomerization was rate-limited during food waste valorization, which could be enhanced by 40-min reaction to achieve 22.7 wt% HMF from cooked rice. Sugary food waste, e.g., kiwifruit and watermelon, yielded up to 13 wt% HMF over Sn catalyst, which mainly resulted from naturally present fructose. Yet, organic acids in fruits may hinder Fe-catalyzed dehydration by competing for the Lewis sites. In contrast, conversion of raw mixed vegetables as cellulosic food waste was limited by marginal hydrolysis at the studied conditions (120-160 °C and 20-40 min). It is interesting to note that tetravalent metals enabled HMF production at a lower temperature and shorter time, while trivalent metals could achieve a higher HMF selectivity at an elevated temperature. Further studies on kinetics, thermodynamics, and reaction pathways of food waste valorization are recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. 9 CFR 319.301 - Chili con carne with beans.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Chili con carne with beans. 319.301 Section 319.301 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... Dehydrated Meat Food Products § 319.301 Chili con carne with beans. Chili con carne with beans shall contain...

  1. 9 CFR 319.301 - Chili con carne with beans.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Chili con carne with beans. 319.301 Section 319.301 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... Dehydrated Meat Food Products § 319.301 Chili con carne with beans. Chili con carne with beans shall contain...

  2. 9 CFR 319.301 - Chili con carne with beans.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Chili con carne with beans. 319.301 Section 319.301 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... Dehydrated Meat Food Products § 319.301 Chili con carne with beans. Chili con carne with beans shall contain...

  3. 9 CFR 319.301 - Chili con carne with beans.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Chili con carne with beans. 319.301 Section 319.301 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... Dehydrated Meat Food Products § 319.301 Chili con carne with beans. Chili con carne with beans shall contain...

  4. 9 CFR 319.301 - Chili con carne with beans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Chili con carne with beans. 319.301 Section 319.301 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... Dehydrated Meat Food Products § 319.301 Chili con carne with beans. Chili con carne with beans shall contain...

  5. Abundancias químicas de ψ Octantis

    NASA Astrophysics Data System (ADS)

    Medina, M. C.; Pintado, O. I.

    Se determinan las abundancias químicas de ψ Oct usando espectros obtenidos con EBASIM en CASLEO. Los valores iniciales de temperatura efectiva y gravedad superficial se calculan con la fotometría uvbyβ. Esta estrella fue estudiada por Pintado y Adelman (1996) usando espectros REOSC y Adelman y otros (1993), este último basado en espectros echelle obtenidos con el Telescopio Anglo Australiano. Comparamos nuestros resultados con los de los trabajos anteriormente mencionados, pudiéndose realizar una evaluación de la calidad de los espectros EBASIM.

  6. Normal Postprandial Nonesterified Fatty Acid Uptake in Muscles Despite Increased Circulating Fatty Acids in Type 2 Diabetes

    PubMed Central

    Labbé, Sébastien M.; Croteau, Etienne; Grenier-Larouche, Thomas; Frisch, Frédérique; Ouellet, René; Langlois, Réjean; Guérin, Brigitte; Turcotte, Eric E.; Carpentier, André C.

    2011-01-01

    OBJECTIVE Postprandial plasma nonesterified fatty acid (NEFA) appearance is increased in type 2 diabetes. Our objective was to determine whether skeletal muscle uptake of plasma NEFA is abnormal during the postprandial state in type 2 diabetes. RESEARCH DESIGN AND METHODS Thigh muscle blood flow and oxidative metabolism indexes and NEFA uptake were determined using positron emission tomography coupled with computed tomography (PET/CT) with [11C]acetate and 14(R,S)-[18F]fluoro-6-thia-heptadecanoic acid (18FTHA) in seven healthy control subjects (CON) and seven subjects with type 2 diabetes during continuous oral intake of a liquid meal to achieve steady postprandial NEFA levels with insulin infusion to maintain similar plasma glucose levels in both groups. RESULTS In the postprandial state, plasma NEFA level was higher in type 2 diabetic subjects versus CON (P < 0.01), whereas plasma glucose was at the same level in both groups. Muscle NEFA fractional extraction and blood flow index levels were 56% (P < 0.05) and 24% (P = 0.27) lower in type 2 diabetes, respectively. However, muscle NEFA uptake was similar to that of CON (quadriceps femoris [QF] 1.47 ± 0.23 vs. 1.37 ± 0.24 nmol ⋅ g−1 ⋅ min−1, P = 0.77; biceps femoris [BF] 1.54 ± 0.26 vs. 1.46 ± 0.28 nmol ⋅ g−1 ⋅ min−1, P = 0.85). Muscle oxidative metabolism was similar in both groups. Muscle NEFA fractional extraction and blood flow index were strongly and positively correlated (r = 0.79, P < 0.005). CONCLUSIONS Postprandial muscle NEFA uptake is normal despite elevated systemic NEFA levels and acute normalization of plasma glucose in type 2 diabetes. Lower postprandial muscle blood flow with resulting reduction in muscle NEFA fractional extraction may explain this phenomenon. PMID:21228312

  7. Normal postprandial nonesterified fatty acid uptake in muscles despite increased circulating fatty acids in type 2 diabetes.

    PubMed

    Labbé, Sébastien M; Croteau, Etienne; Grenier-Larouche, Thomas; Frisch, Frédérique; Ouellet, René; Langlois, Réjean; Guérin, Brigitte; Turcotte, Eric E; Carpentier, André C

    2011-02-01

    Postprandial plasma nonesterified fatty acid (NEFA) appearance is increased in type 2 diabetes. Our objective was to determine whether skeletal muscle uptake of plasma NEFA is abnormal during the postprandial state in type 2 diabetes. Thigh muscle blood flow and oxidative metabolism indexes and NEFA uptake were determined using positron emission tomography coupled with computed tomography (PET/CT) with [(11)C]acetate and 14(R,S)-[(18)F]fluoro-6-thia-heptadecanoic acid ((18)FTHA) in seven healthy control subjects (CON) and seven subjects with type 2 diabetes during continuous oral intake of a liquid meal to achieve steady postprandial NEFA levels with insulin infusion to maintain similar plasma glucose levels in both groups. In the postprandial state, plasma NEFA level was higher in type 2 diabetic subjects versus CON (P < 0.01), whereas plasma glucose was at the same level in both groups. Muscle NEFA fractional extraction and blood flow index levels were 56% (P < 0.05) and 24% (P = 0.27) lower in type 2 diabetes, respectively. However, muscle NEFA uptake was similar to that of CON (quadriceps femoris [QF] 1.47 ± 0.23 vs. 1.37 ± 0.24 nmol·g(-1)·min(-1), P = 0.77; biceps femoris [BF] 1.54 ± 0.26 vs. 1.46 ± 0.28 nmol·g(-1)·min(-1), P = 0.85). Muscle oxidative metabolism was similar in both groups. Muscle NEFA fractional extraction and blood flow index were strongly and positively correlated (r = 0.79, P < 0.005). Postprandial muscle NEFA uptake is normal despite elevated systemic NEFA levels and acute normalization of plasma glucose in type 2 diabetes. Lower postprandial muscle blood flow with resulting reduction in muscle NEFA fractional extraction may explain this phenomenon.

  8. Discretion vs. Valor: The Development and Evaluation of a Simulation Game about Being a Believer in the Soviet Union.

    ERIC Educational Resources Information Center

    Blackstone, Barbara

    A study was conducted to determine the effectiveness of "Discretion vs. Valor," a simulation game designed to give North American players a chance to: (1) identify with "believers" (Christians) in the Soviet Union in order to form new images of these persons; (2) gain empathy for Christians by understanding the dilemmas they…

  9. Innovative approach for the valorization of useful metals from waste electric and electronic equipment (WEEE)

    NASA Astrophysics Data System (ADS)

    Soare, V.; Burada, M.; Dumitrescu, D. V.; Constantin, I.; Soare, V.; Popescu, A.-M. J.; Carcea, I.

    2016-08-01

    Waste electric and electronic equipment are an important secondary source of rare and precious metals and their processing through ecological technologies constitutes a major concern in the European Union and significantly contributes to the reduction of environmental pollution and to the preservation of valuable resources of nonferrous metals. The paper presents an innovative approach for the complex valorization of useful metals contained in WEEE. The method consists in the melting of WEEE in a furnace in a microwave field at temperatures of 1000 ÷1200°C, for the complete separation of the metallic fraction from the organic components. The gases resulting from the melting process were also treated/neutralized in a microwave environment and the obtained metallic bulk (multi-component alloy) was processed through combined hydrometallurgical and electrochemical methods. The major elements in the metallic bulk (Cu, Sn, Zn, Pb) were separated/recovered by anodic dissolution, respectively by leaching in nitric acid followed by cementation using various agents, or by electrodeposition. Depending on the electrochemical parameters, cathodic deposits consisting of Cu, with a purity higher than 99.9%, or of Cu-Sn and Cu-Sn-Zn alloys were obtained. Silver was valorized by leaching/precipitation with NaCl and the gold concentrated in the anodic slime will be recovered by thiourea extraction. The experiments performed demonstrate the possibility of ecological and efficient processing of WEEE in a microwave field and the recovery of nonferrous and precious metals through combined hydrometallurgical and electrochemical methods.

  10. La binaria LSS 3074 y su entorno: ?`una nueva asociación OB?

    NASA Astrophysics Data System (ADS)

    Niemela, V.; Morrell, N.; Corti, M.

    En este trabajo presentamos un nuevo análisis orbital de LSS~3074, junto con tipos espectrales y velocidades radiales de estrellas que podrían constituir con ella una nueva asociación OB. La estrella O4f LSS3074 fue descubierta como binaria espectroscópica de corto período y líneas dobles por Morrell & Niemela (1990, ASP Conf. Ser. 7, 57). Posteriormente, Haefner et~al.(1994, IBVS 3969) encontraron variaciones fotométricas, estimando una inclinación orbital entre 50o y 55o. Teniendo en cuenta la importancia de obtener valores empíricos para las masas de estrellas O tempranas, y considerando la gran dispersión existente entre los valores observados y su discrepancia con los predichos por los modelos teóricos, hemos obtenido nuevas observaciones espectroscópicas de este sistema, con el propósito de mejorar los elementos orbitales derivados en la solución preliminar. Además, como las estrellas O tempranas suelen formar parte de cúmulos y asociaciones OB, hemos llevado a cabo una investigación espectroscópica de varias estrellas tempranas que podrían estar físicamente relacionadas con LSS~3074.

  11. Solid state fermentation (SSF): diversity of applications to valorize waste and biomass.

    PubMed

    Lizardi-Jiménez, M A; Hernández-Martínez, R

    2017-05-01

    Solid state fermentation is currently used in a range of applications including classical applications, such as enzyme or antibiotic production, recently developed products, such as bioactive compounds and organic acids, new trends regarding bioethanol and biodiesel as sources of alternative energy, and biosurfactant molecules with environmental purposes of valorising unexploited biomass. This work summarizes the diversity of applications of solid state fermentation to valorize biomass regarding alternative energy and environmental purposes. The success of applying solid state fermentation to a specific process is affected by the nature of specific microorganisms and substrates. An exhaustive number of microorganisms able to grow in a solid matrix are presented, including fungus such as Aspergillus or Penicillum for antibiotics, Rhizopus for bioactive compounds, Mortierella for biodiesel to bacteria, Bacillus for biosurfactant production, or yeast for bioethanol.

  12. Galaxias australes con núcleo doble

    NASA Astrophysics Data System (ADS)

    Gimeno, G.; Díaz, R.; Carranza, G.

    Se estudia una muestra de galaxias australes con núcleo doble a partir de una búsqueda extensiva en la literatura. Se analizan las características morfológicas, fotométricas y espectroscópicas de la muestra. Para algunas galaxias se han realizado observaciones con el espectrógrafo multifunción (EMF) de la Estación Astrofísica de Bosque Alegre a partir de las cuales se determinaron parámetros cinemáticos.

  13. Normalized modes at selected points without normalization

    NASA Astrophysics Data System (ADS)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  14. Use of a GnRH vaccine, GonaCon, for prevention and treatment of adrenocortical disease (ACD) in domestic ferrets.

    PubMed

    Miller, Lowell A; Fagerstone, Kathleen A; Wagner, Robert A; Finkler, Mark

    2013-09-23

    Adrenocortical disease (ACD) is a common problem in surgically sterilized, middle-aged to old ferrets (Mustela putorius furo). The adrenal tissues of these ferrets develop hyperplasia, adenomas, or adenocarcinomas, which produce steroid hormones including estradiol, 17-hydroxyprogesterone, and androstenedione. Major clinical signs attributable to overproduction of these hormones are alopecia (hair loss) in both sexes and a swollen vulva in females. Pruritus, muscle atrophy, hind limb weakness, and sexual activity or aggression are also observed in both sexes. Males can develop prostatic cysts, prostatitis, and urethral obstruction. ACD is thought to be linked to continuous and increased LH secretion, due to lack of gonadal hormone feedback in neutered ferrets. This continuous elevated LH acts on adrenal cortex LH receptors, resulting in adrenal hyperplasia or adrenal tumor. This study investigated whether the immunocontraceptive vaccine GonaCon, a GnRH vaccine developed to reduce the fertility of wildlife species and the spread of disease, could prevent or delay onset of ACD and treat alopecia in ferrets with existing ACD. Results showed that GonaCon provided relief from ACD by causing production of antibodies to GnRH, probably suppressing production and/or release of LH. Treatment caused many ACD symptoms to disappear, allowing the ferrets to return to a normal life. The study also found that the probability of developing ACD was significantly reduced in ferrets treated with GonaCon when young (1-3 years old) compared to untreated control animals. GonaCon caused injection site reaction in some animals when administered as an intramuscular injection but caused few side effects when administered subcutaneously. Both intramuscular and subcutaneous vaccination resulted in similar levels of GnRH antibody titers. Subcutaneous vaccination with GonaCon is thus recommended to prevent the onset of ACD and as a possible treatment for ACD-signs in domestic ferrets. Published

  15. Clarifying Normalization

    ERIC Educational Resources Information Center

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  16. The environmental sustainability of anaerobic digestion as a biomass valorization technology.

    PubMed

    De Meester, Steven; Demeyer, Jens; Velghe, Filip; Peene, Andy; Van Langenhove, Herman; Dewulf, Jo

    2012-10-01

    This paper studies the environmental sustainability of anaerobic digestion from three perspectives. First, reference electricity is compared to electricity production from domestic organic waste and energy crop digestion. Second, different digester feed possibilities in an agricultural context are studied. Third, the influence of applying digestate as fertilizer is investigated. Results highlight that biomass is converted at a rational exergy (energy) efficiency ranging from 15.3% (22.6) to 33.3% (36.0). From a life cycle perspective, a saving of over 90% resources is achieved in most categories when comparing biobased electricity to conventional electricity. However, operation without heat valorization results in 32% loss of this performance while using organic waste (domestic and agricultural residues) as feedstock avoids land resources. The use of digestate as a fertilizer is beneficial from a resource perspective, but causes increased nitrogen and methane emissions, which can be reduced by 50%, making anaerobic digestion an environmentally competitive bioenergy technology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Micro-scale energy valorization of grape marcs in winery production plants.

    PubMed

    Fabbri, Andrea; Bonifazi, Giuseppe; Serranti, Silvia

    2015-02-01

    The Biochemical Methane Potential (BMP) of winery organic waste, with reference to two Italian red and white grapes (i.e. Nero Buono and Greco) by-products was investigated. The study was carried out to verify the possibility to reduce the production impact in a green-waste-management-chain-perspective. The possibility to efficiently utilize wine-related-by-products for energy production at a micro-scale (i.e. small-medium scale winery production plant) was also verified. Results showed as a good correlation can be established between the percentage of COD removal and the biogas production, as the winery can produce, from its waste methanization, about 7800 kW h year(-1) electrical and 8900 kW h year(-1) thermal. A critical evaluation was performed about the possibility to utilize the proposed approach to realize an optimal biomass waste management and an energetic valorization in a local-energy-production-perspective. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. The Procurement of Non Developmental Items: Pros and Cons

    DTIC Science & Technology

    1994-09-01

    commercial way of procurement will bring more advantages than disadvantages to DOD. The list of the pros greatly outweighs the cons . There is practically...AD-A285 009 MH PROCUREMENT OF NON DEVELOPMENTAL ITEMS: PROS AND CONS THES IS Giorgio Scappaticci. Lt. Col., Italian Air Force AFIT/GLMILALj94S-3 I...PROS AND CONS U;narmou,.ced 0 Justification THESIS Giorgio Scappaticci, Lt. Col., Italian Air Force Dist’ibution f Availability Codes AFIT/GLM/LAL

  19. Automobile Shredder Residues in Italy: characterization and valorization opportunities.

    PubMed

    Fiore, S; Ruffino, B; Zanetti, M C

    2012-08-01

    At the moment Automobile Shredder Residue (ASR) is usually landfilled worldwide, but European draft Directive 2000/53/CE forces the development of alternative solutions, stating the 95%-wt recovery of an End of Life Vehicle (ELV) weight to be fulfilled by 2015. This work describes two industrial tests, each involving 250-300 t of ELVs, in which different pre-shredding operations were performed. The produced ASR materials underwent an extended characterization and some post-shredding processes, consisting of dimensional, magnetic, electrostatic and densimetric separation phases, were tested on laboratory scale, having as main purpose the enhancement of ASR recovery/recycling and the minimization of the landfilled fraction. The gathered results show that accurate depollution and dismantling operations are mandatory to obtain a high quality ASR material which may be recycled/recovered and partially landfilled according to the actual European Union regulations, with particular concern for Lower Heating Value (LHV), heavy metals content and Dissolved Organic Carbon (DOC) as critical parameters. Moreover post-shredding technical solutions foreseeing minimum economic and engineering efforts, therefore realizable in common European ELVs shredding plants, may lead to multi-purposed (material recovery and thermal valorization) opportunities for ASR reuse/recovery. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. ConKit: a python interface to contact predictions.

    PubMed

    Simkovic, Felix; Thomas, Jens M H; Rigden, Daniel J

    2017-07-15

    Recent advances in protein residue contact prediction algorithms have led to the emergence of many new methods and a variety of file formats. We present ConKit , an open source, modular and extensible Python interface which allows facile conversion between formats and provides an interface to analyses of sequence alignments and sets of contact predictions. ConKit is available via the Python Package Index. The documentation can be found at http://www.conkit.org . ConKit is licensed under the BSD 3-Clause. hlfsimko@liverpool.ac.uk or drigden@liverpool.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. ComSciCon: The Communicating Science Workshop for Graduate Students

    NASA Astrophysics Data System (ADS)

    Sanders, Nathan; Drout, Maria; Kohler, Susanna; Cook, Ben; ComSciCon Leadership Team

    2018-01-01

    ComSciCon (comscicon.com) is a national workshop series organized by graduate students, for graduate students, focused on leadership and training in science communication. Our goal is to empower young scientists to become leaders in their field, propagating appreciation and understanding of research results to broad and diverse audiences. ComSciCon attendees meet and interact with professional communicators, build lasting networks with graduate students in all fields of science and engineering from around the country, and write and publish original works. ComSciCon consists of both a flagship national conference series run annually for future leaders in science communication, and a series of regional and specialized workshops organized by ComSciCon alumni nationwide. We routinely receive over 1000 applications for 50 spots in our national workshop. Since its founding in 2012, over 300 STEM graduate students have participated in the national workshop, and 23 local spin-off workshops have been organized in 10 different locations throughout the country. This year, ComSciCon is working to grow as a self-sustaining organization by launching as an independent 501(c)(3) non-profit. In this poster we will discuss the ComSciCon program and methods, our results to date, potential future collaborations between ComSciCon and AAS, and how you can become involved.

  2. Trazando la materia oscura con cúmulos globulares

    NASA Astrophysics Data System (ADS)

    Forte, J. C.

    Se describe la estrategia adoptada para mapear la distribución de materia oscura y bariónica en galaxias elípticas cuyos cúmulos globulares están siendo observados con los telescopios VLT y Gemini. Se ejemplifican los resultados con los datos obtenidos en el cúmulo de Fornax.

  3. Pre Normal Science and the Transition to Post-Normal Policy

    NASA Astrophysics Data System (ADS)

    Halpern, J. B.

    2015-12-01

    Post-Normal Science as formulated by Funtowicz and Ravetz describes cases where "facts are uncertain, values in dispute, stakes high, and decisions urgent". However Post-Normal Science is better described as Pre-Normal Science, the stage at which something has been observed, but no one quite knows where it came from, what it means (science) or what to do about it (policy). The initial flailing about to reach a useful understanding is later used by those who oppose action to obfuscate by insisting that still nothing is known, what is known is wrong, or at best that more research is needed. Consider AIDS/HIV, stratospheric ozone, tobacco, acid rain, climate change, etc. As these issues gained attention, we entered the Pre-Normal Science stage. What was the cause? How could they be dealt with? Every idea could be proposed and was. Normal science sorted through them. Many proposers of the discarded theories still clutched them strongly, but mostly they are dismissed within the scientific community. Post-Normal Policy ensues when normal science has reached a consensus and it is clear that action is needed but it is economically or philosophically impossible for some to accept that. The response is to deny the utility of science and scientific judgment, thus the attacks on scientists and scientific panels that provide policy makers with their best scientific advice. Recognizing the division between Pre-Normal Science and Post-Normal Policy and the uses of the former to block action by the later is useful for understanding the course of controversies that require normal science to influence policy.

  4. ConSpeciFix: Classifying prokaryotic species based on gene flow.

    PubMed

    Bobay, Louis-Marie; Ellis, Brian Shin-Hua; Ochman, Howard

    2018-05-16

    Classification of prokaryotic species is usually based on sequence similarity thresholds, which are easy to apply but lack a biologically-relevant foundation. Here, we present ConSpeciFix, a program that classifies prokaryotes into species using criteria set forth by the Biological Species Concept, thereby unifying species definition in all domains of life. ConSpeciFix's webserver is freely available at www.conspecifix.com. The local version of the program can be freely downloaded from https://github.com/Bobay-Ochman/ConSpeciFix. ConSpeciFix is written in Python 2.7 and requires the following dependencies: Usearch, MCL, MAFFT and RAxML. ljbobay@uncg.edu.

  5. Valorization of biogas into liquid hydrocarbons in plasma-catalyst reactor

    NASA Astrophysics Data System (ADS)

    Nikravech, Mehrdad; Rahmani, Abdelkader; Labidi, Sana; Saintini, Noiric

    2016-09-01

    Biogas represents an important source of renewable energy issued from biological degradation of biomass. It is planned to produce in Europe the amount of biogas equivalent to 6400 kWh electricity and 4500 kteo (kilo tons equivalent oil) in 2020. Currently the biogas is used in cogeneration engines to produce heat and electricity directly in farms or it is injected in gas networks after purification and odorisation. The aim of this work is to propose a third option that consists of valorization of biogas by transformation into liquid hydrocarbons like acetone, methanol, ethanol, acetic acid etc. These chemicals, among the most important feed materials for chemical industries, retain CO2 molecules participating to reduce the greenhouse gas emissions and have high storage energy capacity. We developed a low temperature atmospheric plasma-catalyst reactor (surface dielectric barrier discharge) to transform biogas into chemicals. The conversion rates of CH4 and CO2 are respectively about 50% and 30% depending on operational conditions. The energetic cost is 25 eV/molecule. The yields of liquid hydrocarbon reaches currently 10% wt. More the 11 liquid chemicals are observed in the liquid fraction. Acknowledgements are due to SPC Programme Energie de demain.

  6. Advocating for Normal Birth With Normal Clothes

    PubMed Central

    Waller-Wise, Renece

    2007-01-01

    Childbirth educators need to be aware that the clothes they wear when teaching classes send a nonverbal message to class participants. Regardless of who wears the clothing or what is worn, clothes send a message; thus, both the advantages and disadvantages related to clothing choice should be considered. Ultimately, the message should reflect the values of supporting normal birth. For childbirth educators who are allowed to choose their own apparel to wear in their classes, street clothes may be the benchmark for which to strive. This article discusses the many nonverbal messages that clothes convey and provides support for the choice of street clothes as the dress for the professional childbirth educator; thus, “normal clothes to promote normal birth.” PMID:18408807

  7. Thermochemical valorization and characterization of household biowaste.

    PubMed

    Vakalis, S; Sotiropoulos, A; Moustakas, K; Malamis, D; Vekkos, K; Baratieri, M

    2017-12-01

    Valorization of municipal solid waste (MSW), by means of energy and material recovery, is considered to be a crucial step for sustainable waste management. A significant fraction of MSW is comprised from food waste, the treatment of which is still a challenge. Therefore, the conventional disposal of food waste in landfills is being gradually replaced by recycling aerobic treatment, anaerobic digestion and waste-to-energy. In principle, thermal processes like combustion and gasification are preferred for the recovery of energy due to the higher electrical efficiency and the significantly less time required for the process to be completed when compared to biological process, i.e. composting, anaerobic digestion and transesterification. Nonetheless, the high water content and the molecular structure of biowaste are constraining factors in regard to the application of thermal conversion pathways. Investigating alternative solutions for the pre-treatment and more energy efficient handling of this waste fraction may provide pathways for the optimization of the whole process. In this study, by means of utilizing drying/milling as an intermediate step, thermal treatment of household biowaste has become possible. Household biowaste has been thermally processed in a bench scale reactor by means of torrefaction, carbonization and high temperature pyrolysis. According to the operational conditions, fluctuating fractions of biochar, bio-oil (tar) and syngas were recovered. The thermochemical properties of the feedstock and products were analyzed by means of Simultaneous Thermal Analysis (STA), Ultimate and Proximate analysis and Attenuated Total Reflectance (ATR). The analysis of the products shows that torrefaction of dried household biowaste produces an energy dense fuel and high temperature pyrolysis produces a graphite-like material with relatively high yield. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Games Con Men Play: The Semiosis of Deceptive Interaction.

    ERIC Educational Resources Information Center

    Hankiss, Agnes

    1980-01-01

    Analyzes some of the most frequent deceptive interactions as rendered through case histories of male con artists and their victims taken from police records. Discusses the recurrent elements in both the con-games strategies and victims' way of interpreting those strategies. (JMF)

  9. Adsorptive removal of organics from aqueous phase by acid-activated coal fly ash: preparation, adsorption, and Fenton regenerative valorization of "spent" adsorbent.

    PubMed

    Wang, Nannan; Hao, Linlin; Chen, Jiaqing; Zhao, Qiang; Xu, Han

    2018-05-01

    Raw coal fly ash was activated to an adsorbent by sulfuric acid impregnation. The activation condition, the adsorption capacity, and the regenerative valorization of the adsorbent were studied. The results show that the optimal preparation conditions of the adsorbent are [H 2 SO 4 ] = 1 mol L -1 , activation time = 30 min, the ratio of coal fly ash to acid = 1:20 (g:mL), calcination temperature = 100 °C. The adsorption of p-nitrophenol on the adsorbent accords with the pseudo-second-order kinetic equation and the adsorption rate constant is 0.089 g mg -1  min -1 . The adsorption on this adsorbent can be considered enough after 35 min, when the corresponding adsorption capacity is 1.07 mg g -1 (85.6% of p-nitrophenol removal). Compared with raw coal fly ash, the adsorbent has a stable adsorption performance at low pH range (pH = 1-6) and the adsorption of p-nitrophenol is an exothermic process. Ninety minutes is required for the regenerative valorization of saturated adsorbent by Fenton process. The regenerative valorization for this saturated adsorbent can reach 89% under the optimal proposed conditions (30 °C, pH = 3, [H 2 O 2 ] = 5.0 mmol L -1 , [Fe 2+ ] = 5.5 mmol L -1 ). Within 15 experimental runs, the adsorbent has a better and better stability with the increase of experimental runs. Finally, the mechanism of activating coal fly ash is proposed, being verified by the results of the SEM and BET test.

  10. Breath Analysis Science at PittCon 2012, Orlando, Florida

    EPA Science Inventory

    Breath analysis science was featured in three organized sessions at this year’s Pittsburgh Conference and Exposition, or ‘PittCon 2012’ (http://www.pittcon.org/). As described in previous meeting reports, PittCon is one of the largest international conferences for analytical chem...

  11. Preparation of sustainable photocatalytic materials through the valorization of industrial wastes.

    PubMed

    Sugrañez, Rafael; Cruz-Yusta, Manuel; Mármol, Isabel; Morales, Julián; Sánchez, Luis

    2013-12-01

    A new value-added material was developed from wastes to aim for appropriate waste management and sustainable development. This paper reports the valorization of industrial sandblasting operation wastes (SOWs) as new photocatalytic materials. This waste is composed of Fe2 O3 (60.7 %), SiO2 (29.1 %), and Al2 O3 (3.9 %) as the main components. The high presence of iron oxides was used to develop photocatalytic properties through their thermal transformation into α-Fe2 O3 . The new product, SOW-T, exhibited a good behavior towards the photochemical degradation of organic dyes. The preparation of advanced photocatalytic materials that exhibit self-cleaning and depolluting properties was possible by the inclusion of SOW-T and TiO2 in a cement-based mortar. The synergy observed between both materials enhanced their photocatalytic action. To the best of our knowledge, this is the first report that describes the use of transformed wastes based on iron oxide for the photochemical oxidation of NOx gases. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Enhancement technology improves palatability of normal and callipyge lambs.

    PubMed

    Everts, A K R; Wulf, D M; Wheeler, T L; Everts, A J; Weaver, A D; Daniel, J A

    2010-12-01

    The objective of this research was to determine if BPI Processing Technology (BPT) improved palatability of normal (NN) and callipyge (CN) lamb meat and to determine the mechanism by which palatability was improved. Ten ewe and 10 wether lambs of each phenotype were slaughtered, and carcass traits were assessed by a trained evaluator. The LM was removed at 2 d postmortem. Alternating sides served as controls (CON) or were treated with BPT. Muscles designated BPT were injected to a target 120% by weight with a patented solution containing water, ammonium hydroxide, carbon monoxide, and salt. Muscle pH, cooking loss, Warner-Bratzler shear force (WBS), sarcomere length, cooked moisture retention, and desmin degradation were measured. A trained sensory panel and a take-home consumer panel evaluated LM chops. Callipyge had a heavier BW and HCW, less adjusted fat thickness, reduced yield grades, and greater conformation scores than NN (P < 0.05). For LM, NN had shorter sarcomeres, smaller WBS values, greater juiciness ratings, more off-flavors, reduced consumer ratings for raw characteristics (like of portion size, like of color, like of leanness, overall like of appearance) and greater consumer ratings for eating characteristics (like of juiciness, like of flavor) than CN (P < 0.05). For LM, BPT had greater cooked moisture retention, smaller WBS values, greater juiciness ratings, less off-flavors, and greater consumer ratings for raw characteristics (like of portion size, like of color, overall like of appearance) and eating characteristics (like of juiciness, like of flavor) than CON (P < 0.05). Significant phenotype × treatment interactions occurred for LM muscle pH, desmin degradation, tenderness, consumer like of texture/tenderness, and consumer overall like of eating quality (P < 0.05). For LM, BPT increased muscle pH more for NN than CN (P < 0.01) and increased desmin degradation for NN but decreased desmin degradation for CN (P < 0.01). The BPT enhancement

  13. Fuentes de variabilidad en el diagnóstico de gastritis atrófica multifocal asociada con la infección por Helicobacter pylori1

    PubMed Central

    Bravo, Luis Eduardo; Bravo, Juan Carlos; Realpe, José Luis; Zarama, Guillermo; Piazuelo, MarÍa Blanca; Correa, Pelayo

    2014-01-01

    RESUMEN Introducción El mapeo de las diferentes regiones del estómago y el número de fragmentos de mucosa gástrica disponibles para evaluación histopatológica son fuentes importantes de variación en el momento de clasificar y hacer la gradación de la gastritis crónica. Objetivos Estimar la sensibilidad del número de fragmentos de mucosa gástrica necesarios para establecer los diagnósticos de gastritis atrófica con metaplasia intestinal (MI), displasia y estado de infección por Helicobacter pylori. Además evaluar la variabilidad intra-observador en la clasificación de estas lesiones precursoras del cáncer gástrico. Materiales y métodos En una cohorte de 6 años de seguimiento se evaluaron 1,958 procedimientos de endoscopia realizados por dos gastroenterólogos. En cada procedimiento y de cada participante se obtuvieron 5 biopsias de mucosa gástrica que representaban antro, incisura angularis y cuerpo. Un único patólogo hizo la interpretación histológica de las 5 biopsias y proporcionó un diagnóstico definitivo global que se utilizó como patrón de referencia. Cada fragmento de mucosa gástrica examinado condujo a un diagnóstico individual para cada biopsia que se comparó con el patrón de referencia. La variabilidad intra-observador se evaluó en 127 personas que corresponden a una muestra aleatoria de 20% del total de endoscopias hechas a los 72 meses de seguimiento. Resultados La sensibilidad del diagnóstico de MI y displasia gástrica aumentó de manera significativa con el número de fragmentos de mucosa gástrica evaluados El sitio anatómico de mayor sensibilidad para el diagnóstico de MI y displasia fue la incisura angularis. Para descubrir H. pylori se logró alta sensibilidad con el estudio de un solo fragmento de mucosa gástrica (95.9%) y fue independiente del sitio de obtención de la biopsia. El acuerdo intra-observador para el diagnóstico de gastritis crónica fue 86.1% con valor kappa de 0.79 IC 95% (0.76-0.85). Las

  14. Power of tests of normality for detecting contaminated normal samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thode, H.C. Jr.; Smith, L.A.; Finch, S.J.

    1981-01-01

    Seventeen tests of normality or goodness of fit were evaluated for power at detecting a contaminated normal sample. This study used 1000 replications each of samples of size 12, 17, 25, 33, 50, and 100 from six different contaminated normal distributions. The kurtosis test was the most powerful over all sample sizes and contaminations. The Hogg and weighted Kolmogorov-Smirnov tests were second. The Kolmogorov-Smirnov, chi-squared, Anderson-Darling, and Cramer-von-Mises tests had very low power at detecting contaminated normal random variables. Tables of the power of the tests and the power curves of certain tests are given.

  15. Impacto metabólico e inflamatorio de una comida rica en grasas saturadas y su relación con la obesidad abdominal.

    PubMed

    Alayón, Alicia Norma; Rivadeneira, Ana Patricia; Herrera, Carlos; Guzmán, Heidy; Arellano, Dioneris; Echeverri, Isabella

    2018-05-01

    Introducción. La etapa posprandial se asocia con el incremento de marcadores relacionados con el riesgo cardiovascular, cuya intensidad depende del estado metabólico.Objetivo. Determinar el impacto de la ingestión de una comida rica en grasas saturadas sobre el perfil metabólico e inflamatorio y su relación con la obesidad abdominal.Materiales y métodos. Se hizo un ensayo clínico en 42 individuos (21 con obesidad abdominal). Se midieron, en sangre, la glucosa, la insulina, el perfil lipídico, la proteína C reactiva, los lipopolisacáridos y la interleucina 6, en ayunas y después de la ingestión.Resultados. Además de la obesidad, se registró la presencia de resistencia a la insulina y de niveles elevados de triacilglicéridos y proteína C reactiva en ayunas. Asimismo, se detectaron niveles posprandiales más elevados de glucosa, insulina y triacilglicéridos. La interleucina 6 disminuyó en el grupo de personas sin obesidad y los lipopolisacáridos aumentaron en ambos grupos.Conclusión. La ingestión de una comida rica en grasas saturadas produjo un mayor impacto en las variables glucémicas en el grupo con obesidad y, aunque afectó de forma similar los lípidos en ambos grupos, el incremento de triacilglicéridos fue mayor en presencia de una concentración basal elevada y promovió el aumento de lipopolisacáridos. El estado inflamatorio basal y posprandial afectó en mayor medida al grupo con obesidad. El momento posprandial reflejó el estado más frecuente de los individuos en un día normal y permitió evidenciar la capacidad de respuesta metabólica frente a la ingestión de alimentos, así como los estados tempranos de riesgo metabólico.

  16. Normalizing and scaling of data to derive human response corridors from impact tests.

    PubMed

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A

    2014-06-03

    It is well known that variability is inherent in any biological experiment. Human cadavers (Post-Mortem Human Subjects, PMHS) are routinely used to determine responses to impact loading for crashworthiness applications including civilian (motor vehicle) and military environments. It is important to transform measured variables from PMHS tests (accelerations, forces and deflections) to a standard or reference population, termed normalization. The transformation process should account for inter-specimen variations with some underlying assumptions used during normalization. Scaling is a process by which normalized responses are converted from one standard to another (example, mid-size adult male to large-male and small-size female adults, and to pediatric populations). These responses are used to derive corridors to assess the biofidelity of anthropomorphic test devices (crash dummies) used to predict injury in impact environments and design injury mitigating devices. This survey examines the pros and cons of different approaches for obtaining normalized and scaled responses and corridors used in biomechanical studies for over four decades. Specifically, the equal-stress equal-velocity and impulse-momentum methods along with their variations are discussed in this review. Methods ranging from subjective to quasi-static loading to different approaches are discussed for deriving temporal mean and plus minus one standard deviation human corridors of time-varying fundamental responses and cross variables (e.g., force-deflection). The survey offers some insights into the potential efficacy of these approaches with examples from recent impact tests and concludes with recommendations for future studies. The importance of considering various parameters during the experimental design of human impact tests is stressed. Published by Elsevier Ltd.

  17. Valorization of glycerol through the production of biopolymers: the PHB case using Bacillus megaterium.

    PubMed

    Naranjo, Javier M; Posada, John A; Higuita, Juan C; Cardona, Carlos A

    2013-04-01

    In this work technical and economic analyses were performed to evaluate the glycerol transformation into Polyhydroxybutyrate using Bacillus megaterium. The production of PHB was compared using glycerol or glucose as substrates and similar yields were obtained. The total production costs for PHB generation with both substrates were estimated at an industrial scale. Compared to glucose, glycerol showed a 10% and 20% decrease in the PHB production costs using two different separation schemes respectively. Moreover, a 20% profit margin in the PHB sales price using glycerol as substrate resulted in a 166% valorization of crude glycerol. In this work, the feasibility of glycerol as feedstock for the production of PHB at laboratory (up to 60% PHB accumulation) and industrial (2.6US$/kgPHB) scales is demonstrated. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Thermochemical Wastewater Valorization via Enhanced Microbial Toxicity Tolerance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckham, Gregg T; Thelhawadigedara, Lahiru Niroshan Jayakody; Johnson, Christopher W

    Thermochemical (TC) biomass conversion processes such as pyrolysis and liquefaction generate considerable amounts of wastewater, which often contains highly toxic compounds that are incredibly challenging to convert via standard wastewater treatment approaches such as anaerobic digestion. These streams represent a cost for TC biorefineries, and a potential valorization opportunity, if effective conversion methods are developed. The primary challenge hindering microbial conversion of TC wastewater is toxicity. In this study, we employ a robust bacterium, Pseudomonas putida, with TC wastewater streams to demonstrate that aldehydes are the most inhibitory compounds in these streams. Proteomics, transcriptomics, and fluorescence-based immunoassays of P. putidamore » grown in a representative wastewater stream indicate that stress results from protein damage, which we hypothesize is a primary toxicity mechanism. Constitutive overexpression of the chaperone genes, groEL, groES, and clpB, in a genome-reduced P. putida strain improves the tolerance towards multiple TC wastewater samples up to 200-fold. Moreover, the concentration ranges of TC wastewater are industrially relevant for further bioprocess development for all wastewater streams examined here, representing different TC process configurations. Furthermore, we demonstrate proof-of-concept polyhydroxyalkanoate production from the usable carbon in an exemplary TC wastewater stream. Overall, this study demonstrates that protein quality control machinery and repair mechanisms can enable substantial gains in microbial tolerance to highly toxic substrates, including heterogeneous waste streams. When coupled to other metabolic engineering advances such as expanded substrate utilization and enhanced product accumulation, this study generally enables new strategies for biological conversion of highly-toxic, organic-rich wastewater via engineered aerobic monocultures or designer consortia.« less

  19. Interactions between Polygonal Normal Faults and Larger Normal Faults, Offshore Nova Scotia, Canada

    NASA Astrophysics Data System (ADS)

    Pham, T. Q. H.; Withjack, M. O.; Hanafi, B. R.

    2017-12-01

    Polygonal faults, small normal faults with polygonal arrangements that form in fine-grained sedimentary rocks, can influence ground-water flow and hydrocarbon migration. Using well and 3D seismic-reflection data, we have examined the interactions between polygonal faults and larger normal faults on the passive margin of offshore Nova Scotia, Canada. The larger normal faults strike approximately E-W to NE-SW. Growth strata indicate that the larger normal faults were active in the Late Cretaceous (i.e., during the deposition of the Wyandot Formation) and during the Cenozoic. The polygonal faults were also active during the Cenozoic because they affect the top of the Wyandot Formation, a fine-grained carbonate sedimentary rock, and the overlying Cenozoic strata. Thus, the larger normal faults and the polygonal faults were both active during the Cenozoic. The polygonal faults far from the larger normal faults have a wide range of orientations. Near the larger normal faults, however, most polygonal faults have preferred orientations, either striking parallel or perpendicular to the larger normal faults. Some polygonal faults nucleated at the tip of a larger normal fault, propagated outward, and linked with a second larger normal fault. The strike of these polygonal faults changed as they propagated outward, ranging from parallel to the strike of the original larger normal fault to orthogonal to the strike of the second larger normal fault. These polygonal faults hard-linked the larger normal faults at and above the level of the Wyandot Formation but not below it. We argue that the larger normal faults created stress-enhancement and stress-reorientation zones for the polygonal faults. Numerous small, polygonal faults formed in the stress-enhancement zones near the tips of larger normal faults. Stress-reorientation zones surrounded the larger normal faults far from their tips. Fewer polygonal faults are present in these zones, and, more importantly, most polygonal faults

  20. Determinacion de Caracteristicas Opticas del Telescopio OAN150

    NASA Astrophysics Data System (ADS)

    Galan, M. J.; Cobos, F. J.

    1987-05-01

    En el Observatorio de Calar Alto, en Almería, España, está ubicado un telescopio de 15O-cms de diámetro -construído por REOSC- perteneciente al Observatorio Astronómico Nacional, con sede en Madrid, España. La infraestructura técnica del OAN ha sido tradicionalmente débil y actualmente se está haciendo un esfuerzo por fortalecerla. Existe una información muy limitada del telescopio en general; de su óptica en particular se conocían los valores de los parámetros principales pero sin saber si éstos corresponden a valores teóricos ó de construcción. Por ello se consideró necesario iniciar una investigación para conocer en detalle los valores reales de las componentes ópticas del telescopio, obteniéndose algunos resultados de interés. El primario del telescopio OANl5O es aproximadamente F/3 y el siste ma en su conjunto es F/8.2, con su sistema corrector de campo. En términos generales, la imagen es satisfactoria en todo el campo y, sin sistema corrector, la imagen axial también es buena. En un futuro muy cercano se piensa diseñar instrumentación adicional para este telescopio. Conocer con mayor precisión sus características puede ser de gran utilidad para tal fin, pues se efectúan los cálculos considerando conjuntamente al telescopio y al instrumento.

  1. Valorization of Cheese and Tofu Whey through Enzymatic Synthesis of Lactosucrose.

    PubMed

    Corzo-Martinez, Marta; Luscher, Alice; de Las Rivas, Blanca; Muñoz, Rosario; Moreno, F Javier

    2015-01-01

    This work deals with the development of a new bioprocess for the efficient synthesis of lactosucrose, a potential prebiotic oligosaccharide with a high value-added, from two important and inexpensive agro-industrial by-products such as tofu whey and cheese whey permeate. The bioconversion is driven by the ability of the enzyme levansucrase SacB from Bacillus subtilis CECT 39 to transfructosylate lactose contained in the cheese whey permeate by using not only sucrose but also raffinose and stachyose, which are present in considerable amounts in the tofu whey, as suitable donors of fructosyl moieties. The maximum lactosucrose concentration obtained from both by-products was 80.1 g L-1 after a short reaction time 120 min at 37°C, leading to productivity and specific productivity values of 40.1 g lactosucrose L-1 h-1 and 80.1 mg lactosucrose U enzyme-1 h-1, respectively. Findings contained in this work could provide a new strategy to valorize agro-industrial by-products as cheese whey permeate and, specially, tofu whey by means of their use as renewable resources in the enzymatic synthesis of bioactive oligosaccharides.

  2. Valorization of Cheese and Tofu Whey through Enzymatic Synthesis of Lactosucrose

    PubMed Central

    Corzo-Martinez, Marta; Luscher, Alice; de las Rivas, Blanca; Muñoz, Rosario; Moreno, F. Javier

    2015-01-01

    This work deals with the development of a new bioprocess for the efficient synthesis of lactosucrose, a potential prebiotic oligosaccharide with a high value-added, from two important and inexpensive agro-industrial by-products such as tofu whey and cheese whey permeate. The bioconversion is driven by the ability of the enzyme levansucrase SacB from Bacillus subtilis CECT 39 to transfructosylate lactose contained in the cheese whey permeate by using not only sucrose but also raffinose and stachyose, which are present in considerable amounts in the tofu whey, as suitable donors of fructosyl moieties. The maximum lactosucrose concentration obtained from both by-products was 80.1 g L-1 after a short reaction time 120 min at 37°C, leading to productivity and specific productivity values of 40.1 g lactosucrose L-1 h-1 and 80.1 mg lactosucrose U enzyme−1 h−1, respectively. Findings contained in this work could provide a new strategy to valorize agro-industrial by-products as cheese whey permeate and, specially, tofu whey by means of their use as renewable resources in the enzymatic synthesis of bioactive oligosaccharides. PMID:26406885

  3. Thermal valorization of post-consumer film waste in a bubbling bed gasifier.

    PubMed

    Martínez-Lera, S; Torrico, J; Pallarés, J; Gil, A

    2013-07-01

    The use of plastic bags and film packaging is very frequent in manifold sectors and film waste is usually present in different sources of municipal and industrial wastes. A significant part of it is not suitable for mechanical recycling but could be safely transformed into a valuable gas by means of thermal valorization. In this research, the gasification of film wastes has been experimentally investigated through experiments in a fluidized bed reactor of two reference polymers, polyethylene and polypropylene, and actual post-consumer film waste. After a complete experimental characterization of the three materials, several gasification experiments have been performed to analyze the influence of the fuel and of equivalence ratio on gas production and composition, on tar generation and on efficiency. The experiments prove that film waste and analogue polymer derived wastes can be successfully gasified in a fluidized bed reactor, yielding a gas with a higher heating value in a range from 3.6 to 5.6 MJ/m3 and cold gas efficiencies up to 60%. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Smooth quantile normalization.

    PubMed

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  5. [Not Available].

    PubMed

    Pacheco-Herrera, Javier Darío; Ramírez-Vélez, Robinson; Correa-Bautista, Jorge Enrique

    2016-06-30

    Objetivo: el presente estudio tuvo por objetivos: a) determinar los valores de referencia de la condición muscular mediante el índice general de fuerza (IGF); y b) estudiar si el IGF está asociado con indicadores de adiposidad en niños y adolescentes escolares de Bogotá, Colombia.Métodos: del total de 7.268 niños y adolescentes (9-17,9 años) evaluados en el estudio FUPRECOL, 4.139 (57%) fueron mujeres. Se evaluó el IGF como marcador del desempeño muscular a partir de la tipificación de las pruebas de fuerza prensil (FP) y salto de longitud (SL). El IGF se recodificó en cuartiles (Q), siendo el Q4 la posición con mejor valor del IGF. El índice de masa corporal (IMC), la circunferencia de cintura (CC), el índice cintura/talla (ICT) y el porcentaje de grasa corporal (% GC) por bioimpedancia eléctrica se midieron como marcadores de adiposidad.Resultados: la edad media de los evaluados fue 12,8 ± 2,3 años. Se aprecia una tendencia hacia un incremento del nivel de condición física muscular en los varones conforme aumenta la edad, y hacia la estabilidad o un ligero aumento en el caso de las mujeres. El IGF se relacionóinversamente con el ICT y % GC en los varones (r = -0,280, r = -0,327, p < 0,01), respectivamente. Los escolares ubicados en el Q4 del IGF presentaron menores valores en marcadores de adiposidad IMC, CC, ICT y % GC, p < 0,01, que su contraparte del Q1.Conclusión: se presentan valores de referencia del IGF a partir de la estandarización de los resultados obtenidos en la FP y SL. La evaluación de la fuerza muscular en edades tempranas permitirá implementar programas de prevención de riesgo cardiovascular y metabólico futuro.

  6. Cortical Thinning in Network-Associated Regions in Cognitively Normal and Below-Normal Range Schizophrenia

    PubMed Central

    Pinnock, Farena; Parlar, Melissa; Hawco, Colin; Hanford, Lindsay; Hall, Geoffrey B.

    2017-01-01

    This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB) composite score (T = 50 ± 10) and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n = 39) had greater cortical thickness than both cognitively normal (n = 17) and below-normal range (n = 49) patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n = 24) or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment. PMID:28348889

  7. A protocol for eliciting nonmaterial values through a cultural ecosystem services frame

    PubMed Central

    Gould, Rachelle K; Klain, Sarah C; Ardoin, Nicole M; Satterfield, Terre; Woodside, Ulalia; Hannahs, Neil; Daily, Gretchen C; Chan, Kai M

    2015-01-01

    Servicios Ambientales Culturales Resumen Los deseos, necesidades y valores no materiales de los accionistas influyen frecuentemente sobre el éxito de los proyectos de conservación. Estas consideraciones son difíciles de articular y caracterizar, lo que resulta en entendimiento limitado en el manejo y la política. Concebimos un protocolo de entrevista diseñado para mejorar el entendimiento de los servicios ambientales culturales (SAC). El protocolo inicia con la discusión de actividades relacionadas con ecosistemas (p. ej.: recreación, cacería) y manejo; después señala a los SAC, dando pie a los valores que encierran conceptos identificados en la Evaluación Ambiental del Milenio (2005) y explorado en otras investigaciones sobre SAC. Hicimos pruebas piloto del protocolo en Hawái y Columbia Británica. En cada localidad entrevistamos a 30 individuos de diversos entornos. Analizamos los resultados de las dos localidades para determinar la efectividad del protocolo de entrevista en la obtención de valores no materiales. Los componentes cualitativos y espaciales del protocolo nos ayudaron a caracterizar los valores culturales, sociales y éticos asociados con el ecosistema de múltiples maneras. Los mapas y las preguntas de situación, o de tipo viñeta, ayudaron a los encuestados a articular valores difíciles de discutir. Las preguntas abiertas permitieron a los encuestados expresar una diversidad de valores ambientales y demostraron ser suficientemente flexibles para que los encuestados comunicaran valores que el protocolo no buscaba explícitamente. Finalmente, los resultados sugieren que ciertos valores, aquellos mencionados frecuentemente en la entrevista, son particularmente prominentes para poblaciones particulares. El protocolo puede proporcionar datos eficientes, contextuales y basados en lugar sobre la importancia de atributos ambientales particulares para el bienestar humano. Los datos cualitativos son complementarios para las evaluaciones cuantitativas y

  8. [Not Available].

    PubMed

    Zubiaga, Lorea; Ruiz-Tovar, Jaime; Giner, Lorena; González, Juan; Aguilar, María Del Mar; García, Alejandro; Calpena, Rafael; Durán, Manuel

    2016-07-19

    Introducción y objetivo: el IMC puede resultar engañoso para ciertas complexiones corporales, por lo que se han propuesto otros parámetros como la adiposidad (calculada a través de fórmula CUN-BAE,) el índice de Framingham de riesgo cardiovascular (IF) y el índice aterogénico (IA) (rCT/HDL-c) como predictores de riesgo cardiovascular. Se propone comparar estos factores como marcadores de éxito terapéutico tras cirugía en pacientes obesos sometidos a gastrectomía vertical laparoscópica (GVL) como procedimiento de cirugía bariátrica.Material y métodos: realizamos un estudio observacional prospectivo de pacientes sometidos a GVL y con un periodo de seguimiento mínimo de 1 año. Analizamos la evolución de IMC, adiposidad, IF e IA.Resultados: analizamos 140 pacientes. El IMC preoperatorio fue de 49,1 kg/m2, con una adiposidad del 54,8%, un IF 7,54% y un IA de 4,2. A los 12 meses el IMC era de 28,4 kg/m2, con una adiposidad del 39,4%, un IF del 3,7% y un IA de 1,64. En función de estos resultados, a los 12 meses el IMC medio está en rango de sobrepeso, la adiposidad en niveles de obesidad (obesidad: > 25% en hombres y > 35% en mujeres), el IF en rango de riesgo cardiovascular bajo (< 5%) y el IA dentro del rango de normalidad (< 3). Correlacionando estos parámetros observamos que el IMC se correlaciona con la adiposidad tanto en valores preoperatorios (Pearson 0,486; p = 0,004), posoperatorios (Pearson 0,957; p < 0,001), como en la diferencia entre ambos (Pearson 0,606; p = 0,017), lo cual es lógico, porque el IMC se incluye en la fórmula CUN-BAE para el cálculo de la adiposidad. En los valores posoperatorios se objetiva una correlación de la adiposidad con el IF (Pearson 0,814, p = 0,036) y con el IA (Pearson 0,517; p = 0,049). En los valores preoperatorios no se objetivan dichas correlaciones. El IMC no se correlacionó con la adiposidad.Conclusión: la adiposidad se correlaciona con índices de riesgo cardiovascular, como el índice de

  9. Multivariate normality

    NASA Technical Reports Server (NTRS)

    Crutcher, H. L.; Falls, L. W.

    1976-01-01

    Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.

  10. Microbial utilization of lignin: available biotechnologies for its degradation and valorization.

    PubMed

    Palazzolo, Martín A; Kurina-Sanz, Marcela

    2016-10-01

    Lignocellulosic biomasses, either from non-edible plants or from agricultural residues, stock biomacromolecules that can be processed to produce both energy and bioproducts. Therefore, they become major candidates to replace petroleum as the main source of energy. However, to shift the fossil-based economy to a bio-based one, it is imperative to develop robust biotechnologies to efficiently convert lignocellulosic streams in power and platform chemicals. Although most of the biomass processing facilities use celluloses and hemicelluloses to produce bioethanol and paper, there is no consolidated bioprocess to produce valuable compounds out of lignin at industrial scale available currently. Usually, lignin is burned to provide heat or it remains as a by-product in different streams, thus arising environmental concerns. In this way, the biorefinery concept is not extended to completion. Due to Nature offers an arsenal of biotechnological tools through microorganisms to accomplish lignin valorization or degradation, an increasing number of projects dealing with these tasks have been described recently. In this review, outstanding reports over the last 6 years are described, comprising the microbial utilization of lignin to produce a variety of valuable compounds as well as to diminish its ecological impact. Furthermore, perspectives on these topics are given.

  11. Normal Coagulation

    DTIC Science & Technology

    2014-09-04

    LO TTIN G with vitamin K antagonist...confidential until formal publication.6 F CHAPTER 34 Normal Coagulation 531 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Table 34-1 Procoagulant...formal publication.8 F CHAPTER 34 Normal Coagulation 533 SE C T IO N 7 B LEED IN G A N D C LO TTIN G Figure 34-4 Vitamin K–dependent com-

  12. Niveles de referencia de osteocalcina en población sana de México.

    PubMed

    Nieto-Flores, Jesús; Villafán-Bernal, José Rafael; Rivera-León, Edgar Alfonso; Llamas-Covarrubias, Iris Monserrat; González-Hita, Mercedes Elvira; Alcalá-Zermeno, Juan Luis; Sánchez-Enríquez, Sergio

    2018-01-01

    Se ha demostrado que la osteocalcina tiene una relación inversa con la glucemia, resistencia a la insulina y adiposidad. Determinar la concentración sérica normal de osteocalcina en adultos sanos mexicanos y compararlos con los reportados en otras poblaciones. Se determinó la concentración sérica de osteocalcina carboxilada y pobremente carboxilada en 100 adultos sanos mediante inmunoensayo enzimático; se calculó la concentración de osteocalcina total. Se hizo una comparación descriptiva con valores de otras poblaciones reportadas en la literatura. Las medianas de las concentraciones de osteocalcina carboxilada y pobremente carboxilada fueron 3.22 ng/mL y 1.61 ng/mL, respectivamente; la media de osteocalcina total fue 7.40 ± 5.11 ng/mL. No hubo diferencia significativa entre los valores de osteocalcina total en nuestra población y los de poblaciones en las que se utilizaron métodos de cuantificación similares al nuestro. La concentración sérica promedio de osteocalcina total en la población analizada fue de 7.40 ng/mL. Las variaciones sutiles entre poblaciones son atribuibles a factores genéticos y poblacionales, sin embargo, el método de cuantificación fue el único que se comprobó influye significativamente en los niveles de osteocalcina en poblaciones sanas. Osteocalcin has been shown to have an inverse relationship with blood glucose, insulin resistance and adiposity. To determine osteocalcin normal serum concentration in Mexican healthy adults and compare it with values reported in other populations. Carboxylated and undercarboxylated osteocalcin serum concentrations were determined in 100 healthy adults by means of enzyme immunoassay; osteocalcin total concentration was calculated. A descriptive comparison was made with other populations' values reported in the literature. Carboxylated and undercarboxylated osteocalcin median concentrations were 3.22 ng/mL and 1.61 ng/mL, respectively. Mean total osteocalcin was 7.40 ± 5.11 ng/mL. There

  13. Omega-3 fatty acid deficiency selectively up-regulates delta6-desaturase expression and activity indices in rat liver: prevention by normalization of omega-3 fatty acid status.

    PubMed

    Hofacer, Rylon; Jandacek, Ronald; Rider, Therese; Tso, Patrick; Magrisso, I Jack; Benoit, Stephen C; McNamara, Robert K

    2011-09-01

    This study investigated the effects of perinatal dietary omega-3 (n-3) fatty acid depletion and subsequent repletion on the expression of genes that regulate long-chain (LC) polyunsaturated fatty acid biosynthesis in rat liver and brain. It was hypothesized that chronic n-3 fatty acid deficiency would increase liver Fads1 and Fads2 messenger RNA (mRNA) expression/activity and that n-3 fatty acid repletion would normalize this response. Adult rats fed the n-3-free diet during perinatal development exhibited significantly lower erythrocyte, liver, and frontal cortex LCn-3 fatty acid composition and reciprocal elevations in LC omega-6 (n-6) fatty acid composition compared with controls (CONs) and repleted rats. Liver Fads2, but not Fads1, Elovl2, or Elovl5, mRNA expression was significantly greater in n-3-deficient (DEF) rats compared with CONs and was partially normalized in repleted rats. The liver 18:3n-6/18:2n-6 ratio, an index of delta6-desturase activity, was significantly greater in DEF rats compared with CON and repleted rats and was positively correlated with Fads2 mRNA expression among all rats. The liver 18:3n-6/18:2n-6 ratio, but not Fads2 mRNA expression, was also positively correlated with erythrocyte and frontal cortex LCn-6 fatty acid compositions. Neither Fads1 or Fads2 mRNA expression was altered in brain cortex of DEF rats. These results confirm previous findings that liver, but not brain, delta6-desaturase expression and activity indices are negatively regulated by dietary n-3 fatty acids. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. New spatial upscaling methods for multi-point measurements: From normal to p-normal

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Li, Xin

    2017-12-01

    Careful attention must be given to determining whether the geophysical variables of interest are normally distributed, since the assumption of a normal distribution may not accurately reflect the probability distribution of some variables. As a generalization of the normal distribution, the p-normal distribution and its corresponding maximum likelihood estimation (the least power estimation, LPE) were introduced in upscaling methods for multi-point measurements. Six methods, including three normal-based methods, i.e., arithmetic average, least square estimation, block kriging, and three p-normal-based methods, i.e., LPE, geostatistics LPE and inverse distance weighted LPE are compared in two types of experiments: a synthetic experiment to evaluate the performance of the upscaling methods in terms of accuracy, stability and robustness, and a real-world experiment to produce real-world upscaling estimates using soil moisture data obtained from multi-scale observations. The results show that the p-normal-based methods produced lower mean absolute errors and outperformed the other techniques due to their universality and robustness. We conclude that introducing appropriate statistical parameters into an upscaling strategy can substantially improve the estimation, especially if the raw measurements are disorganized; however, further investigation is required to determine which parameter is the most effective among variance, spatial correlation information and parameter p.

  15. ConsPred: a rule-based (re-)annotation framework for prokaryotic genomes.

    PubMed

    Weinmaier, Thomas; Platzer, Alexander; Frank, Jeroen; Hellinger, Hans-Jörg; Tischler, Patrick; Rattei, Thomas

    2016-11-01

    The rapidly growing number of available prokaryotic genome sequences requires fully automated and high-quality software solutions for their initial and re-annotation. Here we present ConsPred, a prokaryotic genome annotation framework that performs intrinsic gene predictions, homology searches, predictions of non-coding genes as well as CRISPR repeats and integrates all evidence into a consensus annotation. ConsPred achieves comprehensive, high-quality annotations based on rules and priorities, similar to decision-making in manual curation and avoids conflicting predictions. Parameters controlling the annotation process are configurable by the user. ConsPred has been used in the institutions of the authors for longer than 5 years and can easily be extended and adapted to specific needs. The ConsPred algorithm for producing a consensus from the varying scores of multiple gene prediction programs approaches manual curation in accuracy. Its rule-based approach for choosing final predictions avoids overriding previous manual curations. ConsPred is implemented in Java, Perl and Shell and is freely available under the Creative Commons license as a stand-alone in-house pipeline or as an Amazon Machine Image for cloud computing, see https://sourceforge.net/projects/conspred/. thomas.rattei@univie.ac.atSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Deconstructing Normalisation: Clearing the Way for Inclusion.

    ERIC Educational Resources Information Center

    Culham, Andrew; Nind, Melanie

    2003-01-01

    This paper considers two major movements affecting the lives of people with intellectual disabilities: normalization and inclusion. It reviews the aims, processes, and outcomes of the normalization and social role valorization movement and explores its compatibility with inclusion. Lessons from normalization are applied to the inclusion movement.…

  17. Visual Memories Bypass Normalization.

    PubMed

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  18. Group normalization for genomic data.

    PubMed

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  19. Visual Memories Bypass Normalization

    PubMed Central

    Bloem, Ilona M.; Watanabe, Yurika L.; Kibbe, Melissa M.; Ling, Sam

    2018-01-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores—neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation. PMID:29596038

  20. InterCon Travel Health: Case B

    ERIC Educational Resources Information Center

    Truman, Gregory E.; Pachamanova, Dessislava A.; Goldstein, Michael A.

    2010-01-01

    InterCon provides services to health insurers of foreign tourists who travel to the United States and Canada. Management wants to implement a new information system that will deal with several operational problems, but it is having difficulty securing the capital resources to fund the system's development. After an initial failure, the chief…

  1. Impact of the ConRed program on different cyberbulling roles.

    PubMed

    Del Rey, Rosario; Casas, José A; Ortega, Rosario

    2016-01-01

    This article presents results from an evaluation of the ConRed cyberbullying intervention program. The program's impacts were separately determined for the different roles within cyberbullying that students can take, i.e., cyber-victims, cyber-bullies, cyber-bully/victims, and bystanders. The ConRed program is a theory-driven program designed to prevent cyberbullying and improve cyberbullying coping skills. It involves students, teachers, and families. During a 3-month period, external experts conducted eight training sessions with students, two with teachers and one with families. ConRed was evaluated through a quasi-experimental design, in which students from three secondary schools were separated into experimental and control groups. The sample comprised 875 students, aged between 11 and 19 years. More students (n = 586) were allocated to the experimental groups at the specific insistence of the management of all schools; the remainder (n = 289) formed the control. Repeated measures MANOVA showed that cyber victims, cyber aggressors and cyberbully/victims reduced their involvement in cyberbullying. Moreover, cyber-victims and bystanders adjusted their perceptions about their control of personal information on the Internet, and cyber aggressors and bystanders reduced their Internet dependence. The ConRed program had stronger effects on male participants, especially in heightening their affective empathy. © 2015 Wiley Periodicals, Inc.

  2. Group Normalization for Genomic Data

    PubMed Central

    Ghandi, Mahmoud; Beer, Michael A.

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets. PMID:22912661

  3. El efecto de la panfotocoagulación con láser en edema macular diabético con el fotocoagulador Pascal® versus el láser de argón convencional.

    PubMed

    Mahgoub, Mohamed M; Macky, Tamer A

    2017-07-11

    Objetivo: El objetivo de este estudio fue comparar el efecto de la panfotocoagulación (PFC) en el edema macular diabético (EMD) en pacientes con retinopatía diabética proliferativa (RDP) con el fotocoagulador Pascal® (FP) vs. un fotocoagulador con láser de argón convencional (FLAC). Métodos: Se aleatorizó el uso de FP o FLAC en ochenta ojos con RDP y EMD con afectación central de la mácula. Ambos grupos tuvieron una evaluación de base de mejor agudeza visual corregida y fueron examinados con tomografía de coherencia óptica y angiografía con fluoresceína. Resultados: El número medio de disparos de láser en los grupos de FP y FLAC fue 1.726,10 y 752,00 en la sesión 1 y 1.589,00 y 830,00 (p < 0,001) en la sesión 2, respectivamente. El grosor foveal central (GFC) medio antes de comenzar el estudio fue 306 ± 100 y 314 ± 98 en los grupos de FP y FLAC, respectivamente. A las 8 semanas, el GFC medio fue 332 ± 116 y 347 ± 111 en los grupos de FP y FLAC, respectivamente (p > 0,05). La MAVC media fue similar durante el periodo de estudio y no hubo ninguna diferencia significativa entre los grupos (p > 0,05). Conclusiones: El FP y el FLAC mostraron efectos similares en el EMD en ojos con RDP y fueron igualmente seguros sin un aumento significativo del GFC. © 2017 S. Karger AG, Basel.

  4. Meeting Report: Breath Biomarkers Networking Sessions at PittCon 2010, Orlando, Florida

    EPA Science Inventory

    The Pittsburgh Conference and Exposition, or "PittCon" (www.pittcon.org/), is one of the largest international conferences for analytical chemistry and instrumentation typically attracting about 25,000 attendees and 1,000 commercial exhibitors. PittCon began in 1950 as a small sp...

  5. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    PubMed

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  6. ConA-based glucose sensing using the long-lifetime azadioxatriangulenium fluorophore

    NASA Astrophysics Data System (ADS)

    Cummins, Brian; Simpson, Jonathan; Gryczynski, Zygmunt; Sørensen, Thomas Just; Laursen, Bo W.; Graham, Duncan; Birch, David; Coté, Gerard

    2014-02-01

    Fluorescent glucose sensing technologies have been identified as possible alternatives to current continuous glucose monitoring approaches. We have recently introduced a new, smart fluorescent ligand to overcome the traditional problems of ConA-based glucose sensors. For this assay to be translated into a continuous glucose monitoring device where both components are free in solution, the molecular weight of the smart fluorescent ligand must be increased. We have identified ovalbumin as a naturally-occurring glycoprotein that could serve as the core-component of a 2nd generation smart fluorescent ligand. It has a single asparagine residue that is capable of displaying an N-linked glycan and a similar isoelectric point to ConA. Thus, binding between ConA and ovalbumin can potentially be monovalent and sugar specific. This work is the preliminary implementation of fluorescently-labeled ovalbumin in the ConA-based assay. We conjugate the red-emitting, long-lifetime azadioxatriangulenium (ADOTA+) dye to ovalbumin, as ADOTA have many advantageous properties to track the equilibrium binding of the assay. The ADOTA-labeled ovalbumin is paired with Alexa Fluor 647-labeled ConA to create a Förster Resonance Energy Transfer (FRET) assay that is glucose dependent. The assay responds across the physiologically relevant glucose range (0-500 mg/dL) with increasing intensity from the ADOTA-ovalbumin, showing that the strategy may allow for the translation of the smart fluorescent ligand concept into a continuous glucose monitoring device.

  7. ConTour: Data-Driven Exploration of Multi-Relational Datasets for Drug Discovery.

    PubMed

    Partl, Christian; Lex, Alexander; Streit, Marc; Strobelt, Hendrik; Wassermann, Anne-Mai; Pfister, Hanspeter; Schmalstieg, Dieter

    2014-12-01

    Large scale data analysis is nowadays a crucial part of drug discovery. Biologists and chemists need to quickly explore and evaluate potentially effective yet safe compounds based on many datasets that are in relationship with each other. However, there is a lack of tools that support them in these processes. To remedy this, we developed ConTour, an interactive visual analytics technique that enables the exploration of these complex, multi-relational datasets. At its core ConTour lists all items of each dataset in a column. Relationships between the columns are revealed through interaction: selecting one or multiple items in one column highlights and re-sorts the items in other columns. Filters based on relationships enable drilling down into the large data space. To identify interesting items in the first place, ConTour employs advanced sorting strategies, including strategies based on connectivity strength and uniqueness, as well as sorting based on item attributes. ConTour also introduces interactive nesting of columns, a powerful method to show the related items of a child column for each item in the parent column. Within the columns, ConTour shows rich attribute data about the items as well as information about the connection strengths to other datasets. Finally, ConTour provides a number of detail views, which can show items from multiple datasets and their associated data at the same time. We demonstrate the utility of our system in case studies conducted with a team of chemical biologists, who investigate the effects of chemical compounds on cells and need to understand the underlying mechanisms.

  8. SLUDGE PARTICLE SEPAPATION EFFICIENCIES DURING SETTLER TANK RETRIEVAL INTO SCS-CON-230

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DEARING JI; EPSTEIN M; PLYS MG

    2009-07-16

    The purpose of this document is to release, into the Hanford Document Control System, FA1/0991, Sludge Particle Separation Efficiencies for the Rectangular SCS-CON-230 Container, by M. Epstein and M. G. Plys, Fauske & Associates, LLC, June 2009. The Sludge Treatment Project (STP) will retrieve sludge from the 105-K West Integrated Water Treatment System (IWTS) Settler Tanks and transfer it to container SCS-CON-230 using the Settler Tank Retrieval System (STRS). The sludge will enter the container through two distributors. The container will have a filtration system that is designed to minimize the overflow of sludge fines from the container to themore » basin. FAI/09-91 was performed to quantify the effect of the STRS on sludge distribution inside of and overflow out of SCS-CON-230. Selected results of the analysis and a system description are discussed. The principal result of the analysis is that the STRS filtration system reduces the overflow of sludge from SCS-CON-230 to the basin by roughly a factor of 10. Some turbidity can be expected in the center bay where the container is located. The exact amount of overflow and subsequent turbidity is dependent on the density of the sludge (which will vary with location in the Settler Tanks) and the thermal gradient between the SCS-CON-230 and the basin. Attachment A presents the full analytical results. These results are applicable specifically to SCS-CON-230 and the STRS filtration system's expected operating duty cycles.« less

  9. [Not Available].

    PubMed

    De Arriba Muñoz, Antonio; López Úbeda, Marta; Rueda Caballero, Carmen; Labarta Aizpún, José Ignacio; Ferrández Longás, Ángel

    2016-07-19

    Introducción: saber diagnosticar y tratar la obesidad se ha convertido en el mayor reto del siglo XXI, debido al aumento de su prevalencia.Objetivos: determinar los valores de normalidad de perímetro abdominal (PA) e índice de masa corporal (IMC) según edad y sexo en población española sana.Métodos: estudio longitudinal observacional realizado entre 1980 y 2014. Se incluyeron 165 niños y 169 niñas recién nacidas, recogiendo datos de forma anual hasta los 18 años (74 varones y 92 mujeres), y posteriormente a los 28 años (42 varones y 45 mujeres). Se realizó medición de peso, longitud/talla y perímetro abdominal. Se calcularon los percentiles (P3, P10, P25, P50, P75, P90, P97) de IMC y PA según edad y sexo.Resultados: se presentan datos evolutivos de IMC y PA durante la infancia, destacando cómo aumentan los valores entre los 18 y 28 años de los percentiles superiores al p50, sobre todo en mujeres. Existe una correlación positiva en relación al PA entre el valor obtenido a los 3 años con el valor de los 18 años y de los 28 años tanto en varones (r = 0,722 y r = 0,605, p = 0,000, respectivamente) como en mujeres(r = 0,922, r = 0,857, p = 0,000, respectivamente). Y entre los 18 y 28 años (r = 0,731, p = 0,000 para varones y r = 0,961, p = 0,000 para mujeres).Conclusión: se presentan valores de normalidad de PA e IMC según edad y sexo, que podrán utilizarse como herramienta de referencia para identificar a personas con riesgo de desarrollar enfermedades cardiovasculares o diabetes.

  10. Virulence, Speciation and Antibiotic Susceptibility of Ocular Coagualase Negative Staphylococci (CoNS)

    PubMed Central

    Priya, Ravindran; Mythili, Arumugam; Singh, Yendremban Randhir Babu; Sreekumar, Haridas; Manikandan, Palanisamy; Panneerselvam, Kanesan

    2014-01-01

    Background: Coagulase negative Staphylococci (CoNS) are common inhabitants of human skin and mucous membranes. With the emergence of these organisms as prominent pathogens in patients with ocular infections, investigation has intensified in an effort to identify important virulence factors and to inform new approaches to treatment and prevention. Aim: To isolate CoNS from ocular specimens; to study the possible virulence factors; speciation of coagulase negative staphylococci (CoNS) which were isolated from ocular complications; antibiotic susceptibility testing of ocular CoNS. Materials and Methods: The specimens were collected from the target patients who attended the Microbiology Laboratory of a tertiary care eye hospital in Coimbatore, Tamilnadu state, India. The isolates were subjected to tube and slide coagulase tests for the identification of CoNS. All the isolates were subjected to screening for lipase and protease activities. Screening for other virulence factors viz., slime production on Congo red agar medium and haemagglutination assay with use of 96-well microtitre plates. These isolates were identified upto species level by performing biochemical tests such as phosphatase test, arginine test, maltose and trehalose fermentation tests and novobiocin sensitivity test. The isolates were subjected to antibiotic susceptibility studies, based on the revised standards of Clinical and Laboratory Standards Institutes (CLSI). Results: During the one year of study, among the total 260 individuals who were screened, 100 isolates of CoNS were obtained. Lipolytic activity was seen in all the isolates, whereas 38 isolates showed a positive result for protease. A total of 63 isolates showed slime production. Of 100 isolates, 30 isolates were analyzed for haemagglutination, where 4 isolates showed the capacity to agglutinate the erythrocytes. The results of the biochemical analysis revealed that of the 100 isolates of CoNS, 43% were Staphylococcus epidermidis. The other

  11. Valorization of Flue Gas by Combining Photocatalytic Gas Pretreatment with Microalgae Production.

    PubMed

    Eynde, Erik Van; Lenaerts, Britt; Tytgat, Tom; Blust, Ronny; Lenaerts, Silvia

    2016-03-01

    Utilization of flue gas for algae cultivation seems to be a promising route because flue gas from fossil-fuel combustion processes contains the high amounts of carbon (CO2) and nitrogen (NO) that are required for algae growth. NO is a poor nitrogen source for algae cultivation because of its low reactivity and solublilty in water and its toxicity for algae at high concentrations. Here, we present a novel strategy to valorize NO from flue gas as feedstock for algae production by combining a photocatalytic gas pretreatment unit with a microalgal photobioreactor. The photocatalytic air pretreatment transforms NO gas into NO2 gas and thereby enhances the absorption of NOx in the cultivation broth. The absorbed NOx will form NO2(-) and NO3(-) that can be used as a nitrogen source by algae. The effect of photocatalytic air pretreatment on the growth and biomass productivity of the algae Thalassiosira weissflogii in a semicontinuous system aerated with a model flue gas (1% CO2 and 50 ppm of NO) is investigated during a long-term experiment. The integrated system makes it possible to produce algae with NO from flue gas as the sole nitrogen source and reduces the NOx content in the exhaust gas by 84%.

  12. Statokinesigram normalization method.

    PubMed

    de Oliveira, José Magalhães

    2017-02-01

    Stabilometry is a technique that aims to study the body sway of human subjects, employing a force platform. The signal obtained from this technique refers to the position of the foot base ground-reaction vector, known as the center of pressure (CoP). The parameters calculated from the signal are used to quantify the displacement of the CoP over time; there is a large variability, both between and within subjects, which prevents the definition of normative values. The intersubject variability is related to differences between subjects in terms of their anthropometry, in conjunction with their muscle activation patterns (biomechanics); and the intrasubject variability can be caused by a learning effect or fatigue. Age and foot placement on the platform are also known to influence variability. Normalization is the main method used to decrease this variability and to bring distributions of adjusted values into alignment. In 1996, O'Malley proposed three normalization techniques to eliminate the effect of age and anthropometric factors from temporal-distance parameters of gait. These techniques were adopted to normalize the stabilometric signal by some authors. This paper proposes a new method of normalization of stabilometric signals to be applied in balance studies. The method was applied to a data set collected in a previous study, and the results of normalized and nonnormalized signals were compared. The results showed that the new method, if used in a well-designed experiment, can eliminate undesirable correlations between the analyzed parameters and the subjects' characteristics and show only the experimental conditions' effects.

  13. Valorization of MSWI bottom ash for biogas desulfurization: Influence of biogas water content.

    PubMed

    Fontseré Obis, Marta; Germain, Patrick; Troesch, Olivier; Spillemaecker, Michel; Benbelkacem, Hassen

    2017-02-01

    In this study an alternative valorization of Municipal Solid Waste Incineration (MSWI) Bottom Ash (BA) for H 2 S elimination from landfill biogas was evaluated. Emphasis was given to the influence of water content in biogas on H 2 S removal efficiency by BA. A small-scale pilot was developed and implemented in a landfill site located in France. A new biogas analyzer was used and allowed real-time continuous measurement of CH 4 , CO 2 , O 2 , H 2 S and H 2 O in raw and treated biogas. The H 2 S removal efficiency of bottom ash was evaluated for different inlet biogas humidities: from 4 to 24g water /m 3 . The biogas water content was found to greatly affect bottom ash efficiency regarding H 2 S removal. With humid inlet biogas the H 2 S removal was almost 3 times higher than with a dry inlet biogas. Best removal capacity obtained was 56gH 2 S/kgdryBA. A humid inlet biogas allows to conserve the bottom ash moisture content for a maximum H 2 S retention. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Fed-batch anaerobic valorization of slaughterhouse by-products with mesophilic microbial consortia without methane production.

    PubMed

    Pessiot, J; Nouaille, R; Jobard, M; Singhania, R R; Bournilhas, A; Christophe, G; Fontanille, P; Peyret, P; Fonty, G; Larroche, C

    2012-07-01

    This work aimed at setting up a fully instrumented, laboratory-scale bioreactor enabling anaerobic valorization of solid substrates through hydrogen and/or volatile fatty acid (VFA) production using mixed microbial populations (consortia). The substrate used was made of meat-based wastes, especially from slaughterhouses, which are becoming available in large amounts as a consequence of the growing constraints for waste disposal from meat industry. A reconstituted microbial mesophilic consortium without Archaebacteria (methanogens), named PBr, was cultivated in a 5-L anaerobic bioreactor on slaughterhouse wastes. The experiments were carried out with sequential fed-batch operations, including liquid medium removal from the bioreactor and addition of fresh substrate. VFAs and nitrogen were the main metabolites observed, while hydrogen accumulation was very low and no methane production was evidenced. After 1,300 h of culture, yields obtained for VFAs reached 0.38 g/g dry matter. Strain composition of the microbial consortium was also characterized using molecular tools (temporal temperature gradient gel electrophoresis and gene sequencing).

  15. Compositional insights and valorization pathways for carbonaceous material deposited during bio-oil thermal treatment.

    PubMed

    Ochoa, Aitor; Aramburu, Borja; Ibáñez, María; Valle, Beatriz; Bilbao, Javier; Gayubo, Ana G; Castaño, Pedro

    2014-09-01

    This work analyses the composition, morphology, and thermal behavior of the carbonaceous materials deposited during the thermal treatment of bio-oil (thermal pyrolytic lignin-TPL). The bio-oil was obtained by flash pyrolysis of lignocellulosic biomass (pine sawdust), and the TPLs were obtained in the 400-700 °C range. The TPLs were characterized by performing elemental analysis; (13)C NMR, Raman, FTIR, and X-ray photoelectron spectroscopy; SEM; and temperature-programmed oxidation analyzed by differential thermogravimetry and differential scanning calorimetry. The results are compared to a commercial lignin (CL). The TPLs have lower oxygen and hydrogen contents and a greater aromaticity and structural order than the CL material. Based on these features, different valorization routes are proposed: the TPL obtained at 500 °C is suitable for use as a fuel, and the TPL obtained at 700 °C has a suitable morphology and composition for use as an adsorbent or catalyst support. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. NOAA predicts near-normal or below-normal 2014 Atlantic hurricane season

    Science.gov Websites

    Related link: Atlantic Basin Hurricane Season Outlook Discussion El Niño/Southern Oscillation (ENSO predicts near-normal or below-normal 2014 Atlantic hurricane season El Niño expected to develop and . The main driver of this year's outlook is the anticipated development of El Niño this summer. El NiÃ

  17. [Presentation of the Editor of Gaceta Médica de México].

    PubMed

    Treviño-Becerra, Alejandro

    La Gaceta Médica de México (GMM) es nuestro órgano oficial de divulgación que muestra los valores de la Academia Nacional de Medicina de México. Sirve como identidad del médico mexicano con sus académicos y divulga los fundamentos científicos de la práctica médica nacional.

  18. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    PubMed

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  19. Enrichment of Glycoproteins using Nano-scale Chelating Con A Monolithic Capillary Chromatography

    PubMed Central

    Feng, Shun; Yang, Na; Pennathur, Subramaniam; Goodison, Steve; Lubman, David M.

    2009-01-01

    Immobilized lectin chromatography can be employed for glycoprotein enrichment, but commonly used columns have limitations of yield and resolution. In order to improve efficiency and to make the technique applicable to minimal sample material, we have developed a nano-scale chelating Concanavalin A (Con A) monolithic capillary prepared using GMA-EDMA (glycidyl methacrylate–co-ethylene dimethacrylate) as polymeric support. Con A was immobilized on Cu(II)-charged iminodiacetic acid (IDA) regenerable sorbents by forming a IDA:Cu(II):Con A sandwich affinity structure that has high column capacity as well as stability. When compared with conventional Con A lectin chromatography, the monolithic capillary enabled the better reproducible detection of over double the number of unique N-glycoproteins in human urine samples. Utility for analysis of minimal biological samples was confirmed by the successful elucidation of glycoprotein profiles in mouse urine samples at the microliter scale. The improved efficiency of the nano-scale monolithic capillary will impact the analysis of glycoproteins in complex biological samples, especially where only limited material may be available. PMID:19366252

  20. 9 CFR 319.300 - Chili con carne.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Chili con carne. 319.300 Section 319.300 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY...

  1. 9 CFR 319.300 - Chili con carne.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Canned, Frozen, or Dehydrated Meat Food... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Chili con carne. 319.300 Section 319.300 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY...

  2. Optimizing Electrospray Interfaces Using Slowly Diverging Conical Duct (ConDuct) Electrodes

    PubMed Central

    Krutchinsky, Andrew N.; Padovan, Júlio C.; Cohen, Herbert; Chait, Brian T.

    2015-01-01

    We demonstrate that the efficiency of ion transmission from atmosphere to vacuum through stainless steel electrodes that contain slowly divergent conical duct (ConDuct) channels can be close to 100%. Here, we explore the properties of 2.5 cm long electrodes with angles of divergence of 0°, 1°, 2°, 3°, 5°, 8°, 13°, and 21°, respectively. The ion transmission efficiency was observed to jump from 10–20% for the 0° (straight) channels to 90–95% for channels with an angle of divergence as small as 1°. Furthermore, the 2–3° ConDuct electrodes produced extraordinarily low divergence ion beams that propagated in a laser-like fashion over long distances in vacuum. To take advantage of these newly discovered properties, we constructed a novel atmosphere-to-vacuum ion interface utilizing a 2° ConDuct as an inlet electrode and compared its ion transmission efficiency with that of the interface used in the commercial (Thermo) Velos Orbitrap and Q Exactive mass spectrometers. We observed that the ConDuct interface transmitted up to 17 times more ions than the commercial reference interface and also yielded improved signal-to-noise mass spectra of peptides. We infer from these results that the performance of many current atmosphere-tovacuum interfaces utilizing metal capillaries can be substantially improved by replacing them with 1° or 2° metal ConDuct electrodes, which should preserve the convenience of supplying ion desolvation energy by heating the electrode while greatly increasing the efficiency of ion transmission into the mass spectrometer. PMID:25667060

  3. Optimizing Electrospray Interfaces Using Slowly Diverging Conical Duct (ConDuct) Electrodes

    NASA Astrophysics Data System (ADS)

    Krutchinsky, Andrew N.; Padovan, Júlio C.; Cohen, Herbert; Chait, Brian T.

    2015-04-01

    We demonstrate that the efficiency of ion transmission from atmosphere to vacuum through stainless steel electrodes that contain slowly divergent conical duct (ConDuct) channels can be close to 100%. Here, we explore the properties of 2.5-cm-long electrodes with angles of divergence of 0°, 1°, 2°, 3°, 5°, 8°, 13°, and 21°, respectively. The ion transmission efficiency was observed to jump from 10-20% for the 0° (straight) channels to 90-95% for channels with an angle of divergence as small as 1°. Furthermore, the 2-3° ConDuct electrodes produced extraordinarily low divergence ion beams that propagated in a laser-like fashion over long distances in vacuum. To take advantage of these newly discovered properties, we constructed a novel atmosphere-to-vacuum ion interface utilizing a 2° ConDuct as an inlet electrode and compared its ion transmission efficiency with that of the interface used in the commercial (Thermo Fisher Scientific, San Jose, CA, USA) Velos Orbitrap and Q Exactive mass spectrometers. We observed that the ConDuct interface transmitted up to 17 times more ions than the commercial reference interface and also yielded improved signal-to-noise mass spectra of peptides. We infer from these results that the performance of many current atmosphere-to-vacuum interfaces utilizing metal capillaries can be substantially improved by replacing them with 1° or 2° metal ConDuct electrodes, which should preserve the convenience of supplying ion desolvation energy by heating the electrode while greatly increasing the efficiency of ion transmission into the mass spectrometer.

  4. Simultaneous valorization and biocatalytic upgrading of heavy vacuum gas oil by the biosurfactant-producing Pseudomonas aeruginosa AK6U.

    PubMed

    Ismail, Wael Ahmed; Mohamed, Magdy El-Said; Awadh, Maysoon N; Obuekwe, Christian; El Nayal, Ashraf M

    2017-11-01

    Heavy vacuum gas oil (HVGO) is a complex and viscous hydrocarbon stream that is produced as the bottom side product from the vacuum distillation units in petroleum refineries. HVGO is conventionally treated with thermochemical process, which is costly and environmentally polluting. Here, we investigate two petroleum biotechnology applications, namely valorization and bioupgrading, as green approaches for valorization and upgrading of HVGO. The Pseudomonas aeruginosa AK6U strain grew on 20% v/v of HVGO as a sole carbon and sulfur source. It produced rhamnolipid biosurfactants in a growth-associated mode with a maximum crude biosurfactants yield of 10.1 g l -1 , which reduced the surface tension of the cell-free culture supernatant to 30.6 mN m -1 within 1 week of incubation. The rarely occurring dirhamnolipid Rha-Rha-C 12 -C 12 dominated the congeners' profile of the biosurfactants produced from HVGO. Heavy vacuum gas oil was recovered from the cultures and abiotic controls and the maltene fraction was extracted for further analysis. Fractional distillation (SimDist) of the biotreated maltene fraction showed a relative decrease in the high-boiling heavy fuel fraction (BP 426-565 °C) concomitant with increase in the lighter distillate diesel fraction (BP 315-426 °C). Analysis of the maltene fraction revealed compositional changes. The number-average (Mn) and weight-average (Mw) molecular weights, as well as the absolute number of hydrocarbons and sulfur heterocycles were higher in the biotreated maltene fraction of HVGO. These findings suggest that HVGO can be potentially exploited as a carbon-rich substrate for production of the high-value biosurfactants by P. aeruginosa AK6U and to concomitantly improve/upgrade its chemical composition. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  5. SeleCon: Scalable IoT Device Selection and Control Using Hand Gestures.

    PubMed

    Alanwar, Amr; Alzantot, Moustafa; Ho, Bo-Jhang; Martin, Paul; Srivastava, Mani

    2017-04-01

    Although different interaction modalities have been proposed in the field of human-computer interface (HCI), only a few of these techniques could reach the end users because of scalability and usability issues. Given the popularity and the growing number of IoT devices, selecting one out of many devices becomes a hurdle in a typical smarthome environment. Therefore, an easy-to-learn, scalable, and non-intrusive interaction modality has to be explored. In this paper, we propose a pointing approach to interact with devices, as pointing is arguably a natural way for device selection. We introduce SeleCon for device selection and control which uses an ultra-wideband (UWB) equipped smartwatch. To interact with a device in our system, people can point to the device to select it then draw a hand gesture in the air to specify a control action. To this end, SeleCon employs inertial sensors for pointing gesture detection and a UWB transceiver for identifying the selected device from ranging measurements. Furthermore, SeleCon supports an alphabet of gestures that can be used for controlling the selected devices. We performed our experiment in a 9 m -by-10 m lab space with eight deployed devices. The results demonstrate that SeleCon can achieve 84.5% accuracy for device selection and 97% accuracy for hand gesture recognition. We also show that SeleCon is power efficient to sustain daily use by turning off the UWB transceiver, when a user's wrist is stationary.

  6. Propiedades biomecánicas de la membrana limitante interna tras recibir tratamiento intravítreo con ocriplasmina.

    PubMed

    Vielmuth, Franziska; Schumann, Ricarda G; Spindler, Volker; Wolf, Armin; Scheler, Renate; Mayer, Wolfgang J; Henrich, Paul B; Haritoglou, Christos

    2017-01-01

    Objetivo: Evaluar la rigidez de la membrana limitante interna (MLI) humana y evaluar los posibles cambios de las propiedades mecánicas tras administrar una inyección intravítrea de ocriplasmina para tratar la tracción vitreomacular. Métodos: Este estudio se compone de una serie de casos intervencionales y comparativos de 12 muestras de MLI extraídas mediante cirugía y obtenidas de forma consecutiva de 9 ojos de 9 pacientes después de someterse sin éxito a vitreólisis farmacológica con ocriplasmina. Durante el mismo periodo de tiempo, 16 muestras de otros 13 ojos sin tratamiento con ocriplasmina se obtuvieron mediante vitrectomía y sirvieron como controles. Todos los pacientes presentaron agujeros maculares o tracción vitreomacular y se sometieron a vitrectomía con disección de la MLI tanto con tinción con azul brillante (AB) como sin ella. Todas las muestras se analizaron con un microscopio de fuerza atómica con imágenes de las regiones de 25 × 25 μm. En todas las muestras, se analizaron tanto la parte de la retina como la del vítreo de la MLI. Resultados: La microscopia de fuerza atómica no reveló diferencias significativas en cuanto a elasticidad de las muestras de MLI extraídas de ojos con o sin tratamiento con ocriplasmina. Las áreas onduladas de la parte de la retina presentaron una mayor rigidez que la parte del vítreo de la MLI. La cartografía topográfica tanto de la parte del vítreo como de la retina de la MLI no mostró ninguna alteración aparente de la morfología en ojos tratados con ocriplasmina en comparación con los ojos no tratados. La tinción con azul brillante conllevó un aumento de la rigidez tisular. Conclusiones: Las inyecciones intravítreas de ocriplasmina no varían las propiedades biomecánicas de la MLI humana. No existen pruebas de un posible efecto enzimático que interfiera con la rigidez de esta membrana basal. © 2017 S. Karger AG, Basel.

  7. Sympathetic nerve traffic and baroreflex function in optimal, normal, and high-normal blood pressure states.

    PubMed

    Seravalle, Gino; Lonati, Laura; Buzzi, Silvia; Cairo, Matteo; Quarti Trevano, Fosca; Dell'Oro, Raffaella; Facchetti, Rita; Mancia, Giuseppe; Grassi, Guido

    2015-07-01

    Adrenergic activation and baroreflex dysfunction are common in established essential hypertension, elderly hypertension, masked and white-coat hypertension, resistant hypertension, and obesity-related hypertension. Whether this autonomic behavior is peculiar to established hypertension or is also detectable in the earlier clinical phases of the disease, that is, the high-normal blood pressure (BP) state, is still largely undefined, however. In 24 individuals with optimal BP (age: 37.1  ±  2.1 years, mean  ±  SEM) and in 27 with normal BP and 38 with high-normal BP, age matched with optimal BP, we measured clinic, 24-h and beat-to-beat BP, heart rate (HR), and muscle sympathetic nerve activity (MSNA) at rest and during baroreceptor stimulation and deactivation. Measurements also included anthropometric as well as echocardiographic and homeostasis model assessment (HOMA) index. For similar anthropometric values, clinic, 24-h ambulatory, and beat-to-beat BPs were significantly greater in normal BP than in optimal BP. This was the case when the high-normal BP group was compared to the normal and optimal BP groups. MSNA (but not HR) was also significantly greater in high-normal BP than in normal BP and optimal BP (51.3  ±  2.0 vs. 40.3  ±  2.3 and 41.1 ± 2.6  bursts per 100  heartbeats, respectively, P < 0.01). The sympathetic activation seen in high-normal BP was coupled with an impairment of baroreflex HR control (but not MSNA) and with a significant increase in HOMA Index, which showed a significant direct relationship with MSNA. Thus, independently of which BP the diagnosis is based, high-normal BP is a condition characterized by a sympathetic activation. This neurogenic alteration, which is likely to be triggered by metabolic rather than reflex alterations, might be involved, together with other factors, in the progression of the condition to established hypertension.

  8. Upper-normal waist circumference is a risk marker for metabolic syndrome in normal-weight subjects.

    PubMed

    Okada, R; Yasuda, Y; Tsushita, K; Wakai, K; Hamajima, N; Matsuo, S

    2016-01-01

    To elucidate implication of upper-normal waist circumference (WC), we examined whether the normal range of WC still represents a risk of metabolic syndrome (MetS) or non-adipose MetS components among normal-weight subjects. A total of 173,510 persons (100,386 men and 73,124 women) with normal WC (<90/80 cm in men/women) and body mass index (BMI) of 18.5-24.9 were included. Subjects were categorized as having low, moderate, and upper-normal WC for those with WC < 80, 80-84, and 85-89 cm in men and <70, 70-74, and 75-79 cm in women, respectively. The prevalence of all the non-adipose MetS components (e.g. prediabetes and borderline dyslipidemia) was significantly higher in subjects with upper-normal WC on comparison with those with low WC. Overall, the prevalence of MetS (having three or more of four non-adipose MetS components) gradually increased with increasing WC (12%, 21%, and 27% in men and 11%, 14%, and 19% in women for low, moderate, and upper-normal WC, respectively). Moreover, the risk of having a greater number of MetS components increased in subjects with upper-normal WC compared with those with low WC (odds ratios for the number of one, two, three, and four MetS components: 1.29, 1.81, 2.53, and 2.47 in men and 1.16, 1.55, 1.49, and 2.20 in women, respectively). Upper-normal WC represents a risk for acquiring a greater number of MetS components and the early stage of MetS components (prediabetes and borderline dyslipidemia), after adjusting for BMI, in a large general population with normal WC and BMI. Copyright © 2015 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  9. B cells as accessory cells in a Con A response of a T cell clone.

    PubMed

    Takeuchi, M; Kakiuchi, T; Taira, S; Nariuchi, H

    1987-12-01

    Accessory cell (AC) function of B cells was examined in Con A response of a cloned T cell line, 22-9D, which is Thy 1+,L3T4+,Lyt2-,H-2KbDb+ and I-Ab-.22-9D cells produced IL 2 in the presence of Con A without participation of AC. For the initiation of a proliferative response to Con A, the addition of spleen cells or spleen adherent cells was required. B cells as AC were unable to induce the proliferative response. In the presence of culture supernatant of spleen cells stimulated with Con A (CAS), 22-9D cells showed proliferative response to Con A with B cell AC. The response was inhibited by a relevant monoclonal anti-I-A antibody. Although irradiated spleen cells as AC induced IL 2 receptor expression of 22-9D cells in the presence of Con A, B cells were shown to require the addition of unknown factor(s) in CAS, which was suggested to be different from IL 1, IL 2, IL 3, or IFN-gamma, for the induction of the receptor expression on 22-9D cells.

  10. Wildland Fire Induced Heating of Dome 375 Perma-Con®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flores, Eugene Michael

    AET-1 was tasked by ADEM with determining the temperature rise in the drum contents of drums stored in the Dome 375 Perma-Con® at TA-54 given a wildland fire. The wildland fire causes radiative and convective heating on the Perma-Con® exterior. The wildland fire time histories for the radiative and convective heating environment were provided to AET-1 by EES-16. If the calculated temperature rise results in a drum content temperature over 40 °C, then ADEM desires a design solution to ensure the peak temperature remains below 40 °C. An axi-symmetric FE simulation was completed to determine the peak temperature of themore » contents of a drum stored in the Dome 375 Perma-Con® during a wildland fire event. Three wildland fire time histories for the radiative and convective heat transfer were provided by EES-16 and were inputs for the FE simulation. The maximum drum content temperature reached was found to be 110 °C while using inputs from the SiteG_2ms_4ign_wind_from_west.xlsx time history input and not including the SWB in the model. Including the SWB in the results in a peak drum content temperature of 61 °C for the SiteG_2ms_4ign_wind_from_west.xlsx inputs. EES-16 decided that by using fuel mitigation efforts, such as mowing the grass and shrubs near the Perma-Con® they could reduce the shrub/grass fuel loading near the Perma-Con® from 1.46 kg/m 2 to 0.146 kg/m 2 and by using a less conservative fuel loading for the debris field inside the Dome 375 perimeter, reducing it from 0.58 kg/m2 to 0.058 kg/m 2 in their model. They also greatly increased the resolution of their radiation model and increased the accuracy of their model’s required convergence value. Using this refined input the maximum drum content temperature was found to be 28 °C with no SWB present in the model. Additionally, this refined input model was modified to include worst case emissivity values for the concrete, drum and Perma-Con® interior, along with adding a 91 second long

  11. Potential of sustainable hierarchical zeolites in the valorization of α-pinene.

    PubMed

    Nuttens, Nicolas; Verboekend, Danny; Deneyer, Aron; Van Aelst, Joost; Sels, Bert F

    2015-04-13

    In the valorization of α-pinene, which is an important biomass intermediate derived from turpentine oil, hierarchical (mesoporous) zeolites represent a superior class of catalysts. Hierarchical USY, ZSM-5, and beta zeolites have been prepared, characterized, and catalytically evaluated, with the aim of combining the highest catalytic performance with the most sustainable synthetic protocol. These zeolites are prepared by alkaline treatment in aqueous solutions of NH4 OH, NaOH, diethylamine, and NaOH complemented with tetrapropylammonium bromide. The hierarchical USY zeolite is the most attractive catalyst of the tested series, and is able to combine an overall organic-free synthesis with an up to sixfold activity enhancement and comparable selectivity over the conventional USY zeolite. This superior performance relates to a threefold greater activity than that of the commercial standard, namely, H2 SO4 /TiO2 . Correlation of the obtained benefits to the amount of solid lost during the postsynthetic modifications highlights that the highest activity gains are obtained with minor leaching. Furthermore, a highly zeolitic character, as determined by bulk XRD, is beneficial, but not crucial, in the conversion of α-pinene. The alkaline treatments not only result in a higher overall activity, but also a more functional external surface area, attaining up to four times the pinene conversions per square nanometer. The efficiency of the hierarchical USY zeolite is concomitantly demonstrated in the conversion of limonene and turpentine oil, which emphasizes its industrial potential. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Supervised normalization of microarrays

    PubMed Central

    Mecham, Brigham H.; Nelson, Peter S.; Storey, John D.

    2010-01-01

    Motivation: A major challenge in utilizing microarray technologies to measure nucleic acid abundances is ‘normalization’, the goal of which is to separate biologically meaningful signal from other confounding sources of signal, often due to unavoidable technical factors. It is intuitively clear that true biological signal and confounding factors need to be simultaneously considered when performing normalization. However, the most popular normalization approaches do not utilize what is known about the study, both in terms of the biological variables of interest and the known technical factors in the study, such as batch or array processing date. Results: We show here that failing to include all study-specific biological and technical variables when performing normalization leads to biased downstream analyses. We propose a general normalization framework that fits a study-specific model employing every known variable that is relevant to the expression study. The proposed method is generally applicable to the full range of existing probe designs, as well as to both single-channel and dual-channel arrays. We show through real and simulated examples that the method has favorable operating characteristics in comparison to some of the most highly used normalization methods. Availability: An R package called snm implementing the methodology will be made available from Bioconductor (http://bioconductor.org). Contact: jstorey@princeton.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20363728

  13. La salud en personas con discapacidad intelectual en España: estudio europeo POMONA-II

    PubMed Central

    Martínez-Leal, Rafael; Salvador-Carulla, Luis; Gutiérrez-Colosía, Mencía Ruiz; Nadal, Margarida; Novell-Alsina, Ramón; Martorell, Almudena; González-Gordón, Rodrigo G.; Mérida-Gutiérrez, M. Reyes; Ángel, Silvia; Milagrosa-Tejonero, Luisa; Rodríguez, Alicia; García-Gutiérrez, Juan C.; Pérez-Vicente, Amado; García-Ibáñez, José; Aguilera-Inés, Francisco

    2011-01-01

    Introducción Estudios internacionales demuestran que existe un patrón diferenciado de salud y una disparidad en la atención sanitaria entre personas con discapacidad intelectual (DI) y población general. Objetivo Obtener datos sobre el estado de salud de las personas con DI y compararlos con datos de población general. Pacientes y métodos Se utilizó el conjunto de indicadores de salud P15 en una muestra de 111 sujetos con DI. Los datos de salud encontrados se compararon según el tipo de residencia de los sujetos y se utilizó la Encuesta Nacional de Salud 2006 para comparar estos datos con los de la población general. Resultados La muestra con DI presentó 25 veces más casos de epilepsia y el doble de obesidad. Un 20% presentó dolor bucal, y existió una alta presencia de problemas sensoriales, de movilidad y psicosis. Sin embargo, encontramos una baja presencia de patologías como la diabetes, la hipertensión, la osteoartritis y la osteoporosis. También presentaron una menor participación en programas de prevención y promoción de la salud, un mayor número de ingresos hospitalarios y un uso menor de los servicios de urgencia. Conclusiones El patrón de salud de las personas con DI difiere del de la población general, y éstas realizan un uso distinto de los servicios sanitarios. Es importante el desarrollo de programas de promoción de salud y de formación profesional específicamente diseñados para la atención de personas con DI, así como la implementación de encuestas de salud que incluyan datos sobre esta población. PMID:21948011

  14. Energy Star program benefits Con Edison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Impressed with savings in energy costs achieved after upgrading the lighting and air conditioning systems at its Manhattan headquarters, Home Box Office (HBO) wanted to do more, James Flock, vice president for computer and office systems, contacted Con Edison Co. of New York in March 1991 to determine what the company could do to save money by reducing energy consumed by personal computers. Arthur Kressner, Con Edison Research and Development manager contacted industry organizations and manufacturers for advice, but was told only to shut off computers at night and on weekends. Kressner arranged a series of meetings with IBM andmore » the Electric Power Research Institute (EPRI) to discuss the issue, then approached the U.S. Environmental Protection Agency (EPA), which was designing a program to promote the introduction and use of energy-efficient office equipment. In 1992, the EPA announced the Energy Star program for PCs, enabling manufacturers to display the Energy Star logo on machines meeting program criteria, including the ability to enter a sleep mode in which neither the computer nor monitor consume more than 30 W or electricity. Industry experts estimate national energy consumption by office equipment could double by the year 2000, but Energy Star equipment is expected to improve efficiency and help maintain electric loads.« less

  15. Laser-scanned fluorescence of nonlased/normal, lased/normal, nonlased/carious, and lased/carious enamel

    NASA Astrophysics Data System (ADS)

    Zakariasen, Kenneth L.; Barron, Joseph R.; Paton, Barry E.

    1992-06-01

    Research has shown that low levels of CO2 laser irradiation raise enamel resistance to sub-surface demineralization. Additionally, laser scanned fluorescence analysis of enamel, as well a laser and white light reflection studies, have potential for both clinical diagnosis and comparative research investigations of the caries process. This study was designed to compare laser fluorescence and laser/white light reflection of (1) non-lased/normal with lased/normal enamel and (2) non-lased/normal with non-lased/carious and lased/carious enamel. Specimens were buccal surfaces of extracted third molars, coated with acid resistant varnish except for either two or three 2.25 mm2 windows (two window specimens: non-lased/normal, lased/normal--three window specimens: non-lased/normal, non-lased carious, lased/carious). Teeth exhibiting carious windows were immersed in a demineralizing solution for twelve days. Non-carious windows were covered with wax during immersion. Following immersion, the wax was removed, and fluorescence and laser/white light reflection analyses were performed on all windows utilizing a custom scanning laser fluorescence spectrometer which focuses light from a 25 mWatt He-Cd laser at 442 nm through an objective lens onto a cross-section >= 3 (mu) in diameter. For laser/white light reflection analyses, reflected light intensities were measured. A HeNe laser was used for laser light reflection studies. Following analyses, the teeth are sectioned bucco-lingually into 80 micrometers sections, examined under polarized light microscopy, and the lesions photographed. This permits comparison between fluorescence/reflected light values and the visualized decalcification areas for each section, and thus comparisons between various enamel treatments and normal enamel. The enamel specimens are currently being analyzed.

  16. Estudio de la fotoabsorción y fotoionización de la molécula de alta relevancia atmosférica no a través de los estados Rydberg con la metodología MQDO

    NASA Astrophysics Data System (ADS)

    Bustos, E.; Velasco, A. M.; Martín, I.; Lavín, C.

    Los procesos de fotoionización son de una importancia fundamental [1] y encuentran aplicación en un gran número de contextos científicos: Astrofísica [2], química de las radiaciones, biología. Los investigadores de dichos campos, necesitan de valores de fiables de secciones eficaces para la fotoionización parcial, la Fotoabsorción, así como para los procesos de fotofragmentación en amplios intervalos espectrales, particularmente en estudios de modelización [3-5]. En este trabajo se ha centrado la atención sobre el oxido nítrico, que se ha considerado apropiado y relevante por varios motivos: por el trascendental papel que representa en la física y química de la alta atmosfera [6], aparte de por estar íntimamente relacionado con los problemas de contaminación. Los procesos de recombinación disociativa [7] del NO, donde los estados Rydberg se encuentran directamente implicados, son relevantes, por ejemplo, en las regiones E y F de la ionosfera [7]. En este trabajo se estudia la fotoionización del NO desde el estado fundamental con la versión molecular del método del orbital de defecto cuántico (MQDO). Para ello se calcula el diferencial de las fuerzas de oscilador parciales que constituyen los canales de fotoionización del NO desde el estado fundamental. La continuidad del diferencial de fuerza de oscilador calculada a través del umbral de fotoionización, esto es, en las regiones del espectro discreta y del continua, se adopta como criterio de calidad la escasez de datos comparativos [8].

  17. SeleCon: Scalable IoT Device Selection and Control Using Hand Gestures

    PubMed Central

    Alanwar, Amr; Alzantot, Moustafa; Ho, Bo-Jhang; Martin, Paul; Srivastava, Mani

    2018-01-01

    Although different interaction modalities have been proposed in the field of human-computer interface (HCI), only a few of these techniques could reach the end users because of scalability and usability issues. Given the popularity and the growing number of IoT devices, selecting one out of many devices becomes a hurdle in a typical smarthome environment. Therefore, an easy-to-learn, scalable, and non-intrusive interaction modality has to be explored. In this paper, we propose a pointing approach to interact with devices, as pointing is arguably a natural way for device selection. We introduce SeleCon for device selection and control which uses an ultra-wideband (UWB) equipped smartwatch. To interact with a device in our system, people can point to the device to select it then draw a hand gesture in the air to specify a control action. To this end, SeleCon employs inertial sensors for pointing gesture detection and a UWB transceiver for identifying the selected device from ranging measurements. Furthermore, SeleCon supports an alphabet of gestures that can be used for controlling the selected devices. We performed our experiment in a 9m-by-10m lab space with eight deployed devices. The results demonstrate that SeleCon can achieve 84.5% accuracy for device selection and 97% accuracy for hand gesture recognition. We also show that SeleCon is power efficient to sustain daily use by turning off the UWB transceiver, when a user’s wrist is stationary. PMID:29683151

  18. Sport Concussion Management Using Facebook: A Feasibility Study of an Innovative Adjunct "iCon".

    PubMed

    Ahmed, Osman Hassan; Schneiders, Anthony G; McCrory, Paul R; Sullivan, S John

    2017-04-01

      Sport concussion is currently the focus of much international attention. Innovative methods to assist athletic trainers in facilitating management after this injury need to be investigated.   To investigate the feasibility of using a Facebook concussion-management program termed iCon (interactive concussion management) to facilitate the safe return to play (RTP) of young persons after sport concussion.   Observational study.   Facebook group containing interactive elements, with moderation and support from trained health care professionals.   Eleven participants (n = 9 men, n = 2 women; range, 18 to 28 years old) completed the study.   The study was conducted over a 3-month period, with participant questionnaires administered preintervention and postintervention. The primary focus was on the qualitative experiences of the participants and the effect of iCon on their RTP. Usage data were also collected.   At the completion of the study, all participants (100%) stated that they would recommend an intervention such as iCon to others. Their supporting quotes all indicated that iCon has the potential to improve the management of concussion among this cohort. Most participants (n = 9, 82%) stated they were better informed with regard to their RTP due to participating in iCon.   This interactive adjunct to traditional concussion management was appreciated among this participant group, which indicates the feasibility of a future, larger study of iCon. Athletic trainers should consider the role that multimedia technologies may play in assisting with the management of sport concussion.

  19. Influence of protic ionic liquids on the structure and stability of succinylated Con A.

    PubMed

    Attri, Pankaj; Venkatesu, Pannuru

    2012-01-01

    We report the synthesis of a series of ionic liquids (ILs) from various ions having different kosmotropicity including dihydrogen phosphate (H(2)PO(4)(-)), hydrogen sulfate (HSO(4)(-)) and acetate (CH(3)COO(-)) as anions and chaotropic cation such as trialkylammonium cation. To characterize the biomolecular interactions of ILs with protein, we have explored the stability of succinylated Con A (S Con A) in the presence of these aqueous ILs, which are varied combinations of kosmotropic anion with chaotropic cation such as triethylammonium dihydrogen phosphate [(CH(3)CH(2))(3)NH][H(2)PO(4)] (TEAP), trimethylammonium acetate [(CH(3))(3)NH][CH(3)COO] (TMAA), trimethylammonium dihydrogen phosphate [(CH(3))(3)NH][H(2)PO(4)] (TMAP) and trimethylammonium hydrogen sulfate [(CH(3))(3)NH][HSO(4)] (TMAS). Circular dichroism (CD) and fluorescence experiments have been used to characterize the stability of S Con A by ILs. Our data distinctly demonstrate that the long alkyl chain IL TEAP is a strong stabilizer for S Con A. Further, our experimental results reveal that TEAP is an effective refolding enhancer for S Con A from a thermally denatured protein structure. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Apoyo a Estudios Geodinamicos con GPS en Guatemala

    NASA Astrophysics Data System (ADS)

    Robles, V. R.

    2013-05-01

    El Instituto Geografico Nacional de Guatemala implemento 17 estaciones GNSS en el año 2009, como un proyecto de credito mixto de donacion de equipamiento del Gobierno de Suiza, el cual, este equipamiento de estaciones CORS GNSS es un sistema de recepción y transmisión de datos crudos GPS RInex que utiliza la tecnologia Spider Web de Leica, asi mismo este sistema esta sirviendo para el espablecimiento de un marco geodesico nacional de coordenadas geodesicas oficiales, el cual se calculan u obtienen las velocidades en tiempos temporales programados de las 17 Estaciones CORS. La infraestructura del marco geodesico de Guatemala esta sirviendo de base para las aplicaciones de estudios geodinamicos como el monitoreo de del desplazamiento de las placas tectonicas por medio de un estudio que se inicio en el año de 1999, llamado medicion con GPS el sistema de Fallas de los rios Polochic Motagua de Guatemala, tambien para un estudio que se implemento para deformación de corteza terrestre local en un Volcan Activo de Guatemala llamado Pacaya. Para el estudio de medicion con GPS en el sistema de falla de los Rios del polochic Motagua se implementaron 16 puntos para medir con GPS de dos frecuencias en el año de 1999, el cual, tres puntos son estaciones geodesicas CORS IGS llamados GUAT, ELEN y HUEH, despues en el año de 2003 se hizo otra medicion en un total de 20 puntos, que permitió calcular las velocidades de desplazamieinto de los puntos en mención, usando como referencia el modelo NUVEL 1A de DeMets de la placa de Norteamerica. Este estudio fue en cooperación internacional por la universidad de Nice de Francia y el IGNde Francia. Para el estudio del monitoreo con GPS del volcan activo de Guatemala, se implementaron cuatro puntos al rededor del volcan, el cual, se realizan cuatro mediciones al año, que permiten determinar axialmente la distancias entre los puntos, y rebisar estadisticamente cual es el comportamiento de las distancias en funcion del tiempo, si

  1. Areas naturales protegidas de Puerto Rico

    Treesearch

    William A. Gould; Maya Quinones; Mariano Solórzano; Waldemar Alcobas; Caryl Alarcon

    2011-01-01

    En este mapa mostramos las areas naturales protegidas designadas para la conservacion de los recursos naturales en Puerto Rico y regiones reguladas por ley con el potencial de mejorar la conservacion de los mismos. La designacion de areas protegidas es un proceso dinamico debido a que es el resultado de la constante evolucion de valores en los diferentes sectores de la...

  2. Spinal cord normalization in multiple sclerosis.

    PubMed

    Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S

    2014-01-01

    Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P < .05). There were correlations between clinical measures and raw-SCV (EDSS:r = -.20; MSFC:r = .16; strength:r = .35; vibration:r = -.19). Correlations consistently strengthened with normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.

  3. Normal Stress or Adjustment Disorder?

    MedlinePlus

    ... Lifestyle Stress management What's the difference between normal stress and an adjustment disorder? Answers from Daniel K. Hall-Flavin, M.D. Stress is a normal psychological and physical reaction to ...

  4. Understanding a Normal Distribution of Data.

    PubMed

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  5. Valorization of pyrolysis water: a biorefinery side stream, for 1,2-propanediol production with engineered Corynebacterium glutamicum.

    PubMed

    Lange, Julian; Müller, Felix; Bernecker, Kerstin; Dahmen, Nicolaus; Takors, Ralf; Blombach, Bastian

    2017-01-01

    A future bioeconomy relies on the efficient use of renewable resources for energy and material product supply. In this context, biorefineries have been developed and play a key role in converting lignocellulosic residues. Although a holistic use of the biomass feed is desired, side streams evoke in current biorefinery approaches. To ensure profitability, efficiency, and sustainability of the overall conversion process, a meaningful valorization of these materials is needed. Here, a so far unexploited side stream derived from fast pyrolysis of wheat straw-pyrolysis water-was used for production of 1,2-propanediol in microbial fermentation with engineered Corynebacterium glutamicum . A protocol for pretreatment of pyrolysis water was established and enabled growth on its major constituents, acetate and acetol, with rates up to 0.36 ± 0.04 h -1 . To convert acetol to 1,2-propanediol, the plasmid pJUL gldA expressing the glycerol dehydrogenase from Escherichia coli was introduced into C. glutamicum . 1,2-propanediol was formed in a growth-coupled biotransformation and production was further increased by construction of C. glutamicum Δ pqo Δ aceE Δ ldhA Δ mdh pJUL gldA . In a two-phase aerobic/microaerobic fed-batch process with pyrolysis water as substrate, this strain produced 18.3 ± 1.2 mM 1,2-propanediol with a yield of 0.96 ± 0.05 mol 1,2-propanediol per mol acetol and showed an overall volumetric productivity of 1.4 ± 0.1 mmol 1,2-propanediol L -1  h -1 . This study implements microbial fermentation into a biorefinery based on pyrolytic liquefaction of lignocellulosic biomass and accesses a novel value chain by valorizing the side stream pyrolysis water for 1,2-PDO production with engineered C. glutamicum . The established bioprocess operated at maximal product yield and accomplished the so far highest overall volumetric productivity for microbial 1,2-PDO production with an engineered producer strain. Besides, the results highlight the

  6. Olive pomace valorization by Aspergillus species: lipase production using solid-state fermentation.

    PubMed

    Oliveira, Felisbela; Moreira, Cláudia; Salgado, José Manuel; Abrunhosa, Luís; Venâncio, Armando; Belo, Isabel

    2016-08-01

    Pollution by olive mill wastes is an important problem in the Mediterranean area and novel solutions for their proper management and valorization are needed. The aim of this work was to optimize a solid-state fermentation (SSF) process to produce lipase using olive pomace (OP) as the main source of nutrients by several Aspergillus spp. Optimized variables in two different designs were: ratio between olive pomace and wheat bran (OP:WB), NaNO3 , Czapek nutrients, fermentation time, moisture content (MC) and temperature. Results showed that the mixture OP:WB and MC were the most significant factors affecting lipase production for all fungi strains tested. With MC and temperature optimization, a 4.4-fold increase in A. ibericus lipase was achieved (90.5 ± 1.5 U g(-1) ), using a mixture of OP and WB at 1:1 ratio, 0.02 g NaNO3 g(-1) dry substrate, absence of Czapek nutrients, 60% of MC and incubation at 30 °C for 7 days. For A. niger and A. tubingensis, highest lipase activity obtained was 56.6 ± 5.4 and 7.6 ± 0.6 U g(-1) , respectively. Aspergillus ibericus was found to be the most promising microorganism for lipase production using mixtures of OP and WB. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  7. A Review of Staphylococcal Cassette Chromosome mec (SCCmec) Types in Coagulase-Negative Staphylococci (CoNS) Species.

    PubMed

    Saber, Huda; Jasni, Azmiza Syawani; Jamaluddin, Tengku Zetty Maztura Tengku; Ibrahim, Rosni

    2017-10-01

    Coagulase-negative staphylococci (CoNS) are considered low pathogenic organisms. However, they are progressively causing more serious infections with time because they have adapted well to various antibiotics owing to their ability to form biofilms. Few studies have been conducted on CoNS in both, hospital and community-acquired settings, especially in Malaysia. Thus, it is important to study their species and gene distributions. A mobile genetic element, staphylococcal cassette chromosome mec (SCC mec ), plays an important role in staphylococci pathogenesis. Among CoNS, SCC mec has been studied less frequently than Staphylococcus aureus (coagulase-positive staphylococci). A recent study (8) conducted in Malaysia successfully detected SCC mec type I to VIII as well as several new combination patterns in CoNS species, particularly Staphylococcus epidermidis . However, data are still limited, and further research is warranted. This paper provides a review on SCC mec types among CoNS species.

  8. Cell proliferation in normal epidermis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinstein, G.D.; McCullough, J.L.; Ross, P.

    1984-06-01

    A detailed examination of cell proliferation kinetics in normal human epidermis is presented. Using tritiated thymidine with autoradiographic techniques, proliferative and differentiated cell kinetics are defined and interrelated. The proliferative compartment of normal epidermis has a cell cycle duration (Tc) of 311 h derived from 3 components: the germinative labeling index (LI), the duration of DNA synthesis (ts), and the growth fraction (GF). The germinative LI is 2.7% +/- 1.2 and ts is 14 h, the latter obtained from a composite fraction of labeled mitoses curve obtained from 11 normal subjects. The GF obtained from the literature and from humanmore » skin xenografts to nude mice is estimated to be 60%. Normal-appearing epidermis from patients with psoriasis appears to have a higher proliferation rate. The mean LI is 4.2% +/- 0.9, approximately 50% greater than in normal epidermis. Absolute cell kinetic values for this tissue, however, cannot yet be calculated for lack of other information on ts and GF. A kinetic model for epidermal cell renewal in normal epidermis is described that interrelates the rate of birth/entry, transit, and/or loss of keratinocytes in the 3 epidermal compartments: proliferative, viable differentiated (stratum malpighii), and stratum corneum. Expected kinetic homeostasis in the epidermis is confirmed by the very similar ''turnover'' rates in each of the compartments that are, respectively, 1246, 1417, and 1490 cells/day/mm2 surface area. The mean epidermal turnover time of the entire tissue is 39 days. The Tc of 311 h in normal cells in 8-fold longer than the psoriatic Tc of 36 h and is necessary for understanding the hyperproliferative pathophysiologic process in psoriasis.« less

  9. Normal Weight Obesity: A Hidden Health Risk?

    MedlinePlus

    Normal weight obesity: A hidden health risk? Can you be considered obese if you have a normal body weight? Answers from ... considered obese — a condition known as normal weight obesity. Normal weight obesity means you may have the ...

  10. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    PubMed

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  11. Comprehensive non-dimensional normalization of gait data.

    PubMed

    Pinzone, Ornella; Schwartz, Michael H; Baker, Richard

    2016-02-01

    Normalizing clinical gait analysis data is required to remove variability due to physical characteristics such as leg length and weight. This is particularly important for children where both are associated with age. In most clinical centres conventional normalization (by mass only) is used whereas there is a stronger biomechanical argument for non-dimensional normalization. This study used data from 82 typically developing children to compare how the two schemes performed over a wide range of temporal-spatial and kinetic parameters by calculating the coefficients of determination with leg length, weight and height. 81% of the conventionally normalized parameters had a coefficient of determination above the threshold for a statistical association (p<0.05) compared to 23% of those normalized non-dimensionally. All the conventionally normalized parameters exceeding this threshold showed a reduced association with non-dimensional normalization. In conclusion, non-dimensional normalization is more effective that conventional normalization in reducing the effects of height, weight and age in a comprehensive range of temporal-spatial and kinetic parameters. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Connected vehicle pilot deployment program phase 1, concept of operations (ConOps) – Tampa (THEA).

    DOT National Transportation Integrated Search

    2016-02-01

    This document describes the Concept of Operations (ConOps) for the Tampa Hillsborough Expressway Authority (THEA) Connected Vehicle (CV) Pilot Deployment. This ConOps describes the current state of operations, establishes the reasons for change, and ...

  13. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Connected vehicle pilot deployment program phase 1, concept of operations (ConOps) - New York City.

    DOT National Transportation Integrated Search

    2016-04-08

    This document describes the Concept of Operations (ConOps) for the New York City Department of Transportation (NYC) Connected Vehicle Pilot Deployment (CVPD) Project. This ConOps describes the current state of operations, establishes the reasons for ...

  15. Rock friction under variable normal stress

    USGS Publications Warehouse

    Kilgore, Brian D.; Beeler, Nicholas M.; Lozos, Julian C.; Oglesby, David

    2017-01-01

    This study is to determine the detailed response of shear strength and other fault properties to changes in normal stress at room temperature using dry initially bare rock surfaces of granite at normal stresses between 5 and 7 MPa. Rapid normal stress changes result in gradual, approximately exponential changes in shear resistance with fault slip. The characteristic length of the exponential change is similar for both increases and decreases in normal stress. In contrast, changes in fault normal displacement and the amplitude of small high-frequency elastic waves transmitted across the surface follow a two stage response consisting of a large immediate and a smaller gradual response with slip. The characteristic slip distance of the small gradual response is significantly smaller than that of shear resistance. The stability of sliding in response to large step decreases in normal stress is well predicted using the shear resistance slip length observed in step increases. Analysis of the shear resistance and slip-time histories suggest nearly immediate changes in strength occur in response to rapid changes in normal stress; these are manifested as an immediate change in slip speed. These changes in slip speed can be qualitatively accounted for using a rate-independent strength model. Collectively, the observations and model show that acceleration or deceleration in response to normal stress change depends on the size of the change, the frictional characteristics of the fault surface, and the elastic properties of the loading system.

  16. Evaluation of the ASOS impact on climatic normals and assessment of variable-length time periods in calculation of normals

    NASA Astrophysics Data System (ADS)

    Kauffman, Chad Matthew

    The temperature and precipitation that describe the norm of daily, monthly, and seasonal climate conditions are ``climate normals.'' They are usually calculated based on climate data covering a 30-year period, and updated in every 10 years. The next update will take place in year 2001. Because of the advent of the Automated Surface Observations Systems (ASOS) beginning in early 1990s and recognized temperature bias between ASOS and the conventional temperature sensors there is an uncertainty of how the ASOS data should be used to calculate the 1971-2000 temperature normal. This study examined the uncertainty and offered a method to minimize it. It showed that the ASOS bias has a measurable impact on the new 30-year temperature normal. The impact varies among stations and climate regions. Some stations with a cooling trend in ASOS temperature have a cooler normal for their temperature, while others with a warming trend have a warmer normal for temperature. These quantitative evaluations of ASOS effect for stations and regions can be used to reduce ASOS bias in temperature normals. This study also evaluated temperature normals for different length periods and compared them to the 30-year normal. It showed that the difference between the normals, is smaller in maritime climate than in continental temperate climate. In the former, the six- year normal describes a similar temperature variation as the 30-year normal does. In the latter, the 18-year normal starts to resemble the temperature variation that the 30-year normal describes. These results provide a theoretical basis for applying different normals in different regions. The study further compared temperature normal for different periods and identified a seasonal shift in climate change in the southwestern U.S. where the summer maximum temperature has shifted to a late summer month and the winter minimum temperature shifted to an early winter month in the past 30 years.

  17. A Statistical Selection Strategy for Normalization Procedures in LC-MS Proteomics Experiments through Dataset Dependent Ranking of Normalization Scaling Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Jacobs, Jon M.

    2011-12-01

    Quantification of LC-MS peak intensities assigned during peptide identification in a typical comparative proteomics experiment will deviate from run-to-run of the instrument due to both technical and biological variation. Thus, normalization of peak intensities across a LC-MS proteomics dataset is a fundamental step in pre-processing. However, the downstream analysis of LC-MS proteomics data can be dramatically affected by the normalization method selected . Current normalization procedures for LC-MS proteomics data are presented in the context of normalization values derived from subsets of the full collection of identified peptides. The distribution of these normalization values is unknown a priori. If theymore » are not independent from the biological factors associated with the experiment the normalization process can introduce bias into the data, which will affect downstream statistical biomarker discovery. We present a novel approach to evaluate normalization strategies, where a normalization strategy includes the peptide selection component associated with the derivation of normalization values. Our approach evaluates the effect of normalization on the between-group variance structure in order to identify candidate normalization strategies that improve the structure of the data without introducing bias into the normalized peak intensities.« less

  18. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Galal, Ken

    1992-01-01

    Methods are presented to normalize the attitude quaternion in two extended Kalman filters (EKF), namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). It is concluded that all the normalization methods work well and yield comparable results. In the AEKF, normalization is not essential, since the data chosen for the test do not have a rapidly varying attitude. In the MEKF, normalization is necessary to avoid divergence of the attitude estimate. All of the methods of the methods behave similarly when the spacecraft experiences low angular rates.

  19. Inclusion: The Pros and Cons--A Critical Review

    ERIC Educational Resources Information Center

    Savich, Carl

    2008-01-01

    The purpose of this paper was to review, analyze, and critique the pros and cons, the advantages and disadvantages, of inclusion. The methodology consisted in analyzing and comparing research findings on the benefits and costs of inclusion. Federal legislation and regulations on inclusion were examined, analyzed, and discussed. The results showed…

  20. Tailoring a ConOps for NASA LSP Integrated Operations

    NASA Technical Reports Server (NTRS)

    Owens, Skip Clark V., III

    2017-01-01

    An integral part of the Systems Engineering process is the creation of a Concept of Operations (ConOps) for a given system, with the ConOps initially established early in the system design process and evolved as the system definition and design matures. As Integration Engineers in NASA's Launch Services Program (LSP) at Kennedy Space Center (KSC), our job is to manage the interface requirements for all the robotic space missions that come to our Program for a Launch Service. LSP procures and manages a launch service from one of our many commercial Launch Vehicle Contractors (LVCs) and these commercial companies are then responsible for developing the Interface Control Document (ICD), the verification of the requirements in that document, and all the services pertaining to integrating the spacecraft and launching it into orbit. However, one of the systems engineering tools that have not been employed within LSP to date is a Concept of Operations. The goal of this paper is to research the format and content that goes into these various aerospace industry ConOps and tailor the format and content into template form, so the template may be used as an engineering tool for spacecraft integration with future LSP procured launch services. This tailoring effort was performed as the authors final Masters Project in the Spring of 2016 for the Stevens Institute of Technology and modified for publication with INCOSE (Owens, 2016).

  1. Curva de aprendizaje en la colocación de tornillos pediculares percutáneos mínimamente invasivos

    PubMed Central

    Landriel, Federico; Hem, Santiago; Rasmussen, Jorge; Vecchi, Eduardo; Yampolsky, Claudio

    2018-01-01

    Resumen Objetivo: El objetivo de este estudio fue estimar la curva de aprendizaje necesaria para la correcta colocación de tornillos transpediculares percutáneos (TTP). Introducción: Los TTP son la forma de instrumentación más utilizada en el tratamiento quirúrgico de lesiones espinales que requieren estabilización. Métodos: Evaluamos retrospectivamente la inserción de 422 TTP (T5 a S1) en 75 pacientes operados entre 2013–2016, bajo guía fluoroscópica bidimensional. El cirujano 1 colocó siempre los tornillos del lado derecho y el cirujano 2, la totalidad del lado izquierdo. El posicionamiento y ruptura pedicular fue determinando con la clasificación tomográfica de Gertzbein. Se comparó la precisión en la colocación de TTP de nuestra serie con una tasa de ruptura de 8,08% (rango de 0,67-20,83%), valor de referencia obtenido de un meta-análisis propio. Resultados: De los 422 TTP, 395 fueron insertados en el pedículo sin violación de su cortical (Grado 1 = 93,6%), 27 (6,4%) rompieron la pared pedicular, de los cuales el 3,8% fue Grado 2, el 1,65% Grado 3 y sólo el 0,9% Grado 4. El Cirujano 1, presentó una tasa se ruptura global de 6,6%, alcanzando valores estándares de precisión al colocar 74 TTP; el Cirujano 2 presentó una tasa de ruptura de 6,1%, alcanzando valores de referencia a los 64 TTP; la diferencia entre ambos no fue estadísticamente significativa (P = 0,9009). Conclusión: En la serie evaluada se evidenció que se necesitan colocar aproximadamente 70 TTP para lograr resultados en términos de exactitud intrapedicular comparables con lo reportado por cirujanos experimentados en esta técnica mínimamente invasiva. PMID:29900033

  2. Relating normalization to neuronal populations across cortical areas

    PubMed Central

    Alberts, Joshua J.; Cohen, Marlene R.

    2016-01-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. PMID:27358313

  3. Canceling Some d-CON Mouse and Rat Control Products

    EPA Pesticide Factsheets

    EPA has reached agreement with the manufacturer, to cancel 12 d-CON products that do not meet our testing protocols that better protect children, pets and non-target wildlife from accidental exposure to the pesticide. These products will be phased out.

  4. The ConSurf-DB: pre-calculated evolutionary conservation profiles of protein structures.

    PubMed

    Goldenberg, Ofir; Erez, Elana; Nimrod, Guy; Ben-Tal, Nir

    2009-01-01

    ConSurf-DB is a repository for evolutionary conservation analysis of the proteins of known structures in the Protein Data Bank (PDB). Sequence homologues of each of the PDB entries were collected and aligned using standard methods. The evolutionary conservation of each amino acid position in the alignment was calculated using the Rate4Site algorithm, implemented in the ConSurf web server. The algorithm takes into account the phylogenetic relations between the aligned proteins and the stochastic nature of the evolutionary process explicitly. Rate4Site assigns a conservation level for each position in the multiple sequence alignment using an empirical Bayesian inference. Visual inspection of the conservation patterns on the 3D structure often enables the identification of key residues that comprise the functionally important regions of the protein. The repository is updated with the latest PDB entries on a monthly basis and will be rebuilt annually. ConSurf-DB is available online at http://consurfdb.tau.ac.il/

  5. The ConSurf-DB: pre-calculated evolutionary conservation profiles of protein structures

    PubMed Central

    Goldenberg, Ofir; Erez, Elana; Nimrod, Guy; Ben-Tal, Nir

    2009-01-01

    ConSurf-DB is a repository for evolutionary conservation analysis of the proteins of known structures in the Protein Data Bank (PDB). Sequence homologues of each of the PDB entries were collected and aligned using standard methods. The evolutionary conservation of each amino acid position in the alignment was calculated using the Rate4Site algorithm, implemented in the ConSurf web server. The algorithm takes into account the phylogenetic relations between the aligned proteins and the stochastic nature of the evolutionary process explicitly. Rate4Site assigns a conservation level for each position in the multiple sequence alignment using an empirical Bayesian inference. Visual inspection of the conservation patterns on the 3D structure often enables the identification of key residues that comprise the functionally important regions of the protein. The repository is updated with the latest PDB entries on a monthly basis and will be rebuilt annually. ConSurf-DB is available online at http://consurfdb.tau.ac.il/ PMID:18971256

  6. B cell helper factors. II. Synergy among three helper factors in the response of T cell- and macrophage-depleted B cells.

    PubMed

    Liebson, H J; Marrack, P; Kappler, J

    1982-10-01

    The concanavalin A- (Con A) stimulated supernatant of normal spleen cells (normal Con A SN) was shown to contain a set of helper factors sufficient to allow T cell- and macrophage- (M phi) depleted murine splenic B cells to produce a plaque-forming cell response to the antigen sheep red blood cells (SRBC). The activity of normal Con A SN could be reconstituted by a mixture of three helper factor preparations. The first was the interleukin 2- (IL 2) containing Con A SN of the T cell hybridoma, FS6-14.13. The second was a normal Con A SN depleted of IL 2 by extended culture with T cell blasts from which the 30,000 to 50,000 m.w. factors were isolated (interleukin X, IL X). The third was a SN either from the M phi tumor cell line P388D1 or from normal M phi taken from Corynebacterium parvum-immune mice. The combination of all three helper factor preparations was required to equal the activity of normal Con A SN; however, the M phi SN had the least overall effect. The M phi SN and IL 2 had to be added at the initiation of the culture period for a maximal effect, but the IL X preparation was most effective when added 24 hr after the initiation of culture. These results indicate that at least three nonspecific helper factors contribute to the helper activity in normal Con A SN.

  7. Thermal valorization of post-consumer film waste in a bubbling bed gasifier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martínez-Lera, S., E-mail: susanamartinezlera@gmail.com; Torrico, J.; Pallarés, J.

    2013-07-15

    Highlights: • Film waste from packaging is a common waste, a fraction of which is not recyclable. • Gasification can make use of the high energy value of the non-recyclable fraction. • This waste and two reference polymers were gasified in a bubbling bed reactor. • This experimental research proves technical feasibility of the process. • It also analyzes impact of composition and ER on the performance of the plant. - Abstract: The use of plastic bags and film packaging is very frequent in manifold sectors and film waste is usually present in different sources of municipal and industrial wastes.more » A significant part of it is not suitable for mechanical recycling but could be safely transformed into a valuable gas by means of thermal valorization. In this research, the gasification of film wastes has been experimentally investigated through experiments in a fluidized bed reactor of two reference polymers, polyethylene and polypropylene, and actual post-consumer film waste. After a complete experimental characterization of the three materials, several gasification experiments have been performed to analyze the influence of the fuel and of equivalence ratio on gas production and composition, on tar generation and on efficiency. The experiments prove that film waste and analogue polymer derived wastes can be successfully gasified in a fluidized bed reactor, yielding a gas with a higher heating value in a range from 3.6 to 5.6 MJ/m{sup 3} and cold gas efficiencies up to 60%.« less

  8. Valorization of rendering industry wastes and co-products for industrial chemicals, materials and energy: review.

    PubMed

    Mekonnen, Tizazu; Mussone, Paolo; Bressler, David

    2016-01-01

    Over the past decades, strong global demand for industrial chemicals, raw materials and energy has been driven by rapid industrialization and population growth across the world. In this context, long-term environmental sustainability demands the development of sustainable strategies of resource utilization. The agricultural sector is a major source of underutilized or low-value streams that accompany the production of food and other biomass commodities. Animal agriculture in particular constitutes a substantial portion of the overall agricultural sector, with wastes being generated along the supply chain of slaughtering, handling, catering and rendering. The recent emergence of bovine spongiform encephalopathy (BSE) resulted in the elimination of most of the traditional uses of rendered animal meals such as blood meal, meat and bone meal (MBM) as animal feed with significant economic losses for the entire sector. The focus of this review is on the valorization progress achieved on converting protein feedstock into bio-based plastics, flocculants, surfactants and adhesives. The utilization of other rendering streams such as fat and ash rich biomass for the production of renewable fuels, solvents, drop-in chemicals, minerals and fertilizers is also critically reviewed.

  9. Relating normalization to neuronal populations across cortical areas.

    PubMed

    Ruff, Douglas A; Alberts, Joshua J; Cohen, Marlene R

    2016-09-01

    Normalization, which divisively scales neuronal responses to multiple stimuli, is thought to underlie many sensory, motor, and cognitive processes. In every study where it has been investigated, neurons measured in the same brain area under identical conditions exhibit a range of normalization, ranging from suppression by nonpreferred stimuli (strong normalization) to additive responses to combinations of stimuli (no normalization). Normalization has been hypothesized to arise from interactions between neuronal populations, either in the same or different brain areas, but current models of normalization are not mechanistic and focus on trial-averaged responses. To gain insight into the mechanisms underlying normalization, we examined interactions between neurons that exhibit different degrees of normalization. We recorded from multiple neurons in three cortical areas while rhesus monkeys viewed superimposed drifting gratings. We found that neurons showing strong normalization shared less trial-to-trial variability with other neurons in the same cortical area and more variability with neurons in other cortical areas than did units with weak normalization. Furthermore, the cortical organization of normalization was not random: neurons recorded on nearby electrodes tended to exhibit similar amounts of normalization. Together, our results suggest that normalization reflects a neuron's role in its local network and that modulatory factors like normalization share the topographic organization typical of sensory tuning properties. Copyright © 2016 the American Physiological Society.

  10. Catalytic valorization of starch-rich food waste into hydroxymethylfurfural (HMF): Controlling relative kinetics for high productivity.

    PubMed

    Yu, Iris K M; Tsang, Daniel C W; Yip, Alex C K; Chen, Season S; Wang, Lei; Ok, Yong Sik; Poon, Chi Sun

    2017-08-01

    This study aimed to maximize the valorization of bread waste, a typical food waste stream, into hydroxymethylfurfural (HMF) by improving our kinetic understanding. The highest HMF yield (30mol%) was achieved using SnCl 4 as catalyst, which offered strong derived Brønsted acidity and moderate Lewis acidity. We evaluated the kinetic balance between these acidities to facilitate faster desirable reactions (i.e., hydrolysis, isomerization, and dehydration) relative to undesirable reactions (i.e., rehydration and polymerization). Such catalyst selectivity of SnCl 4 , AlCl 3 , and FeCl 3 was critical in maximizing HMF yield. Higher temperature made marginal advancement by accelerating the undesirable reactions to a similar extent as the desirable pathways. The polymerization-induced metal-impregnated high-porosity carbon was a possible precursor of biochar-based catalyst, further driving up the economic potential. Preliminary economic analysis indicated a net gain of USD 43-236 per kilogram bread waste considering the thermochemical-conversion cost and chemical-trading revenue. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Normal evaporation of binary alloys

    NASA Technical Reports Server (NTRS)

    Li, C. H.

    1972-01-01

    In the study of normal evaporation, it is assumed that the evaporating alloy is homogeneous, that the vapor is instantly removed, and that the alloy follows Raoult's law. The differential equation of normal evaporation relating the evaporating time to the final solute concentration is given and solved for several important special cases. Uses of the derived equations are exemplified with a Ni-Al alloy and some binary iron alloys. The accuracy of the predicted results are checked by analyses of actual experimental data on Fe-Ni and Ni-Cr alloys evaporated at 1600 C, and also on the vacuum purification of beryllium. These analyses suggest that the normal evaporation equations presented here give satisfactory results that are accurate to within an order of magnitude of the correct values, even for some highly concentrated solutions. Limited diffusion and the resultant surface solute depletion or enrichment appear important in the extension of this normal evaporation approach.

  12. Surgical animal models of neuropathic pain: Pros and Cons.

    PubMed

    Challa, Siva Reddy

    2015-03-01

    One of the biggest challenges for discovering more efficacious drugs for the control of neuropathic pain has been the diversity of chronic pain states in humans. It is now acceptable that different mechanisms contribute to normal physiologic pain, pain arising from tissue damage and pain arising from injury to the nervous system. To study pain transmission, spot novel pain targets and characterize the potential analgesic profile of new chemical entities, numerous experimental animal pain models have been developed that attempt to simulate the many human pain conditions. Among the neuropathic pain models, surgical models have paramount importance in the induction of pain states. Many surgical animal models exist, like the chronic constriction injury (CCI) to the sciatic nerve, partial sciatic nerve ligation (pSNL), spinal nerve ligation (SNL), spared nerve injury (SNI), brachial plexus avulsion (BPA), sciatic nerve transaction (SNT) and sciatic nerve trisection. Most of these models induce responses similar to those found in causalgia, a syndrome of sustained burning pain often seen in the distal extremity after partial peripheral nerve injury in humans. Researchers most commonly use these surgical models in both rats and mice during drug discovery to screen new chemical entities for efficacy in the area of neuropathic pain. However, there is scant literature that provides a comparative discussion of all these surgical models. Each surgical model has its own benefits and limitations. It is very difficult for a researcher to choose a suitable surgical animal model to suit their experimental set-up. Therefore, particular attention has been given in this review to comparatively provide the pros and cons of each model of surgically induced neuropathic pain.

  13. Teaching after Retirement: The Pros and the Cons

    ERIC Educational Resources Information Center

    Sommer, Robert

    2014-01-01

    Having enjoyed teaching during my active career, I continued to teach summer school following retirement. Self-observed sensory and cognitive impairments, although not mentioned by students in their evaluations, induced me to consider the pros and cons of continuing to teach. My hope is that this list of benefits and problems will be of assistance…

  14. 40 CFR 180.1213 - Coniothyrium minitans strain CON/M/91-08; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Coniothyrium minitans strain CON/M/91... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1213 Coniothyrium minitans strain CON/M... tolerance is established for residues of the microbial pesticide Coniothyrium minitans strain CON/M/91-08...

  15. 40 CFR 180.1213 - Coniothyrium minitans strain CON/M/91-08; exemption from the requirement of a tolerance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Coniothyrium minitans strain CON/M/91... PESTICIDE CHEMICAL RESIDUES IN FOOD Exemptions From Tolerances § 180.1213 Coniothyrium minitans strain CON/M... tolerance is established for residues of the microbial pesticide Coniothyrium minitans strain CON/M/91-08...

  16. Fluid involvement in normal faulting

    NASA Astrophysics Data System (ADS)

    Sibson, Richard H.

    2000-04-01

    Evidence of fluid interaction with normal faults comes from their varied role as flow barriers or conduits in hydrocarbon basins and as hosting structures for hydrothermal mineralisation, and from fault-rock assemblages in exhumed footwalls of steep active normal faults and metamorphic core complexes. These last suggest involvement of predominantly aqueous fluids over a broad depth range, with implications for fault shear resistance and the mechanics of normal fault reactivation. A general downwards progression in fault rock assemblages (high-level breccia-gouge (often clay-rich) → cataclasites → phyllonites → mylonite → mylonitic gneiss with the onset of greenschist phyllonites occurring near the base of the seismogenic crust) is inferred for normal fault zones developed in quartzo-feldspathic continental crust. Fluid inclusion studies in hydrothermal veining from some footwall assemblages suggest a transition from hydrostatic to suprahydrostatic fluid pressures over the depth range 3-5 km, with some evidence for near-lithostatic to hydrostatic pressure cycling towards the base of the seismogenic zone in the phyllonitic assemblages. Development of fault-fracture meshes through mixed-mode brittle failure in rock-masses with strong competence layering is promoted by low effective stress in the absence of thoroughgoing cohesionless faults that are favourably oriented for reactivation. Meshes may develop around normal faults in the near-surface under hydrostatic fluid pressures to depths determined by rock tensile strength, and at greater depths in overpressured portions of normal fault zones and at stress heterogeneities, especially dilational jogs. Overpressures localised within developing normal fault zones also determine the extent to which they may reutilise existing discontinuities (for example, low-angle thrust faults). Brittle failure mode plots demonstrate that reactivation of existing low-angle faults under vertical σ1 trajectories is only likely if

  17. The Normal Fetal Pancreas.

    PubMed

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P < .001) and significant correlations also with abdominal circumference and estimated fetal weight (Pearson r = 0.829 and 0.812, respectively; P < .001). Modeled pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  18. Normal pressure hydrocephalus

    MedlinePlus

    Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. Ferri's Clinical Advisor 2016 . Philadelphia, PA: Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders of cerebrospinal fluid circulation. ...

  19. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  20. 3j Symbols: To Normalize or Not to Normalize?

    ERIC Educational Resources Information Center

    van Veenendaal, Michel

    2011-01-01

    The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…

  1. The SubCons webserver: A user friendly web interface for state-of-the-art subcellular localization prediction.

    PubMed

    Salvatore, M; Shu, N; Elofsson, A

    2018-01-01

    SubCons is a recently developed method that predicts the subcellular localization of a protein. It combines predictions from four predictors using a Random Forest classifier. Here, we present the user-friendly web-interface implementation of SubCons. Starting from a protein sequence, the server rapidly predicts the subcellular localizations of an individual protein. In addition, the server accepts the submission of sets of proteins either by uploading the files or programmatically by using command line WSDL API scripts. This makes SubCons ideal for proteome wide analyses allowing the user to scan a whole proteome in few days. From the web page, it is also possible to download precalculated predictions for several eukaryotic organisms. To evaluate the performance of SubCons we present a benchmark of LocTree3 and SubCons using two recent mass-spectrometry based datasets of mouse and drosophila proteins. The server is available at http://subcons.bioinfo.se/. © 2017 The Protein Society.

  2. Somatic cell nuclear transfer: pros and cons.

    PubMed

    Sumer, Huseyin; Liu, Jun; Tat, Pollyanna; Heffernan, Corey; Jones, Karen L; Verma, Paul J

    2009-01-01

    Even though the technique of mammalian SCNT is just over a decade old it has already resulted in numerous significant advances. Despite the recent advances in the reprogramming field, SCNT remains the bench-mark for the generation of both genetically unmodified autologous pluripotent stem cells for transplantation and for the production of cloned animals. In this review we will discuss the pros and cons of SCNT, drawing comparisons with other reprogramming methods.

  3. Conserva a Puerto Rico con bosques maderables

    Treesearch

    Frank H. Wadsworth

    2009-01-01

    [article in Spanish] Puerto Rico consume muchos productos forestales costosos de importar. También tiene bosques extensos con maderas explotables. Además, existen condiciones físicas favorables para la producción de madera útil. No obstante, hoy día no se utiliza la madera de los bosques actuales ocurre la deforestación para cualquier fin. Los Bosques productivos de...

  4. Normal Aging and Linguistic Decrement.

    ERIC Educational Resources Information Center

    Emery, Olga B.

    A study investigated language patterning, as an indication of synthetic mental activity, in comparison groups of normal pre-middle-aged adults (30-42 years), normal elderly adults (75-93), and elderly adults (71-91) with Alzheimer's dementia. Semiotic theory was used as the conceptual context. Linguistic measures included the Token Test, the…

  5. Quantiles for Finite Mixtures of Normal Distributions

    ERIC Educational Resources Information Center

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  6. Pros and cons of rotating ground motion records to fault-normal/parallel directions for response history analysis of buildings

    USGS Publications Warehouse

    Kalkan, Erol; Kwong, Neal S.

    2014-01-01

    According to the regulatory building codes in the United States (e.g., 2010 California Building Code), at least two horizontal ground motion components are required for three-dimensional (3D) response history analysis (RHA) of building structures. For sites within 5 km of an active fault, these records should be rotated to fault-normal/fault-parallel (FN/FP) directions, and two RHAs should be performed separately (when FN and then FP are aligned with the transverse direction of the structural axes). It is assumed that this approach will lead to two sets of responses that envelope the range of possible responses over all nonredundant rotation angles. This assumption is examined here, for the first time, using a 3D computer model of a six-story reinforced-concrete instrumented building subjected to an ensemble of bidirectional near-fault ground motions. Peak values of engineering demand parameters (EDPs) were computed for rotation angles ranging from 0 through 180° to quantify the difference between peak values of EDPs over all rotation angles and those due to FN/FP direction rotated motions. It is demonstrated that rotating ground motions to FN/FP directions (1) does not always lead to the maximum responses over all angles, (2) does not always envelope the range of possible responses, and (3) does not provide maximum responses for all EDPs simultaneously even if it provides a maximum response for a specific EDP.

  7. DESAFÍOS ÉTICOS DE LA INVESTIGACIÓN CON ANIMALES, MANIPULACIÓN GENÉTICA

    PubMed Central

    Yunta, Eduardo Rodríguez

    2012-01-01

    En la investigación con animales existen cuestionamientos éticos tanto en el uso como modelos de enfermedades humanas y requisito previo para ensayos en humanos como en la introducción de modificaciones genéticas. Algunos de estos cuestionamientos son: no representar exactamente la condición humana como modelos, realizar pruebas de toxicidad con grave daño para los animales, alterar su naturaleza mediante modificaciones genéticas, riesgos de la introducción de organismos genéticamente modificados. El uso de animales en investigación para beneficio humano, impone al ser humano la responsabilidad moral de respetarlo, no haciéndoles sufrir innecesariamente, al estar trabajando con seres vivientes y sentientes. PMID:23338641

  8. Metabolic Cost, Mechanical Work, and Efficiency during Normal Walking in Obese and Normal-Weight Children

    ERIC Educational Resources Information Center

    Huang, Liang; Chen, Peijie; Zhuang, Jie; Zhang, Yanxin; Walt, Sharon

    2013-01-01

    Purpose: This study aimed to investigate the influence of childhood obesity on energetic cost during normal walking and to determine if obese children choose a walking strategy optimizing their gait pattern. Method: Sixteen obese children with no functional abnormalities were matched by age and gender with 16 normal-weight children. All…

  9. Normalizing Catastrophe: An Educational Response

    ERIC Educational Resources Information Center

    Jickling, Bob

    2013-01-01

    Processes of normalizing assumptions and values have been the subjects of theoretical framing and critique for several decades now. Critique has often been tied to issues of environmental sustainability and social justice. Now, in an era of global warming, there is a rising concern that the results of normalizing of present values could be…

  10. A Normalized Direct Approach for Estimating the Parameters of the Normal Ogive Three-Parameter Model for Ability Tests.

    ERIC Educational Resources Information Center

    Gugel, John F.

    A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…

  11. Genomic Changes in Normal Breast Tissue in Women at Normal Risk or at High Risk for Breast Cancer

    PubMed Central

    Danforth, David N.

    2016-01-01

    Sporadic breast cancer develops through the accumulation of molecular abnormalities in normal breast tissue, resulting from exposure to estrogens and other carcinogens beginning at adolescence and continuing throughout life. These molecular changes may take a variety of forms, including numerical and structural chromosomal abnormalities, epigenetic changes, and gene expression alterations. To characterize these abnormalities, a review of the literature has been conducted to define the molecular changes in each of the above major genomic categories in normal breast tissue considered to be either at normal risk or at high risk for sporadic breast cancer. This review indicates that normal risk breast tissues (such as reduction mammoplasty) contain evidence of early breast carcinogenesis including loss of heterozygosity, DNA methylation of tumor suppressor and other genes, and telomere shortening. In normal tissues at high risk for breast cancer (such as normal breast tissue adjacent to breast cancer or the contralateral breast), these changes persist, and are increased and accompanied by aneuploidy, increased genomic instability, a wide range of gene expression differences, development of large cancerized fields, and increased proliferation. These changes are consistent with early and long-standing exposure to carcinogens, especially estrogens. A model for the breast carcinogenic pathway in normal risk and high-risk breast tissues is proposed. These findings should clarify our understanding of breast carcinogenesis in normal breast tissue and promote development of improved methods for risk assessment and breast cancer prevention in women. PMID:27559297

  12. Proyecto para la medición sistemática de seeing en CASLEO

    NASA Astrophysics Data System (ADS)

    Fernández Lajus, E.; Forte, J. C.

    La calidad del seeing astronómico es ciertamente uno de los parámetros mas importantes que caracterizan el sitio de un observatorio. Por tanto se desea determinar si el alto valor de seeing observado con el telescopio de 2.15 m se debe a efectos internos y/o del entorno a la cupula o si se debe simplemente al seeing propio del lugar. El actual mecanismo de refrigeración del espejo primario del 2.15, parece haber mejorado notablemente la calidad del seeing. Sin embargo se hace necesario saber hasta que punto el valor del seeing puede ser mejorado. La primera etapa del proyecto consistió en la puesta a punto del telescopio emplazado para este propósito y la adquisición de las primeras medidas tentativas de seeing.

  13. Normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  14. Normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1997-06-10

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3{prime} noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. 4 figs.

  15. Valorization of residual Empty Palm Fruit Bunch Fibers (EPFBF) by microfluidization: production of nanofibrillated cellulose and EPFBF nanopaper.

    PubMed

    Ferrer, Ana; Filpponen, Ilari; Rodríguez, Alejandro; Laine, Janne; Rojas, Orlando J

    2012-12-01

    Different cellulose pulps were produced from sulfur-free chemical treatments of Empty Palm Fruit Bunch Fibers (EPFBF), a by-product from palm oil processing. The pulps were microfluidized for deconstruction into nanofibrillated cellulose (NFC) and nanopaper was manufactured by using an overpressure device. The morphological and structural features of the obtained NFCs were characterized via atomic force and scanning electron microscopies. The physical properties as well as the interactions with water of sheets from three different pulps were compared with those of nanopaper obtained from the corresponding NFC. Distinctive chemical and morphological characteristics and ensuing nanopaper properties were generated by the EPFBF fibers. The NFC grades obtained compared favorably with associated materials typically produced from bleached wood fibers. Lower water absorption, higher tensile strengths (107-137 MPa) and elastic modulus (12-18 GPa) were measured, which opens the possibility for valorization of such widely available bioresource. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Confirmed viral meningitis with normal CSF findings.

    PubMed

    Dawood, Naghum; Desjobert, Edouard; Lumley, Janine; Webster, Daniel; Jacobs, Michael

    2014-07-17

    An 18-year-old woman presented with a progressively worsening headache, photophobia feverishness and vomiting. Three weeks previously she had returned to the UK from a trip to Peru. At presentation, she had clinical signs of meningism. On admission, blood tests showed a mild lymphopenia, with a normal C reactive protein and white cell count. Chest X-ray and CT of the head were normal. Cerebrospinal fluid (CSF) microscopy was normal. CSF protein and glucose were in the normal range. MRI of the head and cerebral angiography were also normal. Subsequent molecular testing of CSF detected enterovirus RNA by reverse transcriptase PCR. The patient's clinical syndrome correlated with her virological diagnosis and no other cause of her symptoms was found. Her symptoms were self-limiting and improved with supportive management. This case illustrates an important example of viral central nervous system infection presenting clinically as meningitis but with normal CSF microscopy. 2014 BMJ Publishing Group Ltd.

  17. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  18. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water fluctuations. (a) Normal water fluctuations in a natural aquatic system consist of daily, seasonal, and annual...

  19. Con Edison power failure of July 13 and 14, 1977. Final staff report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1978-06-01

    On July 13, 1977 the entire electric load of the Con Edison system was lost, plunging New York City and Westchester County into darkness. The collapse resulted from a combination of natural events, equipment malfunctions, questionable system-design features, and operating errors. An attempt is made in this report to answer the following: what were the specific causes of the failure; if equipment malfunctions and operator errors contributed, could they have been prevented; to what extent was Con Edison prepared to handle such an emergency; and did Con Edison plan prudently reserve generation, for reserve transmission capability, for automatic equipment tomore » protect its system, and for proper operator response to a critical situation. Following the introductory and summary section, additional sections include: the Consolidated Edison system; prevention of bulk power-supply interruptions; the sequence of failure and restoration; analysis of the July 1977 power failure; restoration sequence and equipment damage assessment; and other investigations of the blackout. (MCW)« less

  20. Estructura espacial de las órbitas caóticas en un modelo autoconsistente de galaxia elíptica

    NASA Astrophysics Data System (ADS)

    Muzzio, J. C.

    Hemos logrado construir modelos autoconsistentes de sistemas estelares utilizando una aproximación cuadrupolar para el potencial. Esto nos permite determinar órbitas y exponentes de Lyapunov de objetos que tienen posiciones y velocidades equivalentes a las que se obtienen de la funcón de distribución del sistema. La distribución espacial de las órbitas caóticas exhibe considerable estructura y, lo que es más importante aún, los valores de los exponentes de Lyapunov calculados sobre intervalos finitos de tiempo, muestran una fuerte correlación con el comportamiento de la órbita en esos mismos intervalos, por lo que permiten reconocer distintos subsistemas con diferentes distribuciones espaciales.

  1. What's normal? Influencing women's perceptions of normal genitalia: an experiment involving exposure to modified and nonmodified images.

    PubMed

    Moran, C; Lee, C

    2014-05-01

    Examine women's perceptions of what is 'normal' and 'desirable' in female genital appearance. Experiment with random allocation across three conditions. Community. A total of 97 women aged 18-30 years. Women were randomly assigned to view a series of images of (1) surgically modified vulvas or (2) nonmodified vulvas, or (3) no images. They then viewed and rated ten target images of surgically modified vulvas and ten of unmodified vulvas. Women used a four-point Likert scale ('strongly agree' to 'strongly disagree'), to rate each target image for 'looks normal' and 'represents society's ideal'. For each woman, we created two summary scores that represented the extent to which she rated the unmodified vulvas as more 'normal' and more 'society's ideal' than the modified vulvas. For ratings of 'normality,' there was a significant effect for condition (F2,94  = 2.75 P = 0.007, radj2 = 0.082): women who had first viewed the modified images rated the modified target vulvas as more normal than the nonmodified vulvas, significantly different from the control group, who rated them as less normal. For ratings of 'society's ideal', there was again a significant effect for condition (F2,92  = 7.72, P < 0.001, radj2 = 0.125); all three groups rated modified target vulvas as more like society's ideal than the nonmodified target vulvas, with the effect significantly strongest for the women who had viewed the modified images. Exposure to images of modified vulvas may change women's perceptions of what is normal and desirable. This may explain why some healthy women seek labiaplasty. © 2013 Royal College of Obstetricians and Gynaecologists.

  2. Valuation of Normal Range of Ankle Systolic Blood Pressure in Subjects with Normal Arm Systolic Blood Pressure.

    PubMed

    Gong, Yi; Cao, Kai-wu; Xu, Jin-song; Li, Ju-xiang; Hong, Kui; Cheng, Xiao-shu; Su, Hai

    2015-01-01

    This study aimed to establish a normal range for ankle systolic blood pressure (SBP). A total of 948 subjects who had normal brachial SBP (90-139 mmHg) at investigation were enrolled. Supine BP of four limbs was simultaneously measured using four automatic BP measurement devices. The ankle-arm difference (An-a) on SBP of both sides was calculated. Two methods were used for establishing normal range of ankle SBP: the 99% method was decided on the 99% reference range of actual ankle BP, and the An-a method was the sum of An-a and the low or up limits of normal arm SBP (90-139 mmHg). Whether in the right or left side, the ankle SBP was significantly higher than the arm SBP (right: 137.1 ± 16.9 vs 119.7 ± 11.4 mmHg, P<0.05). Based on the 99% method, the normal range of ankle SBP was 94~181 mmHg for the total population, 84~166 mmHg for the young (18-44 y), 107~176 mmHg for the middle-aged(45-59 y) and 113~179 mmHg for the elderly (≥ 60 y) group. As the An-a on SBP was 13 mmHg in the young group and 20 mmHg in both middle-aged and elderly groups, the normal range of ankle SBP on the An-a method was 103-153 mmHg for young and 110-160 mmHg for middle-elderly subjects. A primary reference for normal ankle SBP was suggested as 100-165 mmHg in the young and 110-170 mmHg in the middle-elderly subjects.

  3. Strength of Gamma Rhythm Depends on Normalization

    PubMed Central

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  4. [Normal aging of frontal lobe functions].

    PubMed

    Calso, Cristina; Besnard, Jérémy; Allain, Philippe

    2016-03-01

    Normal aging in individuals is often associated with morphological, metabolic and cognitive changes, which particularly concern the cerebral frontal regions. Starting from the "frontal lobe hypothesis of cognitive aging" (West, 1996), the present review is based on the neuroanatomical model developed by Stuss (2008), introducing four categories of frontal lobe functions: executive control, behavioural and emotional self-regulation and decision-making, energization and meta-cognitive functions. The selected studies only address the changes of one at least of these functions. The results suggest a deterioration of several cognitive frontal abilities in normal aging: flexibility, inhibition, planning, verbal fluency, implicit decision-making, second-order and affective theory of mind. Normal aging seems also to be characterised by a general reduction in processing speed observed during neuropsychological assessment (Salthouse, 1996). Nevertheless many cognitive functions remain preserved such as automatic or non-conscious inhibition, specific capacities of flexibility and first-order theory of mind. Therefore normal aging doesn't seem to be associated with a global cognitive decline but rather with a selective change in some frontal systems, conclusion which should be taken into account for designing caring programs in normal aging.

  5. Fotodensitometro realizzato con microscopio.

    NASA Astrophysics Data System (ADS)

    di Giovanni, G.

    1989-02-01

    This work describes the chracteristics and use of a microphotometer suitable for the measure of photographic density of photos of stars. The optics is a normal microscope which projects the small photographic image of a star on the plate of a photodiode. This high sensitivity instrument is easy to build up.

  6. Normal metal - insulator - superconductor thermometers and coolers with titanium-gold bilayer as the normal metal

    NASA Astrophysics Data System (ADS)

    Räisänen, I. M. W.; Geng, Z.; Kinnunen, K. M.; Maasilta, I. J.

    2018-03-01

    We have fabricated superconductor - insulator - normal metal - insulator - superconductor (SINIS) tunnel junctions in which Al acts as the superconductor, AlOx is the insulator, and the normal metal consists of a thin Ti layer (5 nm) covered with a thicker Au layer (40 nm). We have characterized the junctions by measuring their current-voltage curves between 60 mK and 750 mK. For comparison, the same measurements have been performed for a SINIS junction pair whose normal metal is Cu. The Ti-Au bilayer decreases the SINIS tunneling resistance by an order of magnitude compared to junctions where Cu is used as normal metal, made with the same oxidation parameters. The Ti-Au devices are much more robust against chemical attacks, and their lower tunneling resistance makes them more robust against static charge. More significantly, they exhibit significantly stronger electron cooling than Cu devices with identical fabrication steps, when biased close to the energy gap of the superconducting Al. By using a self-consistent thermal model, we can fit the current-voltage characteristics well, and show an electron cooling from 200 mK to 110 mK, with a non-optimized device.

  7. Providers debate pros and cons of pneumonia vaccination at discharge.

    PubMed

    2001-02-01

    When to vaccinate against pneumonia? Does it makes sense when patients are in the hospital? Or should patients wait for the first post-op visit with the PCP? Office-based and hospital-based physicians weigh the pros and cons of each.

  8. a Recursive Approach to Compute Normal Forms

    NASA Astrophysics Data System (ADS)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  9. Neither Hematocrit Normalization nor Exercise Training Restores Oxygen Consumption to Normal Levels in Hemodialysis Patients

    PubMed Central

    Stray-Gundersen, James; Parsons, Dora Beth; Thompson, Jeffrey R.

    2016-01-01

    Patients treated with hemodialysis develop severely reduced functional capacity, which can be partially ameliorated by correcting anemia and through exercise training. In this study, we determined perturbations of an erythroid-stimulating agent and exercise training to examine if and where limitation to oxygen transport exists in patients on hemodialysis. Twenty-seven patients on hemodialysis completed a crossover study consisting of two exercise training phases at two hematocrit (Hct) values: 30% (anemic) and 42% (physiologic; normalized by treatment with erythroid-stimulating agent). To determine primary outcome measures of peak power and oxygen consumption (VO2) and secondary measures related to components of oxygen transport and utilization, all patients underwent numerous tests at five time points: baseline, untrained at Hct of 30%, after training at Hct of 30%, untrained at Hct of 42%, and after training at Hct of 42%. Hct normalization, exercise training, or the combination thereof significantly improved peak power and VO2 relative to values in the untrained anemic phase. Hct normalization increased peak arterial oxygen and arteriovenous oxygen difference, whereas exercise training improved cardiac output, citrate synthase activity, and peak tissue diffusing capacity. However, although the increase in arterial oxygen observed in the combination phase reached a value similar to that in healthy sedentary controls, the increase in peak arteriovenous oxygen difference did not. Muscle biopsy specimens showed markedly thickened endothelium and electron–dense interstitial deposits. In conclusion, exercise and Hct normalization had positive effects but failed to normalize exercise capacity in patients on hemodialysis. This effect may be caused by abnormalities identified within skeletal muscle. PMID:27153927

  10. Cue-based assertion classification for Swedish clinical text – developing a lexicon for pyConTextSwe

    PubMed Central

    Velupillai, Sumithra; Skeppstedt, Maria; Kvist, Maria; Mowery, Danielle; Chapman, Brian E.; Dalianis, Hercules; Chapman, Wendy W.

    2014-01-01

    Objective The ability of a cue-based system to accurately assert whether a disorder is affirmed, negated, or uncertain is dependent, in part, on its cue lexicon. In this paper, we continue our study of porting an assertion system (pyConTextNLP) from English to Swedish (pyConTextSwe) by creating an optimized assertion lexicon for clinical Swedish. Methods and material We integrated cues from four external lexicons, along with generated inflections and combinations. We used subsets of a clinical corpus in Swedish. We applied four assertion classes (definite existence, probable existence, probable negated existence and definite negated existence) and two binary classes (existence yes/no and uncertainty yes/no) to pyConTextSwe. We compared pyConTextSwe’s performance with and without the added cues on a development set, and improved the lexicon further after an error analysis. On a separate evaluation set, we calculated the system’s final performance. Results Following integration steps, we added 454 cues to pyConTextSwe. The optimized lexicon developed after an error analysis resulted in statistically significant improvements on the development set (83% F-score, overall). The system’s final F-scores on an evaluation set were 81% (overall). For the individual assertion classes, F-score results were 88% (definite existence), 81% (probable existence), 55% (probable negated existence), and 63% (definite negated existence). For the binary classifications existence yes/no and uncertainty yes/no, final system performance was 97%/87% and 78%/86% F-score, respectively. Conclusions We have successfully ported pyConTextNLP to Swedish (pyConTextSwe). We have created an extensive and useful assertion lexicon for Swedish clinical text, which could form a valuable resource for similar studies, and which is publicly available. PMID:24556644

  11. Muscular hypertrophy and atrophy in normal rats provoked by the administration of normal and denervated muscle extracts.

    PubMed

    Agüera, Eduardo; Castilla, Salvador; Luque, Evelio; Jimena, Ignacio; Leiva-Cepas, Fernando; Ruz-Caracuel, Ignacio; Peña, José

    2016-12-01

    This study was conducted to determine the effects of extracts obtained from both normal and denervated muscles on different muscle types. Wistar rats were used and were divided into a control group and four experimental groups. Each experimental group was treated intraperitoneally during 10 consecutive days with a different extract. These extracts were obtained from normal soleus muscle, denervated soleus, normal extensor digitorum longus, and denervated extensor digitorum longus. Following treatment, the soleus and extensor digitorum longus muscles were obtained for study under optic and transmission electron microscope; morphometric parameters and myogenic responses were also analyzed. The results demonstrated that the treatment with normal soleus muscle and denervated soleus muscle extracts provoked hypertrophy and increased myogenic activity. In contrast, treatment with extracts from the normal and denervated EDL had a different effect depending on the muscle analyzed. In the soleus muscle it provoked hypertrophy of type I fibers and increased myogenic activity, while in the extensor digitorum longus atrophy of the type II fibers was observed without changes in myogenic activity. This suggests that the muscular responses of atrophy and hypertrophy may depend on different factors related to the muscle type which could be related to innervation.

  12. An Application of Con-Resistant Trust to Improve the Reliability of Special Protection Systems within the Smart Grid

    DTIC Science & Technology

    2012-06-01

    in an effort to be more reliable and efficient. However, with the benefits of this new technology comes added risk . This research utilizes a con ...AN APPLICATION OF CON -RESISTANT TRUST TO IMPROVE THE RELIABILITY OF SPECIAL PROTECTION SYSTEMS WITHIN THE SMART GRID THESIS Crystal M. Shipman...Government and is not subject to copyright protection in the United States AFIT/GCO/ENG/12-22 AN APPLICATION OF CON -RESISTANT TRUST TO IMPROVE THE

  13. Normalization of satellite imagery

    NASA Technical Reports Server (NTRS)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  14. Evaluation of CT-based SUV normalization

    NASA Astrophysics Data System (ADS)

    Devriese, Joke; Beels, Laurence; Maes, Alex; Van de Wiele, Christophe; Pottel, Hans

    2016-09-01

    The purpose of this study was to determine patients’ lean body mass (LBM) and lean tissue (LT) mass using a computed tomography (CT)-based method, and to compare standardized uptake value (SUV) normalized by these parameters to conventionally normalized SUVs. Head-to-toe positron emission tomography (PET)/CT examinations were retrospectively retrieved and semi-automatically segmented into tissue types based on thresholding of CT Hounsfield units (HU). The following HU ranges were used for determination of CT-estimated LBM and LT (LBMCT and LTCT):  -180 to  -7 for adipose tissue (AT), -6 to 142 for LT, and 143 to 3010 for bone tissue (BT). Formula-estimated LBMs were calculated using formulas of James (1976 Research on Obesity: a Report of the DHSS/MRC Group (London: HMSO)) and Janmahasatian et al (2005 Clin. Pharmacokinet. 44 1051-65), and body surface area (BSA) was calculated using the DuBois formula (Dubois and Dubois 1989 Nutrition 5 303-11). The CT segmentation method was validated by comparing total patient body weight (BW) to CT-estimated BW (BWCT). LBMCT was compared to formula-based estimates (LBMJames and LBMJanma). SUVs in two healthy reference tissues, liver and mediastinum, were normalized for the aforementioned parameters and compared to each other in terms of variability and dependence on normalization factors and BW. Comparison of actual BW to BWCT shows a non-significant difference of 0.8 kg. LBMJames estimates are significantly higher than LBMJanma with differences of 4.7 kg for female and 1.0 kg for male patients. Formula-based LBM estimates do not significantly differ from LBMCT, neither for men nor for women. The coefficient of variation (CV) of SUV normalized for LBMJames (SUVLBM-James) (12.3%) was significantly reduced in liver compared to SUVBW (15.4%). All SUV variances in mediastinum were significantly reduced (CVs were 11.1-12.2%) compared to SUVBW (15.5%), except SUVBSA (15.2%). Only SUVBW and SUVLBM-James show

  15. Gestational surrogacy: could be a way to be a way to reproduction? Pros and cons.

    PubMed

    Clementina, Peris

    2011-06-01

    The aim of this article was to address pros and cons of gestational surrogacy, the social and psychological issues involved in surrogate motherhood triads. Pros and cons of surrogacy, the possible insurgence of a hematologic disease in the fetus, hemolytic disease of the newborn, naturally acquired microchimerism in surrogacy cases, ethical, medical, psychologic, legal and religious issues of a problem are discussed.

  16. Normal Psychosexual Development

    ERIC Educational Resources Information Center

    Rutter, Michael

    1971-01-01

    Normal sexual development is reviewed with respect to physical maturation, sexual interests, sex drive", psychosexual competence and maturity, gender role, object choice, children's concepts of sexual differences, sex role preference and standards, and psychosexual stages. Biologic, psychoanalytic and psychosocial theories are briefly considered.…

  17. Young women's perspective of the pros and cons to seeking screening for chlamydia and gonorrhea: an exploratory study.

    PubMed

    Chacko, Mariam R; von Sternberg, Kirk; Velasquez, Mary M; Wiemann, Constance M; Smith, Peggy B; DiClemente, Ralph

    2008-08-01

    To identify young women's pros and cons (decisional balance) to seeking chlamydia (CT) and gonorrhea (NGC) screening. Prospective, cross sectional study Community-based reproductive health clinic 192 young women (66% African American; mean age 18.9 years). Content analysis of responses obtained during a decisional balance exercise (pros and cons) promoting CT and NGC screening was conducted. Thematic categories were developed through a coding process, and each response was assigned to one thematic category. The frequency of pros and cons responses for each category and the frequency of participants endorsing each category were calculated. Ten thematic categories in relation to pros and cons of seeking CT and NGC screening were: being healthy; awareness of the body; systemic factors around the clinic visit and testing procedures; benefits and aversions around treatment; partner trust issues; confidentiality; prevention of long term adverse effects, protection of the body; concern for others; fear of results/aversion to testing; and logistical barriers. The three most often cited pros were awareness of the body, being healthy and treatment issues; and the three most often cited cons were logistical barriers (time/transportation), fear/aversion to testing, and systemic factors. A variety of pros and cons to seeking CT and NGC screening were identified at a community-based clinic. Providers in clinical settings can utilize this information when encouraging patients to seek regular STI screening by elucidating and emphasizing those pros and cons that have the most influence on a young woman's decision-making to seek screening.

  18. Valorization of pellets from municipal WWTP sludge in lightweight clay ceramics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cusido, Joan A., E-mail: joan.antoni.cusido@upc.edu; Soriano, Cecilia

    2011-06-15

    for conventional clay ceramics, industrial production would require the implementation of some type of air-depuration system. The results showed that the ceramization of sludge pellets is a promising valorization technique worth considering from both the economic and technological perspectives.« less

  19. Valorization of pellets from municipal WWTP sludge in lightweight clay ceramics.

    PubMed

    Cusidó, Joan A; Soriano, Cecilia

    2011-06-01

    conventional clay ceramics, industrial production would require the implementation of some type of air-depuration system. The results showed that the ceramization of sludge pellets is a promising valorization technique worth considering from both the economic and technological perspectives. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Produccion Gaseosa del Cometa Halley: Erupciones Y Fotodisociacion del Radical OH

    NASA Astrophysics Data System (ADS)

    Silva, A. M.; Mirabel, I. F.

    1990-11-01

    RESUMEN:En este trabajo informamos la detecci6n de 20 erupciones en la li'nea de =18cm (1667MHz) del radical OH en el Cometa Halley.Las observaciones incluyen todos los monitoreos existentes y se extienden desde 120 dias antes del perihelio hasta 90 dias despues.Se detectan bruscos crecimientos en el flujo medido,hasta un factor 1O,seguidos por decaimientos lentos asociados con la fotodisociaci6n del OH. Se obtuvieron valores para el tiempo de vida fotoquimico del OH y del H2O basandose en el modelo desarrollado previamente por Silva(1988). Esos tiempos de vida estan de acuerdo con predicciones teoricas y con las observaciones en el Ultravioleta, y los resultados, los que son fuertemente dependientes de la velocidad heliocentrica del Coineta (variando hasta un factor 6), han sido calculados para varios rangos de velocidad entre +28 y -28 km/seg. Key wo'L :

  1. Valorization of titanium metal wastes as tanning agent used in leather industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crudu, Marian, E-mail: mariancrudu@yahoo.com; Deselnicu, Viorica, E-mail: viorica.deselnicu@icpi.ro; Deselnicu, Dana Corina, E-mail: d_deselnicu@yahoo.com

    2014-10-15

    Highlights: • Valorization of titanium wastes which cannot be recycled in metallurgical industry. • Transferring Ti waste into raw materials for obtaining Ti based tanning agent. • Characterization of new Ti based tanning agents and leather tanned with them. • Characterization of sewage waste water and sludge resulted from leather manufacture. • Analysis of the impact of main metal component of Ti waste. - Abstract: The development of new tanning agents and new technologies in the leather sector is required to cope with the increasingly higher environmental pressure on the current tanning materials and processes such as tanning with chromiummore » salts. In this paper, the use of titanium wastes (cuttings) resulting from the process of obtaining highly pure titanium (ingots), for the synthesis of new tanning agent and tanning bovine hides with new tanning agent, as alternative to tanning with chromium salts are investigated. For this purpose, Ti waste and Ti-based tanning agent were characterized for metal content by inductively coupled plasma mass spectrometry (ICP-MS) and chemical analysis; the tanned leather (wet white leather) was characterized by Scanning Electron Microscope/Energy Dispersive Using X-ray (Analysis). SEM/EDX analysis for metal content; Differential scanning calorimetric (DSC), Micro-Hot-Table and standard shrinkage temperature showing a hydrothermal stability (ranged from 75.3 to 77 °C) and chemical analysis showing the leather is tanned and can be processed through the subsequent mechanical operations (splitting, shaving). On the other hand, an analysis of major minor trace substances from Ti-end waste (especially vanadium content) in new tanning agent and wet white leather (not detected) and residue stream was performed and showed that leachability of vanadium is acceptable. The results obtained show that new tanning agent obtained from Ti end waste can be used for tanning bovine hides, as eco-friendly alternative for chrome

  2. Utilice en forma segura los productos con cebo para roedores

    EPA Pesticide Factsheets

    Si se usan de manera inadecuada, los productos con veneno para ratas y ratones podrían hacerle daño a usted, a sus hijos o a sus mascotas. Siempre que use pesticidas lea la etiqueta del producto y siga todas las indicaciones.

  3. Project VALOR: design and methods of a longitudinal registry of post-traumatic stress disorder (PTSD) in combat-exposed veterans in the Afghanistan and Iraqi military theaters of operations.

    PubMed

    Rosen, Raymond C; Marx, Brian P; Maserejian, Nancy N; Holowka, Darren W; Gates, Margaret A; Sleeper, Lynn A; Vasterling, Jennifer J; Kang, Han K; Keane, Terence M

    2012-03-01

    Few studies have investigated the natural history of post-traumatic stress disorder (PTSD). Project VALOR (Veterans' After-discharge Longitudinal Registry) was designed as a longitudinal patient registry assessing the course of combat-related PTSD among 1600 male and female Veterans who served in Operation Enduring Freedom (OEF) in Afghanistan or Operation Iraqi Freedom (OIF). Aims of the study include investigating patterns and predictors of progression or remission of PTSD and treatment utilization. The study design was based on recommendations from the Agency for Healthcare Quality and Research for longitudinal disease registries and used a pre-specified theoretical model to select the measurement domains for data collection and interpretation of forthcoming results. The registry will include 1200 male and female Veterans with a recent diagnosis of PTSD in the Department of Veteran Affairs (VA) electronic medical record and a comparison group of 400 Veterans without a medical record-based PTSD diagnosis, to also allow for case-control analyses. Data are collected from administrative databases, electronic medical records, a self-administered questionnaire, and a semi-structured diagnostic telephone interview. Project VALOR is a unique and timely registry study that will evaluate the clinical course of PTSD, psychosocial correlates, and health outcomes in a carefully selected cohort of returning OEF/OIF Veterans. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Visual attention and flexible normalization pools

    PubMed Central

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  5. A normalization strategy for comparing tag count data

    PubMed Central

    2012-01-01

    Background High-throughput sequencing, such as ribonucleic acid sequencing (RNA-seq) and chromatin immunoprecipitation sequencing (ChIP-seq) analyses, enables various features of organisms to be compared through tag counts. Recent studies have demonstrated that the normalization step for RNA-seq data is critical for a more accurate subsequent analysis of differential gene expression. Development of a more robust normalization method is desirable for identifying the true difference in tag count data. Results We describe a strategy for normalizing tag count data, focusing on RNA-seq. The key concept is to remove data assigned as potential differentially expressed genes (DEGs) before calculating the normalization factor. Several R packages for identifying DEGs are currently available, and each package uses its own normalization method and gene ranking algorithm. We compared a total of eight package combinations: four R packages (edgeR, DESeq, baySeq, and NBPSeq) with their default normalization settings and with our normalization strategy. Many synthetic datasets under various scenarios were evaluated on the basis of the area under the curve (AUC) as a measure for both sensitivity and specificity. We found that packages using our strategy in the data normalization step overall performed well. This result was also observed for a real experimental dataset. Conclusion Our results showed that the elimination of potential DEGs is essential for more accurate normalization of RNA-seq data. The concept of this normalization strategy can widely be applied to other types of tag count data and to microarray data. PMID:22475125

  6. Parental Perceptions of the Outcome and Meaning of Normalization

    PubMed Central

    Knafl, Kathleen A.; Darney, Blair G.; Gallo, Agatha M.; Angst, Denise B.

    2010-01-01

    The purpose of this secondary analysis was to identify the meaning of normalization for parents of a child with a chronic genetic condition. The sample was comprised of 28 families (48 parents), selected to reflect two groups: Normalization Present (NP) and Normalization Absent (NA). Constant comparison analysis was used to identify themes characterizing parents' perceptions of the meaning of normalization. The meanings parents attributed to normalization reflected their evaluation of condition management, parenting role, and condition impact, with parents in the NP and NA groups demonstrating distinct patterns of meaning. These meaning patterns are discussed as an outcome of normalization. Providers can play a pivotal role in helping families achieve normalization by providing guidance on how to balance condition management with normal family life. PMID:20108258

  7. Single step purification of concanavalin A (Con A) and bio-sugar production from jack bean using glucosylated magnetic nano matrix.

    PubMed

    Kim, Ho Myeong; Cho, Eun Jin; Bae, Hyeun-Jong

    2016-08-01

    Jack bean (JB, Canavalia ensiformis) is the source of bio-based products, such as proteins and bio-sugars that contribute to modern molecular biology and biomedical research. In this study, the use of jack bean was evaluated as a source for concanavalin A (Con A) and bio-sugar production. A novel method for purifying Con A from JBs was successfully developed using a glucosylated magnetic nano matrix (GMNM) as a physical support, which facilitated easy separation and purification of Con A. In addition, the enzymatic conversion rate of 2% (w/v) Con A extracted residue to bio-sugar was 98.4%. Therefore, this new approach for the production of Con A and bio-sugar is potentially useful for obtaining bio-based products from jack bean. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  9. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  10. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  11. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Tax normalization. 154.305 Section 154.305 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... Changes § 154.305 Tax normalization. (a) Applicability. An interstate pipeline must compute the income tax...

  12. A normality bias in legal decision making.

    PubMed

    Prentice, Robert A; Koehler, Jonathan J

    2003-03-01

    It is important to understand how legal fact finders determine causation and assign blame. However, this process is poorly understood. Among the psychological factors that affect decision makers are an omission bias (a tendency to blame actions more than inactions [omissions] for bad results), and a normality bias (a tendency to react more strongly to bad outcomes that spring from abnormal rather than normal circumstances). The omission and normality biases often reinforce one another when inaction preserves the normal state and when action creates an abnormal state. But what happens when these biases push in opposite directions as they would when inaction promotes an abnormal state or when action promotes a normal state? Which bias exerts the stronger influence on the judgments and behaviors of legal decision makers? The authors address this issue in two controlled experiments. One experiment involves medical malpractice and the other involves stockbroker negligence. They find that jurors pay much more attention to the normality of conditions than to whether those conditions arose through acts or omissions. Defendants who followed a nontraditional medical treatment regime or who chose a nontraditional stock portfolio received more blame and more punishment for bad outcomes than did defendants who obtained equally poor results after recommending a traditional medical regime or a traditional stock portfolio. Whether these recommendations entailed an action or an omission was essentially irrelevant. The Article concludes with a discussion of the implications of a robust normality bias for American jurisprudence.

  13. The role of the 2H4 molecule in the generation of suppressor function in Con A-activated T cells.

    PubMed

    Morimoto, C; Letvin, N L; Rudd, C E; Hagan, M; Takeuchi, T; Schlossman, S F

    1986-11-15

    The molecular basis for the suppression generated in a concanavalin A (Con A)-activated T cell culture remains unknown. In this study, we have attempted to determine whether the 2H4 and 4B4 molecules on Con A-activated T cells play some role in the generation of suppression by such cells. We have shown that Con A-activated suppressor cells belong to the 2H4+ subset of T cells but not the 4B4+ (2H4-) subset. Con A-activated T cells exerted their optimal suppressor function on day 2 in culture, a time at which the expression of 2H4 on such cells was maximal and 4B4 was minimal. Furthermore, the stimulation of T cells with the higher concentration of Con A generated the stronger suppressor function. At the same time, both 2H4 expression and density were increased and 4B4 expression and density were decreased on such Con A-activated T cells. More importantly, the treatment of Con A-activated T cells with anti-2H4 antibody but not with anti-4B4, anti-TQ1, or anti-T4 antibodies can block the suppressor function of such cells. Taken together, the above results strongly suggest that the 2H4 molecule itself may be involved in the generation of suppressor function in Con A-activated T cells. The 2H4 antigen on such cells was shown to be comprised of 220,000 and 200,000 m.w. glycoproteins. Thus this study indicates that the 220,000 and 200,000 m.w. structure of the 2H4 molecule may itself play a crucial role in the generation of suppressor signals of Con A-activated cells.

  14. Normal peer models and autistic children's learning.

    PubMed Central

    Egel, A L; Richman, G S; Koegel, R L

    1981-01-01

    Present research and legislation regarding mainstreaming autistic children into normal classrooms have raised the importance of studying whether autistic children can benefit from observing normal peer models. The present investigation systematically assessed whether autistic children's learning of discrimination tasks could be improved if they observed normal children perform the tasks correctly. In the context of a multiple baseline design, four autistic children worked on five discrimination tasks that their teachers reported were posing difficulty. Throughout the baseline condition the children evidenced very low levels of correct responding on all five tasks. In the subsequent treatment condition, when normal peers modeled correct responses, the autistic children's correct responding increased dramatically. In each case, the peer modeling procedure produced rapid achievement of the acquisition which was maintained after the peer models were removed. These results are discussed in relation to issues concerning observational learning and in relation to the implications for mainstreaming autistic children into normal classrooms. PMID:7216930

  15. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  16. Normal stresses in shear thickening granular suspensions.

    PubMed

    Pan, Zhongcheng; de Cagny, Henri; Habibi, Mehdi; Bonn, Daniel

    2017-05-24

    When subjected to shear, granular suspensions exhibit normal stresses perpendicular to the shear plane but the magnitude and sign of the different components of the normal stresses are still under debate. By performing both oscillatory and rotational rheology measurements on shear thickening granular suspensions and systematically varying the particle diameters and the gap sizes between two parallel-plates, we show that a transition from a positive to a negative normal stress can be observed. We find that frictional interactions which determine the shear thickening behavior of suspensions contribute to the positive normal stresses. Increasing the particle diameters or decreasing the gap sizes leads to a growing importance of hydrodynamic interactions, which results in negative normal stresses. We determine a relaxation time for the system, set by both the pore and the gap sizes, that governs the fluid flow through the inter-particle space. Finally, using a two-fluid model we determine the relative contributions from the particle phase and the liquid phase.

  17. Statistical normalization techniques for magnetic resonance imaging.

    PubMed

    Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2014-01-01

    While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.

  18. Is My Penis Normal? (For Teens)

    MedlinePlus

    ... worried about whether his penis is a normal size. There's a fairly wide range of normal penis sizes — just as there is for every other body part. And just like other parts of the body, how a penis appears at different stages of a guy's life varies quite a ...

  19. A Skew-Normal Mixture Regression Model

    ERIC Educational Resources Information Center

    Liu, Min; Lin, Tsung-I

    2014-01-01

    A challenge associated with traditional mixture regression models (MRMs), which rest on the assumption of normally distributed errors, is determining the number of unobserved groups. Specifically, even slight deviations from normality can lead to the detection of spurious classes. The current work aims to (a) examine how sensitive the commonly…

  20. Correcting the Normalized Gain for Guessing

    ERIC Educational Resources Information Center

    Stewart, John; Stewart, Gay

    2010-01-01

    The normalized gain, "g", has been an important tool for the characterization of conceptual improvement in physics courses since its use in Hake's extensive study on conceptual learning in introductory physics. The normalized gain is calculated from the score on a pre-test administered before instruction and a post-test administered…

  1. Waking Up from Four Decades' Long Dream of Valorizing Agro-Food Byproducts: Toward Practical Applications of the Gained Knowledge.

    PubMed

    Domínguez-Perles, Raúl; Moreno, Diego A; García-Viguera, Cristina

    2018-03-28

    The late 1970s was the onset of literature about the first research outputs on alternatives to create added-value to agro-food byproducts focused on the reduction of the dependency on raw materials and, simultaneously, helping to reduce the environmental impacts of the agricultural activities. This trend, increased over the years and during the past decade, has been boosted by the growing concern of the socio-economic impact of wastes from agro-food activities, however, with little success of the proposed applications. Throughout four decades an array of studies have emerged, aimed to gain evidence on the relevance of innovation in the agro-food industry, as well as to overcome this situation. To our knowledge, only a few cases, summarized in the present perspective, represent the main alternatives currently available for the valorization of agro-food byproducts, with indications of some constraints that need to be addressed, in the coming years, to obtain a real profit from these products.

  2. Hacia el consumo informado de tabaco en México: efecto de las advertencias con pictogramas en población fumadora

    PubMed Central

    Thrasher, James F; Pérez-Hernández, Rosaura; Arillo-Santillán, Edna; Barrientos-Gutiérrez, Inti

    2015-01-01

    Resumen Objetivo Evaluar el efecto de las advertencias sanitarias (AS) con pictogramas en las cajetillas de tabaco en adultos fumadores. Material y métodos Cohorte de fumadores con representatividad poblacional de siete ciudades mexi canas, antes (2010) y después (2011) de la implementación de AS con pictogramas (ASP). Para determinar el cambio en las variables sobre el impacto cognitivo y conductual de las advertencias, se estimaron modelos bivariados y ajustados de ecuaciones de estimación generalizada. En el Segundo levantamiento (2011), se estimaron modelos para determiner los factores que se asocian con el reporte de recordar cada advertencia que había entrado al mercado, además de los factores asociados con el autorreporte del impacto de cada advertencia vigente. Resultados Se observaron incrementos importantes de 2010 a 2011 en los conocimientos sobre los riesgos de fumar, los componentes tóxicos del tabaco y el número telefónico para recibir consejos sobre dejar de fumar. La recordación e impacto de las primeras advertencias con pictogramas parecen ser amplios y equitativos a través de la población fumadora. En comparación con 2010, un mayor nivel de ex fumadores entrevistados en 2011 reportaron que las advertencias habían influido mucho en dejar de fumar (RM=2.44, 95% IC 1.27–4.72). Conclusiones Las AS con pictogramas han logrado un impacto importante en el conocimiento y conducta, información relevante para la población y en tomadores de decisiones. PMID:22689162

  3. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  4. Evaluation of normalization methods in mammalian microRNA-Seq data

    PubMed Central

    Garmire, Lana Xia; Subramaniam, Shankar

    2012-01-01

    Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701

  5. Self-efficacy, pros, and cons as variables associated with adjacent stages of change for regular exercise in Japanese college students.

    PubMed

    Horiuchi, Satoshi; Tsuda, Akira; Kobayashi, Hisanori; Fallon, Elizabeth A; Sakano, Yuji

    2017-07-01

    This study examined self-efficacy (confidence to exercise), pros (exercise's advantages), and cons (exercise's disadvantages) as variables associated across the transtheoretical model's six stages of change in 403 Japanese college students. A series of logistic regression analyses were conducted. Results showed that higher pros and lower cons were associated with being in contemplation compared to precontemplation. Lower cons were associated with being in preparation compared to contemplation. Higher self-efficacy was associated with being in action compared to preparation as well as being in maintenance compared to action. Lower cons were associated with being in termination compared to maintenance.

  6. Changes to perceptions of the pros and cons of genetic susceptibility testing after APOE genotyping for Alzheimer disease risk

    PubMed Central

    Christensen, Kurt D.; Roberts, J. Scott; Uhlmann, Wendy R.; Green, Robert C.

    2011-01-01

    Purpose Perceptions about the pros and cons of genetic susceptibility testing are among the best predictors of test utilization. How actual testing changes such perceptions has yet to be examined. Methods In a clinical trial, first-degree relatives of patients with Alzheimer disease received genetic risk assessments for Alzheimer disease including APOE disclosure. Participants rated 11 possible benefits associated with genetic testing (pros) and 10 risks or limitations (cons) before genetic risk disclosure and again 12 months afterward. Results Pros were rated higher than cons at baseline (3.53 vs. 1.83, P < 0.001) and at 12 months after risk disclosure (3.33 vs. 1.88, P < 0.001). Ratings of pros decreased during the 12-month period (3.33 vs. 3.53, P < 0.001). Ratings of cons did not change (1.88 vs. 1.83, P = 0.199) except for a three-item discrimination subscale which increased (2.07 vs. 1.92, P = 0.012). Among specific pros and cons, three items related to prevention and treatment changed the most. Conclusion The process of APOE genetic risk assessment for Alzheimer disease sensitizes some to its limitations and the risks of discrimination; however, 1-year after disclosure, test recipients still consider the pros to strongly outweigh the cons. PMID:21270636

  7. Economic values under inappropriate normal distribution assumptions.

    PubMed

    Sadeghi-Sefidmazgi, A; Nejati-Javaremi, A; Moradi-Shahrbabak, M; Miraei-Ashtiani, S R; Amer, P R

    2012-08-01

    The objectives of this study were to quantify the errors in economic values (EVs) for traits affected by cost or price thresholds when skewed or kurtotic distributions of varying degree are assumed to be normal and when data with a normal distribution is subject to censoring. EVs were estimated for a continuous trait with dichotomous economic implications because of a price premium or penalty arising from a threshold ranging between -4 and 4 standard deviations from the mean. In order to evaluate the impacts of skewness, positive and negative excess kurtosis, standard skew normal, Pearson and the raised cosine distributions were used, respectively. For the various evaluable levels of skewness and kurtosis, the results showed that EVs can be underestimated or overestimated by more than 100% when price determining thresholds fall within a range from the mean that might be expected in practice. Estimates of EVs were very sensitive to censoring or missing data. In contrast to practical genetic evaluation, economic evaluation is very sensitive to lack of normality and missing data. Although in some special situations, the presence of multiple thresholds may attenuate the combined effect of errors at each threshold point, in practical situations there is a tendency for a few key thresholds to dominate the EV, and there are many situations where errors could be compounded across multiple thresholds. In the development of breeding objectives for non-normal continuous traits influenced by value thresholds, it is necessary to select a transformation that will resolve problems of non-normality or consider alternative methods that are less sensitive to non-normality.

  8. Perceived pros and cons of smoking and quitting in hard-core smokers: a focus group study

    PubMed Central

    2014-01-01

    Background In the last decade, so-called hard-core smokers have received increasing interest in research literature. For smokers in general, the study of perceived costs and benefits (or ‘pros and cons’) of smoking and quitting is of particular importance in predicting motivation to quit and actual quitting attempts. Therefore, this study aims to gain insight into the perceived pros and cons of smoking and quitting in hard-core smokers. Methods We conducted 11 focus group interviews among current hard-core smokers (n = 32) and former hard-core smokers (n = 31) in the Netherlands. Subsequently, each participant listed his or her main pros and cons in a questionnaire. We used a structural procedure to analyse the data obtained from the group interviews and from the questionnaires. Results Using the qualitative data of both the questionnaires and the transcripts, the perceived pros and cons of smoking and smoking cessation were grouped into 6 main categories: Finance, Health, Intrapersonal Processes, Social Environment, Physical Environment and Food and Weight. Conclusions Although the perceived pros and cons of smoking in hard-core smokers largely mirror the perceived pros and cons of quitting, there are some major differences with respect to weight, social integration, health of children and stress reduction, that should be taken into account in clinical settings and when developing interventions. Based on these findings we propose the ‘Distorted Mirror Hypothesis’. PMID:24548463

  9. Univariate normalization of bispectrum using Hölder's inequality.

    PubMed

    Shahbazi, Forooz; Ewald, Arne; Nolte, Guido

    2014-08-15

    Considering that many biological systems including the brain are complex non-linear systems, suitable methods capable of detecting these non-linearities are required to study the dynamical properties of these systems. One of these tools is the third order cummulant or cross-bispectrum, which is a measure of interfrequency interactions between three signals. For convenient interpretation, interaction measures are most commonly normalized to be independent of constant scales of the signals such that its absolute values are bounded by one, with this limit reflecting perfect coupling. Although many different normalization factors for cross-bispectra were suggested in the literature these either do not lead to bounded measures or are themselves dependent on the coupling and not only on the scale of the signals. In this paper we suggest a normalization factor which is univariate, i.e., dependent only on the amplitude of each signal and not on the interactions between signals. Using a generalization of Hölder's inequality it is proven that the absolute value of this univariate bicoherence is bounded by zero and one. We compared three widely used normalizations to the univariate normalization concerning the significance of bicoherence values gained from resampling tests. Bicoherence values are calculated from real EEG data recorded in an eyes closed experiment from 10 subjects. The results show slightly more significant values for the univariate normalization but in general, the differences are very small or even vanishing in some subjects. Therefore, we conclude that the normalization factor does not play an important role in the bicoherence values with regard to statistical power, although a univariate normalization is the only normalization factor which fulfills all the required conditions of a proper normalization. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Score Normalization for Keyword Search

    DTIC Science & Technology

    2016-06-23

    Anahtar Sözcük Arama için Skor Düzgeleme Score Normalization for Keyword Search Leda Sarı, Murat Saraçlar Elektrik ve Elektronik Mühendisliği Bölümü...skor düzgeleme. Abstract—In this work, keyword search (KWS) is based on a symbolic index that uses posteriorgram representation of the speech data...For each query, sum-to-one normalization or keyword specific thresholding is applied to the search results. The effect of these methods on the proposed

  11. Normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  12. Normal stresses in semiflexible polymer hydrogels

    NASA Astrophysics Data System (ADS)

    Vahabi, M.; Vos, Bart E.; de Cagny, Henri C. G.; Bonn, Daniel; Koenderink, Gijsje H.; MacKintosh, F. C.

    2018-03-01

    Biopolymer gels such as fibrin and collagen networks are known to develop tensile axial stress when subject to torsion. This negative normal stress is opposite to the classical Poynting effect observed for most elastic solids including synthetic polymer gels, where torsion provokes a positive normal stress. As shown recently, this anomalous behavior in fibrin gels depends on the open, porous network structure of biopolymer gels, which facilitates interstitial fluid flow during shear and can be described by a phenomenological two-fluid model with viscous coupling between network and solvent. Here we extend this model and develop a microscopic model for the individual diagonal components of the stress tensor that determine the axial response of semiflexible polymer hydrogels. This microscopic model predicts that the magnitude of these stress components depends inversely on the characteristic strain for the onset of nonlinear shear stress, which we confirm experimentally by shear rheometry on fibrin gels. Moreover, our model predicts a transient behavior of the normal stress, which is in excellent agreement with the full time-dependent normal stress we measure.

  13. YOUNG WOMEN’S PERSPECTIVE OF THE PROS AND CONS TO SEEKING SCREENING FOR CHLAMYDIA AND GONORRHEA: AN EXPLORATORY STUDY

    PubMed Central

    Chacko, Mariam R.; von Sternberg, Kirk; Velasquez, Mary M.; Wiemann, Constance M.; Smith, Peggy B.; DiClemente, Ralph

    2008-01-01

    Study Objective To identify young women’s pros and cons (decisional balance) to seeking chlamydia (CT) and gonorrhea (NGC) screening. Design Prospective, cross sectional study Setting Community-based reproductive health clinic Participants 192 young women (66% African American; mean age 18.9 years). Main Outcome Measure(s) Content analysis of responses obtained during a decisional balance exercise (pros and cons) promoting CT and NGC screening was conducted. Thematic categories were developed through a coding process, and each response was assigned to one thematic category. The frequency of pros and cons responses for each category and the frequency of participants endorsing each category were calculated. Results Ten thematic categories in relation to pros and cons of seeking CT and NGC screening were: being healthy; awareness of knowing the body; systemic factors around the clinic visit and testing procedures; benefits and aversions around treatment; partner relationship issues; confidentiality; prevention of long term adverse effects, protection of the body; concern for others; fear of results/aversion to testing; and logistical barriers. The three most often cited pros were awareness, healthy and treatment issues; and the three most often cited cons were logistical barriers (time/transportation), fear/aversion to testing, and systemic issues. Conclusions A variety of pros and cons to seeking CT and NGC screening were identified at a community-based clinic. Providers in clinical settings can utilize this information when encouraging patients to seek regular STI screening by elucidating and emphasizing those pros and cons that have the most influence on a young woman’s decision-making to seek screening. PMID:18656072

  14. Forced Normalization: Antagonism Between Epilepsy and Psychosis.

    PubMed

    Kawakami, Yasuhiko; Itoh, Yasuhiko

    2017-05-01

    The antagonism between epilepsy and psychosis has been discussed for a long time. Landolt coined the term "forced normalization" in the 1950s to describe psychotic episodes associated with the remission of seizures and disappearance of epileptiform activity on electroencephalograms in individuals with epilepsy. Since then, neurologists and psychiatrists have been intrigued by this phenomenon. However, although collaborative clinical studies and basic experimental researches have been performed, the mechanism of forced normalization remains unknown. In this review article, we present a historical overview of the concept of forced normalization, and discuss potential pathogenic mechanisms and clinical diagnosis. We also discuss the role of dopamine, which appears to be a key factor in the mechanism of forced normalization. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting.

    PubMed

    Cereceda-Sánchez, Francisco José; Molina-Mula, Jesús

    2017-05-15

    ón de alteraciones metabólicas ante pacientes en respiración espontánea, en el ámbito de las emergencias y los cuidados críticos. búsqueda bibliográfica estructurada en profundidad, en bases de datos EBSCOhost, Biblioteca Virtual de la Salud, PubMed, Cochrane Library, entre otras, identificando estudios que evaluaban la relación entre valores de la capnografía y variables implicadas en el equilibrio ácido-base sanguíneo. se recopilaron 19 estudios, dos eran revisiones y 17 observacionales. En nueve estudios, se correlacionaron los valores capnográficos junto al dióxido de carbono (CO2), en ocho con el bicarbonato (HCO3), tres con el lactato, y cuatro con el pH sanguíneo. la mayoría de estudios han obtenido una correlación adecuada entre los valores capnográficos y biomarcadores sanguíneos, sugiriendo la utilidad de este parámetro para la detección de pacientes en riesgo de padecer una alteración metabólica grave, de forma rápida, económica y precisa.

  16. Energetic valorization of wood waste: estimation of the reduction in CO2 emissions.

    PubMed

    Vanneste, J; Van Gerven, T; Vander Putten, E; Van der Bruggen, B; Helsen, L

    2011-09-01

    This paper investigates the potential CO(2) emission reductions related to a partial switch from fossil fuel-based heat and electricity generation to renewable wood waste-based systems in Flanders. The results show that valorization in large-scale CHP (combined heat and power) systems and co-firing in coal plants have the largest CO(2) reduction per TJ wood waste. However, at current co-firing rates of 10%, the CO(2) reduction per GWh of electricity that can be achieved by co-firing in coal plants is five times lower than the CO(2) reduction per GWh of large-scale CHP. Moreover, analysis of the effect of government support for co-firing of wood waste in coal-fired power plants on the marginal costs of electricity generation plants reveals that the effect of the European Emission Trading Scheme (EU ETS) is effectively counterbalanced. This is due to the fact that biomass integrated gasification combined cycles (BIGCC) are not yet commercially available. An increase of the fraction of coal-based electricity in the total electricity generation from 8 to 10% at the expense of the fraction of gas-based electricity due to the government support for co-firing wood waste, would compensate entirely for the CO(2) reduction by substitution of coal by wood waste. This clearly illustrates the possibility of a 'rebound' effect on the CO(2) reduction due to government support for co-combustion of wood waste in an electricity generation system with large installed capacity of coal- and gas-based power plants, such as the Belgian one. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Cultured normal mammalian tissue and process

    NASA Technical Reports Server (NTRS)

    Goodwin, Thomas J. (Inventor); Prewett, Tacey L. (Inventor); Wolf, David A. (Inventor); Spaulding, Glenn F. (Inventor)

    1993-01-01

    Normal mammalian tissue and the culturing process has been developed for the three groups of organ, structural and blood tissue. The cells are grown in vitro under microgravity culture conditions and form three dimensional cell aggregates with normal cell function. The microgravity culture conditions may be microgravity or simulated microgravity created in a horizontal rotating wall culture vessel.

  18. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, J.; Markley, F. L.; Bar-Itzhack, Itzhack Y.

    1993-01-01

    Attitude determination of spacecraft usually utilizes vector measurements such as Sun, center of Earth, star, and magnetic field direction to update the quaternion which determines the spacecraft orientation with respect to some reference coordinates in the three dimensional space. These measurements are usually processed by an extended Kalman filter (EKF) which yields an estimate of the attitude quaternion. Two EKF versions for quaternion estimation were presented in the literature; namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). In the multiplicative EKF, it is assumed that the error between the correct quaternion and its a-priori estimate is, by itself, a quaternion that represents the rotation necessary to bring the attitude which corresponds to the a-priori estimate of the quaternion into coincidence with the correct attitude. The EKF basically estimates this quotient quaternion and then the updated quaternion estimate is obtained by the product of the a-priori quaternion estimate and the estimate of the difference quaternion. In the additive EKF, it is assumed that the error between the a-priori quaternion estimate and the correct one is an algebraic difference between two four-tuple elements and thus the EKF is set to estimate this difference. The updated quaternion is then computed by adding the estimate of the difference to the a-priori quaternion estimate. If the quaternion estimate converges to the correct quaternion, then, naturally, the quaternion estimate has unity norm. This fact was utilized in the past to obtain superior filter performance by applying normalization to the filter measurement update of the quaternion. It was observed for the AEKF that when the attitude changed very slowly between measurements, normalization merely resulted in a faster convergence; however, when the attitude changed considerably between measurements, without filter tuning or normalization, the quaternion estimate diverged. However, when the

  19. Valorization of date palm (Phoenix dactylifera) fruit processing by-products and wastes using bioprocess technology - Review.

    PubMed

    Chandrasekaran, M; Bahkali, Ali H

    2013-04-01

    The date palm Phoenix dactylifera has played an important role in the day-to-day life of the people for the last 7000 years. Today worldwide production, utilization and industrialization of dates are continuously increasing since date fruits have earned great importance in human nutrition owing to their rich content of essential nutrients. Tons of date palm fruit wastes are discarded daily by the date processing industries leading to environmental problems. Wastes such as date pits represent an average of 10% of the date fruits. Thus, there is an urgent need to find suitable applications for this waste. In spite of several studies on date palm cultivation, their utilization and scope for utilizing date fruit in therapeutic applications, very few reviews are available and they are limited to the chemistry and pharmacology of the date fruits and phytochemical composition, nutritional significance and potential health benefits of date fruit consumption. In this context, in the present review the prospects of valorization of these date fruit processing by-products and wastes' employing fermentation and enzyme processing technologies towards total utilization of this valuable commodity for the production of biofuels, biopolymers, biosurfactants, organic acids, antibiotics, industrial enzymes and other possible industrial chemicals are discussed.

  20. Poblaciones de los niveles atómicos en condiciones de no equilibrio

    NASA Astrophysics Data System (ADS)

    Cruzado, A.

    El objetivo de este trabajo es encontrar la distribución de los átomos en los diferentes niveles de energía. Con el propósito de encontrar resultados generales y de amplia aplicación, hemos planteado las ecuaciones de equilibrio estadístico como función del número atómico del elemento a considerar y de las condiciones físicas del medio (temperatura y densidad). Así también se ha intentado tener en cuenta todos los niveles atómicos considerando explícitamente aquellos con un número cuántico principal menor que un cierto valor n y calculando una expresión aproximada para estimar la influencia de los demás.

  1. Low-frequency electrotherapy for female patients with detrusor underactivity due to neuromuscular deficiency.

    PubMed

    Xu, Dan-Feng; Zhang, Shen; Wang, Cun-Zhou; Li, Jun; Qu, Chuang-Yu; Cui, Xin-Gang; Zhao, Sheng-Jia

    2012-08-01

    The aim of the study was to assess the efficacy of low-frequency electrotherapy (LFE) for female patients with early-stage detrusor underactivity (DUA) due to neuromuscular deficiency. A total of 102 female patients were divided randomly into four groups: LFE-NC (normal compliance), LFE-LC (low compliance), CON (control)-NC and CON-LC. Patients in the LFE-NC and LFE-LC groups received LFE, and those in the CON-NC and CON-LC groups received conservative treatment. Urodynamic evaluation was performed before and after treatment. After treatment, 82 % of the LFE-NC regained detrusor contractility, whereas only 2 (8 %) of the CON-NC had normal detrusor contraction. None of LFE-LC or CON-LC regained detrusor contractility (p < 0.01). The per cent of LFE-NC who relied on catheterization for bladder emptying decreased by 43 % (p < 0.01). Those in the LFE-LC, CON-NC and CON-LC groups decreased by only 4, 12 or 0 % (p > 0.05). LFE was more effective for DUA patients with normal compliance; these patients benefited from LFE, but DUA patients with low compliance did not.

  2. Flow derivatives and curvatures for a normal shock

    NASA Astrophysics Data System (ADS)

    Emanuel, G.

    2018-03-01

    A detached bow shock wave is strongest where it is normal to the upstream velocity. While the jump conditions across the shock are straightforward, many properties, such as the shock's curvatures and derivatives of the pressure, along and normal to a normal shock, are indeterminate. A novel procedure is introduced for resolving the indeterminacy when the unsteady flow is three-dimensional and the upstream velocity may be nonuniform. Utilizing this procedure, normal shock relations are provided for the nonunique orientation of the flow plane and the corresponding shock's curvatures and, e.g., the downstream normal derivatives of the pressure and the velocity components. These algebraic relations explicitly show the dependence of these parameters on the shock's shape and the upstream velocity gradient. A simple relation, valid only for a normal shock, is obtained for the average curvatures. Results are also obtained when the shock is an elliptic paraboloid shock. These derivatives are both simple and proportional to the average curvature.

  3. GC-Content Normalization for RNA-Seq Data

    PubMed Central

    2011-01-01

    Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264

  4. CHILES Con Pol: An ultra-deep JVLA survey probing galaxy evolution and cosmic magnetism

    NASA Astrophysics Data System (ADS)

    Hales, Christopher A.; Momjian, Emmanuel; van Gorkom, Jacqueline; Rupen, Michael P.; Greiner, Maksim; Ensslin, Torsten A.; Bonzini, Margherita; Padovani, Paolo; Harrison, Ian; Brown, Michael L.; Gim, Hansung; Yun, Min S.; Maddox, Natasha; Stewart, Adam; Fender, Rob P.; Tremou, Evangelia; Chomiuk, Laura; Peters, Charee; Wilcots, Eric M.; Lazio, Joseph

    2015-08-01

    We are undertaking a 1000 hour campaign with the Karl G. Jansky VLA to survey 0.2 square degrees of the COSMOS field in full polarization continuum at 1.4 GHz. Our observations are part of a joint program with the spectral line COSMOS HI Large Extragalactic Survey (CHILES). When complete, we expect our CHILES Continuum Polarization (CHILES Con Pol) survey to reach an SKA-era sensitivity of 500 nJy per 4 arcsecond resolving beam, the deepest view of the radio sky yet. CHILES Con Pol will open new and fertile parameter space, with sensitivity to star formation rates of 10 Msun per year out to an unprecedented redshift of z=2, and ultra-luminous infrared galaxies and sub-millimeter galaxies out to redshifts of z=8 and beyond. This rich resource will extend the utility of radio band studies beyond the usual radio quasar and radio galaxy populations, opening sensitivity to the starforming and radio-quiet AGN populations that form the bulk of extragalactic sources detected in the optical, X-ray, and infrared bands. In this talk I will outline the key science of CHILES Con Pol, including galaxy evolution and novel measurements of intergalactic magnetic fields. I will present initial results from the first 180 hours of the survey and describe our forthcoming Data Release 1. I invite the astronomical community to consider unique science that can be pursued with CHILES Con Pol radio data.

  5. Normal IQ is possible in Smith-Lemli-Opitz syndrome.

    PubMed

    Eroglu, Yasemen; Nguyen-Driver, Mina; Steiner, Robert D; Merkens, Louise; Merkens, Mark; Roullet, Jean-Baptiste; Elias, Ellen; Sarphare, Geeta; Porter, Forbes D; Li, Chumei; Tierney, Elaine; Nowaczyk, Małgorzata J; Freeman, Kurt A

    2017-08-01

    Children with Smith-Lemli-Opitz syndrome (SLOS) are typically reported to have moderate to severe intellectual disability. This study aims to determine whether normal cognitive function is possible in this population and to describe clinical, biochemical and molecular characteristics of children with SLOS and normal intelligent quotient (IQ). The study included children with SLOS who underwent cognitive testing in four centers. All children with at least one IQ composite score above 80 were included in the study. Six girls, three boys with SLOS were found to have normal or low-normal IQ in a cohort of 145 children with SLOS. Major/multiple organ anomalies and low serum cholesterol levels were uncommon. No correlation with IQ and genotype was evident and no specific developmental profile were observed. Thus, normal or low-normal cognitive function is possible in SLOS. Further studies are needed to elucidate factors contributing to normal or low-normal cognitive function in children with SLOS. © 2017 Wiley Periodicals, Inc.

  6. Housekeeping while brain's storming Validation of normalizing factors for gene expression studies in a murine model of traumatic brain injury

    PubMed Central

    Rhinn, Hervé; Marchand-Leroux, Catherine; Croci, Nicole; Plotkine, Michel; Scherman, Daniel; Escriou, Virginie

    2008-01-01

    -intention normalizing factor with a broad field of applications is highlighted. Pros and cons of the three methods of normalization factors selection are discussed. A generic time- and cost-effective procedure for normalization factor validation is proposed. PMID:18611280

  7. Normal gravity field in relativistic geodesy

    NASA Astrophysics Data System (ADS)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  8. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  9. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  10. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  11. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  12. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  13. NREL, NYSERDA, and Con Edison Partner on Home Energy Management Systems |

    Science.gov Websites

    at large scale, the overall impact could be a win-win for both homeowners and utilities, which could sources. Founded in 1823, Con Edison provides electric, gas, and steam service to 10 million people who

  14. FORCED NORMALIZATION: Epilepsy and Psychosis Interaction

    PubMed Central

    Loganathan, Muruga A.; Enja, Manasa

    2015-01-01

    Forced normalization is the emergence of psychoses following the establishment of seizure control in an uncontrolled epilepsy patient. Two illustrative clinical vignettes are provided about people with epilepsy that was newly controlled and followed by emergence of a psychosis; symptoms appeared only after attaining ictal control. For recognition and differential diagnosis purposes, understanding forced normalization is important in clinical practice. PMID:26155377

  15. Corticocortical feedback increases the spatial extent of normalization.

    PubMed

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  16. Angio-OCT de la zona avascular foveal en ojos con oclusión venosa de la retina.

    PubMed

    Wons, Juliana; Pfau, Maximilian; Wirth, Magdalena A; Freiberg, Florentina J; Becker, Matthias D; Michels, Stephan

    2017-07-11

    Objetivo: El objetivo del estudio comprendía visualizar y cuantificar las alteraciones patológicas de la zona avascular foveal (ZAF) mediante angio-OCT en ojos con oclusión venosa de la retina (OVR) en comparación con el ojo contralateral sano. Procedimientos: La angio-OCT se llevó a cabo mediante el sistema Avanti® RTVue 100 XR (Optovue Inc., Fremont, Calif., EE. UU.). Los bordes de la capa vascular superficial (CVS) se definieron como 3 μm por debajo de la membrana limitante interna y 15 μm por debajo de la capa plexiforme interna y, para la capa vascular profunda (CVP), como 15 y 70 μm por debajo de la membrana limitante interna y de la capa plexiforme interna, respectivamente. La longitud de la ZAF horizontal, vertical y máxima de la CVS y la CVP en cada ojo se midió de forma manual. Además, se midió el ángulo entre el diámetro máximo de la ZAF y el plano papilomacular. Resultados: La angio-OCT representó los defectos dentro de la vasculatura en el área perifoveal en ojos con oclusión de rama venosa de la retina (ORVR; n = 11) y con oclusión de la vena central de la retina (OVCR; n = 8). Esto resultó en un crecimiento del diámetro máximo de la ZAF en ojos con OVR (n = 19) en comparación con el ojo contralateral (n = 19; 921 ± 213 frente a 724 ± 145 µm; p = 0,008). Además, se observó una correlación significativa entre la mejor agudeza visual corregida (MAVC) y el diámetro máximo de la ZAF en la CVP (ρ de Spearman = -0,423, p < 0,01). Por último, en los ojos con OVR, el ángulo entre el plano papilomacular y el diámetro máximo de la ZAF se dio tan solo en el 21,05% (CVS) y en el 15,79% (CVP) de los casos a 0 ± 15 ó 90 ± 15°, respectivamente. En ojos sanos, estos ángulos (que supuestamente representan una configuración de la ZAF regular) fueron más prevalentes (CVS 68,42 frente a 21,05%, p = 0,003; CVP 73,68 frente a 15,79%, p < 0,001). Conclusiones: La angio-OCT muestra alteraciones morfológicas de la ZAF en ojos con

  17. The International Consortium for the Investigation of Renal Malignancies (I-ConFIRM)

    Cancer.gov

    The International Consortium for the Investigation of Renal Malignancies (I-ConFIRM) was formed to promote international, multidisciplinary collaborations to advance our understanding of the etiology and outcomes of kidney cancer.

  18. Detail of conning tower atop the submarine. Note the wire ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of conning tower atop the submarine. Note the wire rope wrapped around the base of the tower, which may have been used in an attempt to pull the submarine offshore. - Sub Marine Explorer, Located along the beach of Isla San Telmo, Pearl Islands, Isla San Telmo, Former Panama Canal Zone, CZ

  19. Normalized Temperature Contrast Processing in Flash Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing of flash infrared thermography method by the author given in US 8,577,120 B1. The method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided, including converting one from the other. Methods of assessing emissivity of the object, afterglow heat flux, reflection temperature change and temperature video imaging during flash thermography are provided. Temperature imaging and normalized temperature contrast imaging provide certain advantages over pixel intensity normalized contrast processing by reducing effect of reflected energy in images and measurements, providing better quantitative data. The subject matter for this paper mostly comes from US 9,066,028 B1 by the author. Examples of normalized image processing video images and normalized temperature processing video images are provided. Examples of surface temperature video images, surface temperature rise video images and simple contrast video images area also provided. Temperature video imaging in flash infrared thermography allows better comparison with flash thermography simulation using commercial software which provides temperature video as the output. Temperature imaging also allows easy comparison of surface temperature change to camera temperature sensitivity or noise equivalent temperature difference (NETD) to assess probability of detecting (POD) anomalies.

  20. Understanding a Normal Distribution of Data (Part 2).

    PubMed

    Maltenfort, Mitchell

    2016-02-01

    Completing the discussion of data normality, advanced techniques for analysis of non-normal data are discussed including data transformation, Generalized Linear Modeling, and bootstrapping. Relative strengths and weaknesses of each technique are helpful in choosing a strategy, but help from a statistician is usually necessary to analyze non-normal data using these methods.

  1. High-speed digital signal normalization for feature identification

    NASA Technical Reports Server (NTRS)

    Ortiz, J. A.; Meredith, B. D.

    1983-01-01

    A design approach for high speed normalization of digital signals was developed. A reciprocal look up table technique is employed, where a digital value is mapped to its reciprocal via a high speed memory. This reciprocal is then multiplied with an input signal to obtain the normalized result. Normalization improves considerably the accuracy of certain feature identification algorithms. By using the concept of pipelining the multispectral sensor data processing rate is limited only by the speed of the multiplier. The breadboard system was found to operate at an execution rate of five million normalizations per second. This design features high precision, a reduced hardware complexity, high flexibility, and expandability which are very important considerations for spaceborne applications. It also accomplishes a high speed normalization rate essential for real time data processing.

  2. Determinación astronómica de la Desviación de la Vertical

    NASA Astrophysics Data System (ADS)

    Pacheco, A. M.; Podestá, R. C.

    A partir de las coordenadas astronómicas de Latitud y Longitud determinadas en la falla geológica de Nikizanga ubicada en las serranías de Pie de Palo, y, en base a un Punto Datum de referencia, se desarrolla la metodología para la determinación de la Desviación de la Vertical, que comprende la reducción de las observaciones astronómicas, transformaciones de coordenadas, aplicación de correcciones y el cálculo definitivo de los valores angulares de la Vertical. Estos estudios se iniciaron a sugerencia del Servicio Internacional de Latitud, International Polar Motion Service (IPMS), con el objeto de obtener en determinados puntos de la Tierra la Desviación de la Vertical y su variación, dentro de la nueva disciplina denominada Astrogeodinámica, con la idea de correlacionar estas variaciones con la predicción de grandes sismos.

  3. Antitissue Transglutaminase Normalization Postdiagnosis in Children With Celiac Disease.

    PubMed

    Isaac, Daniela Migliarese; Rajani, Seema; Yaskina, Maryna; Huynh, Hien Q; Turner, Justine M

    2017-08-01

    Limited pediatric data exist examining the trend and predictors of antitissue transglutaminase (atTG) normalization over time in children with celiac disease (CD). We aimed to evaluate time to normalization of atTG in children after CD diagnosis, and to assess for independent predictors affecting this duration. A retrospective chart review was completed in pediatric patients with CD diagnosed from 2007 to 2014 at the Stollery Children's Hospital Celiac Clinic (Edmonton, Alberta, Canada). The clinical predictors assessed for impact on time to atTG normalization were initial atTG, Marsh score at diagnosis, gluten-free diet compliance (GFDC), age at diagnosis, sex, ethnicity, medical comorbidities, and family history of CD. Kaplan-Meier survival analysis was completed to assess time to atTG normalization, and Cox regression to assess for independent predictors of this time. A total of 487 patients met inclusion criteria. Approximately 80.5% of patients normalized atTG levels. Median normalization time was 407 days for all patients (95% confidence interval [CI: 361-453]), and 364 days for gluten-free diet compliant patients (95% CI [335-393]). Type 1 diabetes mellitus (T1DM) patients took significantly longer to normalize at 1204 days (95% CI [199-2209], P < 0.001). Cox regression demonstrated T1DM (hazard ratio = 0.36 [0.24-0.55], P < 0.001) and higher baseline atTG (hazard ratio = 0.52 [0.43-0.63], P < 0.001) were significant predictors of longer atTG normalization time. GFDC was a significant predictor of earlier normalization (OR = 13.91 [7.86-24.62], P < 0.001). GFDC and lower atTG at diagnosis are predictors of earlier normalization. Patients with T1DM are less likely to normalize atTG levels, with longer normalization time. Additional research and education for higher-risk populations are needed.

  4. Normalization of energy-dependent gamma survey data.

    PubMed

    Whicker, Randy; Chambers, Douglas

    2015-05-01

    Instruments and methods for normalization of energy-dependent gamma radiation survey data to a less energy-dependent basis of measurement are evaluated based on relevant field data collected at 15 different sites across the western United States along with a site in Mongolia. Normalization performance is assessed relative to measurements with a high-pressure ionization chamber (HPIC) due to its "flat" energy response and accurate measurement of the true exposure rate from both cosmic and terrestrial radiation. While analytically ideal for normalization applications, cost and practicality disadvantages have increased demand for alternatives to the HPIC. Regression analysis on paired measurements between energy-dependent sodium iodide (NaI) scintillation detectors (5-cm by 5-cm crystal dimensions) and the HPIC revealed highly consistent relationships among sites not previously impacted by radiological contamination (natural sites). A resulting generalized data normalization factor based on the average sensitivity of NaI detectors to naturally occurring terrestrial radiation (0.56 nGy hHPIC per nGy hNaI), combined with the calculated site-specific estimate of cosmic radiation, produced reasonably accurate predictions of HPIC readings at natural sites. Normalization against two to potential alternative instruments (a tissue-equivalent plastic scintillator and energy-compensated NaI detector) did not perform better than the sensitivity adjustment approach at natural sites. Each approach produced unreliable estimates of HPIC readings at radiologically impacted sites, though normalization against the plastic scintillator or energy-compensated NaI detector can address incompatibilities between different energy-dependent instruments with respect to estimation of soil radionuclide levels. The appropriate data normalization method depends on the nature of the site, expected duration of the project, survey objectives, and considerations of cost and practicality.

  5. Speaker normalization for chinese vowel recognition in cochlear implants.

    PubMed

    Luo, Xin; Fu, Qian-Jie

    2005-07-01

    Because of the limited spectra-temporal resolution associated with cochlear implants, implant patients often have greater difficulty with multitalker speech recognition. The present study investigated whether multitalker speech recognition can be improved by applying speaker normalization techniques to cochlear implant speech processing. Multitalker Chinese vowel recognition was tested with normal-hearing Chinese-speaking subjects listening to a 4-channel cochlear implant simulation, with and without speaker normalization. For each subject, speaker normalization was referenced to the speaker that produced the best recognition performance under conditions without speaker normalization. To match the remaining speakers to this "optimal" output pattern, the overall frequency range of the analysis filter bank was adjusted for each speaker according to the ratio of the mean third formant frequency values between the specific speaker and the reference speaker. Results showed that speaker normalization provided a small but significant improvement in subjects' overall recognition performance. After speaker normalization, subjects' patterns of recognition performance across speakers changed, demonstrating the potential for speaker-dependent effects with the proposed normalization technique.

  6. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  7. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  8. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected farmer's normal marketings which, for the purposes of this subpart, shall be the sum of the quantities of...

  9. Integration of Waste Valorization for Sustainable Production of Chemicals and Materials via Algal Cultivation.

    PubMed

    Chen, Yong; Sun, Li-Ping; Liu, Zhi-Hui; Martin, Greg; Sun, Zheng

    2017-11-27

    Managing waste is an increasing problem globally. Microalgae have the potential to help remove contaminants from a range of waste streams and convert them into useful biomass. This article presents a critical review of recent technological developments in the production of chemicals and other materials from microalgae grown using different types of waste. A range of novel approaches are examined for efficiently capturing CO 2 in flue gas via photosynthetic microalgal cultivation. Strategies for using microalgae to assimilate nitrogen, organic carbon, phosphorus, and metal ions from wastewater are considered in relation to modes of production. Generally, more economical open cultivation systems such as raceway ponds are better suited for waste conversion than more expensive closed photobioreactor systems, which might have use for higher-value products. The effect of cultivation methods and the properties of the waste streams on the composition the microalgal biomass is discussed relative to its utilization. Possibilities include the production of biodiesel via lipid extraction, biocrude from hydrothermal liquefaction, and bioethanol or biogas from microbial conversion. Microalgal biomass produced from wastes may also find use in higher-value applications including protein feeds or for the production of bioactive compounds such as astaxanthin or omega-3 fatty acids. However, for some waste streams, further consideration of how to manage potential microbial and chemical contaminants is needed for food or health applications. The use of microalgae for waste valorization holds promise. Widespread implementation of the available technologies will likely follow from further improvements to reduce costs, as well as the increasing pressure to effectively manage waste.

  10. Rhythm-based heartbeat duration normalization for atrial fibrillation detection.

    PubMed

    Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim

    2016-05-01

    Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization

  11. Normal modes of weak colloidal gels

    NASA Astrophysics Data System (ADS)

    Varga, Zsigmond; Swan, James W.

    2018-01-01

    The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer

  12. Corticocortical feedback increases the spatial extent of normalization

    PubMed Central

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  13. Plasma Electrolyte Distributions in Humans-Normal or Skewed?

    PubMed

    Feldman, Mark; Dickson, Beverly

    2017-11-01

    It is widely believed that plasma electrolyte levels are normally distributed. Statistical tests and calculations using plasma electrolyte data are often reported based on this assumption of normality. Examples include t tests, analysis of variance, correlations and confidence intervals. The purpose of our study was to determine whether plasma sodium (Na + ), potassium (K + ), chloride (Cl - ) and bicarbonate [Formula: see text] distributions are indeed normally distributed. We analyzed plasma electrolyte data from 237 consecutive adults (137 women and 100 men) who had normal results on a standard basic metabolic panel which included plasma electrolyte measurements. The skewness of each distribution (as a measure of its asymmetry) was compared to the zero skewness of a normal (Gaussian) distribution. The plasma Na + distribution was skewed slightly to the right, but the skew was not significantly different from zero skew. The plasma Cl - distribution was skewed slightly to the left, but again the skew was not significantly different from zero skew. On the contrary, both the plasma K + and [Formula: see text] distributions were significantly skewed to the right (P < 0.01 zero skew). There was also a suggestion from examining frequency distribution curves that K + and [Formula: see text] distributions were bimodal. In adults with a normal basic metabolic panel, plasma potassium and bicarbonate levels are not normally distributed and may be bimodal. Thus, statistical methods to evaluate these 2 plasma electrolytes should be nonparametric tests and not parametric ones that require a normal distribution. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  14. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  15. Public health and valorization of genome-based technologies: a new model

    PubMed Central

    2011-01-01

    Background The success rate of timely translation of genome-based technologies to commercially feasible products/services with applicability in health care systems is significantly low. We identified both industry and scientists neglect health policy aspects when commercializing their technology, more specifically, Public Health Assessment Tools (PHAT) and early on involvement of decision makers through which market authorization and reimbursements are dependent. While Technology Transfer (TT) aims to facilitate translation of ideas into products, Health Technology Assessment, one component of PHAT, for example, facilitates translation of products/processes into healthcare services and eventually comes up with recommendations for decision makers. We aim to propose a new model of valorization to optimize integration of genome-based technologies into the healthcare system. Methods The method used to develop our model is an adapted version of the Fish Trap Model and the Basic Design Cycle. Results We found although different, similarities exist between TT and PHAT. Realizing the potential of being mutually beneficial justified our proposal of their relative parallel initiation. We observed that the Public Health Genomics Wheel should be included in this relative parallel activity to ensure all societal/policy aspects are dealt with preemptively by both stakeholders. On further analysis, we found out this whole process is dependent on the Value of Information. As a result, we present our LAL (Learning Adapting Leveling) model which proposes, based on market demand; TT and PHAT by consultation/bi-lateral communication should advocate for relevant technologies. This can be achieved by public-private partnerships (PPPs). These widely defined PPPs create the innovation network which is a developing, consultative/collaborative-networking platform between TT and PHAT. This network has iterations and requires learning, assimilating and using knowledge developed and is called

  16. Public health and valorization of genome-based technologies: a new model.

    PubMed

    Lal, Jonathan A; Schulte In den Bäumen, Tobias; Morré, Servaas A; Brand, Angela

    2011-12-05

    The success rate of timely translation of genome-based technologies to commercially feasible products/services with applicability in health care systems is significantly low. We identified both industry and scientists neglect health policy aspects when commercializing their technology, more specifically, Public Health Assessment Tools (PHAT) and early on involvement of decision makers through which market authorization and reimbursements are dependent. While Technology Transfer (TT) aims to facilitate translation of ideas into products, Health Technology Assessment, one component of PHAT, for example, facilitates translation of products/processes into healthcare services and eventually comes up with recommendations for decision makers. We aim to propose a new model of valorization to optimize integration of genome-based technologies into the healthcare system. The method used to develop our model is an adapted version of the Fish Trap Model and the Basic Design Cycle. We found although different, similarities exist between TT and PHAT. Realizing the potential of being mutually beneficial justified our proposal of their relative parallel initiation. We observed that the Public Health Genomics Wheel should be included in this relative parallel activity to ensure all societal/policy aspects are dealt with preemptively by both stakeholders. On further analysis, we found out this whole process is dependent on the Value of Information. As a result, we present our LAL (Learning Adapting Leveling) model which proposes, based on market demand; TT and PHAT by consultation/bi-lateral communication should advocate for relevant technologies. This can be achieved by public-private partnerships (PPPs). These widely defined PPPs create the innovation network which is a developing, consultative/collaborative-networking platform between TT and PHAT. This network has iterations and requires learning, assimilating and using knowledge developed and is called absorption capacity. We

  17. Is Coefficient Alpha Robust to Non-Normal Data?

    PubMed Central

    Sheng, Yanyan; Sheng, Zhaohui

    2011-01-01

    Coefficient alpha has been a widely used measure by which internal consistency reliability is assessed. In addition to essential tau-equivalence and uncorrelated errors, normality has been noted as another important assumption for alpha. Earlier work on evaluating this assumption considered either exclusively non-normal error score distributions, or limited conditions. In view of this and the availability of advanced methods for generating univariate non-normal data, Monte Carlo simulations were conducted to show that non-normal distributions for true or error scores do create problems for using alpha to estimate the internal consistency reliability. The sample coefficient alpha is affected by leptokurtic true score distributions, or skewed and/or kurtotic error score distributions. Increased sample sizes, not test lengths, help improve the accuracy, bias, or precision of using it with non-normal data. PMID:22363306

  18. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  19. End-User Use of Data Base Query Language: Pros and Cons.

    ERIC Educational Resources Information Center

    Nicholes, Walter

    1988-01-01

    Man-machine interface, the concept of a computer "query," a review of database technology, and a description of the use of query languages at Brigham Young University are discussed. The pros and cons of end-user use of database query languages are explored. (Author/MLW)

  20. Attention and normalization circuits in macaque V1

    PubMed Central

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-01-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. PMID:25757941

  1. Dependence of normal brain integral dose and normal tissue complication probability on the prescription isodose values for γ-knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Ma, Lijun

    2001-11-01

    A recent multi-institutional clinical study suggested possible benefits of lowering the prescription isodose lines for stereotactic radiosurgery procedures. In this study, we investigate the dependence of the normal brain integral dose and the normal tissue complication probability (NTCP) on the prescription isodose values for γ-knife radiosurgery. An analytical dose model was developed for γ-knife treatment planning. The dose model was commissioned by fitting the measured dose profiles for each helmet size. The dose model was validated by comparing its results with the Leksell gamma plan (LGP, version 5.30) calculations. The normal brain integral dose and the NTCP were computed and analysed for an ensemble of treatment cases. The functional dependence of the normal brain integral dose and the NCTP versus the prescribing isodose values was studied for these cases. We found that the normal brain integral dose and the NTCP increase significantly when lowering the prescription isodose lines from 50% to 35% of the maximum tumour dose. Alternatively, the normal brain integral dose and the NTCP decrease significantly when raising the prescribing isodose lines from 50% to 65% of the maximum tumour dose. The results may be used as a guideline for designing future dose escalation studies for γ-knife applications.

  2. Papiloma invertido sinunasal con invasión intracraneal: Reporte de caso y revisión bibliográfica

    PubMed Central

    Di Pietrantonio, Andrés; Asmus, Humberto; Ingratta, Christian; Brennan, Walter; Schulz, Javier; Carballo, Leandro

    2018-01-01

    Resumen IntroducciÓn: El papiloma invertido es una neoplasia benigna de los senos paranasales localmente agresiva con alto potencial de recurrencia y de malignización. La extensión intracraneal es infrecuente y más aún, la penetración dural, asociándose a menudo a la recurrencia de la enfermedad o a su degeneración en carcinoma de células escamosas. Caso clínico: Presentamos el caso de una paciente de 32 años que consultó por lesión exofítica en fosa nasal derecha y exoftalmos, asociada a cefalea, anosmia y disgeusia. Se estudió con TC cerebro, macizo facial y RM de encéfalo que evidencian lesión en fosa nasal derecha con ocupación de senos aéreos, osteólisis de pared medial orbitaria y base de cráneo anterior e invasión intracraneal frontal derecha, con efecto de masa y compresión del parénquima encefálico adyacente. Intervención: Se realizó una nasofibroscopía en primer tiempo con diagnóstico anatomopatológico de papiloma invertido y posteriormente resección de la lesión mediante doble abordaje más reconstrucción de la fosa craneal anterior. Se obtuvo diagnóstico definitivo de papiloma invertido de tipo Schneideriano con áreas de transformación atípica in situ. La paciente evolucionó de forma favorable y sin complicaciones, con permeabilidad de vía aérea superior, sin signos de recidiva lesional luego de 4 años de seguimiento. Conclusión: La invasión intracraneal de esta patología es sumamente infrecuente. Cuando existe, es indicador de agresividad y potencial recidiva, por lo que la exéresis completa de la misma define el pronóstico de la enfermedad. PMID:29430328

  3. Prevalencia y tamizaje del Trastorno por Déficit de Atención con Hiperactividad en Costa Rica

    PubMed Central

    Weiss, Nicholas T.; Schuler, Jovita; Monge, Silvia; McGough, James J.; Chavira, Denise; Bagnarello, Monica; Herrera, Luis Diego; Mathews, Carol A.

    2015-01-01

    Resumen La investigación tuvo como propósito estimar la prevalencia del Trastorno por Déficit de Atención con Hiperactividad (TDAH) en Costa Rica y determinar si la versión en español del cuestionario Swanson Nolan and Pelham Scale IV (SNAP-IV) es un instrumento de tamizaje útil en una población de niños y niñas escolares costarricenses. El instrumento fue entregado a padres y maestros de 425 niños entre 5 y 13 años de edad (promedio = 8.8). Todos fueron evaluados con el instrumento Swanson, Kotkin, Agler, M-Flynn and Pelham Scale (SKAMP). Su diagnóstico fue confirmado con una entrevista clínica. La sensibilidad y la especificidad del SNAP-IV fueron evaluadas como predictores de criterios de diagnóstico según el DSM-IV. La prevalencia puntual en la muestra del TDAH fue del 5%. El tamizaje más preciso lo hizo el SNAP-IV completado por el maestro en un corte de 20%, con una sensibilidad de 96% y una especificidad de un 82%. La sensibilidad de los instrumentos completados por los padres fue más baja que aquella de los maestros. El SNAP-IV completado por las maestras con un corte aislando el 20% de los mayores puntajes categorizó correctamente a un 87% de los sujetos. PMID:22432094

  4. Soluciones analiticas AL problema de jets con velocidad de eyeccion variable EN EL tiempo.

    NASA Astrophysics Data System (ADS)

    Canto, J.; Raga, A. C.; D'Alessio, P.

    1998-11-01

    Se presenta un nuevo metodo que permite resolver de manera exacta y analitica las ecuaciones que describen un jet hipersonico con velocidad de eyeccion variable en el tiempo. El metodo se basa en consideraciones sencillas de conservacion de momento para las superficies de trabajo que se forman en el interior del jet. Como ejemplo, se presentan soluciones para jets con variacion sinusoidal en la velocidad de eyeccion, y tambien para el caso de un incremento lineal en el tiempo. Estas soluciones analiticas tienen una clara aplicacion en la interpretacion de las observaciones de jets asociados a objetos Herbig-Haro.

  5. Air Traffic Management Technology Demonstration-1 Concept of Operations (ATD-1 ConOps)

    NASA Technical Reports Server (NTRS)

    Baxley, Brian T.; Johnson, William C.; Swenson, Harry; Robinson, John E.; Prevot, Thomas; Callantine, Todd; Scardina, John; Greene, Michael

    2012-01-01

    The operational goal of the ATD-1 ConOps is to enable aircraft, using their onboard FMS capabilities, to fly Optimized Profile Descents (OPDs) from cruise to the runway threshold at a high-density airport, at a high throughput rate, using primarily speed control to maintain in-trail separation and the arrival schedule. The three technologies in the ATD-1 ConOps achieve this by calculating a precise arrival schedule, using controller decision support tools to provide terminal controllers with speeds for aircraft to fly to meet times at a particular meter points, and onboard software providing flight crews with speeds for the aircraft to fly to achieve a particular spacing behind preceding aircraft.

  6. Valorization of Tunisian alfa fibres and sumac tannins for the elaboration of biodegradable insulating panels

    NASA Astrophysics Data System (ADS)

    Saad, Houda; Charrier, Bertrand; Ayed, Naceur; Charrier-El-Bouhtoury, Fatima

    2017-10-01

    Alfa leaves are important renewable raw materials in Tunisia where they are used basically in handcrafts and paper industry. Sumac is also an abundant species in Tunisia known for its high tannin content and is basically used in traditional medicine. To valorize these natural resources, we studied, for the first time, the possibility of making insulating panels based on alfa fibres and sumac tannins based adhesive. Firstly, alfa leaves were treated with an alkali solution as it is one of the standard procedures commonly used in the paper industry to extract cellulosic fibres. Mercerization effects were studied by characterizing fibres thermal properties and fibres surface morphology. Secondly, the sumac tannin based resin was formulated and characterized. Finally, the insulating panel was elaborated and characterized by determining its thermal conductivity. The thermal gravimetric analysis results show improvement in the thermal stability of fibres after alkali treatment. Environmental Scanning Electron Microscopy showed changes on treated alfa surface which could promote the fibre-matrix adhesion. The reactivity of sumac tannins to formaldehyde test (Stiasny number) showed the possible use of sumac tannins in wood adhesive formulation. Thermomechanical analysis and strength analysis of sumac tannin/hexamin based resin highlighted acceptable bonding properties. The thermal conductivity measurement showed an average value equal to 0.110 W/m K. Contribution to the topical issue "Materials for Energy harvesting, conversion and storage II (ICOME 2016)", edited by Jean-Michel Nunzi, Rachid Bennacer and Mohammed El Ganaoui

  7. Spatially tuned normalization explains attention modulation variance within neurons.

    PubMed

    Ni, Amy M; Maunsell, John H R

    2017-09-01

    Spatial attention improves perception of attended parts of a scene, a behavioral enhancement accompanied by modulations of neuronal firing rates. These modulations vary in size across neurons in the same brain area. Models of normalization explain much of this variance in attention modulation with differences in tuned normalization across neurons (Lee J, Maunsell JHR. PLoS One 4: e4651, 2009; Ni AM, Ray S, Maunsell JHR. Neuron 73: 803-813, 2012). However, recent studies suggest that normalization tuning varies with spatial location both across and within neurons (Ruff DA, Alberts JJ, Cohen MR. J Neurophysiol 116: 1375-1386, 2016; Verhoef BE, Maunsell JHR. eLife 5: e17256, 2016). Here we show directly that attention modulation and normalization tuning do in fact covary within individual neurons, in addition to across neurons as previously demonstrated. We recorded the activity of isolated neurons in the middle temporal area of two rhesus monkeys as they performed a change-detection task that controlled the focus of spatial attention. Using the same two drifting Gabor stimuli and the same two receptive field locations for each neuron, we found that switching which stimulus was presented at which location affected both attention modulation and normalization in a correlated way within neurons. We present an equal-maximum-suppression spatially tuned normalization model that explains this covariance both across and within neurons: each stimulus generates equally strong suppression of its own excitatory drive, but its suppression of distant stimuli is typically less. This new model specifies how the tuned normalization associated with each stimulus location varies across space both within and across neurons, changing our understanding of the normalization mechanism and how attention modulations depend on this mechanism. NEW & NOTEWORTHY Tuned normalization studies have demonstrated that the variance in attention modulation size seen across neurons from the same cortical

  8. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... change salinity patterns, alter erosion or sedimentation rates, aggravate water temperature extremes, and... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  9. Facial-Attractiveness Choices Are Predicted by Divisive Normalization.

    PubMed

    Furl, Nicholas

    2016-10-01

    Do people appear more attractive or less attractive depending on the company they keep? A divisive-normalization account-in which representation of stimulus intensity is normalized (divided) by concurrent stimulus intensities-predicts that choice preferences among options increase with the range of option values. In the first experiment reported here, I manipulated the range of attractiveness of the faces presented on each trial by varying the attractiveness of an undesirable distractor face that was presented simultaneously with two attractive targets, and participants were asked to choose the most attractive face. I used normalization models to predict the context dependence of preferences regarding facial attractiveness. The more unattractive the distractor, the more one of the targets was preferred over the other target, which suggests that divisive normalization (a potential canonical computation in the brain) influences social evaluations. I obtained the same result when I manipulated faces' averageness and participants chose the most average face. This finding suggests that divisive normalization is not restricted to value-based decisions (e.g., attractiveness). This new application to social evaluation of normalization, a classic theory, opens possibilities for predicting social decisions in naturalistic contexts such as advertising or dating.

  10. Normal mode analysis and applications in biological physics.

    PubMed

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  11. Meissner effect in normal-superconducting proximity-contact double layers

    NASA Astrophysics Data System (ADS)

    Higashitani, Seiji; Nagai, Katsuhiko

    1995-02-01

    The Meissner effect in normal-superconducting proximity-contact double layers is discussed in the clean limit. The diamagnetic current is calculated using the quasi-classical Green's function. We obtain the quasi-classical Green's function linear in the vector potential in the proximity-contact double layers with a finite reflection coefficient at the interface. It is found that the diamagnetic current in the clean normal layer is constant in space, therefore, the magnetic field linearly decreases in the clean normal layer. We give an explicit expression for the screening length in the clean normal layer and study its temperature dependence. We show that the temperature dependence in the clean normal layer is considerably different from that in the dirty normal layer and agrees with a recent experiment in Au-Nb system.

  12. Helicon normal modes in Proto-MPEX

    NASA Astrophysics Data System (ADS)

    Piotrowicz, P. A.; Caneses, J. F.; Green, D. L.; Goulding, R. H.; Lau, C.; Caughman, J. B. O.; Rapp, J.; Ruzic, D. N.

    2018-05-01

    The Proto-MPEX helicon source has been operating in a high electron density ‘helicon-mode’. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the ‘helicon-mode’. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besides directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region. ).

  13. Helicon normal modes in Proto-MPEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  14. Helicon normal modes in Proto-MPEX

    DOE PAGES

    Piotrowicz, Pawel A.; Caneses, Juan F.; Green, David L.; ...

    2018-05-22

    Here, the Proto-MPEX helicon source has been operating in a high electron density 'helicon-mode'. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the 'helicon-mode'. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besidesmore » directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region.« less

  15. Normal fault earthquakes or graviquakes

    PubMed Central

    Doglioni, C.; Carminati, E.; Petricca, P.; Riguzzi, F.

    2015-01-01

    Earthquakes are dissipation of energy throughout elastic waves. Canonically is the elastic energy accumulated during the interseismic period. However, in crustal extensional settings, gravity is the main energy source for hangingwall fault collapsing. Gravitational potential is about 100 times larger than the observed magnitude, far more than enough to explain the earthquake. Therefore, normal faults have a different mechanism of energy accumulation and dissipation (graviquakes) with respect to other tectonic settings (strike-slip and contractional), where elastic energy allows motion even against gravity. The bigger the involved volume, the larger is their magnitude. The steeper the normal fault, the larger is the vertical displacement and the larger is the seismic energy released. Normal faults activate preferentially at about 60° but they can be shallower in low friction rocks. In low static friction rocks, the fault may partly creep dissipating gravitational energy without releasing great amount of seismic energy. The maximum volume involved by graviquakes is smaller than the other tectonic settings, being the activated fault at most about three times the hypocentre depth, explaining their higher b-value and the lower magnitude of the largest recorded events. Having different phenomenology, graviquakes show peculiar precursors. PMID:26169163

  16. Existing con el Lobo, Traversing la Frontera con Mis Nepantla Coyotes, y Buscando la Vida del Zorro: An Autoethnographic Exploration of a Chicano in Academia

    ERIC Educational Resources Information Center

    Ramirez, Ernesto Fidel

    2017-01-01

    This dissertation is the experience of my life, an evolution of platicas I have had con mis coyotes, my Nepantlero guides. I am one Chicano navigating through the mechanisms of a coercive and hegemonic system which limits our advancement in the academy. My ontology, epistemology, and axiology stem from my cultural and family foundations which I…

  17. Concanavalin A-mediated polyclonal helper assay of normal thymocytes and its use for the analysis of ontogeny.

    PubMed

    Kina, T; Nishikawa, S; Amagai, T; Katsura, Y

    1987-01-01

    A concanavalin A (Con A)-mediated polyclonal helper assay system was established by using the thymus cells or splenic T cells as a source of helper T cells. When splenic B cells were cocultured with thymus cells or splenic Lyt-2- T cells in the presence of an optimal dose of Con A, B Cells were polyclonally activated and differentiated into immunoglobulin-secreting cells. This Con A-mediated helper activity was completely inhibited by the addition of alpha-methyl-D-mannoside and could not be substituted by culture supernatant of Con A-stimulated thymocytes or splenic T cells. Almost all the activity of the thymus cells was carried by peanut agglutinin low binding population. Genetic restriction between T and B cells was not observed in this helper function. In ontogeny, Con A-mediated helper activity in the thymus was first detected at a few days after birth and reached the adult level at about 1 week of age. The polyclonal helper assay system developed in the present study provides a sensitive system to analyse the helper function of thymus cells and also to delineate the early phase of the differentiation of helper T cell population.

  18. On the efficacy of procedures to normalize Ex-Gaussian distributions

    PubMed Central

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2015-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results. PMID:25709588

  19. On the efficacy of procedures to normalize Ex-Gaussian distributions.

    PubMed

    Marmolejo-Ramos, Fernando; Cousineau, Denis; Benites, Luis; Maehara, Rocío

    2014-01-01

    Reaction time (RT) is one of the most common types of measure used in experimental psychology. Its distribution is not normal (Gaussian) but resembles a convolution of normal and exponential distributions (Ex-Gaussian). One of the major assumptions in parametric tests (such as ANOVAs) is that variables are normally distributed. Hence, it is acknowledged by many that the normality assumption is not met. This paper presents different procedures to normalize data sampled from an Ex-Gaussian distribution in such a way that they are suitable for parametric tests based on the normality assumption. Using simulation studies, various outlier elimination and transformation procedures were tested against the level of normality they provide. The results suggest that the transformation methods are better than elimination methods in normalizing positively skewed data and the more skewed the distribution then the transformation methods are more effective in normalizing such data. Specifically, transformation with parameter lambda -1 leads to the best results.

  20. Normalized Legal Drafting and the Query Method.

    ERIC Educational Resources Information Center

    Allen, Layman E.; Engholm, C. Rudy

    1978-01-01

    Normalized legal drafting, a mode of expressing ideas in legal documents so that the syntax that relates the constituent propositions is simplified and standardized, and the query method, a question-asking activity that teaches normalized drafting and provides practice, are examined. Some examples are presented. (JMD)

  1. 40 CFR 230.24 - Normal water fluctuations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... either attuned to or characterized by these periodic water fluctuations. (b) Possible loss of... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Normal water fluctuations. 230.24... Impacts on Physical and Chemical Characteristics of the Aquatic Ecosystem § 230.24 Normal water...

  2. Attention and normalization circuits in macaque V1.

    PubMed

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-04-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. © 2015 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  3. Compressed normalized block difference for object tracking

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  4. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  5. Tumor vessel normalization after aerobic exercise enhances chemotherapeutic efficacy.

    PubMed

    Schadler, Keri L; Thomas, Nicholas J; Galie, Peter A; Bhang, Dong Ha; Roby, Kerry C; Addai, Prince; Till, Jacob E; Sturgeon, Kathleen; Zaslavsky, Alexander; Chen, Christopher S; Ryeom, Sandra

    2016-10-04

    Targeted therapies aimed at tumor vasculature are utilized in combination with chemotherapy to improve drug delivery and efficacy after tumor vascular normalization. Tumor vessels are highly disorganized with disrupted blood flow impeding drug delivery to cancer cells. Although pharmacologic anti-angiogenic therapy can remodel and normalize tumor vessels, there is a limited window of efficacy and these drugs are associated with severe side effects necessitating alternatives for vascular normalization. Recently, moderate aerobic exercise has been shown to induce vascular normalization in mouse models. Here, we provide a mechanistic explanation for the tumor vascular normalization induced by exercise. Shear stress, the mechanical stimuli exerted on endothelial cells by blood flow, modulates vascular integrity. Increasing vascular shear stress through aerobic exercise can alter and remodel blood vessels in normal tissues. Our data in mouse models indicate that activation of calcineurin-NFAT-TSP1 signaling in endothelial cells plays a critical role in exercise-induced shear stress mediated tumor vessel remodeling. We show that moderate aerobic exercise with chemotherapy caused a significantly greater decrease in tumor growth than chemotherapy alone through improved chemotherapy delivery after tumor vascular normalization. Our work suggests that the vascular normalizing effects of aerobic exercise can be an effective chemotherapy adjuvant.

  6. A Review of Depth and Normal Fusion Algorithms

    PubMed Central

    Štolc, Svorad; Pock, Thomas

    2018-01-01

    Geometric surface information such as depth maps and surface normals can be acquired by various methods such as stereo light fields, shape from shading and photometric stereo techniques. We compare several algorithms which deal with the combination of depth with surface normal information in order to reconstruct a refined depth map. The reasons for performance differences are examined from the perspective of alternative formulations of surface normals for depth reconstruction. We review and analyze methods in a systematic way. Based on our findings, we introduce a new generalized fusion method, which is formulated as a least squares problem and outperforms previous methods in the depth error domain by introducing a novel normal weighting that performs closer to the geodesic distance measure. Furthermore, a novel method is introduced based on Total Generalized Variation (TGV) which further outperforms previous approaches in terms of the geodesic normal distance error and maintains comparable quality in the depth error domain. PMID:29389903

  7. Systemic sclerosis with normal or nonspecific nailfold capillaroscopy.

    PubMed

    Fichel, Fanny; Baudot, Nathalie; Gaitz, Jean-Pierre; Trad, Salim; Barbe, Coralie; Francès, Camille; Senet, Patricia

    2014-01-01

    In systemic sclerosis (SSc), a specific nailfold videocapillaroscopy (NVC) pattern is observed in 90% of cases and seems to be associated with severity and progression of the disease. To describe the characteristics of SSc patients with normal or nonspecific (normal/nonspecific) NVC. In a retrospective cohort study, clinical features and visceral involvements of 25 SSc cases with normal/nonspecific NVC were compared to 63 SSc controls with the SSc-specific NVC pattern. Normal/nonspecific NVC versus SSc-specific NVC pattern was significantly associated with absence of skin sclerosis (32 vs. 6.3%, p = 0.004), absence of telangiectasia (47.8 vs. 17.3%, p = 0.006) and absence of sclerodactyly (60 vs. 25.4%, p = 0.002), and less frequent severe pulmonary involvement (26.3 vs. 58.2%, p = 0.017). Normal/nonspecific NVC in SSc patients appears to be associated with less severe skin involvement and less frequent severe pulmonary involvement. © 2014 S. Karger AG, Basel.

  8. Non-Normality and Testing that a Correlation Equals Zero

    ERIC Educational Resources Information Center

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  9. Deformation associated with continental normal faults

    NASA Astrophysics Data System (ADS)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  10. Normal personality traits, rumination and stress generation among early adolescent girls

    PubMed Central

    Stroud, Catherine B.; Sosoo, Effua E.; Wilson, Sylia

    2017-01-01

    This study examined associations between personality and stress generation. Expanding upon prior work, we examined (a) the role of Positive Emotionality (PE), Negative Emotionality (NE), and Constraint (CON), and their lower-order facets, as predictors of acute and chronic interpersonal stress generation; (b) whether personality moderated effects of rumination on stress generation; and (c) whether personality increased exposure to independent (uncontrollable) stress. These questions were examined in a one-year study of 126 adolescent girls (M age = 12.39 years) using contextual stress interviews. NE predicted increases in acute and chronic interpersonal stress generation, but not independent stress. NE, CON and affiliative PE each moderated the effect of rumination on chronic interpersonal stress generation. These effects were driven by particular lower-order traits. PMID:28845067

  11. Valorization of date palm (Phoenix dactylifera) fruit processing by-products and wastes using bioprocess technology – Review

    PubMed Central

    Chandrasekaran, M.; Bahkali, Ali H.

    2013-01-01

    The date palm Phoenix dactylifera has played an important role in the day-to-day life of the people for the last 7000 years. Today worldwide production, utilization and industrialization of dates are continuously increasing since date fruits have earned great importance in human nutrition owing to their rich content of essential nutrients. Tons of date palm fruit wastes are discarded daily by the date processing industries leading to environmental problems. Wastes such as date pits represent an average of 10% of the date fruits. Thus, there is an urgent need to find suitable applications for this waste. In spite of several studies on date palm cultivation, their utilization and scope for utilizing date fruit in therapeutic applications, very few reviews are available and they are limited to the chemistry and pharmacology of the date fruits and phytochemical composition, nutritional significance and potential health benefits of date fruit consumption. In this context, in the present review the prospects of valorization of these date fruit processing by-products and wastes’ employing fermentation and enzyme processing technologies towards total utilization of this valuable commodity for the production of biofuels, biopolymers, biosurfactants, organic acids, antibiotics, industrial enzymes and other possible industrial chemicals are discussed. PMID:23961227

  12. Normalization as a canonical neural computation

    PubMed Central

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  13. Resonance Raman of BCC and normal skin

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-hui; Sriramoju, Vidyasagar; Boydston-White, Susie; Wu, Binlin; Zhang, Chunyuan; Pei, Zhe; Sordillo, Laura; Beckman, Hugh; Alfano, Robert R.

    2017-02-01

    The Resonance Raman (RR) spectra of basal cell carcinoma (BCC) and normal human skin tissues were analyzed using 532nm laser excitation. RR spectral differences in vibrational fingerprints revealed skin normal and cancerous states tissues. The standard diagnosis criterion for BCC tissues are created by native RR biomarkers and its changes at peak intensity. The diagnostic algorithms for the classification of BCC and normal were generated based on SVM classifier and PCA statistical method. These statistical methods were used to analyze the RR spectral data collected from skin tissues, yielding a diagnostic sensitivity of 98.7% and specificity of 79% compared with pathological reports.

  14. Thymic lymphocytes. III. Cooperative phenomenon in the proliferation of thymocytes under Con A stimulation.

    PubMed

    Papiernik, M; Jacobson, J B

    1986-01-01

    In the present paper, the response of thymocytes to Con A is analyzed in terms of a cooperative phenomenon between medullary thymocytes, cortical thymocytes, thymic accessory cells, and interleukin 2. Medullary thymocytes respond spontaneously to Con A and produce IL-2. The addition of exogenously produced IL-2 enhances their proliferation. Small numbers of cortical (PNA+) thymocytes do not respond to Con A, even in the presence of IL-2-containing supernatant. By increasing the number of PNA+ cells per well, sensitivity to Con A and IL-2 appears. This response may be linked either to the increase in a minor PNA+-responding population and/or to the enhanced contamination by medullary thymocytes and macrophages in non-responding PNA+ thymocyte population. In this hypothesis, either the contaminating cells respond by themselves and/or cooperate with PNA+ cells to induce their proliferation. Coculture of non-responding low numbers of PNA+ thymocytes with Con A- and IL-2-containing supernatant in the presence of PNA- cells containing thymic medullary thymocytes and macrophages always produces a higher response than that of each individual population. These results show that a cooperative phenomenon occurs in the cocultures of PNA+ and PNA- thymic cells. We can show using PNA+ and PNA- thymocytes with different Thy 1 alleles, that indeed both PNA+ and populations participate PNA-thymocytes with different Thy 1 alleles, that indeed both PNA+ and PNA- populations participate in the generation of proliferating cells. We can demonstrate, by lysis experiments with monoclonal antibodies and complement that at the end of coculture, most of the proliferating cells are Lyt 1+, and part are Lyt 2+ or L3T4+. We discuss the fact that the phenotype of the cells after activation does not allow us to deduce the phenotype of their precursors. Lysis of Ia+ cells prior to coculture, reduces the level of the proliferative response but does not modify the percentage of cooperation produced

  15. Bulimia nervosa in overweight and normal-weight women.

    PubMed

    Masheb, Robin; White, Marney A

    2012-02-01

    The aim of the present study was to examine overweight bulimia nervosa (BN) in a community sample of women. Volunteers (n = 1964) completed self-report questionnaires of weight, binge eating, purging, and cognitive features. Participants were classified as overweight (body mass index ≥25) or normal weight (body mass index <25). Rates of BN within the overweight and normal-weight classes did not differ (6.4% vs 7.9%). Of the 131 participants identified as BN, 64% (n = 84) were classified as overweight BN and 36% (n = 47) as normal-weight BN. The overweight BN group had a greater proportion of ethnic minorities and reported significantly less restraint than the normal-weight BN group. Otherwise, the 2 groups reported similarly, even in terms of purging and depression. In summary, rates of BN did not differ between overweight and normal-weight women. Among BN participants, the majority (two thirds) were overweight. Differences in ethnicity and restraint, but little else, were found between overweight and normal-weight BN. Findings from the present study should serve to increase awareness of the weight range and ethnic diversity of BN, and highlight the need to address weight and cultural sensitivity in the identification and treatment of eating disorders. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Normal force and drag force in magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Miao, Chunlin; Shafrir, Shai N.; Lambropoulos, John C.; Jacobs, Stephen D.

    2009-08-01

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, λ, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials including optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low "coefficient of friction". The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.

  17. High-Frequency Normal Mode Propagation in Aluminum Cylinders

    USGS Publications Warehouse

    Lee, Myung W.; Waite, William F.

    2009-01-01

    Acoustic measurements made using compressional-wave (P-wave) and shear-wave (S-wave) transducers in aluminum cylinders reveal waveform features with high amplitudes and with velocities that depend on the feature's dominant frequency. In a given waveform, high-frequency features generally arrive earlier than low-frequency features, typical for normal mode propagation. To analyze these waveforms, the elastic equation is solved in a cylindrical coordinate system for the high-frequency case in which the acoustic wavelength is small compared to the cylinder geometry, and the surrounding medium is air. Dispersive P- and S-wave normal mode propagations are predicted to exist, but owing to complex interference patterns inside a cylinder, the phase and group velocities are not smooth functions of frequency. To assess the normal mode group velocities and relative amplitudes, approximate dispersion relations are derived using Bessel functions. The utility of the normal mode theory and approximations from a theoretical and experimental standpoint are demonstrated by showing how the sequence of P- and S-wave normal mode arrivals can vary between samples of different size, and how fundamental normal modes can be mistaken for the faster, but significantly smaller amplitude, P- and S-body waves from which P- and S-wave speeds are calculated.

  18. ["Normal pressure" hydrocephalus].

    PubMed

    Philippon, Jacques

    2005-03-01

    Normal pressure hydrocephalus (NPH) or, more precisely, chronic adult hydrocephalus, is a complex condition. Even if the basic mechanism is found in an impediment to CSF absorption, the underlying pathology is heterogeneous. In secondary NPH, the disruption of normal CSF pathways, following meningitis or sub-arachnoid haemorrhage, is responsible for ventricular dilatation. However, in about half of the cases, the etiology remains obscure. NPH is more frequently found in elderly people, probably in relation with the increased incidence of cerebrovascular disease. The diagnosis of NPH is based upon a triad of clinical symptoms. The main symptom is gait disturbances, followed by urinary incontinence and various degree of cognitive changes. The latter two symptoms are not prerequisites for the diagnosis. Radiological ventricular dilatation without cortical sulcal enlargement is a key factor, as well as substantial clinical improvement after CSF withdrawal (CSF tap test). Other CSF dynamic studies and various imaging investigations have been proposed to improve diagnostic accuracy, but no simple test can predict the results of CSF drainage. The current treatment is ventriculo-peritonial shunting, ideally using an adjustable valve. Results are directly dependent upon the accuracy of the preoperative diagnosis. Post-surgical complications may be observed in about 10% of cases.

  19. Self-Monitoring of Listening Abilities in Normal-Hearing Children, Normal-Hearing Adults, and Children with Cochlear Implants

    PubMed Central

    Rothpletz, Ann M.; Wightman, Frederic L.; Kistler, Doris J.

    2012-01-01

    Background Self-monitoring has been shown to be an essential skill for various aspects of our lives, including our health, education, and interpersonal relationships. Likewise, the ability to monitor one’s speech reception in noisy environments may be a fundamental skill for communication, particularly for those who are often confronted with challenging listening environments, such as students and children with hearing loss. Purpose The purpose of this project was to determine if normal-hearing children, normal-hearing adults, and children with cochlear implants can monitor their listening ability in noise and recognize when they are not able to perceive spoken messages. Research Design Participants were administered an Objective-Subjective listening task in which their subjective judgments of their ability to understand sentences from the Coordinate Response Measure corpus presented in speech spectrum noise were compared to their objective performance on the same task. Study Sample Participants included 41 normal-hearing children, 35 normal-hearing adults, and 10 children with cochlear implants. Data Collection and Analysis On the Objective-Subjective listening task, the level of the masker noise remained constant at 63 dB SPL, while the level of the target sentences varied over a 12 dB range in a block of trials. Psychometric functions, relating proportion correct (Objective condition) and proportion perceived as intelligible (Subjective condition) to target/masker ratio (T/M), were estimated for each participant. Thresholds were defined as the T/M required to produce 51% correct (Objective condition) and 51% perceived as intelligible (Subjective condition). Discrepancy scores between listeners’ threshold estimates in the Objective and Subjective conditions served as an index of self-monitoring ability. In addition, the normal-hearing children were administered tests of cognitive skills and academic achievement, and results from these measures were compared

  20. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    PubMed

    Choi, Jiwoong; Hoffman, Eric A; Lin, Ching-Long; Milhem, Mohammed M; Tessier, Jean; Newell, John D

    2017-01-01

    Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells) through interactions with host cells. We explored this with serial inspiratory computed tomography (CT) and image matching to assess regional changes in lung expansion. We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05. Lung regions of metastatic sarcoma patients (but not the normal control group) demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05). There was also evidence of increased lung "tissue" volume (non-air components) in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions. This new quantitative CT (QCT) method for linking serial

  1. About normal distribution on SO(3) group in texture analysis

    NASA Astrophysics Data System (ADS)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  2. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    PubMed

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  3. Normal Birth: Two Stories

    PubMed Central

    Scaer, Roberta M.

    2002-01-01

    The author shares two stories: one of a normal birth that took place in a hospital with a nurse-midwife in attendance and another of a home birth unexpectedly shared by many colleagues. Both are told with the goal to inform, inspire, and educate. PMID:17273292

  4. Oxacilin-resistant Coagulase-negative staphylococci (CoNS) bacteremia in a general hospital at São Paulo city, Brasil

    PubMed Central

    d’Azevedo, P.A.; Secchi, C.; Antunes, A.L.S.; Sales, T.; Silva, F.M.; Tranchesi, R.; Pignatari, A.C.C.

    2008-01-01

    In the last decades, coagulase-negative staphylococci (CoNS), especially Staphylococcus epidermidis have become an important cause of bloodstream infections. In addition, rates of methicillin-resistance among CoNS have increased substantially, leading to the use of glicopeptides for therapy. The objective of this study was to evaluate eleven consecutives clinically relevant cases of oxacillin-resistant CoNS bacteremia in a general hospital localized in São Paulo city, Brazil. Five different species were identified by different phenotypic methods, including S. epidermidis (5), S. haemolyticus (3), S. hominis (1), S. warneri (1) and S. cohnii subsp urealyticus (1). A variety of Pulsed Field Gel Electrophoresis profiles was observed by macrorestriction DNA analysis in S. epidermidis isolates, but two of three S. haemolyticus isolates presented the same profile. These data indicated the heterogeneity of the CoNS isolates, suggesting that horizontal dissemination of these microorganisms in the investigated hospital was not frequent. One S. epidermidis and one S. haemolyticus isolates were resistant to teicoplanin and susceptible to vancomycin. The selective pressure due to the use of teicoplanin in this hospital is relevant. PMID:24031279

  5. Mutual regulation of tumour vessel normalization and immunostimulatory reprogramming.

    PubMed

    Tian, Lin; Goldstein, Amit; Wang, Hai; Ching Lo, Hin; Sun Kim, Ik; Welte, Thomas; Sheng, Kuanwei; Dobrolecki, Lacey E; Zhang, Xiaomei; Putluri, Nagireddy; Phung, Thuy L; Mani, Sendurai A; Stossi, Fabio; Sreekumar, Arun; Mancini, Michael A; Decker, William K; Zong, Chenghang; Lewis, Michael T; Zhang, Xiang H-F

    2017-04-13

    Blockade of angiogenesis can retard tumour growth, but may also paradoxically increase metastasis. This paradox may be resolved by vessel normalization, which involves increased pericyte coverage, improved tumour vessel perfusion, reduced vascular permeability, and consequently mitigated hypoxia. Although these processes alter tumour progression, their regulation is poorly understood. Here we show that type 1 T helper (T H 1) cells play a crucial role in vessel normalization. Bioinformatic analyses revealed that gene expression features related to vessel normalization correlate with immunostimulatory pathways, especially T lymphocyte infiltration or activity. To delineate the causal relationship, we used various mouse models with vessel normalization or T lymphocyte deficiencies. Although disruption of vessel normalization reduced T lymphocyte infiltration as expected, reciprocal depletion or inactivation of CD4 + T lymphocytes decreased vessel normalization, indicating a mutually regulatory loop. In addition, activation of CD4 + T lymphocytes by immune checkpoint blockade increased vessel normalization. T H 1 cells that secrete interferon-γ are a major population of cells associated with vessel normalization. Patient-derived xenograft tumours growing in immunodeficient mice exhibited enhanced hypoxia compared to the original tumours in immunocompetent humans, and hypoxia was reduced by adoptive T H 1 transfer. Our findings elucidate an unexpected role of T H 1 cells in vasculature and immune reprogramming. T H 1 cells may be a marker and a determinant of both immune checkpoint blockade and anti-angiogenesis efficacy.

  6. 40 CFR 227.27 - Limiting permissible con-cen-tra-tion (LPC).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Limiting permissible con-cen-tra-tion... scientific literature or accepted by EPA as being reliable test organisms to determine the anticipated impact... for each type they represent, and that are documented in the scientific literature and accepted by EPA...

  7. Perfiles de luminosidad en galaxias con núcleo tipo Seyfert 1

    NASA Astrophysics Data System (ADS)

    Boris, N.; Rodriguez-Ardilla, A. A.; Pastoriza, M. G.

    Presentamos imágenes CCD en los filtros BVI y Hα de una muestra de 10 galaxias Seyfert 1 y Narrow Line Seyfert 1. Recientes observaciones muestran que hay una diferencia significante en el índice espectral óptico entre NLS1s y Sy1 normales, siendo para las primeras del orden de 2. Otra característica importante es que la mayor parte de las NLS1s muestran tasas de FeII/Hβ mayores que las observadas en otras Sy1s. Desde el punto de vista fotométrico, estas galaxias no tienenningún tipo de estudio previo. Presentamos magnitudes totales, perfiles de luminosidad y mapas de color junto con un detallado análisis de la formación estelar en estos objetos. Encontramos que la descomposición en bulbo + disco representa adecuadamente los perfiles de luminosidad de las galaxias de la muestra. Sin embargo, en todos los casos es necesario que el disco tenga un agujero en su centro. El radio de este agujero va desde los 3 a los 9 kpc. Si bien no tenemos aún una explicación para este hecho, los agujeros parecen estar asociados a anillos circumnucleares de alto oscurecimiento E(B-V) ~1. Los perfiles presentan también un fuerte gradiente de color, siendo notablemente más azules hacia la región nuclear. Los objetos de la muestra cubren todo el rango de tipos morfológicos, no obstante, no encontramos regiones de formación estelar en las regiones exteriores de las galaxias. La formación estelar está confinada a la región nuclear y se data en alrededor de 5 x 107 años.

  8. Phenformin-induced Hypoglycaemia in Normal Subjects*

    PubMed Central

    Lyngsøe, J.; Trap-Jensen, J.

    1969-01-01

    Study of the effect of phenformin on the blood glucose level in normal subjects before and during 70 hours of starvation showed a statistically significant hypoglycaemic effect after 40 hours of starvation. This effect was not due to increased glucose utilization. Another finding in this study was a statistically significant decrease in total urinary nitrogen excretion during starvation in subjects given phenformin. These findings show that the hypoglycaemic effect of phenformin in starved normal subjects is due to inhibition of gluconeogenesis. PMID:5780431

  9. Generalized approach for using unbiased symmetric metrics with negative values: normalized mean bias factor and normalized mean absolute error factor

    EPA Science Inventory

    Unbiased symmetric metrics provide a useful measure to quickly compare two datasets, with similar interpretations for both under and overestimations. Two examples include the normalized mean bias factor and normalized mean absolute error factor. However, the original formulations...

  10. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  11. [Primary culture of human normal epithelial cells].

    PubMed

    Tang, Yu; Xu, Wenji; Guo, Wanbei; Xie, Ming; Fang, Huilong; Chen, Chen; Zhou, Jun

    2017-11-28

    The traditional primary culture methods of human normal epithelial cells have disadvantages of low activity of cultured cells, the low cultivated rate and complicated operation. To solve these problems, researchers made many studies on culture process of human normal primary epithelial cell. In this paper, we mainly introduce some methods used in separation and purification of human normal epithelial cells, such as tissue separation method, enzyme digestion separation method, mechanical brushing method, red blood cell lysis method, percoll layered medium density gradient separation method. We also review some methods used in the culture and subculture, including serum-free medium combined with low mass fraction serum culture method, mouse tail collagen coating method, and glass culture bottle combined with plastic culture dish culture method. The biological characteristics of human normal epithelial cells, the methods of immunocytochemical staining, trypan blue exclusion are described. Moreover, the factors affecting the aseptic operation, the conditions of the extracellular environment, the conditions of the extracellular environment during culture, the number of differential adhesion, and the selection and dosage of additives are summarized.

  12. Salud mental en desastres naturales: estrategias interventivas con adultos mayores en sectores rurales de Chile.

    PubMed

    Osorio-Parraguez, Paulina; Espinoza, Adriana

    2016-06-01

    En el presente artículo se da a conocer una estrategia de intervención llevada a cabo con adultos mayores en la comuna de Paredones, sexta región de Chile, con posterioridad al terremoto y tsunami del 27 de febrero 2010 en Chile, en el contexto de una investigación sobre fortalezas y vulnerabilidades desplegadas por este grupo etario, con posterioridad a un desastre natural. Se presenta una descripción del desarrollo metodológico de la intervención y de los sustentos teóricos y conceptuales en los que se basa. Como resultado de este proceso, se propone una estrategia que trabaje a través de la identificación de las propias experiencias y fortalezas de los sujetos. De tal forma se minimizan los efectos negativos de los determinantes sociales de la salud (como la edad y el lugar de residencia) en contexto de crisis; permitiendo a los adultos mayores fortalecer sus recursos individuales y colectivos, en pro de su bienestar psicosocial. © The Author(s) 2015.

  13. Neuropathological and neuropsychological changes in "normal" aging: evidence for preclinical Alzheimer disease in cognitively normal individuals.

    PubMed

    Hulette, C M; Welsh-Bohmer, K A; Murray, M G; Saunders, A M; Mash, D C; McIntyre, L M

    1998-12-01

    The presence of diffuse or primitive senile plaques in the neocortex of cognitively normal elderly at autopsy has been presumed to represent normal aging. Alternatively, these patients may have developed dementia and clinical Alzheimer disease (AD) if they had survived. In this setting, these patients could be subjects for cognitive or pharmacologic intervention to delay disease onset. We have thus followed a cohort of cognitively normal elderly subjects with a Clinical Dementia Rating (CDR) of 0 at autopsy. Thirty-one brains were examined at postmortem according to Consortium to Establish a Registry for Alzheimer Disease (CERAD) criteria and staged according to Braak. Ten patients were pathologically normal according to CERAD criteria (1a). Two of these patients were Braak Stage II. Seven very elderly subjects exhibited a few primitive neuritic plaques in the cortex and thus represented CERAD 1b. These individuals ranged in age from 85 to 105 years and were thus older than the CERAD la group that ranged in age from 72 to 93. Fourteen patients displayed Possible AD according to CERAD with ages ranging from 66 to 95. Three of these were Braak Stage I, 4 were Braak Stage II, and 7 were Braak Stage III. The Apolipoprotein E4 allele was over-represented in this possible AD group. Neuropsychological data were available on 12 individuals. In these 12 individuals, Possible AD at autopsy could be predicted by cognitive deficits in 1 or more areas including savings scores on memory testing and overall performance on some measures of frontal executive function.

  14. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, Tuan; Panjehpour, Masoud; Overholt, Bergein F.

    1996-01-01

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample.

  15. Normal Force and Drag Force in Magnetorheological Finishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, C.; Shafrir, S.N.; Lambropoulos, J.C.

    2010-01-13

    The material removal in magnetorheological finishing (MRF) is known to be controlled by shear stress, tau, which equals drag force, Fd, divided by spot area, As. However, it is unclear how the normal force, Fn, affects the material removal in MRF and how the measured ratio of drag force to normal force Fd/Fn, equivalent to coefficient of friction, is related to material removal. This work studies, for the first time for MRF, the normal force and the measured ratio Fd/Fn as a function of material mechanical properties. Experimental data were obtained by taking spots on a variety of materials includingmore » optical glasses and hard ceramics with a spot-taking machine (STM). Drag force and normal force were measured with a dual load cell. Drag force decreases linearly with increasing material hardness. In contrast, normal force increases with hardness for glasses, saturating at high hardness values for ceramics. Volumetric removal rate decreases with normal force across all materials. The measured ratio Fd/Fn shows a strong negative linear correlation with material hardness. Hard materials exhibit a low “coefficient of friction”. The volumetric removal rate increases with the measured ratio Fd/Fn which is also correlated with shear stress, indicating that the measured ratio Fd/Fn is a useful measure of material removal in MRF.« less

  16. Ultraviolet Spectra of Normal Spiral Galaxies

    NASA Technical Reports Server (NTRS)

    Kinney, Anne

    1997-01-01

    The data related to this grant on the Ultraviolet Spectra of Normal Spiral Galaxies have been entirely reduced and analyzed. It is incorporated into templates of Spiral galaxies used in the calculation of K corrections towards the understanding of high redshift galaxies. The main paper was published in the Astrophysical Journal, August 1996, Volume 467, page 38. The data was also used in another publication, The Spectral Energy Distribution of Normal Starburst and Active Galaxies, June 1997, preprint series No. 1158. Copies of both have been attached.

  17. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  18. Normal-range verbal-declarative memory in schizophrenia.

    PubMed

    Heinrichs, R Walter; Parlar, Melissa; Pinnock, Farena

    2017-10-01

    Cognitive impairment is prevalent and related to functional outcome in schizophrenia, but a significant minority of the patient population overlaps with healthy controls on many performance measures, including declarative-verbal-memory tasks. In this study, we assessed the validity, clinical, and functional implications of normal-range (NR), verbal-declarative memory in schizophrenia. Performance normality was defined using normative data for 8 basic California Verbal Learning Test (CVLT-II; Delis, Kramer, Kaplan, & Ober, 2000) recall and recognition trials. Schizophrenia patients (n = 155) and healthy control participants (n = 74) were assessed for performance normality, defined as scores within 1 SD of the normative mean on all 8 trials, and assigned to normal- and below-NR memory groups. NR schizophrenia patients (n = 26) and control participants (n = 51) did not differ in general verbal ability, on a reading-based estimate of premorbid ability, across all 8 CVLT-II-score comparisons or in terms of intrusion and false-positive errors and auditory working memory. NR memory patients did not differ from memory-impaired patients (n = 129) in symptom severity, and both patient groups were significantly and similarly disabled in terms of functional status in the community. These results confirm a subpopulation of schizophrenia patients with normal, verbal-declarative-memory performance and no evidence of decline from higher premorbid ability levels. However, NR patients did not experience less severe psychopathology, nor did they show advantage in community adjustment relative to impaired patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Normalization of a chromosomal contact map.

    PubMed

    Cournac, Axel; Marie-Nelly, Hervé; Marbouty, Martial; Koszul, Romain; Mozziconacci, Julien

    2012-08-30

    Chromatin organization has been increasingly studied in relation with its important influence on DNA-related metabolic processes such as replication or regulation of gene expression. Since its original design ten years ago, capture of chromosome conformation (3C) has become an essential tool to investigate the overall conformation of chromosomes. It relies on the capture of long-range trans and cis interactions of chromosomal segments whose relative proportions in the final bank reflect their frequencies of interactions, hence their spatial proximity in a population of cells. The recent coupling of 3C with deep sequencing approaches now allows the generation of high resolution genome-wide chromosomal contact maps. Different protocols have been used to generate such maps in various organisms. This includes mammals, drosophila and yeast. The massive amount of raw data generated by the genomic 3C has to be carefully processed to alleviate the various biases and byproducts generated by the experiments. Our study aims at proposing a simple normalization procedure to minimize the influence of these unwanted but inevitable events on the final results. Careful analysis of the raw data generated previously for budding yeast S. cerevisiae led to the identification of three main biases affecting the final datasets, including a previously unknown bias resulting from the circularization of DNA molecules. We then developed a simple normalization procedure to process the data and allow the generation of a normalized, highly contrasted, chromosomal contact map for S. cerevisiae. The same method was then extended to the first human genome contact map. Using the normalized data, we revisited the preferential interactions originally described between subsets of discrete chromosomal features. Notably, the detection of preferential interactions between tRNA in yeast and CTCF, PolII binding sites in human can vary with the normalization procedure used. We quantitatively reanalyzed the

  20. Normal forms for Hopf-Zero singularities with nonconservative nonlinear part

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh; Sanders, Jan A.

    In this paper we are concerned with the simplest normal form computation of the systems x˙=2xf(x,y2+z2), y˙=z+yf(x,y2+z2), z˙=-y+zf(x,y2+z2), where f is a formal function with real coefficients and without any constant term. These are the classical normal forms of a larger family of systems with Hopf-Zero singularity. Indeed, these are defined such that this family would be a Lie subalgebra for the space of all classical normal form vector fields with Hopf-Zero singularity. The simplest normal forms and simplest orbital normal forms of this family with nonzero quadratic part are computed. We also obtain the simplest parametric normal form of any non-degenerate perturbation of this family within the Lie subalgebra. The symmetry group of the simplest normal forms is also discussed. This is a part of our results in decomposing the normal forms of Hopf-Zero singular systems into systems with a first integral and nonconservative systems.

  1. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Role of the normal gut microbiota.

    PubMed

    Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D

    2015-08-07

    Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.

  3. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, T.; Panjehpour, M.; Overholt, B.F.

    1996-12-03

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample. 5 figs.

  4. A general approach to double-moment normalization of drop size distributions

    NASA Astrophysics Data System (ADS)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  5. Catálogo de Radio-Fuentes Opticas con Astrolabio Fotoeléctrico PAII

    NASA Astrophysics Data System (ADS)

    Manrique, W. T.; Podestá, R. C.; Alonso, E.; Actis, E. V.; Pacheco, A. M.; Bustos, G.; Lizhi, L.; Zezhi, W.; Fanmiao, Z.; Hongqi, W.; Perdomo, R.

    Usando los datos observados en San Juan con el Astrolabio Fotoeléctrico PAII desde Febrero de 1992 hasta Marzo de 1997, se ha realizado el Catálogo de Radio-Fuentes Opticas de San Juan conteniendo 69 estrellas. Las observaciones de las posiciones de las radio-fuentes están realizadas para la época y equinoccio J2000,0 y en un sistema cercano al del FK5. Las precisiones medias son ± 2,2 ms y ± 0,"035 en ascensión recta y declinación respectivamente. Las magnitudes de las estrellas son desde 0,9 a 10,7 . Las declinaciones son desde --2,5 grados a --60 grados. La época media es 1995,1. Se muestran también los resultados comparados con el Catálogo Hiparcos.

  6. Toward the optimization of normalized graph Laplacian.

    PubMed

    Xie, Bo; Wang, Meng; Tao, Dacheng

    2011-04-01

    Normalized graph Laplacian has been widely used in many practical machine learning algorithms, e.g., spectral clustering and semisupervised learning. However, all of them use the Euclidean distance to construct the graph Laplacian, which does not necessarily reflect the inherent distribution of the data. In this brief, we propose a method to directly optimize the normalized graph Laplacian by using pairwise constraints. The learned graph is consistent with equivalence and nonequivalence pairwise relationships, and thus it can better represent similarity between samples. Meanwhile, our approach, unlike metric learning, automatically determines the scale factor during the optimization. The learned normalized Laplacian matrix can be directly applied in spectral clustering and semisupervised learning algorithms. Comprehensive experiments demonstrate the effectiveness of the proposed approach.

  7. The Effect of Normalization in Violence Video Classification Performance

    NASA Astrophysics Data System (ADS)

    Ali, Ashikin; Senan, Norhalina

    2017-08-01

    Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.

  8. Mitochondrial dysfunction in myocardium obtained from clinically normal dogs, clinically normal anesthetized dogs, and dogs with dilated cardiomyopathy.

    PubMed

    Sleeper, Meg M; Rosato, Bradley P; Bansal, Seema; Avadhani, Narayan G

    2012-11-01

    To compare mitochondrial complex I and complex IV activity in myocardial mitochondria of clinically normal dogs, clinically normal dogs exposed to inhalation anesthesia, and dogs affected with dilated cardiomyopathy. Myocardial samples obtained from 21 euthanized dogs (6 clinically normal [control] dogs, 5 clinically normal dogs subjected to inhalation anesthesia with isoflurane prior to euthanasia, 5 dogs with juvenile-onset dilated cardiomyopathy, and 5 dogs with adult-onset dilated cardiomyopathy). Activity of mitochondrial complex I and complex IV was assayed spectrophotometrically in isolated mitochondria from left ventricular tissue obtained from the 4 groups of dogs. Activity of complex I and complex IV was significantly decreased in anesthetized dogs, compared with activities in the control dogs and dogs with juvenile-onset or adult-onset dilated cardiomyopathy. Inhalation anesthesia disrupted the electron transport chain in the dogs, which potentially led to an outburst of reactive oxygen species that caused mitochondrial dysfunction. Inhalation anesthesia depressed mitochondrial function in dogs, similar to results reported in other species. This effect is important to consider when anesthetizing animals with myocardial disease and suggested that antioxidant treatments may be beneficial in some animals. Additionally, this effect should be considered when designing studies in which mitochondrial enzyme activity will be measured. Additional studies that include a larger number of animals are warranted.

  9. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  10. Verbal Processing Reaction Times in "Normal" and "Poor" Readers.

    ERIC Educational Resources Information Center

    Culbertson, Jack; And Others

    After it had been determined that reaction time (RT) was a sensitive measure of hemispheric dominance in a verbal task performed by normal adult readers, the reaction times of three groups of subjects (20 normal reading college students, 12 normal reading third graders and 11 poor reading grade school students) were compared. Ss were exposed to…

  11. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  12. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  13. 7 CFR 1794.23 - Proposals normally requiring an EA.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Proposals normally requiring an EA. 1794.23 Section... § 1794.23 Proposals normally requiring an EA. RUS will normally prepare an EA for all proposed actions... require an EA and shall be subject to the requirements of §§ 1794.40 through 1794.44. (a) General...

  14. Volume-preserving normal forms of Hopf-zero singularity

    NASA Astrophysics Data System (ADS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  15. An Integrated Approach for RNA-seq Data Normalization.

    PubMed

    Yang, Shengping; Mercante, Donald E; Zhang, Kun; Fang, Zhide

    2016-01-01

    DNA copy number alteration is common in many cancers. Studies have shown that insertion or deletion of DNA sequences can directly alter gene expression, and significant correlation exists between DNA copy number and gene expression. Data normalization is a critical step in the analysis of gene expression generated by RNA-seq technology. Successful normalization reduces/removes unwanted nonbiological variations in the data, while keeping meaningful information intact. However, as far as we know, no attempt has been made to adjust for the variation due to DNA copy number changes in RNA-seq data normalization. In this article, we propose an integrated approach for RNA-seq data normalization. Comparisons show that the proposed normalization can improve power for downstream differentially expressed gene detection and generate more biologically meaningful results in gene profiling. In addition, our findings show that due to the effects of copy number changes, some housekeeping genes are not always suitable internal controls for studying gene expression. Using information from DNA copy number, integrated approach is successful in reducing noises due to both biological and nonbiological causes in RNA-seq data, thus increasing the accuracy of gene profiling.

  16. Pattern Adaptation and Normalization Reweighting.

    PubMed

    Westrick, Zachary M; Heeger, David J; Landy, Michael S

    2016-09-21

    Adaptation to an oriented stimulus changes both the gain and preferred orientation of neural responses in V1. Neurons tuned near the adapted orientation are suppressed, and their preferred orientations shift away from the adapter. We propose a model in which weights of divisive normalization are dynamically adjusted to homeostatically maintain response products between pairs of neurons. We demonstrate that this adjustment can be performed by a very simple learning rule. Simulations of this model closely match existing data from visual adaptation experiments. We consider several alternative models, including variants based on homeostatic maintenance of response correlations or covariance, as well as feedforward gain-control models with multiple layers, and we demonstrate that homeostatic maintenance of response products provides the best account of the physiological data. Adaptation is a phenomenon throughout the nervous system in which neural tuning properties change in response to changes in environmental statistics. We developed a model of adaptation that combines normalization (in which a neuron's gain is reduced by the summed responses of its neighbors) and Hebbian learning (in which synaptic strength, in this case divisive normalization, is increased by correlated firing). The model is shown to account for several properties of adaptation in primary visual cortex in response to changes in the statistics of contour orientation. Copyright © 2016 the authors 0270-6474/16/369805-12$15.00/0.

  17. Fault stability under conditions of variable normal stress

    USGS Publications Warehouse

    Dieterich, J.H.; Linker, M.F.

    1992-01-01

    The stability of fault slip under conditions of varying normal stress is modelled as a spring and slider system with rate- and state-dependent friction. Coupling of normal stress to shear stress is achieved by inclining the spring at an angle, ??, to the sliding surface. Linear analysis yields two conditions for unstable slip. The first, of a type previously identified for constant normal stress systems, results in instability if stiffness is below a critical value. Critical stiffness depends on normal stress, constitutive parameters, characteristic sliding distance and the spring angle. Instability of the first type is possible only for velocity-weakening friction. The second condition yields instability if spring angle ?? <-cot-1??ss, where ??ss is steady-state sliding friction. The second condition can arise under conditions of velocity strengthening or weakening. Stability fields for finite perturbations are investigated by numerical simulation. -Authors

  18. ConSurf 2016: an improved methodology to estimate and visualize evolutionary conservation in macromolecules

    PubMed Central

    Ashkenazy, Haim; Abadi, Shiran; Martz, Eric; Chay, Ofer; Mayrose, Itay; Pupko, Tal; Ben-Tal, Nir

    2016-01-01

    The degree of evolutionary conservation of an amino acid in a protein or a nucleic acid in DNA/RNA reflects a balance between its natural tendency to mutate and the overall need to retain the structural integrity and function of the macromolecule. The ConSurf web server (http://consurf.tau.ac.il), established over 15 years ago, analyses the evolutionary pattern of the amino/nucleic acids of the macromolecule to reveal regions that are important for structure and/or function. Starting from a query sequence or structure, the server automatically collects homologues, infers their multiple sequence alignment and reconstructs a phylogenetic tree that reflects their evolutionary relations. These data are then used, within a probabilistic framework, to estimate the evolutionary rates of each sequence position. Here we introduce several new features into ConSurf, including automatic selection of the best evolutionary model used to infer the rates, the ability to homology-model query proteins, prediction of the secondary structure of query RNA molecules from sequence, the ability to view the biological assembly of a query (in addition to the single chain), mapping of the conservation grades onto 2D RNA models and an advanced view of the phylogenetic tree that enables interactively rerunning ConSurf with the taxa of a sub-tree. PMID:27166375

  19. A National Look at Postmodernism's Pros and Cons in Educational Leadership

    ERIC Educational Resources Information Center

    Townsell, Rhodena

    2007-01-01

    The purpose of this article is to take a look at the pros and cons of postmodernism. It is imperative for administrators to closely examine educational theories and practices prior to instituting changes. The ability to read and digest challenging material keeps one informed and prepared to lead effectively. This paper will list the pros and cons…

  20. 77 FR 3030 - Twenty-Eighth Meeting: RTCA Special Committee 206: Aeronautical Information and Meteorological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-20

    ... and normal definitions Review proposed TOR changes ConUse Review February 7, 2012 ConUse Review... proposed TOR changes and new or modified Sub-groups' roles and responsibilities Decision to release ConUse...] BILLING CODE 4910-13-P ...

  1. Fragile X syndrome and an isodicentric X chromosome in a woman with multiple anomalies, developmental delay, and normal pubertal development.

    PubMed

    Freedenberg, D L; Gane, L W; Richards, C S; Lampe, M; Hills, J; O'Connor, R; Manchester, D; Taylor, A; Tassone, F; Hulseberg, D; Hagerman, R J; Patil, S R

    1999-07-30

    We report on an individual with developmental delays, short stature, skeletal abnormalities, normal pubertal development, expansion of the fragile X triplet repeat, as well as an isodicentric X chromosome. S is a 19-year-old woman who presented for evaluation of developmental delay. Pregnancy was complicated by a threatened miscarriage. She was a healthy child with intellectual impairment noted in infancy. Although she had global delays, speech was noted to be disproportionately delayed with few words until age 3.5 years. Facial appearance was consistent with fragile X syndrome. Age of onset of menses was 11 years with normal breast development. A maternal male second cousin had been identified with fragile X syndrome based on DNA studies. The mother of this child (S's maternal first cousin) and the grandfather (S's maternal uncle) were both intellectually normal but were identified as carrying triplet expansions in the premutation range. S's mother had some school difficulties but was not identified as having global delays. Molecular analysis of S's fragile X alleles noted an expansion of more than 400 CGG repeats in one allele. Routine cytogenetic studies of peripheral blood noted the presence of an isodicentric X in 81of 86 cells scored. Five of 86 cells were noted to be 45,X. Cytogenetic fra(X) studies from peripheral blood showed that the structurally normal chromosome had the fragile site in approximately 16% of the cells. Analysis of maternal fragile X alleles identified an allele with an expansion to approximately 110 repeats. FMRP studies detected the expression of the protein in 24% of cells studied. To our knowledge, this is the first patient reported with an isodicentric X and fragile X syndrome. Whereas her clinical phenotype is suggestive of fragile X syndrome, her skeletal abnormalities may represent the presence of the isodicentric X. Treatment of S with 20 mg/day of Prozac improved her behavior. In the climate of cost con trol, this individual

  2. Bacterial microflora of normal and telangiectatic livers in cattle.

    PubMed

    Stotland, E I; Edwards, J F; Roussel, A J; Simpson, R B

    2001-07-01

    To identify potential bacterial pathogens in normal and telangiectatic livers of mature cattle at slaughter and to identify consumer risk associated with hepatic telangiectasia. 50 normal livers and 50 severely telangiectatic livers. Normal and telangiectatic livers were collected at slaughter for aerobic and anaerobic bacterial culture. Isolates were identified, and patterns of isolation were analyzed. Histologic examination of all livers was performed. Human pathogens isolated from normal and telangiectatic livers included Escherichia coli O157:H7 and group-D streptococci. Most livers in both groups contained bacteria in low numbers; however, more normal livers yielded negative culture results. More group-D streptococci were isolated from the right lobes of telangiectatic livers than from the left lobes, and more gram-negative anaerobic bacteria were isolated from left lobes of telangiectatic livers than from right lobes. All telangiectatic lesions were free of fibrosis, active necrotizing processes, and inflammation. The USDA regulation condemning telangiectatic livers is justified insofar as these livers contain more bacteria than normal livers do; however, normal livers contain similar species of microflora. Development of telangiectasia could not be linked to an infectious process. The finding of E coli O157:H7 in bovine livers suggests that information regarding bacterial content of other offal and muscle may identify sources of this and other potential foodborne pathogens and assist in establishing critical control points for the meat industry.

  3. CEC-normalized clay-water sorption isotherm

    NASA Astrophysics Data System (ADS)

    Woodruff, W. F.; Revil, A.

    2011-11-01

    A normalized clay-water isotherm model based on BET theory and describing the sorption and desorption of the bound water in clays, sand-clay mixtures, and shales is presented. Clay-water sorption isotherms (sorption and desorption) of clayey materials are normalized by their cation exchange capacity (CEC) accounting for a correction factor depending on the type of counterion sorbed on the mineral surface in the so-called Stern layer. With such normalizations, all the data collapse into two master curves, one for sorption and one for desorption, independent of the clay mineralogy, crystallographic considerations, and bound cation type; therefore, neglecting the true heterogeneity of water sorption/desorption in smectite. The two master curves show the general hysteretic behavior of the capillary pressure curve at low relative humidity (below 70%). The model is validated against several data sets obtained from the literature comprising a broad range of clay types and clay mineralogies. The CEC values, derived by inverting the sorption/adsorption curves using a Markov chain Monte Carlo approach, are consistent with the CEC associated with the clay mineralogy.

  4. Vagina: What's Normal, What's Not

    MedlinePlus

    ... some antibiotics increases the risk of a vaginal yeast infection. Birth control and feminine-hygiene products. Barrier ... or change in the normal balance of vaginal yeast and bacteria can cause inflammation of the vagina ( ...

  5. Cervical disc arthroplasty: Pros and cons

    PubMed Central

    Moatz, Bradley; Tortolani, P. Justin

    2012-01-01

    Background: Cervical disc arthroplasty has emerged as a promising potential alternative to anterior cervical discectomy and fusion (ACDF) in appropriately selected patients. Despite a history of excellent outcomes after ACDF, the question as to whether a fusion leads to adjacent segment degeneration remains unanswered. Numerous US investigational device exemption trials comparing cervical arthroplasty to fusion have been conducted to answer this question. Methods: This study reviews the current research regarding cervical athroplasty, and emphasizes both the pros and cons of arthroplasty as compared with ACDF. Results: Early clinical outcomes show that cervical arthroplasty is as effective as the standard ACDF. However, this new technology is also associated with an expanding list of novel complications. Conclusion: Although there is no definitive evidence that cervical disc replacement reduces the incidence of adjacent segment degeneration, it does show other advantages; for example, faster return to work, and reduced need for postoperative bracing. PMID:22905327

  6. Cervical disc arthroplasty: Pros and cons.

    PubMed

    Moatz, Bradley; Tortolani, P Justin

    2012-01-01

    Cervical disc arthroplasty has emerged as a promising potential alternative to anterior cervical discectomy and fusion (ACDF) in appropriately selected patients. Despite a history of excellent outcomes after ACDF, the question as to whether a fusion leads to adjacent segment degeneration remains unanswered. Numerous US investigational device exemption trials comparing cervical arthroplasty to fusion have been conducted to answer this question. This study reviews the current research regarding cervical athroplasty, and emphasizes both the pros and cons of arthroplasty as compared with ACDF. Early clinical outcomes show that cervical arthroplasty is as effective as the standard ACDF. However, this new technology is also associated with an expanding list of novel complications. Although there is no definitive evidence that cervical disc replacement reduces the incidence of adjacent segment degeneration, it does show other advantages; for example, faster return to work, and reduced need for postoperative bracing.

  7. Graph-based normalization and whitening for non-linear data analysis.

    PubMed

    Aaron, Catherine

    2006-01-01

    In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.

  8. Normal modes of a small gamelan gong.

    PubMed

    Perrin, Robert; Elford, Daniel P; Chalmers, Luke; Swallowe, Gerry M; Moore, Thomas R; Hamdan, Sinin; Halkon, Benjamin J

    2014-10-01

    Studies have been made of the normal modes of a 20.7 cm diameter steel gamelan gong. A finite-element model has been constructed and its predictions for normal modes compared with experimental results obtained using electronic speckle pattern interferometry. Agreement was reasonable in view of the lack of precision in the manufacture of the instrument. The results agree with expectations for an axially symmetric system subject to small symmetry breaking. The extent to which the results obey Chladni's law is discussed. Comparison with vibrational and acoustical spectra enabled the identification of the small number of modes responsible for the sound output when played normally. Evidence of non-linear behavior was found, mainly in the form of subharmonics of true modes. Experiments using scanning laser Doppler vibrometry gave satisfactory agreement with the other methods.

  9. Environmental assessment of food waste valorization in producing biogas for various types of energy use based on LCA approach.

    PubMed

    Woon, Kok Sin; Lo, Irene M C; Chiu, Sam L H; Yan, Dickson Y S

    2016-04-01

    This paper aims to evaluate the environmental impacts of valorizing food waste for three types of energy use, namely electricity and heat, city gas, and biogas fuel as a petrol, diesel, and liquefied petroleum gas substitute for vehicle use, with reference to the Hong Kong scenario. The life cycle based environmental assessment is conducted from bin-to-cradle system boundary via SimaPro 7.2.4 with ReCiPe 1.04. All of the inventory data of included processes is based on reports of government and industrial sectors. The results show that biogas fuel as a petrol substitute for vehicle use is advantageous over other types of energy use in regard to human health and ecosystems, and it is also the best considering the government's future emission reduction targets set out for the power and transport sectors in Hong Kong. By turning 1080 tonnes per day of food waste into biogas vehicle fuel as petrol substitute, it reduces 1.9% of greenhouse gas emissions in the transport sectors, which results a larger decrease of GHG emissions than the achieved mitigation in Hong Kong from 2005 to 2010. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  11. Conning the conmen: Intelligence and female desire in Dedh Ishqiya.

    PubMed

    Singh, Shailendra Kumar

    2018-01-02

    This article investigates the ostensibly paradoxical relationship that exists between the theme of excessive love, as suggested by the title of Abhishek Chaubey's film Dedh Ishqiya (2014), and the actual representation of it in the movie, which is not only restrained and disproportionate, but is also looked at with suspicion and contempt. It examines the logic of this seeming contradiction through the other two related themes that Chaubey's chef-d'œuvre foregrounds, namely that of intelligence and female desire. The quest for financial autonomy that the female protagonists of the movie are involved in-a necessary pre-condition for leading independent lives-is so inextricably intertwined with manipulation, dexterity, and subterfuge, that any overt expression of homoerotic female desire can only jeopardize their existing possibilities of self-aggrandizement. The heteronormative arrangements of Begum Para's palace thus constitute the elaborate mise en scène, behind which female desire is enacted through a politics of intelligence, resourcefulness, discretion, and anonymity. Through this strategic negotiation, which is also a tactical necessity, the female protagonists are not only able to con the con men in the movie, but also imagine alternative subject positions that recognize the need for both pragmatism and expediency as well as deconstructing heteropatriarchal economies of desire.

  12. Normal Language Skills and Normal Intelligence in a Child with de Lange Syndrome.

    ERIC Educational Resources Information Center

    Cameron, Thomas H.; Kelly, Desmond P.

    1988-01-01

    The subject of this case report is a two-year, seven-month-old girl with de Lange syndrome, normal intelligence, and age-appropriate language skills. She demonstrated initial delays in gross motor skills and in receptive and expressive language but responded well to intensive speech and language intervention, as well as to physical therapy.…

  13. The experience of weight management in normal weight adults.

    PubMed

    Hernandez, Cheri Ann; Hernandez, David A; Wellington, Christine M; Kidd, Art

    2016-11-01

    No prior research has been done with normal weight persons specific to their experience of weight management. The purpose of this research was to discover the experience of weight management in normal weight individuals. Glaserian grounded theory was used. Qualitative data (focus group) and quantitative data (food diary, study questionnaire, and anthropometric measures) were collected. Weight management was an ongoing process of trying to focus on living (family, work, and social), while maintaining their normal weight targets through five consciously and unconsciously used strategies. Despite maintaining normal weights, the nutritional composition of foods eaten was grossly inadequate. These five strategies can be used to develop new weight management strategies that could be integrated into existing weight management programs, or could be developed into novel weight management interventions. Surprisingly, normal weight individuals require dietary assessment and nutrition education to prevent future negative health consequences. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Bayes classification of terrain cover using normalized polarimetric data

    NASA Technical Reports Server (NTRS)

    Yueh, H. A.; Swartz, A. A.; Kong, J. A.; Shin, R. T.; Novak, L. M.

    1988-01-01

    The normalized polarimetric classifier (NPC) which uses only the relative magnitudes and phases of the polarimetric data is proposed for discrimination of terrain elements. The probability density functions (PDFs) of polarimetric data are assumed to have a complex Gaussian distribution, and the marginal PDF of the normalized polarimetric data is derived by adopting the Euclidean norm as the normalization function. The general form of the distance measure for the NPC is also obtained. It is demonstrated that for polarimetric data with an arbitrary PDF, the distance measure of NPC will be independent of the normalization function selected even when the classifier is mistrained. A complex Gaussian distribution is assumed for the polarimetric data consisting of grass and tree regions. The probability of error for the NPC is compared with those of several other single-feature classifiers. The classification error of NPCs is shown to be independent of the normalization function.

  15. Telomere length in normal and neoplastic canine tissues.

    PubMed

    Cadile, Casey D; Kitchell, Barbara E; Newman, Rebecca G; Biller, Barbara J; Hetler, Elizabeth R

    2007-12-01

    To determine the mean telomere restriction fragment (TRF) length in normal and neoplastic canine tissues. 57 solid-tissue tumor specimens collected from client-owned dogs, 40 samples of normal tissue collected from 12 clinically normal dogs, and blood samples collected from 4 healthy blood donor dogs. Tumor specimens were collected from client-owned dogs during diagnostic or therapeutic procedures at the University of Illinois Veterinary Medical Teaching Hospital, whereas 40 normal tissue samples were collected from 12 control dogs. Telomere restriction fragment length was determined by use of an assay kit. A histologic diagnosis was provided for each tumor by personnel at the Veterinary Diagnostic Laboratory at the University of Illinois. Mean of the mean TRF length for 44 normal samples was 19.0 kilobases (kb; range, 15.4 to 21.4 kb), and the mean of the mean TRF length for 57 malignant tumors was 19.0 kb (range, 12.9 to 23.5 kb). Although the mean of the mean TRF length for tumors and normal tissues was identical, tumor samples had more variability in TRF length. Telomerase, which represents the main mechanism by which cancer cells achieve immortality, is an attractive therapeutic target. The ability to measure telomere length is crucial to monitoring the efficacy of telomerase inhibition. In contrast to many other mammalian species, the length of canine telomeres and the rate of telomeric DNA loss are similar to those reported in humans, making dogs a compelling choice for use in the study of human anti-telomerase strategies.

  16. Normal mode-guided transition pathway generation in proteins

    PubMed Central

    Lee, Byung Ho; Seo, Sangjae; Kim, Min Hyeok; Kim, Youngjin; Jo, Soojin; Choi, Moon-ki; Lee, Hoomin; Choi, Jae Boong

    2017-01-01

    The biological function of proteins is closely related to its structural motion. For instance, structurally misfolded proteins do not function properly. Although we are able to experimentally obtain structural information on proteins, it is still challenging to capture their dynamics, such as transition processes. Therefore, we need a simulation method to predict the transition pathways of a protein in order to understand and study large functional deformations. Here, we present a new simulation method called normal mode-guided elastic network interpolation (NGENI) that performs normal modes analysis iteratively to predict transition pathways of proteins. To be more specific, NGENI obtains displacement vectors that determine intermediate structures by interpolating the distance between two end-point conformations, similar to a morphing method called elastic network interpolation. However, the displacement vector is regarded as a linear combination of the normal mode vectors of each intermediate structure, in order to enhance the physical sense of the proposed pathways. As a result, we can generate more reasonable transition pathways geometrically and thermodynamically. By using not only all normal modes, but also in part using only the lowest normal modes, NGENI can still generate reasonable pathways for large deformations in proteins. This study shows that global protein transitions are dominated by collective motion, which means that a few lowest normal modes play an important role in this process. NGENI has considerable merit in terms of computational cost because it is possible to generate transition pathways by partial degrees of freedom, while conventional methods are not capable of this. PMID:29020017

  17. The Influence of Normalization Weight in Population Pharmacokinetic Covariate Models.

    PubMed

    Goulooze, Sebastiaan C; Völler, Swantje; Välitalo, Pyry A J; Calvier, Elisa A M; Aarons, Leon; Krekels, Elke H J; Knibbe, Catherijne A J

    2018-03-23

    In covariate (sub)models of population pharmacokinetic models, most covariates are normalized to the median value; however, for body weight, normalization to 70 kg or 1 kg is often applied. In this article, we illustrate the impact of normalization weight on the precision of population clearance (CL pop ) parameter estimates. The influence of normalization weight (70, 1 kg or median weight) on the precision of the CL pop estimate, expressed as relative standard error (RSE), was illustrated using data from a pharmacokinetic study in neonates with a median weight of 2.7 kg. In addition, a simulation study was performed to show the impact of normalization to 70 kg in pharmacokinetic studies with paediatric or obese patients. The RSE of the CL pop parameter estimate in the neonatal dataset was lowest with normalization to median weight (8.1%), compared with normalization to 1 kg (10.5%) or 70 kg (48.8%). Typical clearance (CL) predictions were independent of the normalization weight used. Simulations showed that the increase in RSE of the CL pop estimate with 70 kg normalization was highest in studies with a narrow weight range and a geometric mean weight away from 70 kg. When, instead of normalizing with median weight, a weight outside the observed range is used, the RSE of the CL pop estimate will be inflated, and should therefore not be used for model selection. Instead, established mathematical principles can be used to calculate the RSE of the typical CL (CL TV ) at a relevant weight to evaluate the precision of CL predictions.

  18. Decorin and biglycan of normal and pathologic human corneas

    NASA Technical Reports Server (NTRS)

    Funderburgh, J. L.; Hevelone, N. D.; Roth, M. R.; Funderburgh, M. L.; Rodrigues, M. R.; Nirankari, V. S.; Conrad, G. W.

    1998-01-01

    PURPOSE: Corneas with scars and certain chronic pathologic conditions contain highly sulfated dermatan sulfate, but little is known of the core proteins that carry these atypical glycosaminoglycans. In this study the proteoglycan proteins attached to dermatan sulfate in normal and pathologic human corneas were examined to identify primary genes involved in the pathobiology of corneal scarring. METHODS: Proteoglycans from human corneas with chronic edema, bullous keratopathy, and keratoconus and from normal corneas were analyzed using sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE), quantitative immunoblotting, and immunohistology with peptide antibodies to decorin and biglycan. RESULTS: Proteoglycans from pathologic corneas exhibit increased size heterogeneity and binding of the cationic dye alcian blue compared with those in normal corneas. Decorin and biglycan extracted from normal and diseased corneas exhibited similar molecular size distribution patterns. In approximately half of the pathologic corneas, the level of biglycan was elevated an average of seven times above normal, and decorin was elevated approximately three times above normal. The increases were associated with highly charged molecular forms of decorin and biglycan, indicating modification of the proteins with dermatan sulfate chains of increased sulfation. Immunostaining of corneal sections showed an abnormal stromal localization of biglycan in pathologic corneas. CONCLUSIONS: The increased dermatan sulfate associated with chronic corneal pathologic conditions results from stromal accumulation of decorin and particularly of biglycan in the affected corneas. These proteins bear dermatan sulfate chains with increased sulfation compared with normal stromal proteoglycans.

  19. The Normalization of Cannabis Use Among Bangladeshi and Pakistani Youth: A New Frontier for the Normalization Thesis?

    PubMed

    Williams, Lisa; Ralphs, Rob; Gray, Paul

    2017-03-21

    The Asian population in Britain has grown, representing the second largest ethnic group; Bangladeshi, Pakistani, and Indian nationalities are prevalent (Jivraj, 2012 ; Office for National Statistics, 2013 ). Yet, we know relatively little about the nature and extent of their substance use. Jayakody et al. ( 2006 ) argue ethnic minority groups may be influenced by the norms and values of the dominant culture. Given recreational drug use has undergone a process of normalization in Britain (Aldridge et al., 2011 ; Parker et al., 1998 , 2002 ), we explore the degree to which this is occurring in a Bangladeshi and Pakistani community of Muslim faith in Northern England; a group typically assumed to reject substance use because of robust religious and cultural values. To examine the extent, frequency, and nature of substance use, and associated attitudes. A cross-sectional study collecting qualitative data from a sample (N = 43) of adolescents accessing a drug service and a range of professionals working with them during 2014. We also present analyses of routinely collected quantitative client data. Adolescent interviewees reported extensive personal experience smoking skunk cannabis, and professionals working in the community confirmed many young Asians smoked it. Its consumption appeared to be accommodated into the daily lives of young people and the supply of it also showed signs of acceptance. Skunk cannabis may be undergoing a process of normalization within some Asian communities in Britain. Our study has significant implications for the normalization thesis, finding evidence for normalization within a subpopulation that is typically perceived to resist this trend.

  20. Pros and cons of prognostic disclosure to Japanese cancer patients and their families from the family's point of view.

    PubMed

    Yoshida, Saran; Shiozaki, Mariko; Sanjo, Makiko; Morita, Tatsuya; Hirai, Kei; Tsuneto, Satoru; Shima, Yasuo

    2012-12-01

    The primary goals of this analysis were to explore the pros and cons of prognostic disclosure to patients and their families from the bereaved family's point of view. Semistructured interviews were conducted with 60 bereaved family members of patients with cancer in Japan. There were eight categories of influence related to the disclosure of prognosis to the family, including pros (e.g., "Enabling mental preparedness for the patient's death") and cons (e.g., "Being distressed by acknowledging the patient's prognosis"); and seven categories of influence of not disclosing the prognosis to family, including pros (e.g., "Being able to maintain hope") and cons (e.g., "Being prevented from providing adequate care for the patient"). There were also nine categories of influence related to the disclosure of prognosis to patients (e.g., "Enabling various discussions regarding death with the patient"), and eight categories of influence related to not disclosing the prognosis to patients (e.g., "Maintaining the patient's hope"). Although prognostic disclosure to family members can contribute to psychological distress and hopelessness, at the same time, it has the potential to prepare them for the future both emotionally and practically, and also to make the time until the patient's death as meaningful as possible. It is useful for physicians to introduce pros and cons of prognostic disclosure to family members at the time of decision making, to understand the family members' psychological state, and to provide support considering pros and cons whether or not they disclosed prognosis.

  1. Cortical thickness in neuropsychologically near-normal schizophrenia.

    PubMed

    Cobia, Derin J; Csernansky, John G; Wang, Lei

    2011-12-01

    Schizophrenia is a severe psychiatric illness with widespread impairments of cognitive functioning; however, a certain percentage of subjects are known to perform in the normal range on neuropsychological measures. While the cognitive profiles of these individuals have been examined, there has been relatively little attention to the neuroanatomical characteristics of this important subgroup. The aims of this study were to statistically identify schizophrenia subjects with relatively normal cognition, examine their neuroanatomical characteristics relative to their more impaired counterparts using cortical thickness mapping, and to investigate relationships between these characteristics and demographic variables to better understand the nature of cognitive heterogeneity in schizophrenia. Clinical, neuropsychological, and MRI data were collected from schizophrenia (n = 79) and healthy subjects (n = 65). A series of clustering algorithms on neuropsychological scores was examined, and a 2-cluster solution that separated subjects into neuropsychologically near-normal (NPNN) and neuropsychologically impaired (NPI) groups was determined most appropriate. Surface-based cortical thickness mapping was utilized to examine differences in thinning among schizophrenia subtypes compared with the healthy participants. A widespread cortical thinning pattern characteristic of schizophrenia emerged in the NPI group, while NPNN subjects demonstrated very limited thinning relative to healthy comparison subjects. Analysis of illness duration indicated minimal effects on subtype classification and cortical thickness results. Findings suggest a strong link between cognitive impairment and cortical thinning in schizophrenia, where subjects with near-normal cognitive abilities also demonstrate near-normal cortical thickness patterns. While generally supportive of distinct etiological processes for cognitive subtypes, results provide direction for further examination of additional

  2. COMS normal operation for Earth Observation mission

    NASA Astrophysics Data System (ADS)

    Cho, Young-Min

    2012-09-01

    Communication Ocean Meteorological Satellite (COMS) for the hybrid mission of meteorological observation, ocean monitoring, and telecommunication service was launched onto Geostationary Earth Orbit on June 27, 2010 and it is currently under normal operation service since April 2011. The COMS is located on 128.2° East of the geostationary orbit. In order to perform the three missions, the COMS has 3 separate payloads, the meteorological imager (MI), the Geostationary Ocean Color Imager (GOCI), and the Ka-band antenna. Each payload is dedicated to one of the three missions, respectively. The MI and GOCI perform the Earth observation mission of meteorological observation and ocean monitoring, respectively. For this Earth observation mission the COMS requires daily mission commands from the satellite control ground station and daily mission is affected by the satellite control activities. For this reason daily mission planning is required. The Earth observation mission operation of COMS is described in aspects of mission operation characteristics and mission planning for the normal operation services of meteorological observation and ocean monitoring. And the first year normal operation results after the In-Orbit-Test (IOT) are investigated through statistical approach to provide the achieved COMS normal operation status for the Earth observation mission.

  3. Normals to a Parabola

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2013-01-01

    Given a parabola in the standard form y[superscript 2] = 4ax, corresponding to three points on the parabola, such that the normals at these three points P, Q, R concur at a point M = (h, k), the equation of the circumscribing circle through the three points P, Q, and R provides a tremendous opportunity to illustrate "The Art of Algebraic…

  4. Measurements of normal joint angles by goniometry in calves.

    PubMed

    Sengöz Şirin, O; Timuçin Celik, M; Ozmen, A; Avki, S

    2014-01-01

    The aim of this study was to establish normal reference values of the forelimb and hindlimb joint angles in normal Holstein calves. Thirty clinically normal Holstein calves that were free of any detectable musculoskeletal abnormalities were included in the study. A standard transparent plastic goniometer was used to measure maximum flexion, maximum extension, and range-of-motion of the shoulder, elbow, carpal, hip, stifle, and tarsal joints. The goniometric measurements were done on awake calves that were positioned in lateral recumbency. The goniometric values were measured and recorded by two independent investigators. As a result of the study it was concluded that goniometric values obtained from awake calves in lateral recumbency were found to be highly consistent and accurate between investigators (p <0.05). The data of this study acquired objective and useful information on the normal forelimb and hindlimb joint angles in normal Holstein calves. Further studies can be done to predict detailed goniometric values from different diseases and compare them.

  5. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  6. Normal keratinized mucosa transplants in nude mice.

    PubMed

    Holmstrup, P; Dabelsteen, E; Reibel, J; Harder, F

    1981-01-01

    Two types of normal keratinized mucosa were transplanted to subcutaneous sites of nude mice of two different strains. 24 intact specimens of clinically normal human palatal mucosa were transplanted to nude mice of the strain nu/nu NC. The transplants were recovered after 42 d with a recovery rate of 96%. Moreover, 22 intact specimens of normal rat forestomach mucosa were transplanted to nude mice of the strain nu/nu BALB/c/BOM. These transplants were recovered after 21 d with a recovery rate of 63%. The histologic features of the transplants were essentially the same as those of the original tissues. However, epithelial outgrowths from the transplants differed with respect to the pattern of keratinization. The outgrowths of human palatal mucosa transplants were essentially unkeratinized, while the outgrowths of the rat forestomach transplants showed continued keratinization.

  7. Masturbation, sexuality, and adaptation: normalization in adolescence.

    PubMed

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  8. Are cancer cells really softer than normal cells?

    PubMed

    Alibert, Charlotte; Goud, Bruno; Manneville, Jean-Baptiste

    2017-05-01

    Solid tumours are often first diagnosed by palpation, suggesting that the tumour is more rigid than its surrounding environment. Paradoxically, individual cancer cells appear to be softer than their healthy counterparts. In this review, we first list the physiological reasons indicating that cancer cells may be more deformable than normal cells. Next, we describe the biophysical tools that have been developed in recent years to characterise and model cancer cell mechanics. By reviewing the experimental studies that compared the mechanics of individual normal and cancer cells, we argue that cancer cells can indeed be considered as softer than normal cells. We then focus on the intracellular elements that could be responsible for the softening of cancer cells. Finally, we ask whether the mechanical differences between normal and cancer cells can be used as diagnostic or prognostic markers of cancer progression. © 2017 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.

  9. Spectra of normal and nutrient-deficient maize leaves

    NASA Technical Reports Server (NTRS)

    Al-Abbas, A. H.; Barr, R.; Hall, J. D.; Crane, F. L.; Baumgardner, M. F.

    1973-01-01

    Reflectance, transmittance and absorptance spectra of normal and six types of nutrient-deficient (N, P, K, S, Mg, and Ca) maize (Zea mays L.) leaves were analyzed at 30 selected wavelengths from 500 to 2600 nm. The analysis of variance showed significant differences in reflectance, transmittance and absorptance in the visible wavelengths among leaf numbers 3, 4, and 5, among the seven treatments, and among the interactions of leaf number and treatments. In the infrared wavelengths only treatments produced significant differences. The chlorophyll content of leaves was reduced in all nutrient-deficient treatments. Percent moisture was increased in S-, Mg-, and N-deficiencies. Polynomial regression analysis of leaf thickness and leaf moisture content showed that these two variables were significantly and directly related. Leaves from the P- and Ca-deficient plants absorbed less energy in the near infrared than the normal plants; S-, Mg-, K-, and N-deficient leaves absorbed more than the normal. Both S- and N-deficient leaves had higher temperatues than normal maize leaves.

  10. Public mental hospital work: pros and cons for psychiatrists.

    PubMed

    Miller, R D

    1984-09-01

    The extensive literature concerning public mental hospitals has largely been written from the perspective of administrators and systems analysts; most of the reports emphasize the frustrations and problems of working in public mental hospitals and the continued exodus of psychiatrists from these facilities. The author addresses the pros and cons of such a career choice from the viewpoint of one who has been an "Indian" rather than a "chief" for a decade. He suggests that the current financial situation in both private practice and academia makes work in public mental hospitals increasingly attractive.

  11. Silence, Metaperformance, and Communication in Pedro Almodóvar's "Hable con ella"

    ERIC Educational Resources Information Center

    Fellie, Maria C.

    2016-01-01

    Many scenes in Pedro Almodóvar's "Hable con ella" (2002) include shots of metaperformances such as silent films, dances, television shows, concerts, and bullfights. Spectators often observe passive characters who are in turn observing. By presenting these performances within cinematic performance, Almodóvar highlights our role as viewers…

  12. U.S. Exploration EVA: ConOps, Interfaces and Test Objectives for Airlocks

    NASA Technical Reports Server (NTRS)

    Buffington, J.

    2017-01-01

    NASA is moving forward on defining the xEVA System Architecture and its implications to the spacecraft that host exploration EVA systems. This presentation provides an overview of the latest information for NASA's Concept of Operations (ConOps), Interfaces and corresponding Test Objectives for Airlocks hosting the xEVA System.

  13. Proteoglycans in Leiomyoma and Normal Myometrium

    PubMed Central

    Barker, Nichole M.; Carrino, David A.; Caplan, Arnold I.; Hurd, William W.; Liu, James H.; Tan, Huiqing; Mesiano, Sam

    2015-01-01

    Uterine leiomyoma are a common benign pelvic tumors composed of modified smooth muscle cells and a large amount of extracellular matrix (ECM). The proteoglycan composition of the leiomyoma ECM is thought to affect pathophysiology of the disease. To test this hypothesis, we examined the abundance (by immunoblotting) and expression (by quantitative real-time polymerase chain reaction) of the proteoglycans biglycan, decorin, and versican in leiomyoma and normal myometrium and determined whether expression is affected by steroid hormones and menstrual phase. Leiomyoma and normal myometrium were collected from women (n = 17) undergoing hysterectomy or myomectomy. In vitro studies were performed on immortalized leiomyoma (UtLM) and normal myometrial (hTERT-HM) cells with and without exposure to estradiol and progesterone. In leiomyoma tissue, abundance of decorin messenger RNA (mRNA) and protein were 2.6-fold and 1.4-fold lower, respectively, compared with normal myometrium. Abundance of versican mRNA was not different between matched samples, whereas versican protein was increased 1.8-fold in leiomyoma compared with myometrium. Decorin mRNA was 2.4-fold lower in secretory phase leiomyoma compared with proliferative phase tissue. In UtLM cells, progesterone decreased the abundance of decorin mRNA by 1.3-fold. Lower decorin expression in leiomyoma compared with myometrium may contribute to disease growth and progression. As decorin inhibits the activity of specific growth factors, its reduced level in the leiomyoma cell microenvironment may promote cell proliferation and ECM deposition. Our data suggest that decorin expression in leiomyoma is inhibited by progesterone, which may be a mechanism by which the ovarian steroids affect leiomyoma growth and disease progression. PMID:26423601

  14. The classification of normal screening mammograms

    NASA Astrophysics Data System (ADS)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  15. The COBE normalization for standard cold dark matter

    NASA Technical Reports Server (NTRS)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  16. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  17. Normal stress effects on Knudsen flow

    NASA Astrophysics Data System (ADS)

    Eu, Byung Chan

    2018-01-01

    Normal stress effects are investigated on tube flow of a single-component non-Newtonian fluid under a constant pressure gradient in a constant temperature field. The generalized hydrodynamic equations are employed, which are consistent with the laws of thermodynamics. In the cylindrical tube flow configuration, the solutions of generalized hydrodynamic equations are exactly solvable and the flow velocity is obtained in a simple one-dimensional integral quadrature. Unlike the case of flow in the absence of normal stresses, the flow develops an anomaly in that the flow in the boundary layer becomes stagnant and the thickness of such a stagnant velocity boundary layer depends on the pressure gradient, the aspect ratio of the radius to the length of the tube, and the pressure (or density and temperature) at the entrance of the tube. The volume flow rate formula through the tube is derived for the flow. It generalizes the Knudsen flow rate formula to the case of a non-Newtonian stress tensor in the presence of normal stress differences. It also reduces to the Navier-Stokes theory formula in the low shear rate limit near equilibrium.

  18. Glymphatic MRI in idiopathic normal pressure hydrocephalus

    PubMed Central

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-01-01

    Abstract The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer’s disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic

  19. Glymphatic MRI in idiopathic normal pressure hydrocephalus.

    PubMed

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-10-01

    The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer's disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic function. In

  20. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  1. Method for construction of normalized cDNA libraries

    DOEpatents

    Soares, M.B.; Efstratiadis, A.

    1998-11-03

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3` noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries. 19 figs.

  2. Normal Perceptual Sensitivity Arising From Weakly Reflective Cone Photoreceptors

    PubMed Central

    Bruce, Kady S.; Harmening, Wolf M.; Langston, Bradley R.; Tuten, William S.; Roorda, Austin; Sincich, Lawrence C.

    2015-01-01

    Purpose To determine the light sensitivity of poorly reflective cones observed in retinas of normal subjects, and to establish a relationship between cone reflectivity and perceptual threshold. Methods Five subjects (four male, one female) with normal vision were imaged longitudinally (7–26 imaging sessions, representing 82–896 days) using adaptive optics scanning laser ophthalmoscopy (AOSLO) to monitor cone reflectance. Ten cones with unusually low reflectivity, as well as 10 normally reflective cones serving as controls, were targeted for perceptual testing. Cone-sized stimuli were delivered to the targeted cones and luminance increment thresholds were quantified. Thresholds were measured three to five times per session for each cone in the 10 pairs, all located 2.2 to 3.3° from the center of gaze. Results Compared with other cones in the same retinal area, three of 10 monitored dark cones were persistently poorly reflective, while seven occasionally manifested normal reflectance. Tested psychophysically, all 10 dark cones had thresholds comparable with those from normally reflecting cones measured concurrently (P = 0.49). The variation observed in dark cone thresholds also matched the wide variation seen in a large population (n = 56 cone pairs, six subjects) of normal cones; in the latter, no correlation was found between cone reflectivity and threshold (P = 0.0502). Conclusions Low cone reflectance cannot be used as a reliable indicator of cone sensitivity to light in normal retinas. To improve assessment of early retinal pathology, other diagnostic criteria should be employed along with imaging and cone-based microperimetry. PMID:26193919

  3. Microarray expression profiling in adhesion and normal peritoneal tissues.

    PubMed

    Ambler, Dana R; Golden, Alicia M; Gell, Jennifer S; Saed, Ghassan M; Carey, David J; Diamond, Michael P

    2012-05-01

    To identify molecular markers associated with adhesion and normal peritoneal tissue using microarray expression profiling. Comparative study. University hospital. Five premenopausal women. Adhesion and normal peritoneal tissue samples were obtained from premenopausal women. Ribonucleic acid was extracted using standard protocols and processed for hybridization to Affymetrix Whole Transcript Human Gene Expression Chips. Microarray data were obtained from five different patients, each with adhesion tissue and normal peritoneal samples. Real-time polymerase chain reaction was performed for confirmation using standard protocols. Gene expression in postoperative adhesion and normal peritoneal tissues. A total of 1,263 genes were differentially expressed between adhesion and normal tissues. One hundred seventy-three genes were found to be up-regulated and 56 genes were down-regulated in the adhesion tissues compared with normal peritoneal tissues. The genes were sorted into functional categories according to Gene Ontology annotations. Twenty-six up-regulated genes and 11 down-regulated genes were identified with functions potentially relevant to the pathophysiology of postoperative adhesions. We evaluated and confirmed expression of 12 of these specific genes via polymerase chain reaction. The pathogenesis, natural history, and optimal treatment of postoperative adhesive disease remains unanswered. Microarray analysis of adhesions identified specific genes with increased and decreased expression when compared with normal peritoneum. Knowledge of these genes and ontologic pathways with altered expression provide targets for new therapies to treat patients who have or are at risk for postoperative adhesions. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Resistance to antibiotics in the normal flora of animals.

    PubMed

    Sørum, H; Sunde, M

    2001-01-01

    The normal bacterial flora contains antibiotic resistance genes to various degrees, even in individuals with no history of exposure to commercially prepared antibiotics. Several factors seem to increase the number of antibiotic-resistant bacteria in feces. One important factor is the exposure of the intestinal flora to antibacterial drugs. Antibiotics used as feed additives seem to play an important role in the development of antibiotic resistance in normal flora bacteria. The use of avoparcin as a feed additive has demonstrated that an antibiotic considered "safe" is responsible for increased levels of antibiotic resistance in the normal flora enterococci of animals fed with avoparcin and possibly in humans consuming products from these animals. However, other factors like stress from temperature, crowding, and management also seem to contribute to the occurrence of antibiotic resistance in normal flora bacteria. The normal flora of animals has been studied with respect to the development of antibiotic resistance over four decades, but there are few studies with the intestinal flora as the main focus. The results of earlier studies are valuable when focused against the recent understanding of mobile genetics responsible for bacterial antibiotic resistance. New studies should be undertaken to assess whether the development of antibiotic resistance in the normal flora is directly linked to the dramatic increase in antibiotic resistance of bacterial pathogens. Bacteria of the normal flora, often disregarded scientifically, should be studied with the intention of using them as active protection against infectious diseases and thereby contributing to the overall reduction of use of antibioties in both animals and humans.

  5. Evaluation of ConPrim: A three-part model for continuing education in primary health care.

    PubMed

    Berggren, Erika; Strang, Peter; Orrevall, Ylva; Ödlund Olin, Ann; Sandelowsky, Hanna; Törnkvist, Lena

    2016-11-01

    To overcome the gap between existing knowledge and the application of this knowledge in practice, a three-part continuing educational model for primary health care professionals (ConPrim) was developed. It includes a web-based program, a practical exercise and a case seminar. To evaluate professionals' perceptions of the design, pedagogy and adaptation to primary health care of the ConPrim continuing educational model as applied in a subject-specific intervention. A total of 67 professionals (nurses and physicians) completed a computer-based questionnaire evaluating the model's design, pedagogy and adaptation to primary health care one week after the intervention. Descriptive statistics were used. Over 90% found the design of the web-based program and case seminar attractive; 86% found the design of the practical exercise attractive. The professionals agreed that the time spent on two of the three parts was acceptable. The exception was the practical exercise: 32% did not fully agree. Approximately 90% agreed that the contents of all parts were relevant to their work and promoted interactive and interprofessional learning. In response to the statements about the intervention as whole, approximately 90% agreed that the intervention was suitable to primary health care, that it had increased their competence in the subject area, and that they would be able to use what they had learned in their work. ConPrim is a promising model for continuing educational interventions in primary health care. However, the time spent on the practical exercise should be adjusted and the instructions for the exercise clarified. ConPrim should be tested in other subject-specific interventions and its influence on clinical practice should be evaluated. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Advocacy in the Public Forum: The Pro/Con Program at Ohio State.

    ERIC Educational Resources Information Center

    Stegman, John D.

    The Pro/Con Campus Debate and Community Forum program at The Ohio State University serves the educational mission of the department of communication and contributes to the intellectual life of the student body and the larger community by emphasizing the needs of the audience. Eschewing jargon and rhetorical tricks, the program encourages the…

  7. The significance of early post-exercise ST segment normalization.

    PubMed

    Chow, Rudy; Fordyce, Christopher B; Gao, Min; Chan, Sammy; Gin, Kenneth; Bennett, Matthew

    2015-01-01

    The persistence of ST segment depression in recovery signifies a strongly positive exercise treadmill test (ETT). However, it is unclear if early recovery of ST segments portends a similar prognosis. We sought to determine if persistence of ST depression into recovery correlates with ischemic burden based on myocardial perfusion imaging (MPI). This was a retrospective analysis of 853 consecutive patients referred for exercise MPI at a tertiary academic center over a 24-month period. Patients were stratified into three groups based on the results of the ETT: normal (negative ETT), persistence (positive ETT with >1mm ST segment depression at 1minute in recovery) and early normalization (positive ETT with <1mm ST segment depression at 1minute in recovery). Summed stress scores (SSSs) were calculated then for each patient, while the coronary anatomy was reported for the subset of patients who received coronary angiograms. A total of 513 patients had a negative ETT, 235 patients met criteria for early normalization, while 105 patients met criteria for persistence. The persistence group had a significantly greater SSS (8.48±7.77) than both the early normalization (4.34±4.98, p<0.001) and normal (4.47±5.31, p<0.001) groups. The SSSs of the early normalization and normal groups were not statistically different and met the prespecified non-inferiority margin (mean difference 0.12, -0.66=lower 95% CI, p<0.001). Among the 87 patients who underwent an angiogram, significant three-vessel or left main disease was seen in 39.3% of the persistence group compared with 5.9% of normal and 7.4% of early normalization groups. Among patients with an electrically positive ETT, recovery of ST segment depression within 1minute was associated with a lower SSS than patients with persistence of ST depression beyond 1minute. Furthermore, early ST segment recovery conferred a similar SSS to patients with a negative ETT. These results suggest that among patients evaluated for chest pain with

  8. Improved Algorithm For Finite-Field Normal-Basis Multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1989-01-01

    Improved algorithm reduces complexity of calculations that must precede design of Massey-Omura finite-field normal-basis multipliers, used in error-correcting-code equipment and cryptographic devices. Algorithm represents an extension of development reported in "Algorithm To Design Finite-Field Normal-Basis Multipliers" (NPO-17109), NASA Tech Briefs, Vol. 12, No. 5, page 82.

  9. Gang Youth, Substance Use Patterns, and Drug Normalization

    ERIC Educational Resources Information Center

    Sanders, Bill

    2012-01-01

    Gang membership is an indicator of chronic illicit substance use and such patterns of use may have a normalized character. Using epidemiological and qualitative data collected between 2006 and 2007, this manuscript examines the drug normalization thesis among a small sample (n=60) of gang youth aged 16-25 years from Los Angeles. Overall, while…

  10. Newborns' temperature submitted to radiant heat and to the Top Maternal device at birth.

    PubMed

    Albuquerque, Rosemeire Sartori de; Mariani, Corintio; Bersusa, Ana Aparecida Sanches; Dias, Vanessa Macedo; Silva, Maria Izabel Mota da

    2016-08-08

    to compare the axillar temperatures of newborns that are put immediately after birth in skin-to-skin contact under the Top Maternal device, as compared to those in a radiant heat crib. comparatives observational study of the case-control type about temperature of 60 babies born at the Obstetric Center and Normal Delivery Center of a public hospital of the municipality of Sao Paulo, being them: 29 receiving assistance in heated crib and 31 in skin-to skin contact, shielded by a cotton tissue placed on mother's thorax, called Top Maternal. the temperature of the babies of the skin-to-skin contact group presented higher values in a larger share of the time measures verified, as compared to those that were placed in radiant heat crib, independently from the place of birth. Differences between the two groups were not statistically significant. the study contributes to generate new knowledge, supporting the idea of keeping babies with their mothers immediately after birth protected with the Maternal Top, without harming their wellbeing, as it keeps the axillar temperature in recommendable levels. comparar a temperatura axilar dos recém-nascidos acomodados - imediatamente após o nascimento - em contato pele a pele, sob o Top Maternal, em berço de calor radiante. estudo comparativo observacional do tipo Caso-Controle sobre a temperatura de 60 bebês nascidos no Centro Obstétrico e Centro de Parto Normal de um hospital público do município de São Paulo, sendo: 29 assistidos em berço aquecido e 31 em contato pele a pele, protegidos por uma malha de algodão colocada sobre o tórax da mãe, denominada Top Maternal. a temperatura dos bebês do grupo de contato pele a pele teve maior valor na maioria dos tempos verificados comparada à dos que foram colocados em berço de calor radiante, independentemente do local de nascimento. A diferença entre os grupos não foi estatisticamente significante. o estudo contribui com a geração de um novo conhecimento que sustenta a

  11. Empirical evaluation of data normalization methods for molecular classification.

    PubMed

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  12. Empirical evaluation of data normalization methods for molecular classification

    PubMed Central

    Huang, Huei-Chung

    2018-01-01

    Background Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers—an increasingly important application of microarrays in the era of personalized medicine. Methods In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. Results In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Conclusion Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy. PMID:29666754

  13. Indentation stiffness does not discriminate between normal and degraded articular cartilage.

    PubMed

    Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle

    2007-08-01

    Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.

  14. The global regulator of pathogenesis PnCon7 positively regulates Tox3 effector gene expression through direct interaction in the wheat pathogen Parastagonospora nodorum.

    PubMed

    Lin, Shao-Yu; Chooi, Yit-Heng; Solomon, Peter S

    2018-05-03

    To investigate effector gene regulation in the wheat pathogenic fungus Parastagonospora nodorum, the promoter and expression of Tox3 was characterised through a series of complementary approaches. Promoter deletion and DNase I footprinting experiments identified a 25 bp region in the Tox3 promoter as being required for transcription. Subsequent yeast one-hybrid analysis using the DNA sequence as bait identified that interacting partner as the C2H2 zinc finger transcription factor PnCon7, a putative master regulator of pathogenesis. Silencing of PnCon7 resulted in the down-regulation of Tox3 demonstrating that the transcription factor has a positive regulatory role on gene expression. Analysis of Tox3 expression in the PnCon7 silenced strains revealed a strong correlation with PnCon7 transcript levels, supportive of a direct regulatory role. Subsequent pathogenicity assays using PnCon7-silenced isolates revealed that the transcription factor was required for Tox3-mediated disease. The expression of two other necrotrophic effectors (ToxA and Tox1) was also affected but in a non-dose dependent manner suggesting that the regulatory role of PnCon7 on these genes was indirect. Collectively, these data have advanced our fundamental understanding of the Con7 master regulator of pathogenesis by demonstrating its positive regulatory role on the Tox3 effector in P. nodorum through direct interaction. This article is protected by copyright. All rights reserved. © 2018 John Wiley & Sons Ltd.

  15. Quasi-normal modes from non-commutative matrix dynamics

    NASA Astrophysics Data System (ADS)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  16. Pro/con a precessional geodynamo

    NASA Astrophysics Data System (ADS)

    Vanyo, J.

    2003-04-01

    The modest amount of research that exists on the ability, or lack of ability, of mantle precession to power a geodynamo developed mostly during the last half of the 1900s. Papers by Roberts and Stewartson (1965) and by Busse (1968) studied precession generally without a pro/con conclusion. Malkus in the late 1960s attempted to advance a positive role for precession through experiments and analysis. His experiments have survived criticism, but his analyses were discounted, especially by Rochester, Jacobs, Smylie, and Chong (1975) and by Loper (1975). Rochester, et al. critiqued existing analyses of precession, including those of Malkus, but did not reach a strong position either pro or con a precessional geodynamo. Loper argued emphatically that precession was not capable of powering the geodynamo. Explicit analyses that either critique or support Loper’s arguments have yet to appear in the literature. During the 1970s, Vanyo and associates studied energy dissipation during precession of satellite liquid fuels and its effect on satellite attitude stability. Engineers and scientists in every country that has launched satellites completed similar research. Some is published in the aerospace literature, more is available in company and government reports. Beginning in 1981, Vanyo and associates applied this knowledge to the very similar problem of energy dissipation and flow patterns in precessing mechanical models scaled geometrically and dynamically to the Earth’s liquid core. Energy experiments indicate massive amounts of mechanical energy are dissipated at the CMB, and flow experiments show complex motions within the boundary layer and axial flows with helicity throughout the interior. Analysis of Earth core precession also advanced, especially in several papers by Kerswell and by Tilgner in the late 1990s. Detail numerical models have yet to appear. Although progress in understanding the role of precession in Earth core motions has advanced, there remains a

  17. Normalization of urinary drug concentrations with specific gravity and creatinine.

    PubMed

    Cone, Edward J; Caplan, Yale H; Moser, Frank; Robert, Tim; Shelby, Melinda K; Black, David L

    2009-01-01

    Excessive fluid intake can substantially dilute urinary drug concentrations and result in false-negative reports for drug users. Methods for correction ("normalization") of drug/metabolite concentrations in urine have been utilized by anti-doping laboratories, pain monitoring programs, and in environmental monitoring programs to compensate for excessive hydration, but such procedures have not been used routinely in workplace, legal, and treatment settings. We evaluated two drug normalization procedures based on specific gravity and creatinine. These corrections were applied to urine specimens collected from three distinct groups (pain patients, heroin users, and marijuana/ cocaine users). Each group was unique in characteristics, study design, and dosing conditions. The results of the two normalization procedures were highly correlated (r=0.94; range, 0.78-0.99). Increases in percent positives by specific gravity and creatinine normalization were small (0.3% and -1.0%, respectively) for heroin users (normally hydrated subjects), modest (4.2-9.8%) for pain patients (unknown hydration state), and substantial (2- to 38-fold increases) for marijuana/cocaine users (excessively hydrated subjects). Despite some limitations, these normalization procedures provide alternative means of dealing with highly dilute, dilute, and concentrated urine specimens. Drug/metabolite concentration normalization by these procedures is recommended for urine testing programs, especially as a means of coping with dilute specimens.

  18. Modelo de accesibilidad de conceptos matematicos aplicados en el curso de Astronomia Descriptiva para estudiantes con impedimentos visuales en la UPR

    NASA Astrophysics Data System (ADS)

    Isidro Villamizar, Gloria Maria

    Este estudio utiliza metodologia de investigacion cualitativa, con el proposito de describir, analizar y evaluar los procesos de diseno y desarrollo de un modelo de accesibilidad que consiste en estrategias de ensenanza de las matematicas para estudiantes con impedimentos visuales matriculados en el curso de Astronomia Descriptiva en la UPR. Se utilizaron las siguientes estrategias para recopilar la informacion, 1) reflexiones de la investigadora en el proceso de diseno y desarrollo de las lecciones adaptadas, que se registraron en un diario reflexivo. 2) entrevista semiestructurada luego de haber trabajado las lecciones de aprendizaje adaptadas con los participantes. 3) observaciones y notas de la investigadora del trabajo de los participantes. Para obtener la informacion de los participantes se obtuvo los permisos institucionales necesarios; se seleccionaron los participantes y se validaron los instrumentos; se realizo el desarrollo de las lecciones adaptadas con los participantes; y finalmente, se analizo la informacion obtenida. El diseno de las lecciones de aprendizaje adaptadas se hizo siguiendo las recomendaciones curriculares de los temas de matematicas aplicados en el curso de Astronomia Descriptiva realizado por la investigadora durante su semestre de internado. El testimonio de las voces de los participantes se obtuvo del proceso de desarrollo de las lecciones de aprendizaje adaptadas de temas seleccionados de conceptos matematicos requeridos en el curso de Astronomia Descriptiva y de la entrevista semiestructurada con los participantes, luego de haber trabajado las lecciones de aprendizaje. Para el desarrollo de las lecciones de aprendizaje, se utilizaron materiales tactiles adaptados, materiales tactiles disenados y materiales disponibles comercialmente. Los textos de las lecciones se imprimieron en tinta y en Braille. Se exhorta a disenar y desarrollar estrategias de ensenanza accesibles, considerando como recursos para evaluar su efectividad a

  19. Protein Degradation in Normal and Beige (Chediak-Higashi) Mice

    PubMed Central

    Lyons, Robert T.; Pitot, Henry C.

    1978-01-01

    The beige mouse, C57BL/6 (bg/bg), is an animal model for the Chediak-Higashi syndrome in man, a disease characterized morphologically by giant lysosomes in most cell types. Half-lives for the turnover of [14C]bicarbonate-labeled total soluble liver protein were determined in normal and beige mice. No significant differences were observed between the normal and mutant strain for both rapidly and slowly turning-over classes of proteins. Glucagon treatment during the time-course of protein degradation had similar effects on both normal and mutant strains and led to the conclusion that the rate of turnover of endogenous intracellular protein in the beige mouse liver does not differ from normal. The rates of uptake and degradation of an exogenous protein were determined in normal and beige mice by intravenously injecting 125I-bovine serum albumin and following, in peripheral blood, the loss with time of phosphotungstic acid-insoluble bovine serum albumin and the parallel appearance of phosphotungstic acid-soluble (degraded) material. No significant differences were observed between beige and normal mice in the uptake by liver lysosomes of 125I-bovine serum albumin (t½ = 3.9 and 2.8 h, respectively). However, it was found that lysosomes from livers of beige mice released phosphotungstic acid-soluble radioactivity at a rate significantly slower than normal (t½ = 6.8 and 3.1 h, respectively). This defect in beige mice could be corrected by chronic administration of carbamyl choline (t½ = 3.5 h), a cholinergic agonist which raises intracellular cyclic GMP levels. However, no significant differences between normal and beige mice were observed either in the ability of soluble extracts of liver and kidney to bind [3H]cyclic GMP in vitro or in the basal levels of cyclic AMP in both tissues. The relevance of these observations to the presumed biochemical defect underlying the Chediak-Higashi syndrome is discussed. PMID:202611

  20. 7 CFR 760.4 - Normal marketings of milk.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Normal marketings of milk. 760.4 Section 760.4... Farmers for Milk § 760.4 Normal marketings of milk. (a) The county committee shall determine the affected... whole milk which such farmer would have sold in the commercial market in each of the pay periods in the...

  1. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassanein, A.; Konkashbaev, I.

    1999-03-15

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutionsmore » provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters.« less

  2. A comparison of vowel normalization procedures for language variation research

    NASA Astrophysics Data System (ADS)

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .

  3. A comparison of vowel normalization procedures for language variation research.

    PubMed

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels ("vowel-extrinsic" information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself ("vowel-intrinsic" information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., "formant-extrinsic" F2-F1).

  4. Normalized methodology for medical infrared imaging

    NASA Astrophysics Data System (ADS)

    Vargas, J. V. C.; Brioschi, M. L.; Dias, F. G.; Parolin, M. B.; Mulinari-Brenner, F. A.; Ordonez, J. C.; Colman, D.

    2009-01-01

    A normalized procedure for medical infrared imaging is suggested, and illustrated by a leprosy and hepatitis C treatment follow-up, in order to investigate the effect of concurrent treatment which has not been reported before. A 50-year-old man with indeterminate leprosy and a 20-year history of hepatitis C was monitored for 587 days, starting from the day the patient received treatment for leprosy. Standard therapy for hepatitis C started 30 days later. Both visual observations and normalized infrared imaging were conducted periodically to assess the response to leprosy treatment. The primary end points were effectiveness of the method under different boundary conditions over the period, and rapid assessment of the response to leprosy treatment. The patient achieved sustained hepatitis C virological response 6 months after the end of the treatment. The normalized infrared results demonstrate the leprosy treatment success in spite of the concurrent hepatitis C treatment, since day 87, whereas repigmentation was visually assessed only after day 182, and corroborated with a skin biopsy on day 390. The method detected the effectiveness of the leprosy treatment in 87 days, whereas repigmentation started only in 182 days. Hepatitis C and leprosy treatment did not affect each other.

  5. A Late Babylonian Normal and ziqpu star text

    NASA Astrophysics Data System (ADS)

    Roughton, N. A.; Steele, J. M.; Walker, C. B. F.

    2004-09-01

    The Late Babylonian tablet BM 36609+ is a substantial rejoined fragment of an important and previously unknown compendium of short texts dealing with the use of stars in astronomy. Three of the fragments which constitute BM 36609+ were first identified as containing a catalogue of Babylonian "Normal Stars" (stars used as reference points in the sky to track the movement of the moon and planets) by N. A. Roughten. C. B. F. Walker has been able to join several more fragments to the tablet which have revealed that other sections of the compendium concern a group of stars whose culminations are used for keeping time, known as ziqpu-stars after the Akkadian term for culmination, ziqpu. All the preserved sections on the obverse of BM 36609+ concern ziqpu-stars. On the reverse of the tablet we find several sections concerning Normal Stars. This side begins with a catalogue of Normal Stars giving their positions within zodiacal signs. The catalogue is apparently related to the only other Normal Star catalogue previously known, BM 46083 published by Sachs. In the following we present an edition of BM 36609+ based upon Walker's transliteration of the tablet. Since Sachs' edition of BM 46083, the Normal Star catalogue related to BM 36609+, was based upon a photograph and is incomplete we include a fresh edition of the tablet. A list of Akkadian and translated star names with identifications is given.

  6. Modeling and simulation of normal and hemiparetic gait

    NASA Astrophysics Data System (ADS)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  7. Accurate macromolecular crystallographic refinement: incorporation of the linear scaling, semiempirical quantum-mechanics program DivCon into the PHENIX refinement package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borbulevych, Oleg Y.; Plumley, Joshua A.; Martin, Roger I.

    2014-05-01

    Semiempirical quantum-chemical X-ray macromolecular refinement using the program DivCon integrated with PHENIX is described. Macromolecular crystallographic refinement relies on sometimes dubious stereochemical restraints and rudimentary energy functionals to ensure the correct geometry of the model of the macromolecule and any covalently bound ligand(s). The ligand stereochemical restraint file (CIF) requires a priori understanding of the ligand geometry within the active site, and creation of the CIF is often an error-prone process owing to the great variety of potential ligand chemistry and structure. Stereochemical restraints have been replaced with more robust functionals through the integration of the linear-scaling, semiempirical quantum-mechanics (SE-QM)more » program DivCon with the PHENIX X-ray refinement engine. The PHENIX/DivCon package has been thoroughly validated on a population of 50 protein–ligand Protein Data Bank (PDB) structures with a range of resolutions and chemistry. The PDB structures used for the validation were originally refined utilizing various refinement packages and were published within the past five years. PHENIX/DivCon does not utilize CIF(s), link restraints and other parameters for refinement and hence it does not make as many a priori assumptions about the model. Across the entire population, the method results in reasonable ligand geometries and low ligand strains, even when the original refinement exhibited difficulties, indicating that PHENIX/DivCon is applicable to both single-structure and high-throughput crystallography.« less

  8. Not Quite Normal: Consequences of Violating the Assumption of Normality in Regression Mixture Models

    ERIC Educational Resources Information Center

    Van Horn, M. Lee; Smith, Jessalyn; Fagan, Abigail A.; Jaki, Thomas; Feaster, Daniel J.; Masyn, Katherine; Hawkins, J. David; Howe, George

    2012-01-01

    Regression mixture models, which have only recently begun to be used in applied research, are a new approach for finding differential effects. This approach comes at the cost of the assumption that error terms are normally distributed within classes. This study uses Monte Carlo simulations to explore the effects of relatively minor violations of…

  9. Notes on power of normality tests of error terms in regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Střelec, Luboš

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less

  10. A brain imaging repository of normal structural MRI across the life course: Brain Images of Normal Subjects (BRAINS).

    PubMed

    Job, Dominic E; Dickie, David Alexander; Rodriguez, David; Robson, Andrew; Danso, Sammy; Pernet, Cyril; Bastin, Mark E; Boardman, James P; Murray, Alison D; Ahearn, Trevor; Waiter, Gordon D; Staff, Roger T; Deary, Ian J; Shenkin, Susan D; Wardlaw, Joanna M

    2017-01-01

    The Brain Images of Normal Subjects (BRAINS) Imagebank (http://www.brainsimagebank.ac.uk) is an integrated repository project hosted by the University of Edinburgh and sponsored by the Scottish Imaging Network: A Platform for Scientific Excellence (SINAPSE) collaborators. BRAINS provide sharing and archiving of detailed normal human brain imaging and relevant phenotypic data already collected in studies of healthy volunteers across the life-course. It particularly focusses on the extremes of age (currently older age, and in future perinatal) where variability is largest, and which are under-represented in existing databanks. BRAINS is a living imagebank where new data will be added when available. Currently BRAINS contains data from 808 healthy volunteers, from 15 to 81years of age, from 7 projects in 3 centres. Additional completed and ongoing studies of normal individuals from 1st to 10th decades are in preparation and will be included as they become available. BRAINS holds several MRI structural sequences, including T1, T2, T2* and fluid attenuated inversion recovery (FLAIR), available in DICOM (http://dicom.nema.org/); in future Diffusion Tensor Imaging (DTI) will be added where available. Images are linked to a wide range of 'textual data', such as age, medical history, physiological measures (e.g. blood pressure), medication use, cognitive ability, and perinatal information for pre/post-natal subjects. The imagebank can be searched to include or exclude ranges of these variables to create better estimates of 'what is normal' at different ages. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Topical Oxygen for Chronic Wounds: A PRO/CON Debate

    PubMed Central

    Mutluoglu, Mesut; Cakkalkurt, Aslican; Uzun, Gunalp; Aktas, Samil

    2014-01-01

    The role of oxygen in wound healing is universally accepted and does not require any further evidence; however the controversy as to whether oxygen delivery systems have the potential to improve wound healing remains to be concluded. Topical oxygen treatment (TOT) involves the delivery of 100% oxygen for a mean of 90 min, once a day at an atmospheric pressure slightly above 1 atm abs. The use of TOT gained increasing interest recently. The current manuscript will summarize the pros and cons of TOT in the view of the available literature. PMID:26199891

  12. Normal central retinal function and structure preserved in retinitis pigmentosa.

    PubMed

    Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V

    2010-02-01

    To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.

  13. Non Invasive Biomedical Analysis - Breath Networking Session at PittCon 2011, Atlanta, Georgia

    EPA Science Inventory

    This was the second year that our breath colleagues organized a networking session at the Pittsburgh Conference and Exposition or ''PittCon'' (http://www.pincon.org/).This time it was called "Non-invasive Biomedical Analysis" to broaden the scope a bit, but the primary focus rema...

  14. Gradient Optimization for Analytic conTrols - GOAT

    NASA Astrophysics Data System (ADS)

    Assémat, Elie; Machnes, Shai; Tannor, David; Wilhelm-Mauch, Frank

    Quantum optimal control becomes a necessary step in a number of studies in the quantum realm. Recent experimental advances showed that superconducting qubits can be controlled with an impressive accuracy. However, most of the standard optimal control algorithms are not designed to manage such high accuracy. To tackle this issue, a novel quantum optimal control algorithm have been introduced: the Gradient Optimization for Analytic conTrols (GOAT). It avoids the piecewise constant approximation of the control pulse used by standard algorithms. This allows an efficient implementation of very high accuracy optimization. It also includes a novel method to compute the gradient that provides many advantages, e.g. the absence of backpropagation or the natural route to optimize the robustness of the control pulses. This talk will present the GOAT algorithm and a few applications to transmons systems.

  15. Amniotic fluid cortisol and alpha-fetoprotein in normal and aneuploid pregnancies.

    PubMed

    Drugan, A; Subramanian, M G; Johnson, M P; Evans, M I

    1988-01-01

    Cortisol and alpha-fetoprotein (AFP) levels were measured in amniotic fluid (AF) samples at 15-20 weeks of gestation from 125 normal pregnancies and 29 pregnancies affected by aneuploidy. The normal pregnancy group was further subdivided into 'low' AF-AFP (less than 0.6 MOM, n = 60) and 'normal' AF-AFP (0.6 less than AFP less than 1.4 MOM, n = 65). A significant, inverse, linear correlation was found between cortisol and AF-AFP for both normal AFP and low AFP groups (r = -0.26, and r = -0.4, respectively, p less than 0.05). Gestational age was significantly correlated with both cortisol and AFP levels in the normal pregnancy groups. No difference was found when cortisol levels were compared between the low and normal AFP groups. The correlation between cortisol and AFP in aneuploid pregnancies was not significant (p = 0.37). The strong association between cortisol or AFP and gestational age in normal pregnancy (p less than 0.00001) was lost in trisomic gestation. We conclude that higher cortisol levels do not seem to be the cause of low AFP in normal or aneuploid pregnancies.

  16. Comparación de la efectividad de ranibizumab intravítreo para el tratamiento del edema macular diabético en ojos vitrectomizados y no vitrectomizados.

    PubMed

    Koyanagi, Yoshito; Yoshida, Shigeo; Kobayashi, Yoshiyuki; Kubo, Yuki; Yamaguchi, Muneo; Nakama, Takahito; Nakao, Shintaro; Ikeda, Yasuhiro; Ohshima, Yuji; Ishibashi, Tatsuro; Sonoda, Kohhei

    2017-07-11

    Objetivo: Comparar la efectividad de ranibizumab intravítreo (RIV) para el tratamiento del edema macular diabético (EMD) en ojos con y sin vitrectomía previa. Procedimientos: Evaluamos de manera prospectiva la mejor agudeza visual corregida (MAVC) y el grosor macular central (GMC) tras el tratamiento con RIV durante 6 meses. Resultados: No se observaron diferencias significativas en la MAVC o GMC inicial en ninguno de los dos grupos. En el grupo no vitrectomizado (n = 15), los cambios medios en la MAVC y GMC hasta el sexto mes de tratamiento con respecto al valor inicial resultaron significativos (p < 0,01). En el grupo vitrectomizado (n = 10), se observó una mejora más lenta, y la mejora media en la MAVC no resultó significativa (p = 0,5), aunque la media en la disminución del GMC sí que lo fue (p < 0,05). No se observaron diferencias significativas en los cambios medios en la MAVC y el GMC entre ambos grupos a los 6 meses del tratamiento. Conclusiones: La diferencia en la efectividad de RIV entre ambos grupos no resultó significativa. Ranibizumab intravítreo puede ser una opción de tratamiento incluso en pacientes vitrectomizados con EMD. © 2017 S. Karger AG, Basel.

  17. Physical Properties of Normal Grade Biodiesel and Winter Grade Biodiesel

    PubMed Central

    Sadrolhosseini, Amir Reza; Moksin, Mohd Maarof; Nang, Harrison Lau Lik; Norozi, Monir; Yunus, W. Mahmood Mat; Zakaria, Azmi

    2011-01-01

    In this study, optical and thermal properties of normal grade and winter grade palm oil biodiesel were investigated. Surface Plasmon Resonance and Photopyroelectric technique were used to evaluate the samples. The dispersion curve and thermal diffusivity were obtained. Consequently, the variation of refractive index, as a function of wavelength in normal grade biodiesel is faster than winter grade palm oil biodiesel, and the thermal diffusivity of winter grade biodiesel is higher than the thermal diffusivity of normal grade biodiesel. This is attributed to the higher palmitic acid C16:0 content in normal grade than in winter grade palm oil biodiesel. PMID:21731429

  18. PubMed

    Bueno Vargas, Pilar; Manzano Martín, Manuel; López-Aliaga, Inmaculada; López Pedrosa, José M ª

    2016-09-20

    Introducción: la gestación y lactancia están relacionadas con pérdidas temporales en la densidad mineral ósea (DMO) materna. Una suplementación con calcio podría resultar beneficiosa para evitar la pérdida de masa ósea del esqueleto materno. Otros nutrientes como los prebióticos han sido identificados como responsables de un incremento en la absorción de minerales, pudiendo condicionar la mineralización ósea.Objetivo: estudiar el efecto de la suplementación de la dieta materna con el prebiótico inulina enriquecida con oligofructosa, durante la gestación y la lactancia sobre el contenido mineral óseo (CMO) y la DMO al final del periodo de lactancia.Métodos: las ratas gestantes fueron alimentadas con dieta estándar (grupo CC), dieta fortificada en calcio (grupo Ca) o enriquecida con el prebiótico inulina enriquecida con oligofructosa (grupo Pre) hasta el final del periodo de lactancia. Posteriormente se evaluó el CMO y DMO por absorciometría de rayos X (DEXA) y el pH del contenido cecal.Resultados:en términos generales, el grupo Pre presenta los mayores valores absolutos de CMO y DMO de entre los tres grupos, siendo en la tibia significativamente diferentes en los grupos CC y Pre frente al grupo Ca. El pH del contenido cecal del grupo Pre es significativamente inferior al de los grupos CC y Ca.Conclusión:la suplementación con inulina enriquecida con oligofructosa, en condiciones nutricionales no deficientes en calcio, durante la gestación y la lactancia, ejerce una protección del esqueleto materno en las ratas y puede ser considerada como una estrategia nutricional para proteger la masa ósea materna en el periodo perinatal.

  19. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    PubMed

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Downstream valorization and comprehensive two-dimensional liquid chromatography-based chemical characterization of bioactives from black chokeberries (Aronia melanocarpa) pomace.

    PubMed

    Brazdauskas, T; Montero, L; Venskutonis, P R; Ibañez, E; Herrero, M

    2016-10-14

    In this work, a new alternative for the downstream processing and valorization of black chokeberry pomace (Aronia melanocarpa) which could be potentially coupled to a biorefinery process is proposed. This alternative is based on the application of pressurized liquid extraction (PLE) to the residue obtained after the supercritical fluid extraction of the berry pomace. An experimental design is employed to study and optimize the most relevant extraction conditions in order to attain extracts with high extraction yields, total phenols content and antioxidant activity. Moreover, the PLE extracts were characterized by using a new method based on the application of comprehensive two-dimensional liquid chromatography in order to correlate their activity with their chemical composition. Thanks to the use of this powerful analytical tool, 61 compounds could be separated being possible the tentative identification of different anthocyanins, proanthocyanidins, flavonoids and phenolic acids. By using the optimized PLE approach (using pressurized 46% ethanol in water at 165°C containing 1.8% formic acid), extracts with high total phenols content (236.6mg GAE g -1 extract) and high antioxidant activities (4.35mmol TE g -1 extract and EC 50 5.92μgmL -1 ) could be obtained with high yields (72.5%). Copyright © 2016 Elsevier B.V. All rights reserved.