Science.gov

Sample records for mdct generally underestimates

  1. Have We Substantially Underestimated the Impact of Improved Sanitation Coverage on Child Health? A Generalized Additive Model Panel Analysis of Global Data on Child Mortality and Malnutrition

    PubMed Central

    Prüss-Ustün, Annette

    2016-01-01

    Background Although widely accepted as being one of the most important public health advances of the past hundred years, the contribution that improving sanitation coverage can make to child health is still unclear, especially since the publication of two large studies of sanitation in India which found no effect on child morbidity. We hypothesis that the value of sanitation does not come directly from use of improved sanitation but from improving community coverage. If this is so we further hypothesise that the relationship between sanitation coverage and child health will be non-linear and that most of any health improvement will accrue as sanitation becomes universal. Methods We report a fixed effects panel analysis of country level data using Generalized Additive Models in R. Outcome variables were under 5 childhood mortality, neonatal mortality, under 5 childhood mortality from diarrhoea, proportion of children under 5 with stunting and with underweight. Predictor variables were % coverage by improved sanitation, improved water source, Gross Domestic Product per capita and Health Expenditure per capita. We also identified three studies reporting incidence of diarrhoea in children under five alongside gains in community coverage in improved sanitation. Findings For each of the five outcome variables, sanitation coverage was independently associated with the outcome but this association was highly non-linear. Improving sanitation coverage was very strongly associated with under 5 years diarrhoea mortality, under 5years all-cause mortality, and all-cause neonatal mortality. There was a decline as sanitation coverage increased up to about 20% but then no further decline was seen until about 70% (60% for diarrhoea mortality and 80% for neonatal mortality, respectively). The association was less strong for stunting and underweight but a threshold about 50% coverage was also seen. Three large trials of sanitation on diarrhoea morbidity gave results that were similar

  2. Underestimation of Project Costs

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  3. MDCT Versus MRI Assessment of Tumor Response After Transarterial Chemoembolization for the Treatment of Hepatocellular Carcinoma

    SciTech Connect

    Kloeckner, Roman; Otto, Gerd; Biesterfeld, Stefan; Oberholzer, Katja; Dueber, Christoph; Pitton, Michael B.

    2010-06-15

    The purpose of this study was to compare the ability of multidetector computed tomography (MDCT) and magnetic resonance imaging (MRI) to evaluate treatment results after transarterial chemoembolization (TACE), with a special focus on the influence of Lipiodol on calculation of tumor necrosis according to EASL criteria. A total of 115 nodules in 20 patients (17 males, 3 females; 69.5 {+-} 9.35 years) with biopsy-proven hepatocellular carcinoma were treated with TACE. Embolization was performed using a doxorubicin-Lipiodol emulsion (group I) or DC Beads loaded with doxorubicin (group II). Follow-up included triphasic contrast-enhanced 64-row MDCT (collimation, 0.625 mm; slice, 3 mm; contrast bolus, 120 ml iomeprol; delay by bolus trigger) and contrast-enhanced MRI (T1 native, T2 native; five dynamic contrast-enhanced phases; 0.1 mmol/kg body weight gadolinium-DTPA; slice thickness, 4 mm). Residual tumor and the extent of tumor necrosis were evaluated according to EASL. Contrast enhancement within tumor lesions was suspected to represent vital tumor. In the Lipiodol-based TACE protocol, MDCT underestimated residual viable tumor compared to MRI, due to Lipiodol artifacts (23.2% vs 47.7% after first, 11.9% vs 31.2% after second, and 11.4% vs 23.7% after third TACE; p = 0.0014, p < 0.001, and p < 0.001, respectively). In contrast to MDCT, MRI was completely free of any artifacts caused by Lipiodol. In the DC Bead-based Lipiodol-free TACE protocol, MRI and CT showed similar residual tumor and rating of treatment results (46.4% vs 41.2%, 31.9 vs 26.8%, and 26.0% vs 25.6%; n.s.). In conclusion, MRI is superior to MDCT for detection of viable tumor residuals after Lipiodol-based TACE. Since viable tumor tissue is superimposed by Lipiodol artifacts in MDCT, MRI is mandatory for reliable decision-making during follow-up after Lipiodol-based TACE protocols.

  4. Postmortem imaging: MDCT features of postmortem change and decomposition.

    PubMed

    Levy, Angela D; Harcke, Howard Theodore; Mallak, Craig T

    2010-03-01

    Multidetector computed tomography (MDCT) has emerged as an effective imaging technique to augment forensic autopsy. Postmortem change and decomposition are always present at autopsy and on postmortem MDCT because they begin to occur immediately upon death. Consequently, postmortem change and decomposition on postmortem MDCT should be recognized and not mistaken for a pathologic process or injury. Livor mortis increases the attenuation of vasculature and dependent tissues on MDCT. It may also produce a hematocrit effect with fluid levels in the large caliber blood vessels and cardiac chambers from dependent layering erythrocytes. Rigor mortis and algor mortis have no specific MDCT features. In contrast, decomposition through autolysis, putrefaction, and insect and animal predation produce dramatic alterations in the appearance of the body on MDCT. Autolysis alters the attenuation of organs. The most dramatic autolytic changes on MDCT are seen in the brain where cerebral sulci and ventricles are effaced and gray-white matter differentiation is lost almost immediately after death. Putrefaction produces a pattern of gas that begins with intravascular gas and proceeds to gaseous distension of all anatomic spaces, organs, and soft tissues. Knowledge of the spectrum of postmortem change and decomposition is an important component of postmortem MDCT interpretation.

  5. State-of-the-art preoperative staging of gastric cancer by MDCT and magnetic resonance imaging

    PubMed Central

    Choi, Joon-Il; Joo, Ijin; Lee, Jeong Min

    2014-01-01

    Gastric cancer is one of the most common and fatal cancers. The importance of accurate staging for gastric cancer has become more critical due to the recent introduction of less invasive treatment options, such as endoscopic mucosal resection or laparoscopic surgery. The tumor-node-metastasis staging system is the generally accepted staging system for predicting the prognosis of patients with gastric cancer. Multidetector row computed tomography (MDCT) is a widely accepted imaging modality for the preoperative staging of gastric cancer that can simultaneously assess locoregional staging, including the gastric mass, regional lymph nodes, and distant metastasis. The diagnostic performance of MDCT for T- and N-staging has been improved by the technical development of isotropic imaging and 3D reformation. Although magnetic resonance imaging (MRI) was not previously used to evaluate gastric cancer due to the modality’s limitations, the development of high-speed sequences has made MRI a feasible tool for the staging of gastric cancer. PMID:24782607

  6. Is soil carbon storage underestimated?

    PubMed

    Díaz-Hernández, José Luis

    2010-06-01

    An accurate evaluation of the carbon stored in soils is essential to fully understand the role of soils as source or sink of atmospheric CO(2), as well as the feedback processes involved in soil-atmosphere CO(2) exchange. Depth and strategies of sampling have been, and still are, sources of uncertainties, because most current estimates of carbon storage in soils are based on conventional soil surveys and data sets compiled primarily for agricultural purposes. In a study of the Guadix-Baza basin, a semiarid area of southern Spain, sizeable amounts of carbon have been found stored in the subsoil. Total carbon estimated within 2-m was 141.3 kg Cm(-2) compared to 36.1 kg Cm(-2) if estimates were based solely on conventional soil depths (e.g. 40-cm in Regosols and 100-cm in Fluvisols). Thus, the insufficient sampling depth could lead to considerable underestimation of global soil carbon. In order to correctly evaluate the carbon content in world soils, more specific studies must be planned and carried out, especially in those soils where caliche and other carbonated cemented horizons are present.

  7. Vascular involvement in periampullary tumors: MDCT, EUS, and CDU.

    PubMed

    Gusmini, S; Nicoletti, R; Martinenghi, C; Del Maschio, A

    2009-07-01

    In patients affected by periampullary tumors, surgical resection represents the only treatment with curative intent. Preoperative evaluation of vascular involvement is necessary to avoid surgical treatments unable of curative intent resection. The aim of our update article is to assess the performance of multidetector computed tomography (MDCT), endoscopic ultrasonography (EUS), and color Doppler ultrasonography (CDU) in the evaluation of vascular involvement of major peripancreatic vessels, in periampullary tumors, analyzing the current and past literature.

  8. Radiation dose measurement for various parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Lee, Chang-Lae; Kim, Hee-Joung; Jeon, Seong Su; Cho, Hyo-Min; Nam, So Ra; Jung, Ji-Young

    2008-03-01

    The MDCT parameters affecting radiation dose include tube voltage, tube current, change of beam collimation, and size of the human body. The purpose of this study was to measure and evaluate radiation dose for MDCT parameters. A comparative analysis of the radiation dose according to before and after the calibration of the ionization chamber was performed. The ionization chamber was used for measuring radiation dose in the MDCT, as well as of CTDI W according to temperature and pressure correction factors in the CT room. As a result, the patient dose of CTDI W values linearly increased as tube voltage and current were increased, and nonlinearly decreased as beam collimation was increased. And the CTDI W value which was reflected calibration factors, as well as correction factors of temperature and pressure, was found to be greater by the range of 0.479 ~ 3.162 mGy in effective radiation dose than the uncorrected value. Also, Under the abdomen routine CT conditions used in hospitals, patient exposure dose showed a difference of a maximum of 0.7 mSv between before and after the application of such factors. These results imply that the calibration of the ion chamber, and the application of temperature and pressure of the CT room are crucial in measuring and calculating patient exposure dose.

  9. MDCT imaging of the stomach: advances and applications.

    PubMed

    Nagpal, Prashant; Prakash, Anjali; Pradhan, Gaurav; Vidholia, Aditi; Nagpal, Nishant; Saboo, Sachin S; Kuehn, David M; Khandelwal, Ashish

    2017-01-01

    The stomach may be involved by a myriad of pathologies ranging from benign aetiologies like inflammation to malignant aetiologies like carcinoma or lymphoma. Multidetector CT (MDCT) of the stomach is the first-line imaging for patients with suspected gastric pathologies. Conventionally, CT imaging had the advantage of simultaneous detection of the mural and extramural disease extent, but advances in MDCT have allowed mucosal assessment by virtual endoscopy (VE). Also, better three-dimensional (3D) post-processing techniques have enabled more robust and accurate pre-operative planning in patients undergoing gastrectomy and even predict the response to surgery for patients undergoing laparoscopic sleeve gastrectomy for weight loss. The ability of CT to obtain stomach volume (for bariatric surgery patients) and 3D VE images depends on various patient and protocol factors that are important for a radiologist to understand. We review the appropriate CT imaging protocol in the patients with suspected gastric pathologies and highlight the imaging pearls of various gastric pathologies on CT and VE.

  10. MDCT of hand and wrist infections: emphasis on compartmental anatomy.

    PubMed

    Ahlawat, S; Corl, F M; LaPorte, D M; Fishman, E K; Fayad, L M

    2017-04-01

    Hand and wrist infections can present with a spectrum of manifestations ranging from cellulitis to deep-space collections. The various infectious processes can be categorised as superficial or deep infections based on their respective locations relative to the tendons. Superficial hand infections are located superficial to the tendons and are comprised of cellulitis, lymphangitis, paronychia, pulp-space infections, herpetic whitlow, and include volar as well as dorsal subcutaneous abscesses. Deep hand infections are located deep to the tendon sheaths and include synovial space infections, such as infectious tenosynovitis, deep fascial space infections, septic arthritis, necrotising fasciitis, and osteomyelitis. Knowledge of hand and wrist compartmental anatomy is essential for the accurate diagnosis and management of hand infections. Although early and superficial infections of the hand may respond to non-surgical management, most hand infections are surgical emergencies. Multidetector computed tomography (MDCT), with its muliplanar reformation (MPR) and three-dimensional (3D) capabilities, is a powerful tool in the emergency setting for the evaluation of acute hand and wrist pathology. The clinical and imaging features of hand and wrist infections as evident on MDCT will be reviewed with emphasis on contiguous and closed synovial and deep fascial spaces. Knowledge of hand compartmental anatomy enables accurate characterisation of the infectious process and localise the extent of disease in the acute setting.

  11. Mixed-radix Algorithm for the Computation of Forward and Inverse MDCT

    PubMed Central

    Wu, Jiasong; Shu, Huazhong; Senhadji, Lotfi; Luo, Limin

    2008-01-01

    The modified discrete cosine transform (MDCT) and inverse MDCT (IMDCT) are two of the most computational intensive operations in MPEG audio coding standards. A new mixed-radix algorithm for efficient computing the MDCT/IMDCT is presented. The proposed mixed-radix MDCT algorithm is composed of two recursive algorithms. The first algorithm, called the radix-2 decimation in frequency (DIF) algorithm, is obtained by decomposing an N-point MDCT into two MDCTs with the length N/2. The second algorithm, called the radix-3 decimation in time (DIT) algorithm, is obtained by decomposing an N-point MDCT into three MDCTs with the length N/3. Since the proposed MDCT algorithm is also expressed in the form of a simple sparse matrix factorization, the corresponding IMDCT algorithm can be easily derived by simply transposing the matrix factorization. Comparison of the proposed algorithm with some existing ones shows that our proposed algorithm is more suitable for parallel implementation and especially suitable for the layer III of MPEG-1 and MPEG-2 audio encoding and decoding. Moreover, the proposed algorithm can be easily extended to the multidimensional case by using the vector-radix method. PMID:21258639

  12. Polyarteritis nodosa: MDCT as a 'One-Stop Shop' Modality for Whole-Body Arterial Evaluation

    SciTech Connect

    Tsai, W.-L.; Tsai, I-C.; Lee Tain; Hsieh, C.-W.

    2008-07-15

    Polyarteritis nodosa is a rare disease, which is characterized by aneurysm formation and occlusion in the arteries of multiple systems. Due to its extensive involvement, whole-body evaluation is necessary for diagnosis and treatment monitoring. We report a case of polyarteritis nodosa using multidetector-row computed tomography (MDCT) as a 'one-stop shop' modality for whole-body arterial evaluation. With precise protocol design, MDCT can be used as a reliable noninvasive modality providing comprehensive whole-body arterial evaluation.

  13. SDU: A Semidefinite Programming-Based Underestimation Method for Stochastic Global Optimization in Protein Docking.

    PubMed

    Paschalidis, Ioannis Ch; Shen, Yang; Vakili, Pirooz; Vajda, Sandor

    2007-04-01

    This paper introduces a new stochastic global optimization method targeting protein-protein docking problems, an important class of problems in computational structural biology. The method is based on finding general convex quadratic underestimators to the binding energy function that is funnel-like. Finding the optimum underestimator requires solving a semidefinite programming problem, hence the name semidefinite programming-based underestimation (SDU). The underestimator is used to bias sampling in the search region. It is established that under appropriate conditions SDU locates the global energy minimum with probability approaching one as the sample size grows. A detailed comparison of SDU with a related method of convex global underestimator (CGU), and computational results for protein-protein docking problems are provided.

  14. Quantification of arterial plaque and lumen density with MDCT

    SciTech Connect

    Paul, Narinder S.; Blobel, Joerg; Kashani, Hany; Rice, Murray; Ursani, Ali

    2010-08-15

    Purpose: This study aimed to derive a mathematical correction function in order to normalize the CT number measurements for small volume arterial plaque and small vessel mimicking objects, imaged with multidetector CT (MDCT). Methods: A commercially available calcium plaque phantom (QRM GmbH, Moehrendorf, Germany) and a custom built cardiovascular phantom were scanned with 320 and 64 MDCT scanners. The calcium hydroxyapatite plaque phantom contained objects 0.5-5.0 mm in diameter with known CT attenuation nominal values ranging 50-800 HU. The cardiovascular phantom contained vessel mimicking objects 1.0-5.0 mm in diameter with different contrast media. Both phantoms were scanned using clinical protocols for CT angiography and images were reconstructed with different filter kernels. The measured CT number (HU) and diameter of each object were analyzed on three clinical postprocessing workstations. From the resultant data, a mathematical formula was derived based on absorption function exp(-{mu}{sup *}d) to demonstrate the relation between measured CT numbers and object diameters. Results: The percentage reduction in measured CT number (HU) for the group of selected filter kernels, apparent during CT angiography, is dependent only on the object size (plaque or vessel diameter). The derived formula of the form 1-c{sup *}exp(-a{sup *}d{sup b}) showed reduction in CT number for objects between 0.5 and 5 mm in diameter, with asymptote reaching background noise for small objects with diameters nearing the CT in-plane resolution (0.35 mm). No reduction was observed for the objects with diameters equal or larger than 5 mm. Conclusions: A clear mathematical relationship exists between object diameter and reduction in measured CT number in HU. This function is independent of exposure parameters and inherent attenuation properties of the objects studied. Future developments include the incorporation of this mathematical model function into quantification software in order to

  15. Three-dimensional reconstruction of upper airways from MDCT

    NASA Astrophysics Data System (ADS)

    Perchet, Diane; Fetita, Catalin; Preteux, Francoise

    2005-03-01

    Under the framework of clinical respiratory investigation, providing accurate modalities for morpho-functional analysis is essential for diagnosis improvement, surgical planning and follow-up. This paper focuses on the upper airways investigation and develops an automated approach for 3D mesh reconstruction from MDCT acquisitions. In order to overcome the difficulties related to the complex morphology of the upper airways and to the image gray level heterogeneity of the airway lumens and thin bony septa, the proposed 3D reconstruction methodology combines 2D segmentation and 3D surface regularization approaches. The segmentation algorithm relies on mathematical morphology theory and provides airway lumen robust discrimination from the surrounding tissues, while preserving the connectivity relationship between the different anatomical structures. The 3D regularization step uses an energy-based modeling in order to achieve a smooth and well-fitted 3D surface of the upper airways. An accurate 3D mesh representation of the reconstructed airways makes it possible to develop specific clinical applications such as virtual endoscopy, surgical planning and computer assisted intervention. In addition, building up patient-specific 3D models of upper airways is highly valuable for the study and design of inhaled medication delivery via computational fluid dynamics (CFD) simulations.

  16. Accurate 3D quantification of the bronchial parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  17. Quantitative analysis of the central-chest lymph nodes based on 3D MDCT image data

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Bascom, Rebecca; Mahraj, Rickhesvar P. M.; Higgins, William E.

    2009-02-01

    Lung cancer is the leading cause of cancer death in the United States. In lung-cancer staging, central-chest lymph nodes and associated nodal stations, as observed in three-dimensional (3D) multidetector CT (MDCT) scans, play a vital role. However, little work has been done in relation to lymph nodes, based on MDCT data, due to the complicated phenomena that give rise to them. Using our custom computer-based system for 3D MDCT-based pulmonary lymph-node analysis, we conduct a detailed study of lymph nodes as depicted in 3D MDCT scans. In this work, the Mountain lymph-node stations are automatically defined by the system. These defined stations, in conjunction with our system's image processing and visualization tools, facilitate lymph-node detection, classification, and segmentation. An expert pulmonologist, chest radiologist, and trained technician verified the accuracy of the automatically defined stations and indicated observable lymph nodes. Next, using semi-automatic tools in our system, we defined all indicated nodes. Finally, we performed a global quantitative analysis of the characteristics of the observed nodes and stations. This study drew upon a database of 32 human MDCT chest scans. 320 Mountain-based stations (10 per scan) and 852 pulmonary lymph nodes were defined overall from this database. Based on the numerical results, over 90% of the automatically defined stations were deemed accurate. This paper also presents a detailed summary of central-chest lymph-node characteristics for the first time.

  18. Feasibility of Free-breathing CCTA using 256-MDCT.

    PubMed

    Liu, Zhuo; Sun, Ye; Zhang, Zhuolu; Chen, Lei; Hong, Nan

    2016-07-01

    Usually, coronary computed tomography angiography (CCTA) is performed during breath-holding to reduce artifact caused by respiration. The objective of this study was to evaluate the feasibility of free-breathing CCTA compared to breath-holding using CT scanner with wide detector. To evaluate the feasibility of CCTA during free-breathing using a 256-MDCT. In 80 patients who underwent CCTA, 40 were performed during breath-holding (group A), and the remaining 40 during free-breathing (group B). The quality scores for coronary arteries were analyzed and defined as: 3 (excellent), 2 (good), and 1 (poor). The image noise, signal-to-noise ratio and effective radiation dose as well as the heart rate variation were compared. The noise, signal-to-noise ratio, and effective radiation dose were not significantly different between the 2 groups. The mean heart rate variation between planning and scanning for group A was 7 ± 7.6 bpm, and larger than 3 ± 2.6 bpm for group B (P = 0.012). Quality scores of the free-breathing group were better than those of the breath-holding group (group A: 2.55 ± 0.64, group B: 2.85 ± 0.36, P = 0.018). Free-breathing CCTA is feasible on wide detector CT scanner to provide acceptable image quality with reduced heart rate variation and better images for certain patients.

  19. Robust extraction of the aorta and pulmonary artery from 3D MDCT image data

    NASA Astrophysics Data System (ADS)

    Taeprasartsit, Pinyo; Higgins, William E.

    2010-03-01

    Accurate definition of the aorta and pulmonary artery from three-dimensional (3D) multi-detector CT (MDCT) images is important for pulmonary applications. This work presents robust methods for defining the aorta and pulmonary artery in the central chest. The methods work on both contrast enhanced and no-contrast 3D MDCT image data. The automatic methods use a common approach employing model fitting and selection and adaptive refinement. During the occasional event that more precise vascular extraction is desired or the method fails, we also have an alternate semi-automatic fail-safe method. The semi-automatic method extracts the vasculature by extending the medial axes into a user-guided direction. A ground-truth study over a series of 40 human 3D MDCT images demonstrates the efficacy, accuracy, robustness, and efficiency of the methods.

  20. Segmentation of the central-chest lymph nodes in 3D MDCT images.

    PubMed

    Lu, Kongkuo; Higgins, William E

    2011-09-01

    Central-chest lymph nodes play a vital role in lung-cancer staging. The definition of lymph nodes from three-dimensional (3D) multidetector computed-tomography (MDCT) images, however, remains an open problem. We propose two methods for computer-based segmentation of the central-chest lymph nodes from a 3D MDCT scan: the single-section live wire and the single-click live wire. For the single-section live wire, the user first applies the standard live wire to a single two-dimensional (2D) section after which automated analysis completes the segmentation process. The single-click live wire is similar but is almost completely automatic. Ground-truth studies involving human 3D MDCT scans demonstrate the robustness, efficiency, and intra-observer and inter-observer reproducibility of the methods.

  1. Three-dimensional MDCT angiography of splanchnic arteries: pearls and pitfalls.

    PubMed

    Dohan, A; Dautry, R; Guerrache, Y; Fargeaudou, Y; Boudiaf, M; Le Dref, O; Sirol, M; Soyer, P

    2015-02-01

    Fast scanning along with high resolution of multidetector computed tomography (MDCT) have expanded the role of non-invasive imaging of splanchnic arteries. Advancements in both MDCT scanner technology and three-dimensional (3D) imaging software provide a unique opportunity for non-invasive investigation of splanchnic arteries. Although standard axial computed tomography (CT) images allow identification of splanchnic arteries, visualization of small or distal branches is often limited. Similarly, a comprehensive assessment of the complex anatomy of splanchnic arteries is often beyond the reach of axial images. However, the submillimeter collimation that can be achieved with MDCT scanners now allows the acquisition of true isotropic data so that a high spatial resolution is now maintained in any imaging plane and in 3D mode. This ability to visualize the complex network of splanchnic arteries using 3D rendering and multiplanar reconstruction is of major importance for an optimal analysis in many situations. The purpose of this review is to discuss and illustrate the role of 3D MDCT angiography in the detection and assessment of abnormalities of splanchnic arteries as well as the limitations of the different reconstruction techniques.

  2. US and MDCT diagnosis of a rare cause of haematuria in children: Posterior nutcracker syndrome.

    PubMed

    Ozel, A; Tufaner, O; Kaya, E; Maldur, V

    2011-06-01

    Posterior nutcracker syndrome is caused by compression of the left renal vein between the abdominal aorta and the vertebral column. We present the case of a 14-year-old girl with vague left loin pain, mild haematuria and proteinuria. Diagnosis of this rare syndrome was achieved using color Doppler US and multidetector computed tomography (MDCT) angiography.

  3. Venous thromboembolism in cancer patients: an underestimated major health problem.

    PubMed

    Khalil, Jihane; Bensaid, Badr; Elkacemi, Hanan; Afif, Mohamed; Bensaid, Younes; Kebdani, Tayeb; Benjaafar, Noureddine

    2015-06-20

    Venous thromboembolism (VTE) is a major health problem among patients with cancer, its incidence in this particular population is widely increasing. Although VTE is associated with high rates of mortality and morbidity in cancer patients, its severity is still underestimated by many oncologists. Thromboprophylaxis of VTE now considered as a standard of care is still not prescribed in many institutions; the appropriate treatment of an established VTE is not yet well known by many physicians and nurses in the cancer field. Patients are also not well informed about VTE and its consequences. Many studies and meta-analyses have addressed this question so have many guidelines that dedicated a whole chapter to clarify and expose different treatment strategies adapted to this particular population. There is a general belief that the prevention and treatment of VTE cannot be optimized without a complete awareness by oncologists and patients. The aim of this article is to make VTE a more clear and understood subject.

  4. Has Northern Hemisphere Heat Flow Been Underestimated?

    NASA Astrophysics Data System (ADS)

    Gosnold, W. D.; Majorowicz, J.; Safanda, J.; Szewczyk, J.

    2005-05-01

    We present three lines of evidence to suggest the hypothesis that heat flow in the northern hemisphere may have been underestimated by 15 to 60 percent in shallow wells due to a large post-glacial warming signal. First, temperature vs. depth (T-z) measurements in parts of Europe and North America show a systematic increase in heat flow with depth. This phenomenon is best recognized in analyses of deep (greater than 2km) boreholes in non-tectonic regions with normal to low background heat flow. In Europe, the increase in heat flow with depth has been observed by analysis of more than 1500 deep boreholes located throughout the Fennoscandian Shield, East European Platform, Danish Basin, Germany, Czech Republic, and Poland. There are significantly fewer deep boreholes in North America, but the increase in heat flow with depth appears in a suite of 759 sites in the IHFC Global Heat Flow Database for the region east of the Rocky Mountains and north of latitude 40 N. Second, surface heat flow values in southern hemisphere shields average approximately 50 mWm-2, but surface heat flow values in northern hemisphere shields average 33 mWm-2. Unless crustal radioactivity or mantle heat flow or both factors are greater in southern hemisphere continents, there is no reason for the northern and southern shield areas having similar ages to have different heat flow values. Third, two recently published surface heat flow maps show anomalously low heat flow in the Canadian Shield in a pattern that is coincident with the Wisconsinan ice sheet. The coincidence of low heat flow and ice accumulation has no geophysical basis, thus the coincidence may suggest the existence of a transient signal caused by a warming event. Recent studies of heat flow in North America indicate that in several sites, the ice base temperature was close to the pressure melting point. We hypothesize that there may have been cold ice-free periods during the Pleistocene that would account for the apparent colder

  5. Automated diagnosis of interstitial lung diseases and emphysema in MDCT imaging

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Chang Chien, Kuang-Che; Brillet, Pierre-Yves; Prêteux, Françoise

    2007-09-01

    Diffuse lung diseases (DLD) include a heterogeneous group of non-neoplasic disease resulting from damage to the lung parenchyma by varying patterns of inflammation. Characterization and quantification of DLD severity using MDCT, mainly in interstitial lung diseases and emphysema, is an important issue in clinical research for the evaluation of new therapies. This paper develops a 3D automated approach for detection and diagnosis of diffuse lung diseases such as fibrosis/honeycombing, ground glass and emphysema. The proposed methodology combines multi-resolution 3D morphological filtering (exploiting the sup-constrained connection cost operator) and graph-based classification for a full characterization of the parenchymal tissue. The morphological filtering performs a multi-level segmentation of the low- and medium-attenuated lung regions as well as their classification with respect to a granularity criterion (multi-resolution analysis). The original intensity range of the CT data volume is thus reduced in the segmented data to a number of levels equal to the resolution depth used (generally ten levels). The specificity of such morphological filtering is to extract tissue patterns locally contrasting with their neighborhood and of size inferior to the resolution depth, while preserving their original shape. A multi-valued hierarchical graph describing the segmentation result is built-up according to the resolution level and the adjacency of the different segmented components. The graph nodes are then enriched with the textural information carried out by their associated components. A graph analysis-reorganization based on the nodes attributes delivers the final classification of the lung parenchyma in normal and ILD/emphysematous regions. It also makes possible to discriminate between different types, or development stages, among the same class of diseases.

  6. Spectrum of Abdominal Aortic Disease in a Tertiary Health Care Setup: MDCT Based Observational Study

    PubMed Central

    Kumar, DG Santosh; Gadabanahalli, Karthik; Kalyanpur, Arjun

    2016-01-01

    Introduction Abdominal aortic disease is an important cause of clinical disability that requires early detection by imaging methods for prompt and effective management. Understanding regional disease pattern and prevalence has a bearing on healthcare management and resource planning. Non-invasive, conclusive imaging strategy plays an important role in the detection of disease. Multi-Detector Computed Tomography (MDCT) with its technological developments provides affordable, accurate and comprehensive imaging solution. Aim To evaluate regional demography of abdominal aortic disease spectrum detected using MDCT imaging data in a tertiary hospital. Materials and Methods A descriptive study was conducted based on MDCT imaging data of patients who were investigated with clinical diagnosis of abdominal aortic disease, from March 2008-2010, over a period of 24 months. Patients were examined with the contrast-enhanced MDCT examination. Morphological diagnosis of the aortic disease was based on changes in relative aortic caliber, luminal irregularity, presence of wall calcification, dissection or thrombus and evidence of major branch occlusion. Patients were categorized into four groups based on imaging findings. MDCT information and associated clinical parameters were examined and correlated to management of patient. Descriptive statistical data, namely mean, standard deviation and frequency of disease were evaluated. Results A total of 90 out of 210 patients (43%) were detected with the abdominal aortic abnormality defined by imaging criteria. Group I, comprising of patients with atherosclerosis –including those with complications, constituted 65.5% of the patients. Group II represented patients with aneurysms (45.5%). Group III, consisting of 32.2% of the patients, contained those with dissections. The rest of the patients, including patients with aorto-arteritis, were classified as group IV. Eight patients with aneurysm and one patient with aorto-arteritis were

  7. Semi-automatic central-chest lymph-node definition from 3D MDCT images

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Higgins, William E.

    2010-03-01

    Central-chest lymph nodes play a vital role in lung-cancer staging. The three-dimensional (3D) definition of lymph nodes from multidetector computed-tomography (MDCT) images, however, remains an open problem. This is because of the limitations in the MDCT imaging of soft-tissue structures and the complicated phenomena that influence the appearance of a lymph node in an MDCT image. In the past, we have made significant efforts toward developing (1) live-wire-based segmentation methods for defining 2D and 3D chest structures and (2) a computer-based system for automatic definition and interactive visualization of the Mountain central-chest lymph-node stations. Based on these works, we propose new single-click and single-section live-wire methods for segmenting central-chest lymph nodes. The single-click live wire only requires the user to select an object pixel on one 2D MDCT section and is designed for typical lymph nodes. The single-section live wire requires the user to process one selected 2D section using standard 2D live wire, but it is more robust. We applied these methods to the segmentation of 20 lymph nodes from two human MDCT chest scans (10 per scan) drawn from our ground-truth database. The single-click live wire segmented 75% of the selected nodes successfully and reproducibly, while the success rate for the single-section live wire was 85%. We are able to segment the remaining nodes, using our previously derived (but more interaction intense) 2D live-wire method incorporated in our lymph-node analysis system. Both proposed methods are reliable and applicable to a wide range of pulmonary lymph nodes.

  8. Spectrum of MDCT Findings in Bowel Obstruction in a Tertiary Care Rural Hospital in Northern India

    PubMed Central

    Gupta, Ranjana; Mittal, Amit; Gupta, Sharad; Mittal, Kapish; Taneja, Arpit

    2016-01-01

    Introduction Multidetector Computed Tomography (MDCT) provides clinically and surgically important information in bowel obstruction. It can depict the severity, level and cause of obstruction. Aim To depict the spectrum of MDCT findings in cases of small and large bowel obstruction. Materials and Methods Contrast enhanced MDCT examination of 50 patients were retrospectively included in the study who had evidence of clinical as well as MDCT evidence of bowel obstruction and in whom surgical/clinical follow-up for final diagnosis was available. CT scan was done in all the patients with Ingenuity CT (128 slice MDCT, Philips Medical Systems). The axial sections were reconstructed in coronal and sagital planes to determine site and cause of bowel obstruction. Results There were 34 males and 16 females patients in this study with mean age of 28.4 years. The level of obstruction was in small bowel in 39 patients (76.67%) and large bowel in 11 patients (23.33%). Adhesive bands were the cause of Small Bowel Obstruction (SBO) in 17 patients (43.5% of SBO patients). The most common CT signs in adhesive band SBO were beak sign (seen in 70.6% patients) and fat notch sign (52.9% patients). Five cases of SBO were secondary to benign stricture. Matted adhesions were the cause of obstruction in 3 patients. All these patients showed transition zone in pelvis with positive small bowel faeces sign. Two patients with SBO due to adhesive band had evidence of closed loop obstruction with evidence of gangrenous gut on surgery. Large Bowel Obstruction (LBO) was seen in 11 patients. Most common cause of LBO was primary colonic malignancy, accounting for 7 patients (63.6%). In one patient, the cause was direct invasion of hepatic flexure by carcinoma of gall bladder. Other causes of LBO were pelvic adhesions, faecal impaction and ischaemic stricture. Conclusion SBO is more common than LBO with adhesive bands being the most common cause of SBO. MDCT is very useful for depicting site and cause

  9. Ectopia cordis with tetralogy of Fallot in an infant with pentalogy of Cantrell: high-pitch MDCT exam.

    PubMed

    Santiago-Herrera, Rogerio; Ramirez-Carmona, Rocio; Criales-Vera, Sergio; Calderon-Colmenero, Juan; Kimura-Hayama, Eric

    2011-07-01

    We report the MDCT findings of a 17-month-old girl with Cantrell's pentalogy, a rare congenital disease characterized by several defects in the ventral thoracoabdominal wall including ectopia cordis, and, in this patient, associated with tetralogy of Fallot. This case provides an example of the utility of a wide volume in coverage and high-pitch MDCT scan in the evaluation of complex cardiovascular anatomy in infants with congenital heart disease without the need of an ECG-gating acquisition.

  10. Congenital thoracic vascular anomalies: evaluation with state-of-the-art MR imaging and MDCT.

    PubMed

    Hellinger, Jeffrey C; Daubert, Melissa; Lee, Edward Y; Epelman, Monica

    2011-09-01

    Congenital thoracic vascular anomalies include embryologic developmental disorders of the thoracic aorta, aortic arch branch arteries, pulmonary arteries, thoracic systemic veins, and pulmonary veins. Diagnostic evaluation of these anomalies in pediatric patients has evolved with innovations in diagnostic imaging technology. State-of-the-art magnetic resonance (MR) imaging, MR angiography multidetector-row computed tomographic (MDCT) angiography, and advanced postprocessing visualization techniques offer accurate and reliable high-resolution two-dimensional and three-dimensional noninvasive anatomic displays for interpretation and clinical management of congenital thoracic vascular anomalies. This article reviews vascular MR imaging, MR angiography, MDCT angiography, and advanced visualization techniques and applications for the assessment of congenital thoracic vascular anomalies, emphasizing clinical embryology and the characteristic imaging findings.

  11. MDCT Imaging Findings of Liver Cirrhosis: Spectrum of Hepatic and Extrahepatic Abdominal Complications

    PubMed Central

    Sangster, Guillermo P.; Previgliano, Carlos H.; Nader, Mathieu; Chwoschtschinsky, Elisa; Heldmann, Maureen G.

    2013-01-01

    Hepatic cirrhosis is the clinical and pathologic result of a multifactorial chronic liver injury. It is well known that cirrhosis is the origin of multiple extrahepatic abdominal complications and a markedly increased risk of hepatocellular carcinoma (HCC). This tumor is the sixth most common malignancy worldwide and the third most common cause of cancer related death. With the rising incidence of HCC worldwide, awareness of the evolution of cirrhotic nodules into malignancy is critical for an early detection and treatment. Adequate imaging protocol selection with dynamic multiphase Multidetector Computed Tomography (MDCT) and reformatted images is crucial to differentiate and categorize the hepatic nodular dysplasia. Knowledge of the typical and less common extrahepatic abdominal manifestations is essential for accurately assessing patients with known or suspected hepatic disease. The objective of this paper is to illustrate the imaging spectrum of intra- and extrahepatic abdominal manifestations of hepatic cirrhosis seen on MDCT. PMID:23986608

  12. Robust method for extracting the pulmonary vascular trees from 3D MDCT images

    NASA Astrophysics Data System (ADS)

    Taeprasartsit, Pinyo; Higgins, William E.

    2011-03-01

    Segmentation of pulmonary blood vessels from three-dimensional (3D) multi-detector CT (MDCT) images is important for pulmonary applications. This work presents a method for extracting the vascular trees of the pulmonary arteries and veins, applicable to both contrast-enhanced and unenhanced 3D MDCT image data. The method finds 2D elliptical cross-sections and evaluates agreement of these cross-sections in consecutive slices to find likely cross-sections. It next employs morphological multiscale analysis to separate vessels from adjoining airway walls. The method then tracks the center of the likely cross-sections to connect them to the pulmonary vessels in the mediastinum and forms connected vascular trees spanning both lungs. A ground-truth study indicates that the method was able to detect on the order of 98% of the vessel branches having diameter >= 3.0 mm. The extracted vascular trees can be utilized for the guidance of safe bronchoscopic biopsy.

  13. Academic self-concept, learning motivation, and test anxiety of the underestimated student.

    PubMed

    Urhahne, Detlef; Chao, Sheng-Han; Florineth, Maria Luise; Luttenberger, Silke; Paechter, Manuela

    2011-03-01

    BACKGROUND. Teachers' judgments of student performance on a standardized achievement test often result in an overestimation of students' abilities. In the majority of cases, a larger group of overestimated students and a smaller group of underestimated students are formed by these judgments. AIMS. In this research study, the consequences of the underestimation of students' mathematical performance potential were examined. SAMPLE. Two hundred and thirty-five fourth grade students and their fourteen mathematics teachers took part in the investigation. METHOD. Students worked on a standardized mathematics achievement test and completed a self-description questionnaire about motivation and affect. Teachers estimated each individual student's potential with regard to mathematics test performance as well as students' expectancy for success, level of aspiration, academic self-concept, learning motivation, and test anxiety. The differences between teachers' judgments on students' test performance and students' actual performance were used to build groups of underestimated and overestimated students. RESULTS. Underestimated students displayed equal levels of test performance, learning motivation, and level of aspiration in comparison with overestimated students, but had lower expectancy for success, lower academic self-concept, and experienced more test anxiety. Teachers expected that underestimated students would receive lower grades on the next mathematics test, believed that students were satisfied with lower grades, and assumed that the students have weaker learning motivation than their overestimated classmates. CONCLUSION. Teachers' judgment error was not confined to test performance but generalized to motivational and affective traits of the students.

  14. Spontaneous Renal Artery Dissection as a Cause of Acute Renal Infarction: Clinical and MDCT Findings.

    PubMed

    Yoon, Kibo; Song, Soon Young; Lee, Chang Hwa; Ko, Byung Hee; Lee, Seunghun; Kang, Bo Kyeong; Kim, Mi Mi

    2017-04-01

    The purpose of this study was to assess the incidence of spontaneous renal artery dissection (SRAD) as a cause of acute renal infarction, and to evaluate the clinical and multidetector computed tomography (MDCT) findings of SRAD. From November 2011 to January 2014, 35 patients who were diagnosed with acute renal infarction by MDCT were included. We analyzed the 35 MDCT data sets and medical records retrospectively, and compared clinical and imaging features of SRAD with an embolism, using Fisher's exact test and the Mann-Whitney test. The most common cause of acute renal infarction was an embolism, and SRAD was the second most common cause. SRAD patients had new-onset hypertension more frequently than embolic patients. Embolic patients were found to have increased C-reactive protein (CRP) more often than SRAD patients. Laboratory results, including tests for lactate dehydrogenase (LDH) and blood urea nitrogen (BUN), and the BUN/creatinine ratio (BCR) were significantly higher in embolic patients than SRAD patients. Bilateral renal involvement was detected in embolic patients more often than in SRAD patients. MDCT images of SRAD patients showed the stenosis of the true lumen, due to compression by a thrombosed false lumen. None of SRAD patients progressed to an estimated glomerular filtration rate < 60 mL/min/1.73 m² or to end-stage renal disease during the follow-up period. SRAD is not a rare cause of acute renal infarction, and it has a benign clinical course. It should be considered in a differential diagnosis of acute renal infarction, particularly in patients with new-onset hypertension, unilateral renal involvement, and normal ranges of CRP, LDH, BUN, and BCR.

  15. Evaluating the effect of two different anesthetic protocols on 64-MDCT coronary angiography in dogs

    PubMed Central

    Drees, Randi; Johnson, Rebecca A; Pinkerton, Marie; Del Rio, Alejandro Munoz; Saunders, Jimmy H; François, Christopher J

    2014-01-01

    Heart rate is a major factor influencing diagnostic image quality in computed tomographic coronary artery angiography (MDCT-CA) with an ideal heart rate of 60–65 beats/minute in humans. Using standardized contrast bolus volume, two different clinically applicable anesthetic protocols were compared for effect on cardiovascular parameters and 64-MDCT-CA quality in ten healthy dogs. The protocol using midazolam/fentanyl (A) was hypothesized to result in adequate reduction of heart rate achieving adequate image quality for MDCT-CA studies and having low impact on blood pressure, where as the protocol utilizing dexmedetomidine (B) was expected to result in reduction of heart rate to the target heart range resulting in excellent image quality while possibly showing undesirable effect on the blood pressure values measured. Heart rate was 80.6 ± 7.5bpm with protocol A and 79.2 ± 14.2bpm with protocol B during image acquisition (P=1). R-R intervals allowing for the best depiction of the individual coronary artery segments were found in the end diastolic period and varied between the 70–95% interval. Diagnostic quality was rated excellent, good and moderate in the majority of the segments evaluated, with higher scores given for more proximal segments and lower for more distal segments respectively. Blur was the most commonly observed artifact and most affected the distal segments. There was no significant difference for the optimal reconstruction interval, diagnostic quality and measured length individual segments or proximal diameter of the coronary arteries between both protocols (P=1). Both anesthetic protocols and the standardized bolus volume allow for diagnostic quality coronary 64-MDCT-CA exams. PMID:25065815

  16. Spontaneous Renal Artery Dissection as a Cause of Acute Renal Infarction: Clinical and MDCT Findings

    PubMed Central

    2017-01-01

    The purpose of this study was to assess the incidence of spontaneous renal artery dissection (SRAD) as a cause of acute renal infarction, and to evaluate the clinical and multidetector computed tomography (MDCT) findings of SRAD. From November 2011 to January 2014, 35 patients who were diagnosed with acute renal infarction by MDCT were included. We analyzed the 35 MDCT data sets and medical records retrospectively, and compared clinical and imaging features of SRAD with an embolism, using Fisher's exact test and the Mann-Whitney test. The most common cause of acute renal infarction was an embolism, and SRAD was the second most common cause. SRAD patients had new-onset hypertension more frequently than embolic patients. Embolic patients were found to have increased C-reactive protein (CRP) more often than SRAD patients. Laboratory results, including tests for lactate dehydrogenase (LDH) and blood urea nitrogen (BUN), and the BUN/creatinine ratio (BCR) were significantly higher in embolic patients than SRAD patients. Bilateral renal involvement was detected in embolic patients more often than in SRAD patients. MDCT images of SRAD patients showed the stenosis of the true lumen, due to compression by a thrombosed false lumen. None of SRAD patients progressed to an estimated glomerular filtration rate < 60 mL/min/1.73 m2 or to end-stage renal disease during the follow-up period. SRAD is not a rare cause of acute renal infarction, and it has a benign clinical course. It should be considered in a differential diagnosis of acute renal infarction, particularly in patients with new-onset hypertension, unilateral renal involvement, and normal ranges of CRP, LDH, BUN, and BCR. PMID:28244286

  17. Managing patient dose in multi-detector computed tomography(MDCT). ICRP Publication 102.

    PubMed

    Valentin, J

    2007-01-01

    Computed tomography (CT) technology has changed considerably in recent years with the introduction of increasing numbers of multiple detector arrays. There are several parameters specific to multi-detector computed tomography (MDCT) scanners that increase or decrease patient dose systematically compared to older single detector computed tomography (SDCT) scanners. This document briefly reviews the MDCT technology, radiation dose in MDCT, including differences from SDCT and factors that affect dose, radiation risks, and the responsibilities for patient dose management. The document recommends that users need to understand the relationship between patient dose and image quality and be aware that image quality in CT is often higher than that necessary for diagnostic confidence. Automatic exposure control (AEC) does not totally free the operator from selection of scan parameters, and awareness of individual systems is important. Scanning protocols cannot simply be transferred between scanners from different manufacturers and should be determined for each MDCT. If the image quality is appropriately specified by the user, and suited to the clinical task, there will be a reduction in patient dose for most patients. Understanding of some parameters is not intuitive and the selection of image quality parameter values in AEC systems is not straightforward. Examples of some clinical situation shave been included to demonstrate dose management, e.g. CT examinations of the chest, the heart for coronary calcium quantification and non-invasive coronary angiography, colonography, the urinary tract, children, pregnant patients, trauma cases, and CT guided interventions. CT is increasingly being used to replace conventional x-ray studies and it is important that patient dose is given careful consideration, particularly with repeated or multiple examinations.

  18. Spectrum of imaging findings on MDCT enterography in patients with small bowel tuberculosis.

    PubMed

    Kalra, N; Agrawal, P; Mittal, V; Kochhar, R; Gupta, V; Nada, R; Singh, R; Khandelwal, N

    2014-03-01

    Abdominal tuberculosis (TB) is the sixth most common extrapulmonary site of involvement. The sites of involvement in abdominal tuberculosis, in descending order of frequency, are lymph nodes, genitourinary tract, peritoneal cavity, and gastrointestinal tract. The radiological armamentarium for evaluating tuberculosis of the small bowel (SBTB) includes barium studies (small bowel follow-through, SBFT), CT (multidetector CT, CT enterography, and CT enteroclysis), ultrasound (sonoenteroclysis), and magnetic resonance imaging (MRI; enterography and enteroclysis). In this review, we illustrate the abnormalities at MDCT enterography in 20 consecutive patients with SB TB and also describe extraluminal findings in these patients. MDCT enterography allows non-invasive good-quality assessment of well-distended bowel loops and the adjacent soft tissues. It displays the thickness and enhancement of the entire bowel wall in all three planes and allows examination of all bowel loops, especially the ileal loops, which are mostly superimposed. The terminal ileum and ileocaecal junction are the most common sites of small bowel involvement in intestinal TB. The most common abnormality is short-segment strictures with symmetrical concentric mural thickening and homogeneous mural enhancement. Other findings include lymphadenopathy, ascites, enteroliths, peritoneal thickening, and enhancement. In conclusion, MDCT enterography is a comprehensive technique for the evaluation of SB TB.

  19. [Food-drug interactions: an underestimated risk].

    PubMed

    Sönnichsen, A C; Donner-Banzhoff, N; Baum, E

    2005-11-03

    With only few exceptions, administration of medicaments should, in principle, be independent of food intake (at least half an hour before or two hours after eating). This ensures uniform and assessable bioavailability. However, it also entails the risk that the patient is more likely to forget to take medication postponed to 2 hours after a meal, than when it is directly coupled to a meal. Certain foodstuffs or food constituents, such as, for example, grapefruit, Seville orange juice, red wine, alcoholic drinks in general, or large quantities of caffeine and garlic should be avoided during drug treatment. In addition, specific interactions with certain drugs must also be taken into account (e.g. MAO inhibitors and tyramine, curamine and vitamin K).

  20. Coronary fly-through or virtual angioscopy using dual-source MDCT data.

    PubMed

    van Ooijen, Peter M A; de Jonge, Gonda; Oudkerk, Matthijs

    2007-11-01

    Coronary fly-through or virtual angioscopy (VA) has been studied ever since its invention in 2000. However, application was limited because it requires an optimal computed tomography (CT) scan and time-consuming post-processing. Recent advances in post-processing software facilitate easy construction of VA, but until now image quality was insufficient in most patients. The introduction of dual-source multidetector CT (MDCT) could enable VA in all patients. Twenty patients were scanned using a dual-source MDCT (Definition, Siemens, Forchheim, Germany) using a standard coronary artery protocol. Post-processing was performed on an Aquarius Workstation (TeraRecon, San Mateo, Calif.). Length travelled per major branch was recorded in millimetres, together with the time required in minutes. VA could be performed in every patient for each of the major coronary arteries. The mean (range) length of the automated fly-through was 80 (32-107) mm for the left anterior descending (LAD), 75 (21-116) mm for the left circumflex artery (LCx), and 109 (21-190) mm for the right coronary artery (RCA). Calcifications and stenoses were visualised, as well as most side branches. The mean time required was 3 min for LAD, 2.5 min for LCx, and 2 min for the RCA. Dual-source MDCT allows for high quality visualisation of the coronary arteries in every patient because scanning with this machine is independent of the heart rate. This is clearly shown by the successful VA in all patients. Potential clinical value of VA should be determined in the near future.

  1. Ileocaecal Intussusception with a Lead Point: Unusual MDCT Findings of Active Crohn's Disease Involving the Appendix

    PubMed Central

    Ozan, Ebru; Atac, Gokce Kaan; Akincioglu, Egemen; Keskin, Mete; Gulpinar, Kamil

    2015-01-01

    Adult intussusception is a rare entity accounting for 1% of all bowel obstructions. Unlike intussusceptions in children, which are idiopathic in 90% of cases, adult intussusceptions have an identifiable cause (lead point) in the majority of cases. Crohn's disease (CD) may affect any part of the gastrointestinal tract, including the appendix. It was shown to be a predisposing factor for intussusception. Here, we report a rare case of adult intussusception with a lead point, emphasizing diagnostic input of multidetector computed tomography (MDCT) in a patient with active CD that involves the appendix. PMID:26558130

  2. Esophagobronchial fistulae: Diagnosis by MDCT with oral contrast swallow examination of a benign and a malignant cause

    PubMed Central

    Hegde, Rahul G; Kalekar, Tushar M; Gajbhiye, Meenakshi I; Bandgar, Amol S; Pawar, Shephali S; Khadse, Gopal J

    2013-01-01

    We report two cases of esophagobronchial fistulae diagnosed by Multi-detector computed tomography (MDCT) oral contrast swallow examination. It is helpful to supplement the CT study with an oral contrast swallow as it aids in confirmation of a suspected fistula and also demonstrates the fistula tract better. We present the clinical details and the imaging findings on MDCT of two cases of esophagobronchial fistulae – one secondary to chronic chest tuberculosis and the other secondary to a squamous cell carcinoma of the upper esophagus – followed by discussion of the etiology, pathogenesis, and imaging of these fistulae. PMID:24082484

  3. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry

    SciTech Connect

    Bostani, Maryam McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F.; Mueller, Jonathon W.; Cody, Dianna D.; DeMarco, John J.

    2015-02-15

    Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.

  4. Effect of Low-Dose MDCT and Iterative Reconstruction on Trabecular Bone Microstructure Assessment

    PubMed Central

    Baum, Thomas; Nasirudin, Radin A.; Mei, Kai; Garcia, Eduardo G.; Burgkart, Rainer; Rummeny, Ernst J.; Kirschke, Jan S.; Noël, Peter B.

    2016-01-01

    We investigated the effects of low-dose multi detector computed tomography (MDCT) in combination with statistical iterative reconstruction algorithms on trabecular bone microstructure parameters. Twelve donated vertebrae were scanned with the routine radiation exposure used in our department (standard-dose) and a low-dose protocol. Reconstructions were performed with filtered backprojection (FBP) and maximum-likelihood based statistical iterative reconstruction (SIR). Trabecular bone microstructure parameters were assessed and statistically compared for each reconstruction. Moreover, fracture loads of the vertebrae were biomechanically determined and correlated to the assessed microstructure parameters. Trabecular bone microstructure parameters based on low-dose MDCT and SIR significantly correlated with vertebral bone strength. There was no significant difference between microstructure parameters calculated on low-dose SIR and standard-dose FBP images. However, the results revealed a strong dependency on the regularization strength applied during SIR. It was observed that stronger regularization might corrupt the microstructure analysis, because the trabecular structure is a very small detail that might get lost during the regularization process. As a consequence, the introduction of SIR for trabecular bone microstructure analysis requires a specific optimization of the regularization parameters. Moreover, in comparison to other approaches, superior noise-resolution trade-offs can be found with the proposed methods. PMID:27447827

  5. Comparison between a new reconstruction algorithm (OPED) and filtered backprojection (FBP) for MDCT data

    NASA Astrophysics Data System (ADS)

    Renger, Bernhard; No"l, Peter B.; Tischenko, Oleg; Rummeny, Ernst J.; Hoeschen, Christoph

    2012-03-01

    Previously the Orthogonal Polynomial Expansion on the Disk (OPED) algorithm was presented. Further, in prototype experiments in combination with the CT D`or geometry feasibility was demonstrated. In this study we implemented OPED with a clinical Scanner, and evaluated the potential using phantom studies. All studies were acquired on a Siemens Somatom 64 (Erlangen, Germany) scanner, where raw projection data were reconstructed with the conventional FBP reconstruction and the OPED algorithm. OPED allows one to use fan beam geometry directly without any additional procedures such as interpolation or rebinning if using the CT D`or geometry. In particular, OPED describes an approximation of the image function as a sum of polynomials using Chebychev polynomials. For performance evaluation, the Catphan phantom 600 was imaged. OPED Images where reconstructed using C++ and MATLAB® .We measured uniformity, MTF and CNR for different dose levels and compared these to standard FBP images reconstructions with different filter kernels. The integration and interpretation of the MDCT projection data for the OPED algorithm was accomplished. Reconstruction time is about 6 s on Quad-Core 3 GHz Intel Xeon processor. Typical artifacts are reduced when applying OPED. Using OPED the MTF maintains constant over the whole FOV. Uniformity and CNR are equal compared to FBP. Advantages of OPED were demonstrated by applying the algorithm to projections images from a clinical MDCT scanner. In the future, we see OPED applications for low-dose or limited angle geometries to reduce the radiation dose while improving diagnostic quality of the reconstructed slices.

  6. MDCT evaluation of potential living renal donor, prior to laparoscopic donor nephrectomy: What the transplant surgeon wants to know?

    PubMed

    Ghonge, Nitin P; Gadanayak, Satyabrat; Rajakumari, Vijaya

    2014-10-01

    As Laparoscopic Donor Nephrectomy (LDN) offers several advantages for the donor such as lesser post-operative pain, fewer cosmetic concerns and faster recovery time, there is growing global trend towards LDN as compared to open nephrectomy. Comprehensive pre-LDN donor evaluation includes assessment of renal morphology including pelvi-calyceal and vascular system. Apart from donor selection, evaluation of the regional anatomy allows precise surgical planning. Due to limited visualization during laparoscopic renal harvesting, detailed pre-transplant evaluation of regional anatomy, including the renal venous anatomy is of utmost importance. MDCT is the modality of choice for pre-LDN evaluation of potential renal donors. Apart from appropriate scan protocol and post-processing methods, detailed understanding of surgical techniques is essential for the Radiologist for accurate image interpretation during pre-LDN MDCT evaluation of potential renal donors. This review article describes MDCT evaluation of potential living renal donor, prior to LDN with emphasis on scan protocol, post-processing methods and image interpretation. The article laid special emphasis on surgical perspectives of pre-LDN MDCT evaluation and addresses important points which transplant surgeons want to know.

  7. Comparison of hepatic MDCT, MRI, and DSA to explant pathology for the detection and treatment planning of hepatocellular carcinoma

    PubMed Central

    Ladd, Lauren M.; Tirkes, Temel; Tann, Mark; Agarwal, David M.; Johnson, Matthew S.; Tahir, Bilal; Sandrasegaran, Kumaresan

    2016-01-01

    Background/Aims The diagnosis and treatment plan for hepatocellular carcinoma (HCC) can be made from radiologic imaging. However, lesion detection may vary depending on the imaging modality. This study aims to evaluate the sensitivities of hepatic multidetector computed tomography (MDCT), magnetic resonance imaging (MRI), and digital subtraction angiography (DSA) in the detection of HCC and the consequent management impact on potential liver transplant patients. Methods One hundred and sixteen HCC lesions were analyzed in 41 patients who received an orthotopic liver transplant (OLT). All of the patients underwent pretransplantation hepatic DSA, MDCT, and/or MRI. The imaging results were independently reviewed retrospectively in a blinded fashion by two interventional and two abdominal radiologists. The liver explant pathology was used as the gold standard for assessing each imaging modality. Results The sensitivity for overall HCC detection was higher for cross-sectional imaging using MRI (51.5%, 95% confidence interval [CI]=36.2-58.4%) and MDCT (49.8%, 95% CI=43.7-55.9%) than for DSA (41.7%, 95% CI=36.2-47.3%) (P=0.05). The difference in false-positive rate was not statistically significant between MRI (22%), MDCT (29%), and DSA (29%) (P=0.67). The sensitivity was significantly higher for detecting right lobe lesions than left lobe lesions for all modalities (MRI: 56.1% vs. 43.1%, MDCT: 55.0% vs. 42.0%, and DSA: 46.9% vs. 33.9%; all P<0.01). The sensitivities of the three imaging modalities were also higher for lesions ≥2 cm vs. <2 cm (MRI: 73.4% vs. 32.7%, MDCT: 66.9% vs. 33.8%, and DSA: 62.2% vs. 24.1%; all P<0.01). The interobserver correlation was rated as very good to excellent. Conclusion The sensitivity for detecting HCC is higher for MRI and MDCT than for DSA, and so cross-sectional imaging modalities should be used to evaluate OLT candidacy. PMID:27987537

  8. People underestimate the value of persistence for creative performance.

    PubMed

    Lucas, Brian J; Nordgren, Loran F

    2015-08-01

    Across 7 studies, we investigated the prediction that people underestimate the value of persistence for creative performance. Across a range of creative tasks, people consistently underestimated how productive they would be while persisting (Studies 1-3). Study 3 found that the subjectively experienced difficulty, or disfluency, of creative thought accounted for persistence undervaluation. Alternative explanations based on idea quality (Studies 1-2B) and goal setting (Study 4) were considered and ruled out and domain knowledge was explored as a boundary condition (Study 5). In Study 6, the disfluency of creative thought reduced people's willingness to invest in an opportunity to persist, resulting in lower financial performance. This research demonstrates that persistence is a critical determinant of creative performance and that people may undervalue and underutilize persistence in everyday creative problem solving.

  9. Oxygen utilization rate (OUR) underestimates ocean respiration: A model study

    NASA Astrophysics Data System (ADS)

    Koeve, W.; Kähler, P.

    2016-08-01

    We use a simple 1-D model representing an isolated density surface in the ocean and 3-D global ocean biogeochemical models to evaluate the concept of computing the subsurface oceanic oxygen utilization rate (OUR) from the changes of apparent oxygen utilization (AOU) and water age. The distribution of AOU in the ocean is not only the imprint of respiration in the ocean's interior but is strongly influenced by transport processes and eventually loss at the ocean surface. Since AOU and water age are subject to advection and diffusive mixing, it is only when they are affected both in the same way that OUR represents the correct rate of oxygen consumption. This is the case only when advection prevails or with uniform respiration rates, when the proportions of AOU and age are not changed by transport. In experiments with the 1-D tube model, OUR underestimates respiration when maximum respiration rates occur near the outcrops of isopycnals and overestimates when maxima occur far from the outcrops. Given the distribution of respiration in the ocean, i.e., elevated rates near high-latitude outcrops of isopycnals and low rates below the oligotrophic gyres, underestimates are the rule. Integrating these effects globally in three coupled ocean biogeochemical and circulation models, we find that AOU-over-age based calculations underestimate true model respiration by a factor of 3. Most of this difference is observed in the upper 1000 m of the ocean with the discrepancies increasing toward the surface where OUR underestimates respiration by as much as factor of 4.

  10. Underestimation and undertreatment of pain in HIV disease: multicentre study.

    PubMed Central

    Larue, F.; Fontaine, A.; Colleau, S. M.

    1997-01-01

    OBJECTIVE: To measure the prevalence, severity, and impact of pain on quality of life for HIV patients; to identify factors associated with undertreatment of pain. DESIGN: Multicentre cross sectional survey. SETTINGS: 34 HIV treatment facilities, including inpatient hospital wards, day hospitals, and ambulatory care clinics, in 13 cities throughout France. SUBJECTS: 315 HIV patients at different stages of the disease. MAIN OUTCOME MEASURES: Patients: recorded presence and severity of pain and rated quality of life. Doctors: reported disease status, estimate of pain severity, and analgesic treatment ordered. RESULTS: From 30% (17/56) of outpatients to 62% (73/118) of inpatients reported pain due to HIV disease. Pain severity significantly decreased patients' quality of life. Doctors underestimated pain severity in 52% (70/135) of HIV patients reporting pain. Underestimation of pain severity was more likely for patients who reported moderate (odds ratio 24) or severe pain (165) and less likely for patients whose pain source was identified or who were perceived as more depressed. Of the patients reporting moderate or severe pain, 57% (61/107) did not receive any analgesic treatment; only 22% (23/107) received at least weak opioids. Likelihood of analgesic prescription increased when doctors estimated pain to be more severe and regarded patients as sicker. CONCLUSIONS: Pain is a common and debilitating symptom of HIV disease which is gravely underestimated and undertreated. PMID:9001475

  11. Numerosity underestimation with item similarity in dynamic visual display.

    PubMed

    Au, Ricky K C; Watanabe, Katsumi

    2013-01-01

    The estimation of numerosity of a large number of objects in a static visual display is possible even at short durations. Such coarse approximations of numerosity are distinct from subitizing, in which the number of objects can be reported with high precision when a small number of objects are presented simultaneously. The present study examined numerosity estimation of visual objects in dynamic displays and the effect of object similarity on numerosity estimation. In the basic paradigm (Experiment 1), two streams of dots were presented and observers were asked to indicate which of the two streams contained more dots. Streams consisting of dots that were identical in color were judged as containing fewer dots than streams where the dots were different colors. This underestimation effect for identical visual items disappeared when the presentation rate was slower (Experiment 1) or the visual display was static (Experiment 2). In Experiments 3 and 4, in addition to the numerosity judgment task, observers performed an attention-demanding task at fixation. Task difficulty influenced observers' precision in the numerosity judgment task, but the underestimation effect remained evident irrespective of task difficulty. These results suggest that identical or similar visual objects presented in succession might induce substitution among themselves, leading to an illusion that there are few items overall and that exploiting attentional resources does not eliminate the underestimation effect.

  12. The electron donating capacity of biochar is dramatically underestimated

    NASA Astrophysics Data System (ADS)

    Prévoteau, Antonin; Ronsse, Frederik; Cid, Inés; Boeckx, Pascal; Rabaey, Korneel

    2016-09-01

    Biochars have gathered considerable interest for agronomic and engineering applications. In addition to their high sorption ability, biochars have been shown to accept or donate considerable amounts of electrons to/from their environment via abiotic or microbial processes. Here, we measured the electron accepting (EAC) and electron donating (EDC) capacities of wood-based biochars pyrolyzed at three different highest treatment temperatures (HTTs: 400, 500, 600 °C) via hydrodynamic electrochemical techniques using a rotating disc electrode. EACs and EDCs varied with HTT in accordance with a previous report with a maximal EAC at 500 °C (0.4 mmol(e‑).gchar‑1) and a large decrease of EDC with HTT. However, while we monitored similar EAC values than in the preceding study, we show that the EDCs have been underestimated by at least 1 order of magnitude, up to 7 mmol(e‑).gchar‑1 for a HTT of 400 °C. We attribute this existing underestimation to unnoticed slow kinetics of electron transfer from biochars to the dissolved redox mediators used in the monitoring. The EDC of other soil organic constituents such as humic substances may also have been underestimated. These results imply that the redox properties of biochars may have a much bigger impact on soil biogeochemical processes than previously conjectured.

  13. The electron donating capacity of biochar is dramatically underestimated

    PubMed Central

    Prévoteau, Antonin; Ronsse, Frederik; Cid, Inés; Boeckx, Pascal; Rabaey, Korneel

    2016-01-01

    Biochars have gathered considerable interest for agronomic and engineering applications. In addition to their high sorption ability, biochars have been shown to accept or donate considerable amounts of electrons to/from their environment via abiotic or microbial processes. Here, we measured the electron accepting (EAC) and electron donating (EDC) capacities of wood-based biochars pyrolyzed at three different highest treatment temperatures (HTTs: 400, 500, 600 °C) via hydrodynamic electrochemical techniques using a rotating disc electrode. EACs and EDCs varied with HTT in accordance with a previous report with a maximal EAC at 500 °C (0.4 mmol(e−).gchar−1) and a large decrease of EDC with HTT. However, while we monitored similar EAC values than in the preceding study, we show that the EDCs have been underestimated by at least 1 order of magnitude, up to 7 mmol(e−).gchar−1 for a HTT of 400 °C. We attribute this existing underestimation to unnoticed slow kinetics of electron transfer from biochars to the dissolved redox mediators used in the monitoring. The EDC of other soil organic constituents such as humic substances may also have been underestimated. These results imply that the redox properties of biochars may have a much bigger impact on soil biogeochemical processes than previously conjectured. PMID:27628746

  14. Data compression in wireless sensors network using MDCT and embedded harmonic coding.

    PubMed

    Alsalaet, Jaafar K; Ali, Abduladhem A

    2015-05-01

    One of the major applications of wireless sensors networks (WSNs) is vibration measurement for the purpose of structural health monitoring and machinery fault diagnosis. WSNs have many advantages over the wired networks such as low cost and reduced setup time. However, the useful bandwidth is limited, as compared to wired networks, resulting in relatively low sampling. One solution to this problem is data compression which, in addition to enhancing sampling rate, saves valuable power of the wireless nodes. In this work, a data compression scheme, based on Modified Discrete Cosine Transform (MDCT) followed by Embedded Harmonic Components Coding (EHCC) is proposed to compress vibration signals. The EHCC is applied to exploit harmonic redundancy present is most vibration signals resulting in improved compression ratio. This scheme is made suitable for the tiny hardware of wireless nodes and it is proved to be fast and effective. The efficiency of the proposed scheme is investigated by conducting several experimental tests.

  15. A Rare Presentation of an Entrapment in a Liver Transplant Candidate Depicted by MDCT Angiography

    PubMed Central

    Kantarci, Mecit; Aydin, Unal; Doganay, Selim; Aydinli, Bulent; Yuce, Ihsan; Polat, Kamil Yalcin

    2010-01-01

    Hypertrophic caudate lobe veins can mimic a normal venous configuration. In cases of multiple vascular collaterals, Doppler evaluations must be conducted, and the flow direction of these veins as well as the IVC should be evaluated. If the flow in the IVC is reversed, Budd-Chiari syndrome should be suspected; moreover, at the supra diaphragmatic level, which may be considered a blind spot, particularly for radiologists, a web should be searched for in the area where the IVC opens into the right atrium. In this study, we present the unique findings of multidetector computed tomography (MDCT) angiography for a liver transplant candidate with Budd-Chiari syndrome caused by a web in the proximal IVC. PMID:25610132

  16. Diagnostic Value and Interreader Agreement of the Pancreaticolienal Gap in Pancreatic Cancer on MDCT

    PubMed Central

    Schawkat, Khoschy; Kühn, Wolfgang; Inderbitzin, Daniel; Gloor, Beat; Heverhagen, Johannes T.; Runge, Val Murray; Christe, Andreas

    2016-01-01

    Objective The aim of this retrospective study was to evaluate the diagnostic value and measure interreader agreement of the pancreaticolienal gap (PLG) in the assessment of imaging features of pancreatic carcinoma (PC) on contrast-enhanced multi-detector computed tomography (CE-MDCT). Materials and Methods CE-MDCT studies in the portal venous phase were retrospectively reviewed for 66 patients with PC. The age- and gender-matched control group comprised 103 healthy individuals. Three radiologists with different levels of experience independently measured the PLG (the minimum distance of the pancreatic tail to the nearest border of the spleen) in the axial plane. The interreader agreement of the PLG and the receiver operating characteristic (ROC) curve was used to calculate the accuracy of the technique. Results While the control group (n = 103) showed a median PLG of 3 mm (Range: 0 – 39mm) the PC patients had a significantly larger PLG of 15mm (Range: 0 – 53mm)(p < 0.0001). A ROC curve demonstrated a cutoff-value of >12 mm for PC, with a sensitivity of 58.2% (95% CI = 45.5–70.1), specificity of 84.0% (95% CI = 75.6–90.4) and an area under the ROC curve of 0.714 (95% CI = 0.641 to 0.780). The mean interreader agreement showed correlation coefficient r of 0.9159. The extent of the PLG did not correlate with tumor stage but did correlate with pancreatic density (fatty involution) and age, the density decreased by 4.1 HU and the PLG increased by 0.8 mm within every 10 y. Conclusion The significant interreader agreement supports the use of the PLG as a characterizing feature of pancreatic cancer independent of the tumor stage on an axial plane. The increase in the PLG with age may represent physiological atrophy of the pancreatic tail. PMID:27893776

  17. Priapism and glucose-6-phosphate dehydrogenase deficiency: An underestimated correlation?

    PubMed

    De Rose, Aldo Franco; Mantica, Guglielmo; Tosi, Mattia; Bovio, Giulio; Terrone, Carlo

    2016-10-05

    Priapism is a rare clinical condition characterized by a persistent erection unrelated to sexual excitement. Often the etiology is idiopathic. Three cases of priapism in glucose-6-phosphate dehydrogenase (G6PD) deficiency patients have been described in literature. We present the case of a 39-year-old man with glucose- 6-phosphate dehydrogenase deficiency, who reached out to our department for the arising of a non-ischemic priapism without arteriolacunar fistula. We suggest that the glucose-6-phosphate dehydrogenase deficiency could be an underestimated risk factor for priapism.

  18. Does WISC-IV Underestimate the Intelligence of Autistic Children?

    PubMed

    Nader, Anne-Marie; Courchesne, Valérie; Dawson, Michelle; Soulières, Isabelle

    2016-05-01

    Wechsler Intelligence Scale for Children (WISC) is widely used to estimate autistic intelligence (Joseph in The neuropsychology of autism. Oxford University Press, Oxford, 2011; Goldstein et al. in Assessment of autism spectrum disorders. Guilford Press, New York, 2008; Mottron in J Autism Dev Disord 34(1):19-27, 2004). However, previous studies suggest that while WISC-III and Raven's Progressive Matrices (RPM) provide similar estimates of non-autistic intelligence, autistic children perform significantly better on RPM (Dawson et al. in Psychol Sci 18(8):657-662, doi: 10.1111/j.1467-9280.2007.01954.x , 2007). The latest WISC version introduces substantial changes in subtests and index scores; thus, we asked whether WISC-IV still underestimates autistic intelligence. Twenty-five autistic and 22 typical children completed WISC-IV and RPM. Autistic children's RPM scores were significantly higher than their WISC-IV FSIQ, but there was no significant difference in typical children. Further, autistic children showed a distinctively uneven WISC-IV index profile, with a "peak" in the new Perceptual Reasoning Index. In spite of major changes, WISC-IV FSIQ continues to underestimate autistic intelligence.

  19. Consequences of Underestimating Impalement Bicycle Handlebar Injuries in Children.

    PubMed

    Ramos-Irizarry, Carmen T; Swain, Shakeva; Troncoso-Munoz, Samantha; Duncan, Malvina

    Impalement bicycle handlebar trauma injuries are rare; however, on initial assessment, they have the potential of being underestimated. We reviewed our prospective trauma database of 3,894 patients for all bicycle injuries from January 2010 to May 2015. Isolated pedal bike injuries were reported in 2.6% (N = 101) of the patients who were admitted to the trauma service. Fifteen patients suffered direct handlebar trauma. Patients were grouped into blunt trauma (n = 12) and impalement trauma (n = 3). We examined gender, age, injury severity score (ISS), Glasgow Coma Scale score, use of protective devices, need for surgical intervention, need for intensive care (ICU), and hospital length of stay. Mean age was 9.6 years. All children with penetrating injuries were males. Mean ISS was less than 9 in both groups. None of the children were wearing bicycle helmets. Three patients who sustained blunt injuries required ICU care due to associated injuries. All of the children with impalement injuries required several surgical interventions. These injuries included a traumatic direct inguinal hernia, a medial groin and thigh laceration with resultant femoral hernia, and a lateral deep thigh laceration. Impalement bicycle handlebar injuries must be thoroughly evaluated, with a similar importance given to blunt injuries. A high index of suspicion must be maintained when examining children with handlebar impalement injuries, as they are at risk for missed or underestimation of their injuries.

  20. Multidetector computed tomography (MDCT) evaluation of myocardial viability: intraindividual comparison of monomeric vs. dimeric contrast media in a rabbit model.

    PubMed

    Mahnken, Andreas H; Jost, Gregor; Bruners, Philipp; Sieber, Martin; Seidensticker, Peter R; Günther, Rolf W; Pietsch, Hubertus

    2009-02-01

    To evaluate the influence of different types of iodinated contrast media on the assessment of myocardial viability, acute myocardial infarction (MI) was surgically induced in six rabbits. Over a period of 45 min, contrast-enhanced cardiac MDCT (64 x 0.6 mm, 80 kV, 680 mAs(eff.)) was repeatedly performed using a contrast medium dose of 600 mg iodine/kg body weight. Animals received randomized iopromide 300 and iodixanol 320, respectively. Attenuation values of healthy and infarcted myocardium were measured. The size of MI was computed and compared with nitroblue tetrazolium (NBT)-stained specimen. The highest attenuation differences between infarcted and healthy myocardium occurred during the arterial phase with 140.0+/-3.5 HU and 141.0+/-2.2 HU for iopromide and iodixanol, respectively. For iodixanol the highest attenuation difference on delayed contrast-enhanced images was achieved 3 min post injection (73.5 HU). A slightly higher attenuation difference was observed for iopromide 6 min after contrast medium injection (82.2 HU), although not statistically significant (p=0.6437). Mean infarct volume as measured by NBT staining was 33.5%+/-13.6%. There was an excellent agreement of infarct sizes among NBT-, iopromide- and iodixanol-enhanced MDCT with concordance-correlation coefficients ranging from rho(c)=0.9928-0.9982. Iopromide and iodixanol both allow a reliable assessment of MI with delayed contrast-enhanced MDCT.

  1. Extramural venous invasion detected by MDCT as an adverse imaging feature for predicting synchronous metastases in T4 gastric cancer.

    PubMed

    Cheng, Jin; Wu, Jing; Ye, Yingjiang; Zhang, Chunfang; Zhang, Yinli; Wang, Yi

    2017-04-01

    Background Extramural venous invasion (EMVI) is defined histologically as the active invasion of tumor cells to the lumens of mesenteric vessels beyond the muscularis propria in advanced gastrointestinal cancer, resulting in distant metastases. Purpose To determine the association between synchronous metastatic disease in patients with T4 gastric cancer and EMVI detected on contrast-enhanced multiple-row detector computed tomography (MDCT). Material and Methods A total of 152 patients with T4 gastric carcinoma were retrospectively reviewed and divided into EMVI-positive and EMVI-negative groups where EMVI, as detected on MDCT, was defined as a tubular or nodular soft tissue thickening extending from the tumor along the vessels of the mesentery. Synchronous metastases were detected by MDCT and/or confirmed by postoperative diagnosis. Logistic regression analyses were performed to analyze the predictive factors of synchronous metastases in gastric cancer. Results Synchronous metastases were found in 47 of 152 (30.9%) patients with T4 gastric cancer. Thirty-one of 77 (40.3%) patients in the EMVI-positive group had evidence of metastases compared to 16 (21.3%) of 75 patients in the EMVI-negative group ( P = 0.019). Synchronous metastases were significantly associated with EMVI with an odds ratio (OR) of 2.250 (95% CI, 1.072-4.724). Conclusion EMVI-positive tumors, as an adverse imaging feature, were significantly associated with synchronous metastases in patients with T4 gastric cancer.

  2. Underestimating the frequency, strength and cost of antipredator responses with data from GPS collars: an example with wolves and elk

    PubMed Central

    Creel, Scott; Winnie, John A; Christianson, David

    2013-01-01

    Field studies that rely on fixes from GPS-collared predators to identify encounters with prey will often underestimate the frequency and strength of antipredator responses. These underestimation biases have several mechanistic causes. (1) Step bias: The distance between successive GPS fixes can be large, and encounters that occur during these intervals go undetected. This bias will generally be strongest for cursorial hunters that can rapidly cover large distances (e.g., wolves and African wild dogs) and when the interval between GPS fixes is long relative to the duration of a hunt. Step bias is amplified as the path travelled between successive GPS fixes deviates from a straight line. (2) Scatter bias: Only a small fraction of the predators in a population typically carry GPS collars, and prey encounters with uncollared predators go undetected unless a collared group-mate is present. This bias will generally be stronger for fission–fusion hunters (e.g., spotted hyenas, wolves, and lions) than for highly cohesive hunters (e.g., African wild dogs), particularly when their group sizes are large. Step bias and scatter bias both cause underestimation of the frequency of antipredator responses. (3) Strength bias: Observations of prey in the absence of GPS fix from a collared predator will generally include a mixture of cases in which predators were truly absent and cases in which predators were present but not detected, which causes underestimation of the strength of antipredator responses. We quantified these biases with data from wolves and African wild dogs and found that fixes from GPS collars at 3-h intervals underestimated the frequency and strength of antipredator responses by a factor >10. We reexamined the results of a recent study of the nonconsumptive effects of wolves on elk in light of these results and confirmed that predation risk has strong effects on elk dynamics by reducing the pregnancy rate. PMID:24455148

  3. Underestimating the frequency, strength and cost of antipredator responses with data from GPS collars: an example with wolves and elk.

    PubMed

    Creel, Scott; Winnie, John A; Christianson, David

    2013-12-01

    Field studies that rely on fixes from GPS-collared predators to identify encounters with prey will often underestimate the frequency and strength of antipredator responses. These underestimation biases have several mechanistic causes. (1) Step bias: The distance between successive GPS fixes can be large, and encounters that occur during these intervals go undetected. This bias will generally be strongest for cursorial hunters that can rapidly cover large distances (e.g., wolves and African wild dogs) and when the interval between GPS fixes is long relative to the duration of a hunt. Step bias is amplified as the path travelled between successive GPS fixes deviates from a straight line. (2) Scatter bias: Only a small fraction of the predators in a population typically carry GPS collars, and prey encounters with uncollared predators go undetected unless a collared group-mate is present. This bias will generally be stronger for fission-fusion hunters (e.g., spotted hyenas, wolves, and lions) than for highly cohesive hunters (e.g., African wild dogs), particularly when their group sizes are large. Step bias and scatter bias both cause underestimation of the frequency of antipredator responses. (3) Strength bias: Observations of prey in the absence of GPS fix from a collared predator will generally include a mixture of cases in which predators were truly absent and cases in which predators were present but not detected, which causes underestimation of the strength of antipredator responses. We quantified these biases with data from wolves and African wild dogs and found that fixes from GPS collars at 3-h intervals underestimated the frequency and strength of antipredator responses by a factor >10. We reexamined the results of a recent study of the nonconsumptive effects of wolves on elk in light of these results and confirmed that predation risk has strong effects on elk dynamics by reducing the pregnancy rate.

  4. Climate change velocity underestimates climate change exposure in mountainous regions

    PubMed Central

    Dobrowski, Solomon Z.; Parks, Sean A.

    2016-01-01

    Climate change velocity is a vector depiction of the rate of climate displacement used for assessing climate change impacts. Interpreting velocity requires an assumption that climate trajectory length is proportional to climate change exposure; longer paths suggest greater exposure. However, distance is an imperfect measure of exposure because it does not quantify the extent to which trajectories traverse areas of dissimilar climate. Here we calculate velocity and minimum cumulative exposure (MCE) in degrees Celsius along climate trajectories for North America. We find that velocity is weakly related to MCE; each metric identifies contrasting areas of vulnerability to climate change. Notably, velocity underestimates exposure in mountainous regions where climate trajectories traverse dissimilar climates, resulting in high MCE. In contrast, in flat regions velocity is high where MCE is low, as these areas have negligible climatic resistance to movement. Our results suggest that mountainous regions are more climatically isolated than previously reported. PMID:27476545

  5. Climate change velocity underestimates climate change exposure in mountainous regions

    NASA Astrophysics Data System (ADS)

    Dobrowski, Solomon Z.; Parks, Sean A.

    2016-08-01

    Climate change velocity is a vector depiction of the rate of climate displacement used for assessing climate change impacts. Interpreting velocity requires an assumption that climate trajectory length is proportional to climate change exposure; longer paths suggest greater exposure. However, distance is an imperfect measure of exposure because it does not quantify the extent to which trajectories traverse areas of dissimilar climate. Here we calculate velocity and minimum cumulative exposure (MCE) in degrees Celsius along climate trajectories for North America. We find that velocity is weakly related to MCE; each metric identifies contrasting areas of vulnerability to climate change. Notably, velocity underestimates exposure in mountainous regions where climate trajectories traverse dissimilar climates, resulting in high MCE. In contrast, in flat regions velocity is high where MCE is low, as these areas have negligible climatic resistance to movement. Our results suggest that mountainous regions are more climatically isolated than previously reported.

  6. Underestimating our influence over others' unethical behavior and decisions.

    PubMed

    Bohns, Vanessa K; Roghanizad, M Mahdi; Xu, Amy Z

    2014-03-01

    We examined the psychology of "instigators," people who surround an unethical act and influence the wrongdoer (the "actor") without directly committing the act themselves. In four studies, we found that instigators of unethical acts underestimated their influence over actors. In Studies 1 and 2, university students enlisted other students to commit a "white lie" (Study 1) or commit a small act of vandalism (Study 2) after making predictions about how easy it would be to get their fellow students to do so. In Studies 3 and 4, online samples of participants responded to hypothetical vignettes, for example, about buying children alcohol and taking office supplies home for personal use. In all four studies, instigators failed to recognize the social pressure they levied on actors through simple unethical suggestions, that is, the discomfort actors would experience by making a decision that was inconsistent with the instigator's suggestion.

  7. Underestimation of Monostatic Sodar Measurements in Complex Terrain

    NASA Astrophysics Data System (ADS)

    Behrens, Paul; O'Sullivan, J.; Archer, R.; Bradley, S.

    2012-04-01

    Recent investigations in complex terrain have found that remote sensing instrumentation commonly finds mean wind-speed differences when compared to cup anemometery. In many cases the difference is found to be an underestimation and varies from 2 to 9% depending on topology. We describe these differences in a theoretical sense for a five-beam sodar. An investigation is conducted on a New Zealand ridge with a five-beam sodar and three computational models, consisting of a potential flow model and two computational fluid dynamical simulations, OpenFOAM and the industry standard software WindSim. All models predict the difference to within 0.1-2.5%. A comparative assessment is made and it is found that, given the computing overheads, the potential flow model provides a good compromise in the prediction of mean wind-speed difference.

  8. Glycerol, an underestimated flavor precursor in the Maillard reaction.

    PubMed

    Smarrito-Menozzi, Candice; Matthey-Doret, Walter; Devaud-Goumoens, Stéphanie; Viton, Florian

    2013-10-30

    The objective of the present work was to investigate in depth the role of glycerol in Maillard reactions and its potential to act as an active flavor precursor. Reactions using isotopically labeled compounds (various reducing sugars, proline, and glycerol) unambiguously demonstrated that, in addition to its role of solvent, glycerol actively contributes to the formation of proline-specific compounds in Maillard model systems. Additionally, rhamnose and fucose/proline/glycerol systems generated the 2-propionyl-1(3),4,5,6-tetrahydropyridines, known for their roasty, popcorn aroma. Their formation from such systems is unprecedented. The results presented here have direct implications for flavor generation during thermal processing of foods containing glycerol, which is a ubiquitous food ingredient and an underestimated flavor precursor.

  9. Reducing radiation dose to selected organs by selecting the tube start angle in MDCT helical scans: A Monte Carlo based study

    SciTech Connect

    Zhang Di; Zankl, Maria; DeMarco, John J.; Cagnon, Chris H.; Angel, Erin; Turner, Adam C.; McNitt-Gray, Michael F.

    2009-12-15

    Purpose: Previous work has demonstrated that there are significant dose variations with a sinusoidal pattern on the peripheral of a CTDI 32 cm phantom or on the surface of an anthropomorphic phantom when helical CT scanning is performed, resulting in the creation of ''hot'' spots or ''cold'' spots. The purpose of this work was to perform preliminary investigations into the feasibility of exploiting these variations to reduce dose to selected radiosensitive organs solely by varying the tube start angle in CT scans. Methods: Radiation dose to several radiosensitive organs (including breasts, thyroid, uterus, gonads, and eye lenses) resulting from MDCT scans were estimated using Monte Carlo simulation methods on voxelized patient models, including GSF's Baby, Child, and Irene. Dose to fetus was also estimated using four pregnant female models based on CT images of the pregnant patients. Whole-body scans were simulated using 120 kVp, 300 mAs, both 28.8 and 40 mm nominal collimations, and pitch values of 1.5, 1.0, and 0.75 under a wide range of start angles (0 deg. - 340 deg. in 20 deg. increments). The relationship between tube start angle and organ dose was examined for each organ, and the potential dose reduction was calculated. Results: Some organs exhibit a strong dose variation, depending on the tube start angle. For small peripheral organs (e.g., the eye lenses of the Baby phantom at pitch 1.5 with 40 mm collimation), the minimum dose can be 41% lower than the maximum dose, depending on the tube start angle. In general, larger dose reductions occur for smaller peripheral organs in smaller patients when wider collimation is used. Pitch 1.5 and pitch 0.75 have different mechanisms of dose reduction. For pitch 1.5 scans, the dose is usually lowest when the tube start angle is such that the x-ray tube is posterior to the patient when it passes the longitudinal location of the organ. For pitch 0.75 scans, the dose is lowest when the tube start angle is such that the x

  10. Morphological and functional MDCT: problem-solving tool and surrogate biomarker for hepatic disease clinical care and drug discovery in the era of personalized medicine.

    PubMed

    Wang, Liang

    2010-08-17

    This article explains the significant role of morphological and functional multidetector computer tomography (MDCT) in combination with imaging postprocessing algorithms served as a problem-solving tool and noninvasive surrogate biomarker to effectively improve hepatic diseases characterization, detection, tumor staging and prognosis, therapy response assessment, and novel drug discovery programs, partial liver resection and transplantation, and MDCT-guided interventions in the era of personalized medicine. State-of-the-art MDCT depicts and quantifies hepatic disease over conventional CT for not only depicting lesion location, size, and extent but also detecting changes in tumor biologic behavior caused by therapy or tumor progression before morphologic changes. Color-encoded parameter display provides important functional information on blood flow, permeability, leakage space, and blood volume. Together with other relevant biomarkers and genomics, the imaging modality is being developed and validated as a biomarker to early response to novel, targeted anti-VEGF(R)/PDGFR or antivascular/angiogenesis agents as its parameters correlate with immunohistochemical surrogates of tumor angiogenesis and molecular features of malignancies. MDCT holds incremental value to World Health Organization response criteria and Response Evaluation Criteria in Solid Tumors in liver disease management. MDCT volumetric measurement of future remnant liver is the most important factor influencing the outcome of patients who underwent partial liver resection and transplantation. MDCT-guided interventional methods deliver personalized therapies locally in the human body. MDCT will hold more scientific impact when it is fused with other imaging probes to yield comprehensive information regarding changes in liver disease at different levels (anatomic, metabolic, molecular, histologic, and other levels).

  11. A Numerical Study of Water Loss Rate Distributions in MDCT-based Human Airway Models

    PubMed Central

    Wu, Dan; Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2015-01-01

    Both three-dimensional (3D) and one-dimensional (1D) computational fluid dynamics (CFD) methods are applied to study regional water loss in three multi-detector row computed-tomography (MDCT)-based human airway models at the minute ventilations of 6, 15 and 30 L/min. The overall water losses predicted by both 3D and 1D models in the entire respiratory tract agree with available experimental measurements. However, 3D and 1D models reveal different regional water loss rate distributions due to the 3D secondary flows formed at bifurcations. The secondary flows cause local skewed temperature and humidity distributions on inspiration acting to elevate the local water loss rate; and the secondary flow at the carina tends to distribute more cold air to the lower lobes. As a result, the 3D model predicts that the water loss rate first increases with increasing airway generation, and then decreases as the air approaches saturation, while the 1D model predicts a monotonic decrease of water loss rate with increasing airway generation. Moreover, the 3D (or 1D) model predicts relatively higher water loss rates in lower (or upper) lobes. The regional water loss rate can be related to the non-dimensional wall shear stress (τ*) by the non-dimensional mass transfer coefficient (h0*) as h0* = 1.15 τ*0.272, R = 0.842. PMID:25869455

  12. Empathy gaps for social pain: why people underestimate the pain of social suffering.

    PubMed

    Nordgren, Loran F; Banas, Kasia; MacDonald, Geoff

    2011-01-01

    In 5 studies, the authors examined the hypothesis that people have systematically distorted beliefs about the pain of social suffering. By integrating research on empathy gaps for physical pain (Loewenstein, 1996) with social pain theory (MacDonald & Leary, 2005), the authors generated the hypothesis that people generally underestimate the severity of social pain (ostracism, shame, etc.)--a biased judgment that is only corrected when people actively experience social pain for themselves. Using a social exclusion manipulation, Studies 1-4 found that nonexcluded participants consistently underestimated the severity of social pain compared with excluded participants, who had a heightened appreciation for social pain. This empathy gap for social pain occurred when participants evaluated both the pain of others (interpersonal empathy gap) as well as the pain participants themselves experienced in the past (intrapersonal empathy gap). The authors argue that beliefs about social pain are important because they govern how people react to socially distressing events. In Study 5, middle school teachers were asked to evaluate policies regarding emotional bullying at school. This revealed that actively experiencing social pain heightened the estimated pain of emotional bullying, which in turn led teachers to recommend both more comprehensive treatment for bullied students and greater punishment for students who bully.

  13. Brighter galaxy bias: underestimating the velocity dispersions of galaxy clusters

    NASA Astrophysics Data System (ADS)

    Old, L.; Gray, M. E.; Pearce, F. R.

    2013-09-01

    We study the systematic bias introduced when selecting the spectroscopic redshifts of brighter cluster galaxies to estimate the velocity dispersion of galaxy clusters from both simulated and observational galaxy catalogues. We select clusters with Ngal ≥ 50 at five low-redshift snapshots from the publicly available De Lucia & Blaziot semi-analytic model galaxy catalogue. Clusters are also selected from the Tempel Sloan Digital Sky Survey Data Release 8 groups and clusters catalogue across the redshift range 0.021 ≤ z ≤ 0.098. We employ various selection techniques to explore whether the velocity dispersion bias is simply due to a lack of dynamical information or is the result of an underlying physical process occurring in the cluster, for example, dynamical friction experienced by the brighter cluster members. The velocity dispersions of the parent dark matter (DM) haloes are compared to the galaxy cluster dispersions and the stacked distribution of DM particle velocities is examined alongside the corresponding galaxy velocity distribution. We find a clear bias between the halo and the semi-analytic galaxy cluster velocity dispersion on the order of σgal/σDM ˜ 0.87-0.95 and a distinct difference in the stacked galaxy and DM particle velocities distribution. We identify a systematic underestimation of the velocity dispersions when imposing increasing absolute I-band magnitude limits. This underestimation is enhanced when using only the brighter cluster members for dynamical analysis on the order of 5-35 per cent, indicating that dynamical friction is a serious source of bias when using galaxy velocities as tracers of the underlying gravitational potential. In contrast to the literature we find that the resulting bias is not only halo mass dependent but also that the nature of the dependence changes according to the galaxy selection strategy. We make a recommendation that, in the realistic case of limited availability of spectral observations, a strictly

  14. Quantifying Greenland freshwater flux underestimates in climate models

    NASA Astrophysics Data System (ADS)

    Little, Christopher M.; Piecuch, Christopher G.; Chaudhuri, Ayan H.

    2016-05-01

    Key processes regulating the mass balance of the Greenland Ice Sheet (GIS) are not represented in current-generation climate models. Here using output from 19 different climate models forced with a high-end business-as-usual emissions pathway, we compare modeled freshwater fluxes (FWF) to a parameterization based on midtropospheric temperature. By the mid 21st century, parameterized GIS FWF is 478 ± 215 km3 yr-1 larger than modeled—over 3 times the 1992-2011 rate of GIS mass loss. By the late 21st century, ensemble mean parameterized GIS FWF anomalies are comparable to FWF anomalies over the northern North Atlantic Ocean, equivalent to approximately 11 cm of global mean sea level rise. The magnitude and spread of these underestimates underscores the need for assessments of the coupled response of the ocean to increased FWF that recognize: (1) the widely varying freshwater budgets of each model and (2) uncertainty in the relationship between GIS FWF and atmospheric temperature.

  15. Massive yet grossly underestimated global costs of invasive insects

    PubMed Central

    Bradshaw, Corey J. A.; Leroy, Boris; Bellard, Céline; Roiz, David; Albert, Céline; Fournier, Alice; Barbet-Massin, Morgane; Salles, Jean-Michel; Simard, Frédéric; Courchamp, Franck

    2016-01-01

    Insects have presented human society with some of its greatest development challenges by spreading diseases, consuming crops and damaging infrastructure. Despite the massive human and financial toll of invasive insects, cost estimates of their impacts remain sporadic, spatially incomplete and of questionable quality. Here we compile a comprehensive database of economic costs of invasive insects. Taking all reported goods and service estimates, invasive insects cost a minimum of US$70.0 billion per year globally, while associated health costs exceed US$6.9 billion per year. Total costs rise as the number of estimate increases, although many of the worst costs have already been estimated (especially those related to human health). A lack of dedicated studies, especially for reproducible goods and service estimates, implies gross underestimation of global costs. Global warming as a consequence of climate change, rising human population densities and intensifying international trade will allow these costly insects to spread into new areas, but substantial savings could be achieved by increasing surveillance, containment and public awareness. PMID:27698460

  16. Massive yet grossly underestimated global costs of invasive insects.

    PubMed

    Bradshaw, Corey J A; Leroy, Boris; Bellard, Céline; Roiz, David; Albert, Céline; Fournier, Alice; Barbet-Massin, Morgane; Salles, Jean-Michel; Simard, Frédéric; Courchamp, Franck

    2016-10-04

    Insects have presented human society with some of its greatest development challenges by spreading diseases, consuming crops and damaging infrastructure. Despite the massive human and financial toll of invasive insects, cost estimates of their impacts remain sporadic, spatially incomplete and of questionable quality. Here we compile a comprehensive database of economic costs of invasive insects. Taking all reported goods and service estimates, invasive insects cost a minimum of US$70.0 billion per year globally, while associated health costs exceed US$6.9 billion per year. Total costs rise as the number of estimate increases, although many of the worst costs have already been estimated (especially those related to human health). A lack of dedicated studies, especially for reproducible goods and service estimates, implies gross underestimation of global costs. Global warming as a consequence of climate change, rising human population densities and intensifying international trade will allow these costly insects to spread into new areas, but substantial savings could be achieved by increasing surveillance, containment and public awareness.

  17. Massive yet grossly underestimated global costs of invasive insects

    NASA Astrophysics Data System (ADS)

    Bradshaw, Corey J. A.; Leroy, Boris; Bellard, Céline; Roiz, David; Albert, Céline; Fournier, Alice; Barbet-Massin, Morgane; Salles, Jean-Michel; Simard, Frédéric; Courchamp, Franck

    2016-10-01

    Insects have presented human society with some of its greatest development challenges by spreading diseases, consuming crops and damaging infrastructure. Despite the massive human and financial toll of invasive insects, cost estimates of their impacts remain sporadic, spatially incomplete and of questionable quality. Here we compile a comprehensive database of economic costs of invasive insects. Taking all reported goods and service estimates, invasive insects cost a minimum of US$70.0 billion per year globally, while associated health costs exceed US$6.9 billion per year. Total costs rise as the number of estimate increases, although many of the worst costs have already been estimated (especially those related to human health). A lack of dedicated studies, especially for reproducible goods and service estimates, implies gross underestimation of global costs. Global warming as a consequence of climate change, rising human population densities and intensifying international trade will allow these costly insects to spread into new areas, but substantial savings could be achieved by increasing surveillance, containment and public awareness.

  18. Satellite methods underestimate indirect climate forcing by aerosols

    PubMed Central

    Penner, Joyce E.; Xu, Li; Wang, Minghuai

    2011-01-01

    Satellite-based estimates of the aerosol indirect effect (AIE) are consistently smaller than the estimates from global aerosol models, and, partly as a result of these differences, the assessment of this climate forcing includes large uncertainties. Satellite estimates typically use the present-day (PD) relationship between observed cloud drop number concentrations (Nc) and aerosol optical depths (AODs) to determine the preindustrial (PI) values of Nc. These values are then used to determine the PD and PI cloud albedos and, thus, the effect of anthropogenic aerosols on top of the atmosphere radiative fluxes. Here, we use a model with realistic aerosol and cloud processes to show that empirical relationships for ln(Nc) versus ln(AOD) derived from PD results do not represent the atmospheric perturbation caused by the addition of anthropogenic aerosols to the preindustrial atmosphere. As a result, the model estimates based on satellite methods of the AIE are between a factor of 3 to more than a factor of 6 smaller than model estimates based on actual PD and PI values for Nc. Using ln(Nc) versus ln(AI) (Aerosol Index, or the optical depth times angstrom exponent) to estimate preindustrial values for Nc provides estimates for Nc and forcing that are closer to the values predicted by the model. Nevertheless, the AIE using ln(Nc) versus ln(AI) may be substantially incorrect on a regional basis and may underestimate or overestimate the global average forcing by 25 to 35%. PMID:21808047

  19. A multiscale MDCT image-based breathing lung model with time-varying regional ventilation

    SciTech Connect

    Yin, Youbing; Choi, Jiwoong; Hoffman, Eric A.; Tawhai, Merryn H.; Lin, Ching-Long

    2013-07-01

    A novel algorithm is presented that links local structural variables (regional ventilation and deforming central airways) to global function (total lung volume) in the lung over three imaged lung volumes, to derive a breathing lung model for computational fluid dynamics simulation. The algorithm constitutes the core of an integrative, image-based computational framework for subject-specific simulation of the breathing lung. For the first time, the algorithm is applied to three multi-detector row computed tomography (MDCT) volumetric lung images of the same individual. A key technique in linking global and local variables over multiple images is an in-house mass-preserving image registration method. Throughout breathing cycles, cubic interpolation is employed to ensure C{sub 1} continuity in constructing time-varying regional ventilation at the whole lung level, flow rate fractions exiting the terminal airways, and airway deformation. The imaged exit airway flow rate fractions are derived from regional ventilation with the aid of a three-dimensional (3D) and one-dimensional (1D) coupled airway tree that connects the airways to the alveolar tissue. An in-house parallel large-eddy simulation (LES) technique is adopted to capture turbulent-transitional-laminar flows in both normal and deep breathing conditions. The results obtained by the proposed algorithm when using three lung volume images are compared with those using only one or two volume images. The three-volume-based lung model produces physiologically-consistent time-varying pressure and ventilation distribution. The one-volume-based lung model under-predicts pressure drop and yields un-physiological lobar ventilation. The two-volume-based model can account for airway deformation and non-uniform regional ventilation to some extent, but does not capture the non-linear features of the lung.

  20. A multiscale MDCT image-based breathing lung model with time-varying regional ventilation

    NASA Astrophysics Data System (ADS)

    Yin, Youbing; Choi, Jiwoong; Hoffman, Eric A.; Tawhai, Merryn H.; Lin, Ching-Long

    2013-07-01

    A novel algorithm is presented that links local structural variables (regional ventilation and deforming central airways) to global function (total lung volume) in the lung over three imaged lung volumes, to derive a breathing lung model for computational fluid dynamics simulation. The algorithm constitutes the core of an integrative, image-based computational framework for subject-specific simulation of the breathing lung. For the first time, the algorithm is applied to three multi-detector row computed tomography (MDCT) volumetric lung images of the same individual. A key technique in linking global and local variables over multiple images is an in-house mass-preserving image registration method. Throughout breathing cycles, cubic interpolation is employed to ensure C1 continuity in constructing time-varying regional ventilation at the whole lung level, flow rate fractions exiting the terminal airways, and airway deformation. The imaged exit airway flow rate fractions are derived from regional ventilation with the aid of a three-dimensional (3D) and one-dimensional (1D) coupled airway tree that connects the airways to the alveolar tissue. An in-house parallel large-eddy simulation (LES) technique is adopted to capture turbulent-transitional-laminar flows in both normal and deep breathing conditions. The results obtained by the proposed algorithm when using three lung volume images are compared with those using only one or two volume images. The three-volume-based lung model produces physiologically-consistent time-varying pressure and ventilation distribution. The one-volume-based lung model under-predicts pressure drop and yields un-physiological lobar ventilation. The two-volume-based model can account for airway deformation and non-uniform regional ventilation to some extent, but does not capture the non-linear features of the lung.

  1. Dynamic left ventricular outflow tract obstruction: underestimated cause of hypotension and hemodynamic instability

    PubMed Central

    2014-01-01

    Left ventricular outflow tract obstruction, which is typically associated with hypertrophic cardiomyopathy, is the third most frequent cause of unexplained hypotension. This underestimated problem may temporarily accompany various diseases (it is found in even <1% of patients with no tangible cardiac disease) and clinical situations (hypovolemia, general anesthesia). It is currently assumed that left ventricular outflow tract obstruction is a dynamic phenomenon, the occurrence of which requires the coexistence of predisposing anatomic factors and a physiological condition that induces it. The diagnosis of left ventricular outflow tract obstruction should entail immediate implementation of the therapy to eliminate the factors that can potentially intensify the obstruction. Echocardiography is the basic modality in the diagnosis and treatment of left ventricular outflow tract obstruction. This paper presents four patients in whom the immediate implementation of bedside echocardiography enabled a rapid diagnosis of left ventricular outflow tract obstruction and implementation of proper treatment. PMID:26674265

  2. Underestimated Amoebic Appendicitis among HIV-1-Infected Individuals in Japan

    PubMed Central

    Kobayashi, Taiichiro; Yano, Hideaki; Murata, Yukinori; Igari, Toru; Nakada-Tsukui, Kumiko; Yagita, Kenji; Nozaki, Tomoyoshi; Kaku, Mitsuo; Tsukada, Kunihisa; Gatanaga, Hiroyuki; Kikuchi, Yoshimi; Oka, Shinichi

    2016-01-01

    ABSTRACT Entamoeba histolytica is not a common causative agent of acute appendicitis. However, amoebic appendicitis can sometimes be severe and life threatening, mainly due to a lack of awareness. Also, its frequency, clinical features, and pathogenesis remain unclear. The study subjects were HIV-1-infected individuals who presented with acute appendicitis and later underwent appendectomy at our hospital between 1996 and 2014. Formalin-fixed paraffin-embedded preserved appendix specimens were reexamined by periodic acid-Schiff (PAS) staining and PCR to identify undiagnosed amoebic appendicitis. Appendectomies were performed in 57 patients with acute appendicitis. The seroprevalence of E. histolytica was 33% (14/43) from the available stored sera. Based on the medical records, only 3 cases were clinically diagnosed as amoebic appendicitis, including 2 diagnosed at the time of appendectomy and 1 case diagnosed by rereview of the appendix after the development of postoperative complications. Retrospective analyses using PAS staining and PCR identified 3 and 3 more cases, respectively. Thus, E. histolytica infection was confirmed in 9 cases (15.8%) in the present study. Apart from a significantly higher leukocyte count in E. histolytica-positive patients than in negative patients (median, 13,760 versus 10,385 cells/μl, respectively, P = 0.02), there were no other differences in the clinical features of the PCR-positive and -negative groups. In conclusion, E. histolytica infection was confirmed in 9 (15.8%) of the appendicitis cases. However, only 3, including one diagnosed after intestinal perforation, were diagnosed before the present analyses. These results strongly suggest there is frequently a failure to detect trophozoites in routine examination, resulting in an underestimation of the incidence of amoebic appendicitis. PMID:27847377

  3. Managing the underestimated risk of statin-associated myopathy.

    PubMed

    Rallidis, Loukianos S; Fountoulaki, Katerina; Anastasiou-Nana, Maria

    2012-09-06

    In clinical practice 5-10% of patients receiving statins develop myopathy, a side effect that had been systematically underestimated in the randomized controlled trials with statins. The most common manifestation of myopathy is muscle pain (usually symmetrical, involving proximal muscles) without creatinine kinase (CK) elevation or less frequently with mild CK elevation. Clinically significant rhabdomyolysis (muscle symptoms with CK elevation >10 times the upper limit of normal and with creatinine elevation) is extremely rare. Myopathy complicates the use of all statins (class effect) and is dose-dependent. The pathophysiologic mechanism of statin-associated myopathy is unknown and probably multifactorial. The risk of statin-associated myopathy can be minimized by identifying vulnerable patients (i.e. patients with impaired renal or liver function, advanced age, hypothyroidism, etc.) and/or by eliminating-avoiding statin interactions with specific drugs (cytochrome P-450 3A4 inhibitors, gemfibrozil, etc.). In symptomatic patients, the severity of symptoms, the magnitude of CK elevation and the risk/benefit ratio of statin continuation should be considered before statin treatment is discontinued. Potential strategies are the use of the same statin at a lower dose and if symptoms recur the initiation of fluvastatin XL 80 mg daily or rosuvastatin intermittently in low dose (5-10mg), combined usually with ezetimibe 10mg daily. Failure of these approaches necessitates the use of non-statin lipid lowering drugs (ezetimibe, colesevelam). In order to provide evidence based recommendations for the appropriate management of statin-intolerant patients we need randomized clinical trials directly comparing the myopathic potential of different lipid-lowering medications at comparable doses.

  4. Correcting eddy-covariance flux underestimates over a grassland.

    SciTech Connect

    Twine, T. E.; Kustas, W. P.; Norman, J. M.; Cook, D. R.; Houser, P. R.; Meyers, T. P.; Prueger, J. H.; Starks, P. J.; Wesely, M. L.; Environmental Research; Univ. of Wisconsin at Madison; DOE; National Aeronautics and Space Administration; National Oceanic and Atmospheric Administrationoratory

    2000-06-08

    Independent measurements of the major energy balance flux components are not often consistent with the principle of conservation of energy. This is referred to as a lack of closure of the surface energy balance. Most results in the literature have shown the sum of sensible and latent heat fluxes measured by eddy covariance to be less than the difference between net radiation and soil heat fluxes. This under-measurement of sensible and latent heat fluxes by eddy-covariance instruments has occurred in numerous field experiments and among many different manufacturers of instruments. Four eddy-covariance systems consisting of the same models of instruments were set up side-by-side during the Southern Great Plains 1997 Hydrology Experiment and all systems under-measured fluxes by similar amounts. One of these eddy-covariance systems was collocated with three other types of eddy-covariance systems at different sites; all of these systems under-measured the sensible and latent-heat fluxes. The net radiometers and soil heat flux plates used in conjunction with the eddy-covariance systems were calibrated independently and measurements of net radiation and soil heat flux showed little scatter for various sites. The 10% absolute uncertainty in available energy measurements was considerably smaller than the systematic closure problem in the surface energy budget, which varied from 10 to 30%. When available-energy measurement errors are known and modest, eddy-covariance measurements of sensible and latent heat fluxes should be adjusted for closure. Although the preferred method of energy balance closure is to maintain the Bowen-ratio, the method for obtaining closure appears to be less important than assuring that eddy-covariance measurements are consistent with conservation of energy. Based on numerous measurements over a sorghum canopy, carbon dioxide fluxes, which are measured by eddy covariance, are underestimated by the same factor as eddy covariance evaporation

  5. The most characteristic lesions and radiologic signs of Crohn disease of the small bowel: air enteroclysis, MDCT, endoscopy, and pathology.

    PubMed

    Carbo, Alberto I; Reddy, Threta; Gates, Thomas; Vesa, Telciane; Thomas, Jaiyeola; Gonzalez, Enrique

    2014-02-01

    This pictorial essay describes the most characteristic lesions and radiologic signs of Crohn disease of the small bowel: nodular lymphoid hyperplasia, abnormal mucosal folds, villous pattern, aphthous ulcerations, linear ulcerations, cobblestone pattern, string sign, target sign, comb sign, creeping fat, sinus tracts, fistulas, and abscesses. Each description includes the definition, a correlation with the pathologic findings, an explanation of the possible physiopathologic mechanism, sample radiologic images with air enteroclysis or MDCT, the correspondence with the endoscopic findings when possible, and a list of differential diagnoses.

  6. Synchronous infection of the aorta and the testis: emphysematous epididymo-orchitis, abdominal aortic mycotic aneurysm, and testicular artery pseudoaneurysm diagnosed by use of MDCT.

    PubMed

    Hegde, Rahul G; Balani, Ankit; Merchant, Suleman A; Joshi, Anagha R

    2014-07-01

    We report clinical details and imaging findings for a case of emphysematous epididymo-orchitis with co-existing mycotic abdominal aortic aneurysm and a testicular artery pseudoaneurysm in a diabetic 65-year-old male. We report imaging findings from ultrasonography (USG) and contrast-enhanced multidetector computed tomography (MDCT). Use of MDCT to identify, confirm, and define the extent of the disease, and its utility in understanding the pathogenesis of this rare condition are highlighted. For such lethal infections, early diagnosis and intervention can be lifesaving; imaging can be of crucial importance in this.

  7. Do general circulation models underestimate the natural variability in the artic climate?

    SciTech Connect

    Battisti, D.S.; Bitz, C.M.; Moritz, R.E.

    1997-08-01

    The authors examine the natural variability of the arctic climate system simulated by two very different models: the Geophysical Fluid Dynamics Laboratory (GFDL) global climate model, and an area-averaged model of the arctic atmosphere-sea ice-upper-ocean system called the polar cap climate model, the PCCM. A 1000-yr integration of the PCCM is performed in which the model is driven by a prescribed, stochastic atmospheric energy flux convergence (D), which has spectral characteristics that are identical to the spectra of the observed D. The standard deviation of the yearly mean sea ice thickness from this model is 0.85 m; the mean sea ice thickness is 3.1 m. In contrast, the standard deviation of the yearly averaged sea ice thickness in the GFDL climate model is found to be about 6% of the climatological mean thickness and only 24% of that simulated by the PCCM. A series of experiments is presented to determine the cause of these disparate results. First, after changing the treatment of sea ice and snow albedo in the (standard) PCCM model to be identical thermodynamically to that in the GFDL model, the PCCM is driven with D from the GFDL control integration to demonstrate that the PCCM model produces an arctic climate similar to that of the GFDL model. Integrations of the PCCM are then examined in which the different prescriptions of the sea ice treatment (GFDL vs standard PCCM) and D (GFDL vs observed) are permutated. The authors present calculations that indicate the variability in the sea ice thickness is extremely sensitive to the spectrum of the atmospheric energy flux convergence. A conservative best estimate for the amplitude of the natural variability in the arctic sea ice volume is presented.The results suggest that most of the global climate models that have been used to evaluate climate change may also have artificially quiescent variability in the Arctic. 24 refs., 6 figs., 3 tabs.

  8. VARIANCES MAY BE UNDERESTIMATED USING AVAILABLE SOFTWARE FOR GENERALIZED ADDITIVE MODELS. (R829213)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  9. MDCT Anatomic Assessment of Right Inferior Phrenic Artery Origin Related to Potential Supply to Hepatocellular Carcinoma and its Embolization

    SciTech Connect

    Basile, Antonio Tsetis, Dimitrios; Montineri, Arturo; Puleo, Stefano; Massa Saluzzo, Cesare; Runza, Giuseppe; Coppolino, Francesco; Ettorre, Giovanni Carlo; Patti, Maria Teresa

    2008-03-15

    Purpose. To prospectively assess the anatomic variation of the right inferior phrenic artery (RIPA) origin with multidetector computed tomography (MDCT) scans in relation to the technical and angiographic findings during transcatheter arterial embolization of hepatocellular carcinoma (HCC). Methods. Two hundred patients with hepatocellular carcinomas were examined with 16-section CT during the arterial phase. The anatomy of the inferior phrenic arteries was recorded, with particular reference to their origin. All patients with subcapsular HCC located at segments VII and VIII underwent arteriography of the RIPA with subsequent embolization if neoplastic supply was detected. Results. The RIPA origin was detected in all cases (sensitivity 100%), while the left inferior phrenic artery origin was detected in 187 cases (sensitivity 93.5%). RIPAs originated from the aorta (49%), celiac trunk (41%), right renal artery (5.5%), left gastric artery (4%), and proper hepatic artery (0.5%), with 13 types of combinations with the left IPA. Twenty-nine patients showed subcapsular HCCs in segments VII and VIII and all but one underwent RIPA selective angiography, followed by embolization in 7 cases. Conclusion. MDCT assesses well the anatomy of RIPAs, which is fundamental for planning subsequent cannulation and embolization of extrahepatic RIPA supply to HCC.

  10. The impacts of open-mouth breathing on upper airway space in obstructive sleep apnea: 3-D MDCT analysis.

    PubMed

    Kim, Eun Joong; Choi, Ji Ho; Kim, Kang Woo; Kim, Tae Hoon; Lee, Sang Hag; Lee, Heung Man; Shin, Chol; Lee, Ki Yeol; Lee, Seung Hoon

    2011-04-01

    Open-mouth breathing during sleep is a risk factor for obstructive sleep apnea (OSA) and is associated with increased disease severity and upper airway collapsibility. The aim of this study was to investigate the effect of open-mouth breathing on the upper airway space in patients with OSA using three-dimensional multi-detector computed tomography (3-D MDCT). The study design included a case-control study with planned data collection. The study was performed at a tertiary medical center. 3-D MDCT analysis was conducted on 52 patients with OSA under two experimental conditions: mouth closed and mouth open. Under these conditions, we measured the minimal cross-sectional area of the retropalatal and retroglossal regions (mXSA-RP, mXSA-RG), as well as the upper airway length (UAL), defined as the vertical dimension from hard palate to hyoid. We also computed the volume of the upper airway space by 3-D reconstruction of both conditions. When the mouth was open, mXSA-RP and mXSA-RG significantly decreased and the UAL significantly increased, irrespective of the severity of OSA. However, between the closed- and open-mouth states, there was no significant change in upper airway volume at any severity of OSA. Results suggest that the more elongated and narrow upper airway during open-mouth breathing may aggravate the collapsibility of the upper airway and, thus, negatively affect OSA severity.

  11. Influence of radiation dose and reconstruction algorithm in MDCT assessment of airway wall thickness: A phantom study

    SciTech Connect

    Gomez-Cardona, Daniel; Nagle, Scott K.; Li, Ke; Chen, Guang-Hong; Robinson, Terry E.

    2015-10-15

    Purpose: Wall thickness (WT) is an airway feature of great interest for the assessment of morphological changes in the lung parenchyma. Multidetector computed tomography (MDCT) has recently been used to evaluate airway WT, but the potential risk of radiation-induced carcinogenesis—particularly in younger patients—might limit a wider use of this imaging method in clinical practice. The recent commercial implementation of the statistical model-based iterative reconstruction (MBIR) algorithm, instead of the conventional filtered back projection (FBP) algorithm, has enabled considerable radiation dose reduction in many other clinical applications of MDCT. The purpose of this work was to study the impact of radiation dose and MBIR in the MDCT assessment of airway WT. Methods: An airway phantom was scanned using a clinical MDCT system (Discovery CT750 HD, GE Healthcare) at 4 kV levels and 5 mAs levels. Both FBP and a commercial implementation of MBIR (Veo{sup TM}, GE Healthcare) were used to reconstruct CT images of the airways. For each kV–mAs combination and each reconstruction algorithm, the contrast-to-noise ratio (CNR) of the airways was measured, and the WT of each airway was measured and compared with the nominal value; the relative bias and the angular standard deviation in the measured WT were calculated. For each airway and reconstruction algorithm, the overall performance of WT quantification across all of the 20 kV–mAs combinations was quantified by the sum of squares (SSQs) of the difference between the measured and nominal WT values. Finally, the particular kV–mAs combination and reconstruction algorithm that minimized radiation dose while still achieving a reference WT quantification accuracy level was chosen as the optimal acquisition and reconstruction settings. Results: The wall thicknesses of seven airways of different sizes were analyzed in the study. Compared with FBP, MBIR improved the CNR of the airways, particularly at low radiation dose

  12. Shading correction for on-board cone-beam CT in radiation therapy using planning MDCT images

    SciTech Connect

    Niu Tianye; Sun, Mingshan; Star-Lack, Josh; Gao Hewei; Fan Qiyong; Zhu Lei

    2010-10-15

    Purpose: Applications of cone-beam CT (CBCT) to image-guided radiation therapy (IGRT) are hampered by shading artifacts in the reconstructed images. These artifacts are mainly due to scatter contamination in the projections but also can result from uncorrected beam hardening effects as well as nonlinearities in responses of the amorphous silicon flat panel detectors. While currently, CBCT is mainly used to provide patient geometry information for treatment setup, more demanding applications requiring high-quality CBCT images are under investigation. To tackle these challenges, many CBCT correction algorithms have been proposed; yet, a standard approach still remains unclear. In this work, we propose a shading correction method for CBCT that addresses artifacts from low-frequency projection errors. The method is consistent with the current workflow of radiation therapy. Methods: With much smaller inherent scatter signals and more accurate detectors, diagnostic multidetector CT (MDCT) provides high quality CT images that are routinely used for radiation treatment planning. Using the MDCT image as ''free'' prior information, we first estimate the primary projections in the CBCT scan via forward projection of the spatially registered MDCT data. Since most of the CBCT shading artifacts stem from low-frequency errors in the projections such as scatter, these errors can be accurately estimated by low-pass filtering the difference between the estimated and raw CBCT projections. The error estimates are then subtracted from the raw CBCT projections. Our method is distinct from other published correction methods that use the MDCT image as a prior because it is projection-based and uses limited patient anatomical information from the MDCT image. The merit of CBCT-based treatment monitoring is therefore retained. Results: The proposed method is evaluated using two phantom studies on tabletop systems. On the Catphan(c)600 phantom, our approach reduces the reconstruction error

  13. Uniform slip model underestimates tsunami hazard for probabilistic assessment: results from a case study in the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, L.; Switzer, A.; Chan, C. H.; Wang, Y.; Weiss, R.; Qiu, Q.

    2015-12-01

    It has long been recognized that rupture complexity, typically in the form of heterogeneous slip distribution pattern, has a significant effect on tsunami wave field. However, the effect of heterogeneous slip distributions is not commonly considered in probabilistic tsunami hazard assessment (PTHA) primarily due to its computational expense. To investigate the effect of heterogeneous slip distribution on PTHAs, we incorporate a stochastic source model into a Monte Carlo-type method for PTHA. Using a hybrid kinematic k-squared source model, we generate a broad range of slip distribution patterns for large numbers of synthetic earthquake events and assess tsunami hazard, as an example, for the South China Sea (SCS). Our result suggests, for a relatively small and confined region like the SCS, the commonly used approach based on the uniform slip distribution fault models could significantly underestimate tsunami hazard, especially on a longer time period. For 500-year return periods, the expected wave height along the coast of west Luzon, Taiwan, southeast China, east Vietnam is generally underestimated by 20-50 %. Notably, the underestimation is more pronounced (some locations reach >50%) for the expected tsunami wave height with a 1000-year return period. Also of note the probability of experiencing 1m tsunami wave in the next 100 years is underestimated by more than 40% in many coastal sites in southeast China and east Vietnam. As the results of PTHA commonly serve as the foundation for further risk assessments, this case study emphasizes how crucial it is to take the effect of rupture complexity into account.

  14. Pelvic ultrasound immediately following MDCT in female patients with abdominal/pelvic pain: is it always necessary?

    PubMed

    Yitta, Silaja; Mausner, Elizabeth V; Kim, Alice; Kim, Danny; Babb, James S; Hecht, Elizabeth M; Bennett, Genevieve L

    2011-10-01

    To determine the added value of reimaging the female pelvis with ultrasound (US) immediately following multidetector CT (MDCT) in the emergent setting. CT and US exams of 70 patients who underwent MDCT for evaluation of abdominal/pelvic pain followed by pelvic ultrasound within 48 h were retrospectively reviewed by three readers. Initially, only the CT images were reviewed followed by evaluation of CT images in conjunction with US images. Diagnostic confidence was recorded for each reading and an exact Wilcoxon signed rank test was performed to compare the two. Changes in diagnosis based on combined CT and US readings versus CT readings alone were identified. Confidence intervals (95%) were derived for the percentage of times US reimaging can be expected to lead to a change in diagnosis relative to the diagnosis based on CT interpretation alone. Ultrasound changed the diagnosis for the ovaries/adnexa 8.1% of the time (three reader average); the majority being cases of a suspected CT abnormality found to be normal on US. Ultrasound changed the diagnosis for the uterus 11.9% of the time (three reader average); the majority related to the endometrial canal. The 95% confidence intervals for the ovaries/adnexa and uterus were 5-12.5% and 8-17%, respectively. Ten cases of a normal CT were followed by a normal US with 100% agreement across all three readers. Experienced readers correctly diagnosed ruptured ovarian cysts and tubo-ovarian abscesses (TOA) based on CT alone with 100% agreement. US reimaging after MDCT of the abdomen and pelvis is not helpful: (1) following a normal CT of the pelvic organs or (2) when CT findings are diagnostic and/or characteristic of certain entities such as ruptured cysts and TOA. Reimaging with ultrasound is warranted for (1) less-experienced readers to improve diagnostic confidence or when CT findings are not definitive, (2) further evaluation of suspected endometrial abnormalities. A distinction should be made between the need for

  15. Centauries as underestimated food additives: antioxidant and antimicrobial potential.

    PubMed

    Siler, Branislav; Zivković, Suzana; Banjanac, Tijana; Cvetković, Jelena; Nestorović Živković, Jasmina; Cirić, Ana; Soković, Marina; Mišić, Danijela

    2014-03-15

    Methanol extracts of aerial parts and roots of five centaury species (Centaurium erythraea, C. tenuiflorum, C. littorale ssp. uliginosum, C. pulchellum, and Schenkia spicata) were analysed for their main secondary metabolites: secoiridoid glycosides, a group of monoterpenoid compounds, and phenolics (xanthones and flavonoids), and further investigated for antioxidant capacity and antimicrobial activity. The results of ABTS, DPPH, and FRAP assays showed that above ground parts generally displayed up to 13 times higher antioxidant activity compared to roots, which should be related to higher phenolics content, especially flavonoids, in green plant organs. Secoiridoid glycosides showed no antioxidant activity. All the tested extracts demonstrated appreciative antibacterial (0.05-0.5 mg ml(-1)) and strong antifungal activity (0.1-0.6 mg ml(-1)). Our results imply that above ground parts of all centaury species studied, could be recommended for human usage as a rich source of natural antioxidants and also in food industry as strong antimicrobial agents for food preservation.

  16. Underestimation of Leptospirosis Incidence in the French West Indies

    PubMed Central

    Cassadou, Sylvie; Rosine, Jacques; Flamand, Claude; Escher, Martina; Ledrans, Martine; Bourhy, Pascale; Picardeau, Mathieu; Quénel, Philippe

    2016-01-01

    Background Leptospirosis is a neglected zoonosis affecting mainly tropical and subtropical regions worldwide, particularly South America and the Caribbean. As in many other countries, under-reporting of cases was suspected in the French West Indies because of inadequate access to diagnostic tests for the general population. Methodology/Principal findings In order to estimate the real incidence of leptospirosis in Guadeloupe and Martinique, a study was performed in 2011 using the three prevailing available biological tests for diagnosis: Microscopic Agglutination Test (MAT), IgM ELISA and PCR. The study investigated inpatients and outpatients and used active case ascertainment from data provided by a general practitioners’ sentinel network. The epidemiology of the disease was also described in terms of severity and demographic characteristics. Leptospirosis incidence was estimated at 69.4 (95%CI 47.6–91.1) and 60.6 (95%CI 36.3–85.0) annual cases per 100 000 inhabitants in Guadeloupe and Martinique, respectively, which was 3 and 4 times higher than previous estimations. Conclusion/Significance Inclusion of PCR and IgM ELISA tests for diagnosis of leptospirosis resulted in improved sensitivity in comparison with MAT alone. Our results highlighted the substantial health burden of the disease in these two territories and the importance of access to appropriate laboratory tests. Based on our results, PCR and IgM ELISA tests have now been included in the list of tests reimbursed by the national system of social security insurance in France. Our results also underline the relevance of implementing an integrated strategy for the surveillance, prevention and control of leptospirosis in the French West Indies. PMID:27128631

  17. Effects of computing parameters and measurement locations on the estimation of 3D NPS in non-stationary MDCT images.

    PubMed

    Miéville, Frédéric A; Bolard, Gregory; Bulling, Shelley; Gudinchet, François; Bochud, François O; Verdun, François R

    2013-11-01

    The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.

  18. Two underestimated threats in food transportation: mould and acceleration

    PubMed Central

    Janssen, S.; Pankoke, I.; Klus, K.; Schmitt, K.; Stephan, U.; Wöllenstein, J.

    2014-01-01

    Two important parameters are often neglected in the monitoring of perishable goods during transport: mould contamination of fresh food and the influence of acceleration or vibration on the quality of a product. We assert the claim that it is necessary to focus research on these two topics in the context of intelligent logistics in this opinion paper. Further, the technical possibilities for future measurement systems are discussed. By measuring taste deviations, we verified the effect on the quality of beer at different vibration frequencies. The practical importance is shown by examining transport routes and market shares. The general feasibility of a mobile mould detection system is established by examining the measurement resolution of semiconductor sensors for mould-related gases. Furthermore, as an alternative solution, we present a concept for a miniaturized and automated culture-medium-based system. Although there is a lack of related research to date, new efforts can make a vital contribution to the reduction of losses in the logistic chains for several products. PMID:24797139

  19. Archaea associated with human surfaces: not to be underestimated.

    PubMed

    Bang, Corinna; Schmitz, Ruth A

    2015-09-01

    Over 40 years ago, Carl Woese and his colleagues discovered the existence of two distinctly different groups of prokaryotes-Bacteria and Archaea. In the meantime, extensive research revealed that several hundred of bacterial species are intensely associated with humans' health and disease. Archaea, originally identified and described to occur mainly in extreme environments, have been shown to be ubiquitous and to appear frequently and in high numbers as part of human microbiota in recent years. Despite the improvement in methodologies leading to increased detection, archaea are often still not considered in many studies focusing on the interdependency between members of the microbiota and components of the human immune system. As a consequence, the knowledge on functional role(s) of archaeal species within the human body is mainly limited to their contribution to nutrient degradation in the intestine, and evidence for immunogenic properties of archaea as part of the human microbiota is generally rare. In this review, the current knowledge of human mucosa-associated archaeal species, their interaction with the human immune system and their potential contribution to humans' health and disease will be discussed.

  20. Underestimating the Toxicological Challenges Associated with the Use of Herbal Medicinal Products in Developing Countries

    PubMed Central

    Neergheen-Bhujun, Vidushi S.

    2013-01-01

    Various reports suggest a high contemporaneous prevalence of herb-drug use in both developed and developing countries. The World Health Organisation indicates that 80% of the Asian and African populations rely on traditional medicine as the primary method for their health care needs. Since time immemorial and despite the beneficial and traditional roles of herbs in different communities, the toxicity and herb-drug interactions that emanate from this practice have led to severe adverse effects and fatalities. As a result of the perception that herbal medicinal products have low risk, consumers usually disregard any association between their use and any adverse reactions hence leading to underreporting of adverse reactions. This is particularly common in developing countries and has led to a paucity of scientific data regarding the toxicity and interactions of locally used traditional herbal medicine. Other factors like general lack of compositional and toxicological information of herbs and poor quality of adverse reaction case reports present hurdles which are highly underestimated by the population in the developing world. This review paper addresses these toxicological challenges and calls for natural health product regulations as well as for protocols and guidance documents on safety and toxicity testing of herbal medicinal products. PMID:24163821

  1. Panoramic radiographs underestimate extensions of the anterior loop and mandibular incisive canal

    PubMed Central

    Nejaim, Yuri; de Freitas, Deborah Queiroz; de Oliveira Santos, Christiano

    2016-01-01

    Purpose The purpose of this study was to detect the anterior loop of the mental nerve and the mandibular incisive canal in panoramic radiographs (PAN) and cone-beam computed tomography (CBCT) images, as well as to determine the anterior/mesial extension of these structures in panoramic and cross-sectional reconstructions using PAN and CBCT images. Materials and Methods Images (both PAN and CBCT) from 90 patients were evaluated by 2 independent observers. Detection of the anterior loop and the incisive canal were compared between PAN and CBCT. The anterior/mesial extension of these structures was compared between PAN and both cross-sectional and panoramic CBCT reconstructions. Results In CBCT, the anterior loop and the incisive canal were observed in 7.7% and 24.4% of the hemimandibles, respectively. In PAN, the anterior loop and the incisive canal were detected in 15% and 5.5% of cases, respectively. PAN presented more difficulties in the visualization of structures. The anterior/mesial extensions ranged from 0.0 mm to 19.0 mm on CBCT. PAN underestimated the measurements by approximately 2.0 mm. Conclusion CBCT appears to be a more reliable imaging modality than PAN for preoperative workups of the anterior mandible. Individual variations in the anterior/mesial extensions of the anterior loop of the mental nerve and the mandibular incisive canal mean that is not prudent to rely on a general safe zone for implant placement or bone surgery in the interforaminal region. PMID:27672611

  2. Underestimation of mid-Holocene Arctic warming in PMIP simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Qiong; Muschitiello, Francesco

    2016-04-01

    Due to the orbital forcing, Arctic is warmer during mid-Holocene (~ 6 kyr BP) in summer because the region received more insolation and also warmer in winter because of strong feedbacks, leads to an annual mean temperature warming. Existing proxy reconstructions show that the Arctic can be two degrees warmer than pre-industrial. However, not all the climate models can capture the warming, and the amplitude is about 0.5 degree less than that seen from proxy data. One possible reason is that these simulations did not take into account a fact of 'Green Sahara', where the large area of Sahara region is covered by vegetation instead of desert as it is today. By using a fully coupled climate model EC-Earth with about 100 km resolution, we have run a series of sensitivity experiments by changing the surface type, as well as accompanied change in dust emission over the northern Sahara. The results show that a green sahara not only results in local climate response such as the northward extension and strengthening of African monsoon, but also affect the large scale circulation and corresponding meridional heat transport. The combination of green sahara and reduced dust entails a general strengthening of the mid-latitude Westerlies, results in a change to more positive North Atlantic Oscillation-like conditions, and more heat transport from lower latitudes to high latitudes both in atmosphere and ocean, eventually leads to a shift towards warmer conditions over the North Atlantic and Arctic regions. This mechanism would explain the sign of rapid hydro-climatic perturbations recorded in several reconstructions from high northern latitudes after the termination of the African Humid Period around 5.5 - 5.0 kyr BP, suggesting that these regions are sensitive to changes in Saharan land cover during the present interglacial. This is central in the debate surrounding Arctic climate amplification and future projections for subtropical precipitation changes and related surface type

  3. Radiological surveillance of formerly asbestos-exposed power industry workers: rates and risk factors of benign changes on chest X-ray and MDCT

    PubMed Central

    2014-01-01

    Background To determine the prevalence of asbestos-related changes on chest X-ray (CXR) and low-dose multidetector-row CT (MDCT) of the thorax in a cohort of formerly asbestos-exposed power industry workers and to assess the importance of common risk factors associated with specific radiological changes. Methods To assess the influence of selected risk factors (age, time since first exposure, exposure duration, cumulative exposure and pack years) on typical asbestos-related radiographic changes, we employed multiple logistic regression and receiver operating characteristic (ROC) analysis. Results On CXR, pleural changes and asbestosis were strongly associated with age, years since first exposure and exposure duration. The MDCT results showed an association between asbestosis and age and between plaques and exposure duration, years since first exposure and cumulative exposure. Parenchymal changes on CXR and MDCT, and diffuse pleural thickening on CXR were both associated with smoking. Using a cut-off of 55 years for age, 17 years for exposure duration and 28 years for latency, benign radiological changes in the cohort with CXR could be predicted with a sensitivity of 82.0% for all of the three variables and a specificity of 47.4%, 39.0% and 40.6%, respectively. Conclusions Participants aged 55 years and older and those with an asbestos exposure of at least 17 years or 28 years since first exposure should be seen as having an increased risk of abnormal radiological findings. For implementing a more focused approach the routine use of low-dose MDCT rather than CXR at least for initial examinations would be justified. PMID:24808921

  4. Self-reported versus measured body height and weight in Polish adult men: the risk of underestimating obesity rates.

    PubMed

    Łopuszańska, Monika; Lipowicz, Anna; Kołodziej, Halina; Szklarska, Alicja; Bielicki, Tadeusz

    2015-01-01

    Background: In some epidemiological studies, self-reported height and weight are often used to save time and money. Self-reported height and weight are commonly used to assess the prevalence of obesity. The aim of this study was to assess the differences between self-reported and measured height and weight in adult men, and to determine how the accuracy of self-reported data depended on age and education. The prevalence of obesity was also calculated based both on self-reported and measured data. Material and methods: Data were collected during two population studies carried out in Wroclaw in 2010. One study included 1,194 19-year-old males who reported for the health examination mandated by the National Conscription Board (younger group). The other group included 355 men between 35 and 80 years old who reported for a ten-year follow-up (older group). Data were analyzed separately for both age groups. Results: Both younger and older subjects overestimated their height by 1.4 cm and 1.0 cm (1.4 cm, 95   %CI: 1.26, 1.51, and 1.0 cm, 95   %CI: 0.85, 1.26, respectively). On average, younger subjects overestimated their weight by 0.7 kilograms (95   %CI: 0.55, 0.92), whereas older subjects underestimated their weight by 0.9 kilograms (95   %CI: –1.15, –0.48). The lower the level of education, the more the subjects overestimated their height. Conclusions: Adult men systematically overestimate their height and underestimate their weight. The magnitude of the inaccuracy depends on level of education. When self-reported data are used, the prevalence of obesity is generally underestimated. Using self-reported data to calculate BMI can lead to a substantial underestimation of the proportion of underweight and obese individuals in a population. Finally, using self-reported values for height in studies on social inequality may lead to false conclusions.

  5. Systems for Lung Volume Standardization during Static and Dynamic MDCT-based Quantitative Assessment of Pulmonary Structure and Function

    PubMed Central

    Fuld, Matthew K.; Grout, Randall; Guo, Junfeng; Morgan, John H.; Hoffman, Eric A.

    2013-01-01

    Rationale and Objectives Multidetector-row Computed Tomography (MDCT) has emerged as a tool for quantitative assessment of parenchymal destruction, air trapping (density metrics) and airway remodeling (metrics relating airway wall and lumen geometry) in chronic obstructive pulmonary disease (COPD) and asthma. Critical to the accuracy and interpretability of these MDCT-derived metrics is the assurance that the lungs are scanned during a breath-hold at a standardized volume. Materials and Methods A computer monitored turbine-based flow meter system was developed to control patient breath-holds and facilitate static imaging at fixed percentages of the vital capacity. Due to calibration challenges with gas density changes during multi-breath xenon-CT an alternative system was required. The design incorporated dual rolling seal pistons. Both systems were tested in a laboratory environment and human subject trials. Results The turbine-based system successfully controlled lung volumes in 32/37 subjects, having a linear relationship for CT measured air volume between repeated scans: for all scans, the mean and confidence interval of the differences (scan1-scan2) was −9 ml (−169, 151); for TLC alone 6 ml (−164, 177); for FRC alone, −23 ml (−172, 126). The dual-piston system successfully controlled lung volume in 31/41 subjects. Study failures related largely to subject non-compliance with verbal instruction and gas leaks around the mouthpiece. Conclusion We demonstrate the successful use of a turbine-based system for static lung volume control and demonstrate its inadequacies for dynamic xenon-CT studies. Implementation of a dual-rolling seal spirometer has been shown to adequately control lung volume for multi-breath wash-in xenon-CT studies. These systems coupled with proper patient coaching provide the tools for the use of CT to quantitate regional lung structure and function. The wash-in xenon-CT method for assessing regional lung function, while not

  6. Risk Analysis of Underestimate Cost Offer to The Project Quality in Aceh Province

    NASA Astrophysics Data System (ADS)

    Rani, Hafnidar A.

    2016-11-01

    The possibility of errors in the process of offer price determination could be enormous, so it can affect the possibility of project underestimate cost which can impact and reduce the profit if being implementing. Government Equipment/Service Procurement Policy Institution (LKPP) assesses that the practices of cheaper price in the government equipment/service procurement are still highly found and can be potential to decrease the project quality. This study aimed to analyze the most dominant factors happened in underestimate cost offer practice, to analyze the relationship of underestimate cost offer risk factors to road construction project quality in Aceh Province and to analyze the most potential factors of underestimate cost offer risk affecting road construction project quality in Aceh Province. Road construction projects observed the projects which have been implemented in Aceh Province since 2013 - 2015. This study conducted by interviewing Government Budget Authority (KPA), and distributing the questionnaire to the road construction contractors with the qualification of K1, K2, K3, M1, M2 and B1. Based on the data from Construction Service Development Institution (LPJK) of Aceh Province on 2016, the populations obtained are 2,717 constructors. By using Slovin Equation, the research samples obtained are 97 contractors. The most dominant factors in underestimate cost offer risk of the road construction projects in Aceh Province is Contingency Cost Factor which the mean is 4.374.

  7. One-dimensional potential of mean force underestimates activation barrier for transport across flexible lipid membranes

    NASA Astrophysics Data System (ADS)

    Kopelevich, Dmitry I.

    2013-10-01

    Transport of a fullerene-like nanoparticle across a lipid bilayer is investigated by coarse-grained molecular dynamics (MD) simulations. Potentials of mean force (PMF) acting on the nanoparticle in a flexible bilayer suspended in water and a bilayer restrained to a flat surface are computed by constrained MD simulations. The rate of the nanoparticle transport into the bilayer interior is predicted using one-dimensional Langevin models based on these PMFs. The predictions are compared with the transport rates obtained from a series of direct (unconstrained) MD simulations of the solute transport into the flexible bilayer. It is observed that the PMF acting on the solute in the flexible membrane underestimates the transport rate by more than an order of magnitude while the PMF acting on the solute in the restrained membrane yields an accurate estimate of the activation energy for transport into the flexible membrane. This paradox is explained by a coexistence of metastable membrane configurations for a range of the solute positions inside and near the flexible membrane. This leads to a significant reduction of the contribution of the transition state to the mean force acting on the solute. Restraining the membrane shape ensures that there is only one stable membrane configuration corresponding to each solute position and thus the transition state is adequately represented in the PMF. This mechanism is quite general and thus this phenomenon is expected to occur in a wide range of interfacial systems. A simple model for the free energy landscape of the coupled solute-membrane system is proposed and validated. This model explicitly accounts for effects of the membrane deformations on the solute transport and yields an accurate prediction of the activation energy for the solute transport.

  8. Perceived risk of diabetes seriously underestimates actual diabetes risk: The KORA FF4 study

    PubMed Central

    Stang, Andreas; Bongaerts, Brenda; Kuss, Oliver; Herder, Christian; Roden, Michael; Quante, Anne; Holle, Rolf; Huth, Cornelia; Peters, Annette; Meisinger, Christa

    2017-01-01

    Objective Early detection of diabetes and prediabetic states is beneficial for patients, but may be delayed by patients´ being overly optimistic about their own health. Therefore, we assessed how persons without known diabetes perceive their risk of having or developing diabetes, and we identified factors associated with perception of diabetes risk. Research design and methods 1,953 participants without previously known diabetes from the population-based, German KORA FF4 Study (59.1 years, 47.8% men) had an oral glucose tolerance test. They estimated their probability of having undiagnosed diabetes mellitus (UDM) on a six category scale, and assessed whether they were at risk of developing diabetes in the future. We cross-tabulated glycemic status with risk perception, and fitted robust Poisson regression models to identify determinants of diabetes risk perception. Results 74% (95% CI: 65–82) of persons with UDM believed that their probability of having undetected diabetes was low or very low. 72% (95% CI: 69–75) of persons with prediabetes believed that they were not at risk of developing diabetes. In people with prediabetes, seeing oneself at risk of diabetes was associated with self-rated poor general health (prevalence ratio (PR) = 3.1 (95% CI: 1.4–6.8), parental diabetes (PR = 2.6, 1.9–3.4), high educational level (PR = 1.9 (1.4–2.5)), lower age (PR = 0.7, 0.6–0.8, per 1 standard deviation increase), female sex (PR = 1.2, 0.9–1.5) and obesity (PR = 1.5, 1.2–2.0). Conclusions People with undiagnosed diabetes or prediabetes considerably underestimate their probability of having or developing diabetes. Contrary to associations with actual diabetes risk, perceived diabetes risk was lower in men, lower educated and older persons. PMID:28141837

  9. Seabird bycatch in pelagic longline fisheries is grossly underestimated when using only haul data.

    PubMed

    Brothers, Nigel; Duckworth, Alan R; Safina, Carl; Gilman, Eric L

    2010-08-31

    Hundreds of thousands of seabirds are killed each year as bycatch in longline fisheries. Seabirds are predominantly caught during line setting but bycatch is generally recorded during line hauling, many hours after birds are caught. Bird loss during this interval may lead to inaccurate bycatch information. In this 15 year study, seabird bycatch was recorded during both line setting and line hauling from four fishing regions: Indian Ocean, Southern Ocean, Coral Sea and central Pacific Ocean. Over 43,000 albatrosses, petrels and skuas representing over 25 species were counted during line setting of which almost 6,000 seabirds attempted to take the bait. Bait-taking interactions were placed into one of four categories. (i) The majority (57%) of bait-taking attempts were "unsuccessful" involving seabirds that did not take the bait nor get caught or hooked. (ii) One-third of attempts were "successful" with seabirds removing the bait while not getting caught. (iii) One-hundred and seventy-six seabirds (3% of attempts) were observed being "caught" during line setting, with three albatross species - Laysan (Phoebastria immutabilis), black-footed (P. nigripes) and black-browed (Thalassarche melanophrys)- dominating this category. However, of these, only 85 (48%) seabird carcasses were retrieved during line hauling. Most caught seabirds were hooked through the bill. (iv) The remainder of seabird-bait interactions (7%) was not clearly observed, but likely involved more "caught" seabirds. Bait taking attempts and percentage outcome (e.g. successful, caught) varied between seabird species and was not always related to species abundance around fishing vessels. Using only haul data to calculate seabird bycatch grossly underestimates actual bycatch levels, with the level of seabird bycatch from pelagic longline fishing possibly double what was previously thought.

  10. One-dimensional potential of mean force underestimates activation barrier for transport across flexible lipid membranes.

    PubMed

    Kopelevich, Dmitry I

    2013-10-07

    Transport of a fullerene-like nanoparticle across a lipid bilayer is investigated by coarse-grained molecular dynamics (MD) simulations. Potentials of mean force (PMF) acting on the nanoparticle in a flexible bilayer suspended in water and a bilayer restrained to a flat surface are computed by constrained MD simulations. The rate of the nanoparticle transport into the bilayer interior is predicted using one-dimensional Langevin models based on these PMFs. The predictions are compared with the transport rates obtained from a series of direct (unconstrained) MD simulations of the solute transport into the flexible bilayer. It is observed that the PMF acting on the solute in the flexible membrane underestimates the transport rate by more than an order of magnitude while the PMF acting on the solute in the restrained membrane yields an accurate estimate of the activation energy for transport into the flexible membrane. This paradox is explained by a coexistence of metastable membrane configurations for a range of the solute positions inside and near the flexible membrane. This leads to a significant reduction of the contribution of the transition state to the mean force acting on the solute. Restraining the membrane shape ensures that there is only one stable membrane configuration corresponding to each solute position and thus the transition state is adequately represented in the PMF. This mechanism is quite general and thus this phenomenon is expected to occur in a wide range of interfacial systems. A simple model for the free energy landscape of the coupled solute-membrane system is proposed and validated. This model explicitly accounts for effects of the membrane deformations on the solute transport and yields an accurate prediction of the activation energy for the solute transport.

  11. Seabird Bycatch in Pelagic Longline Fisheries Is Grossly Underestimated when Using Only Haul Data

    PubMed Central

    Brothers, Nigel; Duckworth, Alan R.; Safina, Carl; Gilman, Eric L.

    2010-01-01

    Hundreds of thousands of seabirds are killed each year as bycatch in longline fisheries. Seabirds are predominantly caught during line setting but bycatch is generally recorded during line hauling, many hours after birds are caught. Bird loss during this interval may lead to inaccurate bycatch information. In this 15 year study, seabird bycatch was recorded during both line setting and line hauling from four fishing regions: Indian Ocean, Southern Ocean, Coral Sea and central Pacific Ocean. Over 43,000 albatrosses, petrels and skuas representing over 25 species were counted during line setting of which almost 6,000 seabirds attempted to take the bait. Bait-taking interactions were placed into one of four categories. (i) The majority (57%) of bait-taking attempts were “unsuccessful” involving seabirds that did not take the bait nor get caught or hooked. (ii) One-third of attempts were “successful” with seabirds removing the bait while not getting caught. (iii) One-hundred and seventy-six seabirds (3% of attempts) were observed being “caught” during line setting, with three albatross species – Laysan (Phoebastria immutabilis), black-footed (P. nigripes) and black-browed (Thalassarche melanophrys)– dominating this category. However, of these, only 85 (48%) seabird carcasses were retrieved during line hauling. Most caught seabirds were hooked through the bill. (iv) The remainder of seabird-bait interactions (7%) was not clearly observed, but likely involved more “caught” seabirds. Bait taking attempts and percentage outcome (e.g. successful, caught) varied between seabird species and was not always related to species abundance around fishing vessels. Using only haul data to calculate seabird bycatch grossly underestimates actual bycatch levels, with the level of seabird bycatch from pelagic longline fishing possibly double what was previously thought. PMID:20824163

  12. Misery has more company than people think: underestimating the prevalence of others' negative emotions.

    PubMed

    Jordan, Alexander H; Monin, Benoît; Dweck, Carol S; Lovett, Benjamin J; John, Oliver P; Gross, James J

    2011-01-01

    Four studies document underestimations of the prevalence of others' negative emotions and suggest causes and correlates of these erroneous perceptions. In Study 1a, participants reported that their negative emotions were more private or hidden than were their positive emotions; in Study 1b, participants underestimated the peer prevalence of common negative, but not positive, experiences described in Study 1a. In Study 2, people underestimated negative emotions and overestimated positive emotions even for well-known peers, and this effect was partially mediated by the degree to which those peers reported suppression of negative (vs. positive) emotions. Study 3 showed that lower estimations of the prevalence of negative emotional experiences predicted greater loneliness and rumination and lower life satisfaction and that higher estimations for positive emotional experiences predicted lower life satisfaction. Taken together, these studies suggest that people may think they are more alone in their emotional difficulties than they really are.

  13. Calorie Underestimation When Buying High-Calorie Beverages in Fast-Food Contexts.

    PubMed

    Franckle, Rebecca L; Block, Jason P; Roberto, Christina A

    2016-07-01

    We asked 1877 adults and 1178 adolescents visiting 89 fast-food restaurants in New England in 2010 and 2011 to estimate calories purchased. Calorie underestimation was greater among those purchasing a high-calorie beverage than among those who did not (adults: 324 ±698 vs 102 ±591 calories; adolescents: 360 ±602 vs 198 ±509 calories). This difference remained significant for adults but not adolescents after adjusting for total calories purchased. Purchasing high-calorie beverages may uniquely contribute to calorie underestimation among adults.

  14. Preferred child body size and parental underestimation of child weight in Mexican-American families

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To determine whether parents who prefer a heavier child would underestimate their child's weight more than those who prefer a leaner child. Methods: Participants were Mexican-American families (312 mothers, 173 fathers, and 312 children ages 8-10) who were interviewed and had height and w...

  15. A novel scheme for detection of diffuse lung disease in MDCT by use of statistical texture features

    NASA Astrophysics Data System (ADS)

    Wang, Jiahui; Li, Feng; Doi, Kunio; Li, Qiang

    2009-02-01

    The successful development of high performance computer-aided-diagnostic systems has potential to assist radiologists in the detection and diagnosis of diffuse lung disease. We developed in this study an automated scheme for the detection of diffuse lung disease on multi-detector computed tomography (MDCT). Our database consisted of 68 CT scans, which included 31 normal and 37 abnormal cases with three kinds of abnormal patterns, i.e., ground glass opacity, reticular, and honeycombing. Two radiologists first selected the CT scans with abnormal patterns based on clinical reports. The areas that included specific abnormal patterns in the selected CT images were then delineated as reference standards by an expert chest radiologist. To detect abnormal cases with diffuse lung disease, the lungs were first segmented from the background in each slice by use of a texture analysis technique, and then divided into contiguous volumes of interest (VOIs) with a 64×64×64 matrix size. For each VOI, we calculated many statistical texture features, including the mean and standard deviation of CT values, features determined from the run length matrix, and features from the co-occurrence matrix. A quadratic classifier was employed for distinguishing between normal and abnormal VOIs by use of a leave-one-case-out validation scheme. A rule-based criterion was employed to further determine whether a case was normal or abnormal. For the detection of abnormal VOIs, our CAD system achieved a sensitivity of 86% and a specificity of 90%. For the detection of abnormal cases, it achieved a sensitivity of 89% and a specificity of 90%. This preliminary study indicates that our CAD system would be useful for the detection of diffuse lung disease.

  16. Arterial double-contrast dual-energy MDCT: in-vivo rabbit atherosclerosis with iodinated nanoparticles and gadolinium agents

    NASA Astrophysics Data System (ADS)

    Carmi, Raz; Kafri, Galit; Altman, Ami; Goshen, Liran; Planer, David; Sosna, Jacob

    2010-03-01

    An in-vivo feasibility study of potentially improved atherosclerosis CT imaging is presented. By administration of two different contrast agents to rabbits with induced atherosclerotic plaques we aim at identifying both soft plaque and vessel lumen simultaneously. Initial injection of iodinated nanoparticle (INP) contrast agent (N1177 - Nanoscan Imaging), two to four hours before scan, leads to its later accumulation in macrophage-rich soft plaque, while a second gadolinium contrast agent (Magnevist) injected immediately prior to the scan blends with the aortic blood. The distinction between the two agents in a single scan is achieved with a double-layer dual-energy MDCT (Philips Healthcare) following material separation analysis using the reconstructed images of the different x-ray spectra. A single contrast agent injection scan, where only INP was injected two hours prior to the scan, was compared to a double-contrast scan taken four hours after INP injection and immediately after gadolinium injection. On the single contrast agent scan we observed along the aorta walls, localized iodine accumulation which can point on INP uptake by atherosclerotic plaque. In the double-contrast scan the gadolinium contributes a clearer depiction of the vessel lumen in addition to the lasting INP presence. The material separation shows a good correlation to the pathologies inferred from the conventional CT images of the two different scans while performing only a single scan prevents miss-registration problems and reduces radiation dose. These results suggest that a double-contrast dual-energy CT may be used for advanced clinical diagnostic applications.

  17. Radiation dose from MDCT using Monte Carlo simulations: estimating fetal dose due to pulmonary embolism scans accounting for overscan

    NASA Astrophysics Data System (ADS)

    Angel, E.; Wellnitz, C.; Goodsitt, M.; DeMarco, J.; Cagnon, C.; Ghatali, M.; Cody, D.; Stevens, D.; McCollough, C.; Primak, A.; McNitt-Gray, M.

    2007-03-01

    Pregnant women with shortness of breath are increasingly referred for CT Angiography to rule out Pulmonary Embolism (PE). While this exam is typically focused on the lungs, extending scan boundaries and overscan can add to the irradiated volume and have implications on fetal dose. The purpose of this work was to estimate radiation dose to the fetus when various levels of overscan were encountered. Two voxelized models of pregnant patients derived from actual patient anatomy were created based on image data. The models represent an early (< 7 weeks) and late term pregnancy (36 weeks). A previously validated Monte Carlo model of an MDCT scanner was used that takes into account physical details of the scanner. Simulated helical scans used 120 kVp, 4x5 mm beam collimation, pitch 1, and varying beam-off locations (edge of the irradiated volume) were used to represent different protocols plus overscan. Normalized dose (mGy/100mAs) was calculated for each fetus. For the early term and the late term pregnancy models, fetal dose estimates for a standard thoracic PE exam were estimated to be 0.05 and 0.3 mGy/100mAs, respectively, increasing to 9 mGy/100mAs when the beam-off location was extended to encompass the fetus. When performing PE exams to rule out PE in pregnant patients, the beam-off location may have a large effect on fetal dose, especially for late term pregnancies. Careful consideration of ending location of the x-ray beam - and not the end of image data - could result in significant reduction in radiation dose to the fetus.

  18. Longitudinal changes in structural abnormalities using MDCT in COPD: do the CT measurements of airway wall thickness and small pulmonary vessels change in parallel with emphysematous progression?

    PubMed Central

    Takayanagi, Shin; Kawata, Naoko; Tada, Yuji; Ikari, Jun; Matsuura, Yukiko; Matsuoka, Shin; Matsushita, Shoichiro; Yanagawa, Noriyuki; Kasahara, Yasunori; Tatsumi, Koichiro

    2017-01-01

    Background Recent advances in multidetector computed tomography (MDCT) facilitate acquiring important clinical information for managing patients with COPD. MDCT can detect the loss of lung tissue associated with emphysema as a low-attenuation area (LAA) and the thickness of airways as the wall area percentage (WA%). The percentage of small pulmonary vessels <5 mm2 (% cross-sectional area [CSA] <5) has been recently recognized as a parameter for expressing pulmonary perfusion. We aimed to analyze the longitudinal changes in structural abnormalities using these CT parameters and analyze the effect of exacerbation and smoking cessation on structural changes in COPD patients. Methods We performed pulmonary function tests (PFTs), an MDCT, and a COPD assessment test (CAT) in 58 patients with COPD at the time of their enrollment at the hospital and 2 years later. We analyzed the change in clinical parameters including CT indices and examined the effect of exacerbations and smoking cessation on the structural changes. Results The CAT score and forced expiratory volume in 1 second (FEV1) did not significantly change during the follow-up period. The parameters of emphysematous changes significantly increased. On the other hand, the WA% at the distal airways significantly decreased or tended to decrease, and the %CSA <5 slightly but significantly increased over the same period, especially in ex-smokers. The parameters of emphysematous change were greater in patients with exacerbations and continued to progress even after smoking cessation. In contrast, the WA% and %CSA <5 did not change in proportion to emphysema progression. Conclusion The WA% at the distal bronchi and the %CSA <5 did not change in parallel with parameters of LAA over the same period. We propose that airway disease and vascular remodeling may be reversible to some extent by smoking cessation and appropriate treatment. Optimal management may have a greater effect on pulmonary vascularity and airway disease

  19. In-Vivo Assessment of Femoral Bone Strength Using Finite Element Analysis (FEA) Based on Routine MDCT Imaging: A Preliminary Study on Patients with Vertebral Fractures

    PubMed Central

    Liebl, Hans; Garcia, Eduardo Grande; Holzner, Fabian; Noel, Peter B.; Burgkart, Rainer; Rummeny, Ernst J.; Baum, Thomas; Bauer, Jan S.

    2015-01-01

    Purpose To experimentally validate a non-linear finite element analysis (FEA) modeling approach assessing in-vitro fracture risk at the proximal femur and to transfer the method to standard in-vivo multi-detector computed tomography (MDCT) data of the hip aiming to predict additional hip fracture risk in subjects with and without osteoporosis associated vertebral fractures using bone mineral density (BMD) measurements as gold standard. Methods One fresh-frozen human femur specimen was mechanically tested and fractured simulating stance and clinically relevant fall loading configurations to the hip. After experimental in-vitro validation, the FEA simulation protocol was transferred to standard contrast-enhanced in-vivo MDCT images to calculate individual hip fracture risk each for 4 subjects with and without a history of osteoporotic vertebral fractures matched by age and gender. In addition, FEA based risk factor calculations were compared to manual femoral BMD measurements of all subjects. Results In-vitro simulations showed good correlation with the experimentally measured strains both in stance (R2 = 0.963) and fall configuration (R2 = 0.976). The simulated maximum stress overestimated the experimental failure load (4743 N) by 14.7% (5440 N) while the simulated maximum strain overestimated by 4.7% (4968 N). The simulated failed elements coincided precisely with the experimentally determined fracture locations. BMD measurements in subjects with a history of osteoporotic vertebral fractures did not differ significantly from subjects without fragility fractures (femoral head: p = 0.989; femoral neck: p = 0.366), but showed higher FEA based risk factors for additional incident hip fractures (p = 0.028). Conclusion FEA simulations were successfully validated by elastic and destructive in-vitro experiments. In the subsequent in-vivo analyses, MDCT based FEA based risk factor differences for additional hip fractures were not mirrored by according BMD measurements. Our

  20. Evaluation of multidecadal variability in CMIP5 surface solar radiation and inferred underestimation of aerosol direct effects over Europe, China, Japan, and India

    NASA Astrophysics Data System (ADS)

    Allen, R. J.; Norris, J. R.; Wild, M.

    2013-06-01

    Observations from the Global Energy Balance Archive indicate regional decreases in all sky surface solar radiation from ˜1950s to 1980s, followed by an increase during the 1990s. These periods are popularly called dimming and brightening, respectively. Removal of the radiative effects of cloud cover variability from all sky surface solar radiation results in a quantity called "clear sky proxy" radiation, in which multidecadal trends can be seen more distinctly, suggesting aerosol radiative forcing as a likely cause. Prior work has shown climate models from the Coupled Model Intercomparison Project 3 (CMIP3) generally underestimate the magnitude of these trends, particularly over China and India. Here we perform a similar analysis with 173 simulations from 42 climate models participating in the new CMIP5. Results show negligible improvement over CMIP3, as CMIP5 dimming trends over four regions—Europe, China, India, and Japan—are all underestimated. This bias is largest for both India and China, where the multimodel mean yields a decrease in clear sky proxy radiation of -1.3±0.3 and -1.2±0.2 W m-2decade-1, respectively, compared to observed decreases of -6.5±0.9 and -8.2±1.3 W m-2decade-1. Similar underestimation of the observed dimming over Japan exists, with the CMIP5 mean dimming ˜20% as large as observed. Moreover, not a single simulation reproduces the magnitude of the observed dimming trend for these three regions. Relative to dimming, CMIP5 models better simulate the observed brightening, but significant underestimation exists for both China and Japan. Overall, no individual model performs particularly well for all four regions. Model biases do not appear to be related to the use of prescribed versus prognostic aerosols or to aerosol indirect effects. However, models exhibit significant correlations between clear sky proxy radiation and several aerosol-related fields, most notably aerosol optical depth (AOD) and absorption AOD. This suggests model

  1. Evidence for link between modelled trends in Antarctic sea ice and underestimated westerly wind changes

    PubMed Central

    Purich, Ariaan; Cai, Wenju; England, Matthew H.; Cowan, Tim

    2016-01-01

    Despite global warming, total Antarctic sea ice coverage increased over 1979–2013. However, the majority of Coupled Model Intercomparison Project phase 5 models simulate a decline. Mechanisms causing this discrepancy have so far remained elusive. Here we show that weaker trends in the intensification of the Southern Hemisphere westerly wind jet simulated by the models may contribute to this disparity. During austral summer, a strengthened jet leads to increased upwelling of cooler subsurface water and strengthened equatorward transport, conducive to increased sea ice. As the majority of models underestimate summer jet trends, this cooling process is underestimated compared with observations and is insufficient to offset warming in the models. Through the sea ice-albedo feedback, models produce a high-latitude surface ocean warming and sea ice decline, contrasting the observed net cooling and sea ice increase. A realistic simulation of observed wind changes may be crucial for reproducing the recent observed sea ice increase. PMID:26842498

  2. Inferring Perspective Versus Getting Perspective: Underestimating the Value of Being in Another Person's Shoes.

    PubMed

    Zhou, Haotian; Majka, Elizabeth A; Epley, Nicholas

    2017-04-01

    People use at least two strategies to solve the challenge of understanding another person's mind: inferring that person's perspective by reading his or her behavior (theorization) and getting that person's perspective by experiencing his or her situation (simulation). The five experiments reported here demonstrate a strong tendency for people to underestimate the value of simulation. Predictors estimated a stranger's emotional reactions toward 50 pictures. They could either infer the stranger's perspective by reading his or her facial expressions or simulate the stranger's perspective by watching the pictures he or she viewed. Predictors were substantially more accurate when they got perspective through simulation, but overestimated the accuracy they had achieved by inferring perspective. Predictors' miscalibrated confidence stemmed from overestimating the information revealed through facial expressions and underestimating the similarity in people's reactions to a given situation. People seem to underappreciate a useful strategy for understanding the minds of others, even after they gain firsthand experience with both strategies.

  3. The Effect of Swelling Ratio on the Coulter Underestimation of Hydrogel Microsphere Diameters

    PubMed Central

    Pellegrini, Michael; Cherukupalli, Abhimanyu; Medini, Michael; Falkowski, Ron

    2015-01-01

    It has been demonstrated that the diameters of porous particles are underestimated by Coulter measurements. This phenomenon has also been observed in hydrogel particles, but not characterized. Since the Coulter principle uses the displacement of electrolyte to determine particle size, electrolyte contained within the swelled hydrogel microparticles results in an underestimate of actual particle diameters. The increased use of hydrogel microspheres in biomedical applications has led to the increased application of the Coulter principle to evaluate the size distribution of microparticles. A relationship between the swelling ratio of the particles and their reported Coulter diameters will permit calculation of the actual diameters of these particles. Using polyethylene glycol diacrylate hydrogel microspheres, we determined a correction factor that relates the polymer swelling ratio and the reported Coulter diameters to their actual size. PMID:26414785

  4. Evidence for link between modelled trends in Antarctic sea ice and underestimated westerly wind changes

    NASA Astrophysics Data System (ADS)

    Purich, Ariaan; Cai, Wenju; England, Matthew H.; Cowan, Tim

    2016-02-01

    Despite global warming, total Antarctic sea ice coverage increased over 1979-2013. However, the majority of Coupled Model Intercomparison Project phase 5 models simulate a decline. Mechanisms causing this discrepancy have so far remained elusive. Here we show that weaker trends in the intensification of the Southern Hemisphere westerly wind jet simulated by the models may contribute to this disparity. During austral summer, a strengthened jet leads to increased upwelling of cooler subsurface water and strengthened equatorward transport, conducive to increased sea ice. As the majority of models underestimate summer jet trends, this cooling process is underestimated compared with observations and is insufficient to offset warming in the models. Through the sea ice-albedo feedback, models produce a high-latitude surface ocean warming and sea ice decline, contrasting the observed net cooling and sea ice increase. A realistic simulation of observed wind changes may be crucial for reproducing the recent observed sea ice increase.

  5. Evidence for link between modelled trends in Antarctic sea ice and underestimated westerly wind changes.

    PubMed

    Purich, Ariaan; Cai, Wenju; England, Matthew H; Cowan, Tim

    2016-02-04

    Despite global warming, total Antarctic sea ice coverage increased over 1979-2013. However, the majority of Coupled Model Intercomparison Project phase 5 models simulate a decline. Mechanisms causing this discrepancy have so far remained elusive. Here we show that weaker trends in the intensification of the Southern Hemisphere westerly wind jet simulated by the models may contribute to this disparity. During austral summer, a strengthened jet leads to increased upwelling of cooler subsurface water and strengthened equatorward transport, conducive to increased sea ice. As the majority of models underestimate summer jet trends, this cooling process is underestimated compared with observations and is insufficient to offset warming in the models. Through the sea ice-albedo feedback, models produce a high-latitude surface ocean warming and sea ice decline, contrasting the observed net cooling and sea ice increase. A realistic simulation of observed wind changes may be crucial for reproducing the recent observed sea ice increase.

  6. Sap flow is Underestimated by Thermal Dissipation Sensors due to Alterations of Wood Anatomy

    NASA Astrophysics Data System (ADS)

    Marañón-Jiménez, S.; Wiedemann, A.; van den Bulcke, J.; Cuntz, M.; Rebmann, C.; Steppe, K.

    2014-12-01

    The thermal dissipation technique (TD) is one of the most commonly adopted methods for sap flow measurements. However, underestimations of up to 60% of the tree transpiration have been reported with this technique, although the causes are not certainly known. The insertion of TD sensors within the stems causes damage of the wood tissue and subsequent healing reactions, changing wood anatomy and likely the sap flow path. However, the anatomical changes in response to the insertion of sap flow sensors and the effects on the measured flow have not been assessed yet. In this study, we investigate the alteration of vessel anatomy on wounds formed around TD sensors. Our main objectives were to elucidate the anatomical causes of sap flow underestimation for ring-porous and diffuse-porous species, and relate these changes to sap flow underestimations. Successive sets of TD probes were installed in early, mid and end of the growing season in Fagus sylvatica (diffuse-porous) and Quercus petraea (ring-porous) trees. They were logged after the growing season and additional sets of sensors were installed in the logged stems with presumably no healing reaction. The wood tissue surrounding each sensor was then excised and analysed by X-ray computed microtomography (X-ray micro CT). This technique allowed the quantification of vessel anatomical characteristics and the reconstruction of the 3-D internal microstructure of the xylem vessels so that extension and shape of the altered area could be determined. Gels and tyloses clogged the conductive vessels around the sensors in both beech and oak. The extension of the affected area was larger for beech although these anatomical changes led to similar sap flow underestimations in both species. The higher vessel size in oak may explain this result and, therefore, larger sap flow underestimation per area of affected conductive tissue. The wound healing reaction likely occurred within the first weeks after sensor installation, which

  7. Monte Carlo simulations in multi-detector CT (MDCT) for two PET/CT scanner models using MASH and FASH adult phantoms

    NASA Astrophysics Data System (ADS)

    Belinato, W.; Santos, W. S.; Paschoal, C. M. M.; Souza, D. N.

    2015-06-01

    The combination of positron emission tomography (PET) and computed tomography (CT) has been extensively used in oncology for diagnosis and staging of tumors, radiotherapy planning and follow-up of patients with cancer, as well as in cardiology and neurology. This study determines by the Monte Carlo method the internal organ dose deposition for computational phantoms created by multidetector CT (MDCT) beams of two PET/CT devices operating with different parameters. The different MDCT beam parameters were largely related to the total filtration that provides a beam energetic change inside the gantry. This parameter was determined experimentally with the Accu-Gold Radcal measurement system. The experimental values of the total filtration were included in the simulations of two MCNPX code scenarios. The absorbed organ doses obtained in MASH and FASH phantoms indicate that bowtie filter geometry and the energy of the X-ray beam have significant influence on the results, although this influence can be compensated by adjusting other variables such as the tube current-time product (mAs) and pitch during PET/CT procedures.

  8. Evaluation of Lung MDCT Nodule Annotation Across Radiologists and Methods1

    PubMed Central

    Meyer, Charles R.; Johnson, Timothy D.; McLennan, Geoffrey; Aberle, Denise R.; Kazerooni, Ella A.; MacMahon, Heber; Mullan, Brian F.; Yankelevitz, David F.; van Beek, Edwin J. R.; Armato, Samuel G.; McNitt-Gray, Michael F.; Reeves, Anthony P.; Gur, David; Henschke, Claudia I.; Hoffman, Eric A.; Bland, Peyton H.; Laderach, Gary; Pais, Richie; Qing, David; Piker, Chris; Guo, Junfeng; Starkey, Adam; Max, Daniel; Croft, Barbara Y.; Clarke, Laurence P.

    2007-01-01

    Rationale and Objectives Integral to the mission of the National Institutes of Health–sponsored Lung Imaging Database Consortium is the accurate definition of the spatial location of pulmonary nodules. Because the majority of small lung nodules are not resected, a reference standard from histopathology is generally unavailable. Thus assessing the source of variability in defining the spatial location of lung nodules by expert radiologists using different software tools as an alternative form of truth is necessary. Materials and Methods The relative differences in performance of six radiologists each applying three annotation methods to the task of defining the spatial extent of 23 different lung nodules were evaluated. The variability of radiologists’ spatial definitions for a nodule was measured using both volumes and probability maps (p-map). Results were analyzed using a linear mixed-effects model that included nested random effects. Results Across the combination of all nodules, volume and p-map model parameters were found to be significant at P < .05 for all methods, all radiologists, and all second-order interactions except one. The radiologist and methods variables accounted for 15% and 3.5% of the total p-map variance, respectively, and 40.4% and 31.1% of the total volume variance, respectively. Conclusion Radiologists represent the major source of variance as compared with drawing tools independent of drawing metric used. Although the random noise component is larger for the p-map analysis than for volume estimation, the p-map analysis appears to have more power to detect differences in radiologist-method combinations. The standard deviation of the volume measurement task appears to be proportional to nodule volume. PMID:16979075

  9. Celiac Axis, Common Hepatic and Hepatic Artery Variants as Evidenced on MDCT Angiography in South Indian Population

    PubMed Central

    Parthasarathy, Ramesh

    2016-01-01

    Introduction With the increase in the hepatobiliary, pancreatic surgeries and liver transplantation, being aware of the anatomic variations of the celiac axis and the hepatic arteries is of paramount importance. Aim To illustrate the normal anatomy and variants of the celiac axis and the hepatic arteries with multidetector computed tomographic (MDCT) angiography in South Indian population and determine the potential variations in the celiac axis anatomy and the hepatic arteries, thus assisting the hepatobiliary surgeon and the interventional radiologist in avoiding iatrogenic injury to the arteries. Materials and Methods Two hundred patients undergoing abdominal CT angiography from July 2014 till July 2015 were retrospectively studied for hepatic arterial and celiac axis anatomical variation. The anatomic variations in our study were correlated with other studies. Results The celiac axis (CA) and the hepatic artery (HA) variations were analysed as per criteria laid by Song et al., and Michel. Out of 15 possible CA variations, 5 types of celiac artery variations were seen in 14 patients. A normal CA was seen in 179(89.5%) patients of the 200 patients. In the remaining 7 patients, the CA anatomy was classified as ambiguous since there was separate origin of the right and left hepatic arteries from the CA with absent common hepatic artery (CHA). The CHA originated normally from the celiac axis in 94% of the cases. Variation of CHA origin was seen in 5 patients. Normal HA anatomy was seen in 114 (57%) patients. Variation in HA anatomy was seen in 86 (43%) patients. Origin of the right hepatic artery (RHA) from the hepatic artery proper was seen in 182 (91%) patients and replaced origin of RHA from the superior mesenteric artery (SMA) was seen in 18 (9%) of the cases. Accessory RHA was seen in 7(3.5%) patients. The left hepatic artery (LHA) originated from the hepatic artery proper in 186 (93%) patients and replaced origin of LHA from the left gastric artery (LGA) was

  10. Underestimating the Alcohol Content of a Glass of Wine: The Implications for Estimates of Mortality Risk

    PubMed Central

    Britton, Annie; O’Neill, Darragh; Bell, Steven

    2016-01-01

    Aims Increases in glass sizes and wine strength over the last 25 years in the UK are likely to have led to an underestimation of alcohol intake in population studies. We explore whether this probable misclassification affects the association between average alcohol intake and risk of mortality from all causes, cardiovascular disease and cancer. Methods Self-reported alcohol consumption in 1997–1999 among 7010 men and women in the Whitehall II cohort of British civil servants was linked to the risk of mortality until mid-2015. A conversion factor of 8 g of alcohol per wine glass (1 unit) was compared with a conversion of 16 g per wine glass (2 units). Results When applying a higher alcohol content conversion for wine consumption, the proportion of heavy/very heavy drinkers increased from 28% to 41% for men and 15% to 28% for women. There was a significantly increased risk of very heavy drinking compared with moderate drinking for deaths from all causes and cancer before and after change in wine conversion; however, the hazard ratios were reduced when a higher wine conversion was used. Conclusions In this population-based study, assuming higher alcohol content in wine glasses changed the estimates of mortality risk. We propose that investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. Prospectively, researchers need to collect more detailed information on alcohol including serving sizes and strength. Short summary The alcohol content in a wine glass is likely to be underestimated in population surveys as wine strength and serving size have increased in recent years. We demonstrate that in a large cohort study, this underestimation affects estimates of mortality risk. Investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. PMID:27261472

  11. Underestimation of the Maximal Capacity of the Mitochondrial Electron Transport System in Oligomycin-Treated Cells

    PubMed Central

    Ruas, Juliana S.; Siqueira-Santos, Edilene S.; Amigo, Ignacio; Rodrigues-Silva, Erika; Kowaltowski, Alicia J.; Castilho, Roger F.

    2016-01-01

    The maximal capacity of the mitochondrial electron transport system (ETS) in intact cells is frequently estimated by promoting protonophore-induced maximal oxygen consumption preceded by inhibition of oxidative phosphorylation by oligomycin. In the present study, human glioma (T98G and U-87MG) and prostate cancer (PC-3) cells were titrated with different concentrations of the protonophore CCCP to induce maximal oxygen consumption rate (OCR) within respirometers in a conventional growth medium. The results demonstrate that the presence of oligomycin or its A-isomer leads to underestimation of maximal ETS capacity. In the presence of oligomycin, the spare respiratory capacity (SRC), i.e., the difference between the maximal and basal cellular OCR, was underestimated by 25 to 45%. The inhibitory effect of oligomycin on SRC was more pronounced in T98G cells and was observed in both suspended and attached cells. Underestimation of SRC also occurred when oxidative phosphorylation was fully inhibited by the ATP synthase inhibitor citreoviridin. Further experiments indicated that oligomycin cannot be replaced by the adenine nucleotide translocase inhibitors bongkrekic acid or carboxyatractyloside because, although these compounds have effects in permeabilized cells, they do not inhibit oxidative phosphorylation in intact cells. We replaced CCCP by FCCP, another potent protonophore and similar results were observed. Lower maximal OCR and SRC values were obtained with the weaker protonophore 2,4-dinitrophenol, and these parameters were not affected by the presence of oligomycin. In permeabilized cells or isolated brain mitochondria incubated with respiratory substrates, only a minor inhibitory effect of oligomycin on CCCP-induced maximal OCR was observed. We conclude that unless a previously validated protocol is employed, maximal ETS capacity in intact cells should be estimated without oligomycin. The inhibitory effect of an ATP synthase blocker on potent protonophore

  12. Underestimation of the Maximal Capacity of the Mitochondrial Electron Transport System in Oligomycin-Treated Cells.

    PubMed

    Ruas, Juliana S; Siqueira-Santos, Edilene S; Amigo, Ignacio; Rodrigues-Silva, Erika; Kowaltowski, Alicia J; Castilho, Roger F

    2016-01-01

    The maximal capacity of the mitochondrial electron transport system (ETS) in intact cells is frequently estimated by promoting protonophore-induced maximal oxygen consumption preceded by inhibition of oxidative phosphorylation by oligomycin. In the present study, human glioma (T98G and U-87MG) and prostate cancer (PC-3) cells were titrated with different concentrations of the protonophore CCCP to induce maximal oxygen consumption rate (OCR) within respirometers in a conventional growth medium. The results demonstrate that the presence of oligomycin or its A-isomer leads to underestimation of maximal ETS capacity. In the presence of oligomycin, the spare respiratory capacity (SRC), i.e., the difference between the maximal and basal cellular OCR, was underestimated by 25 to 45%. The inhibitory effect of oligomycin on SRC was more pronounced in T98G cells and was observed in both suspended and attached cells. Underestimation of SRC also occurred when oxidative phosphorylation was fully inhibited by the ATP synthase inhibitor citreoviridin. Further experiments indicated that oligomycin cannot be replaced by the adenine nucleotide translocase inhibitors bongkrekic acid or carboxyatractyloside because, although these compounds have effects in permeabilized cells, they do not inhibit oxidative phosphorylation in intact cells. We replaced CCCP by FCCP, another potent protonophore and similar results were observed. Lower maximal OCR and SRC values were obtained with the weaker protonophore 2,4-dinitrophenol, and these parameters were not affected by the presence of oligomycin. In permeabilized cells or isolated brain mitochondria incubated with respiratory substrates, only a minor inhibitory effect of oligomycin on CCCP-induced maximal OCR was observed. We conclude that unless a previously validated protocol is employed, maximal ETS capacity in intact cells should be estimated without oligomycin. The inhibitory effect of an ATP synthase blocker on potent protonophore

  13. Underestimation of oxygen deficiency hazard through use of linearized temperature profiles

    SciTech Connect

    Kerby, J.

    1989-06-15

    The failure mode analysis for any cryogenic system includes the effects of a large liquid spill due to vessel rupture or overfilling. The Oxygen Deficiency Hazard (ODH) analysis for this event is a strong function of the estimated heat flux entering the spilled liquid. A common method for estimating the heat flux is to treat the surface on which the liquid spills as a semi-infinite solid. This note addresses the effect of linearizing the temperature profile in this form of analysis, and shows it to cause the calculated flux to be underestimated by more than a factor of two. 3 refs., 2 figs.

  14. EPA underestimates, oversimplifies, miscommunicates, and mismanages cancer risks by ignoring human susceptibility.

    PubMed

    Finkel, Adam M

    2014-10-01

    If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25- to 50-fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences "Silver Book" concluded that EPA and the other agencies should fundamentally correct their mis-computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right-skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of "equity concerns" that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct-that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger "conservative" biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the

  15. A large underestimate of the pyrogenic source of formic acid inferred from space-borne measurements.

    NASA Astrophysics Data System (ADS)

    Chaliyakunnel, S.; Millet, D. B.; Wells, K. C.; Cady-Pereira, K.; Shephard, M.

    2015-12-01

    Formic acid (HCOOH) is one of the most abundant carboxylic acids in the atmosphere, and a dominant source of acidity in the global troposphere. Recent work has revealed a major gap in our present understanding of the atmospheric formic acid budget, with observed concentrations much larger than can be reconciled with current estimates of its sources. In this work, we employ new space-based observations from the Tropospheric Emission Spectrometer (TES) satellite instrument with the GEOS-Chem chemical transport model to better quantify the source of atmospheric formic acid from biomass burning, and assess the degree to which this source can help close the large budget gap for this species. The space-based formic acid data reveal a severe model underestimate for HCOOH that is most prominent over tropical biomass burning regions, indicating a major missing source of organic acids from fires. Based on two independent methods for inferring the fractional contribution of fires to the measured HCOOH abundance, we find that the pyrogenic HCOOH:CO enhancement ratio measured by TES (including direct emissions plus secondary production) is 5-10 times higher than current estimates of the direct emission ratio, providing evidence of substantial secondary production of HCOOH in fire plumes. We further show that current models significantly underestimate (by a factor of 2-6) the total primary and secondary source of HCOOH from tropical fires.

  16. Cancer stem cells are underestimated by standard experimental methods in clear cell renal cell carcinoma

    PubMed Central

    Gedye, Craig; Sirskyj, Danylo; Lobo, Nazleen C.; Meens, Jalna; Hyatt, Elzbieta; Robinette, Michael; Fleshner, Neil; Hamilton, Robert J; Kulkarni, Girish; Zlotta, Alexandre; Evans, Andrew; Finelli, Antonio; Jewett, Michael A. S.; Ailles, Laurie E.

    2016-01-01

    Rare cancer stem cells (CSC) are proposed to be responsible for tumour propagation and re-initiation and are functionally defined by identifying tumour-initiating cells (TICs) using the xenotransplantation limiting dilution assay (LDA). While TICs in clear cell renal cell carcinoma (ccRCC) appeared rare in NOD/SCID/IL2Rγ−/− (NSG) mice, xenografts formed more efficiently from small tumour fragments, indicating the LDA underestimated ccRCC TIC frequency. Mechanistic interrogation of the LDA identified multiple steps that influence ccRCC TIC quantitation. For example, tissue disaggregation destroys most ccRCC cells, common assays significantly overestimate tumour cell viability, and microenvironmental supplementation with human extracellular factors or pharmacological inhibition of anoikis increase clonogenicity and tumourigenicity of ccRCC cell lines and primary tumour cells. Identification of these previously uncharacterized concerns that cumulatively lead to substantial underestimation of TICs in ccRCC provides a framework for development of more accurate TIC assays in the future, both for this disease and for other cancers. PMID:27121191

  17. Drastic underestimation of amphipod biodiversity in the endangered Irano-Anatolian and Caucasus biodiversity hotspots.

    PubMed

    Katouzian, Ahmad-Reza; Sari, Alireza; Macher, Jan N; Weiss, Martina; Saboori, Alireza; Leese, Florian; Weigand, Alexander M

    2016-03-01

    Biodiversity hotspots are centers of biological diversity and particularly threatened by anthropogenic activities. Their true magnitude of species diversity and endemism, however, is still largely unknown as species diversity is traditionally assessed using morphological descriptions only, thereby ignoring cryptic species. This directly limits evidence-based monitoring and management strategies. Here we used molecular species delimitation methods to quantify cryptic diversity of the montane amphipods in the Irano-Anatolian and Caucasus biodiversity hotspots. Amphipods are ecosystem engineers in rivers and lakes. Species diversity was assessed by analysing two genetic markers (mitochondrial COI and nuclear 28S rDNA), compared with morphological assignments. Our results unambiguously demonstrate that species diversity and endemism is dramatically underestimated, with 42 genetically identified freshwater species in only five reported morphospecies. Over 90% of the newly recovered species cluster inside Gammarus komareki and G. lacustris; 69% of the recovered species comprise narrow range endemics. Amphipod biodiversity is drastically underestimated for the studied regions. Thus, the risk of biodiversity loss is significantly greater than currently inferred as most endangered species remain unrecognized and/or are only found locally. Integrative application of genetic assessments in monitoring programs will help to understand the true magnitude of biodiversity and accurately evaluate its threat status.

  18. Underestimated risks of recurrent long-range ash dispersal from northern Pacific Arc volcanoes

    NASA Astrophysics Data System (ADS)

    Bourne, A. J.; Abbott, P. M.; Albert, P. G.; Cook, E.; Pearce, N. J. G.; Ponomareva, V.; Svensson, A.; Davies, S. M.

    2016-07-01

    Widespread ash dispersal poses a significant natural hazard to society, particularly in relation to disruption to aviation. Assessing the extent of the threat of far-travelled ash clouds on flight paths is substantially hindered by an incomplete volcanic history and an underestimation of the potential reach of distant eruptive centres. The risk of extensive ash clouds to aviation is thus poorly quantified. New evidence is presented of explosive Late Pleistocene eruptions in the Pacific Arc, currently undocumented in the proximal geological record, which dispersed ash up to 8000 km from source. Twelve microscopic ash deposits or cryptotephra, invisible to the naked eye, discovered within Greenland ice-cores, and ranging in age between 11.1 and 83.7 ka b2k, are compositionally matched to northern Pacific Arc sources including Japan, Kamchatka, Cascades and Alaska. Only two cryptotephra deposits are correlated to known high-magnitude eruptions (Towada-H, Japan, ca 15 ka BP and Mount St Helens Set M, ca 28 ka BP). For the remaining 10 deposits, there is no evidence of age- and compositionally-equivalent eruptive events in regional volcanic stratigraphies. This highlights the inherent problem of under-reporting eruptions and the dangers of underestimating the long-term risk of widespread ash dispersal for trans-Pacific and trans-Atlantic flight routes.

  19. Underestimated risks of recurrent long-range ash dispersal from northern Pacific Arc volcanoes

    PubMed Central

    Bourne, A. J.; Abbott, P. M.; Albert, P. G.; Cook, E.; Pearce, N. J. G.; Ponomareva, V.; Svensson, A.; Davies, S. M.

    2016-01-01

    Widespread ash dispersal poses a significant natural hazard to society, particularly in relation to disruption to aviation. Assessing the extent of the threat of far-travelled ash clouds on flight paths is substantially hindered by an incomplete volcanic history and an underestimation of the potential reach of distant eruptive centres. The risk of extensive ash clouds to aviation is thus poorly quantified. New evidence is presented of explosive Late Pleistocene eruptions in the Pacific Arc, currently undocumented in the proximal geological record, which dispersed ash up to 8000 km from source. Twelve microscopic ash deposits or cryptotephra, invisible to the naked eye, discovered within Greenland ice-cores, and ranging in age between 11.1 and 83.7 ka b2k, are compositionally matched to northern Pacific Arc sources including Japan, Kamchatka, Cascades and Alaska. Only two cryptotephra deposits are correlated to known high-magnitude eruptions (Towada-H, Japan, ca 15 ka BP and Mount St Helens Set M, ca 28 ka BP). For the remaining 10 deposits, there is no evidence of age- and compositionally-equivalent eruptive events in regional volcanic stratigraphies. This highlights the inherent problem of under-reporting eruptions and the dangers of underestimating the long-term risk of widespread ash dispersal for trans-Pacific and trans-Atlantic flight routes. PMID:27445233

  20. Narita Target Heart Rate Equation Underestimates the Predicted Adequate Exercise Level in Sedentary Young Boys

    PubMed Central

    Siahkouhian, Marefat; Khodadadi, Davar

    2013-01-01

    Purpose Optimal training intensity and the adequate exercise level for physical fitness is one of the most important interests of coaches and sports physiologists. The aim of this study was to investigate the validity of the Narita et al target heart rate equation for the adequate exercise training level in sedentary young boys. Methods Forty two sedentary young boys (19.07±1.16 years) undertook a blood lactate transition threshold maximal treadmill test to volitional exhaustion with continuous respiratory gas measurements according to the Craig method. The anaerobic threshold (AT) of the participants then was calculated using the Narita target heart rate equation. Results Hopkin's spreadsheet to obtain confidence limit and the chance of the true difference between gas measurements and Narita target heart rate equation revealed that the Narita equation most likely underestimates the measured anaerobic threshold in sedentary young boys (168.76±15 vs. 130.08±14.36) (Difference ±90% confidence limit: 38.1±18). Intraclass correlation coefficient (ICC) showed a poor agreement between the criterion method and Narita equation (ICC= 0.03). Conclusion According to the results, the Narita equation underestimates the measured AT. It seems that the Narita equation is a good predictor of aerobic not AT which can be investigated in the future studies. PMID:24427475

  1. Underestimated risks of recurrent long-range ash dispersal from northern Pacific Arc volcanoes.

    PubMed

    Bourne, A J; Abbott, P M; Albert, P G; Cook, E; Pearce, N J G; Ponomareva, V; Svensson, A; Davies, S M

    2016-07-21

    Widespread ash dispersal poses a significant natural hazard to society, particularly in relation to disruption to aviation. Assessing the extent of the threat of far-travelled ash clouds on flight paths is substantially hindered by an incomplete volcanic history and an underestimation of the potential reach of distant eruptive centres. The risk of extensive ash clouds to aviation is thus poorly quantified. New evidence is presented of explosive Late Pleistocene eruptions in the Pacific Arc, currently undocumented in the proximal geological record, which dispersed ash up to 8000 km from source. Twelve microscopic ash deposits or cryptotephra, invisible to the naked eye, discovered within Greenland ice-cores, and ranging in age between 11.1 and 83.7 ka b2k, are compositionally matched to northern Pacific Arc sources including Japan, Kamchatka, Cascades and Alaska. Only two cryptotephra deposits are correlated to known high-magnitude eruptions (Towada-H, Japan, ca 15 ka BP and Mount St Helens Set M, ca 28 ka BP). For the remaining 10 deposits, there is no evidence of age- and compositionally-equivalent eruptive events in regional volcanic stratigraphies. This highlights the inherent problem of under-reporting eruptions and the dangers of underestimating the long-term risk of widespread ash dispersal for trans-Pacific and trans-Atlantic flight routes.

  2. Drastic underestimation of amphipod biodiversity in the endangered Irano-Anatolian and Caucasus biodiversity hotspots

    PubMed Central

    Katouzian, Ahmad-Reza; Sari, Alireza; Macher, Jan N.; Weiss, Martina; Saboori, Alireza; Leese, Florian; Weigand, Alexander M.

    2016-01-01

    Biodiversity hotspots are centers of biological diversity and particularly threatened by anthropogenic activities. Their true magnitude of species diversity and endemism, however, is still largely unknown as species diversity is traditionally assessed using morphological descriptions only, thereby ignoring cryptic species. This directly limits evidence-based monitoring and management strategies. Here we used molecular species delimitation methods to quantify cryptic diversity of the montane amphipods in the Irano-Anatolian and Caucasus biodiversity hotspots. Amphipods are ecosystem engineers in rivers and lakes. Species diversity was assessed by analysing two genetic markers (mitochondrial COI and nuclear 28S rDNA), compared with morphological assignments. Our results unambiguously demonstrate that species diversity and endemism is dramatically underestimated, with 42 genetically identified freshwater species in only five reported morphospecies. Over 90% of the newly recovered species cluster inside Gammarus komareki and G. lacustris; 69% of the recovered species comprise narrow range endemics. Amphipod biodiversity is drastically underestimated for the studied regions. Thus, the risk of biodiversity loss is significantly greater than currently inferred as most endangered species remain unrecognized and/or are only found locally. Integrative application of genetic assessments in monitoring programs will help to understand the true magnitude of biodiversity and accurately evaluate its threat status. PMID:26928527

  3. Climate reconstructions of the NH mean temperature: Can underestimation of trends and variability be avoided?

    NASA Astrophysics Data System (ADS)

    Christiansen, Bo

    2010-05-01

    Knowledge about the climate in the period before instrumental records are available is based on climate proxies obtained from tree-rings, sediments, ice-cores etc. Reconstructing the climate from such proxies is therefore necessary for studies of climate variability and for placing recent climate change into a longer term perspective. More than a decade ago pioneering attempts at using a multi-proxy dataset to reconstruct the Northern Hemisphere (NH) mean temperature resulted in the much published "hockey-stick"; a NH mean temperature that did not vary much before the rapid increase in the last century. Subsequent reconstructions show some differences but the overall "hockey-stick" structure seems to be a persistent feature However, there has been an increasing awareness of the fact that the applied reconstruction methods underestimate the low-frequency variability and trends. The recognition of the inadequacies of the reconstruction methods has to a large degree originated from pseudo-proxy studies, i.e., from long climate model experiments where artificial proxies have been generated and reconstructions based on these have been compared to the known model climate. It has also been found that reconstructions contain a large element of stochasticity which is revealed as broad distributions of skills. This means that it is very difficult to draw conclusions from a single or a few realizations. Climate reconstruction methods are based on variants of linear regression models relating temperatures and proxies. In this contribution we review some of the theory of linear regression and error-in-variables models to identify the sources of the underestimation of variability. Based on the gained insight we formulate a reconstruction method supposed to minimize this underestimation. The method is tested by applying it to an ensemble of surrogate temperature fields based on two climate simulations covering the last 500 and 1000 years. Compared to the RegEM TTLS method and a

  4. Whole-Chest 64-MDCT of Emergency Department Patients with Nonspecific Chest Pain: Radiation Dose and Coronary Artery Image Quality with Prospective ECG Triggering Versus Retrospective ECG Gating

    PubMed Central

    Shuman, William P.; Branch, Kelley R.; May, Janet M.; Mitsumori, Lee M.; Strote, Jared N.; Warren, Bill H.; Dubinsky, Theodore J.; Lockhart, David W.; Caldwell, James H.

    2012-01-01

    Objective The purpose of this study was to compare the patient radiation dose and coronary artery image quality of long-z-axis whole-chest 64-MDCT performed with retrospective ECG gating with those of CT performed with prospective ECG triggering in the evaluation of emergency department patients with nonspecific chest pain. Subjects and Methods Consecutively registered emergency department patients with nonspecific low-to-moderate-risk chest pain underwent whole-chest CT with retrospective gating (n = 41) or prospective triggering (n = 31). Effective patient radiation doses were estimated and compared by use of unpaired Student's t tests. Two reviewers independently scored the quality of images of the coronary arteries, and the scores were compared by use of ordinal logistic regression. Results Age, heart rate, body mass index, and z-axis coverage were not statistically different between the two groups. For retrospective gating, the mean effective radiation dose was 31.8 ± 5.1 mSv; for prospective triggering, the mean effective radiation dose was 9.2 ± 2.2 mSv (prospective triggering 71% lower, p < 0.001). Two of 512 segments imaged with retrospective gating were nonevaluable (0.4%), and two of 394 segments imaged with prospective triggering were nonevaluable (0.5%). Prospectively triggered images were 2.2 (95% CI, 1.1–4.5) times as likely as retrospectively gated images to receive a high image quality score for each segment after adjustment for segment differences (p < 0.05). Conclusion For long-z-axis whole-chest 64-MDCT of emergency department patients with nonspecific chest pain, use of prospective ECG triggering may result in substantially lower patient radiation doses and better coronary artery image quality than is achieved with retrospective ECG gating. PMID:19457832

  5. Hepatic Arterial Configuration in Relation to the Segmental Anatomy of the Liver; Observations on MDCT and DSA Relevant to Radioembolization Treatment

    SciTech Connect

    Hoven, Andor F. van den Leeuwen, Maarten S. van Lam, Marnix G. E. H. Bosch, Maurice A. A. J. van den

    2015-02-15

    PurposeCurrent anatomical classifications do not include all variants relevant for radioembolization (RE). The purpose of this study was to assess the individual hepatic arterial configuration and segmental vascularization pattern and to develop an individualized RE treatment strategy based on an extended classification.MethodsThe hepatic vascular anatomy was assessed on MDCT and DSA in patients who received a workup for RE between February 2009 and November 2012. Reconstructed MDCT studies were assessed to determine the hepatic arterial configuration (origin of every hepatic arterial branch, branching pattern and anatomical course) and the hepatic segmental vascularization territory of all branches. Aberrant hepatic arteries were defined as hepatic arterial branches that did not originate from the celiac axis/CHA/PHA. Early branching patterns were defined as hepatic arterial branches originating from the celiac axis/CHA.ResultsThe hepatic arterial configuration and segmental vascularization pattern could be assessed in 110 of 133 patients. In 59 patients (54 %), no aberrant hepatic arteries or early branching was observed. Fourteen patients without aberrant hepatic arteries (13 %) had an early branching pattern. In the 37 patients (34 %) with aberrant hepatic arteries, five also had an early branching pattern. Sixteen different hepatic arterial segmental vascularization patterns were identified and described, differing by the presence of aberrant hepatic arteries, their respective vascular territory, and origin of the artery vascularizing segment four.ConclusionsThe hepatic arterial configuration and segmental vascularization pattern show marked individual variability beyond well-known classifications of anatomical variants. We developed an individualized RE treatment strategy based on an extended anatomical classification.

  6. Underestimating nearby nature: affective forecasting errors obscure the happy path to sustainability.

    PubMed

    Nisbet, Elizabeth K; Zelenski, John M

    2011-09-01

    Modern lifestyles disconnect people from nature, and this may have adverse consequences for the well-being of both humans and the environment. In two experiments, we found that although outdoor walks in nearby nature made participants much happier than indoor walks did, participants made affective forecasting errors, such that they systematically underestimated nature's hedonic benefit. The pleasant moods experienced on outdoor nature walks facilitated a subjective sense of connection with nature, a construct strongly linked with concern for the environment and environmentally sustainable behavior. To the extent that affective forecasts determine choices, our findings suggest that people fail to maximize their time in nearby nature and thus miss opportunities to increase their happiness and relatedness to nature. Our findings suggest a happy path to sustainability, whereby contact with nature fosters individual happiness and environmentally responsible behavior.

  7. Systematic Underestimation of Earthquake Magnitudes from Large Intracontinental Reverse Faults: Historical Ruptures Break Across Segment Boundaries

    NASA Technical Reports Server (NTRS)

    Rubin, C. M.

    1996-01-01

    Because most large-magnitude earthquakes along reverse faults have such irregular and complicated rupture patterns, reverse-fault segments defined on the basis of geometry alone may not be very useful for estimating sizes of future seismic sources. Most modern large ruptures of historical earthquakes generated by intracontinental reverse faults have involved geometrically complex rupture patterns. Ruptures across surficial discontinuities and complexities such as stepovers and cross-faults are common. Specifically, segment boundaries defined on the basis of discontinuities in surficial fault traces, pronounced changes in the geomorphology along strike, or the intersection of active faults commonly have not proven to be major impediments to rupture. Assuming that the seismic rupture will initiate and terminate at adjacent major geometric irregularities will commonly lead to underestimation of magnitudes of future large earthquakes.

  8. Individual differences in course choice result in underestimation of the validity of college admissions systems.

    PubMed

    Berry, Christopher M; Sackett, Paul R

    2009-07-01

    We demonstrate that the validity of SAT scores and high school grade point averages (GPAs) as predictors of academic performance has been underestimated because of previous studies' reliance on flawed performance indicators (i.e., college GPA) that are contaminated by the effects of individual differences in course choice. We controlled for this contamination by predicting individual course grades, instead of GPAs, in a data set containing more than 5 million college grades for 167,816 students. Percentage of variance accounted for by SAT scores and high school GPAs was 30 to 40% lower when the criteria were freshman and cumulative GPAs than when the criteria were individual course grades. SAT scores and high school GPAs together accounted for between 44 and 62% of the variance in college grades. This study provides new estimates of the criterion-related validity of SAT scores and high school GPAs, and highlights the care that must be taken in choosing appropriate criteria in validity studies.

  9. Virus mutation frequencies can be greatly underestimated by monoclonal antibody neutralization of virions.

    PubMed Central

    Holland, J J; de la Torre, J C; Steinhauer, D A; Clarke, D; Duarte, E; Domingo, E

    1989-01-01

    Monoclonal antibody-resistant mutants have been widely used to estimate virus mutation frequencies. We demonstrate that standard virion neutralization inevitably underestimates monoclonal antibody-resistant mutant genome frequencies of vesicular stomatitis virus, due to phenotypic masking-mixing when wild-type (wt) virions are present in thousandsfold greater numbers. We show that incorporation of antibody into the plaque overlay medium (after virus penetration at 37 degrees C) can provide accurate estimates of genome frequencies of neutral monoclonal antibody-resistant mutant viruses in wt clones. By using this method, we have observed two adjacent G----A base transition frequencies in the I3 epitope to be of the order of 10(-4) in a wt glycine codon. This appears to be slightly lower than the frequencies observed at other sites for total (viable and nonviable) virus genomes when using a direct sequence approach. Images PMID:2479770

  10. Maximum rates of climate change are systematically underestimated in the geological record

    NASA Astrophysics Data System (ADS)

    Kemp, David B.; Eichenseer, Kilian; Kiessling, Wolfgang

    2015-11-01

    Recently observed rates of environmental change are typically much higher than those inferred for the geological past. At the same time, the magnitudes of ancient changes were often substantially greater than those established in recent history. The most pertinent disparity, however, between recent and geological rates is the timespan over which the rates are measured, which typically differ by several orders of magnitude. Here we show that rates of marked temperature changes inferred from proxy data in Earth history scale with measurement timespan as an approximate power law across nearly six orders of magnitude (102 to >107 years). This scaling reveals how climate signals measured in the geological record alias transient variability, even during the most pronounced climatic perturbations of the Phanerozoic. Our findings indicate that the true attainable pace of climate change on timescales of greatest societal relevance is underestimated in geological archives.

  11. Women may underestimate their partners' desires to use condoms: possible implications for behaviour.

    PubMed

    Edwards, Gaynor L; Barber, Bonnie L

    2010-01-01

    Australian young adults reported how often they wanted to use condoms in both romantic (n = 667) and casual relationship (n = 152) contexts and how often they thought their partners wanted to use condoms. Young adults wanted to use condoms more often than they perceived their partners to in both casual and romantic relationship contexts. Gender interactions showed that this pattern was especially strong among young women. Women seemed to underestimate the frequency at which their male partners wanted to use condoms. Furthermore, both the participants' condom use desires and perceptions of their partners' condom use desires predicted condom use behavior. Results suggest that gendered expectations may play a part in how often individuals perceive their partners to want to use condoms, which, in effect, may determine condom use behavior.

  12. Does verbatim sentence recall underestimate the language competence of near-native speakers?

    PubMed Central

    Schweppe, Judith; Barth, Sandra; Ketzer-Nöltge, Almut; Rummer, Ralf

    2015-01-01

    Verbatim sentence recall is widely used to test the language competence of native and non-native speakers since it involves comprehension and production of connected speech. However, we assume that, to maintain surface information, sentence recall relies particularly on attentional resources, which differentially affects native and non-native speakers. Since even in near-natives language processing is less automatized than in native speakers, processing a sentence in a foreign language plus retaining its surface may result in a cognitive overload. We contrasted sentence recall performance of German native speakers with that of highly proficient non-natives. Non-natives recalled the sentences significantly poorer than the natives, but performed equally well on a cloze test. This implies that sentence recall underestimates the language competence of good non-native speakers in mixed groups with native speakers. The findings also suggest that theories of sentence recall need to consider both its linguistic and its attentional aspects. PMID:25698996

  13. Does the Congressional Budget Office underestimate savings from reform? A review of the historical record.

    PubMed

    Gabel, Jon R

    2010-01-01

    When the Congressional Budget Office (CBO) "scores" legislation, or assesses the likely cost impact, it requires substantial evidence that a cost-saving initiative has historically achieved savings. The agency has difficulty addressing the impact of multiple changes made simultaneously without historical precedent where there is an interaction effect among proposed changes. This study examines CBO scoring of major reform legislation enacted during each of the past three decades, including the prospective payment system for hospitals in the 1980s, the Balanced Budget Act of the 1990s, and the Medicare Modernization Act of 2003. In contrasting actual spending with predicted spending, CBO, in all three cases, substantially underestimated savings from these reform measures.

  14. Yellow jackets may be an underestimated component of an ant-seed mutualism

    USGS Publications Warehouse

    Bale, M.T.; Zettler, J.A.; Robinson, B.A.; Spira, T.P.; Allen, C.R.

    2003-01-01

    Yellow jackets (Hymenoptera: Vespidae) are attracted to the typically ant-dispersed seeds of trilliums and will take seeds from ants in the genus Aphaenogaster. To determine if yellow jacket, Vespula maculifrons (Buysson), presence interferes with seed foraging by ants, we presented seeds of Trillium discolor Wray to three species (A. texana carolinensis Wheeler, Formica schaufussi Mayr, and Solenopsis invicta Buren) of seed-carrying ants in areas where vespids were present or excluded. We found that interspecific aggression between yellow jackets and ants is species specific. Vespid presence decreased average foraging time and increased foraging efficiency of two of the three ant species studied, a situation that might reflect competition for a limited food source. We also found that yellow jackets removed more seeds than ants, suggestive that vespids are important, albeit underestimated, components of ant-seed mutualisms.

  15. Does verbatim sentence recall underestimate the language competence of near-native speakers?

    PubMed

    Schweppe, Judith; Barth, Sandra; Ketzer-Nöltge, Almut; Rummer, Ralf

    2015-01-01

    Verbatim sentence recall is widely used to test the language competence of native and non-native speakers since it involves comprehension and production of connected speech. However, we assume that, to maintain surface information, sentence recall relies particularly on attentional resources, which differentially affects native and non-native speakers. Since even in near-natives language processing is less automatized than in native speakers, processing a sentence in a foreign language plus retaining its surface may result in a cognitive overload. We contrasted sentence recall performance of German native speakers with that of highly proficient non-natives. Non-natives recalled the sentences significantly poorer than the natives, but performed equally well on a cloze test. This implies that sentence recall underestimates the language competence of good non-native speakers in mixed groups with native speakers. The findings also suggest that theories of sentence recall need to consider both its linguistic and its attentional aspects.

  16. Maximum rates of climate change are systematically underestimated in the geological record

    PubMed Central

    Kemp, David B.; Eichenseer, Kilian; Kiessling, Wolfgang

    2015-01-01

    Recently observed rates of environmental change are typically much higher than those inferred for the geological past. At the same time, the magnitudes of ancient changes were often substantially greater than those established in recent history. The most pertinent disparity, however, between recent and geological rates is the timespan over which the rates are measured, which typically differ by several orders of magnitude. Here we show that rates of marked temperature changes inferred from proxy data in Earth history scale with measurement timespan as an approximate power law across nearly six orders of magnitude (102 to >107 years). This scaling reveals how climate signals measured in the geological record alias transient variability, even during the most pronounced climatic perturbations of the Phanerozoic. Our findings indicate that the true attainable pace of climate change on timescales of greatest societal relevance is underestimated in geological archives. PMID:26555085

  17. Runners greatly underestimate sweat losses before and after a 1-hr summer run.

    PubMed

    O'Neal, Eric K; Davis, Brett A; Thigpen, Lauren K; Caufield, Christina R; Horton, Anthony D; McIntosh, Joyce R

    2012-10-01

    The purpose of this study was to determine how accurately runners estimate their sweat losses. Male (n = 19) and female (n = 20) runners (41 ± 10 yr, VO2max 57 ± 9 ml · kg(-1) · min(-1) from the southeastern U.S. completed an ~1-hr run during late summer on a challenging outdoor road course (wet bulb globe temperature 24.1 ± 1.5 °C). Runs began at ~6:45 a.m. or p.m. Before and after running, participants filled race-aid-station paper cups with a volume of fluid they felt would be equivalent to their sweat losses. Total sweat losses and losses by percent body weight differed (p < .01) between men (1,797 ± 449 ml, 2.3% ± 0.6%) and women (1,155 ± 258 ml, 1.9% ± 0.4%). Postrun estimates (738 ± 470 ml) were lower (p < .001) than sweat losses (1,468 ± 484 ml), equaling underestimations of 50% ± 23%, with no differences in estimation accuracy by percentage between genders. Runners who reported measuring changes in pre- and postrun weight to assess sweat losses within the previous month (n = 9, -54% ± 18%) were no more accurate (p = .55) than runners who had not (n = 30, -48% ± 24%). These results suggest that inadequate fluid intake during runs or between runs may stem from underestimations of sweat losses and that runners who do assess sweat-loss changes may be making sweat-loss calculation errors or do not accurately translate changes in body weight to physical volumes of water.

  18. Modeling microelectrode biosensors: free-flow calibration can substantially underestimate tissue concentrations

    PubMed Central

    Wall, Mark J.

    2016-01-01

    Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments. NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue

  19. Hydrogen peroxide efflux from muscle mitochondria underestimates matrix superoxide production: a correction using glutathione depletion

    PubMed Central

    TREBERG, Jason R.; QUINLAN, Casey L.; BRAND, Martin D.

    2010-01-01

    Summary The production of H2O2 by isolated mitochondria is frequently used as a measure of mitochondrial superoxide formation. Matrix superoxide dismutase quantitatively converts matrix superoxide to H2O2. However, matrix enzymes such as the glutathione peroxidases can consume H2O2 and compete with efflux of H2O2, causing an underestimate of superoxide production. To assess this underestimate we depleted matrix glutathione in rat skeletal muscle mitochondria by more than 90% by pretreatment with 1-chloro-2,4-dintrobenzene (CDNB). The pretreatment protocol strongly diminished the mitochondrial capacity to consume exogenous H2O2, consistent with decreased peroxidase capacity, but avoided direct stimulation of superoxide production from complex I. It elevated the observed rates of H2O2 formation from matrix-directed superoxide up to two-fold from several sites of production, defined by substrates and electron transport inhibitors, over a wide range of control rates, from 0.2 to 2.5 nmol H2O2 • min−1 • mg protein−1. Similar results were obtained when glutathione was depleted using monochlorobimane or when soluble matrix peroxidase activity was removed by preparation of submitochondrial particles. The data indicate that the increased H2O2 efflux observed with CDNB pretreatment was a result of glutathione depletion and compromised peroxidase activity. A hyperbolic correction curve was constructed, making H2O2 efflux a more quantitative measure of matrix superoxide production. For rat muscle mitochondria, the correction equation was: [CDNB pretreated rate = control rate + (1.43*(control rate))/(0.55+control rate)]. These results have significant ramifications for the rates and topology of superoxide production by isolated mitochondria. PMID:20491900

  20. Modeling microelectrode biosensors: free-flow calibration can substantially underestimate tissue concentrations.

    PubMed

    Newton, Adam J H; Wall, Mark J; Richardson, Magnus J E

    2017-03-01

    Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments.NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue

  1. Influenza Pneumonia Surveillance among Hospitalized Adults May Underestimate the Burden of Severe Influenza Disease

    PubMed Central

    Ortiz, Justin R.; Neuzil, Kathleen M.; Cooke, Colin R.; Neradilek, Moni B.; Goss, Christopher H.; Shay, David K.

    2014-01-01

    Background Studies seeking to estimate the burden of influenza among hospitalized adults often use case definitions that require presence of pneumonia. The goal of this study was to assess the extent to which restricting influenza testing to adults hospitalized with pneumonia could underestimate the total burden of hospitalized influenza disease. Methods We conducted a modelling study using the complete State Inpatient Databases from Arizona, California, and Washington and regional influenza surveillance data acquired from CDC from January 2003 through March 2009. The exposures of interest were positive laboratory tests for influenza A (H1N1), influenza A (H3N2), and influenza B from two contiguous US Federal Regions encompassing the study area. We identified the two outcomes of interest by ICD-9-CM code: respiratory and circulatory hospitalizations, as well as critical illness hospitalizations (acute respiratory failure, severe sepsis, and in-hospital death). We linked the hospitalization datasets with the virus surveillance datasets by geographic region and month of hospitalization. We used negative binomial regression models to estimate the number of influenza-associated events for the outcomes of interest. We sub-categorized these events to include all outcomes with or without pneumonia diagnosis codes. Results We estimated that there were 80,834 (95% CI 29,214–174,033) influenza-associated respiratory and circulatory hospitalizations and 26,760 (95% CI 14,541–47,464) influenza-associated critical illness hospitalizations. When a pneumonia diagnosis was excluded, the estimated number of influenza-associated respiratory and circulatory hospitalizations was 24,816 (95% CI 6,342–92,624). The estimated number of influenza-associated critical illness hospitalizations was 8,213 (95% CI 3,764–20,799). Around 30% of both influenza-associated respiratory and circulatory hospitalizations, as well as influenza-associated critical illness hospitalizations did not

  2. Trabecular bone structure analysis in the osteoporotic spine using a clinical in vivo setup for 64-slice MDCT imaging: comparison to microCT imaging and microFE modeling.

    PubMed

    Issever, Ahi S; Link, Thomas M; Kentenich, Marie; Rogalla, Patrik; Schwieger, Karsten; Huber, Markus B; Burghardt, Andrew J; Majumdar, Sharmila; Diederichs, Gerd

    2009-09-01

    Assessment of trabecular microarchitecture may improve estimation of biomechanical strength, but visualization of trabecular bone structure in vivo is challenging. We tested the feasibility of assessing trabecular microarchitecture in the spine using multidetector CT (MDCT) on intact human cadavers in an experimental in vivo-like setup. BMD, bone structure (e.g., bone volume/total volume = BV/TV; trabecular thickness = Tb.Th; structure model index = SMI) and bone texture parameters were evaluated in 45 lumbar vertebral bodies using MDCT (mean in-plane pixel size, 274 microm(2); slice thickness, 500 microm). These measures were correlated with structure measures assessed with microCT at an isotropic spatial resolution of 16 microm and to microfinite element models (microFE) of apparent modulus and stiffness. MDCT-derived BMD and structure measures showed significant correlations to the density and structure obtained by microCT (BMD, R(2) = 0.86, p < 0.0001; BV/TV, R(2) = 0.64, p < 0.0001; Tb.Th, R(2) = 0.36, p < 0.01). When comparing microCT-derived measures with microFE models, the following correlations (p < 0.001) were found for apparent modulus and stiffness, respectively: BMD (R(2) = 0.58 and 0.66), BV/TV (R(2) = 0.44 and 0.58), and SMI (R(2) = 0.44 and 0.49). However, the overall highest correlation (p < 0.001) with microFE app. modulus (R(2) = 0.75) and stiffness (R(2) = 0.76) was achieved by the combination of QCT-derived BMD with the bone texture measure Minkowski Dimension. In summary, although still limited by its spatial resolution, trabecular bone structure assessment using MDCT is overall feasible. However, when comparing with microFE-derived bone properties, BMD is superior compared with single parameters for microarchitecture, and correlations further improve when combining with texture measures.

  3. Asymmetries of Poverty: Why Global Burden of Disease Valuations Underestimate the Burden of Neglected Tropical Diseases

    PubMed Central

    King, Charles H.; Bertino, Anne-Marie

    2008-01-01

    The disability-adjusted life year (DALY) initially appeared attractive as a health metric in the Global Burden of Disease (GBD) program, as it purports to be a comprehensive health assessment that encompassed premature mortality, morbidity, impairment, and disability. It was originally thought that the DALY would be useful in policy settings, reflecting normative valuations as a standardized unit of ill health. However, the design of the DALY and its use in policy estimates contain inherent flaws that result in systematic undervaluation of the importance of chronic diseases, such as many of the neglected tropical diseases (NTDs), in world health. The conceptual design of the DALY comes out of a perspective largely focused on the individual risk rather than the ecology of disease, thus failing to acknowledge the implications of context on the burden of disease for the poor. It is nonrepresentative of the impact of poverty on disability, which results in the significant underestimation of disability weights for chronic diseases such as the NTDs. Finally, the application of the DALY in policy estimates does not account for the nonlinear effects of poverty in the cost-utility analysis of disease control, effectively discounting the utility of comprehensively treating NTDs. The present DALY framework needs to be substantially revised if the GBD is to become a valid and useful system for determining health priorities. PMID:18365036

  4. PCR diagnostics underestimate the prevalence of avian malaria (Plasmodium relictum) in experimentally-infected passerines

    USGS Publications Warehouse

    Jarvi, Susan I.; Schultz, Jeffrey J.; Atkinson, Carter T.

    2002-01-01

    Several polymerase chain reaction (PCR)-based methods have recently been developed for diagnosing malarial infections in both birds and reptiles, but a critical evaluation of their sensitivity in experimentally-infected hosts has not been done. This study compares the sensitivity of several PCR-based methods for diagnosing avian malaria (Plasmodium relictum) in captive Hawaiian honeycreepers using microscopy and a recently developed immunoblotting technique. Sequential blood samples were collected over periods of up to 4.4 yr after experimental infection and rechallenge to determine both the duration and detectability of chronic infections. Two new nested PCR approaches for detecting circulating parasites based on P. relictum 18S rRNA genes and the thrombospondin-related anonymous protein (TRAP) gene are described. The blood smear and the PCR tests were less sensitive than serological methods for detecting chronic malarial infections. Individually, none of the diagnostic methods was 100% accurate in detecting subpatent infections, although serological methods were significantly more sensitive (97%) than either nested PCR (61–84%) or microscopy (27%). Circulating parasites in chronically infected birds either disappear completely from circulation or to drop to intensities below detectability by nested PCR. Thus, the use of PCR as a sole means of detection of circulating parasites may significantly underestimate true prevalence.

  5. Gastroesophageal reflux disease vs. Panayiotopoulos syndrome: an underestimated misdiagnosis in pediatric age?

    PubMed

    Parisi, Pasquale; Pacchiarotti, Claudia; Ferretti, Alessandro; Bianchi, Simona; Paolino, Maria Chiara; Barreto, Mario; Principessa, Luigi; Villa, Maria Pia

    2014-12-01

    Autonomic signs and symptoms could be of epileptic or nonepileptic origin, and the differential diagnosis depends on a number of factors which include the nature of the autonomic manifestations themselves, the occurrence of other nonictal autonomic signs/symptoms, and the age of the patient. Here, we describe twelve children (aged from ten months to six years at the onset of the symptoms) with Panayiotopoulos syndrome misdiagnosed as gastroesophageal reflux disease. Gastroesophageal reflux disease and Panayiotopoulos syndrome may represent an underestimated diagnostic challenge. When the signs/symptoms occur mainly during sleep, a sleep EEG or, if available, a polysomnographic evaluation may be the most useful investigation to make a differential diagnosis between autonomic epileptic and nonepileptic disorders. An early detection can reduce both the high morbidity related to mismanagement and the high costs to the national health service related to the incorrect diagnostic and therapeutic approaches. To decide if antiseizure therapy is required, one should take into account both the frequency and severity of epileptic seizures and the tendency to have potentially lethal autonomic cardiorespiratory involvement. In conclusion, we would emphasize the need to make a differential diagnosis between gastroesophageal reflux disease and Panayiotopoulos syndrome in patients with "an unusual" late-onset picture of GERD and acid therapy-resistant gastroesophageal reflux, especially if associated with other autonomic symptoms and signs.

  6. Exposure limits: the underestimation of absorbed cell phone radiation, especially in children.

    PubMed

    Gandhi, Om P; Morgan, L Lloyd; de Salles, Alvaro Augusto; Han, Yueh-Ying; Herberman, Ronald B; Davis, Devra Lee

    2012-03-01

    The existing cell phone certification process uses a plastic model of the head called the Specific Anthropomorphic Mannequin (SAM), representing the top 10% of U.S. military recruits in 1989 and greatly underestimating the Specific Absorption Rate (SAR) for typical mobile phone users, especially children. A superior computer simulation certification process has been approved by the Federal Communications Commission (FCC) but is not employed to certify cell phones. In the United States, the FCC determines maximum allowed exposures. Many countries, especially European Union members, use the "guidelines" of International Commission on Non-Ionizing Radiation Protection (ICNIRP), a non governmental agency. Radiofrequency (RF) exposure to a head smaller than SAM will absorb a relatively higher SAR. Also, SAM uses a fluid having the average electrical properties of the head that cannot indicate differential absorption of specific brain tissue, nor absorption in children or smaller adults. The SAR for a 10-year old is up to 153% higher than the SAR for the SAM model. When electrical properties are considered, a child's head's absorption can be over two times greater, and absorption of the skull's bone marrow can be ten times greater than adults. Therefore, a new certification process is needed that incorporates different modes of use, head sizes, and tissue properties. Anatomically based models should be employed in revising safety standards for these ubiquitous modern devices and standards should be set by accountable, independent groups.

  7. Sequestration and bioavailability of perfluoroalkyl acids (PFAAs) in soils: Implications for their underestimated risk.

    PubMed

    Zhao, Lixia; Zhu, Lingyan; Zhao, Shuyan; Ma, Xinxin

    2016-12-01

    Different from typical hydrophobic organic contaminants (HOCs), perfluoroalkyl acids (PFAAs) are more soluble in water and less partitioned to soil than the HOCs. It remains unclear whether and to what extent PFAAs could be sequestrated in soil. In this study, sequential extraction of PFAAs in soil and bioaccumulation of PFAAs in earthworm were carried out to understand the sequestration and bioavailability of PFAAs in soils with different soil organic matter (SOM) and aged for different time periods (7 and 47d). Sequestration occurred in different degrees depending on the amount and compositions of SOM in soil, structural properties of PFAAs and aging time. Surprisingly, in one peat soil with high fraction of organic carbon (foc, 59%), the PFAAs were completely sequestrated in the soil. Aging might lead to further sequestration of PFAAs in soil with relatively lower foc. As a consequence of sequestration, the bioavailability of PFAAs in peat soils was reduced 3-10 times compared to that in the plain farmland soil. However, the sequestrated PFAAs were still bioaccumulative in earthworms to some extent. The results indicated that the risk of PFAAs in field soil with high content of SOM could be underestimated if only free PFAAs using mild solvent extraction were monitored.

  8. Asbestos in Belgium: an underestimated health risk. The evolution of mesothelioma mortality rates (1969–2009)

    PubMed Central

    Van den Borre, Laura; Deboosere, Patrick

    2014-01-01

    Background: Although Belgium was once a major international manufacturer of asbestos products, asbestos-related diseases in the country have remained scarcely researched. Objectives: The aim of this study is to provide a descriptive analysis of Belgian mesothelioma mortality rates in order to improve the understanding of asbestos health hazards from an international perspective. Methods: Temporal and geographical analyses were performed on cause-specific mortality data (1969–2009) using quantitative demographic measures. Results were compared to recent findings on global mesothelioma deaths. Results: Belgium has one of the highest mesothelioma mortality rates in the world, following the UK, Australia, and Italy. With a progressive increase of male mesothelioma deaths in the mid-1980s, large differences in mortality rates between sexes are apparent. Mesothelioma deaths are primarily concentrated in geographic areas with proximity to former asbestos industries. Conclusions: Asbestos mortality in Belgium has been underestimated for decades. Our findings suggest that the location of asbestos industries is correlated with rates of mesothelioma, underlining the need to avert future asbestos exposure by thorough screening of potential contaminated sites and by pursuing a global ban on asbestos. PMID:24999848

  9. Impact of SNR and Gain-Function Over- and Under-estimation on Speech Intelligibility.

    PubMed

    Chen, Fei; Loizou, Philipos C

    2012-02-01

    Most noise reduction algorithms rely on obtaining reliable estimates of the SNR of each frequency bin. For that reason, much work has been done in analyzing the behavior and performance of SNR estimation algorithms in the context of improving speech quality and reducing speech distortions (e.g., musical noise). Comparatively little work has been reported, however, regarding the analysis and investigation of the effect of errors in SNR estimation on speech intelligibility. It is not known, for instance, whether it is the errors in SNR overestimation, errors in SNR underestimation, or both that are harmful to speech intelligibility. Errors in SNR estimation produce concomitant errors in the computation of the gain (suppression) function, and the impact of gain estimation errors on speech intelligibility is unclear. The present study assesses the effect of SNR estimation errors on gain function estimation via sensitivity analysis. Intelligibility listening studies were conducted to validate the sensitivity analysis. Results indicated that speech intelligibility is severely compromised when SNR and gain over-estimation errors are introduced in spectral components with negative SNR. A theoretical upper bound on the gain function is derived that can be used to constrain the values of the gain function so as to ensure that SNR overestimation errors are minimized. Speech enhancement algorithms that can limit the values of the gain function to fall within this upper bound can improve speech intelligibility.

  10. Assessment of snowfall accumulation underestimation by tipping bucket gauges in the Spanish operational network

    NASA Astrophysics Data System (ADS)

    Buisán, Samuel T.; Earle, Michael E.; Luís Collado, José; Kochendorfer, John; Alastrué, Javier; Wolff, Mareile; Smith, Craig D.; López-Moreno, Juan I.

    2017-03-01

    Within the framework of the World Meteorological Organization Solid Precipitation Intercomparison Experiment (WMO-SPICE), the Thies tipping bucket precipitation gauge was assessed against the SPICE reference configuration at the Formigal-Sarrios test site located in the Pyrenees mountain range of Spain. The Thies gauge is the most widely used precipitation gauge by the Spanish Meteorological State Agency (AEMET) for the measurement of all precipitation types including snow. It is therefore critical that its performance is characterized. The first objective of this study is to derive transfer functions based on the relationships between catch ratio and wind speed and temperature. Multiple linear regression was applied to 1 and 3 h accumulation periods, confirming that wind is the most dominant environmental variable affecting the gauge catch efficiency, especially during snowfall events. At wind speeds of 1.5 m s-1 the tipping bucket recorded only 70 % of the reference precipitation. At 3 m s-1, the amount of measured precipitation decreased to 50 % of the reference, was even lower for temperatures colder than -2 °C and decreased to 20 % or less for higher wind speeds.The implications of precipitation underestimation for areas in northern Spain are discussed within the context of the present analysis, by applying the transfer function developed at the Formigal-Sarrios and using results from previous studies.

  11. Age, risk assessment, and sanctioning: Overestimating the old, underestimating the young.

    PubMed

    Monahan, John; Skeem, Jennifer; Lowenkamp, Christopher

    2017-04-01

    While many extoll the potential contribution of risk assessment to reducing the human and fiscal costs of mass incarceration without increasing crime, others adamantly oppose the incorporation of risk assessment in sanctioning. The principal concern is that any benefits in terms of reduced rates of incarceration achieved through the use of risk assessment will be offset by costs to social justice-which are claimed to be inherent in any risk assessment process that relies on variables for which offenders bear no responsibility, such as race, gender, and age. Previous research has addressed the variables of race and gender. Here, based on a sample of 7,350 federal offenders, we empirically test the predictive fairness of an instrument-the Post Conviction Risk Assessment (PCRA)-that includes the variable of age. We found that the strength of association between PCRA scores and future arrests was similar across younger (i.e., 25 years and younger), middle (i.e., 26-40 years), and older (i.e., 41 years and older) age groups (AUC values .70 or higher). Nevertheless, rates of arrest within each PCRA risk category were consistently lower for older than for younger offenders. Despite its inclusion of age as a risk factor, PCRA scores overestimated rates of recidivism for older offenders and underestimated rates of recidivism for younger offenders. (PsycINFO Database Record

  12. Congenital isolated adrenocorticotropin deficiency: an underestimated cause of neonatal death, explained by TPIT gene mutations.

    PubMed

    Vallette-Kasic, Sophie; Brue, Thierry; Pulichino, Anne-Marie; Gueydan, Magali; Barlier, Anne; David, Michel; Nicolino, Marc; Malpuech, Georges; Déchelotte, Pierre; Deal, Cheri; Van Vliet, Guy; De Vroede, Monique; Riepe, Felix G; Partsch, Carl-Joachim; Sippell, Wolfgang G; Berberoglu, Merih; Atasay, Begüm; de Zegher, Francis; Beckers, Dominique; Kyllo, Jennifer; Donohoue, Patricia; Fassnacht, Martin; Hahner, Stefanie; Allolio, Bruno; Noordam, C; Dunkel, Leo; Hero, Matti; Pigeon, B; Weill, Jacques; Yigit, Sevket; Brauner, Raja; Heinrich, Juan Jorge; Cummings, Elizabeth; Riddell, Christie; Enjalbert, Alain; Drouin, Jacques

    2005-03-01

    Tpit is a T box transcription factor important for terminal differentiation of pituitary proopiomelanocortin-expressing cells. We demonstrated that human and mouse mutations of the TPIT gene cause a neonatal-onset form of congenital isolated ACTH deficiency (IAD). In the absence of glucocorticoid replacement, IAD can lead to neonatal death by acute adrenal insufficiency. This clinical entity was not previously well characterized because of the small number of published cases. Since identification of the first TPIT mutations, we have enlarged our series of neonatal IAD patients to 27 patients from 21 unrelated families. We found TPIT mutations in 17 of 27 patients. We identified 10 different TPIT mutations, with one mutation found in five unrelated families. All patients appeared to be homozygous or compound heterozygous for TPIT mutations, and their unaffected parents are heterozygous carriers, confirming a recessive mode of transmission. We compared the clinical and biological phenotype of the 17 IAD patients carrying a TPIT mutation with the 10 IAD patients with normal TPIT-coding sequences. This series of neonatal IAD patients revealed a highly homogeneous clinical presentation, suggesting that this disease may be an underestimated cause of neonatal death. Identification of TPIT gene mutations as the principal molecular cause of neonatal IAD permits prenatal diagnosis for families at risk for the purpose of early glucocorticoid replacement therapy.

  13. New Zealand Joint Registry data underestimates the rate of prosthetic joint infection.

    PubMed

    Zhu, Mark; Ravi, Saiprasad; Frampton, Chris; Luey, Chris; Young, Simon

    2016-08-01

    Background and purpose - Recent studies have revealed deficiencies in the accuracy of data from joint registries when reoperations for prosthetic joint infections (PJIs) are reported, particularly when no components are changed. We compared the accuracy of data from the New Zealand Joint Registry (NZJR) to a multicenter audit of hospital records to establish the rate of capture for PJI reoperations. Methods - 4,009 cases undergoing total knee or hip arthroplasty performed at 3 tertiary referral hospitals over a 3-year period were audited using multiple hospital datasets and the NZJR. The number of reoperations for PJI that were performed within 2 years of the primary arthroplasty was obtained using both methods and the data were compared. Results - The NZJR reported a 2-year reoperation rate for PJI of 0.67%, as compared to 1.1% from the audit of hospital records, giving the NZJR a sensitivity of 63%. Only 4 of 11 debridement-in-situ-only procedures and 7 of 12 modular exchange procedures were captured in the NZJR. Interpretation - The national joint registry underestimated the rate of reoperation for PJI by one third. Strategies for improving the accuracy of data might include revising and clarifying the registry forms to include all reoperations for PJI and frequent validation of the registry data against other databases.

  14. Misdiagnosis of Sacral Stress Fracture: An Underestimated Cause of Low Back Pain in Pregnancy?

    PubMed Central

    Perdomo, Ambar Deschamps; Tomé-Bermejo, Félix; Piñera, Angel R.; Alvarez, Luis

    2015-01-01

    Patient: Female, 28 Final Diagnosis: Sacral stress fracture Symptoms: Lumbosacral pain during pregnancy Medication: — Clinical Procedure: Activity modification • conservative treatment Specialty: Orthopedics and Traumatology Objective: Challenging differential diagnosis Background: Sacral stress fracture during pregnancy is an uncommon condition with unclear pathophysiology, presenting with non-specific symptoms and clinical findings. To date, few cases have been published in the literature describing the occurrence of sacral stress fracture during pregnancy. Case Report: We report a 28-year-old primigravid patient who developed lumbosacral pain at the end of the second trimester. Symptoms were overlooked throughout pregnancy and the postpartum period, resulting in the development of secondary chronic gait and balance problems. Conclusions: Stress fracture of the sacrum should be included in the differential diagnosis of low back and sacral pain during pregnancy. Its prevalence is probably underestimated because of the lack of specificity of the symptoms. Plain radiographs are not appropriate due to radiation exclusion; magnetic resonance is the only method that can be applied safely. There is limited information on natural history but many patients are expected to have a benign course. However, misdiagnosis may lead to prolonged morbidity and the development of secondary gait abnormalities. Stress fracture of the sacrum should be included in the differential diagnosis of low back and sacral pain during pregnancy. A high index of suspicion is necessary to establish an early diagnosis and appropriate treatment. PMID:25656418

  15. Bioavailability of lysine for kittens in overheated casein is underestimated by the rat growth assay method.

    PubMed

    Larsen, J A; Fascetti, A J; Calvert, C C; Rogers, Q R

    2010-10-01

    Growth assays were performed to determine lysine bioavailability for kittens and rats in untreated and heated casein; these values were compared with estimates obtained with an in vitro method. Body weight, food intake, nitrogen and dry matter digestibility, and plasma lysine were determined during an 80-day growth trial using kittens (n = 16). Body weight and food intake were determined during a 21-day growth trial using weanling rats (n = 80). The growth data showed bioavailable lysine to be 102.4% and 100.2% (for untreated casein) and 66.1% and 51.7% (for heated casein) for kittens and rats, respectively. There was no relationship between plasma lysine and dietary lysine concentrations for kittens. There were no significant differences in nitrogen or dry matter digestibility among diets for kittens. The chemically reactive lysine content of untreated casein was 99.6%, and of heated casein was 67.1%. Heat treatment of casein resulted in significantly decreased lysine bioavailability as estimated by all methods. For untreated casein, both growth assays showed good agreement with the in vitro method for available lysine. For heated casein, the rat growth assay significantly underestimated bioavailable lysine as determined in kittens while the in vitro method closely approximated this value for the cat.

  16. The Dst index underestimates the solar cycle variation of geomagnetic activity.

    PubMed

    Temerin, Michael; Li, Xinlin

    2015-07-01

    It is known that the correction of the Kyoto Dst index for the secular variation of the Earth's internal field produces a discontinuity in the Kyoto Dst index at the end of each year. We show that this secular correction also introduces a significant baseline error to the Kyoto Dst index that leads to an underestimate of the solar cycle variation of geomagnetic activity and of the strength of the ring current as measured by the Kyoto Dst index. Thus, the average value of the Kyoto Dst index would be approximately 13 nT more negative for the active year 2003 compared to quiet years 2006 and 2009 if the Kyoto Dst index properly measured the effects of the ring current and other currents that influence the Dst observatories. Discontinuities in the Kyoto Dst index at the end of each year have an average value of about 5 nT, but the discontinuity at the end of year 2002 was approximately 12 nT, and the discontinuity at the end of year 1982 may have been as large as 20 nT.

  17. Consumer underestimation of sodium in fast food restaurant meals: Results from a cross-sectional observational study.

    PubMed

    Moran, Alyssa J; Ramirez, Maricelle; Block, Jason P

    2017-02-21

    Restaurants are key venues for reducing sodium intake in the U.S. but little is known about consumer perceptions of sodium in restaurant foods. This study quantifies the difference between estimated and actual sodium content of restaurant meals and examines predictors of underestimation in adult and adolescent diners at fast food restaurants. In 2013 and 2014, meal receipts and questionnaires were collected from adults and adolescents dining at six restaurant chains in four New England cities. The sample included 993 adults surveyed during 229 dinnertime visits to 44 restaurants and 794 adolescents surveyed during 298 visits to 49 restaurants after school or at lunchtime. Diners were asked to estimate the amount of sodium (mg) in the meal they had just purchased. Sodium estimates were compared with actual sodium in the meal, calculated by matching all items that the respondent purchased for personal consumption to sodium information on chain restaurant websites. Mean (SD) actual sodium (mg) content of meals was 1292 (970) for adults and 1128 (891) for adolescents. One-quarter of diners (176 (23%) adults, 155 (25%) adolescents) were unable or unwilling to provide estimates of the sodium content of their meals. Of those who provided estimates, 90% of adults and 88% of adolescents underestimated sodium in their meals, with adults underestimating sodium by a mean (SD) of 1013 mg (1,055) and adolescents underestimating by 876 mg (1,021). Respondents underestimated sodium content more for meals with greater sodium content. Education about sodium at point-of-purchase, such as provision of sodium information on restaurant menu boards, may help correct consumer underestimation, particularly for meals of high sodium content.

  18. Underestimation of Species Richness in Neotropical Frogs Revealed by mtDNA Analyses

    PubMed Central

    Fouquet, Antoine; Gilles, André; Vences, Miguel; Marty, Christian; Blanc, Michel; Gemmell, Neil J.

    2007-01-01

    Background Amphibians are rapidly vanishing. At the same time, it is most likely that the number of amphibian species is highly underestimated. Recent DNA barcoding work has attempted to define a threshold between intra- and inter-specific genetic distances to help identify candidate species. In groups with high extinction rates and poorly known species boundaries, like amphibians, such tools may provide a way to rapidly evaluate species richness. Methodology Here we analyse published and new 16S rDNA sequences from 60 frog species of Amazonia-Guianas to obtain a minimum estimate of the number of undescribed species in this region. We combined isolation by distance, phylogenetic analyses, and comparison of molecular distances to evaluate threshold values for the identification of candidate species among these frogs. Principal Findings In most cases, geographically distant populations belong to genetically highly distinct lineages that could be considered as candidate new species. This was not universal among the taxa studied and thus widespread species of Neotropical frogs really do exist, contrary to previous assumptions. Moreover, the many instances of paraphyly and the wide overlap between distributions of inter- and intra-specific distances reinforce the hypothesis that many cryptic species remain to be described. In our data set, pairwise genetic distances below 0.02 are strongly correlated with geographical distances. This correlation remains statistically significant until genetic distance is 0.05, with no such relation thereafter. This suggests that for higher distances allopatric and sympatric cryptic species prevail. Based on our analyses, we propose a more inclusive pairwise genetic distance of 0.03 between taxa to target lineages that could correspond to candidate species. Conclusions Using this approach, we identify 129 candidate species, two-fold greater than the 60 species included in the current study. This leads to estimates of around 170 to 460

  19. Neglecting rice milling yield and quality underestimates economic losses from high-temperature stress.

    PubMed

    Lyman, Nathaniel B; Jagadish, Krishna S V; Nalley, L Lanier; Dixon, Bruce L; Siebenmorgen, Terry

    2013-01-01

    Future increases in global surface temperature threaten those worldwide who depend on rice production for their livelihoods and food security. Past analyses of high-temperature stress on rice production have focused on paddy yield and have failed to account for the detrimental impact of high temperatures on milling quality outcomes, which ultimately determine edible (marketable) rice yield and market value. Using genotype specific rice yield and milling quality data on six common rice varieties from Arkansas, USA, combined with on-site, half-hourly and daily temperature observations, we show a nonlinear effect of high-temperature stress exposure on yield and milling quality. A 1 °C increase in average growing season temperature reduces paddy yield by 6.2%, total milled rice yield by 7.1% to 8.0%, head rice yield by 9.0% to 13.8%, and total milling revenue by 8.1% to 11.0%, across genotypes. Our results indicate that failure to account for changes in milling quality leads to understatement of the impacts of high temperatures on rice production outcomes. These dramatic losses result from reduced paddy yield and increased percentages of chalky and broken kernels, which together decrease the quantity and market value of milled rice. Recently published estimates show paddy yield reductions of up to 10% across the major rice-producing regions of South and Southeast Asia due to rising temperatures. The results of our study suggest that the often-cited 10% figure underestimates the economic implications of climate change for rice producers, thus potentially threatening future food security for global rice producers and consumers.

  20. Optical Disector Counting in Cryosections and Vibratome Sections Underestimates Particle Numbers: Effects of Tissue Quality

    PubMed Central

    Ward, Tyson S.; Rosen, Glenn D.; Von Bartheld, Christopher S.

    2013-01-01

    Optical disector counting is currently applied most often to cryosections, followed in frequency by resin-embedded tissues, paraffin, and vibratome sections. The preservation quality of these embedding options differs considerably; yet, the effect of tissue morphology on numerical estimates is unknown. We tested whether different embedding media significantly influence numerical estimates in optical disector counting, using the previously calibrated trochlear motor nucleus of hatchling chickens. Animals were perfusion-fixed with paraformaldehyde (PFA) only or in addition with glutaraldehyde (GA), or by Methacarn immersion fixation. Brains were prepared for paraffin, cryo-, vibratome- or celloidin sectioning. Complete penetration of the thionin stain was verified by z-axis analysis. Neuronal nuclei were counted using an unbiased counting rule, numbers were averaged for each group and compared by ANOVA. In paraffin sections, 906 ± 12 (SEM) neurons were counted, similar to previous calibrated data series, and results obtained from fixation with Methacarn or PFA were statistically indistinguishable. In celloidin sections, 912 ± 28 neurons were counted—not statistically different from paraffin. In cryosections, 812 ± 12 neurons were counted (underestimate of 10.4%) when fixed with PFA only, but 867 ± 17 neurons were counted when fixed with PFA and GA. Vibratome sections had the most serious aberration with 729 ± 31 neurons—a deficit of 20%. Thus, our analysis shows that PFA-fixed cryosections and vibratome sections result in a substantial numerical deficit. The addition of GA to the PFA fixative significantly improved counts in cryosections. These results may explain, in part, the significant numerical differences reported from different labs and should help investigators select optimal conditions for quantitative morphological studies. PMID:17868132

  1. Concomitant spuriously elevated white blood cell count, a previously underestimated phenomenon in EDTA-dependent pseudothrombocytopenia.

    PubMed

    Xiao, Yufei; Xu, Yang

    2015-01-01

    The proportion and potential risk of concomitant spuriously elevated white blood cell count (SEWC) are underestimated in ethylenediaminetetraacetic acid (EDTA)-dependent pseudothrombocytopenia (PTCP). The proportion, kinetics and prevention of SEWC remain poorly understood. A total of 25 patients with EDTA-dependent PTCP were enrolled in this study. With the hematology analyzer Coulter LH 750, we determined the time courses of WBC count, WBC differential and platelet count in EDTA- and sodium citrate-anticoagulated blood, respectively. Blood smears were prepared to inspect the presence of platelet clumps using light microscopy. The effect of automatic instrumental correction on the extent of SEWC was evaluated. The proportion of SEWC was 92% in EDTA-dependent PTCP and 73.9% of SEWCs were within the normal range. The development of SEWC was time-dependent, and neutrophils and lymphocytes were the main subpopulations involved in SEWC. A strong and significant correlation (r = 0.9937, p < 0.001) was found between the increased WBC count and the decreased platelet count. Both corrected and uncorrected WBC counts at 15 minutes or later after blood collection in EDTA were significantly higher than their basal counts, respectively, p < 0.05. Interestingly, in citrated blood, WBC counts after blood collection were not significantly different from its basal counts, p > 0.05. A high proportion of concomitant SEWCs, which are mainly within normal range, are present in patients with EDTA-dependent PTCP. Proper interpretation of SEWC is crucial to avoid clinic errors. SEWC develops in a time-dependent pattern, although the Coulter LH 750 only partly mitigates the extent of SEWC, sodium citrate is able to effectively prevent SEWC.

  2. Lactate minimum underestimates the maximal lactate steady-state in swimming mice.

    PubMed

    Rodrigues, Natalia Almeida; Torsoni, Adriana Souza; Fante, Thais; Dos Reis, Ivan Gustavo Masselli; Gobatto, Claudio Alexandre; Manchado-Gobatto, Fúlvia Barros

    2017-01-01

    The intensity of lactate minimum (LM) has presented a good estimate of the intensity of maximal lactate steady-state (MLSS); however, this relationship has not yet been verified in the mouse model. We proposed validating the LM protocol for swimming mice by investigating the relationship among intensities of LM and MLSS as well as differences between sexes, in terms of aerobic capacity. Nineteen mice (male: 10, female: 9) were submitted to the evaluation protocols for LM and MLSS. The LM protocol consisted of hyperlactatemia induction (30 s exercise (13% body mass (bm)), 30 s resting pause and exhaustive exercise (13% bm), 9 min resting pause and incremental test). The LM underestimated MLSS (mice: 17.6%; male: 13.5%; female: 21.6%). Pearson's analysis showed a strong correlation among intensities of MLSS and LM (male (r = 0.67, p = 0.033); female (r = 0.86, p = 0.003)), but without agreement between protocols. The Bland-Altman analysis showed that bias was higher for females (1.5 (0.98) % bm; mean (MLSS and LM): 4.4%-6.4% bm) as compared with males (0.84 (1.24) % bm; mean (MLSS and LM): 4.5%-7.5% bm). The error associated with the estimated of intensity for males was lower when compared with the range of means for MLSS and LM. Therefore, the LM test could be used to determine individual aerobic intensity for males (considering the bias) but not females. Furthermore, the females supported higher intensities than the males. The differences in body mass between sexes could not explain the higher intensities supported by the females.

  3. Citation Analysis May Severely Underestimate the Impact of Clinical Research as Compared to Basic Research

    PubMed Central

    van Eck, Nees Jan; Waltman, Ludo; van Raan, Anthony F. J.; Klautz, Robert J. M.; Peul, Wilco C.

    2013-01-01

    Background Citation analysis has become an important tool for research performance assessment in the medical sciences. However, different areas of medical research may have considerably different citation practices, even within the same medical field. Because of this, it is unclear to what extent citation-based bibliometric indicators allow for valid comparisons between research units active in different areas of medical research. Methodology A visualization methodology is introduced that reveals differences in citation practices between medical research areas. The methodology extracts terms from the titles and abstracts of a large collection of publications and uses these terms to visualize the structure of a medical field and to indicate how research areas within this field differ from each other in their average citation impact. Results Visualizations are provided for 32 medical fields, defined based on journal subject categories in the Web of Science database. The analysis focuses on three fields: Cardiac & cardiovascular systems, Clinical neurology, and Surgery. In each of these fields, there turn out to be large differences in citation practices between research areas. Low-impact research areas tend to focus on clinical intervention research, while high-impact research areas are often more oriented on basic and diagnostic research. Conclusions Popular bibliometric indicators, such as the h-index and the impact factor, do not correct for differences in citation practices between medical fields. These indicators therefore cannot be used to make accurate between-field comparisons. More sophisticated bibliometric indicators do correct for field differences but still fail to take into account within-field heterogeneity in citation practices. As a consequence, the citation impact of clinical intervention research may be substantially underestimated in comparison with basic and diagnostic research. PMID:23638064

  4. Comparing Two Epidemiologic Surveillance Methods to Assess Underestimation of Human Stampedes in India

    PubMed Central

    Ngai, Ka Ming; Lee, Wing Yan; Madan, Aditi; Sanyal, Saswata; Roy, Nobhojit; Burkle, Frederick M.; Hsu, Edbert B.

    2013-01-01

    Background: Two separate but complementary epidemiologic surveillance methods for human stampedes have emerged since the publication of the topic in 2009. The objective of this study is to estimate the degree of underreporting in India. Method: The Ngai Search Method was compared to the Roy Search Method for human stampede events occurring in India between 2001 and 2010. Results: A total of 40 stampedes were identified by both search methods. Using the Ngai method, 34 human stampedes were identified. Using a previously defined stampede scale: 2 events were class I, 21 events were class II, 8 events were class III, and 3 events were class IV. The median deaths were 5.5 per event and median injuries were 13.5 per event. Using the Roy method, 27 events were identified, including 9 events that were not identified by the Ngai method. After excluding events based on exclusion criteria, six additional events identified by the Roy’s method had a median of 4 deaths and 30 injuries. In multivariate analysis using the Ngai method, religious (6.52, 95%CI 1.73-24.66, p=0.006) and political (277.09, 95%CI 5.12-15,001.96, p=0.006) events had higher relative number of deaths. Conclusion: Many causes accounting for the global increase in human stampede events can only be elucidated through systematic epidemiological investigation. Focusing on a country with a high recurrence of human stampedes, we compare two independent methods of data abstraction in an effort to improve the existing database and to identify pertinent risk factors. We concluded that our previous publication underestimated stampede events in India by approximately 18% and an international standardized database to systematically record occurrence of human stampedes is needed to facilitate understanding of the epidemiology of human stampedes. PMID:24077300

  5. Underestimation of boreal soil carbon stocks by mathematical soil carbon models linked to soil nutrient status

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina A.; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-08-01

    Inaccurate estimate of the largest terrestrial carbon pool, soil organic carbon (SOC) stock, is the major source of uncertainty in simulating feedback of climate warming on ecosystem-atmosphere carbon dioxide exchange by process-based ecosystem and soil carbon models. Although the models need to simplify complex environmental processes of soil carbon sequestration, in a large mosaic of environments a missing key driver could lead to a modeling bias in predictions of SOC stock change.We aimed to evaluate SOC stock estimates of process-based models (Yasso07, Q, and CENTURY soil sub-model v4) against a massive Swedish forest soil inventory data set (3230 samples) organized by a recursive partitioning method into distinct soil groups with underlying SOC stock development linked to physicochemical conditions.For two-thirds of measurements all models predicted accurate SOC stock levels regardless of the detail of input data, e.g., whether they ignored or included soil properties. However, in fertile sites with high N deposition, high cation exchange capacity, or moderately increased soil water content, Yasso07 and Q models underestimated SOC stocks. In comparison to Yasso07 and Q, accounting for the site-specific soil characteristics (e. g. clay content and topsoil mineral N) by CENTURY improved SOC stock estimates for sites with high clay content, but not for sites with high N deposition.Our analysis suggested that the soils with poorly predicted SOC stocks, as characterized by the high nutrient status and well-sorted parent material, indeed have had other predominant drivers of SOC stabilization lacking in the models, presumably the mycorrhizal organic uptake and organo-mineral stabilization processes. Our results imply that the role of soil nutrient status as regulator of organic matter mineralization has to be re-evaluated, since correct SOC stocks are decisive for predicting future SOC change and soil CO2 efflux.

  6. Noncommunicable disease in rural India: Are we seriously underestimating the risk? The Nallampatti noncommunicable disease study

    PubMed Central

    Swaminathan, Krishnan; Veerasekar, Ganesh; Kuppusamy, Sujatha; Sundaresan, Mohanraj; Velmurugan, Ganesan; Palaniswami, Nalla G.

    2017-01-01

    Aim: To assess the prevalence of noncommunicable diseases in a true rural farming population in South India and compare the data with the landmark contemporary Indian Council of Medical Research-India Diabetes (ICMR-INDIAB) study. Methods: Local Ethics Committee approval and informed consent was obtained from all participants. Inclusion criteria were participants, aged ≥20 and ≤85 years, from Nallampatti, a classical farming village from Tamil Nadu state, India. All participants were administered a detailed questionnaire, had anthropometric measurements including height, weight, and waist circumference. Bloods were drawn for random blood glucose, glycated hemoglobin (HbA1c), nonfasting lipid profile, Cystatin C, uric acid, and hemoglobin. All participants had carotid intima-media thickness (CIMT) done by high-resolution B-mode carotid ultrasound. Results: More than 50% of the population had either diabetes or prediabetes based on HbA1c. Nearly, 40% of the population had hypertension with suboptimal control in those with known hypertension. Nearly, a third of the population had dyslipidemia, elevated cystatin C levels, and abnormal CIMT. The burden was higher than the comparable ICMR-INDIAB study in rural Tamil Nadu. Conclusion: One-third to one-half of this rural farming population is at risk of cardiovascular disease, with poor control of preexisting cardiovascular risk factors. Current Indian data may underestimate the risk in different ethnic populations and regions of India. Long-term follow-up of this cohort for the incident cardiovascular disease will shed light on the true cardiovascular risk in a typical South Indian rural farming population. PMID:28217505

  7. Ceramic materials lead to underestimated DNA quantifications: a method for reliable measurements.

    PubMed

    Piccinini, E; Sadr, N; Martin, I

    2010-07-22

    In the context of investigating cell-material interactions or of material-guided generation of tissues, DNA quantification represents an elective method to precisely assess the number of cells attached or embedded within different substrates. Nonetheless, nucleic acids are known to electrostatically bind to ceramics, a class of materials commonly employed in orthopaedic implants and bone tissue engineering scaffolds. This phenomenon is expected to lead to a relevant underestimation of the DNA amount, resulting in erroneous experimental readouts. The present work aims at *lpar;i) investigating the effects of DNA-ceramic bond occurrence on DNA quantification, and (ii) developing a method to reliably extract and accurately quantify DNA in ceramic-containing specimens. A cell-free model was adopted to study DNA-ceramic binding, highlighting an evident DNA loss (up to 90%) over a wide range of DNA/ceramic ratios (w/w). A phosphate buffer-based (800 mM) enzymatic extraction protocol was developed and its efficacy in terms of reliable DNA extraction and measurement was confirmed with commonly used fluorometric assays, for various ceramic substrates. The proposed buffered DNA extraction technique was validated in a cell-based experiment showing 95% DNA retrieval in a cell seeding experiment, demonstrating a 3.5-fold increase in measured DNA amount as compared to a conventional enzymatic extraction protocol. In conclusion, the proposed phosphate buffer method consistently improves the DNA extraction process assuring unbiased analysis of samples and allowing accurate and sensitive cell number quantification on ceramic containing substrates.

  8. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Gu, J.; Bednarz, B.; Caracappa, P. F.; Xu, X. G.

    2009-05-01

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  9. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations.

    PubMed

    Gu, J; Bednarz, B; Caracappa, P F; Xu, X G

    2009-05-07

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  10. Disguised Distress in Children and Adolescents "Flying under the Radar": Why Psychological Problems Are Underestimated and How Schools Must Respond

    ERIC Educational Resources Information Center

    Flett, Gordon L.; Hewitt, Paul L.

    2013-01-01

    It is now recognized that there is a very high prevalence of psychological disorders among children and adolescents and relatively few receive psychological treatment. In the current article, we present the argument that levels of distress and dysfunction among young people are substantially underestimated and the prevalence of psychological…

  11. Underestimation of soil carbon stocks by Yasso07, Q, and CENTURY models in boreal forest linked to overlooking site fertility

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-04-01

    The soil organic carbon stock (SOC) changes estimated by the most process based soil carbon models (e.g. Yasso07, Q and CENTURY), needed for reporting of changes in soil carbon amounts for the United Nations Framework Convention on Climate Change (UNFCCC) and for mitigation of anthropogenic CO2 emissions by soil carbon management, can be biased if in a large mosaic of environments the models are missing a key factor driving SOC sequestration. To our knowledge soil nutrient status as a missing driver of these models was not tested in previous studies. Although, it's known that models fail to reconstruct the spatial variation and that soil nutrient status drives the ecosystem carbon use efficiency and soil carbon sequestration. We evaluated SOC stock estimates of Yasso07, Q and CENTURY process based models against the field data from Swedish Forest Soil National Inventories (3230 samples) organized by recursive partitioning method (RPART) into distinct soil groups with underlying SOC stock development linked to physicochemical conditions. These models worked for most soils with approximately average SOC stocks, but could not reproduce higher measured SOC stocks in our application. The Yasso07 and Q models that used only climate and litterfall input data and ignored soil properties generally agreed with two third of measurements. However, in comparison with measurements grouped according to the gradient of soil nutrient status we found that the models underestimated for the Swedish boreal forest soils with higher site fertility. Accounting for soil texture (clay, silt, and sand content) and structure (bulk density) in CENTURY model showed no improvement on carbon stock estimates, as CENTURY deviated in similar manner. We highlighted the mechanisms why models deviate from the measurements and the ways of considering soil nutrient status in further model development. Our analysis suggested that the models indeed lack other predominat drivers of SOC stabilization

  12. The underestimated challenges of burst-mode WDM transmission in TWDM-PON

    NASA Astrophysics Data System (ADS)

    Bonk, R.; Poehlmann, W.; van Veen, D.; Galaro, J.; Farah, R.; Schmuck, H.; Pfeiffer, Th.

    2015-12-01

    In this paper, the underestimated challenges of the burst-mode operation in the upstream path of time-and-wavelength-division-multiplexed passive optical networks are analyzed. Various challenges are disclosed, the influence of the associated physical effects on the signal quality is discussed and mitigation proposals are described: Intra-channel cross-talk can arise from optical network units induced background amplified-spontaneous emission noise and inter-channel cross-talk can be caused by non-ideal filter suppression between wavelength channels at the optical line termination receivers. Such cross-talk effects can be counteracted by reducing the laser burst-signal output power as well as its rival noise power simultaneously, i.e. by applying power leveling. A fast frequency drift in burst-mode operation is inherent to directly modulated lasers, but depends on the specific laser design, the laser output power and the burst length. Mitigation mechanisms are an optimized non-standard laser design or a specific mode of operation, e.g. an increase of the frequency drift during the preamble of the burst. The power dynamic range an optical pre-amplifier as part of the upstream signal receivers needs to handle can be in range of up to 40 dB, because of the multi-wavelength channel operation and of the optical distribution network differential path loss. This power dynamic faced at the receiver can not only cause challenges for the optical amplifier, but also for the burst-mode receivers. Additionally, the physical layer operation and maintenance of the upstream path is challenging too. Each optical network unit laser needs to be either wavelength pre-calibrated, which adds undesired costs or cross-channel synchronization of ranging windows has to be ensured. Otherwise rogue optical network unit behavior in wavelength and time domain will deteriorate system performance with every new optical network unit entering the network. Further, the operation of the optical

  13. Predicted Values for Spirometry may Underestimate Long-Standing Asthma Severity

    PubMed Central

    Sposato, Bruno

    2016-01-01

    Background: Asthma may show an accelerated lung function decline. Asthmatics, although having FEV1 and FEV1/VC (and z-scores) higher than the lower limit of normality, may show a significant FEV1 decline when compared to previous measurements. We assessed how many asymptomatic long-standing asthmatics (LSA) with normal lung function showed a significant FEV1 decline when an older FEV1 was taken as reference point. Methods: 46 well-controlled LSA (age: 48.8±12.1; 23 females) with normal FEV1 and FEV1/VC according to GLI2012 references (FEV1: 94.8±10.1%, z-score:-0.38±0.79; FEV1/VC: 79.3±5.2, z-score:-0.15±0.77) were selected. We considered FEV1 decline, calculated by comparing the latest value to one at least five years older or to the highest predicted value measured at 21 years for females and 23 for males. A FEV1 decline >15% or 30 ml/years was regarded as pathological. Results: When comparing the latest FEV1 to an at least 5-year-older one (mean 8.1±1.4 years between 2 measurements), 14 subjects (30.4%) showed a FEV1 decline <5% (mean: -2.2±2.6%), 19 (41.3%) had a FEV1 5-15% change (mean: -9.2±2.5%) and 13 (28.3%) a FEV1 decrease>15% (mean: -18.3±2.4). Subjects with a FEV1 decline>30 ml/year were 28 (60.8%). When using the highest predicted FEV1 as reference point and declines were corrected by subtracting the physiological decrease, 6 (13%) patients showed a FEV1 decline higher than 15%, whereas asthmatics with a FEV1 loss>30 ml/year were 17 (37%). Conclusion: FEV1 decline calculation may show how severe asthma actually is, avoiding a bronchial obstruction underestimation and a possible under-treatment in lots of apparent “well-controlled” LSA with GLI2012-normal-range lung function values. PMID:28144365

  14. Pesticide Mixtures, Endocrine Disruption, and Amphibian Declines: Are We Underestimating the Impact?

    PubMed Central

    Hayes, Tyrone B.; Case, Paola; Chui, Sarah; Chung, Duc; Haeffele, Cathryn; Haston, Kelly; Lee, Melissa; Mai, Vien Phoung; Marjuoa, Youssra; Parker, John; Tsui, Mable

    2006-01-01

    increase in plasma levels of the stress hormone corticosterone. Although it cannot be determined whether all the pesticides in the mixture contribute to these adverse effects or whether some pesticides are effectors, some are enhancers, and some are neutral, the present study revealed that estimating ecological risk and the impact of pesticides on amphibians using studies that examine only single pesticides at high concentrations may lead to gross underestimations of the role of pesticides in amphibian declines. PMID:16818245

  15. Underestimated public health risks caused by overestimated VOC removal in wastewater treatment processes.

    PubMed

    Yang, Junchen; Wang, Kun; Zhao, Qingliang; Huang, Likun; Yuan, Chung-Shin; Chen, Wei-Hsiang; Yang, Wen-Bin

    2014-02-01

    The uncontrolled release of volatile organic compounds (VOCs) from wastewater treatment plants (WWTPs) and the adverse health effects on the public have been of increasing concern. In this study, a lab-scale bioreactor was prepared to analyze the mass distribution of three aromatic (benzene, toluene, and xylenes) and four chlorinated VOCs (chloroform, carbon tetrachloride, trichloroethylene, and tetrachloroethylene) among the air, water and sludge phases in wastewater treatment processes. The VOC distribution through a full-scale WWTP in northern China was further investigated with respect to the effects of seasonal temperature variations and treatment technologies, followed by the cancer risk assessment using a steady-state Gaussian plume model (Industrial Source Complex) to simulate the atmospheric behaviors of the VOCs emitted from the WWTP. It was found that three aromatic hydrocarbons, notably benzene, were more readily released from the wastewater into the atmosphere, whereas the chlorinated compounds except chloroform were mainly present in the water phase through the treatment processes. The primary clarifier was the technology releasing high levels of VOCs into the atmosphere from the wastewater. The extents of volatilization or biodegradation, two important mechanisms to remove VOCs from wastewater, appeared to be determined by the physicochemical characteristics of the compounds, as the influence of treatment technologies (e.g., aeration) and seasonal temperature variations was rather limited. More importantly, the people living in the areas even more than 4 km away from the WWTP were still potentially exposed to cancer risks exceeding the regulatory threshold limit. The findings described the complex nature of VOC emissions from WWTPs and quantitatively indicated that the associated health impacts on the public near the WWTPs could be severely underestimated, whereas their treatment efficiencies by wastewater treatment technologies were overestimated

  16. Large-scale movements in European badgers: has the tail of the movement kernel been underestimated?

    PubMed

    Byrne, Andrew W; Quinn, John L; O'Keeffe, James J; Green, Stuart; Sleeman, D Paddy; Martin, S Wayne; Davenport, John

    2014-07-01

    movement distribution is currently underestimated. The implications of this for understanding the spatial ecology of badger populations and for the design of disease intervention strategies are potentially significant.

  17. Cachexia as a major underestimated and unmet medical need: facts and numbers.

    PubMed

    von Haehling, Stephan; Anker, Stefan D

    2010-09-01

    Cachexia is a serious, however underestimated and underrecognised medical consequence of malignant cancer, chronic heart failure (CHF), chronic kidney disease (CKD), chronic obstructive pulmonary disease (COPD), cystic fibrosis, rheumatoid arthritis, Alzheimer's disease, infectious diseases, and many other chronic illnesses. The prevalence of cachexia is high, ranging from 5% to 15% in CHF or COPD to 60% to 80% in advanced cancer. By population prevalence, the most frequent cachexia subtypes are in order: COPD cachexia, cardiac cachexia (in CHF), cancer cachexia, and CKD cachexia. In industrialized countries (North America, Europe, Japan), the overall prevalence of cachexia (due to any disease) is growing and currently about 1%, i.e., about nine million patients. The relative prevalence of cachexia is somewhat less in Asia, but is a growing problem there as well. In absolute terms, cachexia is, in Asia (due to the larger population), as least as big a problem as in the Western world. Cachexia is also a big medical problem in South America and Africa, but data are scarce. A consensus statement recently proposed to diagnose cachexia in chronic diseases when there is weight loss exceeding 5% within the previous 3-12 months combined with symptoms characteristic for cachexia (e.g., fatigue), loss of skeletal muscle and biochemical abnormalities (e.g., anemia or inflammation). Treatment approaches using anabolics, anti-catabolic therapies, appetite stimulants, and nutritional interventions are under development. A more thorough understanding of the pathophysiology of cachexia development and progression is needed that likely will lead to combination therapies being developed. These efforts are greatly needed as presence of cachexia is always associated with high-mortality and poor-symptom status and dismal quality of life. It is thought that in cancer, more than 30% of patients die due to cachexia and more than 50% of patients with cancer die with cachexia being present

  18. System wide analyses have underestimated protein abundances and the importance of transcription in mammals.

    PubMed

    Li, Jingyi Jessica; Bickel, Peter J; Biggin, Mark D

    2014-01-01

    Large scale surveys in mammalian tissue culture cells suggest that the protein expressed at the median abundance is present at 8,000-16,000 molecules per cell and that differences in mRNA expression between genes explain only 10-40% of the differences in protein levels. We find, however, that these surveys have significantly underestimated protein abundances and the relative importance of transcription. Using individual measurements for 61 housekeeping proteins to rescale whole proteome data from Schwanhausser et al. (2011), we find that the median protein detected is expressed at 170,000 molecules per cell and that our corrected protein abundance estimates show a higher correlation with mRNA abundances than do the uncorrected protein data. In addition, we estimated the impact of further errors in mRNA and protein abundances using direct experimental measurements of these errors. The resulting analysis suggests that mRNA levels explain at least 56% of the differences in protein abundance for the 4,212 genes detected by Schwanhausser et al. (2011), though because one major source of error could not be estimated the true percent contribution should be higher. We also employed a second, independent strategy to determine the contribution of mRNA levels to protein expression. We show that the variance in translation rates directly measured by ribosome profiling is only 12% of that inferred by Schwanhausser et al. (2011), and that the measured and inferred translation rates correlate poorly (R(2) = 0.13). Based on this, our second strategy suggests that mRNA levels explain ∼81% of the variance in protein levels. We also determined the percent contributions of transcription, RNA degradation, translation and protein degradation to the variance in protein abundances using both of our strategies. While the magnitudes of the two estimates vary, they both suggest that transcription plays a more important role than the earlier studies implied and translation a much smaller

  19. Black carbon in the Arctic: the underestimated role of gas flaring and residential combustion emissions

    NASA Astrophysics Data System (ADS)

    Stohl, A.; Klimont, Z.; Eckhardt, S.; Kupiainen, K.; Shevchenko, V. P.; Kopeikin, V. M.; Novigatsky, A. N.

    2013-09-01

    BC surface concentrations due to residential combustion by 68% when using daily emissions. A large part (93%) of this systematic increase can be captured also when using monthly emissions; the increase is compensated by a decreased BC burden at lower latitudes. In a comparison with BC measurements at six Arctic stations, we find that using daily-varying residential combustion emissions and introducing gas flaring emissions leads to large improvements of the simulated Arctic BC, both in terms of mean concentration levels and simulated seasonality. Case studies based on BC and carbon monoxide (CO) measurements from the Zeppelin observatory appear to confirm flaring as an important BC source that can produce pollution plumes in the Arctic with a high BC / CO enhancement ratio, as expected for this source type. BC measurements taken during a research ship cruise in the White, Barents and Kara seas north of the region with strong flaring emissions reveal very high concentrations of the order of 200-400 ng m-3. The model underestimates these concentrations substantially, which indicates that the flaring emissions (and probably also other emissions in northern Siberia) are rather under- than overestimated in our emission data set. Our results suggest that it may not be "vertical transport that is too strong or scavenging rates that are too low" and "opposite biases in these processes" in the Arctic and elsewhere in current aerosol models, as suggested in a recent review article (Bond et al., Bounding the role of black carbon in the climate system: a scientific assessment, J. Geophys. Res., 2013), but missing emission sources and lacking time resolution of the emission data that are causing opposite model biases in simulated BC concentrations in the Arctic and in the mid-latitudes.

  20. Impact of underestimating the effects of cold temperature on motor vehicle start emissions of air toxics in the United States.

    PubMed

    Cook, Richard; Touma, Jawad S; Fernandez, Antonio; Brzezinski, David; Bailey, Chad; Scarbro, Carl; Thurman, James; Strum, Madeleine; Ensley, Darrell; Baldauf, Richard

    2007-12-01

    Analyses of U.S. Environmental Protection Agency (EPA) certification data, California Air Resources Board surveillance testing data, and EPA research testing data indicated that EPA's MOBILE6.2 emission factor model substantially underestimates emissions of gaseous air toxics occurring during vehicle starts at cold temperatures for light-duty vehicles and trucks meeting EPA Tier 1 and later standards. An unofficial version of the MOBILE6.2 model was created to account for these underestimates. When this unofficial version of the model was used to project emissions into the future, emissions increased by almost 100% by calendar year 2030, and estimated modeled ambient air toxics concentrations increased by 6-84%, depending on the pollutant. To address these elevated emissions, EPA recently finalized standards requiring reductions of emissions when engines start at cold temperatures.

  1. A new approach to the assessment of lumen visibility of coronary artery stent at various heart rates using 64-slice MDCT

    PubMed Central

    Groen, J. M.; van Ooijen, P. M. A.; Oudkerk, M.

    2007-01-01

    Coronary artery stent lumen visibility was assessed as a function of cardiac movement and temporal resolution with an automated objective method using an anthropomorphic moving heart phantom. Nine different coronary stents filled with contrast fluid and surrounded by fat were scanned using 64-slice multi-detector computed tomography (MDCT) at 50–100 beats/min with the moving heart phantom. Image quality was assessed by measuring in-stent CT attenuation and by a dedicated tool in the longitudinal and axial plane. Images were scored by CT attenuation and lumen visibility and compared with theoretical scoring to analyse the effect of multi-segment reconstruction (MSR). An average increase in CT attenuation of 144 ± 59 HU and average diminished lumen visibility of 29 ± 12% was observed at higher heart rates in both planes. A negative correlation between image quality and heart rate was non-significant for the majority of measurements (P > 0.06). No improvement of image quality was observed in using MSR. In conclusion, in-stent CT attenuation increases and lumen visibility decreases at increasing heart rate. Results obtained with the automated tool show similar behaviour compared with attenuation measurements. Cardiac movement during data acquisition causes approximately twice as much blurring compared with the influence of temporal resolution on image quality. Electronic supplementary material The online version of this article (doi:10.1007/s00330-007-0568-8) contains supplementary material, which is available to authorized users. PMID:17429648

  2. Effects of underestimating the kinematics of trunk rotation on simultaneous reaching movements: predictions of a biomechanical model

    PubMed Central

    2013-01-01

    Background Rotation of the torso while reaching produces torques (e.g., Coriolis torque) that deviate the arm from its planned trajectory. To ensure an accurate reaching movement, the brain may take these perturbing torques into account during movement planning or, alternatively, it may correct hand trajectory during movement execution. Irrespective of the process selected, it is expected that an underestimation of trunk rotation would likely induce inaccurate shoulder and elbow torques, resulting in hand deviation. Nonetheless, it is still undetermined to what extent a small error in the perception of trunk rotations, translating into an inappropriate selection of motor commands, would affect reaching accuracy. Methods To investigate, we adapted a biomechanical model (J Neurophysiol 89: 276-289, 2003) to predict the consequences of underestimating trunk rotations on right hand reaching movements performed during either clockwise or counter clockwise torso rotations. Results The results revealed that regardless of the degree to which the torso rotation was underestimated, the amplitude of hand deviation was much larger for counter clockwise rotations than for clockwise rotations. This was attributed to the fact that the Coriolis and centripetal joint torques were acting in the same direction during counter clockwise rotation yet in opposite directions during clockwise rotations, effectively cancelling each other out. Conclusions These findings suggest that in order to anticipate and compensate for the interaction torques generated during torso rotation while reaching, the brain must have an accurate prediction of torso rotation kinematics. The present study proposes that when designing upper limb prostheses controllers, adding a sensor to monitor trunk kinematics may improve prostheses control and performance. PMID:23758968

  3. Do you know how I feel? Parents underestimate worry and overestimate optimism compared to child self-report.

    PubMed

    Lagattuta, Kristin Hansen; Sayfan, Liat; Bamford, Christi

    2012-10-01

    Three studies assessed parent-child agreement in perceptions of children's everyday emotions in typically developing 4- to 11-year-old children. Study 1 (N=228) and Study 2 (N=195) focused on children's worry and anxiety. Study 3 (N=90) examined children's optimism. Despite child and parent reporters providing internally consistent responses, their perceptions about children's emotional wellbeing consistently failed to correlate. Parents significantly underestimated child worry and anxiety and overestimated optimism compared to child self-report (suggesting a parental positivity bias). Moreover, parents' self-reported emotions correlated with how they reported their children's emotions (suggesting an egocentric bias). These findings have implications for developmental researchers, clinicians, and parents.

  4. [Underestimation of dermatology based on ignorance and its impact on patient's health].

    PubMed

    Fuentes-Suárez, Adán; Domínguez-Soto, Luciano

    2015-01-01

    With the emergence of medical specialties in different areas of medicine, assessment of patients became narrow and specialized. There is also the perception that some specialties are more difficult than others. Dermatology has long been seen by most physicians non dermatologists as a relaxed area, without real emergencies or requirement of great intellectual effort. Some specialists, erroneously think that everything can be cured with topical steroids and/or antifungal creams. Although several skin diseases are common complains seen by the general practitioner, very few time and credits are granted to cover these diseases during the years of undergraduate training. Thus, the primary care physicians and others medical specialists believe that skin diseases are not life threatening and hence irrelevant. Nonetheless, they feel competent enough to prescribe a variety of treatments for skin diseases that may lead to iatrogenesis.

  5. Umbilical Pilonidal Sinus, an Underestimated and Little-Known Clinical Entity: Report of Two Cases

    PubMed Central

    Kaplan, Mehmet; Kaplan, Elif Tugce; Kaplan, Tugba; Kaplan, Fatma Cigdem

    2017-01-01

    Case series Patient: Male, 26 • Female, 21 Final Diagnosis: Umbilical pilonidal sinus Symptoms: Hair tuft in the umbilicus • pain • periumbilical dermatitis • purulent discharge from the umbilicus • skin lesions • pruritis Medication: — Clinical Procedure: Umbilicus preserving surgery Specialty: General Surgery • Dermatology • Plastic Surgery Objective: Rare disease Background: Umbilical pilonidal sinus (UPS) is a rare disease of young, hirsute, dark men with deep navels and poor personal hygiene. UPS could easily be misdiagnosed and mistreated due to its rarity and lack of awareness of the condition by physicians. However, the diagnosis is easy to establish with physical examination and a detailed history. Although it is being diagnosed and reported more frequently, there is still no consensus regarding best treatment options. Case Report: In this report, we present two cases of UPS, one in a man and one in a woman, who had typical symptoms of pain, swelling, and intermittent malodorous discharge from the umbilicus. They had small sinus openings with hair protruding deep in the navel. Because these two patients had previous histories of failed conservative treatments, an umbilicus preserving surgery was performed for both cases. Wounds were healed in 2–3 weeks with acceptable cosmetic results. During a more than 2 year follow-up period, there were no signs of recurrence. Conclusions: In a patient presenting with a history of intermittent discharge, itching, pain, or bleeding from the umbilicus and the presence of granulation tissue with or without protruding hair and periumbilical dermatitis, the diagnosis should consider UPS, even in female patients. Treatment generally depends on the severity of the disease, ranging from good personal hygiene to surgical excision of umbilical complex. The treatment of choice for chronic intermittent cases is surgical removal of the affected portion; paying special attention to cosmetic appearance. PMID

  6. Radiofrequency Ablation of Liver Metastases-Software-Assisted Evaluation of the Ablation Zone in MDCT: Tumor-Free Follow-Up Versus Local Recurrent Disease

    SciTech Connect

    Keil, Sebastian Bruners, Philipp; Schiffl, Katharina; Sedlmair, Martin; Muehlenbruch, Georg; Guenther, Rolf W.; Das, Marco; Mahnken, Andreas H.

    2010-04-15

    The purpose of this study was to investigate differences in change of size and CT value between local recurrences and tumor-free areas after CT-guided radiofrequency ablation (RFA) of hepatic metastases during follow-up by means of dedicated software for automatic evaluation of hepatic lesions. Thirty-two patients with 54 liver metastases from breast or colorectal cancer underwent triphasic contrast-enhanced multidetector-row computed tomography (MDCT) to evaluate hepatic metastatic spread and localization before CT-guided RFA and for follow-up after intervention. Sixteen of these patients (65.1 {+-} 10.3 years) with 30 metastases stayed tumor-free (group 1), while the other group (n = 16 with 24 metastases; 62.0 {+-} 13.8 years) suffered from local recurrent disease (group 2). Applying an automated software tool (SyngoCT Oncology; Siemens Healthcare, Forchheim, Germany), size parameters (volume, RECIST, WHO) and attenuation were measured within the lesions before, 1 day after, and 28 days after RFA treatment. The natural logarithm (ln) of the quotient of the volume 1 day versus 28 days after RFA treament was computed: lnQ1//28/0{sub volume}. Analogously, ln ratios of RECIST, WHO, and attenuation were computed and statistically evaluated by repeated-measures ANOVA. One lesion in group 2 was excluded from further evaluation due to automated missegmentation. Statistically significant differences between the two groups were observed with respect to initial volume, RECIST, and WHO (p < 0.05). Furthermore, ln ratios corresponding to volume, RECIST, and WHO differed significantly between the two groups. Attenuation evaluations showed no significant differences, but there was a trend toward attenuation assessment for the parameter lnQ28/0{sub attenuation} (p = 0.0527), showing higher values for group 1 (-0.4 {+-} 0.3) compared to group 2 (-0.2 {+-} 0.2). In conclusion, hepatic metastases and their zone of coagulation necrosis after RFA differed significantly between tumor

  7. Simultaneous screening for osteoporosis at CT colonography: bone mineral density assessment using MDCT attenuation techniques compared with the DXA reference standard.

    PubMed

    Pickhardt, Perry J; Lee, Lawrence J; del Rio, Alejandro Muñoz; Lauder, Travis; Bruce, Richard J; Summers, Ron M; Pooler, B Dustin; Binkley, Neil

    2011-09-01

    The purpose of this study was to evaluate the utility of lumbar spine attenuation measurement for bone mineral density (BMD) assessment at screening computed tomographic colonography (CTC) using central dual-energy X-ray absorptiometry (DXA) as the reference standard. Two-hundred and fifty-two adults (240 women and 12 men; mean age 58.9 years) underwent CTC screening and central DXA BMD measurement within 2 months (mean interval 25.0 days). The lowest DXA T-score between the spine and hip served as the reference standard, with low BMD defined per World Health Organization as osteoporosis (DXA T-score ≤ -2.5) or osteopenia (DXA T-score between -1.0 and -2.4). Both phantomless quantitative computed tomography (QCT) and simple nonangled region-of-interest (ROI) multi-detector CT (MDCT) attenuation measurements were applied to the T(12) -L(5) levels. The ability to predict osteoporosis and low BMD (osteoporosis or osteopenia) by DXA was assessed. A BMD cut-off of 90 mg/mL at phantomless QCT yielded 100% sensitivity for osteoporosis (29 of 29) and a specificity of 63.8% (143 of 224); 87.2% (96 of 110) below this threshold had low BMD and 49.6% (69 of 139) above this threshold had normal BMD at DXA. At L(1) , a trabecular ROI attenuation cut-off of 160 HU was 100% sensitive for osteoporosis (29 of 29), with a specificity of 46.4% (104 of 224); 83.9% (125 of 149) below this threshold had low BMD and 57.5% (59/103) above had normal BMD at DXA. ROI performance was similar at all individual T(12) -L(5) levels. At ROC analysis, AUC for osteoporosis was 0.888 for phantomless QCT [95% confidence interval (CI) 0.780-0.946] and ranged from 0.825 to 0.853 using trabecular ROIs at single lumbar levels (0.864; 95% CI 0.752-0.930 at multivariate analysis). Supine-prone reproducibility was better with the simple ROI method compared with QCT. It is concluded that both phantomless QCT and simple ROI attenuation measurements of the lumbar spine are effective for BMD screening at CTC

  8. Radiofrequency ablation of liver metastases-software-assisted evaluation of the ablation zone in MDCT: tumor-free follow-up versus local recurrent disease.

    PubMed

    Keil, Sebastian; Bruners, Philipp; Schiffl, Katharina; Sedlmair, Martin; Mühlenbruch, Georg; Günther, Rolf W; Das, Marco; Mahnken, Andreas H

    2010-04-01

    The purpose of this study was to investigate differences in change of size and CT value between local recurrences and tumor-free areas after CT-guided radiofrequency ablation (RFA) of hepatic metastases during follow-up by means of dedicated software for automatic evaluation of hepatic lesions. Thirty-two patients with 54 liver metastases from breast or colorectal cancer underwent triphasic contrast-enhanced multidetector-row computed tomography (MDCT) to evaluate hepatic metastatic spread and localization before CT-guided RFA and for follow-up after intervention. Sixteen of these patients (65.1 + or - 10.3 years) with 30 metastases stayed tumor-free (group 1), while the other group (n = 16 with 24 metastases; 62.0 + or - 13.8 years) suffered from local recurrent disease (group 2). Applying an automated software tool (SyngoCT Oncology; Siemens Healthcare, Forchheim, Germany), size parameters (volume, RECIST, WHO) and attenuation were measured within the lesions before, 1 day after, and 28 days after RFA treatment. The natural logarithm (ln) of the quotient of the volume 1 day versus 28 days after RFA treament was computed: lnQ1//28/0(volume). Analogously, ln ratios of RECIST, WHO, and attenuation were computed and statistically evaluated by repeated-measures ANOVA. One lesion in group 2 was excluded from further evaluation due to automated missegmentation. Statistically significant differences between the two groups were observed with respect to initial volume, RECIST, and WHO (p < 0.05). Furthermore, ln ratios corresponding to volume, RECIST, and WHO differed significantly between the two groups. Attenuation evaluations showed no significant differences, but there was a trend toward attenuation assessment for the parameter lnQ28/0(attenuation) (p = 0.0527), showing higher values for group 1 (-0.4 + or - 0.3) compared to group 2 (-0.2 + or - 0.2). In conclusion, hepatic metastases and their zone of coagulation necrosis after RFA differed significantly between tumor

  9. Recurrent rhabdomyolysis due to muscle β-enolase deficiency: very rare or underestimated?

    PubMed

    Musumeci, Olimpia; Brady, Stefen; Rodolico, Carmelo; Ciranni, Annamaria; Montagnese, Federica; Aguennouz, M'hammed; Kirk, Richard; Allen, Elizabeth; Godfrey, Richard; Romeo, Sara; Murphy, Elaine; Rahman, Shamima; Quinlivan, Ros; Toscano, Antonio

    2014-12-01

    Muscle β-enolase deficiency is a very rare inherited metabolic myopathy caused by an enzymatic defect of distal glycolysis. So far, the condition has been described in only one patient with mutations in ENO3 in a compound heterozygous state who presented with exercise intolerance, post-exercise myalgia and mild hyperCKemia but no pigmenturia. We describe two men, one Italian and one Turkish, with consanguineous parents, who complained of several episodes of intense myalgia, cramps, generalized muscle tenderness and dark urine. No other family members reported similar symptoms. In both cases, there was a very mild rise in lactate during a forearm exercise test. Muscle biopsy showed minimal changes with no lipid or glycogen accumulation. Biochemical studies on muscle tissue demonstrated a marked reduction of muscle β-enolase activity (20 and 10% of residual activity, respectively). Molecular genetic analysis of ENO3 gene revealed two novel homozygous missense mutations, (p.Asn151Ser and p.Glu187Lys). Both mutations segregated as expected in the two families. Although quite rare, muscle β-enolase deficiency should be considered in the differential diagnosis of patients presenting with recurrent rhabdomyolysis. It may present also with a more severe phenotype than previously thought.

  10. Underestimating the effects of spatial heterogeneity due to individual movement and spatial scale: infectious disease as an example

    USGS Publications Warehouse

    Cross, Paul C.; Caillaud, Damien; Heisey, Dennis M.

    2013-01-01

    Many ecological and epidemiological studies occur in systems with mobile individuals and heterogeneous landscapes. Using a simulation model, we show that the accuracy of inferring an underlying biological process from observational data depends on movement and spatial scale of the analysis. As an example, we focused on estimating the relationship between host density and pathogen transmission. Observational data can result in highly biased inference about the underlying process when individuals move among sampling areas. Even without sampling error, the effect of host density on disease transmission is underestimated by approximately 50 % when one in ten hosts move among sampling areas per lifetime. Aggregating data across larger regions causes minimal bias when host movement is low, and results in less biased inference when movement rates are high. However, increasing data aggregation reduces the observed spatial variation, which would lead to the misperception that a spatially targeted control effort may not be very effective. In addition, averaging over the local heterogeneity will result in underestimating the importance of spatial covariates. Minimizing the bias due to movement is not just about choosing the best spatial scale for analysis, but also about reducing the error associated with using the sampling location as a proxy for an individual’s spatial history. This error associated with the exposure covariate can be reduced by choosing sampling regions with less movement, including longitudinal information of individuals’ movements, or reducing the window of exposure by using repeated sampling or younger individuals.

  11. Underestimating protection and overestimating risk: examining descriptive normative perceptions and their association with drinking and sexual behaviors.

    PubMed

    Lewis, Melissa A; Litt, Dana M; Cronce, Jessica M; Blayney, Jessica A; Gilmore, Amanda K

    2014-01-01

    Individuals who engage in risky sexual behavior face the possibility of experiencing negative consequences. One tenet of social learning theory is that individuals engage in behaviors partly based on observations or perceptions of others' engagement in those behaviors. The present study aimed to document these norms-behavior relationships for both risky and protective sexual behaviors, including alcohol-related sexual behavior. Gender was also examined as a possible moderator of the norms-behavior relationship. Undergraduate students (n = 759; 58.0% female) completed a Web-based survey, including various measures of drinking and sexual behavior. Results indicated that students underestimate sexual health-protective behaviors (e.g., condom use and birth control use) and overestimate the risky behaviors (e.g., frequency of drinking prior to sex, typical number of drinks prior to sex, and frequency of casual sex) of their same-sex peers. All norms were positively associated with behavior, with the exception of condom use. Furthermore, no gender differences were found when examining the relationship between normative perceptions and behavior. The present study adds to the existing literature on normative misperceptions as it indicates that college students overestimate risky sexual behavior while underestimating sexual health-protective behaviors. Implications for interventions using the social norm approach and future directions are discussed.

  12. Is osteonecrosis of the lunate bone an underestimated feature of systemic sclerosis? A case series of nine patients and review of literature.

    PubMed

    Frerix, Marc; Kröger, Kai; Szalay, Gabor; Müller-Ladner, Ulf; Tarner, Ingo Helmut

    2016-02-01

    Osteonecrosis of the lunate bone, also known as Kienböck's disease, is a very rare disease of unknown cause. Until today, only six cases of osteonecrosis of the lunate bone in patients with systemic sclerosis (SSc) have been reported in the literature. It is unknown whether these few cases reflect only a coincidence of two rare diseases or whether osteonecrosis of the lunate bone is a potential currently underestimated disease-associated feature of SSc. In this study, we report the clinical course of nine SSc patients with magnetic resonance imaging proven osteonecrosis of the lunate bone and discuss associated disease characteristics and potential underlying pathophysiological mechanisms. Overall, our observations suggest that osteonecrosis of the lunate bone is a frequent and so far under-recognized manifestation of SSc which might be linked to SSc-related vasculopathy. It is important to distinguish osteonecrosis of the lunate bone from wrist arthritis in SSc patients because the clinical treatment is different. In general, the clinical progression of osteonecrosis of the lunate bone seems to be slow in SSc patients. As most of the patients have only minor complaints, watchful waiting in combination with analgesic therapy seems to be a feasible treatment approach in most patients whether an operative intervention might be necessary in rapid progressive cases.

  13. [New insights into the underestimated impairment of quality of life in age-related macular degeneration - a review of the literature].

    PubMed

    Meyer-Ruesenberg, B; Richard, G

    2010-08-01

    Different forms of age-related macular degeneration (AMD) can lead to massive visual impairment up to blindness. Therefore, AMD has a high impact on patients' daily life and causes restrictions of their psychological well-being, autonomy and mobility. These restrictions influence their quality of life. It is difficult to define the term "quality of life". It consists of different aspects that can be measured with psychometric tests. Besides the general health status and the psychological well-being, the vision-specific functional status plays a central role for determining the quality of life of AMD patients. To compare the impact of different diseases on the quality of life, the utility analysis has been developed. It demonstrates the relevance of AMD for patients and shows that its impact on patients' life is comparable to those of other systemic diseases like carcinoma or HIV. The impairment of patients' quality of life by AMD is often underestimated by their attending ophthalmologists. Different medical treatments have influenced the quality of life of AMD patients. However, there is still a large group that cannot be treated. These patients benefit from rehabilitation with low vision aids or psychosocial interventions. Further clinical trials at a high evidence level using valid and comparable psychometric tests are necessary to improve the therapy for and the rehabilitation of AMD patients and to increase their quality of life.

  14. Vitamin D deficiency in HIV infection: an underestimated and undertreated epidemic.

    PubMed

    Pinzone, M R; Di Rosa, M; Malaguarnera, M; Madeddu, G; Focà, E; Ceccarelli, G; d'Ettorre, G; Vullo, V; Fisichella, R; Cacopardo, B; Nunnari, G

    2013-05-01

    Hypovitaminosis D is a very common disorder, regarding both Western and developing countries. A growing amount of data over the last years have shown vitamin D deficiency to be high prevalent among HIV-positive subjects. In addition to "classic" risk factors, such as female sex, low dietary intake, dark skin pigmentation and low sun exposure, HIV-related factors, including immune activation and antiretroviral adverse effects, may affect vitamin D status. Even if both protease inhibitors and non-nucleoside reverse transcriptase inhibitors have been associated with low vitamin D levels, available evidences have failed to univocally associate hypovitaminosis D with specific antiretroviral class effects. Low vitamin D is known to have a negative impact not only on bone health, but also on neurocognitive, metabolic, cardiovascular and immune functions. Similarly to the general population, several studies conducted on HIV-infected subjects have associated hypovitaminosis D with a greater risk of developing osteopenia/osteoporosis and fragility fractures. Analogously, vitamin D deficiency has been described as an independent risk factor for cardiovascular disease and metabolic disorders, such as insulin resistance and type 2 diabetes mellitus. Last EACS guidelines suggest to screen for hypovitaminosis D every HIV-positive subject having a history of bone disease, chronic kidney disease or other known risk factors for vitamin D deficiency. Vitamin D repletion is recommended when 25-hydroxyvitamin D levels are below 10 ng/ml. Furthermore, it may be indicated in presence of 25OHD values between 10 and 30 ng/ml, if associated with osteoporosis, osteomalacia or increased parathyroid hormone levels. The optimal repletion and maintenance dosing regimens remain to be established, as well as the impact of vitamin D supplementation in preventing comorbidities.

  15. Tidal marsh accretion processes in the San Francisco Bay-Delta - are our models underestimating the historic and future importance of plant-mediated organic accretion?

    NASA Astrophysics Data System (ADS)

    Windham-Myers, L.; Drexler, J. Z.; Byrd, K. B.; Schile, L. M.

    2012-12-01

    Peat-accreting coastal wetlands have the potential to keep elevational pace with sea-level rise, thus providing both adaptation and mitigation for expected rises in atmospheric concentrations of greenhouse gases (GHGs). Due to oxidation and sedimentation processes, marsh elevations are generally constrained by sea level rise (1-2 mm yr-1). However, the relative importance of mineral vs. organic accretion remain poorly understood. At least four lines of evidence from the brackish-fresh region of California's SFBay-Delta suggest that potential rates of organic accretion may be underestimated in calibration datasets of the last century. First, tidal marsh elevations have been maintained with changing rates of SLR over the past 6700 years even during periods of low sediment availability. Second, the presence of fibric remnants in historic peat cores suggests that millennial preservation of autochtonous material may be greater in the absence of mineral inputs. Third, an experimental restoration of emergent marsh on subsided peat soil has generated new "proto-peat" at average rates of 4 cm y-1, nearly 40-times mean sea level rise, storing an average of 1 kg C m-2 yr-1 since 1997. Fourth, annual measurements of root production of the dominant fresh-brackish marsh species tule (Schoenoplectus acutus) show high productivity and minimal sensitivity to variable tidal range elevations and fresh-brackish salinities. Separating the relative importance of belowground productivity from decomposition in driving rates of organic accretion may be possible by assessment of fibric remnants, as an index of organic "preservation". Using three distinct peat cores from a larger study with calibrated dating and geochemistry data, fibric remnants (particles >2mm) were assessed at 10 cm intervals and compared with physical and associated geochemical down-core variability (n=230 segments). The presence of fibric remnants was reduced in the presence of sediment, as indicated by mineral content

  16. Assessing the potential underestimation of sediment and nutrient loads to the Great Barrier Reef lagoon during floods.

    PubMed

    Wallace, Jim; Karim, Fazlul; Wilkinson, Scott

    2012-01-01

    Much of the sediment and nutrient load to the Great Barrier Reef (GBR) lagoon happens during over bank floods, when discharge can be significantly underestimated by standard river gauges. This paper assesses the potential need for a flood load correction for 28 coastal rivers that discharge into the GBR lagoon. For each river, daily discharge was divided into flows above and below a 'flood' threshold to calculate the mean annual percentage flow above this threshold. Most GBR rivers potentially need a flood load correction as over 15% of their mean annual flow occurs above the minor flood level; only seven rivers need little/no correction as their flood flows were less than 5% of the mean annual flow. Improved assessment of the true load of materials to the GBR lagoon would be an important contribution to the monitoring and reporting of progress towards Reef Plan and associated marine load targets.

  17. Metabolic Power Method: Underestimation of Energy Expenditure in Field-Sport Movements Using a Global Positioning System Tracking System.

    PubMed

    Brown, Darcy M; Dwyer, Dan B; Robertson, Samuel J; Gastin, Paul B

    2016-11-01

    The purpose of this study was to assess the validity of a global positioning system (GPS) tracking system to estimate energy expenditure (EE) during exercise and field-sport locomotor movements. Twenty-seven participants each completed a 90-min exercise session on an outdoor synthetic futsal pitch. During the exercise session, they wore a 5-Hz GPS unit interpolated to 15 Hz and a portable gas analyzer that acted as the criterion measure of EE. The exercise session was composed of alternating 5-minute exercise bouts of randomized walking, jogging, running, or a field-sport circuit (×3) followed by 10 min of recovery. One-way analysis of variance showed significant (P < .01) and very large underestimations between GPS metabolic power- derived EE and oxygen-consumption (VO2) -derived EE for all field-sport circuits (% difference ≈ -44%). No differences in EE were observed for the jog (7.8%) and run (4.8%), whereas very large overestimations were found for the walk (43.0%). The GPS metabolic power EE over the entire 90-min session was significantly lower (P < .01) than the VO2 EE, resulting in a moderate underestimation overall (-19%). The results of this study suggest that a GPS tracking system using the metabolic power model of EE does not accurately estimate EE in field-sport movements or over an exercise session consisting of mixed locomotor activities interspersed with recovery periods; however, is it able to provide a reasonably accurate estimation of EE during continuous jogging and running.

  18. Underestimation of pyruvic acid concentrations by fructose and cysteine in 2,4-dinitrophenylhydrazine-mediated onion pungency test.

    PubMed

    Yoo, Kil Sun; Lee, Eun Jin; Patil, Bhimanagouda S

    2011-10-01

    Onion pungency has been routinely measured by determining pyruvic acid concentration in onion juice by reacting with 2,4-dinitrophenylhydrazine (DNPH) since 1961. However, the absorbency of the color adduct of the reaction rapidly decreased in onion samples as compared to that of the pyruvic acid standards, resulting in underestimations of the pyruvic acid concentrations. By measuring the absorbency at 1 min, we have demonstrated that accuracy could be substantially improved. As a continuation, the causes of degradation of the color adduct after the reaction and pyruvic acid itself before the reaction were examined in this study. Alliinase action in juice (fresh or cooked) and bulb colors did not influence the degradation. Some organic acids indigenously found in onion, such as ascorbic acid, proline, and glutamic acid, did not reduce the absorbency. However, fructose within the onion juice or supplemented caused the degradation of the color adduct, whereas sucrose and glucose had a lesser effect. Degradation rates increased proportionally as fructose concentrations increased up to 70 mg/mL. Cysteine was found to degrade the pyruvic acid itself before the pyruvic acid could react with DNPH. Approximately 90% of the pyruvic acid was degraded after 60 min in samples of 7 mM pyruvic acid supplemented with 10 mg/mL cysteine. Spectral comparisons of onion juice containing fructose naturally and pyruvic acid solution with supplemented fructose indicated identical patterns and confirmed that the color-adduct degradation was caused by fructose. Our study elucidated that fructose, a major sugar in onion juice, caused the degradation of color adduct in the onion pungency test and resulted in underestimation of the pyruvic acid concentration.

  19. Herpesvirus: an underestimated virus.

    PubMed

    Rechenchoski, Daniele Zendrini; Faccin-Galhardi, Ligia Carla; Linhares, Rosa Elisa Carvalho; Nozawa, Carlos

    2017-03-01

    Herpes simplex virus (HSV) infections are common and widespread; nevertheless, their outcome can be of unpredictable prognosis in neonates and in immunosuppressed patients. Anti-HSV therapy is effective, but the emergence of drug-resistant strains or the drug toxicity that hamper the treatment is of great concern. Vaccine has not yet shown relevant benefit; therefore, palliative prophylactic measures have been adopted to prevent diseases. This short review proposes to present concisely the history of HSV, its taxonomy, physical structure, and replication and to explore the pathogenesis of the infection, clinical manifestations, laboratory diagnosis, treatment, prophylaxis and epidemiology of the diseases.

  20. Diagnostic underestimation of atypical ductal hyperplasia and ductal carcinoma in situ at percutaneous core needle and vacuum-assisted biopsies of the breast in a Brazilian reference institution*

    PubMed Central

    Badan, Gustavo Machado; Roveda Júnior, Decio; Piato, Sebastião; Fleury, Eduardo de Faria Castro; Campos, Mário Sérgio Dantas; Pecci, Carlos Alberto Ferreira; Ferreira, Felipe Augusto Trocoli; D'Ávila, Camila

    2016-01-01

    Objective To determine the rates of diagnostic underestimation at stereotactic percutaneous core needle biopsies (CNB) and vacuum-assisted biopsies (VABB) of nonpalpable breast lesions, with histopathological results of atypical ductal hyperplasia (ADH) or ductal carcinoma in situ (DCIS) subsequently submitted to surgical excision. As a secondary objective, the frequency of ADH and DCIS was determined for the cases submitted to biopsy. Materials and Methods Retrospective review of 40 cases with diagnosis of ADH or DCIS on the basis of biopsies performed between February 2011 and July 2013, subsequently submitted to surgery, whose histopathological reports were available in the internal information system. Biopsy results were compared with those observed at surgery and the underestimation rate was calculated by means of specific mathematical equations. Results The underestimation rate at CNB was 50% for ADH and 28.57% for DCIS, and at VABB it was 25% for ADH and 14.28% for DCIS. ADH represented 10.25% of all cases undergoing biopsy, whereas DCIS accounted for 23.91%. Conclusion The diagnostic underestimation rate at CNB is two times the rate at VABB. Certainty that the target has been achieved is not the sole determining factor for a reliable diagnosis. Removal of more than 50% of the target lesion should further reduce the risk of underestimation. PMID:26929454

  1. Spot-mapping underestimates song-territory size and use of mature forest by breeding golden-winged warblers in Minnesota, USA

    USGS Publications Warehouse

    Streby, Henry M.; Loegering, John P.; Andersen, David E.

    2012-01-01

    Studies of songbird breeding habitat often compare habitat characteristics of used and unused areas. Although there is usually meticulous effort to precisely and consistently measure habitat characteristics, accuracy of methods for estimating which areas are used versus which are unused by birds remains generally untested. To examine accuracy of spot-mapping to identify singing territories of golden-winged warblers (Vermivora chrysoptera), which are considered an early successional forest specialists, we used spot-mapping and radiotelemetry to record song perches and delineate song territories for breeding male golden-winged warblers in northwestern Minnesota, USA. We also used radiotelemetry to record locations (song and nonsong perches) of a subsample (n = 12) of males throughout the day to delineate home ranges. We found that telemetry-based estimates of song territories were 3 times larger and included more mature forest than those estimated from spot-mapping. In addition, home ranges estimated using radiotelemetry included more mature forest than spot-mapping- and telemetry-based song territories, with 75% of afternoon perches located in mature forest. Our results suggest that mature forest comprises a larger component of golden-winged warbler song territories and home ranges than is indicated based on spot-mapping in Minnesota. Because it appears that standard observational methods can underestimate territory size and misidentify cover-type associations for golden-winged warblers, we caution that management and conservation plans may be misinformed, and that similar studies are needed for golden-winged warblers across their range and for other songbird species.

  2. Underestimating the safety benefits of a new vaccine: the impact of acellular pertussis vaccine versus whole-cell pertussis vaccine on health services utilization.

    PubMed

    Hawken, Steven; Manuel, Douglas G; Deeks, Shelley L; Kwong, Jeffrey C; Crowcroft, Natasha S; Wilson, Kumanan

    2012-12-01

    The population-level safety benefits of the acellular pertussis vaccine may have been underestimated because only specific adverse events were considered, not overall impact on health services utilization. Using the Vaccine and Immunization Surveillance in Ontario (VISION) system, the authors analyzed data on 567,378 children born between April 1994 and March 1996 (before introduction of acellular pertussis vaccine) and between April 1998 and March 2000 (after introduction of acellular pertussis vaccine) in Ontario, Canada. Using the self-controlled case series study design, they examined emergency room visits and hospital admissions occurring after routine pediatric vaccinations. The authors determined the relative incidence of events taking place before introduction of the acellular vaccine versus after introduction by calculating relative incidence ratios (RIRs). The observed RIRs demonstrated a highly statistically significant reduction in relative incidence after introduction of the acellular vaccine. RIRs for vaccine administered at ages 2, 4, 6, and 18 months were 1.82 (95% confidence interval (CI): 1.64, 2.01), 1.91 (95% CI: 1.71, 2.13), 1.54 (95% CI: 1.38, 1.72), and 1.51 (95% CI: 1.34, 1.69), respectively, comparing event rates before the introduction of acellular vaccine with those after introduction. The authors estimated that approximately 90 emergency room visits and 9 admissions per month were avoided by switching to the acellular vaccine, which is a 38-fold higher impact than when they considered only admissions for febrile and afebrile convulsions. Future analyses comparing vaccines for safety should examine specific endpoints and general health services utilization.

  3. Predicting the underestimation of the femoral offset in anteroposterior radiographs of the pelvis using 'lesser trochanter index': a 3D CT derived simulated radiographic analysis.

    PubMed

    Boddu, Krishna; Siebachmeyer, Martin; Lakkol, Sandesh; Rajayogeswaran, Brathaban; Kavarthapu, Venu; Li, Patrick L S

    2014-06-01

    We developed 'lesser trochanter index' (LTI) and estimated its accuracy in predicting the underestimation of offset in the anteroposterior (AP) pelvic radiographs. We reconstructed 320 simulated radiographs from the CT scans of 40 adult hips at different rotational projections of 10° increments from 30° internal rotation to 40° external rotation. Underestimation of femoral offset as a percentage was derived from the neck profile angle for all radiographs. Radiographs with an LTI value above 35 were 94% (95% CI, 89%-97%) likely to underestimate femoral offset by more than 5%. Radiographs with LTI between 0 and 30 demonstrated femoral offset within 5% of the true offset (predictive value 100%, CI 87%-100%). LTI could be a useful guide in preoperative templating of hip arthroplasty.

  4. Multilocus approaches reveal underestimated species diversity and inter-specific gene flow in pikas (Ochotona) from southwestern China.

    PubMed

    Koju, Narayan Prasad; He, Kai; Chalise, Mukesh Kumar; Ray, Chris; Chen, Zhongzheng; Zhang, Bin; Wan, Tao; Chen, Shunde; Jiang, Xuelong

    2017-02-01

    The phylogeny of living pikas (Ochotonidae, Ochotona) remains obscure, and pika species diversity in southwestern China has never been well explored. In this study, 96 tissue samples from 11 valid species in three classified subgenera (Pika, Ochotona and Conothoa) from 23 locations were characterized using multilocus sequences of 7031bp. Two mitochondrial (CYT B and COI) and five nuclear gene segments (RAG1, RAG2, TTN, OXAIL and IL1RAPL1) were sequenced. We analysed evolutionary histories using maximum likelihood (RAxML) and Bayesian analyses (BEAST), and we also used molecular species delimitation analyses (BPP) to explore species diversity. Our study supported O. syrinx (O. huangensis) as a distinct clade from all named subgenera. Relationships among subgenera were not fully resolved, which may be due to a rapid diversification in the middle Miocene (∼13.90Ma). Conflicting gene trees implied mitochondrial introgression from O. cansus to O. curzoniae. We uncovered three cryptic species from Shaanxi, Sichuan and Yunnan with strong support, suggesting an underestimation of species diversity in the "sky-island" mountains of southwest China.

  5. The health impact of polyparasitism in humans: are we under-estimating the burden of parasitic diseases?

    PubMed

    Pullan, R; Brooker, S

    2008-06-01

    Parasitic infections are widespread throughout the tropics and sub-tropics, and infection with multiple parasite species is the norm rather than the exception. Despite the ubiquity of polyparasitism, its public health significance has been inadequately studied. Here we review available studies investigating the nutritional and pathological consequences of multiple infections with Plasmodium and helminth infection and, in doing so, encourage a reassessment of the disease burden caused by polyparasitism. The available evidence is conspicuously sparse but is suggestive that multiple human parasite species may have an additive and/or multiplicative impact on nutrition and organ pathology. Existing studies suffer from a number of methodological limitations and adequately designed studies are clearly necessary. Current methods of estimating the potential global morbidity due to parasitic diseases underestimate the health impact of polyparasitism, and possible reasons for this are presented. As international strategies to control multiple parasite species are rolled-out, there is a number of options to investigate the complexity of polyparasitism, and it is hoped that that the parasitological research community will grasp the opportunity to understand better the health of polyparasitism in humans.

  6. A method to compensate for the underestimation of collagen with polarized picrosirius red imaging in human artery atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Greiner, C. A.; Grainger, S. J.; Su, J. L.; Madden, S. P.; Muller, J. E.

    2016-04-01

    Although picrosirius red (PSR) is known to be in quantifying collagen under polarized light (PL), commonly used linearly PL can result in an underestimation of collagen, as some of the fibers may appear dark if aligned with the transmission axis of the polarizers. To address this, a sample may be imaged with circularly polarized light at the expense of higher background intensity. However, the quality and alignment of the microscope illumination optics, polarizers and waveplates can still produce imaging variability with circular polarization. A simpler technique was tested that minimized variability and background intensity with linear polarization by acquiring images at multiple angles of histology slide rotation to create a composite co-registered image, permitting the optimal semi-quantitative visualization of collagen. Linear polarization imaging was performed on PSR stained artery sections. By rotating the slide at 60° intervals while maintaining illumination, polarization and exposure parameters, 6 images were acquired for each section. A composite image was created from the 6 co-registered images, and comprised of the maximum pixel intensity at each point. Images from any of the 6 rotation positions consistently showed variation in PSR signal. A composite image compensates for this variability, without loss of spatial resolution. Additionally, grayscale analysis showed an increased intensity range of 15 - 50% with a linearly polarized composite image over a circularly polarized image after background correction, indicating better SNR. This proposed technique will be applied in the development of a near infrared spectroscopy algorithm to detect vulnerable atherosclerotic plaques in vivo.

  7. Underestimation of urinary biomarker-to-creatinine ratio resulting from age-related gain in muscle mass in rats.

    PubMed

    Tonomura, Yutaka; Morikawa, Yuji; Takagi, Shingo; Torii, Mikinori; Matsubara, Mitsunobu

    2013-01-07

    Recent efforts have been made to identify useful urinary biomarkers of nephrotoxicity. Furthermore, the application of urine to the other toxicities as new biomarker source has been recently expanded. Meanwhile, correction of urinary biomarker concentrations according to fluctuations in urine flow rate is required for adequate interpretation of the alteration. The urinary biomarker-to-creatinine ratio (UBCR) is widely used because of the convenience, while the urinary biomarker-excretion rate is regarded as the gold standard corrective method. Because creatinine is a catabolite in energy production in muscles, we hypothesized that altered muscle mass could affect creatinine kinetics, ultimately affecting UBCR. However, no study has examined this hypothesis. In this study, we examined the influence of muscle mass gain on UBCR, using male Sprague-Dawley rats during the growth phase, 6-12-week old. Both plasma creatinine and excretion of urinary creatinine (Ucr excretion) showed increases with muscle mass gain in rats, in which the alterations of UBCR were lowered. The renal mRNA level of the organic cation transporter-2 (Oct2), a creatinine transporter, showed an age-related increase, whereas the mRNA level of multidrug and toxin extrusions-1 (Mate1) remained constant. Multiple regression analysis showed that the increase in creatinine clearance highly contributed to the age-related increase in Ucr excretion compared to the mRNA levels of Oct2 and Mate1. This suggested that the age-related increase in Ucr excretion may be attributable to the increased transglomerular passage of creatinine. In conclusion, the results suggest that muscle mass gain can affect creatinine kinetics, leading to underestimation of UBCR. Therefore, it is important to understand the characteristics of the corrective method when using urinary biomarker, the failure of which can result in an incorrect diagnosis.

  8. Is the toxicity of adjuvant aromatase inhibitor therapy underestimated? Complementary information from patient-reported outcomes (PROs).

    PubMed

    Oberguggenberger, Anne; Hubalek, Michael; Sztankay, Monika; Meraner, Verena; Beer, Beate; Oberacher, Herbert; Giesinger, Johannes; Kemmler, Georg; Egle, Daniel; Gamper, Eva-Maria; Sperner-Unterweger, Barbara; Holzner, Bernhard

    2011-07-01

    Adjuvant endocrine treatment-related adverse effects have a strong impact on patients' quality of life and thereby limit therapy's risk benefit ratio resulting in morbidity and treatment discontinuation. Still, many AI adverse effects remain untreated given that they are unrecognized by conservative methods (e.g., proxy ratings). The ability of complementary patient-reported outcomes (PROs) to provide a more comprehensive assessment of side-effects is to be explored. A cross-sectional study sample of 280 postmenopausal, early stage breast cancer patients was subjected to a comprehensive PRO assessment (FACT-B/+ES) at their after-care appointment. Prevalence and severity of patient-reported physical side-effects and psychosocial burden related to adjuvant AI therapy were compared with prevalences derived from pivotal phase IV trials (ATAC 2005, BIG1-98 2005). Across all symptom categories, highest prevalence rates were found for joint pain (59.6%), hot flushes (52%), lost interest in sexual intercourse (51.4%), and lack of energy (40.3%). Overall, PROs resulted in significantly higher prevalence rates as compared to physician ratings for all symptoms published in pivotal clinical trials except vaginal bleeding and nausea. The treatment duration exerted no significant impact on symptom frequency (P > 0.05). Established prevalence rates of endocrine treatment-related toxicity seem to be underestimated. The incorporation of PRO data should be mandatory or at least highly recommended in clinical treatment planning to arrive at a more accurate assessment of a patient's actual symptom burden enabling improved individualized management of side-effects and mediating the preservation of treatment adherence.

  9. Cross-over studies underestimate energy compensation: The example of sucrose-versus sucralose-containing drinks.

    PubMed

    Gadah, Nouf S; Brunstrom, Jeffrey M; Rogers, Peter J

    2016-12-01

    The vast majority of preload-test-meal studies that have investigated the effects on energy intake of disguised nutrient or other food/drink ingredient manipulations have used a cross-over design. We argue that this design may underestimate the effect of the manipulation due to carry-over effects. To test this we conducted comparable cross-over (n = 69) and parallel-groups (n = 48) studies testing the effects of sucrose versus low-calorie sweetener (sucralose) in a drink preload on test-meal energy intake. The parallel-groups study included a baseline day in which only the test meal was consumed. Energy intake in that meal was used to control for individual differences in energy intake in the analysis of the effects of sucrose versus sucralose on energy intake on the test day. Consistent with our prediction, the effect of consuming sucrose on subsequent energy intake was greater when measured in the parallel-groups study than in the cross-over study (respectively 64% versus 36% compensation for the 162 kcal difference in energy content of the sucrose and sucralose drinks). We also included a water comparison group in the parallel-groups study (n = 24) and found that test-meal energy intake did not differ significantly between the water and sucralose conditions. Together, these results confirm that consumption of sucrose in a drink reduces subsequent energy intake, but by less than the energy content of the drink, whilst drink sweetness does not increase food energy intake. Crucially, though, the studies demonstrate that study design affects estimated energy compensation.

  10. Molecular outflows driven by low-mass protostars. I. Correcting for underestimates when measuring outflow masses and dynamical properties

    SciTech Connect

    Dunham, Michael M.; Arce, Héctor G.; Mardones, Diego; Lee, Jeong-Eun; Matthews, Brenda C.; Stutz, Amelia M.; Williams, Jonathan P.

    2014-03-01

    We present a survey of 28 molecular outflows driven by low-mass protostars, all of which are sufficiently isolated spatially and/or kinematically to fully separate into individual outflows. Using a combination of new and archival data from several single-dish telescopes, 17 outflows are mapped in {sup 12}CO (2-1) and 17 are mapped in {sup 12}CO (3-2), with 6 mapped in both transitions. For each outflow, we calculate and tabulate the mass (M {sub flow}), momentum (P {sub flow}), kinetic energy (E {sub flow}), mechanical luminosity (L {sub flow}), and force (F {sub flow}) assuming optically thin emission in LTE at an excitation temperature, T {sub ex}, of 50 K. We show that all of the calculated properties are underestimated when calculated under these assumptions. Taken together, the effects of opacity, outflow emission at low velocities confused with ambient cloud emission, and emission below the sensitivities of the observations increase outflow masses and dynamical properties by an order of magnitude, on average, and factors of 50-90 in the most extreme cases. Different (and non-uniform) excitation temperatures, inclination effects, and dissociation of molecular gas will all work to further increase outflow properties. Molecular outflows are thus almost certainly more massive and energetic than commonly reported. Additionally, outflow properties are lower, on average, by almost an order of magnitude when calculated from the {sup 12}CO (3-2) maps compared to the {sup 12}CO (2-1) maps, even after accounting for different opacities, map sensitivities, and possible excitation temperature variations. It has recently been argued in the literature that the {sup 12}CO (3-2) line is subthermally excited in outflows, and our results support this finding.

  11. Look before You Leap: Underestimating Chinese Student History, Chinese University Setting and Chinese University Steering in Sino-British HE Joint Ventures?

    ERIC Educational Resources Information Center

    Dow, Ewan G.

    2010-01-01

    This article makes the case--in three parts--that many Anglo-Chinese university collaborations (joint ventures) to date have seriously underestimated Chinese (student) history, the Chinese university setting and Chinese national governmental steering as part of the process of "glocalisation". Recent turbulence in this particular HE…

  12. General Anesthesia

    MedlinePlus

    General anesthesia Overview By Mayo Clinic Staff Under general anesthesia, you are completely unconscious and unable to feel pain during medical procedures. General anesthesia usually uses a combination of intravenous drugs and ...

  13. Estimating trematode prevalence in snail hosts using a single-step duplex PCR: how badly does cercarial shedding underestimate infection rates?

    PubMed Central

    2014-01-01

    Background Trematode communities often consist of different species exploiting the same host population, with two or more trematodes sometimes co-occuring in the same host. A commonly used diagnostic method to detect larval trematode infections in snails has been based on cercarial shedding, though it is often criticized as inaccurate. In the present study we compare infection prevalences determined by cercarial emission with those determined, for the first time, by molecular methods, allowing us to quantify the underestimation of single and double infections based on cercarial emission. We thus developed a duplex PCR for two host-parasite systems, to specifically differentiate between single and double infections. The Ebro samples include two morphologically similar opecoelids, whereas the Otago samples include two morphologically different larval trematodes. Methods Snails were screened for infections by incubating them individually to induce cercarial emission, thus determining infection following the “classical” detection method. Snail tissue was then removed and fixed for the duplex PCR. After obtaining ITS rDNA sequences, four species-specific primers were designed for each snail-trematode system, and duplex PCR prevalence was determined for each sample. Results from both methods were statistically compared using the McNemar’s Chi-squared test and Cohen’s Kappa Statistic for agreement between outcomes. Results Overall infection prevalences determined by duplex PCR were consistently and substantially higher than those based on cercarial shedding: among Ebro samples, between 17.9% and 60.1% more snails were found infected using the molecular method, whereas in the Otago samples, the difference was between 9.9% and 20.6%. Kappa values generally indicated a fair to substantial agreement between both detection methods, showing a lower agreement for the Ebro samples. Conclusions We demonstrate that molecular detection of single and double infections by

  14. Substantial Underestimation of Post-harvest Burning Emissions in East China as Seen by Multi-species Space Observations

    NASA Astrophysics Data System (ADS)

    Stavrakou, T.; Muller, J. F.; Bauwens, M.; De Smedt, I.; Lerot, C.; Van Roozendael, M.

    2015-12-01

    Crop residue burning is an important contributor to global biomass burning. In the North China Plain, one of the largest and densely populated world plains, post-harvest crop burning is a common agricultural management practice, allowing for land clearing from residual straw and preparation for the subsequent crop cultivation. The most extensive crop fires occur in the North China Plain in June after the winter wheat comes to maturity, and have been blamed for spikes in air pollution leading to serious health problems. Estimating harvest season burning emissions is therefore of primary importance to assess air quality and define best policies for its improvement in this sensitive region. Bottom-up approaches, either based on crop production and emission factors, or on satellite burned area and fire radiative power products, have been adopted so far, however, these methods crucially depend, among other assumptions, on the satellite skill to detect small fires, and could lead to underestimation of the actual emissions. The flux inversion of atmospheric observations is an alternative, independent approach for inferring the emissions from crop fires. Satellite column observations of formaldehyde (HCHO) exhibit a strong peak over the North China Plain in June, resulting from enhanced pyrogenic emissions of a large suite of volatile organic compounds (VOCs), precursors of HCHO. We use vertical columns of formaldehyde retrieved from the OMI instrument between 2005 and 2012 as constraints in an adjoint inversion scheme built on IMAGESv2 CTM, and perform the optimization of biogenic, pyrogenic, and anthropogenic emission parameters at the model resolution. We investigate the interannual variability of the top-down source, quantify its importance for the atmospheric composition on the regional scale, and explore its uncertainties. The OMI-based crop burning source is compared with the corresponding anthropogenic flux in the North China Plain, and is evaluated against HCHO

  15. On underestimation of global vulnerability to tree mortality and forest die-off from hotter drought in the Anthropocene

    USGS Publications Warehouse

    Allen, Craig D.; Breshears, David D.; McDowell, Nathan G.

    2015-01-01

    hotter drought, consistent with fundamental physiology; (5) shorter droughts occur more frequently than longer droughts and can become lethal under warming, increasing the frequency of lethal drought nonlinearly; and (6) mortality happens rapidly relative to growth intervals needed for forest recovery. These high-confidence drivers, in concert with research supporting greater vulnerability perspectives, support an overall viewpoint of greater forest vulnerability globally. We surmise that mortality vulnerability is being discounted in part due to difficulties in predicting threshold responses to extreme climate events. Given the profound ecological and societal implications of underestimating global vulnerability to hotter drought, we highlight urgent challenges for research, management, and policy-making communities.

  16. How systematic age underestimation can impede understanding of fish population dynamics: Lessons learned from a Lake Superior cisco stock

    USGS Publications Warehouse

    Yule, D.L.; Stockwell, J.D.; Black, J.A.; Cullis, K.I.; Cholwek, G.A.; Myers, J.T.

    2008-01-01

    Systematic underestimation of fish age can impede understanding of recruitment variability and adaptive strategies (like longevity) and can bias estimates of survivorship. We suspected that previous estimates of annual survival (S; range = 0.20-0.44) for Lake Superior ciscoes Coregonus artedi developed from scale ages were biased low. To test this hypothesis, we estimated the total instantaneous mortality rate of adult ciscoes from the Thunder Bay, Ontario, stock by use of cohort-based catch curves developed from commercial gill-net catches and otolith-aged fish. Mean S based on otolith ages was greater for adult females (0.80) than for adult males (0.75), but these differences were not significant. Applying the results of a study of agreement between scale and otolith ages, we modeled a scale age for each otolith-aged fish to reconstruct catch curves. Using modeled scale ages, estimates of S (0.42 for females, 0.36 for males) were comparable with those reported in past studies. We conducted a November 2005 acoustic and midwater trawl survey to estimate the abundance of ciscoes when the fish were being harvested for roe. Estimated exploitation rates were 0.085 for females and 0.025 for males, and the instantaneous rates of fishing mortality were 0.089 for females and 0.025 for males. The instantaneous rates of natural mortality were 0.131 and 0.265 for females and males, respectively. Using otolith ages, we found that strong year-classes at large during November 2005 were caught in high numbers as age-1 fish in previous annual bottom trawl surveys, whereas weak or absent year-classes were not. For decades, large-scale fisheries on the Great Lakes were allowed to operate because ciscoes were assumed to be short lived and to have regular recruitment. We postulate that the collapse of these fisheries was linked in part to a misunderstanding of cisco biology driven by scale-ageing error. ?? Copyright by the American Fisheries Society 2008.

  17. General anesthesia

    MedlinePlus

    ... page: //medlineplus.gov/ency/article/007410.htm General anesthesia To use the sharing features on this page, please enable JavaScript. General anesthesia is treatment with certain medicines that puts you ...

  18. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    PubMed

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more

  19. Numerical study identifying the factors causing the significant underestimation of the specific discharge estimated using the modified integral pumping test method in a laboratory experiment.

    PubMed

    Sun, Kerang

    2015-09-01

    A three-dimensional finite element model is constructed to simulate the experimental conditions presented in a paper published in this journal [Goltz et al., 2009. Validation of two innovative methods to measure contaminant mass flux in groundwater. Journal of Contaminant Hydrology 106 (2009) 51-61] where the modified integral pumping test (MIPT) method was found to significantly underestimate the specific discharge in an artificial aquifer. The numerical model closely replicates the experimental configuration with explicit representation of the pumping well column and skin, allowing for the model to simulate the wellbore flow in the pumping well as an integral part of the porous media flow in the aquifer using the equivalent hydraulic conductivity approach. The equivalent hydraulic conductivity is used to account for head losses due to friction within the wellbore of the pumping well. Applying the MIPT method on the model simulated piezometric heads resulted in a specific discharge that underestimates the true specific discharge in the experimental aquifer by 18.8%, compared with the 57% underestimation of mass flux by the experiment reported by Goltz et al. (2009). Alternative simulation shows that the numerical model is capable of approximately replicating the experiment results when the equivalent hydraulic conductivity is reduced by an order of magnitude, suggesting that the accuracy of the MIPT estimation could be improved by expanding the physical meaning of the equivalent hydraulic conductivity to account for other factors such as orifice losses in addition to frictional losses within the wellbore. Numerical experiments also show that when applying the MIPT method to estimate hydraulic parameters, use of depth-integrated piezometric head instead of the head near the pump intake can reduce the estimation error resulting from well losses, but not the error associated with the well not being fully screened.

  20. Liu et al. suspect that Zhu et al. (2015) may have underestimated dissolved organic nitrogen (N) but overestimated total particulate N in wet deposition in China.

    PubMed

    Liu, Xuejun; Xu, Wen; Pan, Yuepeng; Du, Enzai

    2015-07-01

    In a recent publication in the journal Science of the Total Environment, Zhu et al. (2015) reported the composition, spatial patterns, and factors influencing atmospheric wet nitrogen (N) deposition based on one year's data from 41-monitoring sites in China. We suspect their results may largely underestimate dissolved organic N (DON) but overestimate total particulate N (TPN) in wet deposition due to the uncertainty resulting from the sampling, storage and analysis methods in their study. Our suspicions are based mainly on our experience from earlier measurements and the literature. We therefore suggest that enhanced data quality control on atmospheric N deposition measurements should be taken into account in future studies.

  1. General Conformity

    EPA Pesticide Factsheets

    The General Conformity requirements ensure that the actions taken by federal agencies in nonattainment and maintenance areas do not interfere with a state’s plans to meet national standards for air quality.

  2. General paresis

    MedlinePlus

    ... due to damage to the brain from untreated syphilis. Causes General paresis is one form of neurosyphilis . ... usually occurs in people who have had untreated syphilis for many years. Syphilis is bacterial infection that ...

  3. General Dentist

    MedlinePlus

    ... information you need from the Academy of General Dentistry Sunday, April 9, 2017 About | Contact InfoBites Quick ... Instead of specializing in just one area of dentistry, they can provide plenty of different services for ...

  4. Haemoglobin J-Baltimore can be detected by HbA1c electropherogram but with underestimated HbA1c value.

    PubMed

    Brunel, Valéry; Lahary, Agnčs; Chagraoui, Abdeslam; Thuillez, Christian

    2016-01-01

    Glycated haemoglobin (HbA(1c)) is considered the gold standard for assessing diabetes compensation and treatment. In addition, fortuitous detection of haemoglobin variants during HbA1c measurement is not rare. Recently, two publications reported different conclusions on accuracy of HbA(1c) value using capillary electrophoresis method in presence of haemoglobin J-Baltimore (HbJ).
Here we describe the fortuitous detection of unknown HbJ using capillary electrophoresis for measurement of HbA(1c). A patient followed for gestational diabetes in our laboratory presented unknown haemoglobin on Capillarys 2 Flex Piercing analyser which was identified as HbJ. HbJ is not associated with haematological abnormalities. High Performance Liquid Chromatography methods are known to possibly underestimate HbA(1c) value in the presence of this variant. This variant and its glycated form are clearly distinguished on electropherogram but HbJ was responsible for underestimating the true area of HbA(1c).
 Capillary electrophoresis is a good method for detecting HbJ but does not seem suitable for evaluation of HbA(1C) value in patients in presence of HbJ variant.

  5. Causes of systematic over- or underestimation of low streamflows by use of index-streamgage approaches in the United States

    USGS Publications Warehouse

    Eng, K.; Kiang, J.E.; Chen, Y.-Y.; Carlisle, D.M.; Granato, G.E.

    2011-01-01

    Low-flow characteristics can be estimated by multiple linear regressions or the index-streamgage approach. The latter transfers streamflow information from a hydrologically similar, continuously gaged basin ('index streamgage') to one with a very limited streamflow record, but often results in biased estimates. The application of the index-streamgage approach can be generalized into three steps: (1) selection of streamflow information of interest, (2) definition of hydrologic similarity and selection of index streamgage, and (3) application of an information-transfer approach. Here, we explore the effects of (1) the range of streamflow values, (2) the areal density of streamgages, and (3) index-streamgage selection criteria on the bias of three information-transfer approaches on estimates of the 7-day, 10-year minimum streamflow (Q7, 10). The three information-transfer approaches considered are maintenance of variance extension, base-flow correlation, and ratio of measured to concurrent gaged streamflow (Q-ratio invariance). Our results for 1120 streamgages throughout the United States suggest that only a small portion of the total bias in estimated streamflow values is explained by the areal density of the streamgages and the hydrologic similarity between the two basins. However, restricting the range of streamflow values used in the index-streamgage approach reduces the bias of estimated Q7, 10 values substantially. Importantly, estimated Q7, 10 values are heavily biased when the observed Q7, 10 values are near zero. Results of the analysis also showed that Q7, 10 estimates from two of the three index-streamgage approaches have lower root-mean-square error values than estimates derived from multiple regressions for the large regions considered in this study.

  6. By how much do we underestimate species diversity of liverworts using morphological evidence? An example from Australasian Plagiochila (Plagiochilaceae: Jungermanniopsida).

    PubMed

    Renner, Matt A M; Heslewood, Margaret M; Patzak, Simon D F; Schäfer-Verwimp, Alfons; Heinrichs, Jochen

    2017-02-01

    As a framework for revisionary study of the leafy liverwort Plagiochila in Australia, two methods for species delimitation on molecular sequence data, General Mixed Yule Coalescence model (GMYC) and Automatic Barcode Gap Discovery (ABGD) were applied to a dataset including 265 individuals from Australia, New Zealand, and the Pacific. Groups returned by GMYC and ABGD were incongruent in some lineages, and ABGD tended to lump groups. This may reflect underlying heterogeneity in the history of diversification within different lineages of Plagiochila. GMYC from trees calculated using three different molecular clocks were compared, in some lineages different primary species hypotheses were returned by analyses of trees estimated under different clock models, suggesting clock model selection should be a routine component of phylogeny reconstruction for tree-based species delimitation methods, such as GMYC. Our results suggest that a minimum of 71 Plagiochilaceae species occur in Australasia, 16 more than currently accepted for the region, comprising 8 undetermined species and 8 synonyms requiring reinstatement. Despite modern taxonomic investigation over a four decade period, (1) real diversity is 29% higher than currently recognized; and (2) 12 of 33, or 36%, of currently accepted and previously untested Australasian species have circumscription issues, including polyphyly, paraphyly, internal phylogenetic structure, or combinations of two or more of these issues. These both reflect the many challenges associated with grouping decisions based solely on morphological data in morphologically simple yet polymorphic plant lineages. Our results highlight again the critical need for combined molecular-morphological datasets as a basis for resolving robust species hypotheses in species-rich bryophyte lineages.

  7. Pitfalls of the MTT assay: Direct and off-target effects of inhibitors can result in over/underestimation of cell viability.

    PubMed

    Stepanenko, A A; Dmitrenko, V V

    2015-12-15

    The MTT assay (to a less degree MTS, XTT or WST) is a widely exploited approach for measuring cell viability/drug cytotoxicity. MTT reduction occurs throughout a cell and can be significantly affected by a number of factors, including metabolic and energy perturbations, changes in the activity of oxidoreductases, endo-/exocytosis and intracellular trafficking. Over/underestimation of cell viability by the MTT assay may be due to both adaptive metabolic and mitochondrial reprogramming of cells subjected to drug treatment-mediated stress and inhibitor off-target effects. Previously, imatinib, rottlerin, ursolic acid, verapamil, resveratrol, genistein nanoparticles and some polypeptides were shown to interfere with MTT reduction rate resulting in inconsistent results between the MTT assay and alternative assays. Here, to test the under/overestimation of viability by the MTT assay, we compared results derived from the MTT assay with the trypan blue exclusion assay after treatment of glioblastoma U251, T98G and C6 cells with three widely used inhibitors with the known direct and side effects on energy and metabolic homeostasis - temozolomide (TMZ), a DNA-methylating agent, temsirolimus (TEM), an inhibitor of mTOR kinase, and U0126, an inhibitor of MEK1/2 kinases. Inhibitors were applied shortly as in IC50 evaluating studies or long as in studies focusing on drug resistance acquisition. We showed that over/underestimation of cell viability by the MTT assay and its significance depends on a cell line, a time point of viability measurement and other experimental parameters. Furthermore, we provided a comprehensive survey of factors that should be accounted in the MTT assay. To avoid result misinterpretation, supplementation of the tetrazolium salt-based assays with other non-metabolic assays is recommended.

  8. A trans-Amazonian screening of mtDNA reveals deep intraspecific divergence in forest birds and suggests a vast underestimation of species diversity.

    PubMed

    Milá, Borja; Tavares, Erika S; Muñoz Saldaña, Alberto; Karubian, Jordan; Smith, Thomas B; Baker, Allan J

    2012-01-01

    The Amazonian avifauna remains severely understudied relative to that of the temperate zone, and its species richness is thought to be underestimated by current taxonomy. Recent molecular systematic studies using mtDNA sequence reveal that traditionally accepted species-level taxa often conceal genetically divergent subspecific lineages found to represent new species upon close taxonomic scrutiny, suggesting that intraspecific mtDNA variation could be useful in species discovery. Surveys of mtDNA variation in Holarctic species have revealed patterns of variation that are largely congruent with species boundaries. However, little information exists on intraspecific divergence in most Amazonian species. Here we screen intraspecific mtDNA genetic variation in 41 Amazonian forest understory species belonging to 36 genera and 17 families in 6 orders, using 758 individual samples from Ecuador and French Guiana. For 13 of these species, we also analyzed trans-Andean populations from the Ecuadorian Chocó. A consistent pattern of deep intraspecific divergence among trans-Amazonian haplogroups was found for 33 of the 41 taxa, and genetic differentiation and genetic diversity among them was highly variable, suggesting a complex range of evolutionary histories. Mean sequence divergence within families was the same as that found in North American birds (13%), yet mean intraspecific divergence in Neotropical species was an order of magnitude larger (2.13% vs. 0.23%), with mean distance between intraspecific lineages reaching 3.56%. We found no clear relationship between genetic distances and differentiation in plumage color. Our results identify numerous genetically and phenotypically divergent lineages which may result in new species-level designations upon closer taxonomic scrutiny and thorough sampling, although lineages in the tropical region could be older than those in the temperate zone without necessarily representing separate species. In-depth phylogeographic surveys

  9. Condition-dependent virulence of slow bee paralysis virus in Bombus terrestris: are the impacts of honeybee viruses in wild pollinators underestimated?

    PubMed

    Manley, Robyn; Boots, Mike; Wilfert, Lena

    2017-03-30

    Slow bee paralysis virus (SBPV)-previously considered an obligate honeybee disease-is now known to be prevalent in bumblebee species. SBPV is highly virulent in honeybees in association with Varroa mites, but has been considered relatively benign otherwise. However, condition-dependent pathogens can appear asymptomatic under good, resource abundant conditions, and negative impacts on host fitness may only become apparent when under stressful or resource-limited conditions. We tested whether SBPV expresses condition-dependent virulence in its bumblebee host, Bombus terrestris, by orally inoculating bees with SBPV and recording longevity under satiated and starvation conditions. SBPV infection resulted in significant virulence under starvation conditions, with infected bees 1.6 times more likely to die at any given time point (a median of 2.3 h earlier than uninfected bees), whereas there was no effect under satiated conditions. This demonstrates clear condition-dependent virulence for SBPV in B. terrestris. Infections that appear asymptomatic in non-stressful laboratory assays may nevertheless have significant impacts under natural conditions in the wild. For multi-host pathogens such as SBPV, the use of sentinel host species in laboratory assays may further lead to the underestimation of pathogen impacts on other species in nature. In this case the impact of 'honeybee viruses' on wild pollinators may be underestimated, with detrimental effects on conservation and food security. Our results highlight the importance of multiple assays and multiple host species when testing for virulence, in order for laboratory studies to accurately inform conservation policy and mitigate disease impacts in wild pollinators.

  10. A Trans-Amazonian Screening of mtDNA Reveals Deep Intraspecific Divergence in Forest Birds and Suggests a Vast Underestimation of Species Diversity

    PubMed Central

    Milá, Borja; Tavares, Erika S.; Muñoz Saldaña, Alberto; Karubian, Jordan; Smith, Thomas B.; Baker, Allan J.

    2012-01-01

    The Amazonian avifauna remains severely understudied relative to that of the temperate zone, and its species richness is thought to be underestimated by current taxonomy. Recent molecular systematic studies using mtDNA sequence reveal that traditionally accepted species-level taxa often conceal genetically divergent subspecific lineages found to represent new species upon close taxonomic scrutiny, suggesting that intraspecific mtDNA variation could be useful in species discovery. Surveys of mtDNA variation in Holarctic species have revealed patterns of variation that are largely congruent with species boundaries. However, little information exists on intraspecific divergence in most Amazonian species. Here we screen intraspecific mtDNA genetic variation in 41 Amazonian forest understory species belonging to 36 genera and 17 families in 6 orders, using 758 individual samples from Ecuador and French Guiana. For 13 of these species, we also analyzed trans-Andean populations from the Ecuadorian Chocó. A consistent pattern of deep intraspecific divergence among trans-Amazonian haplogroups was found for 33 of the 41 taxa, and genetic differentiation and genetic diversity among them was highly variable, suggesting a complex range of evolutionary histories. Mean sequence divergence within families was the same as that found in North American birds (13%), yet mean intraspecific divergence in Neotropical species was an order of magnitude larger (2.13% vs. 0.23%), with mean distance between intraspecific lineages reaching 3.56%. We found no clear relationship between genetic distances and differentiation in plumage color. Our results identify numerous genetically and phenotypically divergent lineages which may result in new species-level designations upon closer taxonomic scrutiny and thorough sampling, although lineages in the tropical region could be older than those in the temperate zone without necessarily representing separate species. In-depth phylogeographic surveys

  11. Severe scurvy: an underestimated disease.

    PubMed

    Levavasseur, M; Becquart, C; Pape, E; Pigeyre, M; Rousseaux, J; Staumont-Sallé, D; Delaporte, E

    2015-09-01

    Scurvy is one of the oldest diseases in human history. Nowadays, although scurvy tends to become a forgotten disease in developed country, rare cases still occur, especially in people undergoing extreme diet, old people or children with poor diet and patients with malabsorption. We describe three cases of scurvy. The first case is a patient diagnosed with Crohn's disease, the second one is in a context of anorexia nervosa and drug addiction, and the third case is in a context of social isolation. Early recognition of scurvy can be difficult because symptoms may appear nonspecific and can mimic more common conditions. In any patient with spontaneous hematoma and purpura, in the context of nutritional disorder, scurvy should be systematically considered. As this disease can lead to severe complications, such as bone pain, heart failure or gastrointestinal symptoms, nothing should delay vitamin C supplementation, which is a simple and rapidly effective treatment.

  12. A Bayesian model to correct underestimated 3-D wind speeds from sonic anemometers increases turbulent components of the surface energy balance

    NASA Astrophysics Data System (ADS)

    Frank, John M.; Massman, William J.; Ewers, Brent E.

    2016-12-01

    Sonic anemometers are the principal instruments in micrometeorological studies of turbulence and ecosystem fluxes. Common designs underestimate vertical wind measurements because they lack a correction for transducer shadowing, with no consensus on a suitable correction. We reanalyze a subset of data collected during field experiments in 2011 and 2013 featuring two or four CSAT3 sonic anemometers. We introduce a Bayesian analysis to resolve the three-dimensional correction by optimizing differences between anemometers mounted both vertically and horizontally. A grid of 512 points (˜ ±5° resolution in wind location) is defined on a sphere around the sonic anemometer, from which the shadow correction for each transducer pair is derived from a set of 138 unique state variables describing the quadrants and borders. Using the Markov chain Monte Carlo (MCMC) method, the Bayesian model proposes new values for each state variable, recalculates the fast-response data set, summarizes the 5 min wind statistics, and accepts the proposed new values based on the probability that they make measurements from vertical and horizontal anemometers more equivalent. MCMC chains were constructed for three different prior distributions describing the state variables: no shadow correction, the Kaimal correction for transducer shadowing, and double the Kaimal correction, all initialized with 10 % uncertainty. The final posterior correction did not depend on the prior distribution and revealed both self- and cross-shadowing effects from all transducers. After correction, the vertical wind velocity and sensible heat flux increased ˜ 10 % with ˜ 2 % uncertainty, which was significantly higher than the Kaimal correction. We applied the posterior correction to eddy-covariance data from various sites across North America and found that the turbulent components of the energy balance (sensible plus latent heat flux) increased on average between 8 and 12 %, with an average 95 % credible

  13. MDCT evaluation of sternal variations: Pictorial essay

    PubMed Central

    Duraikannu, Chary; Noronha, Olma V; Sundarrajan, Pushparajan

    2016-01-01

    Sternal variations and anomalies have been identified in the past during autopsy or cadaveric studies. Recently, an increasing number of minor sternal variations have been reported with the advent of multidetector computed tomography (CT). Although there are many sternal variations that occur with varying appearance and prevalence, most of them are not recognized or are underreported during routine imaging of thorax. Identification of sternal variations is important to differentiate from pathological conditions and to prevent fatal complications prior to sternal interventions like marrow aspiration or acupuncture. This article aims to describe the minor and asymptomatic sternal variations by multidetector CT and their clinical significance. PMID:27413263

  14. Bladder carcinoma: MDCT cystography and virtual cystoscopy.

    PubMed

    Panebianco, Valeria; Sciarra, Alessandro; Di Martino, Michele; Bernardo, Silvia; Vergari, Valeria; Gentilucci, Alessandro; Catalano, Carlo; Passariello, Roberto

    2010-06-01

    Bladder carcinoma is the most common tumor among the low urinary tract, accounting for 90% of cancer cases. Conventional cystoscopy represents the gold standard for diagnosis and local management of bladder carcinoma. As the prevalence of transitional cell carcinoma is four-fold greater in men than in women, the endoscopic procedure presents objective difficulties related to the length and bending of male urethra. The most important problems are represented by intense discomfort for the patient and bleeding; furthermore, the high cost, invasivity, and local complications such as infections and mechanical lesions are well-known drawbacks. Additionally, conventional cystoscopy does not provide information about extravescical extensions of the tumor. CT cystography, combined with virtual cystoscopy, is mandatory for TNM staging of the tumor and also is useful when conventional cystoscopy is inconclusive or cannot be performed. We presents the CT cystography findings with virtual endoscopy correlation and bladder carcinoma appearance.

  15. Cardiac Computed Tomography (Multidetector CT, or MDCT)

    MedlinePlus

    ... Graft (CABG) Surgery Atherosclerosis Coronary Artery Disease (CAD) Heart Attack • Home • About Heart Attacks • Warning Signs of a ... Heart Attack • Heart Attack Tools & Resources • Support Network Heart Attack Tools & Resources What Is a Heart Attack? How ...

  16. A scaling theory for the size distribution of emitted dust aerosols suggests climate models underestimate the size of the global dust cycle.

    PubMed

    Kok, Jasper F

    2011-01-18

    Mineral dust aerosols impact Earth's radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (< 2 μm diameter) by a factor of ∼2-8 relative to measurements. This discrepancy is resolved by deriving a simple theoretical expression of the emitted dust size distribution that is in excellent agreement with measurements. This expression is based on the physics of the scale-invariant fragmentation of brittle materials, which is shown to be applicable to dust emission. Because clay aerosols produce a strong radiative cooling, the overestimation of the clay fraction causes GCMs to also overestimate the radiative cooling of a given quantity of emitted dust. On local and regional scales, this affects the magnitude and possibly the sign of the dust radiative forcing, with implications for numerical weather forecasting and regional climate predictions in dusty regions. On a global scale, the dust cycle in most GCMs is tuned to match radiative measurements, such that the overestimation of the radiative cooling of a given quantity of emitted dust has likely caused GCMs to underestimate the global dust emission rate. This implies that the deposition flux of dust and its fertilizing effects on ecosystems may be substantially larger than thought.

  17. The potential impact of human papillomavirus vaccination in contemporary cytologically screened populations may be underestimated: an observational retrospective analysis of invasive cervical cancers.

    PubMed

    Powell, Ned; Boyde, Adam; Tristram, Amanda; Hibbitts, Sam; Fiander, Alison

    2009-11-15

    The aim of this study was to determine the proportion of invasive cervical cancers attributable to human papillomavirus (HPV) types 16 and 18 in a contemporary, cytologically well-screened UK population. This was achieved in a retrospective observational analysis by HPV typing 453 archival invasive cervical cancers diagnosed between January 1, 2000 and September 1, 2006. Pathological material was collected from 9 hospitals across Wales (UK), and HPV typing and pathology review was conducted at a central laboratory. Genotyping for high-risk HPV DNA was performed by PCR-enzyme immunoassay using the GP5+/6+ primer set. DNA was successfully extracted from 297 cases. Two hundred and eighty cases were included in the final analysis. The proportion of cases which had only HPV 16 and/or 18 was 219 of 280 (78.2%, 95% CI = 73.0-82.7); the proportion of cases which had HPV 16 or 18 and another HPV type was 230 of 280 (82.1%, 95% CI = 77.2-86.2). The proportion of cervical cancers associated with infection with HPV types 16 and 18 has previously been estimated at around 70%. The appropriate figure for a cytologically well-screened UK population appears to be approximately 80%. Hence, the potential impact of the current vaccination programme may be underestimated.

  18. National survey data for zoonotic schistosomiasis in the Philippines grossly underestimates the true burden of disease within endemic zones: implications for future control.

    PubMed

    Olveda, Remigio M; Tallo, Veronica; Olveda, David U; Inobaya, Marianette T; Chau, Thao N; Ross, Allen G

    2016-04-01

    Zoonotic schistosomiasis has a long endemic history in the Philippines. Human mass drug administration has been the cornerstone of schistosomiasis control in the country for the past three decades. Recent publications utilizing retrospective national survey data have indicated that the national human prevalence of the disease is <1%, hence the disease is now close to elimination. However, the evidence for such a claim is weak, given that less than a third of the human population is currently being treated annually within endemic zones and only a third of those treated actually swallow the tablets. For those who consume the drug at the single oral dose of 40mg/kg, the estimated cure rate is 52% based on a recent meta-analysis. Thus, approximately 5% of the endemic human population is in reality receiving the appropriate treatment. To compound this public health problem, most of the bovines in the endemic communities are concurrently infected but are not treated under the current national control programme. Given this evidence, it is believed that the human prevalence of schistosomiasis within endemic regions has been grossly underestimated. Inherent flaws in the reporting of national schistosomiasis prevalence data are reported here, and the problems of utilizing national retrospective data in making geographic information system (GIS) risk maps and advising policy makers of the outcomes are highlighted.

  19. The importance of associations with saprotrophic non-Rhizoctonia fungi among fully mycoheterotrophic orchids is currently under-estimated: novel evidence from sub-tropical Asia

    PubMed Central

    Lee, Yung-I; Yang, Chih-Kai; Gebauer, Gerhard

    2015-01-01

    Background and Aims Most fully mycoheterotrophic (MH) orchids investigated to date are mycorrhizal with fungi that simultaneously form ectomycorrhizas with forest trees. Only a few MH orchids are currently known to be mycorrhizal with saprotrophic, mostly wood-decomposing, fungi instead of ectomycorrhizal fungi. This study provides evidence that the importance of associations between MH orchids and saprotrophic non-Rhizoctonia fungi is currently under-estimated. Methods Using microscopic techniques and molecular approaches, mycorrhizal fungi were localized and identified for seven MH orchid species from four genera and two subfamilies, Vanilloideae and Epidendroideae, growing in four humid and warm sub-tropical forests in Taiwan. Carbon and nitrogen stable isotope natural abundances of MH orchids and autotrophic reference plants were used in order to elucidate the nutritional resources utilized by the orchids. Key Results Six out of the seven MH orchid species were mycorrhizal with either wood- or litter-decaying saprotrophic fungi. Only one orchid species was associated with ectomycorrhizal fungi. Stable isotope abundance patterns showed significant distinctions between orchids mycorrhizal with the three groups of fungal hosts. Conclusions Mycoheterotrophic orchids utilizing saprotrophic non-Rhizoctonia fungi as a carbon and nutrient source are clearly more frequent than hitherto assumed. On the basis of this kind of nutrition, orchids can thrive in deeply shaded, light-limiting forest understoreys even without support from ectomycorrhizal fungi. Sub-tropical East Asia appears to be a hotspot for orchids mycorrhizal with saprotrophic non-Rhizoctonia fungi. PMID:26113634

  20. A Generalization of Generalized Fibonacci and Generalized Pell Numbers

    ERIC Educational Resources Information Center

    Abd-Elhameed, W. M.; Zeyada, N. A.

    2017-01-01

    This paper is concerned with developing a new class of generalized numbers. The main advantage of this class is that it generalizes the two classes of generalized Fibonacci numbers and generalized Pell numbers. Some new identities involving these generalized numbers are obtained. In addition, the two well-known identities of Sury and Marques which…

  1. The widely used small subunit 18S rDNA molecule greatly underestimates true diversity in biodiversity surveys of the meiofauna.

    PubMed

    Tang, Cuong Q; Leasi, Francesca; Obertegger, Ulrike; Kieneke, Alexander; Barraclough, Timothy G; Fontaneto, Diego

    2012-10-02

    Molecular tools have revolutionized the exploration of biodiversity, especially in organisms for which traditional taxonomy is difficult, such as for microscopic animals (meiofauna). Environmental (eDNA) metabarcode surveys of DNA extracted from sediment samples are increasingly popular for surveying biodiversity. Most eDNA surveys use the nuclear gene-encoding small-subunit rDNA gene (18S) as a marker; however, different markers and metrics used for delimiting species have not yet been evaluated against each other or against morphologically defined species (morphospecies). We assessed more than 12,000 meiofaunal sequences of 18S and of the main alternatively used marker [Cytochrome c oxidase subunit I (COI) mtDNA] belonging to 55 datasets covering three taxonomic ranks. Our results show that 18S reduced diversity estimates by a factor of 0.4 relative to morphospecies, whereas COI increased diversity estimates by a factor of 7.6. Moreover, estimates of species richness using COI were robust among three of four commonly used delimitation metrics, whereas estimates using 18S varied widely with the different metrics. We show that meiofaunal diversity has been greatly underestimated by 18S eDNA surveys and that the use of COI provides a better estimate of diversity. The suitability of COI is supported by cross-mating experiments in the literature and evolutionary analyses of discreteness in patterns of genetic variation. Furthermore its splitting of morphospecies is expected from documented levels of cryptic taxa in exemplar meiofauna. We recommend against using 18S as a marker for biodiversity surveys and suggest that use of COI for eDNA surveys could provide more accurate estimates of species richness in the future.

  2. Women Under-estimate the Age of their Partners during Survey Interviews: Implications for HIV Risk associated with Age Mixing in Northern Malawi

    PubMed Central

    Kohler, Hans-Peter; Mkandawire, James

    2011-01-01

    Background Age mixing may explain differences in HIV prevalence across populations in sub-Saharan countries, but the validity of survey data on age mixing is unknown. Methods Age differences between partners are frequently estimated indirectly, by asking respondents to report their partner’s age. Partner age can also be assessed directly, by tracing partners and asking them to report their own age. We use data from 519 relationships, collected in Likoma (Malawi), in which both partners were interviewed and tested for HIV. In these relationships age differences were assessed both indirectly and directly, and estimates could thus be compared. We calculate the specificity and sensitivity of the indirect method in identifying age-homogenous/age-disparate relationships in which the male partner is less/more than 5 or 10 years older than the respondent. Results Women were accurate in identifying age-homogenous relationships, but not in identifying age-disparate relationships (specificity ≈ 90%, sensitivity = 24.3%). The sensitivity of the indirect method was even lower in detecting partners older than the respondent by 10 + years (9.6%). Among 43 relationships with an HIV-infected partner included in this study, there were close to 3 times more age-disparate relationships according to direct measures of partner age than according to women’s reports of their partner’s age (17% vs. 46%). Conclusions Survey reports of partner age significantly under-estimate the extent of and the HIV risk associated with age mixing in this population. Future studies of the impact of sexual mixing pattern son HIV risk in sub-Saharan countries should take reporting biases into account. PMID:21992979

  3. Are food insecurity's health impacts underestimated in the U.S. population? Marginal food security also predicts adverse health outcomes in young U.S. children and mothers.

    PubMed

    Cook, John T; Black, Maureen; Chilton, Mariana; Cutts, Diana; Ettinger de Cuba, Stephanie; Heeren, Timothy C; Rose-Jacobs, Ruth; Sandel, Megan; Casey, Patrick H; Coleman, Sharon; Weiss, Ingrid; Frank, Deborah A

    2013-01-01

    This review addresses epidemiological, public health, and social policy implications of categorizing young children and their adult female caregivers in the United States as food secure when they live in households with "marginal food security," as indicated by the U.S. Household Food Security Survey Module. Existing literature shows that households in the US with marginal food security are more like food-insecure households than food-secure households. Similarities include socio-demographic characteristics, psychosocial profiles, and patterns of disease and health risk. Building on existing knowledge, we present new research on associations of marginal food security with health and developmental risks in young children (<48 mo) and health in their female caregivers. Marginal food security is positively associated with adverse health outcomes compared with food security, but the strength of the associations is weaker than that for food insecurity as usually defined in the US. Nonoverlapping CIs, when comparing odds of marginally food-secure children's fair/poor health and developmental risk and caregivers' depressive symptoms and fair/poor health with those in food-secure and -insecure families, indicate associations of marginal food security significantly and distinctly intermediate between those of food security and food insecurity. Evidence from reviewed research and the new research presented indicates that households with marginal food security should not be classified as food secure, as is the current practice, but should be reported in a separate discrete category. These findings highlight the potential underestimation of the prevalence of adverse health outcomes associated with exposure to lack of enough food for an active, healthy life in the US and indicate an even greater need for preventive action and policies to limit and reduce exposure among children and mothers.

  4. A scaling theory for the size distribution of emitted dust aerosols suggests climate models underestimate the size of the global dust cycle

    PubMed Central

    Kok, Jasper F.

    2011-01-01

    Mineral dust aerosols impact Earth’s radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (< 2 μm diameter) by a factor of ∼2–8 relative to measurements. This discrepancy is resolved by deriving a simple theoretical expression of the emitted dust size distribution that is in excellent agreement with measurements. This expression is based on the physics of the scale-invariant fragmentation of brittle materials, which is shown to be applicable to dust emission. Because clay aerosols produce a strong radiative cooling, the overestimation of the clay fraction causes GCMs to also overestimate the radiative cooling of a given quantity of emitted dust. On local and regional scales, this affects the magnitude and possibly the sign of the dust radiative forcing, with implications for numerical weather forecasting and regional climate predictions in dusty regions. On a global scale, the dust cycle in most GCMs is tuned to match radiative measurements, such that the overestimation of the radiative cooling of a given quantity of emitted dust has likely caused GCMs to underestimate the global dust emission rate. This implies that the deposition flux of dust and its fertilizing effects on ecosystems may be substantially larger than thought. PMID:21189304

  5. Are We Under-Estimating the Association between Autism Symptoms?: The Importance of Considering Simultaneous Selection When Using Samples of Individuals Who Meet Diagnostic Criteria for an Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Murray, Aja Louise; McKenzie, Karen; Kuenssberg, Renate; O'Donnell, Michael

    2014-01-01

    The magnitude of symptom inter-correlations in diagnosed individuals has contributed to the evidence that autism spectrum disorders (ASD) is a fractionable disorder. Such correlations may substantially under-estimate the population correlations among symptoms due to simultaneous selection on the areas of deficit required for diagnosis. Using…

  6. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    NASA Astrophysics Data System (ADS)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-01

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg–Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  7. Are we under-estimating the association between autism symptoms?: The importance of considering simultaneous selection when using samples of individuals who meet diagnostic criteria for an autism spectrum disorder.

    PubMed

    Murray, Aja Louise; McKenzie, Karen; Kuenssberg, Renate; O'Donnell, Michael

    2014-11-01

    The magnitude of symptom inter-correlations in diagnosed individuals has contributed to the evidence that autism spectrum disorders (ASD) is a fractionable disorder. Such correlations may substantially under-estimate the population correlations among symptoms due to simultaneous selection on the areas of deficit required for diagnosis. Using statistical simulations of this selection mechanism, we provide estimates of the extent of this bias, given different levels of population correlation between symptoms. We then use real data to compare domain inter-correlations in the Autism Spectrum Quotient, in those with ASD versus a combined ASD and non-ASD sample. Results from both studies indicate that samples restricted to individuals with a diagnosis of ASD potentially substantially under-estimate the magnitude of association between features of ASD.

  8. GENERALIZED CONVEXITY CONES.

    DTIC Science & Technology

    Contents: Introduction The dual cone of C (psi sub 1,..., psi sub n) Extreme rays The cone dual to an intersection of generalized convexity cones... Generalized difference quotients and multivariate convexity Miscellaneous applications of generalized convexity.

  9. HIV AND POPULATION DYNAMICS: A GENERAL MODEL AND MAXIMUM-LIKELIHOOD STANDARDS FOR EAST AFRICA*

    PubMed Central

    HEUVELINE, PATRICK

    2014-01-01

    In high-prevalence populations, the HIV epidemic undermines the validity of past empirical models and related demographic techniques. A parsimonious model of HIV and population dynamics is presented here and fit to 46,000 observations, gathered from 11 East African populations. The fitted model simulates HIV and population dynamics with standard demographic inputs and only two additional parameters for the onset and scale of the epidemic. The underestimation of the general prevalence of HIV in samples of pregnant women and the fertility impact of HIV are examples of the dynamic interactions that demographic models must reproduce and are shown here to increase over time even with constant prevalence levels. As a result, the impact of HIV on population growth appears to have been underestimated by current population projections that ignore this dynamic. PMID:12846130

  10. A verified spider bite and a review of the literature confirm Indian ornamental tree spiders (Poecilotheria species) as underestimated theraphosids of medical importance.

    PubMed

    Fuchs, Joan; von Dechend, Margot; Mordasini, Raffaella; Ceschi, Alessandro; Nentwig, Wolfgang

    2014-01-01

    Literature on bird spider or tarantula bites (Theraphosidae) is rare. This is astonishing as they are coveted pets and interaction with their keepers (feeding, cleaning the terrarium or taking them out to hold) might increase the possibility for bites. Yet, this seems to be a rare event and might be why most theraphosids are considered to be harmless, even though the urticating hairs of many American species can cause disagreeable allergic reactions. We are describing a case of a verified bite by an Indian ornamental tree spider (Poecilotheria regalis), where the patient developed severe, long lasting muscle cramps several hours after the bite. We present a comprehensive review of the literature on bites of these beautiful spiders and conclude that a delayed onset of severe muscle cramps, lasting for days, is characteristic for Poecilotheria bites. We discuss Poecilotheria species as an exception from the general assumption that theraphosid bites are harmless to humans.

  11. General Aviation Propulsion

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Programs exploring and demonstrating new technologies in general aviation propulsion are considered. These programs are the quiet, clean, general aviation turbofan (QCGAT) program; the general aviation turbine engine (GATE) study program; the general aviation propeller technology program; and the advanced rotary, diesel, and reciprocating engine programs.

  12. Generalized holomorphic structures

    NASA Astrophysics Data System (ADS)

    Wang, Yicao

    2014-12-01

    We define the notion of generalized holomorphic principal bundles and establish that their associated vector bundles of holomorphic representations are generalized holomorphic vector bundles defined by M. Gualtieri. Motivated by our definition, several examples of generalized holomorphic structures are constructed. A reduction theorem of generalized holomorphic structures is also included.

  13. Phylogenomics and species delimitation in the knob-scaled lizards of the genus Xenosaurus (Squamata: Xenosauridae) using ddRADseq data reveal a substantial underestimation of diversity.

    PubMed

    Nieto-Montes de Oca, Adrián; Barley, Anthony J; Meza-Lázaro, Rubi N; García-Vázquez, Uri O; Zamora-Abrego, Joan G; Thomson, Robert C; Leaché, Adam D

    2017-01-01

    Middle American knob-scaled lizards of the genus Xenosaurus are a unique radiation of viviparous species that are generally characterized by a flattened body shape and a crevice-dwelling ecology. Only eight species of Xenosaurus, one of them with five subspecies (X. grandis), have been formally described. However, species limits within Xenosaurus have never been examined using molecular data, and no complete phylogeny of the genus has been published. Here, we used ddRADseq data from all of the described and potentially undescribed taxa of Xenosaurus to investigate species limits, and to obtain a phylogenetic hypothesis for the genus. We analyzed the data using a variety of phylogenetic models, and were able to reconstruct a well-resolved and generally well-supported phylogeny for this group. We found Xenosaurus to be composed of four major, allopatric clades concordant with geography. The first and second clades that branch off the tree are distributed on the Atlantic slopes of the Sierra Madre Oriental and are composed of X. mendozai, X. platyceps, and X. newmanorum, and X. tzacualtipantecus and an undescribed species from Puebla, respectively. The third clade is distributed from the Atlantic slopes of the Mexican Transvolcanic Belt in west-central Veracruz south to the Pacific slopes of the Sierra Madre del Sur in Guerrero and Oaxaca, and is composed of X. g. grandis, X. rectocollaris, X. phalaroanthereon, X. g. agrenon, X. penai, and four undescribed species from Oaxaca. The last clade is composed of the four taxa that are geographically closest to the Isthmus of Tehuantepec (X. g. arboreus, X. g. rackhami, X. g. sanmartinensis, and an undescribed species from Oaxaca). We also utilized a variety of molecular species delimitation approaches, including analyses with GMYC, PTP, BPP, and BFD(∗), which suggested that species diversity in Xenosaurus is at least 30% higher than currently estimated.

  14. Generalized Cartan Calculus in general dimension

    DOE PAGES

    Wang, Yi -Nan

    2015-07-22

    We develop the generalized Cartan Calculus for the groups G = SL(2,R) × R+, SL(5,R) and SO(5,5). They are the underlying algebraic structures of d=9,7,6 exceptional field theory, respectively. These algebraic identities are needed for the "tensor hierarchy" structure in exceptional field theory. The validity of Poincar\\'e lemmas in this new differential geometry is also discussed. Lastly, we explore some possible extension of the generalized Cartan calculus beyond the exceptional series.

  15. Do we Underestimate the Importance of Leaf Size in Plant Economics? Disproportional Scaling of Support Costs Within the Spectrum of Leaf Physiognomy

    PubMed Central

    Niinemets, Ülo; Portsmuth, Angelika; Tena, David; Tobias, Mari; Matesanz, Silvia; Valladares, Fernando

    2007-01-01

    Background Broad scaling relationships between leaf size and function do not take into account that leaves of different size may contain different fractions of support in petiole and mid-rib. Methods The fractions of leaf biomass in petiole, mid-rib and lamina, and the differences in chemistry and structure among mid-ribs, petioles and laminas were investigated in 122 species of contrasting leaf size, life form and climatic distribution to determine the extent to which differences in support modify whole-lamina and whole-leaf structural and chemical characteristics, and the extent to which size-dependent support investments are affected by plant life form and site climate. Key Results For the entire data set, leaf fresh mass varied over five orders of magnitude. The percentage of dry mass in mid-rib increased strongly with lamina size, reaching more than 40 % in the largest laminas. The whole-leaf percentage of mid-rib and petiole increased with leaf size, and the overall support investment was more than 60 % in the largest leaves. Fractional support investments were generally larger in herbaceous than in woody species and tended to be lower in Mediterranean than in cool temperate and tropical plants. Mid-ribs and petioles had lower N and C percentages, and lower dry to fresh mass ratio, but greater density (mass per unit volume) than laminas. N percentage of lamina without mid-rib was up to 40 % higher in the largest leaves than the total-lamina (lamina and mid-rib) N percentage, and up to 60 % higher than whole-leaf N percentage, while lamina density calculated without mid-rib was up to 80 % less than that with the mid-rib. For all leaf compartments, N percentage was negatively associated with density and dry to fresh mass ratio, while C percentage was positively linked to these characteristics, reflecting the overall inverse scaling between structural and physiological characteristics. However, the correlations between N and C percentages and structural

  16. Total mesophilic counts underestimate in many cases the contamination levels of psychrotrophic lactic acid bacteria (LAB) in chilled-stored food products at the end of their shelf-life.

    PubMed

    Pothakos, Vasileios; Samapundo, Simbarashe; Devlieghere, Frank

    2012-12-01

    The major objective of this study was to determine the role of psychrotrophic lactic acid bacteria (LAB) in spoilage-associated phenomena at the end of the shelf-life of 86 various packaged (air, vacuum, modified-atmosphere) chilled-stored retail food products. The current microbiological standards, which are largely based on the total viable mesophilic counts lack discriminatory capacity to detect psychrotrophic LAB. A comparison between the total viable counts on plates incubated at 30 °C (representing the mesophiles) and at 22 °C (indicating the psychrotrophs) for 86 food samples covering a wide range - ready-to-eat vegetable salads, fresh raw meat, cooked meat products and composite food - showed that a consistent underestimation of the microbial load occurs when the total aerobic mesophilic counts are used as a shelf-life parameter. In 38% of the samples, the psychrotrophic counts had significantly higher values (+0.5-3 log CFU/g) than the corresponding total aerobic mesophilic counts. A total of 154 lactic acid bacteria, which were unable to proliferate at 30 °C were isolated. In addition, a further 43 with a poor recovery at this temperature were also isolated. This study highlights the potential fallacy of the total aerobic mesophilic count as a reference shelf-life parameter for chilled food products as it can often underestimate the contamination levels at the end of the shelf-life.

  17. Generalized anxiety disorder - children

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/007687.htm Generalized anxiety disorder - children To use the sharing features on this page, please enable JavaScript. Generalized anxiety disorder (GAD) is a mental disorder in which a ...

  18. Generalized anxiety disorder

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/000917.htm Generalized anxiety disorder To use the sharing features on this page, please enable JavaScript. Generalized anxiety disorder (GAD) is a mental disorder in which a ...

  19. General Medical Surveillance Program

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Background on the General Medical Surveillance Program at LeRC is presented. The purpose of the General Medical Surveillance Program at LeRC is outlined, and the specifics of the program are discussed.

  20. Generalized Cartan Calculus in general dimension

    SciTech Connect

    Wang, Yi -Nan

    2015-07-22

    We develop the generalized Cartan Calculus for the groups G = SL(2,R) × R+, SL(5,R) and SO(5,5). They are the underlying algebraic structures of d=9,7,6 exceptional field theory, respectively. These algebraic identities are needed for the "tensor hierarchy" structure in exceptional field theory. The validity of Poincar\\'e lemmas in this new differential geometry is also discussed. Lastly, we explore some possible extension of the generalized Cartan calculus beyond the exceptional series.

  1. Forces in General Relativity

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Many textbooks dealing with general relativity do not demonstrate the derivation of forces in enough detail. The analyses presented herein demonstrate straightforward methods for computing forces by way of general relativity. Covariant divergence of the stress-energy-momentum tensor is used to derive a general expression of the force experienced…

  2. The General Conference Mennonites.

    ERIC Educational Resources Information Center

    Ediger, Marlow

    General Conference Mennonites and Old Order Amish are compared and contrasted in the areas of physical appearance, religious beliefs, formal education, methods of farming, and home settings. General Conference Mennonites and Amish differ in physical appearance and especially in dress. The General Conference Mennonite men and women dress the same…

  3. General Music Today Yearbook

    ERIC Educational Resources Information Center

    Rowman & Littlefield Education, 2005

    2005-01-01

    The collected 2004-2005 issues of General Music Today, the online journal of MENC's Society for General Music includes articles, research, reviews and resources of interest to general music teachers of all levels. Topics covered include working with special-needs students; emphasizing early childhood environment to enhance musical growth;…

  4. [General anesthesia in ophthalmology].

    PubMed

    Mocanu, Carmen; Cernea, Daniela; Enache, Monalisa; Deca, Andreea Gabriela

    2012-01-01

    General anesthesia is less utilized in ophthalmology. There are some criteria that lead to general anesthesia: patients with auditive debility, aged and senile patients, allergic patients, children and young patients, and subjects who totally refuse loco-regional procedures. General anesthesia utilizes as basic products: narcotic substances, analgesic substances, or neuroleptic substances utilized separately or associated, with posology adaptated to several factors: patients' pulse, physical statement, age, duration. The type of anesthetic substance depends also of specialist anesthetist experience.

  5. Generalized quasi variational inequalities

    SciTech Connect

    Noor, M.A.

    1996-12-31

    In this paper, we establish the equivalence between the generalized quasi variational inequalities and the generalized implicit Wiener-Hopf equations using essentially the projection technique. This equivalence is used to suggest and analyze a number of new iterative algorithms for solving generalized quasi variational inequalities and the related complementarity problems. The convergence criteria is also considered. The results proved in this paper represent a significant improvement and refinement of the previously known results.

  6. [Renaissance of general practice].

    PubMed

    Braun, R N; Fink, W; Kamenski, G

    2002-01-01

    Special research work has taught that general practice (family medicine) is a specialization of its own. It requires specific education and vocational training. As far as universities and administrative bodies have accepted it no doctor can start practising family medicine unless he has passed a vocational training of many years duration. That fact together with a successful continued research concerning applied general practice should initiate a true renaissance of general practice.

  7. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  8. General Machinists Course Outline.

    ERIC Educational Resources Information Center

    Wilcox, Chester; And Others

    This curriculum guide for a general machinists course is intended for use in a program combining vocational English as a second language (VESL) with bilingual vocational education. A description of the VESL program design appears first. The next section provides a format on developing lesson plans for teaching the technical and general vocational…

  9. Triangulating Speech Sound Generalization

    ERIC Educational Resources Information Center

    Miccio, Adele W.; Powell, Thomas W.

    2010-01-01

    Generalization refers to the extension of learned behaviours to novel conditions, and it is one of the criteria by which the effectiveness and efficiency of a remediation programme may be judged. This article extracts principles of generalization from the treatment literature, and provides examples of how this information may be used to help guide…

  10. General George C. Marshall

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This is a portrait of General George C. Marshall in Army uniform. The Marshall Space Flight Center, a NASA field installation, was established in Huntsville, Alabama, in 1960. The Center was named in honor of General George C. Marshall, the Army Chief of Staff during World War II, Secretary of State, and Nobel Prize Wirner for his world-renowned Marshall Plan.

  11. Certificateless Generalized Signcryption

    NASA Astrophysics Data System (ADS)

    Ji, Huifang; Han, Wenbao; Zhao, Long

    Generalized Signcryption is a fresh cryptographic primitive that not only can obtain encryption and signature in a single operation, but also provide encryption or signature model alone when needed. This paper proposed a formal definition of certificateless generalized signcryption(CLGSC), then provide the security model of CLGSC. A concrete CLGSC scheme is also given in this paper.

  12. Acquired Idiopathic Generalized Anhidrosis.

    PubMed

    Gangadharan, Geethu; Criton, Sebastian; Surendran, Divya

    2015-01-01

    Acquired idiopathic generalized anhidrosis is a rare condition, where the exact pathomechanism is unknown. We report a case of acquired idiopathic generalized anhidrosis in a patient who later developed lichen planus. Here an autoimmune-mediated destruction of sweat glands may be the probable pathomechanism.

  13. OASIS General Introduction.

    ERIC Educational Resources Information Center

    Stanford Univ., CA.

    Recognizing the need to balance generality and economy in system costs, the Project INFO team at Stanford University developing OASIS has sought to provide generalized and powerful computer support within the normal range of operating and analytical requirements associated with university administration. The specific design objectives of the OASIS…

  14. Generalized Latent Trait Models.

    ERIC Educational Resources Information Center

    Moustaki, Irini; Knott, Martin

    2000-01-01

    Discusses a general model framework within which manifest variables with different distributions in the exponential family can be analyzed with a latent trait model. Presents a unified maximum likelihood method for estimating the parameters of the generalized latent trait model and discusses the scoring of individuals on the latent dimensions.…

  15. GENERAL BIOLOGY OF PHYSALIA

    DTIC Science & Technology

    The objective of this research was to investigate the general biology of Physalia. It was proposed to concentrate research efforts on two major...aspects of the general problem: first, the secretion of gas into the float, and second, the nature and circulation of the gastrovascular fluid. Attention

  16. NASA and general aviation

    NASA Technical Reports Server (NTRS)

    Ethell, J. L.

    1986-01-01

    General aviation remains the single most misunderstood sector of aeronautics in the United States. A detailed look at how general aviation functions and how NASA helps keep it on the cutting edge of technology in airfoils, airframes, commuter travel, environmental concerns, engines, propellers, air traffic control, agricultural development, electronics, and safety is given.

  17. Assessment of General Education

    ERIC Educational Resources Information Center

    Furman, Tanya

    2013-01-01

    The assessment of general education is often approached through broad surveys or standardized instruments that fail to capture the learning goals most faculty members desire for this portion of the curriculum. The challenge in remedying this situation lies, first, in defining general education in meaningful ways that can be both articulated and…

  18. The underestimated biodiversity of tropical grassy biomes.

    PubMed

    Murphy, Brett P; Andersen, Alan N; Parr, Catherine L

    2016-09-19

    For decades, there has been enormous scientific interest in tropical savannahs and grasslands, fuelled by the recognition that they are a dynamic and potentially unstable biome, requiring periodic disturbance for their maintenance. However, that scientific interest has not translated into widespread appreciation of, and concern about threats to, their biodiversity. In terms of biodiversity, grassy biomes are considered poor cousins of the other dominant biome of the tropics-forests. Simple notions of grassy biomes being species-poor cannot be supported; for some key taxa, such as vascular plants, this may be valid, but for others it is not. Here, we use an analysis of existing data to demonstrate that high-rainfall tropical grassy biomes (TGBs) have vertebrate species richness comparable with that of forests, despite having lower plant diversity. The Neotropics stand out in terms of both overall vertebrate species richness and number of range-restricted vertebrate species in TGBs. Given high rates of land-cover conversion in Neotropical grassy biomes, they should be a high priority for conservation and greater inclusion in protected areas. Fire needs to be actively maintained in these systems, and in many cases re-introduced after decades of inappropriate fire exclusion. The relative intactness of TGBs in Africa and Australia make them the least vulnerable to biodiversity loss in the immediate future. We argue that, like forests, TGBs should be recognized as a critical-but increasingly threatened-store of global biodiversity.This article is part of the themed issue 'Tropical grassy biomes: linking ecology, human use and conservation'.

  19. Lymphocytic hypophysitis: a rare or underestimated disease?

    PubMed

    Bellastella, Antonio; Bizzarro, Antonio; Coronella, Concetta; Bellastella, Giuseppe; Sinisi, Antonio Agostino; De Bellis, Annamaria

    2003-11-01

    Lymphocytic hypophysitis (LYH) is an uncommon autoimmune disease in which the pituitary gland is infiltrated by lymphocytes, plasma cells and macrophages and its function is usually impaired. It has to be suspected in pregnant women and in women with recent delivery presenting with hyperprolactinemia, headache, visual field alterations and changes of one or more pituitary hormone secretions with secondary impairment of related peripheral target glands, especially when associated with other autoimmune endocrine or non-endocrine disorders. It can also occur less frequently in prepubertal or post-menopausal women and in men. Headache, visual field impairment and more rarely diplopia are due to extrasellar pituitary enlargement with optic chiasma compression and/or to invasion of cavernous sinuses. Among the 'isolated' pituitary hormone deficiencies, ACTH deficit is usually the earliest and most frequent hormonal impairment and in rare cases can induce an acute secondary hyposurrenalism as the first sign of the disease, with high mortality in affected patients. Histopathological findings from pituitary biopsy show lymphoplasmacytic infiltrate with lymphoid aggregates surrounding atropic acini of pituitary cells; immunohistochemical analysis shows numerous mast cells randomly distributed and also localized in the vicinity of capillaries, suggesting a possible influence on capillary permeability and angiogenesis, thus favoring the inflammatory and immunological aggression against pituitary cells. Nuclear magnetic resonance imaging shows uniform sellar floor depression and an extrasellar symmetrical pituitary enlargement, usually displacing the optic chiasma, which shows a rapid homogeneous enhancement after gadolinium also involving the adjacent dura (dural tail). Antipituitary antibodies have been detected in several patients with LYH but their role needs to be clarified. Since a possible spontaneous remission can occur, a careful follow-up is required in subclinical patients without important hyposurrenalism or symptomatic extrasellar expansion. Medical (immunosuppressive, replacement and antiprolactinemic) and neurosurgical (decompression) treatments are needed in clinical symptomatic patients.

  20. America's Seniors: Marketers Are Underestimating Their Power.

    ERIC Educational Resources Information Center

    Clayton, Catherine

    Society has stereotyped the elderly as those who are unable, dependent, institutionalized, and handicapped in various other ways. Stereotyping older people in this manner allows them to be cast aside in the market as well. The marketing community should concentrate more on this thriving aggregate, for they have disposable income--some for the…

  1. [Focal infection in children - an underestimated problem].

    PubMed

    Pawlaczyk-Kamieńska, Tamara; Pawlaczyk-Wróblewska, Elżbieta

    2014-01-01

    The authors discuss the problem of joint inflammation based on the case of a 3 years old girl in whom the changes occurred in the lower limbs. The detailed diagnostic investigations confirmed the existence of the active focus in the oral cavity. The paper presents the case of a 3-year old girl with lower limbs joints inflamation. After their extraction a complete remission of inflammatory markers was achieved, which confirmed the correctness existence of an active focus. Furthermore, the presented case report demonstrates that oral health is a very important factor, which should be considered in the diagnosis and treatment of systemic diseases, including rheumathoid arthritis. It proves also that prevention of dental caries and oral hygiene should be implemented from early childhood.

  2. Pulp size in molars: underestimation on radiographs.

    PubMed

    Chandler, N P; Ford, T R Pitt; Monteith, B D

    2004-08-01

    The aim was to determine whether radiographs provide a clinically useful indication of pulp size in diseased/restored human first molar teeth, and to investigate accessibility of pulp tissue for diagnostic testing using laser Doppler flowmetry (LDF). Extracted teeth of known age were collected. Restorative materials were removed and teeth with evidence of pulp exposures excluded. Fifty-six teeth were radiographed from buccal and mesial aspects, and then their crowns were sectioned axiobuccolingually and photographed. Images were digitally scanned and measurements made of the total pulp area (above a line across the most superior part of the pulpal floor) and the pulp area in the clinical crown (superior to a line between the amelocemental junctions). The pulp width at the cervix and the highest point of the pulp were also recorded. Data were analysed using Pearson correlations. Pulp areas within the clinical crowns were significantly larger than indicated by radiographs, by 23% in the case of the clinically attainable buccal view (P < 0.05). Pulps may be more accessible to flowmeter testing than they appear. Absence of pulp tissues in the crown was recorded in equal numbers of teeth on radiographs and sections, but with agreement for only one tooth. Sixteen per cent of the teeth had no pulp area in the clinical crown when sectioned, but might still be suitable for testing using LDF.

  3. Milestones of general relativity: Hubble’s law (1929) and the expansion of the universe

    NASA Astrophysics Data System (ADS)

    MacCallum, Malcolm A. H.

    2015-06-01

    Hubble’s announcement of the magnitude-redshift relation (Hubble 1929 Proc. Natl. Acad. Sci. USA 15 168-73) brought about a major change in our understanding of the Universe. After tracing the pre-history of Hubble’s work, and the hiatus in our understanding which his underestimate of distances led to, this review focuses on the development and success of our understanding of the expanding Universe up to the present day, and the part that general relativity plays in that success.

  4. Generalized Fibonacci photon sieves.

    PubMed

    Ke, Jie; Zhang, Junyong

    2015-08-20

    We successfully extend the standard Fibonacci zone plates with two on-axis foci to the generalized Fibonacci photon sieves (GFiPS) with multiple on-axis foci. We also propose the direct and inverse design methods based on the characteristic roots of the recursion relation of the generalized Fibonacci sequences. By switching the transparent and opaque zones, according to the generalized Fibonacci sequences, we not only realize adjustable multifocal distances but also fulfill the adjustable compression ratio of focal spots in different directions.

  5. Science in General Education

    ERIC Educational Resources Information Center

    Read, Andrew F.

    2013-01-01

    General education must develop in students an appreciation of the power of science, how it works, why it is an effective knowledge generation tool, and what it can deliver. Knowing what science has discovered is desirable but less important.

  6. Generalizing twisted gauge invariance

    SciTech Connect

    Duenas-Vidal, Alvaro; Vazquez-Mozo, Miguel A.

    2009-05-01

    We discuss the twisting of gauge symmetry in noncommutative gauge theories and show how this can be generalized to a whole continuous family of twisted gauge invariances. The physical relevance of these twisted invariances is discussed.

  7. General Relativity and Energy

    ERIC Educational Resources Information Center

    Jackson, A. T.

    1973-01-01

    Reviews theoretical and experimental fundamentals of Einstein's theory of general relativity. Indicates that recent development of the theory of the continually expanding universe may lead to revision of the space-time continuum of the finite and unbounded universe. (CC)

  8. Generalized scale invariant theories

    NASA Astrophysics Data System (ADS)

    Padilla, Antonio; Stefanyszyn, David; Tsoukalas, Minas

    2014-03-01

    We present the most general actions of a single scalar field and two scalar fields coupled to gravity, consistent with second-order field equations in four dimensions, possessing local scale invariance. We apply two different methods to arrive at our results. One method, Ricci gauging, was known to the literature and we find this to produce the same result for the case of one scalar field as a more efficient method presented here. However, we also find our more efficient method to be much more general when we consider two scalar fields. Locally scale invariant actions are also presented for theories with more than two scalar fields coupled to gravity and we explain how one could construct the most general actions for any number of scalar fields. Our generalized scale invariant actions have obvious applications to early Universe cosmology and include, for example, the Bezrukov-Shaposhnikov action as a subset.

  9. Milestones of general relativity

    NASA Astrophysics Data System (ADS)

    Pullin, Jorge

    2017-02-01

    We present a summary for non-specialists of the special issue of the journal Classical and Quantum Gravity on ‘Milestones of general relativity’, commemorating the 100th anniversary of the theory.

  10. Tuberculosis: General Information

    MedlinePlus

    TB Elimination Tuberculosis: General Information What is TB? Tuberculosis (TB) is a disease caused by germs that are spread from person ... Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination CS227840_A What Does a Positive Test ...

  11. General Chemistry for Engineers.

    ERIC Educational Resources Information Center

    Kybett, B. D.

    1982-01-01

    Discusses the relationship between molecular structure, intermolecular forces, and tensile strengths of a polymer and suggests that this is a logical way to introduce polymers into a general chemistry course. (Author/JN)

  12. Symmetric generalized binomial distributions

    SciTech Connect

    Bergeron, H.; Curado, E. M. F.; Gazeau, J. P.; Rodrigues, Ligia M. C. S. E-mail: evaldo@cbpf.br E-mail: ligia@cbpf.br

    2013-12-15

    In two recent articles, we have examined a generalization of the binomial distribution associated with a sequence of positive numbers, involving asymmetric expressions of probabilities that break the symmetry win-loss. We present in this article another generalization (always associated with a sequence of positive numbers) that preserves the symmetry win-loss. This approach is also based on generating functions and presents constraints of non-negativeness, similar to those encountered in our previous articles.

  13. Matter in general relativity

    NASA Technical Reports Server (NTRS)

    Ray, J. R.

    1982-01-01

    Two theories of matter in general relativity, the fluid theory and the kinetic theory, were studied. Results include: (1) a discussion of various methods of completing the fluid equations; (2) a method of constructing charged general relativistic solutions in kinetic theory; and (3) a proof and discussion of the incompatibility of perfect fluid solutions in anisotropic cosmologies. Interpretations of NASA gravitational experiments using the above mentioned results were started. Two papers were prepared for publications based on this work.

  14. Securing General Aviation

    DTIC Science & Technology

    2009-03-03

    ajor vulnerabilities still exist in ... general aviation security ,”3 the commission did not further elaborate on the nature of those vulnerabilities...commercial operations may make them an attractive alternative to terrorists seeking to identify and exploit vulnerabilities in aviation security . In this...3, 2003, p. A7. 2 See Report of the Aviation Security Advisory Committee Working Group on General Aviation Airport Security (October 1, 2003); and

  15. General aviation and community development

    NASA Technical Reports Server (NTRS)

    Sincoff, M. Z. (Editor); Dajani, J. S. (Editor)

    1975-01-01

    The summer program is summarized. The reports presented concern (1) general aviation components, (2) general aviation environment, (3) community perspective, and (4) transportation and general aviation in Virginia.

  16. [Generalized periarthritis calcarea (generalized hydroxyapatite disease)].

    PubMed

    Müller, W; Bahous, I

    1979-09-01

    The condition of generalized periarthritis calcarea (hydroxyapatite deposition disease) is characterised by multiple periarticular calcification which can be localised around practically any joint and also in proximity to the spine. This calcification consists of hydroxyapatite crystals which are responsible for the episodes of acute, subacute or chronic periarticular or articular inflammation so typical of the condition. Because of this one can classify periarthritis calcarea along with gout and chondrocalcinosis in the group of crystal deposition diseases. The actual cause of the calcification remains unknown but it is probable that, along with hereditary factors, disturbances in metabolism play an important role. The diagnosis of generalised periarthritis is made from the characteristic X-ray picture in conjunction with the clinical findings and, on occasion, the demonstration of hydroxyapatite crystals in the affected tissues. In the differential diagnosis gout, chondrocalcinosis, various inflammatory rheumatic conditions and septic arthritis must be excluded and various calcification processes, particularly interstitial calcinosis and lipocal cinogranulomatosis, must also be considered. Since the etiology of the calcification remains unknown to specific treatment is available. Symptomatic treatment with colchicine is mostly inadequate which is why one often has recourse to the use of non-steroid anti-inflammatory drugs and corticosteroids.

  17. Generalized classifier neural network.

    PubMed

    Ozyildirim, Buse Melis; Avci, Mutlu

    2013-03-01

    In this work a new radial basis function based classification neural network named as generalized classifier neural network, is proposed. The proposed generalized classifier neural network has five layers, unlike other radial basis function based neural networks such as generalized regression neural network and probabilistic neural network. They are input, pattern, summation, normalization and output layers. In addition to topological difference, the proposed neural network has gradient descent based optimization of smoothing parameter approach and diverge effect term added calculation improvements. Diverge effect term is an improvement on summation layer calculation to supply additional separation ability and flexibility. Performance of generalized classifier neural network is compared with that of the probabilistic neural network, multilayer perceptron algorithm and radial basis function neural network on 9 different data sets and with that of generalized regression neural network on 3 different data sets include only two classes in MATLAB environment. Better classification performance up to %89 is observed. Improved classification performances proved the effectivity of the proposed neural network.

  18. Constrained Generalized Supersymmetries

    NASA Astrophysics Data System (ADS)

    Toppan, Francesco; Kuznetsova, Zhanna

    2005-10-01

    We present a classification of admissible types of constraint (hermitian, holomorphic, with reality condition on the bosonic sectors, etc.) for generalized supersymmetries in the presence of complex spinors. A generalized supersymmetry algebra involving n-component real spinors Qa is given by the anticommutators {Qa,Qb} = Zab where the matrix Z appearing in the r.h.s. is the most general symmetric matrix. A complex generalized supersymmetry algebra is expressed in terms of complex spinors Qa and their complex conjugate Q* ȧ. The most general (with a saturated r.h.s.) algebra is in this case given by {Qa,Qb} = Pab{Q*ȧ, Q*ḃ} = P*ȧḃ{Qa,Q*ḃ} = Raḃ where the matrix Pab is symmetric, while Rab is hermitian. The bosonic right hand side can be expressed in terms of the rank-k totally antisymmetric tensors Pab = Σk(CΓ[μ1 …μ k])abP[μ1 …μ k]. The decomposition in terms of anti-symmetric tensors for any space-time up to dimension D = 13 is presented. Real type, complex type, and quaternionic type space-times are classified. Any restriction on the saturated bosonic generators that allows all possible combinations of these tensors is in principle admissible by a Lorenz-covariant requirement. We investigate division algebra constraints and their influence on physical models. High spin theory models are presented as examples of the applications of such models.

  19. Generalized galilean genesis

    SciTech Connect

    Nishi, Sakine; Kobayashi, Tsutomu

    2015-03-31

    The galilean genesis scenario is an alternative to inflation in which the universe starts expanding from Minkowski in the asymptotic past by violating the null energy condition stably. Several concrete models of galilean genesis have been constructed so far within the context of galileon-type scalar-field theories. We give a generic, unified description of the galilean genesis scenario in terms of the Horndeski theory, i.e., the most general scalar-tensor theory with second-order field equations. In doing so we generalize the previous models to have a new parameter (denoted by α) which results in controlling the evolution of the Hubble rate. The background dynamics is investigated to show that the generalized galilean genesis solution is an attractor, similarly to the original model. We also study the nature of primordial perturbations in the generalized galilean genesis scenario. In all the models described by our generalized genesis Lagrangian, amplification of tensor perturbations does not occur as opposed to what happens in quasi-de Sitter inflation. We show that the spectral index of curvature perturbations is determined solely from the parameter α and does not depend on the other details of the model. In contrast to the original model, a nearly scale-invariant spectrum of curvature perturbations is obtained for a specific choice of α.

  20. Generalized galilean genesis

    SciTech Connect

    Nishi, Sakine; Kobayashi, Tsutomu E-mail: tsutomu@rikkyo.ac.jp

    2015-03-01

    The galilean genesis scenario is an alternative to inflation in which the universe starts expanding from Minkowski in the asymptotic past by violating the null energy condition stably. Several concrete models of galilean genesis have been constructed so far within the context of galileon-type scalar-field theories. We give a generic, unified description of the galilean genesis scenario in terms of the Horndeski theory, i.e., the most general scalar-tensor theory with second-order field equations. In doing so we generalize the previous models to have a new parameter (denoted by α) which results in controlling the evolution of the Hubble rate. The background dynamics is investigated to show that the generalized galilean genesis solution is an attractor, similarly to the original model. We also study the nature of primordial perturbations in the generalized galilean genesis scenario. In all the models described by our generalized genesis Lagrangian, amplification of tensor perturbations does not occur as opposed to what happens in quasi-de Sitter inflation. We show that the spectral index of curvature perturbations is determined solely from the parameter α and does not depend on the other details of the model. In contrast to the original model, a nearly scale-invariant spectrum of curvature perturbations is obtained for a specific choice of α.

  1. Chemistry as General Education

    NASA Astrophysics Data System (ADS)

    Tro, Nivaldo J.

    2004-01-01

    Science courses are common in most general education requirements. This paper addresses the role of chemistry classes in meeting these requirements. Chemistry professors have for many years questioned the appropriateness of the standard introductory chemistry course as general education, resulting in the growing popularity of specialized non-majors courses. I suggest that current non-major chemistry courses cover too much consumer chemistry and ignore some of the big contributions of chemistry to human knowledge. Majors chemistry courses, while they prepare students for majoring in science, do not address these issues either. Consequently, chemistry courses are often an ineffective and unpopular way to meet general education science requirements. Part of the reason for this dilemma is the lack of chemists who address the contributions of chemistry to human knowledge in general. I propose that faculty at liberal arts colleges engage in this important task and that non-majors chemistry textbooks incorporate questions and issues that relate chemistry to a broader view of human knowledge. If these things happen, perhaps chemistry courses will become more effective as general education.

  2. Different physiological responses of cyanobacteria to ultraviolet-B radiation under iron-replete and iron-deficient conditions: Implications for underestimating the negative effects of UV-B radiation.

    PubMed

    Li, Zheng-Ke; Dai, Guo-Zheng; Juneau, Philippe; Qiu, Bao-Sheng

    2017-02-06

    Iron deficiency has been considered one of the main limiting factors of phytoplankton productivity in some aquatic systems including oceans and lakes. Concomitantly, solar ultraviolet-B radiation has been shown to have both deleterious and positive impacts on phytoplankton productivity. However, how iron-deficient cyanobacteria respond to UV-B radiation has been largely overlooked in aquatic systems. In this study, physiological responses of four cyanobacterial strains (Microcystis and Synechococcus), which are widely distributed in freshwater or marine systems, were investigated under different UV-B irradiances and iron conditions. The growth, photosynthetic pigment composition, photosynthetic activity, and nonphotochemical quenching of the different cyanobacterial strains were drastically altered by enhanced UV-B radiation under iron-deficient conditions, but were less affected under iron-replete conditions. Intracellular reactive oxygen species (ROS) and iron content increased and decreased, respectively, with increased UV-B radiation under iron-deficient conditions for both Microcystis aeruginosa FACHB 912 and Synechococcus sp. WH8102. On the contrary, intracellular ROS and iron content of these two strains remained constant and increased, respectively, with increased UV-B radiation under iron-replete conditions. These results indicate that iron-deficient cyanobacteria are more susceptible to enhanced UV-B radiation. Therefore, UV-B radiation probably plays an important role in influencing primary productivity in iron-deficient aquatic systems, suggesting that its effects on the phytoplankton productivity may be underestimated in iron-deficient regions around the world.

  3. Generalized energy failure criterion.

    PubMed

    Qu, R T; Zhang, Z J; Zhang, P; Liu, Z Q; Zhang, Z F

    2016-03-21

    Discovering a generalized criterion that can predict the mechanical failure of various different structural materials is one of ultimate goals for scientists in both material and mechanics communities. Since the first study on the failure criterion of materials by Galileo, about three centuries have passed. Now we eventually find the "generalized energy criterion", as presented here, which appears to be one universal law for various different kinds of materials. The validity of the energy criterion for quantitatively predicting the failure is experimentally confirmed using a metallic glass. The generalized energy criterion reveals the competition and interaction between shear and cleavage, the two fundamental inherent failure mechanisms, and thus provides new physical insights into the failure prediction of materials and structural components.

  4. Generalized isometries in superspace

    NASA Astrophysics Data System (ADS)

    Ryb, Itai

    N = (2, 2) supersymmetric models are of interest for mathematicians and physicists and have been used extensively as a tool for the investigation of generalized Kahler geometry. In the sigma-model approach, it is convenient to formulate and manipulate sigmamodels in superspace where essential geometric properties are captured by the generalized Kahler potential which gives rise to bihermitian geometry description. Recent developments in differential geometry show that one can also characterize these targets using structures that interpolate between complex and symplectic geometry and are defined on the sum T ⊕ T*. The research work that will be presented here extends the set of known superspace tools for the manipulation of bihermitian/generalized Kahler geometries, namely, the gauging of isometries along directions that mix chiral and twisted chiral or semichiral multiplets. Other results that will be presented relate to possible N = (4, 4) supersymmetry in semichiral models and sigma models formulation on the sum T ⊕ T*.

  5. Generalized energy failure criterion

    PubMed Central

    Qu, R. T.; Zhang, Z. J.; Zhang, P.; Liu, Z. Q.; Zhang, Z. F.

    2016-01-01

    Discovering a generalized criterion that can predict the mechanical failure of various different structural materials is one of ultimate goals for scientists in both material and mechanics communities. Since the first study on the failure criterion of materials by Galileo, about three centuries have passed. Now we eventually find the “generalized energy criterion”, as presented here, which appears to be one universal law for various different kinds of materials. The validity of the energy criterion for quantitatively predicting the failure is experimentally confirmed using a metallic glass. The generalized energy criterion reveals the competition and interaction between shear and cleavage, the two fundamental inherent failure mechanisms, and thus provides new physical insights into the failure prediction of materials and structural components. PMID:26996781

  6. Generalized Deterministic Traffic Rules

    NASA Astrophysics Data System (ADS)

    Fuks, Henryk; Boccara, Nino

    We study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parameterized by the speed limit m and another parameter k that represents a "degree of aggressiveness" in driving, strictly related to the distance between two consecutive cars. We compare two driving strategies with identical maximum throughput: "conservative" driving with high speed limit and "aggressive" driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered.

  7. Generalized Communities in Networks

    NASA Astrophysics Data System (ADS)

    Newman, M. E. J.; Peixoto, Tiago P.

    2015-08-01

    A substantial volume of research is devoted to studies of community structure in networks, but communities are not the only possible form of large-scale network structure. Here, we describe a broad extension of community structure that encompasses traditional communities but includes a wide range of generalized structural patterns as well. We describe a principled method for detecting this generalized structure in empirical network data and demonstrate with real-world examples how it can be used to learn new things about the shape and meaning of networks.

  8. General aviation technology assessment

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.

    1975-01-01

    The existing problem areas in general aviation were investigated in order to identify those which can benefit from technological payoffs. The emphasis was placed on acceptance by the pilot/passenger in areas such as performance, safety, handling qualities, ride quality, etc. Inputs were obtained from three sectors: industry; government; and user, although slanted toward the user group. The results should only be considered preliminary due to the small sample sizes of the data. Trends are evident however and a general methodology for allocating effort in future programs is proposed.

  9. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  10. Generalized Random Events

    NASA Astrophysics Data System (ADS)

    Skřivánek, Václav; Frič, Roman

    2015-12-01

    Phenomena in quantum physics motivate studies of mathematical quantum structures and generalized probability. We introduce a new domain of generalized probability which is equivalent to the difference posets (also to effect algebras) of fuzzy sets. It is in terms of a partial operation of addition and it has a natural interpretation as a disjunction in the original spirit of G. Boole. The extension to a binary operation leads to the Łukasiewicz operations, bold algebras and fuzzy probability. We study states on products and mention some applications.

  11. Generalized compliant motion primitive

    NASA Astrophysics Data System (ADS)

    Backes, Paul G.

    1994-08-01

    This invention relates to a general primitive for controlling a telerobot with a set of input parameters. The primitive includes a trajectory generator; a teleoperation sensor; a joint limit generator; a force setpoint generator; a dither function generator, which produces telerobot motion inputs in a common coordinate frame for simultaneous combination in sensor summers. Virtual return spring motion input is provided by a restoration spring subsystem. The novel features of this invention include use of a single general motion primitive at a remote site to permit the shared and supervisory control of the robot manipulator to perform tasks via a remotely transferred input parameter set.

  12. 75 FR 62879 - Individual Exemption Involving General Motors Company, General Motors Holdings LLC, and General...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-13

    ... Benefits Security Administration Individual Exemption Involving General Motors Company, General Motors Holdings LLC, and General Motors LLC, Located in Detroit, MI AGENCY: Employee Benefits Security... ERISA (the Notice).\\2\\ The proposed exemption was requested in an application filed by General...

  13. Interrogation: General vs. Local.

    ERIC Educational Resources Information Center

    Johnson, Jeannette

    This paper proposes a set of hypotheses on the nature of interrogration as a possible language universal. Examples and phrase structure rules and diagrams are given. Examining Tamazight and English, genetically unrelated languages with almost no contact, the author distinguishes two types of interrogation: (1) general, querying acceptability to…

  14. The General Teaching Model.

    ERIC Educational Resources Information Center

    Miles, David T.; Robinson, Roger E.

    The General Teaching Model is a procedural guide for the design, implementation, evaluation, and improvement of instruction. The Model is considered applicable to all levels of education, all subject matters, and any length of instructional unit. It consists of four components: 1) instructional objectives, 2) pre-assessment, 3) instructional…

  15. Generalized Structured Component Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Takane, Yoshio

    2004-01-01

    We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…

  16. General Drafting. Technical Manual.

    ERIC Educational Resources Information Center

    Department of the Army, Washington, DC.

    The manual provides instructional guidance and reference material in the principles and procedures of general drafting and constitutes the primary study text for personnel in drafting as a military occupational specialty. Included is information on drafting equipment and its use; line weights, conventions and formats; lettering; engineering charts…

  17. Work and General Education.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.

    Presentations and other materials are provided from the Asia and the Pacific Programme of Educational Innovation for Development (APEID) Planning and Review Meeting on Work as an Integral Part of General Education. The focus is on how education, through an orientation to work, could help to decrease the gravity of the problems of population…

  18. General George C. Marshall

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The Marshall Space Flight Center, a NASA field installation, was established at Huntsville, Alabama, in 1960. The Center was named in honor of General George C. Marshall, the Army Chief of Staff during World War II, Secretary of State, and Nobel Prize Wirner for his world-renowned Marshall Plan.

  19. Chemistry as General Education

    ERIC Educational Resources Information Center

    Tro, Nivaldo J.

    2004-01-01

    The efficacy of different science and chemistry courses for science-major and non-major students, and the question of chemistry's contribution to general education are evaluated. Chemistry and science curriculum are too profession- and consumer-oriented, and to overcome this problem, it is advised that all disciplines must incorporate the major…

  20. General Aviation Manpower Study.

    ERIC Educational Resources Information Center

    Feller, Richard

    1982-01-01

    Highlights a study examining manpower supply/demand in general aviation. Eight job categories were examined: pilots, flight instructors, engineers, machinists/toolers, and A&P, airframe, and avionics technicians. Findings among others indicate that shortages in indicated job categories exist because personnel are recruited by other industries.…

  1. General Aviation Avionics Statistics.

    DTIC Science & Technology

    1980-12-01

    No. 2. Government Accession No. 3. Recipient’s Catalog No. 5" FAA-MS-80-7* a and. SubtitleDecember 1&80 "GENERAL AVIATION AVIONICS STATISTICS 0 6...Altimeter 8. Fuel gage 3. Compass 9. Landing gear 4. Tachometer 10. Belts 5. Oil temperature 11. Special equipment for 6. Emergency locator over water

  2. Annals of General Psychiatry.

    PubMed

    Fountoulakis, Konstantinos

    2005-02-01

    Our regular readers will notice that the title of our journal has changed from Annals of General Hospital Psychiatry (AGHP) to Annals of General Psychiatry (AGP) since January 1st, 2005. This was judged as necessary, in order to be able to serve better the aims of the journal. Our initial thoughts were that including the term 'General Hospital' in the journal's title would help us to launch a journal dedicated to the idea of Psychiatry as a medical specialty. But they were not justified; so, now the Annals of General Psychiatry (AGP) is born! It is still an Open Access, peer-reviewed, online journal covering the wider field of Psychiatry, Neurosciences and Psychological Medicine, and aims at publishing articles on all aspects of psychiatry. Primary research articles are the journal's priority, and both basic and clinical neuroscience contributions are encouraged. The AGP strongly supports and follows the principles of evidence-based medicine. AGP's articles are archived in PubMed Central, the US National Library of Medicine's full-text repository of life science literature, and also in repositories at the University of Potsdam in Germany, at INIST in France and in e-Depot, the National Library of the Netherlands' digital archive of all electronic publications. We hope that the change in the journal's name will cure the confusion caused by its previous title and help to achieve the journal's aims and scope, that is to help the world-wide promotion of research and publishing in the mental health area.

  3. General Business 101.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    This teaching guide contains guidelines for conducting a secondary-level general business course. Intended to serve as an introduction to business and consumer fundamentals, the course provides socioeconomic background useful to students seeking vocational preparation for office and clerical occupations. The goals and objectives of the course are…

  4. General Construction. Instructor Manual.

    ERIC Educational Resources Information Center

    Laborers-AGC Education and Training Fund, Pomfret Center, CT.

    This guide contains materials for a general construction course. Introductory materials include a list of videos, schedule for the 10-day course, and tools and material list. The course is divided into 10 sections. Each section consists of some or all of these components: list of trainee objectives, instructor notes, instructor outline,…

  5. General Education for Engineers

    NASA Astrophysics Data System (ADS)

    Takeda, Kunihiko

    The basic program of general education of engineers is based on European culture from the times of ancient Greece to the 20th century. However, when considering its results, such as colonialism and the World Wars, this system can be said to lack the most important goal of “culture,” which is “to accept the existence of others.” In particular, the cooperation of European culture and engineering has ravaged the weaker cultures and is currently causing severe environmental problems in nature. Therefore, when considering the general education of engineers, it is indispensable to doubt European scholarship and to analyze what is lacking in current Japanese educational programs. Then, it is desirable that the relationship between the mind and the body, the characteristics of the Japanese climate, and the essence of Japanese artisanship be taken into consideration. It may also be beneficial to study the Ainu culture for its qualities as a peaceful culture.

  6. [Hypnosis in general medicine].

    PubMed

    Tozzi, A

    1981-10-27

    Organic pathologies with psychoaffective aetiology are numerous, and in all diseases, mental and rational elements are concomitant with the organic situation. Aware of the psycho-dynamic mechanisms he sets in motion by his actions and appreciating those underlying the symptomatology he has to deal with, the hypnologist transcends the overspecialisation of medical science to restore the patient's psycho-physical unitary reality. This explains both the possibilities of hypnositherapy in general medicine and its impassable limits. The hypnositherapy that the general practitioner can (and most implement must be perfected by a qualified psychotherapeutist when symptomatology cannot be eradicated without tackling the aetiological factor with a serious analysis of the profound. Hypnosis is, therefore, an extra therapeutic possibility for the physician and this is why it can only be used with science, awareness and on the basis of ethics.

  7. Generalized uncertainty relations

    NASA Astrophysics Data System (ADS)

    Herdegen, Andrzej; Ziobro, Piotr

    2017-04-01

    The standard uncertainty relations (UR) in quantum mechanics are typically used for unbounded operators (like the canonical pair). This implies the need for the control of the domain problems. On the other hand, the use of (possibly bounded) functions of basic observables usually leads to more complex and less readily interpretable relations. In addition, UR may turn trivial for certain states if the commutator of observables is not proportional to a positive operator. In this letter we consider a generalization of standard UR resulting from the use of two, instead of one, vector states. The possibility to link these states to each other in various ways adds additional flexibility to UR, which may compensate some of the above-mentioned drawbacks. We discuss applications of the general scheme, leading not only to technical improvements, but also to interesting new insight.

  8. Generalized random sequential adsorption

    NASA Astrophysics Data System (ADS)

    Tarjus, G.; Schaaf, P.; Talbot, J.

    1990-12-01

    Adsorption of hard spherical particles onto a flat uniform surface is analyzed by using generalized random sequential adsorption (RSA) models. These models are defined by releasing the condition of immobility present in the usual RSA rules to allow for desorption or surface diffusion. Contrary to the simple RSA case, generalized RSA processes are no longer irreversible and the system formed by the adsorbed particles on the surface may reach an equilibrium state. We show by using a distribution function approach that the kinetics of such processes can be described by means of an exact infinite hierarchy of equations reminiscent of the Kirkwood-Salsburg hierarchy for systems at equilibrium. We illustrate the way in which the systems produced by adsorption/desorption and by adsorption/diffusion evolve between the two limits represented by ``simple RSA'' and ``equilibrium'' by considering approximate solutions in terms of truncated density expansions.

  9. Generalized constructive tree weights

    SciTech Connect

    Rivasseau, Vincent E-mail: adrian.tanasa@ens-lyon.org; Tanasa, Adrian E-mail: adrian.tanasa@ens-lyon.org

    2014-04-15

    The Loop Vertex Expansion (LVE) is a quantum field theory (QFT) method which explicitly computes the Borel sum of Feynman perturbation series. This LVE relies in a crucial way on symmetric tree weights which define a measure on the set of spanning trees of any connected graph. In this paper we generalize this method by defining new tree weights. They depend on the choice of a partition of a set of vertices of the graph, and when the partition is non-trivial, they are no longer symmetric under permutation of vertices. Nevertheless we prove they have the required positivity property to lead to a convergent LVE; in fact we formulate this positivity property precisely for the first time. Our generalized tree weights are inspired by the Brydges-Battle-Federbush work on cluster expansions and could be particularly suited to the computation of connected functions in QFT. Several concrete examples are explicitly given.

  10. Tests of General Relativity

    SciTech Connect

    Kramer, Michael

    2011-09-22

    The last years have seen continuing activities in the exploration of our understanding of gravity, motivated by results from precision cosmology and new precision astrophysical experiments. At the centre of attention lies the question as to whether general relativity is the correct theory of gravity. In answering this question, we work not only towards correctly interpreting the phenomenon of 'dark energy' but also towards the goal of achieving a quantum theory of gravity. In these efforts, the observations of pulsars, especially those in binary systems, play an important role. Pulsars do not only provide the only evidence for the existence of gravitational waves so far, but they also provide precision tests of general relativity and alternative theories of gravity. This talk summarizes the current state-of-art in these experiments and looks into the future.

  11. Are Food Insecurity’s Health Impacts Underestimated in the U.S. Population? Marginal Food Security Also Predicts Adverse Health Outcomes in Young U.S. Children and Mothers123

    PubMed Central

    Cook, John T.; Black, Maureen; Chilton, Mariana; Cutts, Diana; Ettinger de Cuba, Stephanie; Heeren, Timothy C.; Rose-Jacobs, Ruth; Sandel, Megan; Casey, Patrick H.; Coleman, Sharon; Weiss, Ingrid; Frank, Deborah A.

    2013-01-01

    This review addresses epidemiological, public health, and social policy implications of categorizing young children and their adult female caregivers in the United States as food secure when they live in households with “marginal food security,” as indicated by the U.S. Household Food Security Survey Module. Existing literature shows that households in the US with marginal food security are more like food-insecure households than food-secure households. Similarities include socio-demographic characteristics, psychosocial profiles, and patterns of disease and health risk. Building on existing knowledge, we present new research on associations of marginal food security with health and developmental risks in young children (<48 mo) and health in their female caregivers. Marginal food security is positively associated with adverse health outcomes compared with food security, but the strength of the associations is weaker than that for food insecurity as usually defined in the US. Nonoverlapping CIs, when comparing odds of marginally food-secure children’s fair/poor health and developmental risk and caregivers’ depressive symptoms and fair/poor health with those in food-secure and -insecure families, indicate associations of marginal food security significantly and distinctly intermediate between those of food security and food insecurity. Evidence from reviewed research and the new research presented indicates that households with marginal food security should not be classified as food secure, as is the current practice, but should be reported in a separate discrete category. These findings highlight the potential underestimation of the prevalence of adverse health outcomes associated with exposure to lack of enough food for an active, healthy life in the US and indicate an even greater need for preventive action and policies to limit and reduce exposure among children and mothers. PMID:23319123

  12. Limitations of clinical trials in chronic diseases: is the efficacy of methotrexate (MTX) underestimated in polyarticular psoriatic arthritis on the basis of limitations of clinical trials more than on limitations of MTX, as was seen in rheumatoid arthritis?

    PubMed

    Pincus, Theodore; Bergman, Martin J; Yazici, Yusuf

    2015-01-01

    Clinical trials are the optimal method to establish efficacy of a drug versus placebo or another drug. Nonetheless, important limitations are seen, particularly in chronic diseases over long periods, although most are ignored. Pragmatic limitations of clinical trials include a relatively short observation period, suboptimal dosage schedules, suboptimal surrogate markers for long-term outcomes, statistically significant results which may not be clinically unimportant and vice versa. Even ideal clinical trials have intrinsic limitations, including the influence of design on results, data reported in groups which ignore individual variation, non-standard observer-dependent interpretation of a balance of efficacy and toxicity, and distortion of a "placebo effect." Limitations are seen in many clinical trials of methotrexate (MTX) in rheumatoid arthritis (RA) and psoriatic arthritis (PsA). The first MTX clinical trial in rheumatology documented excellent efficacy in PsA, but frequent adverse events in 1964, explained by intravenous doses up to 150 kg. MTX was abandoned until the 1980s for RA, while gold salts and penicillamine were termed "remission-inducing," on the basis limitations of clinical trials. In the most recent MTX in PsA (MIPA) trial, all outcomes favoured MTX, but only patient and physician global estimates met the p<0.05 criterion. A conclusion of "no evidence for MTX improving synovitis" appears explained by insufficient statistical power, wide individual variation, no subsets, low doses, and other limitations. MTX appears less efficacious in PsA than RA, but may be underestimated in PsA, similar to historical problems in RA, resulting more from limitations of clinical trials than from limitations of MTX.

  13. Entrepreneurship within General Aviation

    NASA Technical Reports Server (NTRS)

    Ullmann, Brian M.

    1995-01-01

    Many modern economic theories place great importance upon entrepreneurship in the economy. Some see the entrepreneur as the individual who bears risk of operating a business in the face of uncertainty about future conditions and who is rewarded through profits and losses. The 20th century economist Joseph Schumpter saw the entrepreneur as the medium by which advancing technology is incorporated into society as businesses seek competitive advantages through more efficient product development processes. Due to the importance that capitalistic systems place upon entrepreneurship, it has become a well studied subject with many texts to discuss how entrepreneurs can succeed in modern society. Many entrepreneuring and business management courses go so far as to discuss the characteristic phases and prominent challenges that fledgling companies face in their efforts to bring a new product into a competitive market. However, even with all of these aids, start-up companies fail at an enormous rate. Indeed, the odds of shepherding a new company through the travails of becoming a well established company (as measured by the ability to reach Initial Public Offering (IPO)) have been estimated to be six in 1,000,000. Each niche industry has characteristic challenges which act as barriers to entry for new products into that industry. Thus, the applicability of broad generalizations is subject to limitations within niche markets. This paper will discuss entrepreneurship as it relates to general aviation. The goals of this paper will be to: introduce general aviation; discuss the details of marrying entrepreneurship with general aviation; and present a sample business plan which would characterize a possible entrepreneurial venture.

  14. Laparoscopy in General Surgery

    PubMed Central

    O'Regan, Patrick J.; Anderson, Dawn L.

    1992-01-01

    After a period of rather slow initial acceptance by general surgeons, laparoscopy and video endoscopic surgery have suddenly burst on to the surgical scene. Almost overnight many of the surgical procedures once requiring a large incision are now being performed through small punctures. This article describes some of the more common procedures and discusses the merits and difficulties associated with these innovations. ImagesFigure 1Figure 2Figure 3Figure 4 PMID:21221367

  15. TURBULENT GENERAL MAGNETIC RECONNECTION

    SciTech Connect

    Eyink, G. L.

    2015-07-10

    Plasma flows with a magnetohydrodynamic (MHD)-like turbulent inertial range, such as the solar wind, require a generalization of general magnetic reconnection (GMR) theory. We introduce the slip velocity source vector per unit arclength of field line, the ratio of the curl of the non-ideal electric field in the generalized Ohm’s Law and magnetic field strength. It diverges at magnetic nulls, unifying GMR with null-point reconnection. Only under restrictive assumptions is the slip velocity related to the gradient of quasi-potential (which is the integral of parallel electric field along magnetic field lines). In a turbulent inertial range, the non-ideal field becomes tiny while its curl is large, so that line slippage occurs even while ideal MHD becomes accurate. The resolution is that ideal MHD is valid for a turbulent inertial range only in a weak sense that does not imply magnetic line freezing. The notion of weak solution is explained in terms of renormalization group (RG) type theory. The weak validity of the ideal Ohm’s law in the inertial range is shown via rigorous estimates of the terms in the generalized Ohm’s Law. All non-ideal terms are irrelevant in the RG sense and large-scale reconnection is thus governed solely by ideal dynamics. We discuss the implications for heliospheric reconnection, in particular for deviations from the Parker spiral model. Solar wind observations show that reconnection in a turbulence-broadened heliospheric current sheet, which is consistent with Lazarian–Vishniac theory, leads to slip velocities that cause field lines to lag relative to the spiral model.

  16. Generalized quasiperiodic Rauzy tilings

    NASA Astrophysics Data System (ADS)

    Vidal, Julien; Mosseri, Rémy

    2001-05-01

    We present a geometrical description of new canonical d-dimensional codimension one quasiperiodic tilings based on generalized Fibonacci sequences. These tilings are made up of rhombi in 2d and rhombohedra in 3d as the usual Penrose and icosahedral tilings. Thanks to a natural indexing of the sites according to their local environment, we easily write down, for any approximant, the sites coordinates, the connectivity matrix and we compute the structure factor.

  17. Alopecia in general medicine.

    PubMed

    Nalluri, Rajani; Harries, Matthew

    2016-02-01

    Appreciation of different types of hair loss (alopecia) that may be encountered in hospital medicine is important to ensure accurate diagnosis and management, identify underlying medical conditions or treatments that may present with increased hair loss, recognise autoimmune alopecias and their associations, and understand the significant psychological impact of hair loss on an individual. This article discusses common causes of hair loss, as well as those conditions that may be associated with systemic disease, relevant to a general physician.

  18. Generalized Hampel Filters

    NASA Astrophysics Data System (ADS)

    Pearson, Ronald K.; Neuvo, Yrjö; Astola, Jaakko; Gabbouj, Moncef

    2016-12-01

    The standard median filter based on a symmetric moving window has only one tuning parameter: the window width. Despite this limitation, this filter has proven extremely useful and has motivated a number of extensions: weighted median filters, recursive median filters, and various cascade structures. The Hampel filter is a member of the class of decsion filters that replaces the central value in the data window with the median if it lies far enough from the median to be deemed an outlier. This filter depends on both the window width and an additional tuning parameter t, reducing to the median filter when t=0, so it may be regarded as another median filter extension. This paper adopts this view, defining and exploring the class of generalized Hampel filters obtained by applying the median filter extensions listed above: weighted Hampel filters, recursive Hampel filters, and their cascades. An important concept introduced here is that of an implosion sequence, a signal for which generalized Hampel filter performance is independent of the threshold parameter t. These sequences are important because the added flexibility of the generalized Hampel filters offers no practical advantage for implosion sequences. Partial characterization results are presented for these sequences, as are useful relationships between root sequences for generalized Hampel filters and their median-based counterparts. To illustrate the performance of this filter class, two examples are considered: one is simulation-based, providing a basis for quantitative evaluation of signal recovery performance as a function of t, while the other is a sequence of monthly Italian industrial production index values that exhibits glaring outliers.

  19. Generalized fiber Fourier optics.

    PubMed

    Cincotti, Gabriella

    2011-06-15

    A twofold generalization of the optical schemes that perform the discrete Fourier transform (DFT) is given: new passive planar architectures are presented where the 2 × 2 3 dB couplers are replaced by M × M hybrids, reducing the number of required connections and phase shifters. Furthermore, the planar implementation of the discrete fractional Fourier transform (DFrFT) is also described, with a waveguide grating router (WGR) configuration and a properly modified slab coupler.

  20. Generalized FFT Beamsteering

    DTIC Science & Technology

    2008-01-01

    subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE...on a 2D lattice, offers an electronically controlled , agile beam or even multiple beams associated with mul- tiple antenna outputs, a technology...versions both derive fundamentally from the same ideas in elementary group theory . Likewise, classic FFT algo- rithms and the generalized FFT

  1. Generalized Fisher matrices

    NASA Astrophysics Data System (ADS)

    Heavens, A. F.; Seikel, M.; Nord, B. D.; Aich, M.; Bouffanais, Y.; Bassett, B. A.; Hobson, M. P.

    2014-12-01

    The Fisher Information Matrix formalism (Fisher 1935) is extended to cases where the data are divided into two parts (X, Y), where the expectation value of Y depends on X according to some theoretical model, and X and Y both have errors with arbitrary covariance. In the simplest case, (X, Y) represent data pairs of abscissa and ordinate, in which case the analysis deals with the case of data pairs with errors in both coordinates, but X can be any measured quantities on which Y depends. The analysis applies for arbitrary covariance, provided all errors are Gaussian, and provided the errors in X are small, both in comparison with the scale over which the expected signal Y changes, and with the width of the prior distribution. This generalizes the Fisher Matrix approach, which normally only considers errors in the `ordinate' Y. In this work, we include errors in X by marginalizing over latent variables, effectively employing a Bayesian hierarchical model, and deriving the Fisher Matrix for this more general case. The methods here also extend to likelihood surfaces which are not Gaussian in the parameter space, and so techniques such as DALI (Derivative Approximation for Likelihoods) can be generalized straightforwardly to include arbitrary Gaussian data error covariances. For simple mock data and theoretical models, we compare to Markov Chain Monte Carlo experiments, illustrating the method with cosmological supernova data. We also include the new method in the FISHER4CAST software.

  2. General Relativity and Gravitation

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay; Berger, Beverly; Isenberg, James; MacCallum, Malcolm

    2015-07-01

    Part I. Einstein's Triumph: 1. 100 years of general relativity George F. R. Ellis; 2. Was Einstein right? Clifford M. Will; 3. Cosmology David Wands, Misao Sasaki, Eiichiro Komatsu, Roy Maartens and Malcolm A. H. MacCallum; 4. Relativistic astrophysics Peter Schneider, Ramesh Narayan, Jeffrey E. McClintock, Peter Mészáros and Martin J. Rees; Part II. New Window on the Universe: 5. Receiving gravitational waves Beverly K. Berger, Karsten Danzmann, Gabriela Gonzalez, Andrea Lommen, Guido Mueller, Albrecht Rüdiger and William Joseph Weber; 6. Sources of gravitational waves. Theory and observations Alessandra Buonanno and B. S. Sathyaprakash; Part III. Gravity is Geometry, After All: 7. Probing strong field gravity through numerical simulations Frans Pretorius, Matthew W. Choptuik and Luis Lehner; 8. The initial value problem of general relativity and its implications Gregory J. Galloway, Pengzi Miao and Richard Schoen; 9. Global behavior of solutions to Einstein's equations Stefanos Aretakis, James Isenberg, Vincent Moncrief and Igor Rodnianski; Part IV. Beyond Einstein: 10. Quantum fields in curved space-times Stefan Hollands and Robert M. Wald; 11. From general relativity to quantum gravity Abhay Ashtekar, Martin Reuter and Carlo Rovelli; 12. Quantum gravity via unification Henriette Elvang and Gary T. Horowitz.

  3. Generalized Maximum Entropy

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John

    2005-01-01

    A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].

  4. Beyond generalized Proca theories

    NASA Astrophysics Data System (ADS)

    Heisenberg, Lavinia; Kase, Ryotaro; Tsujikawa, Shinji

    2016-09-01

    We consider higher-order derivative interactions beyond second-order generalized Proca theories that propagate only the three desired polarizations of a massive vector field besides the two tensor polarizations from gravity. These new interactions follow the similar construction criteria to those arising in the extension of scalar-tensor Horndeski theories to Gleyzes-Langlois-Piazza-Vernizzi (GLPV) theories. On the isotropic cosmological background, we show the existence of a constraint with a vanishing Hamiltonian that removes the would-be Ostrogradski ghost. We study the behavior of linear perturbations on top of the isotropic cosmological background in the presence of a matter perfect fluid and find the same number of propagating degrees of freedom as in generalized Proca theories (two tensor polarizations, two transverse vector modes, and two scalar modes). Moreover, we obtain the conditions for the avoidance of ghosts and Laplacian instabilities of tensor, vector, and scalar perturbations. We observe key differences in the scalar sound speed, which is mixed with the matter sound speed outside the domain of generalized Proca theories.

  5. General aviation in China

    NASA Astrophysics Data System (ADS)

    Hu, Xiaosi

    In the last four decades, China has accomplished economic reform successfully and grown to be a leading country in the world. As the "world factory", the country is able to manufacture a variety of industrial products from clothes and shoes to rockets and satellites. But the aviation industry has always been a weak spot and even the military relies on imported turbofan engines and jet fighters, not to mention the airlines. Recently China has launched programs such as ARJ21 and C919, and started reform to change the undeveloped situation of its aviation industry. As the foundation of the aviation industry, the development of general aviation is essential for the rise of commercial aviation. The primary goal of this study is to examine the general aviation industry and finds the issues that constrain the development of the industry in the system. The research method used in this thesis is the narrative research of qualitative approach since the policy instead of statistical data is analyzed. It appears that the main constraint for the general aviation industry is the government interference.

  6. Hyperuniformity and its generalizations

    NASA Astrophysics Data System (ADS)

    Torquato, Salvatore

    2016-08-01

    Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystal and liquid: They are like perfect crystals in the way they suppress large-scale density fluctuations and yet are like liquids or glasses in that they are statistically isotropic with no Bragg peaks. These exotic states of matter play a vital role in a number of problems across the physical, mathematical as well as biological sciences and, because they are endowed with novel physical properties, have technological importance. Given the fundamental as well as practical importance of disordered hyperuniform systems elucidated thus far, it is natural to explore the generalizations of the hyperuniformity notion and its consequences. In this paper, we substantially broaden the hyperuniformity concept along four different directions. This includes generalizations to treat fluctuations in the interfacial area (one of the Minkowski functionals) in heterogeneous media and surface-area driven evolving microstructures, random scalar fields, divergence-free random vector fields, and statistically anisotropic many-particle systems and two-phase media. In all cases, the relevant mathematical underpinnings are formulated and illustrative calculations are provided. Interfacial-area fluctuations play a major role in characterizing the microstructure of two-phase systems (e.g., fluid-saturated porous media), physical properties that intimately depend on the geometry of the interface, and evolving two-phase microstructures that depend on interfacial energies (e.g., spinodal decomposition). In the instances of random vector fields and statistically anisotropic structures, we show that the standard definition of hyperuniformity must be generalized such that it accounts for the dependence of the relevant spectral functions on the direction in which the origin in Fourier space is approached (nonanalyticities at the origin). Using this analysis, we place some well-known energy

  7. Hyperuniformity and its generalizations.

    PubMed

    Torquato, Salvatore

    2016-08-01

    Disordered many-particle hyperuniform systems are exotic amorphous states of matter that lie between crystal and liquid: They are like perfect crystals in the way they suppress large-scale density fluctuations and yet are like liquids or glasses in that they are statistically isotropic with no Bragg peaks. These exotic states of matter play a vital role in a number of problems across the physical, mathematical as well as biological sciences and, because they are endowed with novel physical properties, have technological importance. Given the fundamental as well as practical importance of disordered hyperuniform systems elucidated thus far, it is natural to explore the generalizations of the hyperuniformity notion and its consequences. In this paper, we substantially broaden the hyperuniformity concept along four different directions. This includes generalizations to treat fluctuations in the interfacial area (one of the Minkowski functionals) in heterogeneous media and surface-area driven evolving microstructures, random scalar fields, divergence-free random vector fields, and statistically anisotropic many-particle systems and two-phase media. In all cases, the relevant mathematical underpinnings are formulated and illustrative calculations are provided. Interfacial-area fluctuations play a major role in characterizing the microstructure of two-phase systems (e.g., fluid-saturated porous media), physical properties that intimately depend on the geometry of the interface, and evolving two-phase microstructures that depend on interfacial energies (e.g., spinodal decomposition). In the instances of random vector fields and statistically anisotropic structures, we show that the standard definition of hyperuniformity must be generalized such that it accounts for the dependence of the relevant spectral functions on the direction in which the origin in Fourier space is approached (nonanalyticities at the origin). Using this analysis, we place some well-known energy

  8. The sulfur cycle at high-southern latitudes in the LMD-ZT General Circulation Model

    NASA Astrophysics Data System (ADS)

    Cosme, E.; Genthon, C.; Martinerie, P.; Boucher, O.; Pham, M.

    2002-12-01

    This modeling study was motivated by the recent publication of year-round records of dimethylsulfide (DMS) and dimethylsulfoxide (DMSO) in Antarctica, completing the available series of sulfate and methanesulfonic acid (MSA). Sulfur chemistry has been incorporated in the Laboratoire de Météorologie Dynamique-Zoom Tracers (LMD-ZT) Atmospheric General Circulation Model (AGCM), with high-resolution and improved physics at high-southern latitudes. The model predicts the concentration of six major sulfur species through emissions, transport, wet and dry deposition, and chemistry in both gas and aqueous phases. Model results are broadly realistic when compared with measurements in air and snow or ice, as well as to results of other modeling studies, at high- and middle-southern latitudes. Atmospheric MSA concentrations are underestimated and DMSO concentrations are overestimated in summer, reflecting the lack of a DMSO heterogeneous sink leading to MSA. Experiments with various recently published estimates of the rate of this sink are reported. Although not corrected in this work, other defects are identified and discussed: DMS concentrations are underestimated in winter, MSA and non-sea-salt (nss) sulfate concentrations may be underestimated at the South Pole, the deposition scheme used in the model may not be adapted to polar regions, and the model does not adequately reproduces interannual variability. Oceanic DMS sources have a major contribution to the variability of sulfur in these regions. The model results suggest that in a large part of central Antarctica ground-level atmospheric DMS concentrations are larger in winter than in summer. At high-southern latitudes, high loads of DMS and DMSO are found and the main chemical sink of sulfur dioxide (SO2) is aqueous oxidation by ozone (O3), whereas oxidation by hydrogen peroxide (H2O2) dominates at the global scale. A comprehensive modeled sulfur budget of Antarctica is provided.

  9. Distribution and budget of O3 in the troposphere calculated with a chemistry general circulation model

    NASA Astrophysics Data System (ADS)

    Roelofs, Geert-Jan; Lelieveld, Jos

    1995-10-01

    We present results of global tropospheric chemistry simulations with the coupled chemistry/atmospheric general circulation model ECHAM. Ultimately, the model will be used to study climate changes induced by anthropogenic influences on the chemistry of the atmosphere; meteorological parameters that are important for the chemistry, such as temperature, humidity, air motions, cloud and rain characteristics, and mixing processes are calculated on-line. The chemical part of the model describes background tropospheric CH4-CO-NOx-HOx photochemistry. Emissions of NO and CO, surface concentrations of CH4, and stratospheric concentrations of O3 and NOy are prescribed as boundary conditions. Calculations of the tropospheric O3 budget indicate that seasonal variabilities of the photochemical production and of injection from the stratosphere are represented realistically, although some aspects of the model still need improvement. Comparisons of calculated O3 surface concentrations and O3 profiles with available measurements show that the model reproduces O3 distributions in remote tropical and midlatitudinal sites. Also, the model matches typical profiles connected with deep convection in the Intertropical Convergence Zone (ITCZ). However, the model tends to underestimate O3 concentrations at the poles and in relatively polluted regions. These underestimates are caused by the poor representation of tropopause foldings in midlatitudes, which form a significant source of tropospheric O3 from the stratosphere, too weak transport to the poles, and the neglect of higher hydrocarbon chemistry. Also, mixing of polluted continental boundary layer air into the free troposphere may be underestimated. We discuss how these model deficiencies will be improved in the future.

  10. General Permits for Ocean Dumping

    EPA Pesticide Factsheets

    General permits are issued by EPA for the ocean dumping of certain materials that will have a minimal adverse environmental impact and are generally disposed of in small quantities. Information includes examples and ocean disposal sites for general permits

  11. Robotics and general surgery.

    PubMed

    Jacob, Brian P; Gagner, Michel

    2003-12-01

    Robotics are now being used in all surgical fields, including general surgery. By increasing intra-abdominal articulations while operating through small incisions, robotics are increasingly being used for a large number of visceral and solid organ operations, including those for the gallbladder, esophagus, stomach, intestines, colon, and rectum, as well as for the endocrine organs. Robotics and general surgery are blending for the first time in history and as a specialty field should continue to grow for many years to come. We continuously demand solutions to questions and limitations that are experienced in our daily work. Laparoscopy is laden with limitations such as fixed axis points at the trocar insertion sites, two-dimensional video monitors, limited dexterity at the instrument tips, lack of haptic sensation, and in some cases poor ergonomics. The creation of a surgical robot system with 3D visual capacity seems to deal with most of these limitations. Although some in the surgical community continue to test the feasibility of these surgical robots and to question the necessity of such an expensive venture, others are already postulating how to improve the next generation of telemanipulators, and in so doing are looking beyond today's horizon to find simpler solutions. As the robotic era enters the world of the general surgeon, more and more complex procedures will be able to be approached through small incisions. As technology catches up with our imaginations, robotic instruments (as opposed to robots) and 3D monitoring will become routine and continue to improve patient care by providing surgeons with the most precise, least traumatic ways of treating surgical disease.

  12. General curvilinear coordinate systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. P.

    1982-01-01

    The basic ideas of the construction and use of numerically-generated boundary-fitted coordinate systems for the numerical solution of partial differential equations are discussed. With such coordinate systems, all computation can be done on a fixed square grid in the rectangular transformed region regardless of the shape or movement of the physical boundaries. A number of different types of configurations for the transformed region and the basic transformation relations from a cartesian system to a general curvilinear system are given. The material of this paper is applicable to all types of coordinate system generation.

  13. Nonlocal General Relativity

    NASA Astrophysics Data System (ADS)

    Mashhoon, Bahram

    2014-12-01

    A brief account of the present status of the recent nonlocal generalization of Einstein's theory of gravitation is presented. The main physical assumptions that underlie this theory are described. We clarify the physical meaning and significance of Weitzenbock's torsion and emphasize its intimate relationship with the gravitational field, characterized by the Riemannian curvature of spacetime. In this theory, nonlocality can simulate dark matter; in fact, in the Newtonian regime, we recover the phenomenological Tohline-Kuhn approach to modified gravity. To account for the observational data regarding dark matter, nonlocality is associated with a characteristic length scale of order 1 kpc. The confrontation of nonlocal gravity with observation is briefly discussed.

  14. Tachyons in general relativity

    SciTech Connect

    Schwartz, Charles

    2011-05-15

    We consider the motion of tachyons (faster-than-light particles) in the framework of general relativity. An important feature is the large contribution of low energy tachyons to the energy-momentum tensor. We also calculate the gravitational field produced by tachyons in particular geometric arrangements; and it appears that there could be self-cohering bundles of such matter. This leads us to suggest that such theoretical ideas might be relevant to major problems (dark matter and dark energy) in current cosmological models.

  15. Sleep Misperception and Chronic Insomnia in the General Population: The Role of Objective Sleep Duration and Psychological Profiles

    PubMed Central

    Fernandez-Mendoza, Julio; Calhoun, Susan L.; Bixler, Edward O.; Karataraki, Maria; Liao, Duanping; Vela-Bueno, Antonio; Ramos-Platon, María Jose; Sauder, Katherine A.; Basta, Maria; Vgontzas, Alexandros N.

    2011-01-01

    Objective Sleep misperception is considered by some investigators a common characteristic of chronic insomnia, whereas others propose it as a separate diagnosis. The frequency and the determinants of sleep misperception in general population samples are unknown. In this study we examined the role of objective sleep duration, a novel marker in phenotyping insomnia, and psychological profiles on sleep misperception in a large, general population sample. Methods 142 insomniacs and 724 controls selected from a general random sample of 1,741 individuals (age ≥ 20 years) underwent a polysomnographic evaluation, completed the Minnesota Multiphasic Personality Inventory-2, and were split into two groups based on their objective sleep duration: “normal sleep duration” (≥ 6 hours) and “short sleep duration” (< 6 hours). Results The discrepancy between subjective and objective sleep duration was determined by two independent factors. Short sleepers reported more sleep than they objectively had and insomniacs reported less sleep than controls with similar objective sleep duration. The additive effect of these two factors resulted in underestimation only in insomniacs with normal sleep duration. Insomniacs with normal sleep duration showed a MMPI-2 profile of high depression and anxiety, and low ego strength, whereas insomniacs with short sleep duration showed a profile of a medical disorder. Conclusions Underestimation of sleep duration is prevalent among insomniacs with objective normal sleep duration. Anxious-ruminative traits and poor resources for coping with stress appear to mediate the underestimation of sleep duration. These data further support the validity and clinical utility of objective sleep measures in phenotyping insomnia. PMID:20978224

  16. Generalized tonic-clonic seizure

    MedlinePlus

    ... Seizure - grand mal; Grand mal seizure; Seizure - generalized; Epilepsy - generalized seizure ... occur as part of a repeated, chronic illness (epilepsy). Some seizures are due to psychological problems (psychogenic).

  17. Cyclic generalized projection MRI.

    PubMed

    Sarty, Gordon E

    2015-04-01

    Progress in the development of portable MRI hinges on the ability to use lightweight magnets that have non-uniform magnetic fields. An image encoding method and mathematical procedure for recovering the image from the NMR signal from non-uniform magnets with closed isomagnetic contours is given. Individual frequencies in an NMR signal from an object in a non-uniform magnetic field give rise to integrals of the object along contours of constant magnetic field: generalized projections. With closed isomagnetic field contours a simple, cyclic, direct reconstruction of the image from the generalized projections is possible when the magnet and RF transmit coil are held fixed relative to the imaged object while the RF receive coil moves. Numerical simulations, using the Shepp and Logan mathematical phantom, were completed to show that the mathematical method works and to illustrate numerical limitations. The method is numerically verified and exact reconstruction demonstrated for discrete mathematical image phantoms. Correct knowledge of the RF receive field is necessary or severe image distortions will result. The cyclic mathematical reconstruction method presented here will be useful for portable MRI schemes that use non-uniform magnets with closed isomagnetic contours along with mechanically or electronically moving the RF receive coils.

  18. General Aviation Data Framework

    NASA Technical Reports Server (NTRS)

    Blount, Elaine M.; Chung, Victoria I.

    2006-01-01

    The Flight Research Services Directorate at the NASA Langley Research Center (LaRC) provides development and operations services associated with three general aviation (GA) aircraft used for research experiments. The GA aircraft includes a Cessna 206X Stationair, a Lancair Colombia 300X, and a Cirrus SR22X. Since 2004, the GA Data Framework software was designed and implemented to gather data from a varying set of hardware and software sources as well as enable transfer of the data to other computers or devices. The key requirements for the GA Data Framework software include platform independence, the ability to reuse the framework for different projects without changing the framework code, graphics display capabilities, and the ability to vary the interfaces and their performance. Data received from the various devices is stored in shared memory. This paper concentrates on the object oriented software design patterns within the General Aviation Data Framework, and how they enable the construction of project specific software without changing the base classes. The issues of platform independence and multi-threading which enable interfaces to run at different frame rates are also discussed in this paper.

  19. Generalized mosaicing: polarization panorama.

    PubMed

    Schechner, Yoav Y; Nayar, Shree K

    2005-04-01

    We present an approach to image the polarization state of object points in a wide field of view, while enhancing the radiometric dynamic range of maging systems by generalizing image mosaicing. The approach is biologically-inspired, as it emulates spatially varying polarization sensitivity of some animals. In our method, a spatially varying polarization and attenuation filter is rigidly attached to a camera. As the system moves, it senses each scene point multiple times, each time filtering it through a different filter polarizing angle, polarizance, and transmittance. Polarization is an additional dimension of the generalized mosaicing paradigm, which has recently yielded high dynamic range images and multispectral images in a wide field of view using other kinds of filters. The image acquisition is as easy as in traditional image mosaics. The computational algorithm can easily handle nonideal polarization filters (partial polarizers), variable exposures, and saturation in a single framework. The resulting mosaic represents the polarization state at each scene point. Using data acquired by this method, we demonstrate attenuation and enhancement of specular reflections and semireflection separation in an image mosaic.

  20. General composite Higgs models

    NASA Astrophysics Data System (ADS)

    Marzocca, David; Serone, Marco; Shu, Jing

    2012-08-01

    We construct a general class of pseudo-Goldstone composite Higgs models, within the minimal SO(5)/SO(4) coset structure, that are not necessarily of moose-type. We characterize the main properties these models should have in order to give rise to a Higgs mass around 125 GeV. We assume the existence of relatively light and weakly coupled spin 1 and 1/2 resonances. In absence of a symmetry principle, we introduce the Minimal Higgs Potential (MHP) hypothesis: the Higgs potential is assumed to be one-loop dominated by the SM fields and the above resonances, with a contribution that is made calculable by imposing suitable generalizations of the first and second Weinberg sum rules. We show that a 125 GeV Higgs requires light, often sub-TeV, fermion resonances. Their presence can also be important for the models to successfully pass the electroweak precision tests. Interestingly enough, the latter can also be passed by models with a heavy Higgs around 320 GeV. The composite Higgs models of the moose-type considered in the literature can be seen as particular limits of our class of models.

  1. Generalized Nonlinear Yule Models

    NASA Astrophysics Data System (ADS)

    Lansky, Petr; Polito, Federico; Sacerdote, Laura

    2016-11-01

    With the aim of considering models related to random graphs growth exhibiting persistent memory, we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macroevolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth rates. Among the main results we derive the explicit distribution of the number of in-links of a webpage chosen uniformly at random recognizing the contribution to the asymptotics and the finite time correction. The mean value of the latter distribution is also calculated explicitly in the most general case. Furthermore, in order to show the usefulness of our results, we particularize them in the case of specific birth rates giving rise to a saturating behaviour, a property that is often observed in nature. The further specialization to the non-fractional case allows us to extend the Yule model accounting for a nonlinear growth.

  2. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  3. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package

  4. Generalized quantum secret sharing

    SciTech Connect

    Singh, Sudhir Kumar; Srikanth, R.

    2005-01-01

    We explore a generalization of quantum secret sharing (QSS) in which classical shares play a complementary role to quantum shares, exploring further consequences of an idea first studied by Nascimento, Mueller-Quade, and Imai [Phys. Rev. A 64, 042311 (2001)]. We examine three ways, termed inflation, compression, and twin thresholding, by which the proportion of classical shares can be augmented. This has the important application that it reduces quantum (information processing) players by replacing them with their classical counterparts, thereby making quantum secret sharing considerably easier and less expensive to implement in a practical setting. In compression, a QSS scheme is turned into an equivalent scheme with fewer quantum players, compensated for by suitable classical shares. In inflation, a QSS scheme is enlarged by adding only classical shares and players. In a twin-threshold scheme, we invoke two separate thresholds for classical and quantum shares based on the idea of information dilution.

  5. Generalized entanglement entropy

    NASA Astrophysics Data System (ADS)

    Taylor, Marika

    2016-07-01

    We discuss two measures of entanglement in quantum field theory and their holographic realizations. For field theories admitting a global symmetry, we introduce a global symmetry entanglement entropy, associated with the partitioning of the symmetry group. This quantity is proposed to be related to the generalized holographic entanglement entropy defined via the partitioning of the internal space of the bulk geometry. Thesecond measure of quantum field theory entanglement is the field space entanglement entropy, obtained by integrating out a subset of the quantum fields. We argue that field space entanglement entropy cannot be precisely realised geometrically in a holographic dual. However, for holographic geometries with interior decoupling regions, the differential entropy provides a close analogue to the field space entanglement entropy. We derive generic descriptions of such inner throat regions in terms of gravity coupled to massive scalars and show how the differential entropy in the throat captures features of the field space entanglement entropy.

  6. Generalized teleportation protocol

    SciTech Connect

    Gordon, Goren; Rigolin, Gustavo

    2006-04-15

    A generalized teleportation protocol (GTP) for N qubits is presented, where the teleportation channels are nonmaximally entangled and all the free parameters of the protocol are considered: Alice's measurement basis, her sets of acceptable results, and Bob's unitary operations. The full range of fidelity (F) of the teleported state and the probability of success (P{sub suc}) to obtain a given fidelity are achieved by changing these free parameters. A channel efficiency bound is found, where one can determine how to divide it between F and P{sub suc}. A one-qubit formulation is presented and then expanded to N qubits. A proposed experimental setup that implements the GTP is given using linear optics.

  7. Ocean General Circulation Models

    SciTech Connect

    Yoon, Jin-Ho; Ma, Po-Lun

    2012-09-30

    1. Definition of Subject The purpose of this text is to provide an introduction to aspects of oceanic general circulation models (OGCMs), an important component of Climate System or Earth System Model (ESM). The role of the ocean in ESMs is described in Chapter XX (EDITOR: PLEASE FIND THE COUPLED CLIMATE or EARTH SYSTEM MODELING CHAPTERS). The emerging need for understanding the Earth’s climate system and especially projecting its future evolution has encouraged scientists to explore the dynamical, physical, and biogeochemical processes in the ocean. Understanding the role of these processes in the climate system is an interesting and challenging scientific subject. For example, a research question how much extra heat or CO2 generated by anthropogenic activities can be stored in the deep ocean is not only scientifically interesting but also important in projecting future climate of the earth. Thus, OGCMs have been developed and applied to investigate the various oceanic processes and their role in the climate system.

  8. Immunizations: vaccinations in general.

    PubMed

    Wiley, Catherine C

    2015-06-01

    The childhood immunization schedule is complex and nuanced. Although serious adverse reactions to immunizations are uncommon, clinicians must be well-versed in these reactions as well as the contraindications and precautions to each vaccine. • Conjugate vaccine technology links polysaccharide antigens to carrier proteins, triggering T-cell-dependent immunity to polysaccharides, thereby strengthening immune memory. • On the basis of some research evidence and consensus, live vaccines are generally contraindicated in immunocompromised patients and in pregnancy. Most live vaccines can be administered to household contacts of immunocompromised patients. • On the basis of some research and consensus, modified administration of meningococcal, pneumococcal, and less commonly, other vaccines may be indicated to protect immunocompromised patients. • On the basis of disease epidemiology and consensus, international travelers should be up-to-date with all routine immunizations; depending on destination, additional vaccines or immune globulin may be required.

  9. Generalized BEC in superconductivity

    NASA Astrophysics Data System (ADS)

    Grether, M.; de Llano, M.

    2007-09-01

    A generalized Bose-Einstein condensation (GBEC) statistical theory of superconductors accounts not for BB interactions but rather for boson-fermion (BF) interactions. It extends the 1989 Friedberg-Lee BEC theory by including as bosons two-hole (2h) singlet Cooper pairs (CPs) in addition to the usual two-electron (2e) ones. It contains BCS theory when both kinds of pairs are equal in the BE condensate and in excited states-at least as far as identically reproducing the BCS gap equation for all temperatures T as well as the T = 0 BCS condensation energy for all couplings. As a ternary BF model with BF interactions, it yields Tcs one-to-three orders-of-magnitude higher than BCS theory with the same Cooper/BCS model electron-phonon interaction. These Tcs appear to be surprisingly insensitive to the BF interaction.

  10. Orthognathic Surgery: General Considerations

    PubMed Central

    Khechoyan, David Y.

    2013-01-01

    Orthognathic surgery is a unique endeavor in facial surgery: a patient's appearance and occlusal function can be improved significantly, impacting the patient's sense of self and well-being. Successful outcomes in modern orthognathic surgery rely on close collaboration between the surgeon and the orthodontist across all stages of treatment, from preoperative planning to finalization of occlusion. Virtual computer planning promotes a more accurate analysis of dentofacial deformity and preoperative planning. It is also an invaluable aid in providing comprehensive patient education. In this article, the author describes the general surgical principles that underlie orthognathic surgery, highlighting the sequence of treatment, preoperative analysis of dentofacial deformity, surgical execution of the treatment plan, and possible complications. PMID:24872758

  11. Generalized conjugate gradient squared

    SciTech Connect

    Fokkema, D.R.; Sleijpen, G.L.G.

    1994-12-31

    In order to solve non-symmetric linear systems of equations, the Conjugate Gradient Squared (CGS) is a well-known and widely used iterative method. In practice the method converges fast, often twice as fast as the Bi-Conjugate Gradient method. This is what you may expect, since CGS uses the square of the BiCG polynomial. However, CGS may suffer from its erratic convergence behavior. The method may diverge or the approximate solution may be inaccurate. BiCGSTAB uses the BiCG polynomial and a product of linear factors in an attempt to smoothen the convergence. In many cases, this has proven to be very effective. Unfortunately, the convergence of BiCGSTAB may stall when a linear factor (nearly) degenerates. BiCGstab({ell}) is designed to overcome this degeneration of linear factors. It generalizes BiCGSTAB and uses both the BiCG polynomial and a product of higher order factors. Still, CGS may converge faster than BiCGSTAB or BiCGstab({ell}). So instead of using a product of linear or higher order factors, it may be worthwhile to look for other polynomials. Since the BiCG polynomial is based on a three term recursion, a natural choice would be a polynomial based on another three term recursion. Possibly, a suitable choice of recursion coefficients would result in method that converges faster or as fast as CGS, but less erratic. It turns out that an algorithm for such a method can easily be formulated. One particular choice for the recursion coefficients leads to CGS. Therefore one could call this algorithm generalized CGS. Another choice for the recursion coefficients leads to BiCGSTAB. It is therefore possible to mix linear factors and some polynomial based on a three term recursion. This way one may get the best of both worlds. The authors will report on their findings.

  12. Identities for generalized hypergeometric coefficients

    SciTech Connect

    Biedenharn, L.C.; Louck, J.D.

    1991-01-01

    Generalizations of hypergeometric functions to arbitrarily many symmetric variables are discussed, along with their associated hypergeometric coefficients, and the setting within which these generalizations arose. Identities generalizing the Euler identity for {sub 2}F{sub 1}, the Saalschuetz identity, and two generalizations of the {sub 4}F{sub 3} Bailey identity, among others, are given. 16 refs.

  13. Generalized teleparallel theory

    NASA Astrophysics Data System (ADS)

    Junior, Ednaldo L. B.; Rodrigues, Manuel E.

    2016-07-01

    We construct a theory in which the gravitational interaction is described only by torsion, but that generalizes the teleparallel theory still keeping the invariance of local Lorentz transformations in one particular case. We show that our theory falls, in a certain limit of a real parameter, under f(bar{R}) gravity or, in another limit of the same real parameter, under modified f( T) gravity; on interpolating between these two theories it still can fall under several other theories. We explicitly show the equivalence with f(bar{R}) gravity for the cases of a Friedmann-Lemaître-Robertson-Walker flat metric for diagonal tetrads, and a metric with spherical symmetry for diagonal and non-diagonal tetrads. We study four applications, one in the reconstruction of the de Sitter universe cosmological model, for obtaining a static spherically symmetric solution of de Sitter type for a perfect fluid, for evolution of the state parameter ω _{DE}, and for the thermodynamics of the apparent horizon.

  14. Natural generalized mirage mediation

    NASA Astrophysics Data System (ADS)

    Baer, Howard; Barger, Vernon; Serce, Hasan; Tata, Xerxes

    2016-12-01

    In the supersymmetric scenario known as mirage mediation (MM), the soft supersymmetry (SUSY) breaking terms receive comparable anomaly-mediation and moduli-mediation contributions leading to the phenomenon of mirage unification. The simplest MM SUSY breaking models which are consistent with the measured Higgs mass and sparticle mass constraints are strongly disfavored by fine-tuning considerations. However, while MM makes robust predictions for gaugino masses, the scalar sector is quite sensitive to specific mechanisms for moduli stabilization and potential uplifting. We suggest here a broader setup of generalized mirage mediation (GMM), where heretofore discrete parameters are allowed as continuous to better parametrize these other schemes. We find that natural SUSY spectra consistent with both the measured value of mh as well as LHC lower bounds on superpartner masses are then possible. We explicitly show that models generated from natural GMM may be beyond the reach of even high-luminosity LHC searches. In such a case, the proposed International Linear e+e- Collider will be required for natural SUSY discovery via higgsino pair production reactions. We also outline prospects for detection of higgsino-like WIMPs from natural GMM.

  15. Generalized Linear Covariance Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  16. Beyond Einstein's General Relativity

    NASA Astrophysics Data System (ADS)

    Lobo, Francisco S. N.

    2015-04-01

    Modern astrophysical and cosmological models are plagued with two severe theoretical difficulties, namely, the dark energy and the dark matter problems. Relative to the former, high-precision observational data have confirmed with startling evidence that the Universe is undergoing a phase of accelerated expansion. This phase, one of the most important and challenging current problems in cosmology, represents a new imbalance in the governing gravitational equations. Several candidates, responsible for this expansion, have been proposed in the literature, in particular, dark energy models and modified gravity, amongst others. Outstanding questions are related to the nature of this so-called “dark energy” that is driving the acceleration of the universe, and whether it is due to the vacuum energy or a dynamical field. On the other hand, the late-time cosmic acceleration may be due to modifications of General Relativity, which introduce new degrees of freedom to the gravitational sector itself. We analyze some of the modified theories of gravity that address these intriguing and exciting problems facing modern physics, and explore the foundations of gravitation theory, essential for the construction of modified theories of gravity.

  17. 27. GENERAL VIEW OF ORE TRANSPORT AND GENERAL WORK AREA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    27. GENERAL VIEW OF ORE TRANSPORT AND GENERAL WORK AREA BETWEEN MARISCAL WORKS CONDENSER STACK AND VIVIANNA WORKS BUILDINGS LOOKING SOUTHWEST. - Mariscal Quicksilver Mine & Reduction Works, Terlingua, Brewster County, TX

  18. GENERAL VIEW OF ORE TRANSPORT AND GENERAL WORK AREA BETWEEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    GENERAL VIEW OF ORE TRANSPORT AND GENERAL WORK AREA BETWEEN MARISCAL WORKS CONDENSER STACK AND VIVIANNA WORKS BUILDINGS LOOKING SOUTHWEST. - Mariscal Quicksilver Mine & Reduction Works, Terlingua, Brewster County, TX

  19. 2. General view to southwest, showing general context at low ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. General view to southwest, showing general context at low tide, with Building 24 in background right. - Charlestown Navy Yard, Marine Railway, Between Piers 2 & 3, on Charlestown Waterfront at west end of Navy Yard, Boston, Suffolk County, MA

  20. 33. GENERAL HIGH ALTITUDE AERIAL VIEW OF COMPLEX AND GENERAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. GENERAL HIGH ALTITUDE AERIAL VIEW OF COMPLEX AND GENERAL SETTING. October 1982 - Mississippi River 9-Foot Channel Project, Lock & Dam No. 15, Upper Mississipi River (Arsenal Island), Rock Island, Rock Island County, IL

  1. 13. GENERAL HIGH ALTITUDE AERIAL VIEW OF COMPLEX AND GENERAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. GENERAL HIGH ALTITUDE AERIAL VIEW OF COMPLEX AND GENERAL SETTING. October 1982 - Mississippi River 9-Foot Channel Project, Lock & Dam No. 17, Upper Mississippi River, New Boston, Mercer County, IL

  2. Action principle for the generalized harmonic formulation of general relativity

    SciTech Connect

    Brown, J. David

    2011-10-15

    An action principle for the generalized harmonic formulation of general relativity is presented. The action is a functional of the spacetime metric and the gauge source vector. An action principle for the Z4 formulation of general relativity has been proposed recently by Bona, Bona-Casas, and Palenzuela. The relationship between the generalized harmonic action and the Bona, Bona-Casas, and Palenzuela action is discussed in detail.

  3. The generalized triangular decomposition

    NASA Astrophysics Data System (ADS)

    Jiang, Yi; Hager, William W.; Li, Jian

    2008-06-01

    Given a complex matrix mathbf{H} , we consider the decomposition mathbf{H} = mathbf{QRP}^* , where mathbf{R} is upper triangular and mathbf{Q} and mathbf{P} have orthonormal columns. Special instances of this decomposition include the singular value decomposition (SVD) and the Schur decomposition where mathbf{R} is an upper triangular matrix with the eigenvalues of mathbf{H} on the diagonal. We show that any diagonal for mathbf{R} can be achieved that satisfies Weyl's multiplicative majorization conditions: prod_{iD1}^k \\vert r_{i}\\vert le prod_{iD1}^k sigma_i, ; ; 1 le k < K, quad prod_{iD1}^K \\vert r_{i}\\vert = prod_{iD1}^K sigma_i, where K is the rank of mathbf{H} , sigma_i is the i -th largest singular value of mathbf{H} , and r_{i} is the i -th largest (in magnitude) diagonal element of mathbf{R} . Given a vector mathbf{r} which satisfies Weyl's conditions, we call the decomposition mathbf{H} = mathbf{QRP}^* , where mathbf{R} is upper triangular with prescribed diagonal mathbf{r} , the generalized triangular decomposition (GTD). A direct (nonrecursive) algorithm is developed for computing the GTD. This algorithm starts with the SVD and applies a series of permutations and Givens rotations to obtain the GTD. The numerical stability of the GTD update step is established. The GTD can be used to optimize the power utilization of a communication channel, while taking into account quality of service requirements for subchannels. Another application of the GTD is to inverse eigenvalue problems where the goal is to construct matrices with prescribed eigenvalues and singular values.

  4. Directions in General Relativity

    NASA Astrophysics Data System (ADS)

    Hu, B. L.; Ryan, M. P., Jr.; Vishveshwara, C. V.

    2005-10-01

    Preface; Dieter Brill: a spacetime perspective; 1. Thawing the frozen formalism: the difference between observables and what we observe A. Anderson; 2. Jacobi's action and the density of states J. D. Brown and J. W. York; 3. Decoherence of correlation histories E. Calzetta and B. L. Hu; 4. The initial value problem in light of Ashtekar's variables R. Capovilla, J. Dell and T. Jacobson; 5. Status report on an axiomatic basis for functional integration P. Cartier and C. DeWitt-Morette; 6. Solution of the coupled Einstein constraints on asymptotically Euclidean manifolds Y. Choquet-Bruhat; 7. Compact Cauchy horizons and Cauchy surfaces P. Chrusciel and J. Isenberg; 8. The classical electron J. M. Cohen and E. Mustafa; 9. Gauge (in)variance, mass and parity in D=3 revisited S. Deser; 10. Triality, exceptional Lie groups and Dirac operators F. Flaherty; 11. The reduction of the state vector and limitations on measurement in the quantum mechanics of closed systems J. B. Hartle; 12 Quantum linearization instabilities of de Sitter spacetime A. Higuchi; 13. What is the true description of charged black holes? G. T. Horowitz; 14. Limits on the adiabatic index in static stellar models L. Lindblom and A. K. M. Masood-ul-Alam; 15. On the relativity of rotation B. Mashhoon; 16. Recent progress and open problems in linearization stability V. E. Moncrief; 17. Brill waves N. Ó Murchadha; 18. You can't get there from here: constraints on topology change K. Schleich and D. M. Witt; 19. Time, measurement and information loss in quantum cosmology L. Smolin; 20. Impossible measurements on quantum fields R. Sorkin; 21. A new condition implying the existence of a constant mean curvature foliation F. J. Tipler; 22. Maximal slices in stationary spacetimes with ergoregions R. M. Wald; 23. (1 + 1) - Dimensional methods for general relativity J. H. Yoon; 24. Coalescence of primal gravity waves to make cosmological mass without matter D. E. Holz, W. A. Miller, M. Wakano and J. A. Wheeler.

  5. Quadratic Generalized Scale Invariance

    NASA Astrophysics Data System (ADS)

    Lovejoy, S.; Schertzer, D.; Addor, J. B.

    Nearly twenty years ago, two of us argued that in order to account for the scaling strat- ification of the atmosphere, that an anisotropic "unified scaling model" of the atmo- sphere was required with elliptical dimension 23/9=2.555... "in between" the standard 3-D (small scale) and 2-D large scale model. This model was based on the formal- ism of generalized scale invariance (GSI). Physically, GSI is justified by arguing that various conserved fluxes (energy, buoyancy force variance etc.) should define the ap- propriate notion of scale. In a recent large scale satellite cloud image analysis, we directly confirmed this model by studying the isotropic (angle averaged) horizontal cloud statistics. Mathematically, GSI is based on a a group of scale changing opera- tors and their generators but to date, both analyses (primarily of cloud images) and nu- merical (multifractal) simulations, have been limited to the special case of linear GSI. This has shown that cloud texture can plausibly be associated with local linearizations. However realistic morphologies involve spatially avarying textures; the full non linear GSI is clearly necessary. In this talk, we first show that the observed angle averaged (multi)scaling statistics only give a realtively weak constraint on the nonlinear gner- ator: that the latter can be expressed by self-similar (isotropic) part, and a deviatoric part described (in two dimensions) by an arbitrary scalar potential which contains all the information about the cloud morphology. We then show (using a theorem due to Poincaré) how to reduce nonlinear GSI to linear GSI plus a nonlinear coordinate trans- formation numerically, using this to take multifractal GSI modelling to the next level of approximation: quadratic GSI. We show many examples of the coresponding simu- lations which include transitions from various morphologies (including cyclones) and we discuss the results in relation to satellite cloud images.

  6. Update on the imaging diagnosis of otosclerosis.

    PubMed

    Gredilla Molinero, J; Mancheño Losa, M; Santamaría Guinea, N; Arévalo Galeano, N; Grande Bárez, M

    2016-01-01

    Otosclerosis is a primary osteodystrophy of the temporal bone that causes progressive conductive hearing loss. The diagnosis is generally clinical, but multidetector CT (MDCT), the imaging technique of choice, is sometimes necessary. The objective of this article is to systematically review the usefulness of imaging techniques for the diagnosis and postsurgical assessment of otosclerosis, fundamentally the role of MDCT, to decrease the surgical risk.

  7. GVS - GENERAL VISUALIZATION SYSTEM

    NASA Technical Reports Server (NTRS)

    Keith, S. R.

    1994-01-01

    The primary purpose of GVS (General Visualization System) is to support scientific visualization of data output by the panel method PMARC_12 (inventory number ARC-13362) on the Silicon Graphics Iris computer. GVS allows the user to view PMARC geometries and wakes as wire frames or as light shaded objects. Additionally, geometries can be color shaded according to phenomena such as pressure coefficient or velocity. Screen objects can be interactively translated and/or rotated to permit easy viewing. Keyframe animation is also available for studying unsteady cases. The purpose of scientific visualization is to allow the investigator to gain insight into the phenomena they are examining, therefore GVS emphasizes analysis, not artistic quality. GVS uses existing IRIX 4.0 image processing tools to allow for conversion of SGI RGB files to other formats. GVS is a self-contained program which contains all the necessary interfaces to control interaction with PMARC data. This includes 1) the GVS Tool Box, which supports color histogram analysis, lighting control, rendering control, animation, and positioning, 2) GVS on-line help, which allows the user to access control elements and get information about each control simultaneously, and 3) a limited set of basic GVS data conversion filters, which allows for the display of data requiring simpler data formats. Specialized controls for handling PMARC data include animation and wakes, and visualization of off-body scan volumes. GVS is written in C-language for use on SGI Iris series computers running IRIX. It requires 28Mb of RAM for execution. Two separate hardcopy documents are available for GVS. The basic document price for ARC-13361 includes only the GVS User's Manual, which outlines major features of the program and provides a tutorial on using GVS with PMARC_12 data. Programmers interested in modifying GVS for use with data in formats other than PMARC_12 format may purchase a copy of the draft GVS 3.1 Software Maintenance

  8. The Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2011-01-01

    The G-DINA ("generalized deterministic inputs, noisy and gate") model is a generalization of the DINA model with more relaxed assumptions. In its saturated form, the G-DINA model is equivalent to other general models for cognitive diagnosis based on alternative link functions. When appropriate constraints are applied, several commonly used…

  9. General Aviation Pilot Education Program.

    ERIC Educational Resources Information Center

    Cole, Warren L.

    General Aviation Pilot Education (GAPE) was a safety program designed to improve the aeronautical education of the general aviation pilot in anticipation that the national aircraft accident rate might be improved. GAPE PROGRAM attempted to reach the average general aviation pilot with specific and factual information regarding the pitfalls of his…

  10. Generalized Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew

    2004-01-01

    A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…

  11. OLEDS FOR GENERAL LIGHTING

    SciTech Connect

    Anil Duggal; Don Foust; Chris Heller; Bill Nealon; Larry Turner; Joe Shiang; Nick Baynes; Tim Butler; Nalin Patel

    2004-02-29

    The goal of this program was to reduce the long term technical risks that were keeping the lighting industry from embracing and developing organic light-emitting diode (OLED) technology for general illumination. The specific goal was to develop OLEDs for lighting to the point where it was possible to demonstrate a large area white light panel with brightness and light quality comparable to a fluorescence source and with an efficacy comparable to that of an incandescent source. it was recognized that achieving this would require significant advances in three area: (1) the improvement of white light quality for illumination, (2) the improvement of OLED energy efficiency at high brightness, and (3) the development of cost-effective large area fabrication techniques. The program was organized such that, each year, a ''deliverable'' device would be fabricated which demonstrated progress in one or more of the three critical research areas. In the first year (2001), effort concentrated on developing an OLED capable of generating high illumination-quality white light. Ultimately, a down-conversion method where a blue OLED was coupled with various down-conversion layers was chosen. Various color and scattering models were developed to aid in material development and device optimization. The first year utilized this approach to deliver a 1 inch x 1 inch OLED with higher illumination-quality than available fluorescent sources. A picture of this device is shown and performance metrics are listed. To their knowledge, this was the first demonstration of true illumination-quality light from an OLED. During the second year, effort concentrated on developing a scalable approach to large area devices. A novel device architecture consisting of dividing the device area into smaller elements that are monolithically connected in series was developed. In the course of this development, it was realized that, in addition to being scalable, this approach made the device tolerant to the most

  12. General aviation avionics equipment maintenance

    NASA Technical Reports Server (NTRS)

    Parker, C. D.; Tommerdahl, J. B.

    1978-01-01

    Maintenance of general aviation avionics equipment was investigated with emphasis on single engine and light twin engine general aviation aircraft. Factors considered include the regulatory agencies, avionics manufacturers, avionics repair stations, the statistical character of the general aviation community, and owners and operators. The maintenance, environment, and performance, repair costs, and reliability of avionics were defined. It is concluded that a significant economic stratification is reflected in the maintenance problems encountered, that careful attention to installations and use practices can have a very positive impact on maintenance problems, and that new technologies and a general growth in general aviation will impact maintenance.

  13. Telephone triage in general practices: A written case scenario study in the Netherlands

    PubMed Central

    Smits, Marleen; Hanssen, Suzan; Huibers, Linda; Giesen, Paul

    2016-01-01

    Objective General practices increasingly use telephone triage to manage patient flows. During triage, the urgency of the call and required type of care are determined. This study examined the organization and adequacy of telephone triage in general practices in the Netherlands. Design Cross-sectional observational study using a web-based survey among practice assistants including questions on background characteristics and triage organization. Furthermore, practice assistants were asked to assess the required type of care of written case scenarios with varying health problems and levels of urgency. To determine the adequacy of the assessments, a comparison with a reference standard was made. In addition, the association between background characteristics and triage organization and the adequacy of triage was examined. Setting Daytime general practices. Subjects Practice assistants. Main outcome measures Over- and under-estimation, sensitivity, specificity. Results The response rate was 41.1% (n = 973). The required care was assessed adequately in 63.6% of cases, was over-estimated in 19.3%, and under-estimated in 17.1%. The sensitivity of identifying patients with a highly urgent problem was 76.7% and the specificity was 94.0%. The adequacy of the assessments of the required care was higher for more experienced assistants and assistants with fixed daily work meetings with the GP. Triage training, use of a triage tool, and authorization of advice were not associated with adequacy of triage. Conclusion Triage by practice assistants in general practices is efficient (high specificity), but potentially unsafe in highly urgent cases (suboptimal sensitivity). It is important to train practice assistants in identifying highly urgent cases. Key pointsGeneral practices increasingly use telephone triage to manage patient flows, but little is known about the organization and adequacy of triage in daytime practices.Telephone triage by general practice assistants is

  14. Deeply Virtual Compton Scattering on nucleons and nuclei in generalized vector meson dominance model

    SciTech Connect

    Vadim Guzey; Klaus Goeke; Marat Siddikov

    2008-02-01

    We consider Deeply Virtual Compton Scattering (DVCS) on nucleons and nuclei in the framework of generalized vector meson dominance (GVMD) model. We demonstrate that the GVMD model provides a good description of the HERA data on the dependence of the proton DVCS cross section on $Q^2$, $W$ (at $Q^2=4$ GeV$^2$) and $t$. At $Q^2 = 8$ GeV$^2$, the soft $W$-behavior of the GVMD model somewhat underestimates the $W$-dependence of the DVCS cross section due to the hard contribution not present in the GVMD model. We estimate $1/Q^2$ power-suppressed corrections to the DVCS amplitude and the DVCS cross section and find them large. We also make predictions for the nuclear DVCS amplitude and cross section in the kinematics of the future Electron-Ion Collider. We predict significant nuclear shadowing, which matches well predictions of the leading-twist nuclear shadowing in DIS on nuclei.

  15. Conditioned Fear Acquisition and Generalization in Generalized Anxiety Disorder.

    PubMed

    Tinoco-González, Daniella; Fullana, Miquel Angel; Torrents-Rodas, David; Bonillo, Albert; Vervliet, Bram; Blasco, María Jesús; Farré, Magí; Torrubia, Rafael

    2015-09-01

    Abnormal fear conditioning processes (including fear acquisition and conditioned fear-generalization) have been implicated in the pathogenesis of anxiety disorders. Previous research has shown that individuals with panic disorder present enhanced conditioned fear-generalization in comparison to healthy controls. Enhanced conditioned fear-generalization could also characterize generalized anxiety disorder (GAD), but research so far is inconclusive. An important confounding factor in previous research is comorbidity. The present study examined conditioned fear-acquisition and fear-generalization in 28 patients with GAD and 30 healthy controls using a recently developed fear acquisition and generalization paradigm assessing fear-potentiated startle and online expectancies of the unconditioned stimulus. Analyses focused on GAD patients without comorbidity but included also patients with comorbid anxiety disorders. Patients and controls did not differ as regards fear acquisition. However, contrary to our hypothesis, both groups did not differ either in most indexes of conditioned fear-generalization. Moreover, dimensional measures of GAD symptoms were not correlated with conditioned fear-generalization indexes. Comorbidity did not have a significant impact on the results. Our data suggest that conditioned fear-generalization is not enhanced in GAD. Results are discussed with special attention to the possible effects of comorbidity on fear learning abnormalities.

  16. Generalized Hopf bifurcation and its dual generalized homoclinic bifurcation

    NASA Astrophysics Data System (ADS)

    Joyal, Pierre

    1988-06-01

    The duality between the generalized Hopf bifurcation (GHB) and the generalized homoclinic bifurcation has been demonstrated using the Poincare-Andronov theorem of Arnold (1983). In order to study the correspondence map F, two normal forms are given which are dual to the normal forms of the GHB and which provide standard algorithms for calculating various quantities characterizing the generalized homoclinic bifurcation. The duality is shown using the Poincare normal forms at a weak focus and at a weak saddle, and using the bifurcation diagrams of the GHB and the generalized homoclinic bifurcation.

  17. High-Southern Latitudes Sulfur Cycle in an Atmospheric General Circulation Model

    NASA Astrophysics Data System (ADS)

    Cosme, E.; Genthon, C.; Martinerie, P.; Boucher, O.; Pham, M.

    2002-05-01

    This modeling study (Cosme et al., Sulfur cycle in the high southern latitudes in the LMD-ZT General Circulation Model, submitted to JGR) was motivated by the recent publication of annual time-scale records of dimethylsulfide (DMS) and dimethylsulfoxide (DMSO) in Antarctica, completing the available series of sulfate and methanesulfonic acid (MSA). Sulfur chemistry has been incorporated in the Laboratoire de Météorologie Dynamique Atmospheric General Circulation Model (AGCM), LMD-ZT, with high resolution and improved physics in the high-southern latitudes. The model predicts the concentration of 6 major sulfur species through emissions, transport, wet and dry deposition and chemistry in both gaseous and aqueous phases. Model results are broadly realistic when compared with measurements in air and snow or ice, and to results of other modeling studies, at high- and mid- southern latitudes. Although not corrected in this work, defects are identified and discussed: Atmospheric MSA concentrations are underestimated and DMSO concentrations are overestimated in summer, reflecting the lack of a DMSO sink leading to MSA; the deposition scheme used in the model may not be adapted to polar regions; DMS concentrations are underestimated in winter, and the model does not adequately reproduces interannual variability. Oceanic DMS sources appear deciding for the description of the sulfur cycle in these regions. The model suggests that ground atmospheric DMS concentrations are higher in winter than in summer, in a large part of central Antarctica. In the high-southern latitudes, high loads of DMS and DMSO are found and the main chemical sink of sulfur dioxide (SO2) is aqueous oxidation by ozone (O3), whereas oxidation by hydrogen peroxide (H2O2) dominates at the global scale.

  18. Snow cover and snow mass intercomparisons of general circulation models and remotely sensed datasets

    SciTech Connect

    Foster, J.; Liston, G.; Koster, R.

    1996-02-01

    Confirmation of the ability of general circulation models (GCMs) to accurately represent snow cover and snow mass distributions is vital for climate studies. There must be a high degree of confidence that what is being predicted by the models is reliable. In this study, snow output from seven GCMs and passive-microwave snow data derived from the Nimbus-7 Scanning Multichannel Microwave Radiometer (SMMR) are intercompared. National Oceanic and Atmospheric Administration satellite data are used as the standard of reference for snow extent observations and the U.S. Air Force snow depth climatology is used as the standard for snow mass. The reliability of the SMMR snow data needs to be verified, as well. Data for both North America and Eurasia are examined in an effort to assess the magnitude of spatial and temporal variations that exist between the standards of reference, the models, and the passive microwave data. Results indicate that both the models and SMMR represent seasonal and year-to-year snow distributions fairly well. The passive microwave data and several of the models, however, consistently underestimate snow mass, but other models overestimate the mass of snow on the ground. The models do a better job simulating winter and summer snow conditions than in the transition months. In general, the underestimation by SMR is caused by absorption of microwave energy by vegetation. For the GCMs, differences between observed snow conditions can be ascribed to inaccuracies in simulating surface air temperatures and precipitation fields, especially during the spring and fall. 34 refs., 18 figs.

  19. Generalized Kähler Geometry

    NASA Astrophysics Data System (ADS)

    Gualtieri, Marco

    2014-10-01

    Generalized Kähler geometry is the natural analogue of Kähler geometry, in the context of generalized complex geometry. Just as we may require a complex structure to be compatible with a Riemannian metric in a way which gives rise to a symplectic form, we may require a generalized complex structure to be compatible with a metric so that it defines a second generalized complex structure. We prove that generalized Kähler geometry is equivalent to the bi-Hermitian geometry on the target of a 2-dimensional sigma model with (2, 2) supersymmetry. We also prove the existence of natural holomorphic Courant algebroids for each of the underlying complex structures, and that these split into a sum of transverse holomorphic Dirac structures. Finally, we explore the analogy between pre-quantum line bundles and gerbes in the context of generalized Kähler geometry.

  20. Generalized series of Bessel functions

    NASA Astrophysics Data System (ADS)

    Al-Jarrah, A.; Dempsey, K. M.; Glasser, M. L.

    2002-06-01

    Known series of Bessel functions, currently available in handbooks, and many of Neumann type, are generalized to arbitrary order. The underlying result is a Poisson formula due to Titchmarsh. This formula gives rise to a Neumann series involving modified Bessel functions of integral order. The latter is the basis of many of the generalized series that follow. Included are examples of generalized trigonometric identities. The paper concludes by indicating the wide range of results that can be obtained.

  1. Generalized synchronization via nonlinear control.

    PubMed

    Juan, Meng; Xingyuan, Wang

    2008-06-01

    In this paper, the generalized synchronization problem of drive-response systems is investigated. Using the drive-response concept and the nonlinear control theory, a control law is designed to achieve the generalized synchronization of chaotic systems. Based on the Lyapunov stability theory, a generalized synchronization condition is derived. Theoretical analyses and numerical simulations further demonstrate the feasibility and effectiveness of the proposed technique.

  2. Generalized Heisenberg algebras and k-generalized Fibonacci numbers

    NASA Astrophysics Data System (ADS)

    Schork, Matthias

    2007-04-01

    It is shown how some of the recent results of de Souza et al (2006 J. Phys. A: Math. Gen. 39 10415) can be generalized to describe Hamiltonians whose eigenvalues are given as k-generalized Fibonacci numbers. Here k is an arbitrary integer and the cases considered by de Souza et al correspond to k = 2.

  3. Technical highlights in general aviation

    NASA Technical Reports Server (NTRS)

    Stickle, J. W.

    1977-01-01

    Improvements in performance, safety, efficiency, and emissions control in general aviation craft are reviewed. While change is slow, the U.S. industries still account for the bulk (90%) of the world's general aviation fleet. Advances in general aviation aerodynamics, structures and materials, acoustics, avionics, and propulsion are described. Supercritical airfoils, drag reduction design, stall/spin studies, crashworthiness and passenger safety, fiberglass materials, flight noise abatement, interior noise and vibration reduction, navigation systems, quieter and cleaner (reciprocating, turboprop, turbofan) engines, and possible benefits of the Global Position Satellite System to general aviation navigation are covered in the discussion. Some of the developments are illustrated.

  4. Modelling of filariasis in East Java with Poisson regression and generalized Poisson regression models

    NASA Astrophysics Data System (ADS)

    Darnah

    2016-04-01

    Poisson regression has been used if the response variable is count data that based on the Poisson distribution. The Poisson distribution assumed equal dispersion. In fact, a situation where count data are over dispersion or under dispersion so that Poisson regression inappropriate because it may underestimate the standard errors and overstate the significance of the regression parameters, and consequently, giving misleading inference about the regression parameters. This paper suggests the generalized Poisson regression model to handling over dispersion and under dispersion on the Poisson regression model. The Poisson regression model and generalized Poisson regression model will be applied the number of filariasis cases in East Java. Based regression Poisson model the factors influence of filariasis are the percentage of families who don't behave clean and healthy living and the percentage of families who don't have a healthy house. The Poisson regression model occurs over dispersion so that we using generalized Poisson regression. The best generalized Poisson regression model showing the factor influence of filariasis is percentage of families who don't have healthy house. Interpretation of result the model is each additional 1 percentage of families who don't have healthy house will add 1 people filariasis patient.

  5. Generally Covariant Hamiltonian Approach to the Generalized Harmonic Formulation of General Relativity

    NASA Astrophysics Data System (ADS)

    Cao, Meng

    The goal of this dissertation is to develop a generally covariant Hamiltonian approach to the generalized harmonic formulation of general relativity. As en route investigations, an important class of coordinate transformations in the context of the 3 + 1 decomposition, foliation preserving transformations, is defined; transformation rules of various 3 + 1 decomposition variables under this change of coordinates are investigated; the notion of covariant time derivative under foliation preserving transformations is defined; gauge conditions of various numerical relativity formulations are rewritten in generally covariant form. The Hamiltonian formulation of the generalized harmonic system is defined in the latter part of this dissertation. With the knowledge of covariant time derivative, the Hamiltonian formulation is extended to achieve general covariance. The Hamiltonian formulation is further proved to be symmetric hyperbolic.

  6. The Construct of General Intelligence.

    ERIC Educational Resources Information Center

    Humphreys, Lloyd G.

    1979-01-01

    The construct of general intelligence is discussed in the context of factor models, differential validity of tests, Piagetian tasks, heritability, social class, and race. The general factor is an abstraction resulting from genes, environmental pressures, and neural structures involved in cognitive or intellectual human behavior. (Author/RD)

  7. Personality, Intelligence and General Knowledge

    ERIC Educational Resources Information Center

    Furnham, Adrian; Chamorro-Premuzic, Tomas

    2006-01-01

    Three studies, all on student populations, looked at the relationship between a recently psychometrised measure of General Knowledge [Irwing, P., Cammock, T., & Lynn, R. (2001). Some evidence for the existence of a general factor of semantic memory and its components. "Personality and Individual Differences," 30, 857-871], both long…

  8. Using "Informances" in General Music

    ERIC Educational Resources Information Center

    Zaffini, Erin Dineen

    2015-01-01

    General music teachers might be faced with formal performance expectations placed on them by administrators. While performance expectations are common in performance-based ensembles, meeting these expectations can be cumbersome and daunting to those who serve in the realm of general music. Informances--informative presentations of student learning…

  9. Integrability and generalized monodromy matrix

    SciTech Connect

    Lhallabi, T.; Moujib, A.

    2007-09-15

    We construct the generalized monodromy matrix M-circumflex({omega}) of two-dimensional string effective action by introducing the T-duality group properties. The integrability conditions with general solutions depending on spectral parameter are given. This construction is investigated for the exactly solvable Wess, Zumino, Novikov, and Witten model in pp-wave limit when B=0.

  10. Student Views of General Education.

    ERIC Educational Resources Information Center

    Gaff, Jerry G.; Davis, Michael L.

    1981-01-01

    A student survey conducted at 10 colleges by the Project on General Education Models (GEM) demonstrates that students do value a broad general education, especially if that goal is seen in relation to other goals of specialized knowledge, self-knowledge, and preparation for a career. (Author/MLW)

  11. Sampling Assumptions in Inductive Generalization

    ERIC Educational Resources Information Center

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  12. Freshman General Studies Thematic. 1973-.

    ERIC Educational Resources Information Center

    California State Univ., Chico.

    The Freshman General Studies Thematic Program (GST) at California State University, Chico was established in 1973 to create a general education program for freshmen and to give faculty the opportunity to explore innovative teaching methods. What resulted was a 33-unit, year-long interdisciplinary course for 36 well-motivated, well-prepared…

  13. Alcohol-Sensitive Generalized Dystonia.

    PubMed

    Micheli, Federico; Uribe-Roca, Claudia; Saenz-Farret, Michel

    We report the case of a 29-year-old male patient with a generalized and progressive dystonia that led him unable to stand. Multiple antidystonic treatments were tried without benefit. Alcohol test was positive with a dramatic improvement. To the best of our knowledge, this is the first reported case of generalized dystonia without other clinical manifestations sensitive to alcohol.

  14. Generalizing Atoms in Constraint Logic

    NASA Technical Reports Server (NTRS)

    Page, C. David, Jr.; Frisch, Alan M.

    1991-01-01

    This paper studies the generalization of atomic formulas, or atoms, that are augmented with constraints on or among their terms. The atoms may also be viewed as definite clauses whose antecedents express the constraints. Atoms are generalized relative to a body of background information about the constraints. This paper first examines generalization of atoms with only monadic constraints. The paper develops an algorithm for the generalization task and discusses algorithm complexity. It then extends the algorithm to apply to atoms with constraints of arbitrary arity. The paper also presents semantic properties of the generalizations computed by the algorithms, making the algorithms applicable to such problems as abduction, induction, and knowledge base verification. The paper emphasizes the application to induction and presents a pac-learning result for constrained atoms.

  15. Generalized complex geometry, generalized branes and the Hitchin sigma model

    NASA Astrophysics Data System (ADS)

    Zucchini, Roberto

    2005-03-01

    Hitchin's generalized complex geometry has been shown to be relevant in compactifications of superstring theory with fluxes and is expected to lead to a deeper understanding of mirror symmetry. Gualtieri's notion of generalized complex submanifold seems to be a natural candidate for the description of branes in this context. Recently, we introduced a Batalin-Vilkovisky field theoretic realization of generalized complex geometry, the Hitchin sigma model, extending the well known Poisson sigma model. In this paper, exploiting Gualtieri's formalism, we incorporate branes into the model. A detailed study of the boundary conditions obeyed by the world sheet fields is provided. Finally, it is found that, when branes are present, the classical Batalin-Vilkovisky cohomology contains an extra sector that is related non trivially to a novel cohomology associated with the branes as generalized complex submanifolds.

  16. The evolution of general intelligence.

    PubMed

    Burkart, Judith M; Schubiger, Michèle N; van Schaik, Carel P

    2016-07-28

    The presence of general intelligence poses a major evolutionary puzzle, which has led to increased interest in its presence in nonhuman animals. The aim of this review is to critically evaluate this puzzle, and to explore the implications for current theories about the evolution of cognition. We first review domain-general and domain-specific accounts of human cognition in order to situate attempts to identify general intelligence in nonhuman animals. Recent studies are consistent with the presence of general intelligence in mammals (rodents and primates). However, the interpretation of a psychometric g-factor as general intelligence needs to be validated, in particular in primates, and we propose a range of such tests. We then evaluate the implications of general intelligence in nonhuman animals for current theories about its evolution and find support for the cultural intelligence approach, which stresses the critical importance of social inputs during the ontogenetic construction of survival-relevant skills. The presence of general intelligence in nonhumans implies that modular abilities can arise in two ways, primarily through automatic development with fixed content and secondarily through learning and automatization with more variable content. The currently best-supported model, for humans and nonhuman vertebrates alike, thus construes the mind as a mix of skills based on primary and secondary modules. The relative importance of these two components is expected to vary widely among species, and we formulate tests to quantify their strength.

  17. Louisiana Title V General Permits

    SciTech Connect

    Boyer, B.E.; Neal, T.L.

    1995-12-31

    Title V of the Federal Clean Air Act Amendments of 1990 requires federal operating permits for all major sources of air pollution. In 1992, Title 40, Part 70 of the Code of Federal Regulations (40 CFR Part 70) codified the law s requirements. These federal regulations, entitled Operating Permit Program, define the minimum requirements for state administered operating permit programs. The intent of Title V is to put into one document all requirements of an operating permit. General Permits for oil and gas facilities may be preferred if the facility can comply with all permit requirements. If greater flexibility than allowed by the General Permit is required, then the facility should apply for an individual Title V permit. General Permits are designed to streamline the permitting process, shorten the time it takes to obtain approval for initial and modified permits. The advantages of the General Permit include reduced paperwork and greater consistency because the permits are standardized. There should be less uncertainty because permit requirements will be known at the time of application. Approval times for Initial and modified General Permits should be reduced. Lengthy public notice procedures (and possible hearings) will be required for only the initial approval of the General Permit and not for each applicant to the permit. A disadvantage of General Permits is reduced flexibility since the facility must comply with the requirements of a standardized permit.

  18. General Aviation Task Force report

    NASA Technical Reports Server (NTRS)

    1993-01-01

    General aviation is officially defined as all aviation except scheduled airlines and the military. It is the only air transportation to many communities throughout the world. In order to reverse the recent decline in general aviation aircraft produced in the United States, the Task Force recommends that NASA provide the expertise and facilities such as wind tunnels and computer codes for aircraft design. General aviation manufacturers are receptive to NASA's innovations and technological leadership and are expected to be effective users of NASA-generated technologies.

  19. Generalized interaction-free evolutions

    NASA Astrophysics Data System (ADS)

    Militello, Benedetto; Chruściński, Dariusz; Messina, Antonino; NaleŻyty, Paweł; Napoli, Anna

    2016-02-01

    A thorough analysis of the evolutions of bipartite systems characterized by the "effective absence" of interaction between the two subsystems is reported. First, the connection between the concepts underlying interaction-free evolutions (IFE) and decoherence-free subspaces (DFS) is explored, showing intricate relations between these concepts. Second, starting from this analysis and inspired by a generalization of DFS already known in the literature, we introduce the notion of generalized IFE (GIFE), also providing a useful characterization that allows one to develop a general scheme for finding GIFE states.

  20. Generalized Klein-Kramers equations.

    PubMed

    Fa, Kwok Sau

    2012-12-21

    A generalized Klein-Kramers equation for a particle interacting with an external field is proposed. The equation generalizes the fractional Klein-Kramers equation introduced by Barkai and Silbey [J. Phys. Chem. B 104, 3866 (2000)]. Besides, the generalized Klein-Kramers equation can also recover the integro-differential Klein-Kramers equation for continuous-time random walk; this means that it can describe the subdiffusive and superdiffusive regimes in the long-time limit. Moreover, analytic solutions for first two moments both in velocity and displacement (for force-free case) are obtained, and their dynamic behaviors are investigated.

  1. Shape Reconstruction from Generalized Projections

    NASA Astrophysics Data System (ADS)

    Viikinkoski, Matti

    2016-01-01

    In this thesis we develop methods for recovering the three-dimensional shape of an object from generalized projections. We particularly focus on the problems encountered when data are presented as discrete image fields. We demonstrate the usefulness of the Fourier transform in transferring the image data and shape model projections to a domain more suitable for gradient based optimization. To substantiate the general applicability of our methods to observational astronomy, we reconstruct shape models for several asteroids observed with adaptive optics, thermal infrared interferometry, or range-Doppler radar. The reconstructions are carried out with the ADAM software package that we have designed for general use.

  2. Computations of generalized Dolbeault cohomology

    SciTech Connect

    Cavalcanti, Gil R.

    2009-02-02

    We study the (generalized Dolbeault) cohomology of generalized complex manifolds in 4 real dimensions. We show that in 4 real dimensions, the first cohomology around a nondegenerate type change point is given by holomorphic (1,0) forms defined on the type change locus. We use this to compute the cohomology of a neighbourhood of a compact component of the type change locus as well as that of the blow-up of a type change point. Finally, we use these computations to determine the generalized cohomology of some concrete examples.

  3. Patch Test Negative Generalized Dermatitis.

    PubMed

    Spiker, Alison; Mowad, Christen

    2016-01-01

    Allergic contact dermatitis is a common condition in dermatology. Patch testing is the criterion standard for diagnosis. However, dermatitis is not always caused by an allergen, and patch testing does not identify a culprit in every patient. Generalized dermatitis, defined as eczematous dermatitis affecting greater than 3 body sites, is often encountered in dermatology practice, especially patch test referral centers. Management for patients with generalized dermatitis who are patch test negative is challenging. The purpose of this article is to outline an approach to this challenging scenario and summarize the paucity of existing literature on patch test negative generalized dermatitis.

  4. Cross-cultural comparisons of attitudes toward schizophrenia amongst the general population and physicians: a series of web-based surveys in Japan and the United States.

    PubMed

    Richards, Misty; Hori, Hiroaki; Sartorius, Norman; Kunugi, Hiroshi

    2014-02-28

    Cross-cultural differences in attitudes toward schizophrenia are suggested, while no studies have compared such attitudes between the United States and Japan. In our previous study in Japan (Hori et al., 2011), 197 subjects in the general population and 112 physicians (excluding psychiatrists) enrolled in a web-based survey using an Internet-based questionnaire format. Utilizing the identical web-based survey method in the United States, the present study enrolled 172 subjects in the general population and 45 physicians. Participants' attitudes toward schizophrenia were assessed with the English version of the 18-item questionnaire used in our previous Japanese survey. Using exploratory factor analysis, we identified four factors labeled "social distance," "belief of dangerousness," "underestimation of patients' abilities," and "skepticism regarding treatment." The two-way multivariate analysis of covariance on the four factors, with country and occupation as the between-subject factors and with potentially confounding demographic variables as the covariates, revealed that the general population in the US scored significantly lower than the Japanese counterparts on the factors "social distance" and "skepticism regarding treatment" and higher on "underestimation of patients' abilities." Our results suggest that culture may have an important role in shaping attitudes toward mental illness. Anti-stigma campaigns that target culture-specific biases are considered important.

  5. The Role of Spatial Disorientation in Fatal General Aviation Accidents

    NASA Technical Reports Server (NTRS)

    Scheuring, RIchard

    2005-01-01

    In-flight Spatial Disorientation (SD) in pilots is a serious threat to aviation safety. Indeed, SD may play a much larger role in aviation accidents than the approximate 6-8% reported by the National Transportation Safety Board (NTSB) each year, because some accidents coded by the NTSB as aircraft control-not maintained (ACNM) may actually result from SD. The purpose of this study is to determine whether SD is underestimated as a cause of fatal general aviation (GA) accidents in the NTSB database. Fatal GA airplane accidents occurring between January 1995 and December 1999 were reviewed from the NTSB aviation accident database. Cases coded as ACNM or SD as the probable cause were selected for review by a panel of aerospace medicine specialists. Using a rating scale, each rater was instructed to determine if SD was the probable cause of the accident. Agreement between the raters and agreement between the raters and the NTSB were evaluated by Kappa statistics. The raters agreed that 11 out of 20 (55%) accidents coded by the NTSB as ACNM were probably caused by SD (p less than 0.05). Agreement between the raters and the NTSB did not reach significance (p greater than 0.05). The 95% C.I. for the sampling population estimated that between 33-77% of cases that the NTSB identified as ACNM could be identified by aerospace medicine experts as SD. Aerospace medicine specialists agreed that some cases coded by the NTSB as ACNM were probably caused by SD. Consequently, a larger number of accidents may be caused by the pilot succumbing to SD than indicated in the NTSB database. This new information should encourage regulating agencies to insure that pilots receive SD recognition training, enabling them to take appropriate corrective actions during flight. This could lead to new training standards, ultimately saving lives among GA airplane pilots.

  6. The generalized added mass revised

    NASA Astrophysics Data System (ADS)

    De Wilde, Juray

    2007-05-01

    The reformulation of the generalized or apparent added mass presented by De Wilde [Phys. Fluids 17, 113304 (2005)] neglects the presence of a drag-type force in the gas and solid phase momentum equations. Reformulating the generalized added mass accounting for the presence of a drag-type force, an apparent drag force appears next to the apparent distribution of the filtered gas phase pressure gradient over the phases already found by De Wilde in the above-cited reference. The reformulation of the generalized added mass and the evaluation of a linear wave propagation speed test then suggest a generalized added mass type closure approach to completely describe filtered gas-solid momentum transfer, that is, including both the filtered drag force and the correlation between the solid volume fraction and the gas phase pressure gradient.

  7. Problem Solving with General Semantics.

    ERIC Educational Resources Information Center

    Hewson, David

    1996-01-01

    Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)

  8. Generalized teleportation and entanglement recycling.

    PubMed

    Strelchuk, Sergii; Horodecki, Michał; Oppenheim, Jonathan

    2013-01-04

    We introduce new teleportation protocols which are generalizations of the original teleportation protocols that use the Pauli group and the port-based teleportation protocols, introduced by Hiroshima and Ishizaka, that use the symmetric permutation group. We derive sufficient conditions for a set of operations, which in general need not form a group, to give rise to a teleportation protocol and provide examples of such schemes. This generalization leads to protocols with novel properties and is needed to push forward new schemes of computation based on them. Port-based teleportation protocols and our generalizations use a large resource state consisting of N singlets to teleport only a single qubit state reliably. We provide two distinct protocols which recycle the resource state to teleport multiple states with error linearly increasing with their number. The first protocol consists of sequentially teleporting qubit states, and the second teleports them in a bulk.

  9. General Information About Injection Wells

    EPA Pesticide Factsheets

    This webpage provides general background information on injection wells used to place fluids in the subsurface. It also provides information on use, different categories, and how they are regulated. Information on the protection is also provided.

  10. Generalized Teleportation and Entanglement Recycling

    NASA Astrophysics Data System (ADS)

    Strelchuk, Sergii; Horodecki, Michał; Oppenheim, Jonathan

    2013-01-01

    We introduce new teleportation protocols which are generalizations of the original teleportation protocols that use the Pauli group and the port-based teleportation protocols, introduced by Hiroshima and Ishizaka, that use the symmetric permutation group. We derive sufficient conditions for a set of operations, which in general need not form a group, to give rise to a teleportation protocol and provide examples of such schemes. This generalization leads to protocols with novel properties and is needed to push forward new schemes of computation based on them. Port-based teleportation protocols and our generalizations use a large resource state consisting of N singlets to teleport only a single qubit state reliably. We provide two distinct protocols which recycle the resource state to teleport multiple states with error linearly increasing with their number. The first protocol consists of sequentially teleporting qubit states, and the second teleports them in a bulk.

  11. On superpotentials in general relativity

    NASA Astrophysics Data System (ADS)

    Stolín, Oldřich; Novotný, Jan

    2001-10-01

    It is shown that the Einstein—Freud, Landau—Lifshitz and Møller tetrad super-potentials represent special cases of a more general construction. The tetrad version of the Landau—Lifshitz superpotential is derived.

  12. The future of general medicine.

    PubMed

    Firth, John

    2014-08-01

    It is a truth universally acknowledged that there is a problem with general medicine. Physicians have become increasingly specialised over the past 30 years or so, and specialist care has produced increasingly better outcomes for some patients. The patients left behind are looked after by general medicine, where demand is increasing, operational priority within hospitals is low, there is little professional kudos and recruitment is suffering. Three recent reports - Hospitals on the Edge?, the Future Hospital Commission report, and the Shape of Training report - have described the problems, but not articulated compelling solutions. Here, I discuss what is good about general medicine, what is bad and make suggestions for improvement. These involve getting specialities to take responsibility for care of appropriate admissions automatically and without delay, giving general physicians control over the service that they provide, and using well-chosen financial drivers to support movement in the right direction.

  13. A generalized nonlocal vector calculus

    NASA Astrophysics Data System (ADS)

    Alali, Bacim; Liu, Kuo; Gunzburger, Max

    2015-10-01

    A nonlocal vector calculus was introduced in Du et al. (Math Model Meth Appl Sci 23:493-540, 2013) that has proved useful for the analysis of the peridynamics model of nonlocal mechanics and nonlocal diffusion models. A formulation is developed that provides a more general setting for the nonlocal vector calculus that is independent of particular nonlocal models. It is shown that general nonlocal calculus operators are integral operators with specific integral kernels. General nonlocal calculus properties are developed, including nonlocal integration by parts formula and Green's identities. The nonlocal vector calculus introduced in Du et al. (Math Model Meth Appl Sci 23:493-540, 2013) is shown to be recoverable from the general formulation as a special example. This special nonlocal vector calculus is used to reformulate the peridynamics equation of motion in terms of the nonlocal gradient operator and its adjoint. A new example of nonlocal vector calculus operators is introduced, which shows the potential use of the general formulation for general nonlocal models.

  14. Testing approaches for overdispersion in poisson regression versus the generalized poisson model.

    PubMed

    Yang, Zhao; Hardin, James W; Addy, Cheryl L; Vuong, Quang H

    2007-08-01

    Overdispersion is a common phenomenon in Poisson modeling, and the negative binomial (NB) model is frequently used to account for overdispersion. Testing approaches (Wald test, likelihood ratio test (LRT), and score test) for overdispersion in the Poisson regression versus the NB model are available. Because the generalized Poisson (GP) model is similar to the NB model, we consider the former as an alternate model for overdispersed count data. The score test has an advantage over the LRT and the Wald test in that the score test only requires that the parameter of interest be estimated under the null hypothesis. This paper proposes a score test for overdispersion based on the GP model and compares the power of the test with the LRT and Wald tests. A simulation study indicates the score test based on asymptotic standard Normal distribution is more appropriate in practical application for higher empirical power, however, it underestimates the nominal significance level, especially in small sample situations, and examples illustrate the results of comparing the candidate tests between the Poisson and GP models. A bootstrap test is also proposed to adjust the underestimation of nominal level in the score statistic when the sample size is small. The simulation study indicates the bootstrap test has significance level closer to nominal size and has uniformly greater power than the score test based on asymptotic standard Normal distribution. From a practical perspective, we suggest that, if the score test gives even a weak indication that the Poisson model is inappropriate, say at the 0.10 significance level, we advise the more accurate bootstrap procedure as a better test for comparing whether the GP model is more appropriate than Poisson model. Finally, the Vuong test is illustrated to choose between GP and NB2 models for the same dataset.

  15. Future climate of the Caribbean from a super-high-resolution atmospheric general circulation model

    NASA Astrophysics Data System (ADS)

    Hall, Trevor C.; Sealy, Andrea M.; Stephenson, Tannecia S.; Kusunoki, Shoji; Taylor, Michael A.; Chen, A. Anthony; Kitoh, Akio

    2013-07-01

    Present-day (1979-2003) and future (2075-2099) simulations of mean and extreme rainfall and temperature are examined using data from the Meteorological Research Institute super-high-resolution atmospheric general circulation model. Analyses are performed over the 20-km model grid for (1) a main Caribbean basin, (2) sub-regional zones, and (3) specific Caribbean islands. Though the model's topography underestimates heights over the eastern Caribbean, it captures well the present-day spatial and temporal variations of seasonal and annual climates. Temperature underestimations range from 0.1 °C to 2 °C with respect to the Japanese Reanalysis and the Climatic Research Unit datasets. The model also captures fairly well sub-regional scale variations in the rainfall climatology. End-of-century projections under the Intergovernmental Panel on Climate Change SRES A1B scenario indicate declines in rainfall amounts by 10-20 % for most of the Caribbean during the early (May-July) and late (August-October) rainy seasons relative to the 1979-2003 baselines. The early dry season (November-January) is also projected to get wetter in the far north and south Caribbean by approximately 10 %. The model also projects a warming of 2-3 °C over the Caribbean region. Analysis of future climate extremes indicate a 5-10 % decrease in the simple daily precipitation intensity but no significant change in the number of consecutive dry days for Cuba, Jamaica, southern Bahamas, and Haiti. There is also indication that the number of hot days and nights will significantly increase over the main Caribbean basin.

  16. Power and sample size calculations for generalized regression models with covariate measurement error.

    PubMed

    Tosteson, Tor D; Buzas, Jeffrey S; Demidenko, Eugene; Karagas, Margaret

    2003-04-15

    Covariate measurement error is often a feature of scientific data used for regression modelling. The consequences of such errors include a loss of power of tests of significance for the regression parameters corresponding to the true covariates. Power and sample size calculations that ignore covariate measurement error tend to overestimate power and underestimate the actual sample size required to achieve a desired power. In this paper we derive a novel measurement error corrected power function for generalized linear models using a generalized score test based on quasi-likelihood methods. Our power function is flexible in that it is adaptable to designs with a discrete or continuous scalar covariate (exposure) that can be measured with or without error, allows for additional confounding variables and applies to a broad class of generalized regression and measurement error models. A program is described that provides sample size or power for a continuous exposure with a normal measurement error model and a single normal confounder variable in logistic regression. We demonstrate the improved properties of our power calculations with simulations and numerical studies. An example is given from an ongoing study of cancer and exposure to arsenic as measured by toenail concentrations and tap water samples.

  17. Teachers' Understanding of Algebraic Generalization

    NASA Astrophysics Data System (ADS)

    Hawthorne, Casey Wayne

    Generalization has been identified as a cornerstone of algebraic thinking (e.g., Lee, 1996; Sfard, 1995) and is at the center of a rich conceptualization of K-8 algebra (Kaput, 2008; Smith, 2003). Moreover, mathematics teachers are being encouraged to use figural-pattern generalizing tasks as a basis of student-centered instruction, whereby teachers respond to and build upon the ideas that arise from students' explorations of these activities. Although more and more teachers are engaging their students in such generalizing tasks, little is known about teachers' understanding of generalization and their understanding of students' mathematical thinking in this domain. In this work, I addressed this gap, exploring the understanding of algebraic generalization of 4 exemplary 8th-grade teachers from multiple perspectives. A significant feature of this investigation is an examination of teachers' understanding of the generalization process, including the use of algebraic symbols. The research consisted of two phases. Phase I was an examination of the teachers' understandings of the underlying quantities and quantitative relationships represented by algebraic notation. In Phase II, I observed the instruction of 2 of these teachers. Using the lens of professional noticing of students' mathematical thinking, I explored the teachers' enacted knowledge of algebraic generalization, characterizing how it supported them to effectively respond to the needs and queries of their students. Results indicated that teachers predominantly see these figural patterns as enrichment activities, disconnected from course content. Furthermore, in my analysis, I identified conceptual difficulties teachers experienced when solving generalization tasks, in particular, connecting multiple symbolic representations with the quantities in the figures. Moreover, while the teachers strived to overcome the challenges of connecting different representations, they invoked both productive and unproductive

  18. [General adverse reactions to contrast agents. Classification and general concepts].

    PubMed

    Aguilar García, J J; Parada Blázquez, M J; Vargas Serrano, B; Rodríguez Romero, R

    2014-06-01

    General adverse reactions to intravenous contrast agents are uncommon, although relevant due to the growing number of radiologic tests that use iodinated or gadolinium-based contrast agents. Although most of these reactions are mild, some patients can experience significant reactions that radiologists should know how to prevent and treat.

  19. SOFIA general investigator science program

    NASA Astrophysics Data System (ADS)

    Young, Erick T.; Andersson, B.-G.; Becklin, Eric E.; Reach, William T.; Sankrit, Ravi; Zinnecker, Hans; Krabbe, Alfred

    2014-07-01

    SOFIA is a joint project between NASA and DLR, the German Aerospace Center, to provide the worldwide astronomical community with an observatory that offers unique capabilities from visible to far-infrared wavelengths. SOFIA consists of a 2.7-m telescope mounted in a highly modified Boeing 747-SP aircraft, a suite of instruments, and the scientific and operational infrastructure to support the observing program. This paper describes the current status of the observatory and details the General Investigator program. The observatory has recently completed major development activities, and it has transitioned into full operational status. Under the General Investigator program, astronomers submit proposals that are peer reviewed for observation on the facility. We describe the results from the first two cycles of the General Investigator program. We also describe some of the new observational capabilities that will be available for Cycle 3, which will begin in 2015.

  20. Generalization of the Euler Angles

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis

    2002-01-01

    It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.

  1. HIV and General Cardiovascular Risk

    PubMed Central

    Capili, Bernadette; Anastasi, Joyce K.; Ogedegbe, Olugbenga

    2011-01-01

    The incidence of cardiovascular disease (CVD) is increasing in HIV-infected people. Risk factors such as hyperlipidemia, impaired glucose tolerance, and insulin resistance have become common. CVD in HIV may also be related to non-traditional risk factors including accumulation of visceral fat, inflammation secondary to HIV, and effects of some antiretroviral drugs. This cross-sectional study described the CVD risk factors of 123 adults living with HIV and calculated the 10-year estimate for general cardiovascular risk score. Results showed that approximately 25% of the participants were considered to be at high risk for developing CVD in the next 10 years. Increased waist circumference and longer duration of smoking habit were associated with elevated general cardiovascular risk scores. Similar to the general population, most of the identified risks could be modified through lifestyle management. PMID:21277230

  2. Performance evaluation of generalized MSK

    NASA Astrophysics Data System (ADS)

    Galko, P.; Pasupathy, S.

    The computation of the performance of several optimal and suboptimal receivers for generalized MSK is discussed. The optimal receivers considered are Viterbi receivers and the optimal receivers based on a finite observation interval. Two suboptimal receivers, (1) optimized linear receivers based on finite observation intervals and (2) linear receivers based on approximating generalized MSK as an OQPSK signal, are considered as well. It is shown that the former receiver's performance may be computed exactly, while for the latter receiver it is possible to provide arbitrarily tight bounds on the performance. These analyses are illustrated by application to the two popular generalized MSK schemes of duobinary MSK (or FSOQ) and (1 + D)squared/4 MSK.

  3. New roles for general practitioners.

    PubMed Central

    Handysides, S.

    1994-01-01

    General practice is likely to change greatly over the next few years. Increases in care in the community and day surgery will lead to more work, and the demand for better data on practice activity will mean the development of audit and epidemiological work. To make time general practitioners will have to learn to delegate work that does not require a doctor. Fundholding has already stimulated some practices to bring services to patients rather than send patients to hospital, and this trend seems set to continue. It is important to pool resources, not only within practices but among other practices in the area--joint action will increase the ability to improve the services for patients. If general practitioners take the opportunity to gain control of the changes the morale of the profession should improve. Images p513-a p514-a PMID:8136671

  4. CMAC with General Basis Functions.

    PubMed

    Chun-Shin, Lin; Ching-Tsan, Chiang

    1996-10-01

    The cerebellar model articulation controller (CMAC) is often used in learning control. It can be viewed as a basis function network (BFN). The conventional CMAC uses local constant basis functions. A disadvantage is that its output is constant within each quantized state and the derivative information is not preserved. If the constant basis functions are replaced by non-constant differentiable basis functions, the derivative information will be able to be stored into the structure as well. In this paper, the generalized scheme that uses general basis functions is investigated. The conventional CMAC is a special case of the generalized technique. The mathematical foundation for the modified scheme is derived and the convergence of learning is proved. Simulations for the CMAC with Gaussian basis functions (GBFs) are performed to demonstrate the improvement of accuracy in modeling, and the capability in providing derivative information. Copyright 1996 Elsevier Science Ltd

  5. Generalized two-port elements

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J.; Galhano, Alexandra M.

    2017-01-01

    The development of models constitutes a fundamental step in the study of natural and artificial systems. Present day science aims to address broader and more complex areas of application requiring, therefore, new concepts and models. This paper explores the concept of generalized two-port network by embedding the ideas of fractional calculus, memristor, transformer and gyrator. Each element represents separately one possible direction for generalizing the classical elements, but the cross-fertilization of the distinct topics has been overlooked. In this line of thought, the proposal of a novel element is a logical conjecture for obeying the symmetries that have been discovered in nature.

  6. Uniform acceleration in general relativity

    NASA Astrophysics Data System (ADS)

    Friedman, Yaakov; Scarr, Tzvi

    2015-10-01

    We extend de la Fuente and Romero's (Gen Relativ Gravit 47:33, 2015) defining equation for uniform acceleration in a general curved spacetime from linear acceleration to the full Lorentz covariant uniform acceleration. In a flat spacetime background, we have explicit solutions. We use generalized Fermi-Walker transport to parallel transport the Frenet basis along the trajectory. In flat spacetime, we obtain velocity and acceleration transformations from a uniformly accelerated system to an inertial system. We obtain the time dilation between accelerated clocks. We apply our acceleration transformations to the motion of a charged particle in a constant electromagnetic field and recover the Lorentz-Abraham-Dirac equation.

  7. General aviation IFR operational problems

    NASA Technical Reports Server (NTRS)

    Bolz, E. H.; Eisele, J. E.

    1979-01-01

    Operational problems of general aviation IFR operators (particularly single pilot operators) were studied. Several statistical bases were assembled and utilized to identify the more serious problems and to demonstrate their magnitude. These bases include official activity projections, historical accident data and delay data, among others. The GA operating environment and cockpit environment were analyzed in detail. Solutions proposed for each of the problem areas identified are based on direct consideration of currently planned enhancements to the ATC system, and on a realistic assessment of the present and future limitations of general aviation avionics. A coordinated set of research program is suggested which would provide the developments necessary to implement the proposed solutions.

  8. Generalized Gradient Approximation Made Simple

    SciTech Connect

    Perdew, J.P.; Burke, K.; Ernzerhof, M.

    1996-10-01

    Generalized gradient approximations (GGA{close_quote}s) for the exchange-correlation energy improve upon the local spin density (LSD) description of atoms, molecules, and solids. We present a simple derivation of a simple GGA, in which all parameters (other than those in LSD) are fundamental constants. Only general features of the detailed construction underlying the Perdew-Wang 1991 (PW91) GGA are invoked. Improvements over PW91 include an accurate description of the linear response of the uniform electron gas, correct behavior under uniform scaling, and a smoother potential. {copyright} {ital 1996 The American Physical Society.}

  9. A generalization of Nekhoroshev's theorem

    NASA Astrophysics Data System (ADS)

    Bates, Larry; Cushman, Richard

    2016-11-01

    Nekhoroshev discovered a beautiful theorem in Hamiltonian systems that includes as special cases not only the Poincaré theorem on periodic orbits but also the theorem of Liouville-Arnol'd on completely integrable systems [7]. Sadly, his early death precluded him publishing a full account of his proof. The aim of this paper is twofold: first, to provide a complete proof of his original theorem and second a generalization to the noncommuting case. Our generalization of Nekhoroshev's theorem to the nonabelian case subsumes aspects of the theory of noncommutative complete integrability as found in Mishchenko and Fomenko [5] and is similar to what Nekhoroshev's theorem does in the abelian case.

  10. Quantization of general linear electrodynamics

    SciTech Connect

    Rivera, Sergio; Schuller, Frederic P.

    2011-03-15

    General linear electrodynamics allow for an arbitrary linear constitutive relation between the field strength 2-form and induction 2-form density if crucial hyperbolicity and energy conditions are satisfied, which render the theory predictive and physically interpretable. Taking into account the higher-order polynomial dispersion relation and associated causal structure of general linear electrodynamics, we carefully develop its Hamiltonian formulation from first principles. Canonical quantization of the resulting constrained system then results in a quantum vacuum which is sensitive to the constitutive tensor of the classical theory. As an application we calculate the Casimir effect in a birefringent linear optical medium.

  11. Quasilocal Hamiltonians in general relativity

    SciTech Connect

    Anderson, Michael T.

    2010-10-15

    We analyze the definition of quasilocal energy in general relativity based on a Hamiltonian analysis of the Einstein-Hilbert action initiated by Brown-York. The role of the constraint equations, in particular, the Hamiltonian constraint on the timelike boundary, neglected in previous studies, is emphasized here. We argue that a consistent definition of quasilocal energy in general relativity requires, at a minimum, a framework based on the (currently unknown) geometric well-posedness of the initial boundary value problem for the Einstein equations.

  12. Spinning fluids in general relativity

    NASA Technical Reports Server (NTRS)

    Ray, J. R.; Smalley, L. L.

    1982-01-01

    General relativity field equations are employed to examine a continuous medium with internal spin. A variational principle formerly applied in the special relativity case is extended to the general relativity case, using a tetrad to express the spin density and the four-velocity of the fluid. An energy-momentum tensor is subsequently defined for a spinning fluid. The equations of motion of the fluid are suggested to be useful in analytical studies of galaxies, for anisotropic Bianchi universes, and for turbulent eddies.

  13. The General English Proficiency Test

    ERIC Educational Resources Information Center

    Shih, Chih-Min

    2008-01-01

    Since 2000, the General English Proficiency Test, a newly developed test of English, has been phased in by the Language Training and Testing Center in Taiwan. It has become the most universally used test of English in Taiwan, a fact that can be evidenced by its aggregate number of registered test takers. This article first describes the…

  14. Information Society and General Education.

    ERIC Educational Resources Information Center

    Lariccia, Giovanni; Megarry, Jacquetta

    Prepared for a 1984 Organisation for Economic Cooperation and Development (OECD) conference, this report discusses the potential of the new information technologies for enhancing learning in school, vocational, and general education, and presents eight groups of case studies taken from member countries which illustrate what is currently being done…

  15. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  16. Regularized Generalized Structured Component Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun

    2009-01-01

    Generalized structured component analysis (GSCA) has been proposed as a component-based approach to structural equation modeling. In practice, GSCA may suffer from multi-collinearity, i.e., high correlations among exogenous variables. GSCA has yet no remedy for this problem. Thus, a regularized extension of GSCA is proposed that integrates a ridge…

  17. A General Introduction to Microcomputers.

    ERIC Educational Resources Information Center

    Muiznieks, Viktors

    This basic introduction to microcomputers provides the neophyte with the terminology, definitions, and concepts that explain the microcomputer and computing technology in general. Mathematical operations with binary numbers, computer storage, controlling logic, and the concepts of stack and interrupt are explained. (RAO)

  18. Assessing the General Education Elephant

    ERIC Educational Resources Information Center

    Eubanks, David A.

    2008-01-01

    Determining the success of a general education program can resemble the task of John Godfrey's six blind men in trying to acquaint themselves with an elephant. One man's approach is to feel the tusk and conclude that the elephant is a spear. Another approach leads to thinking that the broad side of the animal is a wall. The many ways of asking and…

  19. Homelessness: A General Information Packet.

    ERIC Educational Resources Information Center

    Homelessness Exchange, Washington, DC.

    This packet contains documents that provide general information about homelessness and the need for both Federal and local action to help the homeless people in America. Sections 1 and 2 contain the following articles released by the Homelessness Information Exchange: (1) "The Problem of Homelessness Nationwide"; and "Alternative Family Housing…

  20. Antidepressants in the general hospital.

    PubMed Central

    Gelenberg, A. J.

    1979-01-01

    An approach to the use of antidepressant medication in the general hospital is presented. The type of depression most likely to respond to chemotherapy is described, categories of available antidepressant agents are discussed, and relevant pharmacologic aspects are outlined. This paper suggests clinical guidelines for the use of these drugs, particularly in medical and surgical patients. PMID:455184

  1. Treatments for generalized anxiety disorder.

    PubMed

    Struzik, Lukasz; Vermani, Monica; Coonerty-Femiano, Aimee; Katzman, Martin A

    2004-03-01

    Generalized anxiety disorder is characterized by excessive chronic anxiety in association with many somatic symptoms. The disorder has pervasive effects on quality of life, including work, social and educational aspects and requires long-term therapy. Available studies in patients are the Diagnostic and Statistical Manual of Mental Disorders, third edition-revised and fourth edition, which have defined generalized anxiety disorder and demonstrate the efficacy of benzodiazepines, azapirones, some antidepressants and psychotherapy. Benzodiazepines are effective anxiolytics for short-term use but are accompanied by many adverse events. The antidepressants, paroxetine and venlafaxine (Efexor), have demonstrated efficacy in patients with generalized anxiety disorder with mild side-effect profiles. They have the additional benefit of efficacy in depression, which frequently occurs comorbidly in these patients. Long-term efficacy has been shown with venlafaxine in the treatment of this chronic condition, confirming that as in depression, the goal must not just be remission beyond simple symptom resolution but also on to improved functioning and quality of life. Psychotherapy with applied relaxation, cognitive therapy and cognitive behavioral therapy show the most promise in resolving and maintaining treatment gains in the long-term. These approaches may be useful alone or in combination with adjunctive pharmacotherapy to achieve remission. Based on current evidence, the recommended approach to achieving long-term benefits for patients with generalized anxiety disorder is antidepressant therapy with paroxetine or venlafaxine in combination with cognitive behavioral therapy.

  2. General Relativity: Geometry Meets Physics

    ERIC Educational Resources Information Center

    Thomsen, Dietrick E.

    1975-01-01

    Observing the relationship of general relativity and the geometry of space-time, the author questions whether the rest of physics has geometrical explanations. As a partial answer he discusses current research on subatomic particles employing geometric transformations, and cites the existence of geometrical definitions of physical quantities such…

  3. General B factory design considerations

    SciTech Connect

    Zisman, M.S.

    1992-12-01

    We describe the general considerations that go into the design of an asymmetric B factory collider. Justification is given for the typical parameters of such a facility, and the physics and technology challenges that arise from these parameter choices are discussed. Cost and schedule issues for a B factory are discussed briefly. A summary of existing proposals is presented, noting their similarities and differences.

  4. Internet Resources for General Music.

    ERIC Educational Resources Information Center

    Thompson, Keith P.

    1999-01-01

    Describes Internet tools that can be incorporated into the general music class structure and are appropriate for upper elementary and middle school students: (1) electronic music dictionaries and encyclopedias; (2) Internet practice-and-drill web sites; and (3) virtual learning centers. Discusses the structure of Internet projects and the use of…

  5. Generalized no-broadcasting theorem.

    PubMed

    Barnum, Howard; Barrett, Jonathan; Leifer, Matthew; Wilce, Alexander

    2007-12-14

    We prove a generalized version of the no-broadcasting theorem, applicable to essentially any nonclassical finite-dimensional probabilistic model satisfying a no-signaling criterion, including ones with "superquantum" correlations. A strengthened version of the quantum no-broadcasting theorem follows, and its proof is significantly simpler than existing proofs of the no-broadcasting theorem.

  6. Geography's Role in General Education.

    ERIC Educational Resources Information Center

    Harper, Robert A.

    1990-01-01

    Reprinted from the April 1966 "Journal of Geography," questions what geography could contribute to general education. Observed that new federal programs allow geography to demonstrate the potential contributions of the field. Presented a fundamental perspective for geography instruction that urged a world system theme for curriculum organization.…

  7. New Zealand's first general anaesthetic.

    PubMed

    Newson, A J

    1975-08-01

    The administration of New Zealand's first general anaesthetic took place at the Colonial Gaol, Wellington, on the morning of Monday, September 27th, 1847. The agent used was sulphuric ether which was administered by Mr. Marriot, the manufacturer of the Herapath-type inhaler used on this occasion. The operation, a dental extraction, was performed by the Colonial Surgeon, Dr. J. P. Fitzgerald.

  8. General Chemistry for Waste Handlers.

    ERIC Educational Resources Information Center

    Sixtus, Michael E.

    This manual is intended for use in presenting a course which provides the content-specific general chemistry education required for the safety awareness and job enhancement of persons employed as waste handlers. The course, which was designed to be delivered to technicians at job sites in a lecture/demonstration format with several hands-on…

  9. General purpose programmable accelerator board

    DOEpatents

    Robertson, Perry J.; Witzke, Edward L.

    2001-01-01

    A general purpose accelerator board and acceleration method comprising use of: one or more programmable logic devices; a plurality of memory blocks; bus interface for communicating data between the memory blocks and devices external to the board; and dynamic programming capabilities for providing logic to the programmable logic device to be executed on data in the memory blocks.

  10. Generalization of the Bernoulli ODE

    ERIC Educational Resources Information Center

    Azevedo, Douglas; Valentino, Michele C.

    2017-01-01

    In this note, we propose a generalization of the famous Bernoulli differential equation by introducing a class of nonlinear first-order ordinary differential equations (ODEs). We provide a family of solutions for this introduced class of ODEs and also we present some examples in order to illustrate the applications of our result.

  11. How do general anaesthetics work?

    NASA Astrophysics Data System (ADS)

    Antkowiak, Bernd

    2001-05-01

    Almost a century ago, Meyer and Overton discovered a linear relationship between the potency of anaesthetic agents to induce general anaesthesia and their ability to accumulate in olive oil. Similar correlations between anaesthetic potency and lipid solubility were later reported from investigations on various experimental model systems. However, exceptions to the Meyer-Overton correlation exist in all these systems, indicating that lipid solubility is an important, but not the sole determinant of anaesthetic action. In the mammalian central nervous system, most general anaesthetics act at multiple molecular sites. It seems likely that not all of these effects are involved in anaesthesia. GABAA- and NMDA-receptor/ion channels have already been identified as relevant targets. However, further mechanisms, such as a blockade of Na+ channels and an activation of K+ channels, also come into play. A comparison of different anaesthetics seems to show that each compound has its own spectrum of molecular actions and thus shows specific, fingerprint-like effects on different levels of neuronal activity. This may explain why there is no known compound that specifically antagonises general anaesthesia. General anaesthesia is a multidimensional phenomenon. Unconsciousness, amnesia, analgesia, loss of sensory processing and the depression of spinal motor reflexes are important components. It was not realised until very recently that different molecular mechanisms might underlie these different components. These findings challenge traditional views, such as the assumption that one anaesthetic can be freely replaced by another.

  12. Dimensional Analysis and General Relativity

    ERIC Educational Resources Information Center

    Lovatt, Ian

    2009-01-01

    Newton's law of gravitation is a central topic in the first-year physics curriculum. A lecturer can go beyond the physical details and use the history of gravitation to discuss the development of scientific ideas; unfortunately, the most recent chapter in this history, general relativity, is not covered in first-year courses. This paper discusses…

  13. A More General "U = mgh"

    ERIC Educational Resources Information Center

    Keeports, David

    2009-01-01

    In problems dealing with the Earth's gravity, students are frequently bewildered by the fact that the two common expressions for potential energy, "mgh" and "-Gm[subscript e]m/r", differ in sign and differ considerably in form. Some textbooks demonstrate that the more familiar first expression is a special case of the more general second equation…

  14. Competency-Based General Education.

    ERIC Educational Resources Information Center

    Hunter, Walter E.

    1979-01-01

    Presents arguments in favor of competency-based general education within the context of a program developed at Central Technical Community College (Nebraska) which utilizes an open entry, open exit approach for all programs. Outlines suggested competencies in the areas of communications and oral communications. (MB)

  15. Generalized No-Broadcasting Theorem

    NASA Astrophysics Data System (ADS)

    Barnum, Howard; Barrett, Jonathan; Leifer, Matthew; Wilce, Alexander

    2007-12-01

    We prove a generalized version of the no-broadcasting theorem, applicable to essentially any nonclassical finite-dimensional probabilistic model satisfying a no-signaling criterion, including ones with “superquantum” correlations. A strengthened version of the quantum no-broadcasting theorem follows, and its proof is significantly simpler than existing proofs of the no-broadcasting theorem.

  16. General Requirements and Minimum Standards.

    ERIC Educational Resources Information Center

    2003

    This publication provides the General Requirements and Minimum Standards developed by the National Court Reporters Association's Council on Approved Student Education (CASE). They are the same for all court reporter education programs, whether an institution is applying for approval for the first time or for a new grant of approval. The first…

  17. Good Phonic Generalizations for Decoding

    ERIC Educational Resources Information Center

    Gates, Louis

    2007-01-01

    An exhaustive analysis of 88,641 individual letters and letter combinations within 16,928 words drawn from the Zeno, et al. word list unveiled remarkable phonic transparency. The individual letter and letter combinations sorted into just six general categories: three basic categories of vowels (single vowels, vowel digraphs, and final…

  18. General formulation of transverse hydrodynamics

    SciTech Connect

    Ryblewski, Radoslaw; Florkowski, Wojciech

    2008-06-15

    General formulation of hydrodynamics describing transversally thermalized matter created at the early stages of ultrarelativistic heavy-ion collisions is presented. Similarities and differences with the standard three-dimensionally thermalized relativistic hydrodynamics are discussed. The role of the conservation laws as well as the thermodynamic consistency of two-dimensional thermodynamic variables characterizing transversally thermalized matter is emphasized.

  19. Computing generalized Langevin equations and generalized Fokker-Planck equations.

    PubMed

    Darve, Eric; Solomon, Jose; Kia, Amirali

    2009-07-07

    The Mori-Zwanzig formalism is an effective tool to derive differential equations describing the evolution of a small number of resolved variables. In this paper we present its application to the derivation of generalized Langevin equations and generalized non-Markovian Fokker-Planck equations. We show how long time scales rates and metastable basins can be extracted from these equations. Numerical algorithms are proposed to discretize these equations. An important aspect is the numerical solution of the orthogonal dynamics equation which is a partial differential equation in a high dimensional space. We propose efficient numerical methods to solve this orthogonal dynamics equation. In addition, we present a projection formalism of the Mori-Zwanzig type that is applicable to discrete maps. Numerical applications are presented from the field of Hamiltonian systems.

  20. Anisotropic Generalized Ghost Pilgrim Dark Energy Model in General Relativity

    NASA Astrophysics Data System (ADS)

    Santhi, M. Vijaya; Rao, V. U. M.; Aditya, Y.

    2017-02-01

    A spatially homogeneous and anisotropic locally rotationally symmetric (LRS) Bianchi type- I Universe filled with matter and generalized ghost pilgrim dark energy (GGPDE) has been studied in general theory of relativity. To obtain determinate solution of the field equations we have used scalar expansion proportional to the shear scalar which leads to a relation between the metric potentials. Some well-known cosmological parameters (equation of state (EoS) parameter ( ω Λ), deceleration parameter ( q) and squared speed of sound {vs2}) and planes (ω _{Λ }-dot {ω }_{Λ } and statefinder) are constructed for obtained model. The discussion and significance of these parameters is totally done through pilgrim dark energy parameter ( β) and cosmic time ( t).