Background Skin diseases are underestimated and overlooked by most clinicians despite being common in clinical practice. Many patients are hospitalized with co-existing dermatological conditions which may not be detected and managed by the attending physicians. The objective of this study was to determine the burden of co-existing and overlooked dermatological disorders among patients admitted to medical wards of Muhimbili National hospital in Dar es Salaam. Study design and settings A hospital-based descriptive cross-sectional study conducted at Muhimbili National hospital in Dar es Salaam, Tanzania. Methods Patients were consecutively recruited from the medical wards. Detailed interview to obtain clinico-demographic characteristics was followed by a complete physical examination. Dermatological diagnoses were made mainly clinically. Appropriate confirmatory laboratory investigations were performed where necessary. Data was analyzed using the 'Statistical Package for Social Sciences' (SPSS) program version 10.0. A p-value of < 0.5 was statistically significant. Results Three hundred and ninety patients admitted to medical wards were enrolled into the study of whom, 221(56.7%) were females. The mean age was 36.7 ± 17.9 (range 7-84 years). Overall, 232/390 patients (59.5%) had co-existing dermatological disorders with 49% (191/390) having one, 9% (36/390) two and 5 patients (1%) three. A wide range of co-existing skin diseases was encountered, the most diverse being non-infectious conditions which together accounted for 36.4% (142/390) while infectious dermatoses accounted for 31.5% (123/390). The leading infectious skin diseases were superficial fungal infections accounting for 18%. Pruritic papular eruption of HIV/AIDS (PPE) and seborrheic eczema were the most common non-infectious conditions, each accounting for 4.3%. Of the 232/390 patients with dermatological disorders, 191/232 (82.3%) and 154/232 (66.3%) had been overlooked by their referring and admitting
Mutagaywa, Reuben Kato; Lutale, Janeth; Aboud, Muhsin; Kamala, Benjamin Anathory
Introduction There has been an increase in the prevalence of erectile dysfunction (ED) in the general population especially among Diabetic patients. This seems to be neglected problem in low-income countries. This study aims at establishing the prevalence of ED and associated risk factors in diabetic patients attended at Diabetic Clinic at Muhimbili National Hospital. Methods A cross-sectional hospital based study was conducted among 312 diabetic patients attending diabetic clinic at Muhimbili National Hospital between May and December 2011. Results More than half (55.1%) of the patients were found to have some form of ED (12.8% had mild dysfunction, 11.5% moderate and 27.9% severe dysfunction). The severity of ED was correlated with increased age. Multivariate logistic regression revealed that ED was significantly predicted by old age (odds ratio (OR) = 7.1, 95% CI 1.2-40.7), evidence of peripheral neuropathy (OR) =5.9, 95% CI 1.6-21.3), and evidence of peripheral vascular disease (OR =2.5, 95% CI 1.2-5.3). Also longer duration of DM was marginally associated with ED (p=0.056). Patients with ED were also more likely to suffer other sexual domains (p<0.001). No lifestyle factor was associated with ED. Conclusion The prevalence of ED is high among DM patients. Interventions aimed at prevention, early diagnosis and detection of DM and its complications, and adherence to treatment to prevent complications should be implemented. Further studies should emphasize on temporal variation to show true causality of DM on erectile dysfunction. PMID:25170371
Sawe, H. R.; Mfinanga, J. A.; Ringo, F. H.; Mwafongo, V.; Reynolds, T. A.; Runyon, M. S.
Background. Traditional uvulectomy is performed as a cultural ritual or purported medical remedy. We describe the associated emergency department (ED) presentations and outcomes. Methods. This was a subgroup analysis of a retrospective review of all pediatric visits to our ED in 2012. Trained abstracters recorded demographics, clinical presentations, and outcomes. Results. Complete data were available for 5540/5774 (96%) visits and 56 (1.0%, 95% CI: 0.7–1.3%) were related to recent uvulectomy, median age 1.3 years (interquartile range: 7 months–2 years) and 30 (54%) were male. Presenting complaints included cough (82%), fever (46%), and hematemesis (38%). Clinical findings included fever (54%), tachypnea (30%), and tachycardia (25%). 35 patients (63%, 95% CI: 49–75%) received intravenous antibiotics, 11 (20%, 95% CI: 10–32%) required blood transfusion, and 3 (5%, 95% CI: 1–15%) had surgical intervention. All were admitted to the hospital and 12 (21%, 95% CI: 12–34%) died. By comparison, 498 (9.1%, 95% CI: 8–10%) of the 5484 children presenting for reasons unrelated to uvulectomy died (p = 0.003). Conclusion. In our cohort, traditional uvulectomy was associated with significant morbidity and mortality. Emergency care providers should advocate for legal and public health interventions to eliminate this dangerous practice. PMID:26161270
Background Effective maternal and perinatal audits are associated with improved quality of care and reduction of severe adverse outcome. Although audits at the level of care were formally introduced in Tanzania around 25 years ago, little information is available about their existence, performance, and practical barriers to their implementation. This study assessed the structure, process and impacts of maternal and perinatal death audit systems in clinical practice and presents a detailed account on how they could be improved. Methods A cross sectional descriptive study was conducted in eight major hospitals in Dar es Salaam in January 2009. An in-depth interview guide was used for 29 health managers and members of the audit committees to investigate the existence, structure, process and outcome of such audits in clinical practice. A semi-structured questionnaire was used to interview 30 health care providers in the maternity wards to assess their awareness, attitude and practice towards audit systems. The 2007 institutional pregnancy outcome records were reviewed. Results Overall hospital based maternal mortality ratio was 218/100,000 live births (range: 0 - 385) and perinatal mortality rate was 44/1000 births (range: 17 - 147). Maternal and perinatal audit systems existed only in 4 and 3 hospitals respectively, and key decision makers did not take part in audit committees. Sixty percent of care providers were not aware of even a single action which had ever been implemented in their hospitals because of audit recommendations. There were neither records of the key decision points, action plan, nor regular analysis of the audit reports in any of the facilities where such audit systems existed. Conclusions Maternal and perinatal audit systems in these institutions are poorly established in structure and process; and are less effective to improve the quality of care. Fundamental changes are urgently needed for successful audit systems in these institutions. PMID
Varga, Timothy A; Asner, Gregory P
Alien invasive grasses threaten to transform Hawaiian ecosystems through the alteration of ecosystem dynamics, especially the creation or intensification of a fire cycle. Across sub-montane ecosystems of Hawaii Volcanoes National Park on Hawaii Island, we quantified fine fuels and fire spread potential of invasive grasses using a combination of airborne hyperspectral and light detection and ranging (LiDAR) measurements. Across a gradient from forest to savanna to shrubland, automated mixture analysis of hyperspectral data provided spatially explicit fractional cover estimates of photosynthetic vegetation, non-photosynthetic vegetation, and bare substrate and shade. Small-footprint LiDAR provided measurements of vegetation height along this gradient of ecosystems. Through the fusion of hyperspectral and LiDAR data, a new fire fuel index (FFI) was developed to model the three-dimensional volume of grass fuels. Regionally, savanna ecosystems had the highest volumes of fire fuels, averaging 20% across the ecosystem and frequently filling all of the three-dimensional space represented by each image pixel. The forest and shrubland ecosystems had lower FFI values, averaging 4.4% and 8.4%, respectively. The results indicate that the fusion of hyperspectral and LiDAR remote sensing can provide unique information on the three-dimensional properties of ecosystems, their flammability, and the potential for fire spread. PMID:18488621
Freeland, Mark S.; Anderson, Gerard; Schendler, Carol Ellen
The national community hospital input price index presented here isolates the effects of prices of goods and services required to produce hospital care and measures the average percent change in prices for a fixed market basket of hospital inputs. Using the methodology described in this article, weights for various expenditure categories were estimated and proxy price variables associated with each were selected. The index is calculated for the historical period 1970 through 1978 and forecast for 1979 through 1981. During the historical period, the input price index increased an average of 8.0 percent a year, compared with an average rate of increase of 6.6 percent for overall consumer prices. For the period 1979 through 1981, the average annual increase is forecast at between 8.5 and 9.0 percent. Using the index to deflate growth in expenses, the level of real growth in expenditures per inpatient day (net service intensity growth) averaged 4.5 percent per year with considerable annual variation related to government and hospital industry policies. PMID:10309052
Hagen, S. C.
Scientists at Applied GeoSolutions, Jet Propulsion Laboratory, Winrock International, and the University of New Hampshire are working with the government of Indonesia to enhance the National Forest Monitoring System in Kalimantan, Indonesia. The establishment of a reliable, transparent, and comprehensive NFMS has been limited by a dearth of relevant data that are accurate, low-cost, and spatially resolved at subnational scales. In this NASA funded project, we are developing, evaluating, and validating several critical components of a NFMS in Kalimantan, Indonesia, focusing on the use of LiDAR and radar imagery for improved carbon stock and forest degradation information. Applied GeoSolutions and the University of New Hampshire have developed an Open Source Software package to process large amounts LiDAR data quickly, easily, and accurately. The Open Source project is called lidar2dems and includes the classification of raw LAS point clouds and the creation of Digital Terrain Models (DTMs), Digital Surface Models (DSMs), and Canopy Height Models (CHMs). Preliminary estimates of forest structure and forest damage from logging from these data sets support the idea that comprehensive, well documented, freely available software for processing LiDAR data can enable countries such as Indonesia to cost effectively monitor their forests with high precision.
Wu, Z.; Weiner, J.; Kumar, J.; Norman, S. P.; Hargrove, W. W.; Collier, N.; Hoffman, F. M.
A major challenge in forest management is the inaccessibility of large swaths of land, which makes accurate monitoring of forest change difficult. Remote sensing methods can help address this issue by allowing investigators to monitor remote or inaccessible regions using aerial or satellite-based platforms. However, most remote sensing methods do not provide a full three-dimensional (3D) description of the area. Rather, they return only a single elevation point or landcover description. Multiple-return LiDAR (Light Detection and Ranging) gathers data in a 3D point cloud, which allows forest managers to more accurately characterize and monitor changes in canopy structure and vegetation-type distribution. Our project used high-resolution aerial multiple-return LiDAR data to determine vegetation canopy structures and their spatial distribution in Great Smoky Mountains National Park. To ensure sufficient data density and to match LANDSAT resolution, we gridded the data into 30m x 30m cells. The LiDAR data points within each cell were then used to generate the vertical canopy structure for that cell. After vertical profiles had been created, we used a k-means cluster analysis algorithm to classify the landscape based on the canopy structure. The spatial distribution of distinct and unique canopy structures was mapped across the park and compared to a vegetation-type map to determine the correlation of canopy structure to vegetation types. Preliminary analysis conducted at a number of phenology sites maintained by the Great Smoky Mountains Institute at Tremont shows strong correspondence between canopy structure and vegetation type. However, more validation is needed in other regions of the park to establish this method as a reliable tool. LiDAR data has a unique ability to map full 3D structures of vegetation and the methods developed in this project offer an extensible tool for forest mapping and monitoring.
Bruster, S.; Jarman, B.; Bosanquet, N.; Weston, D.; Erens, R.; Delbanco, T. L.
OBJECTIVE--To survey patients' opinions of their experiences in hospital in order to produce data that can help managers and doctors to identify and solve problems. DESIGN--Random sample of 36 NHS hospitals, stratified by size of hospital (number of beds), area (north, midlands, south east, south west), and type of hospital (teaching or non-teaching, trust or directly managed). From each hospital a random sample of, on average, 143 patients was interviewed at home or the place of discharge two to four weeks after discharge by means of a structured questionnaire about their treatment in hospital. SUBJECTS--5150 randomly chosen NHS patients recently discharged from acute hospitals in England. Subjects had been patients on medical and surgical wards apart from paediatric, maternity, psychiatric, and geriatric wards. MAIN OUTCOME MEASURES--Patients' responses to direct questions about preadmission procedures, admission, communication with staff, physical care, tests and operations, help from staff, pain management, and discharge planning. Patients' responses to general questions about their degree of satisfaction in hospitals. RESULTS--Problems were reported by patients, particularly with regard to communication with staff (56% (2824/5020) had not been given written or printed information); pain management (33% (1042/3162) of those suffering pain were in pain all or most of the time); and discharge planning (70% (3599/5124) had not been told about warning signs and 62% (3177/5119) had not been told when to resume normal activities). Hospitals failed to reach the standards of the Patient's Charter--for example, in explaining the treatment proposed and giving patients the option of not taking part in student training. Answers to questions about patient satisfaction were, however, highly positive but of little use to managers. CONCLUSIONS--This survey has highlighted several problems with treatment in NHS hospitals. Asking patients direct questions about what happened
Kishimba, Rogath Saika; Mpembeni, Rose; Mghamba, Janneth M; Goodman, David; Valencia, Diana
Background 94% of all birth defects (BD) and 95% of deaths due to the BD occur in low and middle income countries, many of which are preventable. In Tanzania, there is currently a paucity of BD data necessary to develop data informed prevention activities. Methods A cross-sectional analysis was conducted of deliveries identified with BD in the labor ward registers at four Dar es Salaam hospitals between October, 2011 and February, 2012. The birth prevalence of structural BD, case fatality proportion, and the distribution of structural defects associated deaths within total deaths were calculated. Results A total of 28 217 resident births were encountered during the study period. Overall birth prevalence of selected defects was 28.3/10 000 live births. Neural tube defects and indeterminate sex were the most and least common defects at birth (9.9 and 1.1/10 000 live births, respectively). Among stillbirths (66.7%) and deaths that occurred within less than 5 days of an affected live birth (18.5%), neural tube defects were the most frequently associated structural defect. Conclusion Structural BD is common and contributes to perinatal mortality in Dar es Salaam. More than half of perinatal deaths encountered among the studied selected external structural BD are associated with neural tube defects, a birth defect with well–established evidence based prevention interventions. By establishing a population–based BD surveillance program, Tanzania would have the information about neural tube defects and other major structural BD needed to develop and monitor prevention activities. PMID:26361541
Methods used to process raw Light Detection and Ranging (LiDAR) data can sometimes obscure the digital signatures indicative of an archaeological site. This thesis explains the negative effects that certain LiDAR data processing procedures can have on the preservation of an archaeological site. This thesis also presents methods for effectively integrating LiDAR with other forms of mapping data in a Geographic Information Systems (GIS) environment in order to improve LiDAR archaeological signatures by examining several pre-Columbian Native American shell middens located in Canaveral National Seashore Park (CANA).
Kumar, Jitendra; HargroveJr., William Walter; Norman, Steven P; Hoffman, Forrest M; Newcomb, Doug
Vegetation canopy structure is a critically important habit characteristic for many threatened and endangered birds and other animal species, and it is key information needed by forest and wildlife managers for monitoring and managing forest resources, conservation planning and fostering biodiversity. Advances in Light Detection and Ranging (LiDAR) technologies have enabled remote sensing-based studies of vegetation canopies by capturing three-dimensional structures, yielding information not available in two-dimensional images of the landscape pro- vided by traditional multi-spectral remote sensing platforms. However, the large volume data sets produced by airborne LiDAR instruments pose a significant computational challenge, requiring algorithms to identify and analyze patterns of interest buried within LiDAR point clouds in a computationally efficient manner, utilizing state-of-art computing infrastructure. We developed and applied a computationally efficient approach to analyze a large volume of LiDAR data and to characterize and map the vegetation canopy structures for 139,859 hectares (540 sq. miles) in the Great Smoky Mountains National Park. This study helps improve our understanding of the distribution of vegetation and animal habitats in this extremely diverse ecosystem.
Joshi, N.; Fensholt, R.; Saatchi, S. S.; Mitchard, E. T.
The international Reducing Emissions from Deforestation and Degradation (REDD) program requires accurate and cost-effective techniques of national-level mapping of above-ground biomass (AGB) and ground-sampling strategies. This paper explores a multi-sensor (radar and low-density airborne LiDAR) integration approach for country-wide AGB estimation and mapping in Denmark, selected as a test-country due to the unique availability of country-wide remote sensing and forest inventory data. We assess the potential use of ALOS PALSAR L-band radar and ENVISAT ASAR C-band radar in prediction and mapping of AGB with accuracies similar to LiDAR-derived AGB estimates at different map scales. We start by creating a LiDAR-based ';ground truth' map, using LiDAR-derived 95th Percentile of heights >1 m weighted by the Canopy Density ratio, together with 113 AGB plots to map AGB at a 0.25 ha resolution across the country. A leave-20%-out cross-validation indicates that the AGB estimates have a mean absolute error of 41 Mg ha-1 and a negative mean bias error of 1.7 Mg ha-1. Though the LiDAR model appears to have an overall species-specific bias for conifers and broadleaf (-5.2 Mg ha-1 and +12.3 Mg ha-1 respectively), these are found to be insignificant (p>0.05) when accounting for species sampling bias and the under-prediction of plots containing high-biomass (> 350 Mg ha-1). Using the LiDAR-derived biomass map as a ';truth-map', biomass-backscatter relations will be quantified at three map scales (0.25 ha, 1 ha and 100 ha) and using three spatial sampling frameworks (full-dataset, stratified random sampling equally representing low and high biomass pixels, clustered sampling). The approach aims to derive a minimal-sampling and mapping strategy for L- and C-band radar that achieves at least 20% accuracy in AGB estimation, along with quantified sources of error from ground-AGB estimates, scaling and sampling. It is expected that mapping techniques, uncertainty quantification and
Objectives This paper provides information for decision making of the managers and the staff of national university hospitals. Methods In order to conduct a financial analysis of national university hospitals, this study uses reports on the final accounts of 10 university hospitals from 2008 to 2011. Results The results of comparing 2008 and 2011 showed that there was a general decrease in total assets, an increase in liabilities, and a decrease in total medical revenues, with a continuous deficit in many hospitals. Moreover, as national university hospitals have low debt dependence, their management conditions generally seem satisfactory. However, some individual hospitals suffer severe financial difficulties and thus depend on short-term debts, which generally aggravate the profit and loss structure. Various indicators show that the financial state and business performance of national university hospitals have been deteriorating. Conclusion These research findings will be used as important basic data for managers who make direct decisions in this uncertain business environment or by researchers who analyze the medical industry to enable informed decision-making and optimized execution. Furthermore, this study is expected to contribute to raising government awareness of the need to foster and support the national university hospital industry. PMID:26730356
Oates, B; Murray, J; Hindle, D
The costing of hospital outputs, and especially of acute admitted patients categorised by DRG, has been the focus of considerable attention in the last decade. Many individual hospitals now routinely estimate the costs of their main products, several State and Territory health authorities undertake periodic multi-site studies, and there have been a few one-off national studies. This paper summarises the methods and results of the most recent national study, which measured costs at a sample of public and private hospitals around Australia for the 1996-97 financial year. We briefly describe the main results and note some implications. PMID:10185689
Boscarino, J A; Steiber, S R
Today, hospitals are involved extensively in social marketing and promotional activities. Recently, investigators from the Centers for Disease Control and Prevention (CDC) estimated that routine testing of hospital patients for human immunodeficiency virus (HIV) could identify more than 100,000 patients with previously unrecognized HIV infections. Several issues are assessed in this paper. These include hospital support for voluntary HIV testing and AIDS education and the impact that treating AIDS patients has on the hospital's image. Also tested is the hypothesis that certain hospitals, such as for-profit institutions and those outside the AIDS epicenters, would be less supportive of hospital-based AIDS intervention strategies. To assess these issues, a national random sample of 193 executives in charge of hospital marketing and public relations were surveyed between December 1992 and January 1993. The survey was part of an ongoing annual survey of hospitals and included questions about AIDS, health education, marketing, patient satisfaction, and hospital planning. Altogether, 12.4 percent of executives indicated their hospital had a reputation for treating AIDS patients. Among hospitals without an AIDS reputation, 34.1 percent believed developing one would be harmful to the hospital's image, in contrast to none in hospitals that had such a reputation (chi 2 = 11.676, df = 1, P = .0006). Although 16.6 percent did not know if large-scale HIV testing should be implemented, a near majority (47.7 percent) expressed some support. In addition, 15 percent reported that HIV-positive physicians on the hospital's medical staff should not be allowed to practice medicine, but 32.1 percent indicated that they should.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7638335
Yoo, Sooyoung; Lee, Kee Hyuck; Lee, Hak Jong; Ha, Kyooseob; Lim, Cheong; Chin, Ho Jun; Yun, Jonghoar; Cho, Eun-Young; Chung, Eunja; Baek, Rong-Min; Chung, Chin Youb; Wee, Won Ryang; Lee, Chul Hee; Lee, Hai-Seok; Byeon, Nam-Soo
Objectives Seoul National University Bundang Hospital, which is the first Stage 7 hospital outside of North America, has adopted and utilized an innovative and emerging information technology system to improve the efficiency and quality of patient care. The objective of this paper is to briefly introduce the major components of the SNUBH information system and to describe our progress toward a next-generation hospital information system (HIS). Methods SNUBH opened in 2003 as a fully digital hospital by successfully launching a new HIS named BESTCare, "Bundang hospital Electronic System for Total Care". Subsequently, the system has been continuously improved with new applications, including close-loop medication administration (CLMA), clinical data warehouse (CDW), health information exchange (HIE), and disaster recovery (DR), which have resulted in the achievement of Stage 7 status. Results The BESTCare system is an integrated system for a university hospital setting. BESTCare is mainly composed of three application domains: the core applications, an information infrastructure, and channel domains. The most critical and unique applications of the system, such as the electronic medical record (EMR), computerized physician order entry (CPOE), clinical decision support system (CDSS), CLMA, CDW, HIE, and DR applications, are described in detail. Conclusions Beyond our achievement of Stage 7 hospital status, we are currently developing a next-generation HIS with new goals of implementing infrastructure that is flexible and innovative, implementing a patient-centered system, and strengthening the IT capability to maximize the hospital value. PMID:22844650
Mitchard, E. T. A.; Saatchi, S. S.; White, L. J. T.; Abernethy, K. A.; Jeffery, K. J.; Lewis, S. L.; Collins, M.; Lefsky, M. A.; Leal, M. E.; Woodhouse, I. H.; Meir, P.
Spatially-explicit maps of aboveground biomass are essential for calculating the losses and gains in forest carbon at a regional to national level. The production of such maps across wide areas will become increasingly necessary as international efforts to protect primary forests, such as the REDD+ (Reducing Emissions from Deforestation and forest Degradation) mechanism, come into effect, alongside their use for management and research more generally. However, mapping biomass over high-biomass tropical forest is challenging as (1) direct regressions with optical and radar data saturate, (2) much of the tropics is persistently cloud-covered, reducing the availability of optical data, (3) many regions include steep topography, making the use of radar data complex, (5) while LiDAR data does not suffer from saturation, expensive aircraft-derived data are necessary for complete coverage. We present a solution to the problems, using a combination of terrain-corrected L-band radar data (ALOS PALSAR), spaceborne LiDAR data (ICESat GLAS) and ground-based data. We map Gabon's Lopé National Park (5000 km2) because it includes a range of vegetation types from savanna to closed-canopy tropical forest, is topographically complex, has no recent contiguous cloud-free high-resolution optical data, and the dense forest is above the saturation point for radar. Our 100 m resolution biomass map is derived from fusing spaceborne LiDAR (7142 ICESat GLAS footprints), 96 ground-based plots (average size 0.8 ha) and an unsupervised classification of terrain-corrected ALOS PALSAR radar data, from which we derive the aboveground biomass stocks of the park to be 78 Tg C (173 Mg C ha-1). This value is consistent with our field data average of 181 Mg C ha-1, from the field plots measured in 2009 covering a total of 78 ha, and which are independent as they were not used for the GLAS-biomass estimation. We estimate an uncertainty of ±25% on our carbon stock value for the park. This error term
Makori, L.; Gikera, M.; Wafula, J.; Chakaya, J.; Edginton, M. E.; Kumar, A. M. V.
Setting: Kenyatta National Hospital (KNH), Nairobi, Ken-ya, a large referral and teaching hospital. Objective: 1) To document tuberculosis (TB) case notification rates and trends; 2) to describe demographic, clinical and workplace characteristics and treatment outcomes; and 3) to examine associations between demographic and clinical characteristics, HIV/AIDS (human immunodeficiency virus/acquired immune-deficiency syndrome) treatment and anti-tuberculosis treatment outcomes among hospital workers with TB at KNH during the period 2006–2011. Design: A retrospective cohort study involving a review of medical records. Results: The TB case notification rate among hospital staff ranged between 413 and 901 per 100 000 staff members per year; 51% of all cases were extra-pulmonary TB; 74% of all cases were among medical, paramedical and support staff. The TB-HIV coinfection rate was 60%. Only 75% had a successful treatment outcome. Patients in the retreatment category, those with unknown HIV status and those who were support staff had a higher risk of poor treatment outcomes. Conclusion: The TB case rate among hospital workers was unacceptably high compared to that of the general population, and treatment outcomes were poor. Infection control in the hospital and management of staff with TB requires urgent attention. PMID:26393055
Hung, Wen-Jiu; Lin, Lan-Ping; Wu, Chia-Ling; Lin, Jin-Ding
The present paper aims to describe the hospitalization profiles which include medical expenses and length of stays, and to determine their possible influencing factors of hospital admission on persons with Down syndrome in Taiwan. We employed a population-based, retrospective analyses used national health insurance hospital discharge data of the…
Spetz, Joanne; Dudley, Nancy; Trupin, Laura; Rogers, Maggie; Meier, Diane E; Dumanovsky, Tamara
The predominant model for palliative care delivery, outside of hospice care, is the hospital-based consultative team. Although a majority of US hospitals offer palliative care services, there has been little research on the staffing of their program teams and whether those teams meet national guidelines, such as the Joint Commission's standard of including at least one physician, an advanced practice or other registered nurse, a social worker, and a chaplain. Data from the 2012-13 annual surveys of the National Palliative Care Registry indicate that only 25 percent of participating programs met that standard based on funded positions, and even when unfunded positions were included, only 39 percent of programs met the standard. Larger palliative care programs were more likely than smaller ones to include a funded physician position, while smaller programs were more reliant upon advanced practice and registered nurses. To meet current and future palliative care needs, expanded and enhanced education, as well as supportive financing mechanisms for consultations, are needed. PMID:27605652
... Has the inpatient hospital death rate decreased for all patients and for those with selected first-listed ... 2010 differ from the length of stay for all hospitalizations? Inpatients who died in the hospital stayed ...
Robinson, Joel E.
Crater Lake partially fills the caldera that formed approximately 7,700 years ago during the eruption of a 12,000-foot volcano known as Mount Mazama. The caldera-forming or climactic eruption of Mount Mazama devastated the surrounding landscape, left a thick deposit of pumice and ash in adjacent valleys, and spread a blanket of volcanic ash as far away as southern Canada. Because the Crater Lake region is potentially volcanically active, knowledge of past events is important to understanding hazards from future eruptions. Similarly, because the area is seismically active, documenting and evaluating geologic faults is critical to assessing hazards from earthquakes. As part of the American Recovery and Reinvestment Act (ARRA) of 2009, the U.S. Geological Survey was awarded funding for high-precision airborne LiDAR (Light Detection And Ranging) data collection at several volcanoes in the Cascade Range through the Oregon LiDAR Consortium, administered by the Oregon Department of Geology and Mineral Industries (DOGAMI). The Oregon LiDAR Consortium contracted with Watershed Sciences, Inc., to conduct the data collection surveys. Collaborating agencies participating with the Oregon LiDAR Consortium for data collection in the Crater Lake region include Crater Lake National Park (National Park Service) and the Federal Highway Administration. In the immediate vicinity of Crater Lake National Park, 798 square kilometers of LiDAR data were collected, providing a digital elevation dataset of the ground surface beneath forest cover with an average resolution of 1.6 laser returns/m2 and both vertical and horizontal accuracies of ±5 cm. The LiDAR data were mosaicked in this report with bathymetry of the lake floor of Crater Lake, collected in 2000 using high-resolution multibeam sonar in a collaborative effort between the U.S. Geological Survey, Crater Lake National Park, and the Center for Coastal and Ocean Mapping at the University of New Hampshire. The bathymetric survey
Lim, HyunWoo; Kim, DongOok; Ahn, JinYoung; Lee, DongHyuk; Lee, JinHyung; Park, HeeJung; Kim, JongHyo; Han, Jungu
Seoul National University Hospital (SNUH) is composed of two buildings and has more than 1500 beds for patients needing hospitalization. Marotech has provided full PACS to SNUH with total HIS Integration in this year. In this paper, the installation process and management experience for seven months will be presented. At SNUH, 1643.8 exams were held per day during seven month after PACS installation. It is about 40 Gigabytes per day. Two acquisition servers (ACQ 1, 2), two database servers (DB 1, 2), two storage servers (LTA, network attached storage-NAS), one backup server (DLT) totally 8 servers were installed. SNUH has 11 CRs, 4 CTs, 3 MRIs, 9 NMs, 4 RFs, 20 USs, 7 ESs, 4 SCs, 5 XAs, and 5 Film Ditigers. All these modalities were integrated with PACS. DICOM 3.0 standard was conformed for images. DICOM Gateways were installed for modalities that do not support DICOM. The doctor can query and view Endoscopes, pathologic and anatomic data as well as radiological data. All the past five years exams is accessed less than 10 Seconds via on-line. Through the cooperation with SNUH and Marotech, HIS and PACS work together in stable state. These systems were integrated with HL7 standards and IHE.
Kane, Van R.; North, Malcolm P.; Lutz, James A.; Churchill, Derek J.; Roberts, Susan L.; Smith, Douglas F.; McGaughey, Robert J.; Kane, Jonathan T.; Brooks, Matthew L.
Mosaics of tree clumps and openings are characteristic of forests dominated by frequent, low- and moderate-severity fires. When restoring these fire-suppressed forests, managers often try to reproduce these structures to increase ecosystem resilience. We examined unburned and burned forest structures for 1937 0.81 ha sample areas in Yosemite National Park, USA. We estimated severity for fires from 1984 to 2010 using the Landsat-derived Relativized differenced Normalized Burn Ratio (RdNBR) and measured openings and canopy clumps in five height strata using airborne LiDAR data. Because our study area lacked concurrent field data, we identified methods to allow structural analysis using LiDAR data alone. We found three spatial structures, canopy-gap, clump-open, and open, that differed in spatial arrangement and proportion of canopy and openings. As fire severity increased, the total area in canopy decreased while the number of clumps increased, creating a patchwork of openings and multistory tree clumps. The presence of openings > 0.3 ha, an approximate minimum gap size needed to favor shade-intolerant pine regeneration, increased rapidly with loss of canopy area. The range and variation of structures for a given fire severity were specific to each forest type. Low- to moderate-severity fires best replicated the historic clump-opening patterns that were common in forests with frequent fire regimes. Our results suggest that managers consider the following goals for their forest restoration: 1) reduce total canopy cover by breaking up large contiguous areas into variable-sized tree clumps and scattered large individual trees; 2) create a range of opening sizes and shapes, including ~ 50% of the open area in gaps > 0.3 ha; 3) create multistory clumps in addition to single story clumps; 4) retain historic densities of large trees; and 5) vary treatments to include canopy-gap, clump-open, and open mosaics across project areas to mimic the range of patterns found for each
Han, Jung Mi; Boo, Eun Hee; Kim, Jung A; Yoon, Soo Jin; Kim, Seong Woo
Objectives This study evaluated the qualitative and quantitative performances of the newly developed information system which was implemented on November 4, 2011 at the National Health Insurance Corporation Ilsan Hospital. Methods Registration waiting time and changes in the satisfaction scores for the key performance indicators (KPI) before and after the introduction of the system were compared; and the economic effects of the system were analyzed by using the information economics approach. Results After the introduction of the system, the waiting time for registration was reduced by 20%, and the waiting time at the internal medicine department was reduced by 15%. The benefit-to-cost ratio was increased to 1.34 when all intangible benefits were included in the economic analysis. Conclusions The economic impact and target satisfaction rates increased due to the introduction of the new system. The results were proven by the quantitative and qualitative analyses carried out in this study. This study was conducted only seven months after the introduction of the system. As such, a follow-up study should be carried out in the future when the system stabilizes. PMID:23115744
Naidoo, L.; Cho, M. A.; Mathieu, R.; Asner, G.
The accurate classification and mapping of individual trees at species level in the savanna ecosystem can provide numerous benefits for the managerial authorities. Such benefits include the mapping of economically useful tree species, which are a key source of food production and fuel wood for the local communities, and of problematic alien invasive and bush encroaching species, which can threaten the integrity of the environment and livelihoods of the local communities. Species level mapping is particularly challenging in African savannas which are complex, heterogeneous, and open environments with high intra-species spectral variability due to differences in geology, topography, rainfall, herbivory and human impacts within relatively short distances. Savanna vegetation are also highly irregular in canopy and crown shape, height and other structural dimensions with a combination of open grassland patches and dense woody thicket - a stark contrast to the more homogeneous forest vegetation. This study classified eight common savanna tree species in the Greater Kruger National Park region, South Africa, using a combination of hyperspectral and Light Detection and Ranging (LiDAR)-derived structural parameters, in the form of seven predictor datasets, in an automated Random Forest modelling approach. The most important predictors, which were found to play an important role in the different classification models and contributed to the success of the hybrid dataset model when combined, were species tree height; NDVI; the chlorophyll b wavelength (466 nm) and a selection of raw, continuum removed and Spectral Angle Mapper (SAM) bands. It was also concluded that the hybrid predictor dataset Random Forest model yielded the highest classification accuracy and prediction success for the eight savanna tree species with an overall classification accuracy of 87.68% and KHAT value of 0.843.
University hospitals, bringing together the three divisions of education, research, and clinical medicine, could be said to represent the pinnacle of medicine. However, when compared with physicians working at public and private hospitals, physicians working at university hospitals and medical schools face extremely poor conditions. This is because physicians at national university hospitals are considered to be "educators." Meanwhile, even after the privatization of national hospitals, physicians working for these institutions continue to be perceived as "medical practitioners." A situation may arise in which physicians working at university hospitals-performing top-level medical work while also being involved with university and postgraduate education, as well as research-might leave their posts because they are unable to live on their current salaries, especially in comparison with physicians working at national hospitals, who focus solely on medical care. This situation would be a great loss for Japan. This potential loss can be prevented by amending the classification of physicians at national university hospitals from "educators" to "medical practitioners." In order to accomplish this, the Japan Medical Association, upon increasing its membership and achieving growth, should act as a mediator in negotiations between national university hospitals, medical schools, and the government. PMID:25842820
Griffith, John R; Pattullo, Andrew; Alexander, Jeffrey A; Jelinek, Richard C; Foster, David A
Nine standardized measures compiled from Medicare data show trends in the safety, quality, financial management, and efficiency for more than 2,500 community hospitals over five years ending in 2003. Although much public attention has been given to hospital performance, along with exhortations to improve, few measures show substantial positive trends, either in variance reduction or overall improvement. The authors conclude that environmental forces are not stimulating improvement and that the overall picture is one of randomness rather than management. PMID:17184003
Abortion is the foremost moral issue for 626 Catholic hospitals nationwide since church teachings prohibit the performance of elective abortions. This and the fact that Catholic hospitals can not do voluntary sterilizations can hinder their ability to get managed care contracts. In some cases a hospital will not join a network because abortions and sterilizations are done in other hospitals in the network. In other cases they have been in plans where abortions are performed in other contract facilities; this does not violate the Catholic church policy since the abortions are not performed in their facility. When a Catholic and secular hospital plan a merger, Catholic ideals seem to take precedence. A Catholic hospital that went bankrupt in Philadelphia, was turned over to investors, and was under no obligation to follow the Catholic church's directives, but did not perform abortions anyway. In Washington state there are merger talks going on between a secular facility and the Franciscan Health System. The cessation of abortion and sterilization services appear to be outweighed by the financial benefits. Besides, these procedures can be performed through other providers in the area. In Michigan similar merger talks may fail because of the abortion issue. The government justice system is investigating and is likely to challenge any merger there. PMID:10294510
Kozhimannil, Katy B.; Arcaya, Mariana C.; Subramanian, S. V.
Background Cesarean delivery is the most common inpatient surgery in the United States, where 1.3 million cesarean sections occur annually, and rates vary widely by hospital. Identifying sources of variation in cesarean use is crucial to improving the consistency and quality of obstetric care. We used hospital discharge records to examine the extent to which variability in the likelihood of cesarean section across US hospitals was attributable to individual women's clinical diagnoses. Methods and Findings Using data from the 2009 and 2010 Nationwide Inpatient Sample from the Healthcare Cost and Utilization Project—a 20% sample of US hospitals—we analyzed data for 1,475,457 births in 1,373 hospitals. We fitted multilevel logistic regression models (patients nested in hospitals). The outcome was cesarean (versus vaginal) delivery. Covariates included diagnosis of diabetes in pregnancy, hypertension in pregnancy, hemorrhage during pregnancy or placental complications, fetal distress, and fetal disproportion or obstructed labor; maternal age, race/ethnicity, and insurance status; and hospital size and location/teaching status. The cesarean section prevalence was 22.0% (95% confidence interval 22.0% to 22.1%) among women with no prior cesareans. In unadjusted models, the between-hospital variation in the individual risk of primary cesarean section was 0.14 (95% credible interval 0.12 to 0.15). The difference in the probability of having a cesarean delivery between hospitals was 25 percentage points. Hospital variability did not decrease after adjusting for patient diagnoses, socio-demographics, and hospital characteristics (0.16 [95% credible interval 0.14 to 0.18]). A limitation is that these data, while nationally representative, did not contain information on parity or gestational age. Conclusions Variability across hospitals in the individual risk of cesarean section is not decreased by accounting for differences in maternal diagnoses. These findings highlight
Kimata, Yoshihiro; Matsumoto, Hiroshi; Sugiyama, Narusi; Onoda, Satoshi; Sakuraba, Minoru
The risk of surgical site infection (SSI) remains high after major reconstructive surgery of the head and neck. Clinical data regarding SSI in microsurgical tongue reconstruction are described at National Cancer Hospital in Japan, including discussions of unfavorable representative cases, the relationship between SSI and preoperative irradiation at Okayama University Hospital in Japan, and strategies for SSI control in head and neck reconstruction. Local complications are inevitable in patients undergoing reconstruction in the head and neck areas. The frequency of major complications can be decreased, and late postoperative complications can be prevented with the help of appropriate methods. PMID:27601396
Greene, M Todd; Saint, Sanjay
Infection prevention practices vary across U.S. hospitals. Although the importance of leadership in infection prevention has been described, little is known about how followership influences such efforts. Our national survey found that hospitals with truly exemplary followers in infection control roles may be more likely to use recommended prevention practices. PMID:26698669
Feliciano, E. A.; Wdowinski, S.; Potts, M. D.; Fatoyinbo, T. E.; Lee, S. K.
The coastal mangroves forests of Everglades National Park (ENP) are well protected from development. Nevertheless, climate change, hurricanes and other anthropogenic disturbances have affected these intertidal ecosystems. Understanding and monitoring forest structural parameters such as canopy height and above-ground biomass (AGB) are important for the establishment of an historical database for past, present and future ecosystem comparison. Forest canopy height has a well understood and directly proportional correlation with AGB. It is possible to derive it using (1) airborne LiDAR/Laser Scanning (ALS) or (2) space-borne radar systems such as Shuttle Radar Topography Mission (SRTM) and TanDEM-X (TDX). A previous study of the mangrove canopy height and AGB in the ENP was conducted a decade ago based on ALS data acquired in 2004 in conjunction with SRTM data, which were acquired in 2000 (Simard et al. 2006). In this study we estimated canopy height and AGB using an ALS dataset acquired in 2012 and TDX data acquired during the years 2012-2014. The ALS dataset was acquired along a 16.5 x 1.5 km swath of mangrove forest with variable canopy height. The sampled areas were representative of mangrove stature and structure in the whole ENP. Analysis of the ALS dataset showed that mangrove canopy height can reach up to ~25 meters close to the coastal ENP waters. Additionally, by comparing our ALS results with those of a previous study by Simard et al. (2006) we identified areas where mangrove height changes greater than ± 3 meters occurred. To expand the study area to the full ENP mangrove ecosystem we processed single-polarization TDX data to obtain a Digital Canopy Model (DCM) that represents the mangrove canopy height. In order to obtain the true canopy height we calibrated the TDX phase center height with ALS true canopy height. Preliminary results of a corrected single-polarized (HH) TDX scene show that mangrove canopy height can reach up to ~25 meters in the western
McFarland, Daniel C; Shen, Megan Johnson; Holcombe, Randall F
Satisfactory pain management of hospitalized patients remains a national unmet need for the United States. Although prior research indicates that inpatient pain management may be improving nationally, not all populations of patients rate pain management as equally satisfactory. County-level predictors, such as demographics and population density, and hospital-level predictors (eg, hospital-bed number), are understudied determinants of pain management patient satisfaction. We created a multivariate regression model of pain management patient satisfaction scores as indicated by Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey results based on county and hospital level predictors. Number of hospital beds (β = -0.16), percent foreign-born (β = -0.16), and population density (β = -0.08) most strongly predicted unfavorable ratings, whereas African American (β = 0.23), white (β= 0.23), and younger population (β = 0.08) most strongly predicted favorable ratings. Greater attention should be placed on pain management in larger hospitals that serve foreign-born patients in population-dense areas. Journal of Hospital Medicine 2016;11:498-501. © 2016 Society of Hospital Medicine. PMID:26970075
Kim, Tae Hyun; Thompson, Jon M
Effective leadership in hospitals is widely recognized as the key to organizational performance. Clinical, financial, and operational performance is increasingly being linked to the leadership practices of hospital managers. Moreover, effective leadership has been described as a means to achieve competitive advantage. Recent environmental forces, including reimbursement changes and increased competition, have prompted many hospitals to focus on building leadership competencies to successfully address these challenges. Using the resource dependence theory as our conceptual framework, we present results from a national study of hospitals examining the association of organizational and market factors with the provision of leadership development program activities, including the presence of a leadership development program, a diversity plan, a program for succession planning, and career development resources. The data are taken from the American Hospital Association's (AHA) 2008 Survey of Hospitals, the Area Resource File, and the Centers for Medicare & Medicaid Services. The results of multilevel logistic regressions of each leadership development program activity on organizational and market factors indicate that hospital size, system and network affiliation, and accreditation are significantly and positively associated with all leadership development program activities. The market factors significantly associated with all leadership development activities include a positive odds ratio for metropolitan statistical area location and a negative odds ratio for the percentage of the hospital's service area population that is female and minority. For-profit hospitals are less likely to provide leadership development program activities. Additional findings are presented, and the implications for hospital management are discussed. PMID:22530292
Drees, Laurie Meijer
First Nations' perspectives on health and health care as delivered by doctors, nurses, and Canada's former Indian hospital system form a significant part of Canada's medical history, as well as a part of First Nations people's personal histories. Oral histories collected in Alberta and British Columbia suggest that First Nations people who experienced the Nanaimo and Charles Camsell Indian hospitals between 1945 and 1965 perceive the value of their experiences to be reflected in their survivance, a concept recalled through narratives emphasizing both humour and pain, as well as past and present personal resilience. PMID:21114087
Murray, Sara G.; Schmajuk, Gabriela; Trupin, Laura; Gensler, Lianne; Katz, Patricia P.; Yelin, Edward H.; Gansky, Stuart A.; Yazdany, Jinoos
Objective Infection is a leading cause of morbidity and mortality in systemic lupus erythematosus (SLE). Therapeutic practices have evolved over the past 15 years, but effects on infectious complications of SLE are unknown. We evaluated trends in hospitalizations for severe and opportunistic infections in a population-based SLE study. Methods Data derive from the 2000 to 2011 United States National Inpatient Sample, including individuals who met a validated administrative definition of SLE. Primary outcomes were diagnoses of bacteremia, pneumonia, opportunistic fungal infection, herpes zoster, cytomegalovirus, or pneumocystis pneumonia (PCP). We used Poisson regression to determine whether infection rates were changing in SLE hospitalizations and used predictive marginals to generate annual adjusted rates of specific infections. Results We identified 361,337 SLE hospitalizations from 2000 to 2011 meeting study inclusion criteria. Compared to non-SLE hospitalizations, SLE patients were younger (51 vs. 62 years), predominantly female (89% vs. 54%), and more likely to be racial/ethnic minorities. SLE diagnosis was significantly associated with all measured severe and opportunistic infections. From 2000 to 2011, adjusted SLE hospitalization rates for herpes zoster increased more than non-SLE rates: 54 to 79 per 10,000 SLE hospitalizations compared with 24 to 29 per 10,000 non-SLE hospitalizations. Conversely, SLE hospitalizations for PCP disproportionately decreased: 5.1 to 2.5 per 10,000 SLE hospitalizations compared with 0.9 to 1.3 per 10,000 non-SLE hospitalizations. Conclusions Among patients with SLE, herpes zoster hospitalizations are rising while PCP hospitalizations are declining. These trends likely reflect evolving SLE treatment strategies. Further research is needed to identify patients at greatest risk for infectious complications. PMID:26731012
Cowling, Thomas E; Harris, Matthew; Watt, Hilary; Soljak, Michael; Richards, Emma; Gunning, Elinor; Bottle, Alex; Macinko, James; Majeed, Azeem
Background The UK government is pursuing policies to improve primary care access, as many patients visit accident and emergency (A and E) departments after being unable to get suitable general practice appointments. Direct admission to hospital via a general practitioner (GP) averts A and E use, and may reduce total hospital costs. It could also enhance the continuity of information between GPs and hospital doctors, possibly improving healthcare outcomes. Objective To determine whether primary care access is associated with the route of emergency admission—via a GP versus via an A and E department. Methods Retrospective analysis of national administrative data from English hospitals for 2011–2012. Adults admitted in an emergency (unscheduled) for ≥1 night via a GP or an A and E department formed the study population. The measure of primary care access—the percentage of patients able to get a general practice appointment on their last attempt—was derived from a large, nationally representative patient survey. Multilevel logistic regression was used to estimate associations, adjusting for patient and admission characteristics. Results The analysis included 2 322 112 emergency admissions (81.9% via an A and E department). With a 5 unit increase in the percentage of patients able to get a general practice appointment on their last attempt, the adjusted odds of GP admission (vs A and E admission) was estimated to increase by 15% (OR 1.15, 95% CI 1.12 to 1.17). The probability of GP admission if ≥95% of appointment attempts were successful in each general practice was estimated to be 19.6%. This probability reduced to 13.6% when <80% of appointment attempts were successful. This equates to 139 673 fewer GP admissions (456 232 vs 316 559) assuming no change in the total number of admissions. Associations were consistent in direction across geographical regions of England. Conclusions Among hospital inpatients admitted as an emergency, patients
Kim, JongHyo; Yeon, Kyoung M.; Han, Man Chung; Lee, Dong Hyuk; Cho, Han I.
The SNUH has started a PACS project with three main goals: to develop a fully hospital-integrated PACS, to develop a cost effective PACS using open systems architecture, and to extend PACS' role to the advanced application such as image guided surgery, multi-media assisted education and research. In order to achieve these goals, we have designed a PACS architecture which takes advantage of client-server computing, high speed communication network, computing power of up-to-date high-end PC, and advanced image compression method. We have installed ATM based communication network in radiology department and in-patient wards, and implemented DICOM compliant acquisition modules, image storage and management servers, and high resolution display workstations based on high-end PC and Microsoft Windows 95 and Windows NT operating systems. The SNUH PACS is in partial scale operation now, and will be expanded to full scale by the end of 1998.
McNatt, Zahirah; Linnander, Erika; Endeshaw, Abraham; Tatek, Dawit; Conteh, David
Abstract Many countries struggle to develop and implement strategies to monitor hospitals nationally. The challenge is particularly acute in low-income countries where resources for measurement and reporting are scarce. We examined the experience of developing and implementing a national system for monitoring the performance of 130 government hospitals in Ethiopia. Using participatory observation, we found that the monitoring system resulted in more consistent hospital reporting of performance data to regional health bureaus and the federal government, increased transparency about hospital performance and the development of multiple quality-improvement projects. The development and implementation of the system, which required technical and political investment and support, would not have been possible without strong hospital-level management capacity. Thorough assessment of the health sector’s readiness to change and desire to prioritize hospital quality can be helpful in the early stages of design and implementation. This assessment may include interviews with key informants, collection of data about health facilities and human resources and discussion with academic partners. Aligning partners and donors with the government’s vision for quality improvement can enhance acceptability and political support. Such alignment can enable resources to be focused strategically towards one national effort – rather than be diluted across dozens of potentially competing projects. Initial stages benefit from having modest goals and the flexibility for continuous modification and improvement, through active engagement with all stakeholders. PMID:26600614
This study aims to explore the influence of national cultural differences on nurses' perceptions of their acceptance of hospital information systems. This study uses the perspective of Technology Acceptance Model; national cultural differences in terms of masculinity/femininity, individualism/collectivism, power distance, and uncertainty avoidance are incorporated into the Technology Acceptance Model as moderators, whereas time orientation is a control variable on hospital information system acceptance. A quantitative research design was used in this study; 261 participants, US and Taiwan RNs, all had hospital information system experience. Data were collected from November 2013 to February 2014 and analyzed using a t test to compare the coefficients for each moderator. The results show that individualism/collectivism, power distance, and uncertainty avoidance all exhibit significant difference on hospital information system acceptance; however, both masculinity/femininity and time orientation factors did not show significance. This study verifies that national cultural differences have significant influence on nurses' behavioral intention to use hospital information systems. Therefore, hospital information system providers should emphasize the way in which to integrate different technological functions to meet the needs of nurses from various cultural backgrounds. PMID:25899441
Kim, Jinkyung; Han, Woosok
Objectives To investigate predictors for specific dimensions of service quality perceived by hospital employees in long-term care hospitals. Methods Data collected from a survey of 298 hospital employees in 18 long-term care hospitals were analysed. Multivariate ordinary least squares regression analysis with hospital fixed effects was used to determine the predictors of service quality using respondents’ and organizational characteristics. Results The most significant predictors of employee-perceived service quality were job satisfaction and degree of consent on national evaluation criteria. National evaluation results on long-term care hospitals and work environment also had positive effects on service quality. Conclusion The findings of the study show that organizational characteristics are significant determinants of service quality in long-term care hospitals. Assessment of the extent to which hospitals address factors related to employeeperceived quality of services could be the first step in quality improvement activities. Results have implications for efforts to improve service quality in longterm care hospitals and designing more comprehensive national evaluation criteria. PMID:24159497
Ma, Yan; Mi, Fengling; Liu, Yuhong; Li, Liang
Background China is transitioning towards concentrating tuberculosis (TB) diagnostic and treatment services in hospitals, while the Centers of Disease Control and Prevention (CDC) system will retain important public health functions. Patient expenditure incurred through hospitalization may lead to barriers to TB care or interruption of treatment. Methodology/Principal Findings We conducted a national survey of TB specialized hospitals to determine hospitalization fees and hospital bed utilization in 1999, 2004, and 2009. Hospitalization of TB patients increased 185.3% from 1999 to 2009. While the average hospitalization fees also increased, the proportion of those fees in relation to GDP per capita decreased. Hospitalization fees differed across the three regions (eastern, central, and western). Using a least standard difference (LSD) paired analysis, in 2004, the difference in hospitalization fees was significant when comparing eastern and central provinces (p<0.001) as well as to western provinces (p<0.001). In 2009, the difference remained statistically significant when comparing eastern province hospitalization fees with central provinces (p<0.001) and western provinces (p = 0.008). In 2004 and 2009, the cost associated with hospitalization as a proportion of GDP per capita was highest in the western region. The average in-patient stay decreased from 33 days in 1999 to 26 and 27 days in 2004 and 2009 respectively. Finally, hospital bed utilization in all three regions increased over this period. Conclusions/Significance Our findings show that both the total number of in-patients and hospitalization fees increased from 1999 to 2009, though the proportion of hospitalization fees to GDP per capita decreased. As diagnostic services move to hospitals, regulatory and monitoring mechanisms should be established, and hospitals should make use of the experience garnered by the CDC system through continued strong collaborations. Infrastructure and social protection
Zhou, T.; Popescu, S. C.; Krause, K.
Successful classification of tree species with waveform LiDAR data would be of considerable value to estimate the biomass stocks and changes in forests. Current approaches emphasize converting the full waveform data into discrete points to get larger amount of parameters and identify tree species using several discrete-points variables. However, ignores intensity values and waveform shapes which convey important structural characteristics. The overall goal of this study was to employ the intensity and waveform shape of individual tree as the waveform signature to detect tree species. The data was acquired by the National Ecological Observatory Network (NEON) within 250*250 m study area located in San Joaquin Experimental Range. Specific objectives were to: (1) segment individual trees using the smoothed canopy height model (CHM) derived from discrete LiDAR points; (2) link waveform LiDAR with above individual tree boundaries to derive sample signatures of three tree species and use these signatures to discriminate tree species in a large area; and (3) compare tree species detection results from discrete LiDAR data and waveform LiDAR data. An overall accuracy of the segmented individual tree of more than 80% was obtained. The preliminary results show that compared with the discrete LiDAR data, the waveform LiDAR signature has a higher potential for accurate tree species classification.
Linnander, Erika; McNatt, Zahirah; Sipsma, Heather; Tatek, Dawit; Abebe, Yigeremu; Endeshaw, Abraham; Bradley, Elizabeth H.
Background Quality improvement collaboratives are a widely used mechanism to improve hospital performance in high-income settings, but we lack evidence about their effectiveness in low-income settings. Methods We conducted cross-sectional and longitudinal analysis of data from the Ethiopian Hospital Alliance for Quality, a national collaborative sponsored by Ethiopia's Federal Ministry of Health. We identified hospital strategies associated with more positive patient satisfaction using linear regression and assessed changes in patient experience over a 3-year period (2012–2014) using matched t-tests. Results A total of 68 hospitals (response rate 68/120, 56.7%) were included in cross-sectional analysis. Four practices were significantly associated with more positive patient satisfaction (p<0.05): posting a record of cleaning activity in toilets and in patient wards, distributing leaflets in the local language with each prescription, and sharing ideas about patient experience across the hospital. Among hospitals that had complete data for longitudinal analysis (44/68, 65%), we found a 10% improvement in a 10-point measure of patient satisfaction (7.7 vs 8.4, p<0.01) from the start to the end of the study period. Conclusions Quality improvement collaboratives can be useful at scale in low-income settings in sub-Saharan Africa, particularly for hospitals that adopt strategies associated with patient satisfaction. PMID:26796023
... 42 Public Health 5 2013-10-01 2013-10-01 false Other national accreditation programs for hospitals and other providers and suppliers. 488.6 Section 488.6 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION SURVEY, CERTIFICATION, AND ENFORCEMENT PROCEDURES...
... 42 Public Health 5 2012-10-01 2012-10-01 false Other national accreditation programs for hospitals and other providers and suppliers. 488.6 Section 488.6 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION SURVEY, CERTIFICATION, AND ENFORCEMENT PROCEDURES...
Background International clinical trials are now rapidly expanding into Asia. However, the proportion of global trials is higher in South Korea compared to Japan despite implementation of similar governmental support in both countries. The difference in clinical trial environment might influence the respective physicians’ attitudes and experience towards clinical trials. Therefore, we designed a questionnaire to explore how physicians conceive the issues surrounding clinical trials in both countries. Methods A questionnaire survey was conducted at Kyoto University Hospital (KUHP) and Seoul National University Hospital (SNUH) in 2008. The questionnaire consisted of 15 questions and 2 open-ended questions on broad key issues relating to clinical trials. Results The number of responders was 301 at KUHP and 398 at SNUH. Doctors with trial experience were 196 at KUHP and 150 at SNUH. Among them, 12% (24/196) at KUHP and 41% (61/150) at SUNH had global trial experience. Most respondents at both institutions viewed clinical trials favorably and thought that conducting clinical trials contributed to medical advances, which would ultimately lead to new and better treatments. The main reason raised as a hindrance to conducting clinical trials was the lack of personnel support and time. Doctors at both university hospitals thought that more clinical research coordinators were required to conduct clinical trials more efficiently. KUHP doctors were driven mainly by pure academic interest or for their desire to find new treatments, while obtaining credits for board certification and co-authorship on manuscripts also served as motivation factors for doctors at SNUH. Conclusions Our results revealed that there might be two different approaches to increase clinical trial activity. One is a social level approach to establish clinical trial infrastructure providing sufficient clinical research professionals. The other is an individual level approach that would provide incentives to
Ogasawara, Shu; Tsutaya, Shoji; Akimoto, Hiroyuki; Kojima, Keiya; Yabaka, Hiroyuki
Skills and knowledge regarding many different types of test are required for medical technologists (MTs) to provide accurate information to help doctors and other medical specialists. In order to become an efficient MT, specialized training programs are required. Certification in specialized areas of clinical laboratory sciences or a doctoral degree in medical sciences may help MTs to realize career advancement, a higher earning potential, and expand the options in their career. However, most young MTs in national university hospitals are employed as part-time workers on a three-year contract, which is too short to obtain certifications or a doctoral degree. We have to leave the hospital without expanding our future. We need to take control of our own development in order to enhance our employability within the period. As teaching and training hospitals, national university hospitals in Japan are facing a difficult dilemma in nurturing MTs. I hope, as a novice medical technologist, that at least university hospitals in Japan create an appropriate workplace environment for novice MTs. PMID:23427696
Adetiba, E; Eleanya, M; Fatumo, S A; Matthews, V O
Health information represents the main basis for health decision-making process and there have been some efforts to increase access to health information in developing countries. However, most of these efforts are based on the internet which has minimal penetration especially in the rural and sub-urban part of developing countries. In this work, a platform for medical record acquisition via the ubiquitous 2.5G/3G wireless communications technologies is presented. The National Hospital Management Portal (NHMP) platform has a central database at each specific country's national hospital which could be updated/accessed from hosts at health centres, clinics, medical laboratories, teaching hospitals, private hospitals and specialist hospitals across the country. With this, doctors can have access to patients' medical records more easily, get immediate access to test results from laboratories, deliver prescription directly to pharmacists. If a particular treatment can be provided to a patient more effectively in another country, NHMP makes it simpler to organise and carry out such treatment abroad. PMID:20643641
Himmelstein, David U; Jun, Miraya; Busse, Reinhard; Chevreul, Karine; Geissler, Alexander; Jeurissen, Patrick; Thomson, Sarah; Vinet, Marie-Amelie; Woolhandler, Steffie
A few studies have noted the outsize administrative costs of US hospitals, but no research has compared these costs across multiple nations with various types of health care systems. We assembled a team of international health policy experts to conduct just such a challenging analysis of hospital administrative costs across eight nations: Canada, England, Scotland, Wales, France, Germany, the Netherlands, and the United States. We found that administrative costs accounted for 25.3 percent of total US hospital expenditures--a percentage that is increasing. Next highest were the Netherlands (19.8 percent) and England (15.5 percent), both of which are transitioning to market-oriented payment systems. Scotland and Canada, whose single-payer systems pay hospitals global operating budgets, with separate grants for capital, had the lowest administrative costs. Costs were intermediate in France and Germany (which bill per patient but pay separately for capital projects) and in Wales. Reducing US per capita spending for hospital administration to Scottish or Canadian levels would have saved more than $150 billion in 2011. This study suggests that the reduction of US administrative costs would best be accomplished through the use of a simpler and less market-oriented payment scheme. PMID:25201663
Ron, Pnina; Shamai, Michal
The main goal of this study was to explore the connections between the exposure of nurses in Israel to national terror and the levels of distress experienced due to ongoing terror attacks. The data were collected from 214 nurses from various parts of Israel who work in three types of heath services (mainly hospital departments) and provide help to victims of terror. The nurses reported very high levels of burnout, high levels of stress and medium-to high levels of intrusive memories. Levels of exposure were associated with burnout, intrusive memories and level of stress. More professional attention should be given to hospital nurses who provide care for trauma patients. PMID:23982180
Kim, Dong Gyu; Park, Chul-Kee; Paek, Sun Ha; Kim, Jeong Eun; Kim, Chi Heon; Phi, Ji Hoon
Established in 1957, the Department of Neurosurgery at Seoul National University College of Medicine is the one of the oldest neurosurgical departments in Korea. The seven past Chairmen (Bo Sung Sim, Kil Soo Choi, Dae Hee Han, Byung-Kyu Cho, Hyun Jib Kim, Hee-Won Jung, and Dong Gyu Kim) have devoted themselves to the development of the department. The current chair, Chun Kee Chung, assumed the position in July 2010. The current department comprises several clinical programs that encompass the entire spectrum of neurosurgical disorders, with 29 specialized faculty members and care teams in three hospitals: Seoul National University Hospital (SNUH), Boramae Medical Center (BMC), and Seoul National University Bundang Hospital (SNUBH). The remarkable growth of the department during the last half century made it possible to perform 5,666 operations (3,299 at SNUH, 411 at BMC and 1,860 at SNUBH) during 2009. A total of 1,201 articles authored by faculty members were published in scientific journals between 1958 and 2009, approximately 32% of which were published in international journals. The department is regarded as the "Mecca" of neurosurgery in Korea because of its outstanding achievement and the many distinguished alumni with leadership roles in the academic field. This article traces the clinical, academic, and scientific development of the department, its present activities, and its future direction. PMID:21600472
Sawe, Hendry R; Mfinanga, Juma A; Ringo, Faith H; Mwafongo, Victor; Reynolds, Teri A; Runyon, Michael S
Objectives To describe the HIV counselling and testing practices for children presenting to an emergency department (ED) in a low-income country. Setting The ED of a large east African national referral hospital. Participants This retrospective review of all paediatric (<18 years old) ED visits in 2012 enrolled patients who had an HIV test ordered and excluded those without testing. Files were available for 5540/5774 (96%) eligible patients and 1632 (30%) were tested for HIV, median age 1.3 years (IQR 9 months to 4 years), 58% <18 months old and 61% male. Primary and secondary outcome measures The primary outcome measure was documentation of pretest and post-test counselling, or deferral of counselling, for children tested for HIV in the ED. Secondary measures included the overall rate of HIV testing, rate of counselling documented in the inpatient record when deferred in the ED, rate of counselling documented when testing was initiated by the inpatient service, rate of counselling documented by test result (positive vs negative) and the rate of referral to follow-up HIV care among patients testing positive. Results Of 418 patients tested in the ED, counselling, or deferral of counselling, was documented for 70 (17%). When deferred to the ward, subsequent counselling was documented for 15/42 (36%). Counselling was documented in 33% of patients testing positive versus 1.1% patients testing negative (OR 43 (95% CI 23 to 83). Of 199 patients who tested positive and survived to hospital discharge, 76 (38%) were referred for follow-up at the HIV clinic on discharge. Conclusions Physicians documented the provision, or deferral, of counselling for <20% of children tested for HIV in the ED. Counselling was much more likely to be documented when the test result was positive. Less than 40% of those testing positive were referred for follow-up care. PMID:26880672
Kendi, Sadiqa; Zonfrillo, Mark R; Seaver Hill, Karen; Arbogast, Kristy B; Gittelman, Michael A
Objective To describe the location, staffing, clientele, safety product disbursement patterns, education provided and sustainability of safety resource centres (SRCs) in US children's hospitals. Methods A cross-sectional survey was distributed to children's hospital-based SRC directors. Survey categories included: funding sources, customer base, items sold, items given free of charge, education provided and directors’ needs. Results 32/38 (84.2%) SRC sites (affiliated with 30 hospitals) completed the survey. SRCs were in many hospital locations including lobby (28.1%), family resource centres (12.5%), gift shop/retail space (18.8%), mobile units (18.8%) and patient clinics (12.5%). 19% of respondents reported that their SRC was financially self-sustainable. Sales to patients predominated (mean of 44%); however, hospital employees made up a mean of 20% (range 0–60%) of sales. 78.1% of SRCs had products for children with special healthcare needs. Documentation kept at SRC sites included items purchased (96.9%), items given free of charge (65.6%) and customer demographics (50%). 56.3% of SRCs provided formal injury prevention education classes. The SRCs’ directors’ most important needs were finances (46.9%), staffing (50%) and space (46.9%). All of the directors were ‘somewhat interested’ or ‘very interested’ in each of the following: creation of a common SRC listserv, national SRC data bank and multisite SRC research platform. Conclusions SRCs are located in many US children's hospitals, and can be characterised as heterogeneous in location, products sold, data kept and ability to be financially sustained. Further research is needed to determine best practices for SRCs to maximise their impact on injury prevention. PMID:24667383
Boyle, Diane K.; Cramer, Emily; Potter, Catima; Staggs, Vincent S.
Background Researchers have studied inpatient falls in relation to aspects of nurse staffing, focusing primarily on staffing levels and proportion of nursing care hours provided by registered nurses (RNs). Less attention has been paid to other nursing characteristics, such as RN national nursing specialty certification. Objective The aim of the study was to examine the relationship over time between changes in RN national nursing specialty certification rates and changes in total patient fall rates at the patient care unit level. Methods We used longitudinal data with standardized variable definitions across sites from the National Database of Nursing Quality Indicators. The sample consisted of 7,583 units in 903 hospitals. Relationships over time were examined using multilevel (units nested in hospitals) latent growth curve modeling. Results The model indices indicated a good fit of the data to the model. At the unit level, there was a small statistically significant inverse relationship (r = −.08, p = .04) between RN national nursing specialty certification rates and total fall rates; increases in specialty certification rates over time tended to be associated with improvements in total fall rates over time. Discussion Our findings may be supportive of promoting national nursing specialty certification as a means of improving patient safety. Future study recommendations are (a) modeling organizational leadership, culture, and climate as mediating variables between national specialty certification rates and patient outcomes and (b) investigating the association of patient safety and specific national nursing specialty certifications which test plans include patient safety, quality improvement, and diffusion of innovation methods in their certifying examinations. PMID:26049719
Kim, Hyo-Geon; Son, Yong-Hyun; Chung, In-Kyo
Purpose: This study examined patients with facial bone fracture visiting Pusan National University Dental Hospital to understand the trends, and to enhance appropriate care and treatment for patients with facial bone fracture. Methods: We investigated 531 patients presenting with facial bone fracture in Yangsan and 802 patients in Busan from January 2010 to December 2013. We divided the patients by year, month, gender, age, site, and cause to compare with historic data and other studies. Results: The gender ratio was 3.58:1 in Yangsan and 4.31:1 in Busan. Patients aged in their 20s had the highest number of facial bone fractures in both Yangsan and Busan. The most frequent fracture site was the mandible, and the most frequent cause was slip down in both Yangsan and Busan. Conclusion: The investigation and comparison of patients with facial bone fracture who visited Pusan National University Hospital located at Yangsan and Busan from 2010 to 2013 found a difference in the total number of patients at each hospital, but the trends were not significantly different. PMID:27489825
Ahmadi, Ali; Soori, Hamid; Mehrabi, Yadollah; Etemad, Koorosh; Sajjadi, Homeira; Sadeghi, Mehraban
Background: Regarding failure to establish the statistical presuppositions for analysis of the data by conventional approaches, hierarchical structure of the data as well as the effect of higher-level variables, this study was conducted to determine the factors independently associated with hospital mortality due to myocardial infarction (MI) in Iran using a multilevel analysis. Methods: This study was a national, hospital-based, and cross-sectional study. In this study, the data of 20750 new MI patients between April, 2012 and March, 2013 in Iran were used. The hospital mortality due to MI was considered as the dependent variable. The demographic data, clinical and behavioral risk factors at the individual level and environmental data were gathered. Multilevel logistic regression models with Stata software were used to analyze the data. Results: Within 1-year of study, the frequency (%) of hospital mortality within 30 days of admission was derived 2511 (12.1%) patients. The adjusted odds ratio (OR) of mortality with (95% confidence interval [CI]) was derived 2.07 (95% CI: 1.5–2.8) for right bundle branch block, 1.5 (95% CI: 1.3–1.7) for ST-segment elevation MI, 1.3 (95% CI: 1.1–1.4) for female gender, and 1.2 (95% CI: 1.1–1.3) for humidity, all of which were considered as risk factors of mortality. But, OR of mortality was 0.7 for precipitation (95% CI: 0.7–0.8) and 0.5 for angioplasty (95% CI: 0.4–0.6) were considered as protective factors of mortality. Conclusions: Individual risk factors had independent effects on the hospital mortality due to MI. Variables in the province level had no significant effect on the outcome of MI. Increasing access and quality to treatment could reduce the mortality due to MI. PMID:26730342
Maack, Joachim; Lingenfelder, Marcus; Weinacker, Holger; Koch, Barbara
Remote sensing-based timber volume estimation is key for modelling the regional potential, accessibility and price of lignocellulosic raw material for an emerging bioeconomy. We used a unique wall-to-wall airborne LiDAR dataset and Landsat 7 satellite images in combination with terrestrial inventory data derived from the National Forest Inventory (NFI), and applied generalized additive models (GAM) to estimate spatially explicit timber distribution and volume in forested areas. Since the NFI data showed an underlying structure regarding size and ownership, we additionally constructed a socio-economic predictor to enhance the accuracy of the analysis. Furthermore, we balanced the training dataset with a bootstrap method to achieve unbiased regression weights for interpolating timber volume. Finally, we compared and discussed the model performance of the original approach (r2 = 0.56, NRMSE = 9.65%), the approach with balanced training data (r2 = 0.69, NRMSE = 12.43%) and the final approach with balanced training data and the additional socio-economic predictor (r2 = 0.72, NRMSE = 12.17%). The results demonstrate the usefulness of remote sensing techniques for mapping timber volume for a future lignocellulose-based bioeconomy.
Mgaya, Edward; Kazaura, Method R; Outwater, Anne; Kinabo, Lina
Suicide surveillance was launched at the Muhimbili National Hospital mortuary in Dar es Salaam Region, Tanzania from 1st January to 31st December, 2005 to determine its magnitude and characteristics. Following the WHO guidelines with minor modifications, information on sex, dates of birth and death, places of residence and death, occupation, reasons and means of suicide were collected. There were 65 (2.3 per 100,000 population) suicides recorded in 2005. The suicide rate for males was 3.4/100,000 and for females was 1.2/100,000 which maybe some of the lowest rates ever reported in the world. The mean age at suicide was 32.9 (SD=13.1) years. Males were about three times more likely to commit suicide as females. The main motive behind suicide was recorded for 26 (40%) victims as family-related and for 11 (17%) as health related. Although there was a wide range of ages at which people committed suicide, the average age seems to be very low. Since reasons for suicide are coated with family problems, strategies to improve awareness of psychological and mental health services and to provide alternative economic and social support networks are advocated. PMID:18313013
Petrie, J; Easton, S; Naik, V; Lockie, C; Brett, S J; Stümpfle, R
Objectives There is a scarcity of literature reporting hospital costs for treating out of hospital cardiac arrest (OOHCA) survivors, especially within the UK. This is essential for assessment of cost-effectiveness of interventions necessary to allow just allocation of resources within the National Health Service. We set out primarily to calculate costs stratified against hospital survival and neurological outcomes. Secondarily, we estimated cost effectiveness based on estimates of survival and utility from previous studies to calculate costs per quality adjusted life year (QALY). Setting We performed a single centre (London) retrospective review of in-hospital costs of patients admitted to the intensive care unit (ICU) following return of spontaneous circulation (ROSC) after OOHCA over 18 months from January 2011 (following widespread introduction of targeted temperature management and primary percutaneous intervention). Participants Of 69 successive patients admitted over an 18-month period, survival and cerebral performance category (CPC) outcomes were obtained from review of databases and clinical notes. The Trust finance department supplied ICU and hospital costs using the Payment by Results UK system. Results Of those patients with ROSC admitted to ICU, survival to hospital discharge (any CPC) was 33/69 (48%) with 26/33 survivors in CPC 1–2 at hospital discharge. Cost per survivor to hospital discharge (including total cost of survivors and non-survivors) was £50 000, cost per CPC 1–2 survivor was £65 000. Cost and length of stay of CPC 1–2 patients was considerably lower than CPC 3–4 patients. The majority of the costs (69%) related to intensive care. Estimated cost per CPC 1–2 survivor per QALY was £16 000. Conclusions The costs of in-hospital patient care for ICU admissions following ROSC after OOHCA are considerable but within a reasonable threshold when assessed from a QALY perspective. PMID:25838503
Altuwaijri, Majid M.; Bahanshal, Abdullah; Almehaid, Mona
Objective: The purpose of this study is to describe the needs, process and experience of implementing a computerized physician order entry (CPOE) system in a leading healthcare organization in Saudi Arabia. Materials and Methods: The National Guard Health Affairs (NGHA) deployed the CPOE in a pilot department, which was the intensive care unit (ICU) in order to assess its benefits and risks and to test the system. After the CPOE was implemented in the ICU area, a survey was sent to the ICU clinicians to assess their perception on the importance of 32 critical success factors (CSFs) that was acquired from the literature. The project team also had several meetings to gather lessons learned from the pilot project in order to utilize them for the expansion of the project to other NGHA clinics and hospitals. Results: The results of the survey indicated that the selected CSFs, even though they were developed with regard to international settings, are very much applicable for the pilot area. The top three CSFs rated by the survey respondents were: The “before go-live training”, the adequate clinical resources during implementation, and the ordering time. After the assessment of the survey and the lessons learned from the pilot project, NGHA decided that the potential benefits of the CPOE are expected to be greater the risks expected. The project was then expanded to cover all NGHA clinics and hospitals in a phased approach. Currently, the project is in its final stages and expected to be completed by the end of 2011. Conclusion: The role of CPOE systems is very important in hospitals in order to reduce medication errors and to improve the quality of care. In spite of their great benefits, many studies suggest that a high percentage of these projects fail. In order to increase the chances of success and due to the fact that CPOE is a clinical system, NGHA implemented the system first in a pilot area in order to test the system without putting patients at risk and to
Jena, Anupam B.; Prasad, Vinay; Goldman, Dana P.; Romley, John
IMPORTANCE Thousands of physicians attend scientific meetings annually. Although hospital physician staffing and composition may be affected by meetings, patient outcomes and treatment patterns during meeting dates are unknown. OBJECTIVE To analyze mortality and treatment differences among patients admitted with acute cardiovascular conditions during dates of national cardiology meetings compared with nonmeeting dates. DESIGN, SETTING, AND PARTICIPANTS Retrospective analysis of 30-day mortality among Medicare beneficiaries hospitalized with acute myocardial infarction (AMI), heart failure, or cardiac arrest from 2002 through 2011 during dates of 2 national cardiology meetings compared with identical nonmeeting days in the 3 weeks before and after conferences (AMI, 8570 hospitalizations during 82 meeting days and 57 471 during 492 nonmeeting days; heart failure, 19 282 during meeting days and 11 4591 during nonmeeting days; cardiac arrest, 1564 during meeting days and 9580 during nonmeeting days). Multivariable analyses were conducted separately for major teaching hospitals and nonteaching hospitals and for low-and high-risk patients. Differences in treatment utilization were assessed. EXPOSURES Hospitalization during cardiology meeting dates. MAIN OUTCOMES AND MEASURES Thirty-day mortality, procedure rates, charges, length of stay. RESULTS Patient characteristics were similar between meeting and nonmeeting dates. In teaching hospitals, adjusted 30-day mortality was lower among high-risk patients with heart failure or cardiac arrest admitted during meeting vs nonmeeting dates (heart failure, 17.5% [95% CI, 13.7%–21.2%] vs 24.8% [95% CI, 22.9%–26.6%]; P < .001; cardiac arrest, 59.1% [95% CI, 51.4%–66.8%] vs 69.4% [95% CI, 66.2%–72.6%]; P = .01). Adjusted mortality for high-risk AMI in teaching hospitals was similar between meeting and nonmeeting dates (39.2% [95% CI, 31.8%–46.6%] vs 38.5% [95% CI, 35.0%–42.0%]; P = .86), although adjusted percutaneous
Volpe, M; Scaldaferri, F; Ojetti, V; Poscia, A
The high demand of Breath Tests (BT) in many gastroenterological conditions in time of limited resources for health care systems, generates increased interest in cost analysis from the point of view of the delivery of services to better understand how use the money to generate value. This study aims to measure the cost of C13 Urea and other most utilized breath tests in order to describe key aspects of costs and reimbursements looking at the economic sustainability for the hospital. A hospital based cost-analysis of the main breath tests commonly delivery in an ambulatory setting is performed. Mean salary for professional nurses and gastroenterologists, drugs/preparation used and disposable materials, purchase and depreciation of the instrument and the testing time was used to estimate the cost, while reimbursements are based on the 2013 Italian National Health System ambulatory pricelist. Variables that could influence the model are considered in the sensitivity analyses. The mean cost for C13--Urea, Lactulose and Lactose BT are, respectively, Euros 30,59; 45,20 and 30,29. National reimbursement often doesn't cover the cost of the analysis, especially considering the scenario with lower number of exam. On the contrary, in high performance scenario all the reimbursement could cover the cost, except for the C13 Urea BT that is high influenced by the drugs cost. However, consideration about the difference between Italian Regional Health System ambulatory pricelist are done. Our analysis shows that while national reimbursement rates cover the costs of H2 breath testing, they do not cover sufficiently C13 BT, particularly urea breath test. The real economic strength of these non invasive tests should be considered in the overall organization of inpatient and outpatient clinic, accounting for complete diagnostic pathway for each gastrointestinal disease. PMID:24443075
McKay, Ailsa J; Newson, Roger B; Soljak, Michael; Riboli, Elio; Car, Josip
Objective Identification of primary care factors associated with hospital admissions for adverse drug reactions (ADRs). Design and setting Cross-sectional analysis of 2010–2012 data from all National Health Service hospitals and 7664 of 8358 general practices in England. Method We identified all hospital episodes with an International Classification of Diseases (ICD) 10 code indicative of an ADR, in the 2010–2012 English Hospital Episode Statistics (HES) admissions database. These episodes were linked to contemporary data describing the associated general practice, including general practitioner (GP) and patient demographics, an estimate of overall patient population morbidity, measures of primary care supply, and Quality and Outcomes Framework (QOF) quality scores. Poisson regression models were used to examine associations between primary care factors and ADR-related episode rates. Results 212 813 ADR-related HES episodes were identified. Rates of episodes were relatively high among the very young, older and female subgroups. In fully adjusted models, the following primary care factors were associated with increased likelihood of episode: higher deprivation scores (population attributable fraction (PAF)=0.084, 95% CI 0.067 to 0.100) and relatively poor glycated haemoglobin (HbA1c) control among patients with diabetes (PAF=0.372; 0.218 to 0.496). The following were associated with reduced episode likelihood: lower GP supply (PAF=−0.016; −0.026 to −0.005), a lower proportion of GPs with UK qualifications (PAF=−0.035; −0.058 to −0.012), lower total QOF achievement rates (PAF=−0.021; −0.042 to 0.000) and relatively poor blood pressure control among patients with diabetes (PAF=−0.144; −0.280 to −0.022). Conclusions Various aspects of primary care are associated with ADR-related hospital episodes, including achievement of particular QOF indicators. Further investigation with individual level data would help develop understanding of the
Huesch, Marco D; Currid-Halkett, Elizabeth; Doctor, Jason N
Objectives Publicly available hospital quality reports seek to inform consumers of important healthcare quality and affordability attributes, and may inform consumer decision-making. To understand how much consumers search for such information online on one Internet search engine, whether they mention such information in social media and how positively they view this information. Setting and design A leading Internet search engine (Google) was the main focus of the study. Google Trends and Google Adwords keyword analyses were performed for national and Californian searches between 1 August 2012 and 31 July 2013 for keywords related to ‘top hospital’, best hospital’, and ‘hospital quality’, as well as for six specific hospital quality reports. Separately, a proprietary social media monitoring tool was used to investigate blog, forum, social media and traditional media mentions of, and sentiment towards, major public reports of hospital quality in California in 2012. Primary outcome measures (1) Counts of searches for keywords performed on Google; (2) counts of and (3) sentiment of mentions of public reports on social media. Results National Google search volume for 75 hospital quality-related terms averaged 610 700 searches per month with strong variation by keyword and by state. A commercial report (Healthgrades) was more commonly searched for nationally on Google than the federal government's Hospital Compare, which otherwise dominated quality-related search terms. Social media references in California to quality reports were generally few, and commercially produced hospital quality reports were more widely mentioned than state (Office of Statewide Healthcare Planning and Development (OSHPD)), or non-profit (CalHospitalCompare) reports. Conclusions Consumers are somewhat aware of hospital quality based on Internet search activity and social media disclosures. Public stakeholders may be able to broaden their quality dissemination initiatives by
Cronin, Robert M; VanHouten, Jacob P; Siew, Edward D; Eden, Svetlana K; Fihn, Stephan D; Nielson, Christopher D; Peterson, Josh F; Baker, Clifton R; Ikizler, T Alp; Speroff, Theodore
Objective Hospital-acquired acute kidney injury (HA-AKI) is a potentially preventable cause of morbidity and mortality. Identifying high-risk patients prior to the onset of kidney injury is a key step towards AKI prevention. Materials and Methods A national retrospective cohort of 1,620,898 patient hospitalizations from 116 Veterans Affairs hospitals was assembled from electronic health record (EHR) data collected from 2003 to 2012. HA-AKI was defined at stage 1+, stage 2+, and dialysis. EHR-based predictors were identified through logistic regression, least absolute shrinkage and selection operator (lasso) regression, and random forests, and pair-wise comparisons between each were made. Calibration and discrimination metrics were calculated using 50 bootstrap iterations. In the final models, we report odds ratios, 95% confidence intervals, and importance rankings for predictor variables to evaluate their significance. Results The area under the receiver operating characteristic curve (AUC) for the different model outcomes ranged from 0.746 to 0.758 in stage 1+, 0.714 to 0.720 in stage 2+, and 0.823 to 0.825 in dialysis. Logistic regression had the best AUC in stage 1+ and dialysis. Random forests had the best AUC in stage 2+ but the least favorable calibration plots. Multiple risk factors were significant in our models, including some nonsteroidal anti-inflammatory drugs, blood pressure medications, antibiotics, and intravenous fluids given during the first 48 h of admission. Conclusions This study demonstrated that, although all the models tested had good discrimination, performance characteristics varied between methods, and the random forests models did not calibrate as well as the lasso or logistic regression models. In addition, novel modifiable risk factors were explored and found to be significant. PMID:26104740
Hendel, T; Fish, M; Aboudi, S
This article explores the anxiety level of, and coping strategies used by, hospital nurses, during a national state of emergency. The study was guided by a stress and coping framework, developed by Lazarus & Folkman, and was conducted at a large teaching hospital, located in the centre of Israel, during the Iraqi crisis in January and February, 1998. Data were collected from a sample of 100 female nurses, and a descriptive correlational design was used. The findings indicated that approximately 33% of the nurses expressed feelings of stress, tension and a sense of discomfort. The dominant coping strategy used by the nurses was direct-active, which was found to be the most effective strategy. As they were unable to remove or control the stressor, stress management intervention by nursing managers focused mainly on communicating with staff and providing social support - informational and emotional--to buffer the stressful experience. Providing support and help in finding practical solutions is important for maintaining emotional stability of staff, thereby helping them to improve their nursing interventions in assisting people to cope with stressful situations. PMID:11153519
de Micheli, Alfredo
Since the most ancient times, hospital constructions and progresses in the clinical practice advanced pari passu. We can find exampless of this statement in Greek regions as well as in Greek citie overseas. Thus, during the renaissance, great figures ot that time converged in Italy: The genius Leonardo da Vinci (1452-1519) and Leon Battista Alberti (1404-1472), a humanist and innovator of architecture. Michelangelo Buonarroti (1475-1564) and his contemporany artists performed anatomical dissection to perfect their art by studying the human body. Anatomical studies flourished at the University of Padua, driven by the Flemish Master. Based on the rigorous study of the anatomical substrate, the studies on the function of the already known organic structures excelled in the xvii century. That century started with the revelation of the major blood circulation by the British physician William Harvey, alumni of the University of Padua, and continued with the description of the minior or pulmonary circulation by ancient or contemporany authors and of the peripheral connections between the arterial and the venous system (Marcelo Malpighi, 1661). All these researchers, and others, were membres of the University of Padua, were the beneficial influence of the teachings of Galileo persisted. In the following centuries, together with the embryological and normal anatomy, the pathological anatomy, systematized by G.B. Morgani, became the cornerstone of the clinical practice. The model of the ancient hospitals evolved to ward the National Institutes of Health in Mexico fostered by Dr. Ignacio Chávez. PMID:25862293
Robinson, Geoffrey; McCann, Kieran; Freeman, Peter; Beasley, Richard
The New Zealand junior doctors' strike provided an opportunity to consider strategies that might be employed to overcome the international shortage of junior doctors. This article reports the experience of the emergency department (ED) and internal medicine (IM) services at Wellington Hospital during the national strike, in which medical services were primarily provided by specialist consultants in addition to, or as part of, their routine work. During the strike, elective admissions and outpatient clinics were mostly cancelled. In the ED, the waiting times and length of stay were markedly reduced. In IM, the proportion of patients admitted to the short stay unit rather than the general medical wards increased. Notwithstanding the different work circumstances, in both services one senior doctor carried the workload of at least two junior doctors. The deployment of additional senior medical staff to acute hospital services could greatly reduce the total number of doctors required. This strategy would have implications in terms of supporting acute medicine specialty initiatives, training, quality of care and funding. PMID:18624033
Clark, Katherine; Byfieldt, Naomi; Green, Malcolm; Saul, Peter; Lack, Jill; Philips, Jane L
The Australian Commission for Quality and Safety in Health Care (ACQSHC) has articulated 10 clinical standards with the aim of improving the consistency of quality healthcare delivery. Currently, the majority of Australians die in acute hospitals. But despite this, no agreed standard of care exists to define the minimum standard of care that people should accept in the final hours to days of life. As a result, there is limited capacity to conduct audits that focus on the gap between current care and recommended care. There is, however, accumulating evidence in the end of life literature to define which aspects of care are likely to be considered most important to those people facing imminent death. These themes offer standards against which to conduct audits. This is very apt given the national recommendation that healthcare should be delivered in the context of considering people's wishes while always treating people with dignity and respect. PMID:24589365
Suárez-Varela, María M Morales; Kaerlev, Linda; Zhu, Jin Liang; Bonde, Jens Peter; Nøhr, Ellen-Aagaard; Llopis-González, Agustín; Olsen, Jørn
In hospitals, women of reproductive age do a range of work tasks, some of which are known to carry potential risks. Tasks such as working with radiation, chemicals, and infectious agents, as well as performing heavy lifting or tasks requiring erratic sleep patterns have been reported to increase the risk of reproductive failures. Our aim was to study pregnancy outcomes in female hospital workers in Denmark. We performed a cohort study of 5976 female hospital workers and used as a reference group 60,890 women employed outside of hospitals. The reproductive health of hospital workers working during pregnancy is comparable to those of non-hospital workers for the majority of reproductive failures studied. However, an increased prevalence of congenital abnormalities was noted in some subgroups of hospital workers, which may indicate that some hospital work still entails fetotoxic hazards. PMID:19886351
Bae, Jungbum; Choo, Minsoo; Park, Ji Hyun; Oh, Jin Kyu; Paick, Jae-Seung
Purpose The objective of this study was to report the experience acquired at the Seoul National University Hospital with Holmium Laser Enucleation of Prostate (HoLEP), combined with mechanical morcellation for symptomatic benign prostatic hyperplasia (BPH). Methods A retrospective review was performed on the clinical data of 309 consecutive patients who underwent HoLEP at our institution between July 2008 and June 2010. All patients were evaluated preoperatively for prostate volume by transrectal ultrasound, maximum urinary flow rate (Qmax), International Prostate Symptoms Score (IPSS) and quality of life (QoL) score. Peri- and postoperative parameters were evaluated and patients were followed-up at 1-, 3-, 6-, and 12- months with the aforementioned investigations. Results The patients' mean age was 68.3 (±6.5) years and mean prostate volume was 55.6 (±23.6) mL. Mean enucleation time was 56.2 (±25.1) minutes, mean morcellation time was 11.3 (±9.5) minutes, and the mean resected weight of the prostate was 20.8 (±16.9) g. The mean catheter indwelling period was 1.9 (±1.7) days and mean hospital stay was 2.9 (±1.5) days. Significant improvement was noted in Qmax, IPSS, and QoL at the 1-year follow-up compared with baseline (P<0.01). At 1 month 17.2% of patients complained of irritative urinary symptoms, which were typically self-limiting within 3 months. Transient stress incontinence was reported in 15.2% of patients. No patient experienced persistent obstructive symptoms that required reoperation. Conclusions Our study showed that HoLEP is a safe and effective therapeutic modality for BPH. PMID:21468284
Kempker, Jordan A; Magee, Matthew J; Cegielski, J Peter; Martin, Greg S
Research has implicated low 25-hydroxyvitamin D (25(OH)D) level as a risk factor for infection; however, results have not been consistent. To further determine the nature of this relationship, we conducted a cohort study using Medicare beneficiaries participating in the 2001-2002 and 2003-2004 cycles of the National Health and Nutrition Examination Survey with data individually linked to hospital records from the Centers for Medicare and Medicaid Services. The primary exposure was a 25(OH)D level of <15 ng/mL versus ≥15 ng/mL. The outcomes were a hospitalization with or without an infection within 1 year of participation in the National Health and Nutrition Examination Survey, as determined from the final hospital discharge codes (International Classification of Diseases, Ninth Revision, Clinical Modification). Of 1,713 individuals, 348 had a baseline serum 25(OH)D level of <15 ng/mL, 77 experienced a hospitalization with an infection, and 287 experienced a hospitalization without an infection. In multivariable analyses, a serum 25(OH)D level of <15 ng/mL was associated with a higher risk of hospitalization with an infection (risk ratio = 2.8, 95% confidence interval: 1.3, 5.9, P < 0.01) but not of hospitalization without an infection (risk ratio = 1.4, 95% confidence interval: 0.9, 2.1, P = 0.1). In this study, we found an association between a serum 25(OH)D concentration of <15 ng/mL and a higher subsequent risk for hospitalization with an infection among Medicare beneficiaries. PMID:27189328
Norris, Gill; Williams, Steve; Adam-Smith, Derek
Two key issues thrown up by the 1999 introduction of the National Minimum Wage (NMW) in the United Kingdom are its likely impact on employers' training practices in low paying sectors of the economy and the implications for skills. Based on a study of the hospitality industry, this article assesses the limited significance of the differential,…
Lin, Lan-Ping; Lee, Jiunn-Tay; Lin, Fu-Gong; Lin, Pei-Ying; Tang, Chi-Chieh; Chu, Cordia M.; Wu, Chia-Ling; Lin, Jin-Ding
Nationwide data were collected concerning inpatient care use and medical expenditure of people with disabilities (N = 937,944) among national health insurance beneficiaries in Taiwan. Data included gender, age, hospitalization frequency and expenditure, healthcare setting and service department, discharge diagnose disease according to the ICD-9-CM…
Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb
Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Methods Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Results Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Conclusions Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation. PMID:23346476
Ozcan, Y A; Luke, R D
Using a sample of 3,000 urban hospitals, this article examines the contributions of selected hospital characteristics to variations in hospital technical efficiencies, while it accounts for multiple products and inputs, and controls for local environmental variations. Four hospital characteristics are examined: hospital size, membership in a multihospital system, ownership, and payer mix (managed care contracts, percent Medicare, and percent Medicaid). Ownership and percent Medicare are consistently found to be related significantly to hospital efficiency. Within the ownership variable, government hospitals tend to be more efficient and for-profit hospitals less efficient than other hospitals. Higher percentages of Medicare payment are negatively related to efficiency. While not consistently significant across all five of the MSA size categories in which the analyses are conducted, possession of managed care contracts, membership in a multihospital system, and size all are consistently related positively to hospital technical efficiency. These variables are also all significant when the hospitals are examined in a combined analysis. Percent Medicaid was not significant in any of the analyses. Implications for policy and the need for methodological work are discussed. PMID:8428810
Best, Matthew J; Buller, Leonard T; Miranda, Alejandro
Foot and ankle arthrodesis reliably reduces pain and functional disability among patients with arthritis and deformity. Since its introduction in 1953, improvements in surgical technique have enhanced the outcomes and reduced complications. However, little is known regarding US national trends of foot and ankle arthrodesis. The present study sought to use the most recently available Centers for Disease Control and Prevention data to investigate changes in the usage of inpatient and ambulatory foot and ankle arthrodesis. Cases of foot and ankle arthrodesis were identified using the National Hospital Discharge Survey and National Survey of Ambulatory Surgery, and the data were analyzed for trends in demographics, treatment, and usage. From 1994 to 2006, the population-adjusted rates of foot and ankle arthrodeses increased by 146% (8.2/100,000 capita to 20.2/100,000 capita). The number of outpatient arthrodeses performed with arthroscopic assistance increased by 858%. The population-adjusted rate of outpatient and inpatient procedures increased by 415% and 17%, respectively. The gender-adjusted rates increased by 59% for males and 209% for females. The age-adjusted rates increased among patients >35 years old in both settings. The use of peripheral nerve blocks during ambulatory procedures increased from 3.3% to 10.1%. Private insurance was the largest compensator. In conclusion, the rate of foot and ankle arthrodesis increased dramatically from 1990 to 2007 using the most up-to-date publicly available data. Knowledge of these national practice patterns could aid policy-makers and surgeons in appropriately allocating healthcare resources to ensure quality patient care. PMID:26213159
Audu, L. I.; Otuneye, A. T.; Mairami, A. B.; Mshelia, L. J.; Nwatah, V. E.
Anaemia is a common morbidity in the NICU and often requires transfusion of packed red blood cells. Haematocrit equilibration following red cell transfusion occurs over time ultimately resulting in a stable packed cell volume (PCV). Knowledge of this equilibration process is pertinent in the accurate timing of posttransfusion (PT) PCV. We conducted a prospective study to determine an appropriate timing for PT PCV estimation on 47 stable anaemic babies at the Neonatal Unit of National Hospital, Abuja. Values of PCV were determined before transfusion and at 1, 6, 12, 24, and 48 hours posttransfusion. Forty of the recruited neonates and young infants were analyzed. Their gestational age range was 26 to 40 weeks. 1-hour PT PCV (48.5% ± 5.5%) was similar to the 6-hour PT PCV (47.8% ± 5.6%) P = 0.516, but both were significantly different from the 12-hour (46.8% ± 5.9%), 24-hour (45.9 ± 5.8%), and 48-hour (45.4% ± 6.2%) PT PCVs. The 12-hour PT PCV was similar to the 24-hour and 48-hour PT PCVs (P = 0.237 and 0.063, resp.). We concluded that, in stable nonhaemorrhaging and nonhaemolysing young infants, the estimated timing of haematocrit equilibration and, consequently, posttransfusion PCV is 12 hours after red blood cell transfusion. PMID:25861284
Tawk, Rima; Freels, Sally; Mullner, Ross
This study examined the association of mental and medical illnesses with the odds for leaving against medical advice (AMA) in a national sample of adult patients who left general hospitals between 1988 and 2006. Leaving AMA was first examined as a function of year and mental illness. Multiple logistic regression analysis was then used to adjust for patient and hospital characteristics when associating mental and major medical diagnoses with AMA discharges. The results indicated that leaving AMA was most strongly associated with mental health problems. However, the impact of mental illness was attenuated after adjusting for medical illnesses, patient and hospital characteristics. The strongest predictors of AMA discharge included being self-pay, having Medicaid insurance, being young and male, and the regional location of the hospital (Northeast). When substance abuse conditions were excluded from the mental illness discharge diagnoses, mental illness had lower odds for leaving AMA. The results may be of value to clinicians, and hospital administrators in helping to profile and target patients at risk for treatment-compliance problems. Prospective primary data collection that would include patient, physician, and hospital variables is recommended. PMID:22057857
Pawłowska, Iga; Pawłowski, Leszek; Kocić, Ivan; Krzyżaniak, Natalia
Background Pharmacist-led care services within the hospital pharmacy setting have a significant impact on efficient drug management processes. The work of pharmacists is directly associated with the provision of drugs and medical supplies along with additional clinical, administrative, organizational and educational duties. Depending on the country, these practice roles may differ to a significant extent. Objective The aim of this research was to explore the role of the hospital pharmacist and the provision of both clinical and traditional pharmaceutical services for patients and medical staff in Polish general hospitals. Setting Hospital pharmacies from all general hospitals in Poland. Method A cross-sectional study was conducted, utilizing an anonymous questionnaire as the research instrument. Heads of hospital pharmacies were requested to participate in this study and complete the questionnaire. The survey was initially piloted to improve the research method. Main outcome measure The types of pharmaceutical services performed in Polish general hospitals. Results 166 hospital pharmacies took part in this survey. The overall response rate was 60.8 %. The total number of full-time equivalent (FTE) professionals employed within the surveyed hospital pharmacies was approximately 833. The procurement and distribution of drugs were identified as pharmaceutical services performed by most of the participants. The significant majority of pharmacists were also involved in compounding, adverse drug reaction monitoring and rational drug management services. Eleven (7 %) of the responding pharmacists had direct contact with patients and 7 (4 %) pharmacists took part in ward rounds. More precise legal regulations regarding hospital pharmacy practice were measures indicated by most pharmacists as necessary changes required in the hospital pharmacy system. Conclusion Polish hospital pharmacists provide various pharmaceutical services. Their work is closely related with direct
Ser, Gloria; Robertson, Ann; Sheikh, Aziz
Aims To investigate the perceptions and reported practices of mental health hospital staff using national hospital electronic health records (EHRs) in order to inform future implementations, particularly in acute mental health settings. Methods Thematic analysis of interviews with a wide range of clinical, information technology (IT), managerial and other staff at two early adopter mental health National Health Service (NHS) hospitals in London, UK, implementing national EHRs. Results We analysed 33 interviews. We first sought out examples of workarounds, such as delayed data entry, entering data in wrong places and individuals using the EHR while logged in as a colleague, then identified possible reasons for the reported workarounds. Our analysis identified four main categories of factors contributing to workarounds (i.e., operational, cultural, organisational and technical). Operational factors included poor system integration with existing workflows and the system not meeting users' perceived needs. Cultural factors involved users' competence with IT and resistance to change. Organisational factors referred to insufficient organisational resources and training, while technical factors included inadequate local technical infrastructure. Many of these factors, such as integrating the EHR system with day-to-day operational processes, staff training and adequate local IT infrastructure, were likely to apply to system implementations in various settings, but we also identified factors that related particularly to implementing EHRs in mental health hospitals, for example: EHR system incompatibility with IT systems used by mental health–related sectors, notably social services; the EHR system lacking specific, mental health functionalities and options; and clinicians feeling unable to use computers while attending to distressed psychiatric patients. Conclusions A better conceptual model of reasons for workarounds should help with designing, and supporting the
Kim, Seoyoung C.; Kim, Mi-Sook; Sanfélix-Gimeno, Gabriel; Song, Hong Ji; Liu, Jun; Hurtado, Isabel; Peiró, Salvador; Lee, Joongyub; Choi, Nam-Kyong; Park, Byung-Joo; Avorn, Jerry
Purpose While current osteoporosis management guidelines recommend use of pharmacologic treatment following hip fracture, the care of such patients has been suboptimal. The objective of this cross-national study is to quantify the use of and adherence to osteoporosis medication following hip fracture in three countries with different health care systems- the United States, Korea and Spain. Methods In three cohorts of patients aged ≥65 years hospitalized for hip fracture, we calculated the proportion receiving ≥1 osteoporosis drug after discharge. Adherence to osteoporosis treatment was measured as the proportion of days covered (PDC) during the first year following the hip fracture. Results We identified 86,202 patients with a hip fracture - 4,704 (U.S. Medicare), 6,700 (U.S. commercial), 57,631(Korea), and 17,167 (Spain). The mean age was 77–83 years and 74–78% were women. In the year prior to the index hip fracture, 16–18% were taking an osteoporosis medication. Within 3 months following the index hip fracture, 11% (U.S. Medicare), 13% (U.S. commercial), 39% (Korea), and 25% (Spain) of patients filled ≥1 prescription for osteoporosis medication. For those who filled one or more prescriptions for an osteoporosis medication, the mean PDC in the year following the fracture was 0.70 (U.S. Medicare), 0.67 (U.S. commercial), 0.43 (Korea) and 0.66 (Spain). Conclusions Regardless of differences in health care delivery systems and medication reimbursement plans, the use of osteoporosis medications for the secondary prevention of osteoporotic fracture was low. Adherence to osteoporosis treatment was also suboptimal with the PDC<0.70 in all three countries. PMID:25660252
Okoche, Deogratius; Asiimwe, Benon B.; Katabazi, Fred Ashaba; Kato, Laban; Najjuka, Christine F.
Introduction Carbapenemases have increasingly been reported in enterobacteriaceae worldwide. Most carbapenemases are plasmid encoded hence resistance can easily spread. Carbapenem-resistant enterobacteriaceae are reported to cause mortality in up to 50% of patients who acquire bloodstream infections. We set out to determine the burden of carbapenem resistance as well as establish genes encoding for carbapenemases in enterobacteriaceae clinical isolates obtained from Mulago National Referral Hospital, Uganda. Methods This was a cross-sectional study with a total of 196 clinical isolates previously collected from pus swabs, urine, blood, sputum, tracheal aspirates, cervical swabs, endomentrial aspirates, rectal swabs, Vaginal swabs, ear swabs, products of conception, wound biopsy and amniotic fluid. All isolates were subjected to phenotypic carbapenemase screening using Boronic acid-based inhibition, Modified Hodge and EDTA double combined disk test. In addition, all the isolates were subjected to PCR assay to confirm presence of carbapenemase encoding genes. Results The study found carbapenemase prevalence of 22.4% (44/196) in the isolates using phenotypic tests, with the genotypic prevalence slightly higher at 28.6% (56/196). Over all, the most prevalent gene was blaVIM (21,10.7%), followed by blaOXA-48 (19, 9.7%), blaIMP (12, 6.1%), blaKPC (10, 5.1%) and blaNDM-1 (5, 2.6%). Among 56 isolates positive for 67 carbapenemase encoding genes, Klebsiella pneumonia was the species with the highest number (52.2%). Most 32/67(47.7%) of these resistance genes were in bacteria isolated from pus swabs. Conclusion There is a high prevalence of carbapenemases and carbapenem-resistance encoding genes among third generation cephalosporins resistant Enterobacteriaceae in Uganda, indicating a danger of limited treatment options in this setting in the near future. PMID:26284519
Chincha, Omayra; Cornelio, Elia; Valverde, Violeta; Acevedo, Mónica
In order to describe the incidence of nosocomial infections associated to invasive devices in intensive care units (UCI) of the National Hospital Cayetano Heredia, a retrospective observational study was conducted using the data from the Office of Epidemiology and Environmental Health from 2010 to 2012. A total number of 222 nosocomial infections were reported; the general medicine UCI reported the highest incidence of pneumonia cases associated to a mechanical ventilator in 1000 days of use of the device (28.6); infection of the blood stream associated to central venous catheter (11.9), and infection of the urinary tract associated to a catheter (8,1). The main infectious agents isolated were Pseudomona sp. (32.3%) in the emergency UCI, negative Staphylococcus coagulasa (36%) in the general medicine UCI and Candida sp (69.2%) in the Surgery UCI. The rates of infections associated to invasive devices were high as in other national hospitals with limited resources and infrastructure. PMID:24448938
Hanna, Kh; Jeffery, Sla
The current conflict in Afghanistan has seen the increasing use of Improvised Explosive Devices (IED) in insurgency attacks. In addition to the coalition forces killed and injured from these devices, local national civilians are also injured. Injuries often include amputations, open fractures and large areas of skin affected by fragmentation. Local national access to long-term care after an IED injury is limited, and often when the patient leaves a coalition hospital this concludes the care the patient will receive. Definitive, durable treatment options are needed for these patients. In the IED-injured patient with open extremity wounds and open metacarpal fractures, pedicled radial forearm flaps offer a suitable soft tissue coverage option. Four cases are reported on IED- injured Afghan patients treated at a Role 3 hospital facility. PMID:23720555
Downey, Erin; Hebert, Anjanette
This paper examines three international healthcare security systems as they relate to patient surge in Canada, Israel, and the United States. Its purpose is to compare the systems, to highlight unique characteristics that define those systems, and to initiate the development of best practices that transcend national boundaries. Several significant national characteristics of demographics, healthcare systems, and political climate, among others, present challenges to translating best practices among these three countries. However, we have found that best practice strategies exist in areas of communications, coordination, building design, space adaptability, and patient routing (both from the community to the hospital, as well as within the hospital) that can be shared and incorporated into the healthcare preparedness efforts in all three countries. PMID:20873500
Alexander, Jeffrey A; Lee, Shoou-Yih D; Wang, Virginia; Margolin, Frances S
Despite the legal and practical importance of monitoring and oversight of management by hospital governing boards, there is little empirical evidence of how hospital boards fulfill these roles and the extent to which these practices have changed over time. We utilize data from three national surveys of hospital governance to examine how oversight and monitoring practices in public and private not-for-profit (NFP) hospital boards have changed over time. Findings suggest that board relations with CEOs in NFP hospitals display important but potentially contradictory patterns. On the one hand, NFP hospital boards appear to be exercising more stringent oversight of management and hospital performance. On the other hand, management is more actively involved with governance matters with less separation of board and management. This general pattern varies by the dimension of oversight and monitoring practice and by specific characteristics of NFP hospitals. PMID:19052168
Background The relationship between extended work hours and health is well documented among hospital doctors, but the effect of national differences in work hours on health is unexplored. The study examines the relationship between work hours and self rated health in two national samples of hospital doctors. Methods The study population consisted of representative samples of 1,260 German and 562 Norwegian hospital doctors aged 25-65 years (N = 1,822) who received postal questionnaires in 2006 (Germany) and 2008 (Norway). The questionnaires contained items on demography, work hours (number of hours per workday and on-call per month) and self rated subjective health on a five point scale - dichotomized into "good" (above average) and "average or below". Results Compared to Norway, a significantly higher proportion of German doctors exceeded a 9 hour work day (58.8% vs. 26.7%) and 60 hours on-call per month (63.4% vs. 18.3%). Every third (32.2%) hospital doctor in Germany worked more than this, while this pattern was rare in Norway (2.9%). In a logistic regression model, working in Norway (OR 4.17; 95% CI 3.02-5.73), age 25-44 years (OR 1.66; 95% CI 1.29-2.14) and not exceeding 9 hour work day and 60 hours on-call per month (OR 1.35; 95% CI 1.03-1.77) were all independent significant predictors of good self reported health. Conclusion A lower percentage of German hospital doctors reported self rated health as "good", which is partly explained by the differences in work time pattern. Initiatives to increase doctors' control over their work time are recommended. PMID:21338494
Trachtenberg, R L
When Robert L. Trachtenberg took over the executive directorship of the National Association of Private Psychiatric Hospitals some five months ago, he walked into a situation wherein several psychiatric specialty hospitals in Texas were under fire. "There were a lot of questions," Trachtenberg says, "and challenges to the credibility of psychiatric hospitals." He was referring to the Texas state investigation into abuses by personnel within psychiatric hospitals. Last year, the Texas Senate Interim Committee on Health and Human Services conducted an eight-month investigation into the conduct of the state's psychiatric hospitals after a newspaper article recounted the unconventional way in which a 14-year old boy was picked up and admitted to a psychiatric facility. After a number of public hearings, three private agencies overseeing Texas psychiatric hospitals adopted rules to prevent further problems in the areas of patient rights, fraudulent billing, patient recruitment and the admission and discharge process. The Senate Interim Committee, however, felt these rules needed to be codified into law and has drafted over 30 bills to be presented to the Texas legislature as omnibus legislation next January. Trachtenberg went to work to iron out methods to encourage better overseeing and state governance, as well as tackling the related issues of standards of care and managed care/utilization review. His background as Deputy Administrator of the Alcohol, Drug Abuse and Mental Health Administration within HHS provided him with a broad spectrum of knowledge about the field of psychiatry and its problems, and his vast experience in federal government--over 32 years of running domestic programs--enable him to have a keen sense of what can get done, and how. Health Systems REVIEW recently discussed the role of the NAPPH under its new leader, Bob Trachtenberg. What follows is an edited version of that conversation. PMID:10122848
Griffiths, Peter; Sloane, Douglas M; Rafferty, Anne Marie; Ball, Jane E; Aiken, Linda H
Objectives To examine whether patient satisfaction with nursing care in National Health Service (NHS) hospitals in England is associated with the proportion of non-UK educated nurses providing care. Design Cross-sectional analysis using data from the 2010 NHS Adult Inpatient Survey merged with data from nurse and hospital administrator surveys. Logistic regression models with corrections for clustering were used to determine whether the proportions of non-UK educated nurses were significantly related to patient satisfaction before and after taking account of other hospital, nursing and patient characteristics. Setting 31 English NHS trusts. Participants 12 506 patients 16 years of age and older with at least one overnight stay that completed a satisfaction survey; 2962 bedside care nurses who completed a nurse survey; and 31 NHS trusts. Main outcome measure Patient satisfaction. Results The percentage of non-UK educated nurses providing bedside hospital care, which ranged from 1% to 52% of nurses, was significantly associated with patient satisfaction. After controlling for potential confounding factors, each 10-point increase in the percentage of non-UK educated nurses diminished the odds of patients reporting good or excellent care by 12% (OR=0.88), and decreased the odds of patients agreeing that they always had confidence and trust in nurses by 13% (OR=0.87). Other indicators of patient satisfaction also revealed lower satisfaction in hospitals with higher percentages of non-UK educated nurses. Conclusions Use of non-UK educated nurses in English NHS hospitals is associated with lower patient satisfaction. Importing nurses from abroad to substitute for domestically educated nurses may negatively impact quality of care. PMID:26634400
Christiaens, Wendy; Gouwy, Anneleen; Bracke, Piet
Background The Belgian and Dutch societies present many similarities but differ with regard to the organisation of maternity care. The Dutch way of giving birth is well known for its high percentage of home births and its low medical intervention rate. In contrast, home births in Belgium are uncommon and the medical model is taken for granted. Dutch and Belgian maternity care systems are compared with regard to the influence of being referred to specialist care during pregnancy or intrapartum while planning for a home birth. We expect that a referral will result in lower satisfaction with childbirth, especially in Belgium. Methods Two questionnaires were filled out by 605 women, one at 30 weeks of pregnancy and one within the first two weeks after childbirth, either at home or in a hospital. Of these, 563 questionnaires were usable for analysis. Women were invited to participate in the study by independent midwives and obstetricians during antenatal visits in 2004–2005. Satisfaction with childbirth was measured by the Mackey Satisfaction with Childbirth Rating Scale, which takes into account the multidimensional nature of the concept. Results Belgian women are more satisfied than Dutch women and home births are more satisfying than hospital births. Women who are referred to the hospital while planning for a home birth are less satisfied than women who planned to give birth in hospital and did. A referral has a greater negative impact on satisfaction for Dutch women. Conclusion There is no reason to believe Dutch women receive hospital care of lesser quality than Belgian women in case of a referral. Belgian and Dutch attach different meaning to being referred, resulting in a different evaluation of childbirth. In the Dutch maternity care system home births lead to higher satisfaction, but once a referral to the hospital is necessary satisfaction drops and ends up lower than satisfaction with hospital births that were planned in advance. We need to understand more
von Theobald, Peter; Cottenet, Jonathan; Iacobelli, Silvia; Quantin, Catherine
We aimed to assess the prevalence of hospitalization for endometriosis in the general population in France and in each French region and to describe temporal trends, rehospitalization rates, and prevalence of the different types of endometriosis. The analyses were carried out on French hospital discharge data and covered the period 2008–2012 and a population of 14,239,197 women of childbearing age. In this population, the prevalence of hospitalization for endometriosis was 0.9%, ranging from 0.4% to 1.6% between regions. Endometriosis affected 1.5% of hospitalized women of childbearing age, ranging from 1.0% to 2.4% between regions. The number of patients hospitalized for endometriosis significantly increased over the study period (p < 0.01). Of these, 4.2% were rehospitalized at least once at one year: ranging from 2.7% to 6.3% between regions. The cumulative rehospitalization rate at 3 years was 6.9%. The types of endometriosis according to the procedures performed were as follows: ovarian (40–50%), peritoneal (20–30%), intestinal (10–20%), and ureteral or bladder (<10%), with significant differences between regions. This is the first detailed epidemiological study of endometriosis in France. Further studies are needed to assess the reasons for the increasing prevalence of endometriosis and for the significant differences in regional prevalence of this disease. PMID:27148550
Gyani, Girdhar J; Krishnamurthy, B
Quality in health care is important as it is directly linked with patient safety. Quality as we know is driven either by regulation or by market demand. Regulation in most developing countries has not been effective, as there is shortage of health care providers and governments have to be flexible. In such circumstances, quality has taken a back seat. Accreditation symbolizes the framework for quality governance of a hospital and is based on optimum standards. Not only is India establishing numerous state of the art hospitals, but they are also experiencing an increase in demand for quality as well as medical tourism. India launched its own accreditation system in 2006, conforming to standards accredited by ISQua. This article shows the journey to accreditation in India and describes the problems encountered by hospitals as well as the benefits it has generated for the industry and patients. PMID:24938026
Stoelwinder, Johannes U
The National Health and Hospitals Reform Commission (NHHRC) has recommended that Australia develop a "single health system", governed by the federal government. Steps to achieving this include: a "Healthy Australia Accord" to agree on the reform framework; the progressive takeover of funding of public hospitals by the federal government; and the possible implementation of a consumer-choice health funding model, called "Medicare Select". These proposals face significant implementation issues, and the final solution needs to deal with both financial and political sustainability. If the federal and state governments cannot agree on a reform plan, the Prime Minister may need to go to the electorate for a mandate, which may be shaped by other economic issues such as tax reform and intergenerational challenges. PMID:19807630
Nielsen, Philip R.; Benros, Michael. E.; Mortensen, Preben B.
Infections and immune responses have been suggested to play an important role in the etiology of schizophrenia. Several studies have reported associations between maternal infections during pregnancy and the child’s risk of schizophrenia; however, infection during childhood and adolescence unrelated to maternal infection during pregnancy has not been studied to nearly the same extent and the results are far from conclusive. Data were drawn from 2 population-based registers, the Danish Psychiatric Central Register and the Danish National Hospital Register. We used a historical population-based cohort design and selected all individuals born in Denmark between 1981 and 1996 (n = 843 390). We identified all individuals with a first-time hospital contact with schizophrenia from 1991 through 2010. Out of the 3409 individuals diagnosed with schizophrenia, a total of 1549 individuals had had a hospital contact with infection before their schizophrenia diagnosis (45%). Our results indicate that individuals who have had a hospital contact with infection are more likely to develop schizophrenia (relative risk [RR] = 1.41; 95% CI: 1.32–1.51) than individuals who had not had such a hospital contact. Bacterial infection was the type of infection that was associated with the highest risk of schizophrenia (RR = 1.63; 95% CI: 1.47–1.82). Our study does not exclude that a certain type of infection may have a specific effect; yet, it does suggest that schizophrenia is associated with a wide range of infections. This association may be due to inflammatory responses affecting the brain or genetic and environmental risk factors aggregating in families. PMID:24379444
Piscitelli, Prisco; Neglia, Cosimo; Falco, Andrea; Rivezzi, Matteo; Agnello, Nadia; Argentiero, Alberto; Chitano, Giovanna; Distante, Chiara; Della Rosa, Giulia; Vinci, Giorgia; De Donno, Antonella; Distante, Alessandro; Romanini, Antonella
Objective: To assess the burden of regional environmental factors influencing the incidence of Melanoma in the Italian population and overcome the problem of partial population coverage by local cancer registries and thematic archives. Methods: We analyzed the Italian national hospitalization records from 2001 to 2008 provided by the Ministry of Health, excluding hospital re-admissions of the same patients, in order to assess the occurrence of Melanoma over a 8-year period. Data were presented by age groups (absolute number of cases from 20 to ≥80 years old) and per Region (rates per 100,000 inhabitants) for each year. Results: The overall number of new hospitalizations due to malignant Melanoma increased by 16.8% from 2001 (n = 4846) to 2008 (n = 5823), with the rate per 100,000 inhabitants passing from 10.5 to almost 12.0 at a national level. The majority of new diagnoses of malignant Melanoma was observed in two age groups: 61–70 years old (from 979 in 2001 up to 1209 in 2008, corresponding to 15.1 and 18.1 new cases per 100,000 inhabitants, respectively) and 71–80 years old (from 954 in 2001 up to 1141 in 2008, corresponding to 19.5 and 21.8 new cases per 100,000 inhabitants, respectively). The number of hospitalizations due to Melanoma increased in all age groups with the only exception of the youngest patients aged 20–30 years old. The highest increases over the 8-year period were observed in people aged ≥81 years old (+34%), 61–70 years old (+20%) and surprisingly in the age group 31–40 years old (+17%). Southern Regions showed lower hospitalization rates compared to Northern Italy and Region Lazio. The highest increases between 2001 and 2008 were observed in Trentino/Alto Adige, Friuli Venezia Giulia, Valla d’Aosta and Veneto Region. Conclusions: Hospitalizations due to malignant Melanoma in Italy seem to be influenced by environmental or population-related factors showing a decreasing incidence rate from the Northern to Southern Regions
Haines, Christine; Brand, Jennie Bickmore
The implementation and effectiveness of the inclusion of literacy and numeracy in industry training packages was examined in case studies of three programs in Western Australia. Two were certificate programs in cooking and food and beverage as specified in the hospitality training package, and the third was an aged care program based on the…
Langstrom, Niklas; Grann, Martin; Ruchkin, Vladislav; Sjostedt, Gabrielle; Fazel, Seena
Little is known about risk factors for violence among individuals with autism spectrum disorder (ASD). This study uses data from Swedish longitudinal registers for all 422 individuals hospitalized with autistic disorder or Asperger syndrome during 1988-2000 and compares those committing violent or sexual offenses with those who did not. Thirty-one…
Crosby, C. J.; Nandigam, V.; Arrowsmith, J. R.; Balakrishnan, S.; Alex, N.; Baru, C.
The recently completed GeoEarthScope airborne LiDAR (Light Detection And Ranging) topography acquisition will provide unprecedented data adjacent to active faults throughout the plate boundary region of western North America. Totaling more than 5000 square kilometers, these community-oriented data offer an high-resolution representation of fault zone topography and should be a revolutionary resource for researchers studying earthquake hazards, active faulting, landscape processes, and ground deformation. Since spring of 2007, the NSF-funded GeoEarthScope LiDAR project has acquired data for the San Andreas fault system in northern California, faults in southern California, the Yakima Fold and Thrust Belt in Washington, Yellowstone National Park, the Tetons, the Wasatch Front, and Alaska. These data will be made available via the OpenTopography Portal (www.opentopography.org), a domain-specific component of the GEON project, as they are processed and delivered by the National Center for Airborne Laser Mapping. The OpenTopography Portal (OpenToPo) provides access to a variety of GeoEarthScope LiDAR data products and uses several cyberinfrastructure components developed by the GEON project. These products range from simple Google Earth visualizations of LiDAR hillshades to standard digital elevation model (DEM) products as well as LiDAR point cloud data. The wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data distribution system that provides various levels of access to the data. Standard DEM products in OpenToPo are accessed via a Google Map and/or Google Earth-based interface that allow users to browse and download the data products. For users who wish to explore the full potential of the LiDAR data, we provide access to the raw LiDAR point data and a suite of DEM generation tools to enable users to create custom DEMs to best fit their science applications. Storage and management of
Santry, Heena P.; Madore, John C.; Collins, Courtney E.; Ayturk, M. Didem; Velmahos, George C.; Britt, LD; Kiefe, Catarina I.
BACKGROUND To date, no studies have reported nationwide adoption of Acute Care Surgery (ACS) or identified structural and/or process variations for the care of emergency general surgery (EGS) patients within such models. METHODS We surveyed surgeons responsible for EGS coverage at University HealthSystems Consortium hospitals using an 8-page postal/email questionnaire querying respondents on hospital and EGS structure/process measures. Survey responses were analyzed using descriptive statistics, univariate comparisons, and multivariable regression models. RESULTS 258 of 319 (81%) potential respondents completed surveys. 81 hospitals (31%) had implemented ACS while 134 (52%) had a traditional general surgeon on-call model (GSOC). 38 (15%) hospitals had another model (HYBRID). Larger bed, university-based, teaching hospitals with Level 1 trauma center verification status located in urban areas were more likely to have adopted ACS. In multivariable modeling, hospital type, setting, and trauma center verification predicted ACS implementation. EGS processes of care varied with 28% GSOC having block time vs 67% ACS (p<0.0001); 45% GSOC providing ICU care to EGS patients in a surgical/trauma ICU vs 93% ACS (p<0.0001); GSOC sharing call among 5.7 (+/− 3.2) surgeons vs 7.9 (+/−2.3) ACS surgeons (p<0.0001); and 13% GSOC taking in-house EGS call vs 75% ACS (p<0.0001). Among ACS hospitals there were variations in patient cohorting (25% EGS patients alone; 21% EGS+trauma; 17% EGS+elective; 30% EGS+trauma+elective), data collection (26% had prospective EGS registries), and patient handoffs (56% had attending surgeon presence), call responsibilities (averaging 4.8 (+/− 1.3) calls per month with 60% providing extra call stipend and 40% with no post-call clinical duties). CONCLUSION The potential of the ACS on the national crisis in access to EGS care is not fully met. Variations in EGS processes of care among adopters of ACS suggest that standardized criteria for ACS
Mikulovsky, R. P.; De La Fuente, J. A.
The US Forest Service (US Department of Agriculture) manages a broad range of geologic resources and hazards on National Forests and Grass Lands throughout the United States. Resources include rock and earth materials, groundwater, caves and paleontological resources, minerals, energy resources, and unique geologic areas. Hazards include landslides, floods, earthquakes, volcanic eruptions, and naturally hazardous materials (e.g., asbestos, radon). Forest Service Geologists who address these issues are Resource Geologists. They have been exploring LiDAR as a revolutionary tool to efficiently manage all of these hazards and resources. However, most LiDAR applications for management have focused on timber and fuels management, rather than landforms. This study shows the applications and preliminary results of using LiDAR for managing geologic resources and hazards on public lands. Applications shown include calculating sediment budgets, mapping and monitoring landslides, mapping and characterizing borrow pits or mines, determining landslide potential, mapping faults, and characterizing groundwater dependent ecosystems. LiDAR can be used to model potential locations of groundwater dependent ecosystems with threatened or endangered plant species such as Howellia aquatilis. This difficult to locate species typically exists on the Mendocino National Forest within sag ponds on landslide benches. LiDAR metrics of known sites are used to model potential habitat. Thus LiDAR can link the disciplines of geology, hydrology, botany, archaeology and others for enhanced land management. As LiDAR acquisition costs decrease and it becomes more accessible, land management organizations will find a wealth of applications with potential far-reaching benefits for managing geologic resources and hazards.
Makubi, Abel N; Meda, Collins; Magesa, Alex; Minja, Peter; Mlalasi, Juliana; Salum, Zubeda; Kweka, Rumisha E; Rwehabura, James; Quaresh, Amrana; Magesa, Pius M; Robert, David; Makani, Julie; Kaaya, Ephata
In Tanzania, there is paucity of data for monitoring laboratory medicine including haematology. This therefore calls for audits of practices in haematology and blood transfusion in order to provide appraise practice and devise strategies that would result in improved quality of health care services. This descriptive cross-sectional study which audited laboratory practice in haematology and blood transfusion at Muhimbili National Hospital (MNH) aimed at assessing the pre-analytical stage of laboratory investigations including laboratory request forms and handling specimen processing in the haematology laboratory and assessing the chain from donor selection, blood component processing to administration of blood during transfusion. A national standard checklist was used to audit the laboratory request forms (LRF), phlebotomists' practices on handling and assessing the from donor selection to administration 6f blood during transfusion. Both interview and observations were used. A total of 195 LRF were audited and 100% of had incomplete information such as patients' identification numbers, time sample ordered, reason for request, summary of clinical assessment and differential diagnoses. The labelling of specimens was poorly done by phlebotomists/clinicians in 82% of the specimens. Also 65% (132/202) of the blood samples delivered in the haematology laboratory did not contain the recommended volume of blood. There was no laboratory request form specific for ordering blood and there were no guidelines for indication of blood transfusion in the wards/ clinics. The blood transfusion laboratory section was not participating in external quality assessment and the hospital transfusion committee was not in operation. It is recommended that a referral hospital like MNH should have a transfusion committee to provide an active forum to facilitate communication between those involved with transfusion, monitor, coordinate and audit blood transfusion practices as per national
Langley, Ricky; Mack, Karin; Haileyesus, Tadesse; Proescholdbell, Scott; Annest, Joseph L.
Objective Injuries resulting from contact with animals and insects are a significant public health concern. This study quantifies nonfatal bite and sting injuries by noncanine sources using data from the National Electronic Injury Surveillance System–All Injury Program (NEISS-AIP). Methods The NEISS-AIP is an ongoing nationally representative surveillance system used to monitor all types and causes of injuries treated in US hospital emergency departments (EDs). Cases were coded by trained hospital coders using information from medical records on animal and insect sources of bite and sting injuries being treated. Data were weighted to produce national annualized estimates, percentages, and rates based on the US population. Results From 2001 to 2010 an estimated 10.1 million people visited EDs for noncanine bite and sting injuries, based on an unweighted case count of 169,010. This translates to a rate of 340.1 per 100,000 people (95% CI, 232.9–447.3). Insects accounted for 67.5% (95% CI, 45.8–89.2) of bite and sting injuries, followed by arachnids 20.8% (95% CI, 13.8–27.9). The estimated number of ED visits for bedbug bite injuries increased more than 7-fold—from 2156 visits in 2007 to 15,945 visits in 2010. Conclusions This study provides an update of national estimates of noncanine bite and sting injuries and describes the diversity of animal exposures based on a national sample of EDs. Treatment of nonfatal bite and sting injuries are costly to society. Direct medical and work time lost translates to an estimated $7.5 billion annually. PMID:24433776
2. View northwest of main hospital building complex, hospital building (Building 90), administration and clinical hospital building (Building 88), and hospital building (Building 91) - National Home for Disabled Volunteer Soldiers Western Branch, 4101 South Fourth Street, Leavenworth, Leavenworth County, KS
..., CERTIFICATION, AND ENFORCEMENT PROCEDURES General Provisions § 488.6 Other national accreditation programs for... health agency providers of outpatient physical therapy, occupational therapy or speech pathology...
Ahn, Tae-Sa; Park, Ihn Sook; You, Ock-Su; Shin, Hyeon-Ju; Woo, Kyung-Shun; Jo, Eun-Mee
In an effort to investigate nurses' perceptions of and attitudes toward the use of electronic medical record (EMR) systems, 904 nurses in a university hospital were surveyed for demographic data and their perceptions of and attitudes toward an EMR system 6 months after its implementation. The questionnaire consisted of demographic information, perception statements relating to the effect of an EMR system, and attitude statements toward an EMR system (assessed on 4-point Likert scales, Cronbach's alpha = 0.979). Nurses' perceptions and attitudes were generally positive and correlated with the type of nursing unit, and their age, years of nursing experience, and job title. This result reinforces that nurses are generally accepting of the implementation of a new EMR system. However, strategies are needed for improving the satisfaction of nurses who have a negative perception of and attitude toward EMR systems. It is recommended that the findings of our study be implemented in other hospitals with ongoing EMR projects. PMID:17102423
Tabtabai, Sara; DeFaria Yeh, Doreen; Stefanescu, Ada; Kennedy, Kevin; Yeh, Robert W; Bhatt, Ami B
Patients with single-ventricle (SV) anatomy now live to adulthood. Little is known about the cost of care and outcomes for patients with SV anatomy, especially those who develop heart failure (HF) cared for in adult hospitals in the United States. We analyzed the Nationwide Inpatient Sample from 2000 to 2011 for patients >14 years admitted to adult hospitals with the International Classifications of Diseases, Ninth Revision, codes for SV anatomy. Demographics, outcomes, co-morbidities, and cost were assessed. From 2000 to 2011, the number of SV admissions was stable with a trend toward increased cost per admission over time. Coexistent hypertension, obesity, and liver, pulmonary, and renal diseases significantly increased over time. The most common reason for admission was atrial arrhythmia followed by HF. Patients with SV with HF had significantly higher inhospital mortality, length of stay, and more medical co-morbidities than those with SV and without HF. In conclusion, the cohort of patients with SV admitted to adult hospitals has changed in the modern era. Patients with SV have medical co-morbidities including renal and liver diseases, hypertension, and obesity at a surprisingly young age. Aggressive and proactive management of HF and arrhythmia may reduce cost of care for this challenging population. Patients with SV with HF have particularly high mortality, more medical co-morbidities, and increased cost of care and deserve more focused attention to improve outcomes. PMID:26100589
Nakamura-Pereira, Marcos; Mendes-Silva, Wallace; Dias, Marcos Augusto Bastos; Reichenheim, Michael E; Lobato, Gustavo
This study aimed to investigate the performance of the Hospital Information System of the Brazilian Unified National Health System (SIH-SUS) in identifying cases of maternal near miss in a hospital in Rio de Janeiro, Brazil, in 2008. Cases were identified by reviewing medical records of pregnant and postpartum women admitted to the hospital. The search for potential near miss events in the SIH-SUS database relied on a list of procedures and codes from the International Classification of Diseases, 10th revision (ICD-10) that were consistent with this diagnosis. The patient chart review identified 27 cases, while 70 potential occurrences of near miss were detected in the SIH-SUS database. However, only 5 of 70 were "true cases" of near miss according to the chart review, which corresponds to a sensitivity of 18.5% (95%CI: 6.3-38.1), specificity of 94.3% (95%CI: 92.8-95.6), area under the ROC of 0.56 (95%CI: 0.48-0.63), and positive predictive value of 10.1% (IC95%: 4.7-20.3). These findings suggest that SIH-SUS does not appear appropriate for monitoring maternal near miss. PMID:23843001
Linden, Stefanie Caroline; Jones, Edgar
During the First World War the National Hospital for the Paralysed and Epileptic, in Queen Square, London, then Britain’s leading centre for neurology, took a key role in the treatment and understanding of shell shock. This paper explores the case notes of all 462 servicemen who were admitted with functional neurological disorders between 1914 and 1919. Many of these were severe or chronic cases referred to the National Hospital because of its acknowledged expertise and the resources it could call upon. Biographical data was collected together with accounts of the patient’s military experience, his symptoms, diagnostic interpretations and treatment outcomes. Analysis of the notes showed that motor syndromes (loss of function or hyperkinesias), often combined with somato-sensory loss, were common presentations. Anxiety and depression as well as vegetative symptoms such as sweating, dizziness and palpitations were also prevalent among this patient population. Conversely, psychogenic seizures were reported much less frequently than in comparable accounts from German tertiary referral centres. As the war unfolded the number of physicians who believed that shell shock was primarily an organic disorder fell as research failed to find a pathological basis for its symptoms. However, little agreement existed among the Queen Square doctors about the fundamental nature of the disorder and it was increasingly categorised as functional disorder or hysteria. PMID:25284893
Ertl, Christian W; Royal, David; Arzoiey, Humayoon Abdul; Shefa, Azizullah; Sultani, Salim; Mosafa, Mohammed Omar; Sadat, Safiullah; Zirkle, Lewis
In Afghanistan, adequate and cost-effective medical care for even routine conditions is lacking; especially for complex injuries like long-bone fractures. The Surgical Implant Generation Network (SIGN) intramedullary nail is used for treatment of long-bone fractures from blunt injuries and does not require imaging. We are reporting for the first time results of the SIGN intramedullary nail at the Afghan National Police Hospital, a tertiary care facility in Kabul. 71 records from the SIGN Online Surgical Database were reviewed for gender, age, date of injury, implant date, patient's home of record, and type/ mechanism of injury. Mean age was 26.7 years, all but one being male; time from injury to implant ranged 1 to 401 days, with mean of 40.6 days. Long-bone fractures from motor vehicle accidents remained constant, and war injuries peaked in summer. Follow-up is limited because of security and financial burdens of travel. However, personal communication with Afghan National Police Hospital surgeons suggests that patients included in the current study have not experienced any adverse outcomes. While it remains to be seen if the SIGN Online Surgical Database will facilitate more comprehensive outcome studies, our results provide support for the efficacy of SIGN nails in treating long-bone fractures from war injuries. PMID:26741473
Bennett, Christine C
After extensive community and health industry consultation, the final report of the National Health and Hospitals Reform Commission, A healthier future for all Australians, was presented to the Australian Government on 30 June 2009. The reform agenda aims to tackle major access and equity issues that affect health outcomes for people now; redesign our health system so that it is better positioned to respond to emerging challenges; and create an agile, responsive and self-improving health system for long-term sustainability. The 123 recommendations are grouped in four themes: Taking responsibility: supporting greater individual and collective action to build good health and wellbeing. Connecting care: delivering comprehensive care for people over their lifetime, by strengthening primary health care, reshaping hospitals, improving subacute care, and opening up greater consumer choice and competition in aged care services. Facing inequities: taking action to tackle the causes and impact of health inequities, focusing on Aboriginal and Torres Strait Islander people, people in rural and remote areas, and access to mental health and dental services. Driving quality performance: having leadership and systems to achieve the best use of people, resources and knowledge, including "one health system" with national leadership and local delivery, revised funding arrangements, and changes to health workforce education, training and practice. PMID:19807629
Background We aimed to examine current practice of the management and secondary prevention of intracerebral haemorrhage (ICH) in China where the disease is more common than in Western populations. Methods Data on baseline characteristics, management in-hospital and post-stroke, and outcome of ICH patients are from the ChinaQUEST (QUality Evaluation of Stroke Care and Treatment) study, a multi-centre, prospective, 62 hospital registry in China during 2006-07. Results Nearly all ICH patients (n = 1572) received an intravenous haemodiluting agent such as mannitol (96%) or a neuroprotectant (72%), and there was high use of intravenous traditional Chinese medicine (TCM) (42%). Neurosurgery was undertaken in 137 (9%) patients; being overweight, having a low Glasgow Coma Scale (GCS) score on admission, and Total Anterior Circulation Syndrome (TACS) clinical pattern on admission, were the only baseline factors associated with this intervention in multivariate analyses. Neurosurgery was associated with nearly three times higher risk of death/disability at 3 months post-stroke (odd ratio [OR] 2.60, p < 0.001). Continuation of antihypertensives in-hospital and at 3 and 12 months post-stroke was reported in 732/935 (78%), 775/935 (83%), and 752/935 (80%) living patients with hypertension, respectively. Conclusions The management of ICH in China is characterised by high rates of use of intravenous haemodiluting agents, neuroprotectants, and TCM, and of antihypertensives for secondary prevention. The controversial efficacy of these therapies, coupled with the current lack of treatments of proven benefit, is a call for action for more outcomes based research in ICH. PMID:21276264
Some are calling it the Enron of the healthcare industry. Ryder trucks hauled possible evidence from embattled financier National Century Financial Enterprises during an FBI raid. NCFE filed for Chapter 11 bankruptcy protection last week, sending ripples through the industry and contributing to the bankruptcies of a string of national healthcare chains and at least six hospitals. PMID:12510558
Hedrick, A.; Marshall, H.-P.; Winstral, A.; Elder, K.; Yueh, S.; Cline, D.
Repeated Light Detection and Ranging (LiDAR) surveys are quickly becoming the de facto method for measuring spatial variability of montane snowpacks at high resolution. This study examines the potential of a 750 km2 LiDAR-derived dataset of snow depths, collected during the 2007 northern Colorado Cold Lands Processes Experiment (CLPX-2), as a validation source for an operational hydrologic snow model. The SNOw Data Assimilation System (SNODAS) model framework, operated by the US National Weather Service, combines a physically-based energy-and-mass-balance snow model with satellite, airborne and automated ground-based observations to provide daily estimates of snowpack properties at nominally 1 km resolution over the coterminous United States. Independent validation data is scarce due to the assimilating nature of SNODAS, compelling the need for an independent validation dataset with substantial geographic coverage. Within twelve distinctive 500 m × 500 m study areas located throughout the survey swath, ground crews performed approximately 600 manual snow depth measurements during each of the CLPX-2 LiDAR acquisitions. This supplied a dataset for constraining the uncertainty of upscaled LiDAR estimates of snow depth at the 1 km SNODAS resolution, resulting in a root-mean-square difference of 13 cm. Upscaled LiDAR snow depths were then compared to the SNODAS-estimates over the entire study area for the dates of the LiDAR flights. The remotely-sensed snow depths provided a more spatially continuous comparison dataset and agreed more closely to the model estimates than that of the in situ measurements alone. Finally, the results revealed three distinct areas where the differences between LiDAR observations and SNODAS estimates were most drastic, suggesting natural processes specific to these regions as causal influences on model uncertainty.
d'Orsi, Eleonora; Brüggemann, Odaléa Maria; Diniz, Carmen Simone Grilo; Aguiar, Janaina Marques de; Gusman, Christine Ranier; Torres, Jacqueline Alves; Angulo-Tuesta, Antonia; Rattner, Daphne; Domingues, Rosa Maria Soares Madeira
The objective is to identify factors associated with women's satisfaction towards the care provided by the health professionals during hospital assisted delivery and identify how those factors influence their general levels of satisfaction. The cohort hospital based study was carried out in connection with the Birth in Brazil research. 15,688 women were included, interviewed at home, through the phone, from March 2011 to February 2012. All the variables that compose the professional/pregnant woman relationship (waiting time, respect, privacy, clarity of explanations, possibility of asking questions and participating in the decisions) and schooling remained independently associated with general satisfaction towards delivery care, in the adjusted model. The white women assisted in the southeastern and southern regions of the country, by the private sector and with a companion present gave a better evaluation of the care provided. Women value the way in which they are assisted by the health professionals, and there are inequalities in the way they are treated based on skin color, geographic region and financial situation. PMID:25167175
Unick, George Jay; Rosenblum, Daniel; Mars, Sarah; Ciccarone, Daniel
The historical patterns of opiate use show that sources and methods of access greatly influence who is at risk. Today, there is evidence that an enormous increase in the availability of prescription opiates is fuelling a rise in addiction nationally, drawing in new initiates to these drugs and changing the geography of opiate overdoses. Recent efforts at supply-based reductions in prescription opiates may reduce harm, but addicted individuals may switch to other opiates such as heroin. In this analysis, we test the hypothesis that changes in the rates of Prescription Opiate Overdoses (POD) are correlated with changes in the rate of heroin overdoses (HOD). ICD9 codes from the Nationwide Inpatient Sample and population data from the Census were used to estimate overall and demographic specific rates of POD and HOD hospital admissions between 1993 and 2009. Regression models were used to test for linear trends and lagged negative binomial regression models were used to model the interrelationship between POD and HOD hospital admissions. Findings show that whites, women, and middle-aged individuals had the largest increase in POD and HOD rates over the study period and that HOD rates have increased in since 2007. The lagged models show that increases in a hospitals POD predict an increase in the subsequent years HOD admissions by a factor of 1.26 (p<0.001) and that each increase in HOD admissions increase the subsequent years POD by a factor of 1.57 (p<0.001). Our hypothesis of fungibility between prescription opiates and heroin was supported by these analyses. These findings suggest that focusing on supply-based interventions may simply lead to a shift in use to heroin rather minimizing the reduction in harm. The alternative approach of using drug abuse prevention resources on treatment and demand-side reduction is likely to be more productive at reducing opiate abuse related harm. PMID:23405084
Outwater, Anne H.; Campbell, Jacquelyn C.; Mgaya, Edward
A foundational implementation of the WHO/CDC Injury Surveillance Guidelines was conducted in Dar es Salaam region of the United Republic of Tanzania in 2005. The Guidelines were adapted to gather qualitative as well as quantitative data about intentional injury mortality which were collected concurrently at the Muhimbili National Hospital Mortuary. An interview schedule of 12 quantitative variables and one open-ended question, participant observation and newspaper reports were used. Mixed methods allowed an understanding of intentional injury mortality to emerge, even for those with the least amount of data, the 22% of homicides whose bodies were never claimed. Mixed methods made it possible to quantify intentional injury mortality rates, describe subpopulations with scanty data, and learn how to embed ongoing injury mortality surveillance into daily practice. PMID:24130432
Outwater, Anne H; Campbell, Jacquelyn C; Mgaya, Edward
A foundational implementation of the WHO/CDC Injury Surveillance Guidelines was conducted in Dar es Salaam region of the United Republic of Tanzania in 2005. The Guidelines were adapted to gather qualitative as well as quantitative data about intentional injury mortality which were collected concurrently at the Muhimbili National Hospital Mortuary. An interview schedule of 12 quantitative variables and one open-ended question, participant observation and newspaper reports were used. Mixed methods allowed an understanding of intentional injury mortality to emerge, even for those with the least amount of data, the 22% of homicides whose bodies were never claimed. Mixed methods made it possible to quantify intentional injury mortality rates, describe subpopulations with scanty data, and learn how to embed ongoing injury mortality surveillance into daily practice. PMID:24130432
In 2006, the Kenyan state joined the international commitment to make antiretroviral treatment free in public health institutions to people infected with HIV. Less than a decade later, treatment has reached over 60% of those who need it in Kenya. This paper, which is based on an in-depth ethnographic case study of the HIV treatment programme at Kenyatta National Hospital, conducted intermittently between 2008 and 2014, examines how HIV-positive peer mentors encourage and track adherence to treatment regimens within and beyond the clinic walls using mobile phones and computer technology. This research into the everyday practices of patient monitoring demonstrates that both surveillance and adherence are collective activities. Peer mentors provide counselling services, follow up people who stray from treatment regimens, and perform a range of other tasks related to patient management and treatment adherence. Despite peer mentors’ involvement in many tasks key to encouraging optimal adherence, their role is rarely acknowledged by co-workers, hospital administrators, or public health officials. Following a biomedical paradigm, adherence at Kenyatta and in Kenya is framed by programme administrators as something individual clients must do and for which they must be held accountable. This framing simultaneously conceals the sociality of adherence and undervalues the work of peer mentors in treatment programmes. PMID:25175291
Catering & Hospitality, Serving Food & Drink, Levels 1-3. 2nd Edition. Catering & Hospitality, Reception & Housekeeping, Levels 1-3. Catering & Hospitality, Supervisory Management, Level 3. Catering & Hospitality Management, Level 4. 2nd Edition. National Vocational Qualifications.
Business and Technology Education Council, London (England).
Britain's National Vocational Qualifications (NVQs) are work qualifications that measure what an employee or potential employee can do as well as how much he or she knows and understands about a particular job. Used as written proof of usable workplace skills that can be put to profitable use by an employer, NVQs range from basic Level 1, for…
Langan Martin, Julie; McLean, Gary; Cantwell, Roch; Smith, Daniel J
Objective To describe weekly admission rates for affective and non-affective psychosis, major depression and other psychiatric disorders in the early and late postpartum periods. To assess the impact of socioeconomic status, age and parity on admission rates. Methods Scottish maternity records were linked to psychiatric hospital admissions. 3290 pregnancy-related psychiatric admissions were assessed. Weekly admission rates were calculated for the pregnancy period, early postpartum period (6 weeks after birth) and late postpartum period (up to 2 years after birth), and compared with pre-pregnancy rates (up to 2 years before pregnancy). Admission rates were generated by calculating the total number of admissions for each time period divided by the number of weeks in the period. Incidence rate ratios (IRRs) were generated for each time period, using deprivation, age, parity and record of previous psychiatric hospital care-adjusted Poisson regression models. Results Women from more deprived social quintiles accounted for the largest proportion of admissions across all time periods. Compared with pre-pregnancy period, admission rates fell during pregnancy, increased markedly during the early postpartum period, and remained elevated for 2 years after childbirth. Within the most affluent quintile, admission IRRs were higher in the early postpartum period (IRR=1.29, 95% CI 1.02 to 1.59) than in the late postpartum period (IRR=0.87, 95% CI 0.74 to 0.98). For the late postpartum period, there was a positive association between higher maternal age and admission IRRs (ages 20–35 years, IRR=1.35, 95% CI 1.16 to 1.54 and age>40 years IRR=1.72, 95% CI 1.41 to 2.09). Conclusions Rates of psychiatric admission fell during pregnancy and increased in the early postpartum period (particularly during the first 2 weeks after birth), and remained elevated above baseline during the 2-year late postpartum period. An understanding of how social deprivation, age and parity
Vierling, K T; Bässler, C; Brandl, R; Vierling, L A; Weiss, I; Müller, J
LiDAR remote sensing has been used to examine relationships between vertebrate diversity and environmental characteristics, but its application to invertebrates has been limited. Our objectives were to determine whether LiDAR-derived variables could be used to accurately describe single-species distributions and community characteristics of spiders in remote forested and mountainous terrain. We collected over 5300 spiders across multiple transects in the Bavarian National Park (Germany) using pitfall traps. We examined spider community characteristics (species richness, the Shannon index, the Simpson index, community composition, mean body size, and abundance) and single-species distribution and abundance with LiDAR variables and ground-based measurements. We used the R2 and partial R2 provided by variance partitioning to evaluate the predictive power of LiDAR-derived variables compared to ground measurements for each of the community characteristics. The total adjusted R2 for species richness, the Shannon index, community species composition, and body size had a range of 25-57%. LiDAR variables and ground measurements both contributed >80% to the total predictive power. For species composition, the explained variance was approximately 32%, which was significantly greater than expected by chance. The predictive power of LiDAR-derived variables was comparable or superior to that of the ground-based variables for examinations of single-species distributions, and it explained up to 55% of the variance. The predictability of species distributions was higher for species that had strong associations with shade in open-forest habitats, and this niche position has been well documented across the European continent for spider species. The similar statistical performance between LiDAR and ground-based measures at our field sites indicated that deriving spider community and species distribution information using LiDAR data can provide not only high predictive power at
Background Surgical site infections (SSIs) remain a common and widespread problem contributing to a significant morbidity and mortality, attributed partly by the increase in antimicrobial resistance among the etiological agents. This study was done to determine the spectrum of bacterial isolates and their susceptibility patterns causing SSIs at Muhimbili National Hospital, Tanzania. Methods This descriptive cross sectional study was conducted between September, 2011 and February, 2012. Pus swabs or pus were cultured on blood agar (Oxoid, UK) and MacConkey agar (Oxoid, UK) and incubated aerobically at 37°C for 18–24 hours. Bacterial identification was done using API 20E and VITEK and antimicrobial susceptibility was determined by Kirby Bauer disc diffusion. Results Of the 100 patients, from whom wound swabs were collected, 90 (90%) had positive aerobic bacterial growth. A total of 147 pathogenic bacteria were isolated, including 114 (77.5%) gram negative and 33(22.5%) gram positive organisms. The most prevalent bacterial species were Pseudomonas aeruginosa (16.3%), followed by Staphylococcus aureus (12.2%) and Klebsiella pneumoniae (10.8%). Of the 18 S. aureus , 8 (44%) were methicillin resistant Staphylococcus aureus (MRSA) and three of them (17%) were carrying both MRSA and induced clindamycin resistance (ICR). Extended spectrum beta-lactamase (ESBL) producing Enterobacteriaceae were observed in 23 (79.3%) of the 29 isolates tested. Majority of Escherichia coli 12 (92.3%) and K. pneumoniae 11 (69%) isolates were ESBL producers. About 63% (93/147) were multiple-drug resistance (MDR) isolates, and the overall MDR among Gram positive and Gram negative bacteria was 60.6% (20/33) and 61.4%, (73/114), respectively. The prevalence of MDR for E. coli, A. baumannii and P. stuartii was 100% each. Majority (97%) of the Gram negative bacteria were resistant to more than four categories (classes) of antibiotics. Conclusion A high proportion (63%) of the isolates causing
Mohamed-Ahmed, Olaa; McClymont, Charlotte; Knight, Marian
Objectives In countries, such as the UK, where maternal deaths are rare, reviews of other severe complications of pregnancy and the puerperium can provide an additional perspective to help learn lessons to improve future care. The objective of this survey was to identify the types of incidents which triggered local reviews in the UK, in order to inform national safety reporting guidance. Design A national descriptive survey. Setting UK. Participants Consultant-led maternity units. Main outcome measure Seventy-one per cent of maternity units provided an incident review trigger list. The conditions included were classified by two assessors. Incidents that were listed by at least 5% of maternity units were reported and compared with incidents recommended for review by the Royal College of Obstetricians and Gynaecologists (RCOG). Results The conditions covered were highly variable, although those recommended by the RCOG were most highly represented. The most commonly listed conditions that had not been recommended for review by the RCOG included inadequate staffing levels (70%), cardiac arrest (69%) and maternal sepsis (64%). Conclusions Substantial variation exists in the types of incident listed for review by maternity units in the UK. Importantly, some units are not reviewing cases of severe infective complications even though this is a current major concern. Future guidance concerning local serious incident review processes should include how the list of conditions triggering a review should be managed in the light of changing clinical and safety priorities. PMID:25057407
Gay, Greer; Patel-Parekh, Lina; Ajani, Jaffer A.; Donohue, John H.
The concept that complex surgical procedures should be performed at high-volume centers to improve surgical morbidity and mortality is becoming widely accepted. We wanted to determine if there were differences in the treatment of patients with gastric cancer between community cancer centers and teaching hospitals in the United States. Data from the 2001 Gastric Cancer Patient Care Evaluation Study of the National Cancer Data Base comprising 6,047 patients with gastric adenocarcinoma treated at 691 hospitals were assessed. The mean number of patients treated was larger at teaching hospitals (14/year) when compared to community centers (5–9/year) (p < 0.05). The utilization of laparoscopy and endoscopic ultrasonography were significantly more common at teaching centers (p < 0.01). Pathologic assessment of greater than 15 nodes was documented in 31% of specimen at community hospitals and 38% at teaching hospitals (p < 0.01). Adjusted for cancer stage, chemotherapy and radiation therapy were utilized with equal frequency at all types of treatment centers. The 30-day postoperative mortality was lowest at teaching hospitals (5.5%) and highest at community hospitals (9.9%) (p < 0.01). These data support previous publications demonstrating that patients with diseases requiring specialized treatment have lower operative mortality when treated at high-volume centers. PMID:17436123
Donaldson, Liam J.; Panesar, Sukhmeet S.; Darzi, Ara
Background Hospital mortality is increasingly being regarded as a key indicator of patient safety, yet methodologies for assessing mortality are frequently contested and seldom point directly to areas of risk and solutions. The aim of our study was to classify reports of deaths due to unsafe care into broad areas of systemic failure capable of being addressed by stronger policies, procedures, and practices. The deaths were reported to a patient safety incident reporting system after mandatory reporting of such incidents was introduced. Methods and Findings The UK National Health Service database was searched for incidents resulting in a reported death of an adult over the period of the study. The study population comprised 2,010 incidents involving patients aged 16 y and over in acute hospital settings. Each incident report was reviewed by two of the authors, and, by scrutinising the structured information together with the free text, a main reason for the harm was identified and recorded as one of 18 incident types. These incident types were then aggregated into six areas of apparent systemic failure: mismanagement of deterioration (35%), failure of prevention (26%), deficient checking and oversight (11%), dysfunctional patient flow (10%), equipment-related errors (6%), and other (12%). The most common incident types were failure to act on or recognise deterioration (23%), inpatient falls (10%), healthcare-associated infections (10%), unexpected per-operative death (6%), and poor or inadequate handover (5%). Analysis of these 2,010 fatal incidents reveals patterns of issues that point to actionable areas for improvement. Conclusions Our approach demonstrates the potential utility of patient safety incident reports in identifying areas of service failure and highlights opportunities for corrective action to save lives. Please see later in the article for the Editors' Summary PMID:24959751
Isfahani, Sakineh Saghaeiannejad; Khajouei, Reza; Jahanbakhsh, Maryan; Mirmohamadi, Mahboubeh
Introduction: Nowadays, modern laboratories are faced with a huge volume of information. One of the goals of the Laboratory Information Management System (LIMS) is to assist in the management of the information generated in the laboratory. This study intends to evaluate the LIMS based on the standards of the American National Standard Institute (ANSI). Materials and Methods: This research is a descriptive–analytical study, which had been conducted in 2011, on the LIMSs in use, in the teaching and private hospitals in Isfahan. The data collecting instrument was a checklist, which was made by evaluating three groups of information components namely: ‘System capabilities’, ‘work list functions,’ and ‘reporting’ based on LIS8-A. Data were analyzed using the SPSS 20. Data were analyzed using (relative) frequency, percentage. To compare the data the following statistical tests were used: Leven test, t-test, and Analysis of Variance (ANOVA). Results: The results of the study indicated that the LIMS had a low conformity (30%) with LIS8-A (P = 0.001), with no difference between teaching and private hospitals (P = 0.806). The ANOVA revealed that in terms of conformity with the LIS8-A standard, there was a significant difference between the systems produced by different vendors (P = 0.023). According to the results, a Kowsar system with more than %57 conformity in the three groups of information components had a better conformity to the standard, compared to the other systems. Conclusions: This study indicated that none of the LIMSs had a good conformity to the standard. It seems that system providers did not pay sufficient attention to many of the information components required by the standards when designing and developing their systems. It was suggested that standards from certified organizations and institutions be followed in the design and development process of health information systems. PMID:25077154
An assessment of composite measures of hospital performance and associated mortality for patients with acute myocardial infarction. Analysis of individual hospital performance and outcome for the National Institute for Cardiovascular Outcomes Research (NICOR)
Baxter, Paul D; Cattle, Brian A; Batin, Phillip D; Wilson, John I; West, Robert M; Hall, Alistair S; Weston, Clive F; Deanfield, John E; Fox, Keith A; Gale, Chris P
Aim: To investigate whether a hospital-specific opportunity-based composite score (OBCS) was associated with mortality in 136,392 patients with acute myocardial infarction (AMI) using data from the Myocardial Ischaemia National Audit Project (MINAP) 2008–2009. Methods and results: For 199 hospitals a multidimensional hospital OBCS was calculated on the number of times that aspirin, thienopyridine, angiotensin-converting enzyme inhibitor (ACEi), statin, β-blocker, and referral for cardiac rehabilitation was given to individual patients, divided by the overall number of opportunities that hospitals had to give that care. OBCS and its six components were compared using funnel plots. Associations between OBCS performance and 30-day and 6-month all-cause mortality were quantified using mixed-effects regression analysis. Median hospital OBCS was 95.3% (range 75.8–100%). By OBCS, 24.1% of hospitals were below funnel plot 99.8% CI, compared to aspirin (11.1%), thienopyridine (15.1%), β-blockers (14.7%), ACEi (19.1%), statins (12.1%), and cardiac rehabilitation (17.6%) on discharge. Mortality (95% CI) decreased with increasing hospital OBCS quartile at 30 days [Q1, 2.25% (2.07–2.43%) vs. Q4, 1.40% (1.25–1.56%)] and 6 months [Q1, 7.93% (7.61–8.25%) vs. Q4, 5.53% (5.22–5.83%)]. Hospital OBCS quartile was inversely associated with adjusted 30-day and 6-month mortality [OR (95% CI), 0.87 (0.80–0.94) and 0.92 (0.88–0.96), respectively] and persisted after adjustment for coronary artery catheterization [0.89 (0.82–0.96) and 0.95 (0.91–0.98), respectively]. Conclusions: Multidimensional hospital OBCS in AMI survivors are high, discriminate hospital performance more readily than single performance indicators, and significantly inversely predict early and longer-term mortality. PMID:24062929
Mwangala, Sheila; Moland, Karen M.; Nkamba, Hope C.; Musonda, Kunda G.; Monze, Mwaka; Musukwa, Katoba K.; Fylkesnes, Knut
Background With new testing technologies, task-shifting and rapid scale-up of HIV testing services in high HIV prevalence countries, assuring quality of HIV testing is paramount. This study aimed to explore various cadres of providers’ experiences in providing HIV testing services and their understanding of elements that impact on quality of service in Zambia. Methods Sixteen in-depth interviews and two focus group discussions were conducted with HIV testing service providers including lay counselors, nurses and laboratory personnel at purposively selected HIV testing sites at a national reference hospital in Lusaka. Qualitative content analysis was adopted for data analysis. Results Lay counselors and nurses reported confidentiality and privacy to be greatly compromised due to limited space in both in- and out-patient settings. Difficulties in upholding consent were reported in provider-initiated testing in in-patient settings. The providers identified non-adherence to testing procedures, high workload and inadequate training and supervision as key elements impacting on quality of testing. Difficulties related to testing varied by sub-groups of providers: lay counselors, in finger pricking and obtaining adequate volumes of specimen; non-laboratory providers in general, in interpreting invalid, false-negative and false-positive results. The providers had been participating in a recently established national HIV quality assurance program, i.e. proficiency testing, but rarely received site supervisory visits. Conclusion Task-shifting coupled with policy shifts in service provision has seriously challenged HIV testing quality, protection of confidentiality and the process of informed consent. Ways to better protect confidentiality and informed consent need careful attention. Training, supervision and quality assurance need strengthening tailored to the needs of the different cadres of providers. PMID:26605800
Ki-Zerbo, G A; Guigma, Y
Noma (Cancrum oris) is a gangrenous stomatitis arising from a periodontal infection and leading to severe soft tissue and bone destruction. The pathology involves numerous factors including local thrombosis, vascularitis, necrotizing gingivitis, immunodeficiency, gram negative and anaerobic infection. It is usually a disease of infants and malnourished children in tropical areas often occurring after a debilitating disease like measles. Recently, cases have been reported in adults especially elderly patients or during immunodeficiency states. Reconstructive surgery is often necessary to deal with destruction and sequel but is rarely accessible in developing countries. We report one case of noma (cancrum oris) in an HIV seropositive patient at the National Hospital in Bobo-Dioulasso. The noma was inaugural of AIDS in a 40 years old labourer coming back from Ivory Coast and no major opportunistic infection was associated. The course was fulminant leading to extensive facial gangrene with recurrent bacterial infections. The disease was fatal in this depressive, malnourished and diarrhoeic patient despite local surgical treatment, prolonged antibiotherapy and supportive care. Pathogenic mechanisms, management and preventive issues are discussed. PMID:11887587
Freeman, Marlene P; Sosinsky, Alexandra Z; Moustafa, Danna; Viguera, Adele C; Cohen, Lee S
Women of reproductive age commonly use integrative treatments. However, the reproductive safety for most complementary products lacks systematic study. We aimed to study the use of supplements by women in a prospective pregnancy registry. The Massachusetts General Hospital National Pregnancy Registry for Atypical Antipsychotics was established to evaluate the reproductive safety of atypical antipsychotics. Exposed and control participants were systematically queried about the use of vitamins and supplements. Slightly greater than half (53.2 %) of the participants eligible for analysis (N = 534) were using at least one vitamin or supplement at the time of enrollment, not including prenatal vitamins or folic acid. The most common supplements used were omega-3 fatty acids (38.0 %), vitamin D (11.0 %), calcium (8.2 %), and iron (4.7 %). Probiotics and melatonin were used by 2.6 and 0.9 %, respectively. In this prospective pregnancy registry, we found that over half of the participants were taking supplements or vitamins other than prenatal vitamins and folic acid. These findings underscore the need for active query on the part of health care providers about the use of supplements during pregnancy, and the need to obtain rigorous reproductive safety and efficacy data for supplements used by pregnant women and reproductive aged women. PMID:26472040
IMANI NASAB, Mohammad Hasan; MOHAGHEGH, Bahram; KHALESI, Nader; JAAFARIPOOYAN, Ebrahim
Background European Foundation for Quality Management (EFQM) model is a widely used quality management system (QMS) worldwide, including Iran. Current study aims to verify the quality assessment results of Iranian National Program for Hospital Evaluation (INPHE) based on those of EFQM. Methods: This cross-sectional study was conducted in 2012 on a sample of emergency departments (EDs) affiliated with Tehran University of Medical Sciences (TUMS), Iran. The standard questionnaire of EFQM (V-2010) was used to gather appropriate data. The results were compared with those of INPHE. MS Excel was used to classify and display the findings. Results: The average assessment score of the EDs based on the INPHE and EFQM model were largely different (i.e. 86.4% and 31%, respectively). In addition, the variation range among five EDs’ scores according to each model was also considerable (22% for EFQM against 7% of INPHE), especially in the EDs with and without prior record of applying QMSs. Conclusion: The INPHE’s assessment results were not confirmed by EFQM model. Moreover, the higher variation range among EDs’ scores using EFQM model could allude to its more differentiation power in assessing the performance comparing with INPHE. Therefore, a need for improvement in the latter drawing on other QMSs’ (such as EFQM) strengths, given the results emanated from its comparison with EFQM seems indispensable. PMID:23967429
Hedrick, A. R.; Marshall, H. P.; Winstral, A. H.; Elder, K.; Yueh, S. H.; Cline, D. W.
As survey costs continue to plummet and storage capabilities soar, large-scale multitemporal airborne Light Detection and Ranging (LiDAR) surveys for high-resolution snow depth measurements are becoming commonplace in mountain research watersheds. Though there are disadvantages to the technique (e.g. poor temporal representation and high uncertainty in steep terrain and dense vegetation), the wealth of information with regard to previously unknown spatial snow depth distributions can be an valuable tool for assessing spatially distributed operational snow models. As a portion of NASA's second Cold Lands Processes Experiment (CLPX-2), two 750-km2 LiDAR surveys were conducted over Northern Colorado in December and February of the 2006/2007 winter season. The resulting 5-m gridded changes in snow depth overlay 980 individual pixels of the SNOw Data Assimilation System (SNODAS) spatial framework. As an important operational snow model developed by NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC), SNODAS generally lacks independent validation datasets due to the data assimilation step critical for adjusting the energy balance and downscaled Numerical Weather Prediction (NWP) model components. The influence of sub-grid variability on SNODAS performance is assessed using the independent high resolution CLPX-2 LiDAR changes in snow depth. This method provides a foundation for further studies to quantitatively address the affect of small-scale physiographic variables on various large-scale operational snow models by making use of forthcoming large-scale LiDAR datasets.
Tamiya, Hiroyuki; Yasunaga, Hideo; Matusi, Hiroki; Fushimi, Kiyohide; Ogawa, Sumito; Akishita, Masahiro
Background Preventing falls and bone fractures in hospital care is an important issue in geriatric medicine. Use of hypnotics is a potential risk factor for falls and bone fractures in older patients. However, data are lacking on the association between use of hypnotics and the occurrence of bone fracture. Methods We used a national inpatient database including 1,057 hospitals in Japan and included dementia patients aged 50 years or older who were hospitalized during a period of 12 months between April 2012 and March 2013. The primary outcome was the occurrence of bone fracture during hospitalization. Use of hypnotics was compared between patients with and without bone fracture in this matched case-control study. Results Of 140,494 patients, 830 patients suffered from in-hospital fracture. A 1:4 matching with age, sex and hospital created 817 cases with fracture and 3,158 matched patients without fracture. With adjustment for the Charlson comorbidity index, emergent admission, activities of daily living, and scores for level walking, a higher occurrence of fractures were seen with short-acting benzodiazepine hypnotics (odds ratio, 1.43; 95% confidence interval, 1.19–1.73; P<0.001), ultrashort-acting non-benzodiazepine hypnotics (1.66; 1.37–2.01; P<0.001), hydroxyzine (1.45; 1.15–1.82, P=0.001), risperidone and perospirone (1.37; 1.08–1.73; P=0.010). Other drug groups were not significantly associated with the occurrence of in-hospital fracture. Conclusions Short-acting benzodiazepine hypnotics and ultrashort-acting non-benzodiazepine hypnotics may increase risk of bone fracture in hospitalized dementia patients. PMID:26061231
Perez, A. M. C.; Gaspa, M. C.; Aloc, D. S.; Mahor, M. A. P.; Gonzalez, K. A. C.; Borlongan, N. J. B.; De La Cruz, R. M.; Olfindo, N. T.; Blanco, A. C.
Water resource monitoring and management has been an important concern in the Philippines, considering that the country is archipelagic in nature and is exposed to a lot of disasters imposed by the global effects of climate change. The design and implementation of an effective management scheme relies heavily on accurate, complete, and updated water resource inventories, usually in the form of digital maps and geodatabases. With the aim of developing a detailed and comprehensive database of all water resources in the Philippines, the 3-year project "Development of the Philippine Hydrologic Dataset (PHD) for Watersheds from LiDAR Surveys" under the Phil-LiDAR 2 Program (National Resource Inventory), has been initiated by the University of the Philippines Diliman (UPD) and the Department of Science and Technology (DOST). Various workflows has already been developed to extract inland hydrologic features in the Philippines using accurate Light Detection and Ranging (LiDAR) Digital Terrain Models (DTMs) and LiDAR point cloud data obtained through other government-funded programs such as Disaster Risk and Exposure Assessment for Mitigation (DREAM) and Phil-LiDAR 1, supplemented with other remotely-sensed imageries and ancillary information from Local Government Units (LGUs) and National Government Agencies (NGAs). The methodologies implemented are mainly combinations of object-based image analysis, pixel-based image analysis, modeling, and field surveys. This paper presents the PHD project, the methodologies developed, and some sample outputs produced.
Crosby, C. J.; Nandigam, V.; Arrowsmith, R.; Blair, J. L.
The growing availability of high-resolution LiDAR (Light Detection And Ranging) topographic data has proven to be revolutionary for Earth science research. These data allow scientists to study the processes acting on the Earth’s surfaces at resolutions not previously possible yet essential for their appropriate representation. In addition to their utility for research, the data have also been recognized as powerful tools for communicating earth science concepts for education and outreach purposes. Unfortunately, the massive volume of data produced by LiDAR mapping technology can be a barrier to their use. To facilitate access to these powerful data for research and educational purposes, we have been exploring the use of Keyhole Markup Language (KML) and Google Earth to deliver LiDAR-derived visualizations. The OpenTopography Portal (http://www.opentopography.org/) is a National Science Foundation-funded facility designed to provide access to Earth science-oriented LiDAR data. OpenTopography hosts a growing collection of LiDAR data for a variety of geologic domains, including many of the active faults in the western United States. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources, and technical experience and thus require a data distribution system that provides various levels of access to the data. For users seeking a synoptic view of the data, and for education and outreach purposes, delivering full-resolution images derived from LiDAR topography into the Google Earth virtual globe is powerful. The virtual globe environment provides a freely available and easily navigated viewer and enables quick integration of the LiDAR visualizations with imagery, geographic layers, and other relevant data available in KML format. Through region-dependant network linked KML, OpenTopography currently delivers over 20 GB of LiDAR-derived imagery to users via simple, easily downloaded KMZ files hosted at the Portal
Background Toxoplasmosis, a zoonotic disease distributed worldwide, is an infection caused by the ubiquitous obligatory intracellular coccidian protozoan organism, Toxoplasma gondii. It is a major public health concern because the disease is serious in terms of mortality or physical and /or psychological sequellae in patients with HIV disease. The aim of the study was to assess the seroprevalence of Toxoplasma gondii IgG and IgM antibodies and associated risk factors in HIV infected and non-infected individuals attending Felege Hiwot referral hospital, Bahir Dar, Northwest Ethiopia. Methods A cross sectional study was conducted at Felege Hiwot referral hospital, Bahir Dar, Amhara National Regional State. Venous blood samples were collected from 103 HIV infected pre anti-retroviral therapy patients at Felege Hiwot referral hospital and 101 HIV negative apparently healthy voluntary blood donors at the blood bank. Serum samples were analyzed for anti-Toxoplasma gondii IgG and IgM antibodies using a commercially available ELISA kit. Socio-demographic and associated risk factors for Toxoplasmosis from each individual were also obtained and the data was analyzed using SPSS version 18. Results Of the examined HIV seropositive individuals, 87.4% (90/103) and 10.7% (11/103) were positive for anti-T. gondii IgG and IgM antibodies, respectively. Multivariate analysis using logistic regression showed that anti-T. gondii seropositivity was independently significantly associated with undercooked or raw meat consumption (adjusted OR=5.73, 95% CI=1.35-24.39; P=0.02) and having contact with cat (adjusted OR= 4.29, 95% CI=1.08-16.94; P=0.04) in HIV positive individuals. In HIV negative apparently healthy blood donors, prevalence of anti-T. gondii antibodies were 70.29% and 2.97% for IgG and IgM, respectively. Multivariate analysis showed that undercooked or raw meat consumption (adjusted OR=6.45, 95% CI=2.16-19.28; p=0.001) and sex (OR=6.79, 95% CI=2.14-21.60; p=0.001) were
Rosta, Judith; Gerber, Andreas
Objectives: To determine correlations between excessively long working hours and subjectively experienced somatic health complaints among hospital physicians. Methods: Quantitative data were collected as part of the survey “Working life, Lifestyle and Health of Hospital Physicians in Germany 2006” using self-reporting questionnaires. The individually experienced health was assessed on the basis of Zerssen’s  list of somatic complaints. The indicator of excessively long working hours was defined as 10 or more working hours per working day and 6 or more on-call shifts a month among full-time employees. The net sample consisted of 3295 randomly selected physicians from 515 hospitals. Results: The response rate was 58% (n=1917). Physicians with excessively long working hours (19%) had significantly higher sum score of health complaints (p=0.0001) and significantly increased mental and physical fatigue symptoms (feeling faint, languor, uneasiness, heavy legs, excessive need for sleep, trembling; p=0.0001 to 0.047), mood changes (irritability, brooding; p=0.008 to 0.014), gastrointestinal (nausea, loss of weight; p=0.0001 to 0.014) and heart disorders (lumpy sensation in the throat, chest pain; p=0.0001 to 0.042). When the sum score of health complaints was controlled for selected confounders, being female (B=-3.44, p=0.0001) and having excessively long working hours (B=2.76, p=0.0001) were significantly correlated with health complaints. In a separate gender analysis, being exposed to excessively long working hours remained a significant predictor for health complaints among both females (B=3.78, p=0.001) and males (B=2.28, p=0.004). Conclusions: Excessively long working hours are associated with an increased risk of health complaints. Reducing working hours may be the first step to improving physicians' health. PMID:19675717
White, Katherine M.; Starfelt, Louise C.; Jimmieson, Nerina L.; Campbell, Megan; Graves, Nicholas; Barnett, Adrian G.; Cockshaw, Wendell; Gee, Phillip; Page, Katie; Martin, Elizabeth; Brain, David; Paterson, David
Hand hygiene is the primary measure in hospitals to reduce the spread of infections, with nurses experiencing the greatest frequency of patient contact. The "5 critical moments" of hand hygiene initiative has been implemented in hospitals across Australia, accompanied by awareness-raising, staff training and auditing. The aim of this…
Makokha, A E
To identify the most significant determinants of maternal mortality in Kenya, a prospective study involving 49,335 deliveries occurring at Kenyatta National Hospital from January 1978-87 was conducted. There were 156 maternal deaths in this series, for a maternal mortality rate of 3.2/1000 deliveries. The 5 most frequent causes of death were abortion (24%), hypertensive disease of pregnancy (13%), sepsis (13%), anemia (10%), and cardiac disease (7%). 24% of women who died were age 19 years or under, 27% were 20-24 years, 23% were 25-29 years, and 11% were 30-34 years. The largest percentage (24%) of deaths involved nulliparous women; 16% were to women of parity 5 and above. 28% of the women who died were single, and single women contributed the majority of deaths from abortion. 66% of the women who died had received no prenatal care. The proportion of avoidable deaths was 19% among clinic attenders compared to 29% among non-attenders. Overall, age, parity, and marital status--traditionally regarded as the key factors associated with maternal mortality--vary in their impact, given the cause of death and medical services received. The assumption that high parity is associated with maternal mortality was not confirmed in this study due to the significant number of deaths from abortion that involved single, nulliparous women. In addition, many women who died were in the optimum age group for childbearing, but were more prone to suffer from anemia, hypertension, ectopic pregnancy, and cardiac disease than women over 30 years old. Overall, 126 deaths were considered avoidable. Contributory factors were slowness of surgical management of emergencies, prolonged confinement of women with cardiac disease, and a lack of emergency supplies of blood and drugs for complicated deliveries. PMID:12316813
Walls, Genevieve; Bulifon, Sophie; Breysse, Serge; Daneth, Thol; Bonnet, Maryline; Hurtado, Northan; Molfino, Lucas
Background and objective There are no recent data on the prevalence of drug-resistant tuberculosis (DR TB) in Cambodia. We aim to describe TB drug resistance amongst adults with pulmonary and extra-pulmonary TB and human immunodeficiency virus (HIV) co-infection in a national referral hospital in Phnom Penh, Cambodia. Design Between 22 November 2007 and 30 November 2009, clinical specimens from HIV-infected patients suspected of having TB underwent routine microscopy, Mycobacterium tuberculosis culture, and drug susceptibility testing. Laboratory and clinical data were collected for patients with positive M. tuberculosis cultures. Results M. tuberculosis was cultured from 236 HIV-infected patients. Resistance to any first-line TB drug occurred in 34.7% of patients; 8.1% had multidrug resistant tuberculosis (MDR TB). The proportion of MDR TB amongst new patients and previously treated patients was 3.7 and 28.9%, respectively (p<0.001). The diagnosis of MDR TB was made after death in 15.8% of patients; in total 26.3% of patients with MDR TB died. The diagnosis of TB was established by culture of extra-pulmonary specimens in 23.6% of cases. Conclusions There is significant resistance to first-line TB drugs amongst new and previously treated TB–HIV co-infected patients in Phnom Penh. These data suggest that the prevalence of DR TB in Cambodia may be higher than previously recognised, particularly amongst HIV-infected patients. Additional prevalence studies are needed. This study also illustrates the feasibility and utility of analysis of non-respiratory specimens in the diagnosis of TB, even in low-resource settings, and suggests that extra-pulmonary specimens should be included in TB diagnostic algorithms. PMID:25623609
Reddy, Ashwan D.; Hawbaker, Todd J.; Wurster, F.; Zhu, Zhiliang; Ward, S.; Newcomb, Doug; Murray, R.
Peatlands are a major reservoir of global soil carbon, yet account for just 3% of global land cover. Human impacts like draining can hinder the ability of peatlands to sequester carbon and expose their soils to fire under dry conditions. Estimating soil carbon loss from peat fires can be challenging due to uncertainty about pre-fire surface elevations. This study uses multi-temporal LiDAR to obtain pre- and post-fire elevations and estimate soil carbon loss caused by the 2011 Lateral West fire in the Great Dismal Swamp National Wildlife Refuge, VA, USA. We also determine how LiDAR elevation error affects uncertainty in our carbon loss estimate by randomly perturbing the LiDAR point elevations and recalculating elevation change and carbon loss, iterating this process 1000 times. We calculated a total loss using LiDAR of 1.10 Tg C across the 25 km2 burned area. The fire burned an average of 47 cm deep, equivalent to 44 kg C/m2, a value larger than the 1997 Indonesian peat fires (29 kg C/m2). Carbon loss via the First-Order Fire Effects Model (FOFEM) was estimated to be 0.06 Tg C. Propagating the LiDAR elevation error to the carbon loss estimates, we calculated a standard deviation of 0.00009 Tg C, equivalent to 0.008% of total carbon loss. We conclude that LiDAR elevation error is not a significant contributor to uncertainty in soil carbon loss under severe fire conditions with substantial peat consumption. However, uncertainties may be more substantial when soil elevation loss is of a similar or smaller magnitude than the reported LiDAR error.
Background While the benefits or otherwise of early hip fracture repair is a long-running controversy with studies showing contradictory results, this practice is being adopted as a quality indicator in several health care organizations. The aim of this study is to analyze the association between early hip fracture repair and in-hospital mortality in elderly people attending public hospitals in the Spanish National Health System and, additionally, to explore factors associated with the decision to perform early hip fracture repair. Methods A cohort of 56,500 patients of 60-years-old and over, hospitalized for hip fracture during the period 2002 to 2005 in all the public hospitals in 8 Spanish regions, were followed up using administrative databases to identify the time to surgical repair and in-hospital mortality. We used a multivariate logistic regression model to analyze the relationship between the timing of surgery (< 2 days from admission) and in-hospital mortality, controlling for several confounding factors. Results Early surgery was performed on 25% of the patients. In the unadjusted analysis early surgery showed an absolute difference in risk of mortality of 0.57 (from 4.42% to 3.85%). However, patients undergoing delayed surgery were older and had higher comorbidity and severity of illness. Timeliness for surgery was not found to be related to in-hospital mortality once confounding factors such as age, sex, chronic comorbidities as well as the severity of illness were controlled for in the multivariate analysis. Conclusions Older age, male gender, higher chronic comorbidity and higher severity measured by the Risk Mortality Index were associated with higher mortality, but the time to surgery was not. PMID:22257790
Cook, G C
The National Health Service Act of 1946, pioneered by Aneurin Bevan, came into being on the "appointed day", 5 July 1948. Hospitals with their "additional premises" throughout Britain were "seized" by the state and incorporated into this vast socialist enterprise. While the majority of the population welcomed this new initiative in the creation of a welfare state, associated with medical care from cradle to grave, not all (especially members of various Hospital Boards of Management) were so enthusiastic. The hospitals for "incurables" (long stay patients) were unhappy and lost a vast proportion of their income owing to a great deal of procrastination; but most of them ultimately managed to escape nationalisation after a prolonged period of negotiation, by a claim that they were "homes" rather than "hospitals". The confiscation of property which had been built as a result of voluntary subscription was another huge and highly contentious matter, which has been highlighted in recent years. The future of the Seamen's Hospital Society's properties represents a good example of this. PMID:15579611
Meli, Benjamin Mbeba
This paper utilises data from a study that investigated the efficacy of vocational skills training provided to orphans from three orphanages in Temeke District, Dar es Salaam. The three orphanage centres that were studied are Kurasini National Children Home, Saudia and Don Bosco Vocational Centre. The sample comprised of 45 orphans, an official…
Background The National Reporting and Learning System (NRLS) collects reports about patient safety incidents in England. Government regulators use NRLS data to assess the safety of hospitals. This study aims to examine whether annual hospital incident reporting rates can be used as a surrogate indicator of individual hospital safety. Secondly assesses which hospital characteristics are correlated with high incident reporting rates and whether a high reporting hospital is safer than those lower reporting hospitals. Finally, it assesses which health-care professionals report more incidents of patient harm, which report more near miss incidents and what hospital factors encourage reporting. These findings may suggest methods for increasing the utility of reporting systems. Methods This study used a mix methods approach for assessing NRLS data. The data were investigated using Pareto analysis and regression models to establish which patients are most vulnerable to reported harm. Hospital factors were correlated with institutional reporting rates over one year to examine what factors influenced reporting. Staff survey findings regarding hospital safety culture were correlated with reported rates of incidents causing harm; no harm and death to understand what barriers influence error disclosure. Findings 5,879,954 incident reports were collected from acute hospitals over the decade. 70.3% of incidents produced no harm to the patient and 0.9% were judged by the reporter to have caused severe harm or death. Obstetrics and Gynaecology reported the most no harm events [OR 1.61(95%CI: 1.12 to 2.27), p<0.01] and pharmacy was the hospital location where most near-misses were captured [OR 3.03(95%CI: 2.04 to 4.55), p<0.01]. Clinicians were significantly more likely to report death than other staff [OR 3.04(95%CI: 2.43 to 3.80) p<0.01]. A higher ratio of clinicians to beds correlated with reduced rate of harm reported [RR = -1.78(95%Cl: -3.33 to -0.23), p = 0.03]. Litigation
BASTANI, Peivand; VATANKHAH, Soudabeh; SALEHI, Masoud
Background: This study was designed to present and compare Iranian hospitals` performance applying ratio analysis technique. Methods: This cross-sectional survey was conducted to present an instant image of 139 Iranian hospitals` performance status applying ratio analysis as one of the non parametric technical efficiency assessment methods in 2008. Data was collected using nine dimensional questionnaires supported by world wide web to achieve main hospital ratios. Final analysis was performed applying classic statistics and relevant statistical tests on significant level of 0.05. Results: Four hospital performance indicators were estimated in the studied hospitals as follows: Bed turnover rate (BTR) was fluctuated from 64.5 to 114.8 times for hospitals located in rich and poor areas respectively. Moreover Bed Interval Rate (BIT) was calculated 1.36 versus 2.4 in the poor and rich areas. Average length of stay (ALS) was computed 1.82 for the poor regions but 3.27 for the rich ones furthermore, a positive statistical significant correlation was seen between ALS and the hospital size (P=0.001, r=0.28). Average bed occupancy rate (BOR) was 57.8% and its variation was from 31.4% to 64.5% depending on the hospital size so that there was a positive statistical significant relationship between the hospital size and BOR (P=0.006, r=0.32). Conclusion: Regarding that BOR, ALS, BTR and BIT along with mortality rates are mentioned as the most considerable performance indicators, applying analytic frameworks more than considering single and raw indicators are severely recommended. PMID:26056642
Freeman, Elsie T.; And Others
Because the Daughters of the American Revolution's (DAR) Black exclusion rule prevented Black singer Marion Anderson from performing in the DAR auditorium in 1939, Eleanor Roosevelt resigned from the organization. Primary source materials regarding this incident and learning activities for secondary level students are presented. (RM)
Abalharth, Mahdi; Hassan, Marwan A.; Klinkenberg, Brian; Leung, Vivian; McCleary, Richard
Logjams significantly influence watershed hydrology, flow regime, channel morphology and stability, and processes in lowland rivers. Consequently, logjams play a major role in the existence and conservation of the riparian and aquatic ecosystems along major waterways. In this paper, we attempt to detect and quantify logjams in river channels using LiDAR technology in conjunction with traditional fieldwork. To the best of our knowledge, LiDAR-based analysis has not been used to characterize logjams in streams. Overall, when applied in a lowland river environment, LiDAR-based analysis demonstrates a comprehensive solution for detecting logjams in relation to the fieldwork, with a low rate of omission. A filtered approach predicted the presence of 95% of fieldwork-reported logjams (a 5% rate of omission), but also identified six logjams not identified in the field (a 10% rate of commission). A nonfiltered approach identified 87% of field-reported logjams, producing a 13% rate of omission and a 6.7% rate of commission. Dimension measurements were more consistent in the filtered LiDAR approach, showing 53%, 34%, and 90% of R2 improvements for the length, width, and height, respectively, over the unfiltered LiDAR values. As vegetation cover hindered accurate delineation of logjam boundaries by LiDAR, field and LiDAR measurements of nonvegetation-obstructed logjams were more highly correlated than the field and LiDAR measurements of partially and completely vegetation-obstructed logjams.
Lin, Yu-Wen; Huang, Hui-Chuan; Lin, Mei-Feng; Shyu, Meei-Ling; Tsai, Po-Li; Chang, Hsiu-Ju
Background Investigating the factors related to suicide is crucial for suicide prevention. Psychiatric disorders, gender, socioeconomic status, and catastrophic illnesses are associated with increased risk of suicide. Most studies have typically focused on the separate influences of physiological or psychological factors on suicide-related behaviors, and have rarely used national data records to examine and compare the effects of major physical illnesses, psychiatric disorders, and socioeconomic status on the risk of suicide-related behaviors. Objectives To identify the characteristics of people who exhibited suicide-related behaviors and the multiple factors associated with repeated suicide-related behaviors and deaths by suicide by examining national data records. Design This is a cohort study of Taiwan’s national data records of hospitalized patients with suicide-related behaviors from January 1, 1997, to December 31, 2010. Participants The study population included all people in Taiwan who were hospitalized with a code indicating suicide or self-inflicted injury (E950–E959) according to the International Classification of Disease, Ninth Revision, Clinical Modification. Results Self-poisoning was the most common method of self-inflicted injury among hospitalized patients with suicide-related behaviors who used a single method. Those who were female, had been hospitalized for suicide-related behaviors at a younger age, had a low income, had a psychiatric disorder (i.e., personality disorder, major depressive disorder, bipolar disorder, schizophrenia, alcohol-related disorder, or adjustment disorder), had a catastrophic illness, or had been hospitalized for suicide-related behaviors that involved two methods of self-inflicted injury had a higher risk of hospitalization for repeated suicide-related behaviors. Those who were male, had been hospitalized for suicide-related behaviors at an older age, had low income, had schizophrenia, showed repeated suicide
Pelletier, J. D.; Swetnam, T.; Papuga, S. A.; Nelson, K.; Brooks, P. D.; Harpold, A. A.; Chorover, J.
Standard protocols exist for extracting bare-earth Digital Elevation Models (DEMs) from LiDAR point clouds that include trees and other large woody vegetation. Grasses and other herbaceous plants can also obscure the ground surface, yet methods for optimally distinguishing grass from ground to generate accurate LiDAR-based raster products for geomorphic and ecological applications are still under development. Developing such methods is important because LiDAR-based difference products (e.g. snow thickness) require accurate representations of the ground surface and because raster data for grass height and density have important applications in ecology. In this study, we developed and tested methods for constructing optimal bare-earth and grass height raster layers from LiDAR point clouds and compared the results to high-quality field-based measurements of grass height, density, and species type for nearly 1000 precisely geo-referenced locations collected during the acquisition of a >200 km^2 airborne LiDAR flight of the Valles Caldera National Preserve (New Mexico). In cases of partially bare ground (where the skewness of return heights above a plane fit to the lowest first returns is sufficiently large), a planar fit to the lowest first returns provides a good method of producing an accurate bare-earth DEM and the statistics of the first returns above that planar fit provide good estimates of the mean and variance of grass height. In areas of relatively thick grass cover, however, a fit to the lowest first returns yields a bare-earth DEM that may be a meter or more above the actual ground surface. Here we propose a method to solve this problem using field-measured correlations among the mean, variance, and skewness of grass heights. In this method, the variance and skewness of the differences between LiDAR first returns and a 10m^2 planar fit to the lowest first returns is used, together with field-based correlations of grass height statistics, to estimate the mean
Bernardino, Vera L.; Baird, Janette R.; Liu, Tao; Merchant, Roland C.
Objectives Compare the prevalence of recent alcohol, tobacco, and drug use among patients from two Rhode Island emergency departments (EDs) to Rhode Island state and United States national general population estimates between 2010 and 2012. Methods Secondary analysis of ED patient data and the National Survey of Drug Use and Health. Results Alcohol was the most commonly reported substance, and prevalence of its use was higher among ED patients than those in the national, but not the Rhode Island, general population. Drug use was higher among ED patients than in the state and national general population. For ED patients, tobacco and opioid use was highest among 26–34 year-olds, alcohol and marijuana highest among 18–25 years-olds, and cocaine highest among 35–49 years-olds. Conclusion Rhode Island Hospital and The Miriam Hospital ED patients report a greater prevalence of substance use than the national population and in many cases the state general population. PMID:25830171
Barbetta, Gian Paolo; Turati, Gilberto; Zago, Angelo M
In this paper we attempt to identify behavioral differences between public and private not-for-profit hospitals, by exploiting the introduction of the DRG-based payment system in the Italian NHS during the second half of the 1990s. We estimate the technical efficiency of a sample of hospitals for the period 1995-2000 considering an output distance function, and adopting both parametric (COLS and SF) and nonparametric (DEA) approaches. Our results show a convergence of mean efficiency scores between not-for-profit and public hospitals, and seem to suggest that differences in economic performances between competing ownership forms are more the result of the institutional settings in which they operate than the effect of the incentive structures embedded in the different proprietary forms. We also observe a decline in technical efficiency, probably due to policies aimed at reducing hospitalization rates. PMID:16929498
Ahmed, Bina; Davis, Herbert T.; Laskey, Warren K.
Background Case‐fatality rates in acute myocardial infarction (AMI) have significantly decreased; however, the prevalence of diabetes mellitus (DM), a risk factor for AMI, has increased. The purposes of the present study were to assess the prevalence and clinical impact of DM among patients hospitalized with AMI and to estimate the impact of important clinical characteristics associated with in‐hospital mortality in patients with AMI and DM. Methods and Results We used the National Inpatient Sample to estimate trends in DM prevalence and in‐hospital mortality among 1.5 million patients with AMI from 2000 to 2010, using survey data‐analysis methods. Clinical characteristics associated with in‐hospital mortality were identified using multivariable logistic regression. There was a significant increase in DM prevalence among AMI patients (year 2000, 22.2%; year 2010, 29.6%, Ptrend<0.0001). AMI patients with DM tended to be older and female and to have more cardiovascular risk factors. However, age‐standardized mortality decreased significantly from 2000 (8.48%) to 2010 (4.95%) (Ptrend<0.0001). DM remained independently associated with mortality (adjusted odds ratio 1.069, 95% CI 1.051 to 1.087; P<0.0001). The adverse impact of DM on in‐hospital mortality was unchanged over time. Decreased death risk over time was greatest among women and elderly patients. Among younger patients of both sexes, there was a leveling off of this decrease in more recent years. Conclusions Despite increasing DM prevalence and disease burden among AMI patients, in‐hospital mortality declined significantly from 2000 to 2010. The adverse impact of DM on mortality remained unchanged overall over time but was age and sex dependent. PMID:25158866
Mumghamba, Elifuraha GS; Manji, Karim P
Background The study examined the relationship between oral health status (periodontal disease and carious pulpal exposure (CPE)) and preterm low-birth-weight (PTLBW) infant deliveries among Tanzanian-African mothers at Muhimbili National Hospital (MNH), Tanzania. Methods A retrospective case-control study was conducted, involving 373 postpartum mothers aged 14–44 years (PTLBW – 150 cases) and at term normal-birth-weight (TNBW) – 223 controls), using structured questionnaire and full-mouth examination for periodontal and dentition status. Results The mean number of sites with gingival bleeding was higher in PTLBW than in TNBW (P = 0.026). No significant differences were observed for sites with plaque, calculus, teeth with decay, missing, filling (DMFT) between PTLBW and TNBW. Controlling for known risk factors in all post-partum (n = 373), and primiparaous (n = 206) mothers, no significant differences were found regarding periodontal disease diagnosis threshold (PDT) (four sites or more that had probing periodontal pocket depth 4+mm and gingival bleeding ≥ 30% sites), and CPE between cases and controls. Significant risk factors for PTLBW among primi- and multiparous mothers together were age ≤ 19 years (adjusted Odds Ratio (aOR) = 2.09, 95% Confidence interval (95% CI): 1.18 – 3.67, P = 0.011), hypertension (aOR = 2.44, (95% CI): 1.20 – 4.93, P = 0.013) and being un-married (aOR = 1.59, (95% CI): 1.00 – 2.53, P = 0.049). For primiparous mothers significant risk factors for PTLBW were age ≤ 19 years (aOR = 2.07, 95% CI: 1.13 – 3.81, P = 0.019), and being un-married (aOR = 2.58, 95% CI: 1.42 – 4.67, P = 0.002). Conclusions These clinical findings show no evidence for periodontal disease or carious pulpal exposure being significant risk factors in PTLBW infant delivery among Tanzanian-Africans mothers at MNH, except for young age, hypertension, and being unmarried. Further research incorporating periodontal pathogens is recommended. PMID:17594498
Van de Sande, St; Bossens, M; Parmentier, Y; Gigot, J F
Public health and financial aspects of cholecystectomy related bile duct injury (BDI) are highlighted in a National Cholecystectomy Survey carried out through 'datamining' the Federal State Medical Records Summaries and Financial Summaries of all Belgian hospitals in 1997. All cancer diagnoses, children < or = 10 years, cholecystectomies performed as an abdominal co-procedure or patients having undergone other non-related surgery were excluded from the study. 10.595 laparoscopic (LC) and 1.033 open cholecystectomies (OC) as well as 137 secondary BDI treatments (LC/OC) were included in the survey (total 11.765). Both LC and OC groups turned out to be significantly different as to distribution of patient's age and APR-DRG severity classes. Composite criteria in terms of ICD-9-CM and billing codes were elaborated to classify: 1) primary, intra-operatively detected and treated BDI (N = 30), 2) primary delayed BDI treatments (N = 38), 3) secondary BDI treatments (N = 137), 4) non-BDI abdomino-surgical complications (N = 119), 4) uneventful laparoscopic (N = 7.476) and 5) uneventful open cholecystectomy (N = 681). Complication rates, community costs of LC and OC groups, incidence of preoperative ERCP and/or intra-operative cholangiography as well as interventions for complications were studied. Incidence of cholecystectomy related BDI was 0.37% in LC, 2.81% in OC and 0.58% overall. Average costs amounted to [symbol: see text] 1.721 for uneventful LC, [symbol: see text] 2.924 for uneventful OC, [symbol: see text] 7.250 for primary, intra-operatively detected and immediately treated BDI [symbol: see text] 9.258 for primary delayed BDI treatments, [symbol: see text] 6.076 for secondary BDI treatments and [symbol: see text] 10.363 for non-BDI abdomino-surgical complications. In conclusion BDI with cholecystectomy reveals to be a serious complication increasing the overall average cost factor ninefold if not detected intra-operatively, in which case the raise is only fourfold
... CDC/NCHS, National Hospital Discharge Survey, 2010. How did rural hospital inpatients differ from urban hospital inpatients ... CDC/NCHS, National Hospital Discharge Survey, 2010. How did patients' first-listed diagnoses differ in rural and ...
St. Germain, Diane; Nacpil, Lianne M; Zaren, Howard A; Swanson, Sandra M; Minnick, Christopher; Carrigan, Angela; Denicoff, Andrea M; Igo, Kathleen E; Acoba, Jared D; Gonzalez, Maria M; McCaskill-Stevens, Worta
Background The value of community-based cancer research has long been recognized. In addition to the National Cancer Institute’s Community Clinical and Minority-Based Oncology Programs established in 1983, and 1991 respectively, the National Cancer Institute established the National Cancer Institute Community Cancer Centers Program in 2007 with an aim of enhancing access to high-quality cancer care and clinical research in the community setting where most cancer patients receive their treatment. This article discusses strategies utilized by the National Cancer Institute Community Cancer Centers Program to build research capacity and create a more entrenched culture of research at the community hospitals participating in the program over a 7-year period. Methods To facilitate development of a research culture at the community hospitals, the National Cancer Institute Community Cancer Centers Program required leadership or chief executive officer engagement; utilized a collaborative learning structure where best practices, successes, and challenges could be shared; promoted site-to-site mentoring to foster faster learning within and between sites; required research program assessments that spanned clinical trial portfolio, accrual barriers, and outreach; increased identification and use of metrics; and, finally, encouraged research team engagement across hospital departments (navigation, multidisciplinary care, pathology, and disparities) to replace the traditionally siloed approach to clinical trials. Limitations The health-care environment is rapidly changing while complexity in research increases. Successful research efforts are impacted by numerous factors (e.g. institutional review board reviews, physician interest, and trial availability). The National Cancer Institute Community Cancer Centers Program sites, as program participants, had access to the required resources and support to develop and implement the strategies described. Metrics are an important
Sureda, Xisca; Ballbè, Montse; Martínez, Cristina; Fu, Marcela; Carabasa, Esther; Saltó, Esteve; Martínez-Sánchez, Jose M.; Fernández, Esteve
Introduction On January 2, 2011, the Spanish government passed a new smoking law that banned smoking in hospital campuses. The objective of this study was to evaluate the implementation of smoke-free campuses in the hospitals of Catalonia based on both airborne particulate matter and observational data. Methods This cross-sectional study included the hospitals registered in the Catalan Network of Smoke-free Hospitals. We measured the concentration of particulate matter < 2.5 µm in μg/m3 at different locations, both indoors and outdoors before (2009) and after (2011) the implementation of the tobacco law. During 2011, we also assessed smoke-free zone signage and indications of smoking in the outdoor areas of hospital campuses. Results The overall median particulate matter < 2.5 µm concentration fell from 12.22 μg/m3 (7.80–19.76 μg/m3) in 2009 to 7.80 μg/m3 (4.68–11.96 μg/m3) in 2011. The smoke-free zone signage within the campus was moderately implemented after the legislation in most hospitals, and 55% of hospitals exhibited no indications of tobacco consumption around the grounds. Conclusions After the law, particulate matter < 2.5 µm concentrations were much below the values obtained before the law and below the annual guideline value recommended by the World Health Organization for outdoor settings (10 μg/m3). Our data showed the feasibility of implementing a smoke-free campus ban and its positive effects. PMID:26844041
Whelley, P.; Garry, W. B.; Scheidt, S. P.; Irwin, R. P., III; Fox, J.; Bleacher, J. E.; Hamilton, C. W.
High-resolution point clouds and digital elevation models (DEMs) are used to investigate lava textures on the Big Island of Hawaii. An experienced geologist can distinguish fresh or degraded lava textures (e.g., blocky, a'a and pahoehoe) visually in the field. Lava texture depends significantly on eruption conditions, and it is therefore instructive, if accurately determined. In places where field investigations are prohibitive (e.g., Mercury, Venus, the Moon, Mars, Io and remote regions on Earth) lava texture must be assessed from remote sensing data. A reliable method for differentiating lava textures in remote sensing data remains elusive. We present preliminary results comparing properties of lava textures observed in airborne and terrestrial Light Detection and Ranging (LiDAR) data. Airborne data, in this study, were collected in 2011 by Airborne 1 Corporation and have a ~1m point spacing. The authors collected the terrestrial data during a May 2014 field season. The terrestrial scans have a heterogeneous point density. Points close to the scanner are 1 mm apart while 200 m in the distance points are 10 cm apart. Both platforms offer advantages and disadvantages beyond the differences in scale. Terrestrial scans are a quantitative representation of what a geologist sees "on the ground". Airborne scans are a point of view routinely imaged by other remote sensing tools, and can therefore be quickly compared to complimentary data sets (e.g., spectral scans or image data). Preliminary results indicate that LiDAR-derived surface roughness, from both platforms, is useful for differentiating lava textures, but at different spatial scales. As all lava types are quite rough, it is not simply roughness that is the most advantageous parameter; rather patterns in surface roughness can be used to differentiate lava surfaces of varied textures. This work will lead to faster and more reliable volcanic mapping efforts for planetary exploration as well as terrestrial
Ulisubisya, Mpoki; Jörnvall, Henrik; Irestedt, Lars; Baker, Tim
Anaesthesia and Intensive Care is a neglected specialty in low-income countries. There is an acute shortage of health workers - several low-income countries have less than 1 anaesthesia provider per 100,000 population. Only 1.5% of hospitals in Africa have the intensive care resources needed for managing patients with sepsis. Health partnerships between institutions in high and low-income countries have been proposed as an effective way to strengthen health systems. The aim of this article is to describe the origin and conduct of a health partnership in Anaesthesia and Intensive Care between institutions in Tanzania and Sweden and how the partnership has expanded to have an impact at regional and national levels.The Muhimbili-Karolinska Anaesthesia and Intensive Care Collaboration was initiated in 2008 on the request of the Executive Director of Muhimbili National Hospital in Dar es Salaam. The partnership has conducted training courses, exchanges, research projects and introduced new equipment, routines and guidelines. The partnership has expanded to include all hospitals in Dar es Salaam. Through the newly formed Life Support Foundation, the partnership has had a national impact assisting the reanimation of the Society of Anaesthesiologists of Tanzania and has seen a marked increase of the number of young doctors choosing a residency in Anaesthesia and Intensive Care. PMID:26993790
Comparative histories of health system development have been variously influenced by the theoretical approaches of historical institutionalism, political pluralism, and labor mobilization. Britain and the United States have figured significantly in this literature because of their very different trajectories. This article explores the implications of recent research on hospital history in the two countries for existing historiographies, particularly the coming of the National Health Service in Britain. It argues that the two hospital systems initially developed in broadly similar ways, despite the very different outcomes in the 1940s. Thus, applying the conceptual tools used to explain the U.S. trajectory can deepen appreciation of events in Britain. Attention focuses particularly on working-class hospital contributory schemes and their implications for finance, governance, and participation; these are then compared with Blue Cross and U.S. hospital prepayment. While acknowledging the importance of path dependence in shaping attitudes of British bureaucrats toward these schemes, analysis emphasizes their failure in pressure group politics, in contrast to the United States. In both countries labor was also crucial, in the United States sustaining employment-based prepayment and in Britain broadly supporting system reform. PMID:22323233
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Merewood, Anne; Grossman, Xena; Cook, John; Sadacharan, Radha; Singleton, Marcella; Peters, Karen; Navidi, Tina
The World Health Organization's International Code of Marketing of Breast-Milk Substitutes, as well as most major medical authorities, opposes hospital-based distribution of free infant formula at discharge. The goal of this cross-sectional telephone survey of 3209 US maternity sites, conducted from 2006 to 2007, was to determine the extent of this practice. It was found that 91% of hospitals distributed formula sample packs, and a trend toward discontinuation of the practice was statistically significant (P < .001). It was concluded that most US hospitals distribute infant formula samples, in violation of the WHO Code and the recommendations of organizations including the US Government Accountability Office, the American Academy of Pediatrics, and the Centers for Disease Control and Prevention. PMID:20871089
Nair, Manisha; Kurinczuk, Jennnifer J.; Knight, Marian
Introduction As maternal deaths become rarer, monitoring near-miss or severe maternal morbidity becomes important as a tool to measure changes in care quality. Many calls have been made to use routinely available hospital administration data to monitor the quality of maternity care. We investigated 1) the feasibility of developing an English Maternal Morbidity Outcome Indicator (EMMOI) by reproducing an Australian indicator using routinely available hospital data, 2) the impact of modifications to the indicator to address potential data quality issues, 3) the reliability of the indicator. Methods We used data from 6,389,066 women giving birth in England from April 2003 to March 2013 available in the Hospital Episode Statistics (HES) database of the Health and Social care Information centre (HSCIC). A composite indicator, EMMOI, was generated from the diagnoses and procedure codes. Rates of individual morbid events included in the EMMOI were compared with the rates in the UK reported by population-based studies. Results EMMOI included 26 morbid events (17 diagnosis and 9 procedures). Selection of the individual morbid events was guided by the Australian indicator and published literature for conditions associated with maternal morbidity and mortality in the UK, but was mainly driven by the quality of the routine hospital data. Comparing the rates of individual morbid events of the indicator with figures from population-based studies showed that the possibility of false positive and false negative cases cannot be ruled out. Conclusion While routine English hospital data can be used to generate a composite indicator to monitor trends in maternal morbidity during childbirth, the quality and reliability of this monitoring indicator depends on the quality of the hospital data, which is currently inadequate. PMID:27054761
Li, N.; Liu, C.; Pfeifer, N.; Yin, J. F.; Liao, Z. Y.; Zhou, Y.
Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the "raw" data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.
LaBresh, Kenneth A
The Paul Coverdell National Acute Stroke Registry prototypes baseline data collection demonstrated a significant gap in the use of evidenced-based interventions. Barriers to the use of these interventions can be characterized as relating to lack of knowledge, attitudes, and ineffective behaviors and systems. Quality improvement programs can address these issues by providing didactic presentations to disseminate the science and peer interactions to address the lack of belief in the evidence, guidelines, and likelihood of improved patient outcomes. Even with knowledge and intention to provide evidenced-based care, the absence of effective systems is a significant behavioral barrier. A program for quality improvement that includes multidisciplinary teams of clinical and quality improvement professionals has been successfully used to carry out redesign of stroke care delivery systems. Teams are given a methodology to set goals, test ideas for system redesign, and implement those changes that can be successfully adapted to the hospital's environment. Bringing teams from several hospitals together substantially accelerates the process by sharing examples of successful change and by providing strategies to support the behavior change necessary for the adoption of new systems. The participation of many hospitals also creates momentum for the adoption of change by demonstrating observable and successful improvement. Data collection and feedback are useful to demonstrate the need for change and evaluate the impact of system change, but improvement occurs very slowly without a quality improvement program. This quality improvement framework provides hospitals with the capacity and support to redesign systems, and has been shown to improve stroke care considerably, when coupled with an Internet-based decision support registry, and at a much more rapid pace than when hospitals use only the support registry. PMID:17178313
Yang, Lianping; Liu, Chaojie; Ferrier, J Adamm; Zhang, Xinping
This study identifies potential organizational barriers associated with the implementation of the Chinese National Essential Medicines Policy (NEMP) in rural primary health care institutions. We used a multistage sampling strategy to select 90 township hospitals from six provinces, two from each of eastern, middle, and western China. Data relating to eight core NEMP indicators and institutional characteristics were collected from January to September 2011, using a questionnaire. Prescription-associated indicators were calculated from 9000 outpatient prescriptions selected at random. We categorized the eight NEMP indicators using an exploratory factor analysis, and performed linear regressions to determine the association between the factor scores and institution-level characteristics. The results identified three main factors. Overall, low levels of expenditure of medicines (F1) and poor performance in rational use of medicines (F2) were evident. The availability of medicines (F3) varied significantly across both hospitals and regions. Factor scores had no significant relationship with hospital size (in terms of number of beds and health workers); however, they were associated with revenue and structure of the hospital, patient service load, and support for health workers. Regression analyses showed that public finance per health worker was negatively associated with the availability of medicines (p < 0.05), remuneration of prescribers was positively associated with higher performance in the rational use of medicines (p < 0.05), and drug sales were negatively associated with higher levels of drug expenditure (p < 0.01). In conclusion, irrational use of medicines remains a serious issue, although the financial barriers for gaining access to essential medicines may be less for prescribers and consumers. Limited public finance from local governments may reduce medicine stock lines of township hospitals and lead them to seek alternative sources of income
O'Neill, Shirley; Hatoss, Aniko
Reports research that aimed to identify the foreign language and cross-cultural skill needs of workers in the tourism and hospitality industry in Australia and to develop foreign language competencies for use in industry training packages. Provides evidence for the need for foreign language skills in the industry and gives an account of the…
Chiu, Ya-Wen; Weng, Yi-Hao; Lo, Heng-Lien; Hsu, Chih-Cheng; Shih, Ya-Hui; Kuo, Ken N.
Introduction: Although evidence-based practice (EBP) has been widely investigated, few studies compare physicians and nurses on performance. Methods: A structured questionnaire survey was used to investigate EBP among physicians and nurses in 61 regional hospitals of Taiwan. Valid postal questionnaires were collected from 605 physicians and 551…
Lin, Jin-Ding; Hung, Wen-Jiu; Lin, Lan-Ping; Lai, Chia-Im
There were not many studies to provide information on health access and health utilization of people with autism spectrum disorders (ASD). The present study describes a general profile of hospital admission and the medical cost among people with ASD, and to analyze the determinants of medical cost. A retrospective study was employed to analyze…
Minges, Karl E; Bikdeli, Behnood; Wang, Yun; Kim, Nancy; Curtis, Jeptha P; Desai, Mayur M; Krumholz, Harlan M
Little is known about national trends of pulmonary embolism (PE) hospitalizations and outcomes in older adults in the context of recent diagnostic and therapeutic advances. Therefore, we conducted a retrospective cohort study of 100% Medicare fee-for-service beneficiaries hospitalized from 1999 to 2010 with a principal discharge diagnosis code for PE. The adjusted PE hospitalization rate increased from 129/100,000 person-years in 1999 to 302/100,000 person-years in 2010, a relative increase of 134% (p <0.001). Black patients had the highest rate of increase (174 to 548/100,000 person-years) among all age, gender, and race categories. The mean (standard deviation) length of hospital stay decreased from 7.6 (5.7) days in 1999 to 5.8 (4.4) days in 2010, and the proportion of patients discharged to home decreased from 51.1% (95% confidence interval [CI] 50.5 to 51.6) to 44.1% (95% CI 43.7 to 44.6), whereas more patients were discharged with home health care and to skilled nursing facilities. The in-hospital mortality rate decreased from 8.3% (95% CI 8.0 to 8.6) in 1999 to 4.4% (95% CI 4.2 to 4.5) in 2010, as did adjusted 30-day (from 12.3% [95% CI 11.9 to 12.6] to 9.1% [95% CI 8.5 to 9.7]) and 6-month mortality rates (from 23.0% [95% CI 22.5 to 23.4] to 19.6% [95% CI 18.8 to 20.5]). There were no significant racial differences in mortality rates by 2010. There was no change in the adjusted 30-day all-cause readmission rate from 1999 to 2010. In conclusion, PE hospitalization rates increased substantially from 1999 to 2010, with a higher rate for black patients. All mortality rates decreased but remained high. The increase in hospitalization rates and continued high mortality and readmission rates confirm the significant burden of PE for older adults. PMID:26409636
Nourbakhshbeidokhti, S.; Kinoshita, A. M.; Chin, A.
Wildfires have potential to significantly alter soil properties and vegetation within watersheds. These alterations often contribute to accelerated erosion, runoff, and sediment transport in stream channels and hillslopes. This research applies repeated Terrestrial Laser Scanning (TLS) Light Detection and Ranging (LiDAR) to stream reaches within the Pike National Forest in Colorado following the 2012 Waldo Canyon Fire. These scans allow investigation of the relationship between sediment delivery and environmental characteristics such as precipitation, soil burn severity, and vegetation. Post-fire LiDAR images provide high resolution information of stream channel changes in eight reaches for three years (2012-2014). All images are processed with RiSCAN PRO to remove vegetation and triangulated and smoothed to create a Digital Elevation Model (DEM) with 0.1 m resolution. Study reaches with two or more successive DEM images are compared using a differencing method to estimate the volume of sediment erosion and deposition. Preliminary analysis of four channel reaches within Williams Canyon and Camp Creek yielded erosion estimates between 0.035 and 0.618 m3 per unit area. Deposition was estimated as 0.365 to 1.67 m3 per unit area. Reaches that experienced higher soil burn severity or larger rainfall events produced the greatest geomorphic changes. Results from LiDAR analyses can be incorporated into post-fire hydrologic models to improve estimates of runoff and sediment yield. These models will, in turn, provide guidance for water resources management and downstream hazards mitigation.
de Miguel-Yanes, José Ma; Esteban-Hernández, Jesús; Jiménez-Trujillo, Isabel; Alvaro-Meca, Alejandro; Carrasco-Garrido, Pilar; de Miguel-Díez, Javier
Background Type 2 Diabetes (T2DM) is the most rapidly increasing risk factor for ischemic stroke. We aimed to compare trends in outcomes for ischemic stroke in people with or without diabetes in Spain between 2003 and 2012. Methods We selected all patients hospitalized for ischemic stroke using national hospital discharge data. We evaluated annual incident rates stratified by T2DM status. We analyzed trends in the use of diagnostic and therapeutic procedures, patient comorbidities, and in-hospital outcomes. We calculated in-hospital mortality (IHM), length of hospital stay (LOHS) and readmission rate in one month after discharge. Time trend on the incidence of hospitalization was estimated fitting Poisson regression models by sex and diabetes variables. In-hospital mortality was analyzed using logistic regression models separate for men and women. LOHS were compared with ANOVA or Kruskal-Wallis when necessary. Results We identified a total of 423,475 discharges of patients (221,418 men and 202,057 women) admitted with ischemic stroke as primary diagnosis. Patients with T2DM accounted for 30.9% of total. The estimated incidence rates of discharges increased significantly in all groups. The incidence of hospitalization due to stroke (with ICD9 codes for stroke as main diagnosis at discharge) was higher among those with than those without diabetes in all the years studied. T2DM was positively associated with ischemic stroke with an adjusted incidence rate ratio (IRR) of 2.27 (95% CI 2.24–2.29) for men and 2.15 (95%CI 2.13–2.17) for women. Over the 10 year period LOHS decreased significantly in men and women with and without diabetes. Readmission rate remained stable in diabetic and non diabetic men (around 5%) while slightly increased in women with and without diabetes. We observed a significant increase in the use of fibrinolysis from 2002–2013. IHM was positively associated with older age in all groups, with Charlson Comorbidity Index > 3 and atrial
Kim, chang hwan; Park, chang hong; Kim, hyun wook; hyuck Kim, won; Lee, myoung hoon; Park, hyeon yeong
Coastal areas, used as human utilization areas like leisure space, medical care, ports and power plants, etc., are regions that are continuously changing and interconnected with oceans and land and the sea level has risen by about 8cm (1.9mm / yr) due to global warming from 1964 year to 2006 year in Korea. Coastal erosion due to sea-level rise has caused the problem of marine ecosystems and loss of tourism resources, etc. Regular monitoring of coastal erosion is essential at key locations with such volatility. But the survey method of land mobile LiDAR (light detection and ranging) system has much time consuming and many restrictions. For effective monitoring beach erosion, KIOST (Korea Institute of Ocean Science & Technology) has constructed a shipborne mobile LiDAR system. The shipborne mobile LiDAR system comprised a land mobile LiDAR (RIEGL LMS-420i), an INS (inertial navigation system, MAGUS Inertial+), a RTKGPS (LEICA GS15 GS25), and a fixed platform. The shipborne mobile LiDAR system is much more effective than a land mobile LiDAR system in the measuring of fore shore areas without shadow zone. Because the vessel with the shipborne mobile LiDAR system is continuously moved along the shoreline, it is possible to efficiently survey a large area in a relatively short time. Effective monitoring of the changes using the constructed shipborne mobile LiDAR system for seriously eroded coastal areas will be able to contribute to coastal erosion management and response.
Nathanson, Marcus; Kean, Jason W.; Grabs, Thomas J.; Seibert, Jan; Laudon, Hjalmar; Lyon, Steve W.
Accurate stream discharge measurements are important for many hydrological studies. In remote locations, however, it is often difficult to obtain stream flow information because of the difficulty in making the discharge measurements necessary to define stage-discharge relationships (rating curves). This study investigates the feasibility of defining rating curves by using a fluid mechanics-based model constrained with topographic data from an airborne LiDAR scanning. The study was carried out for an 8m-wide channel in the boreal landscape of northern Sweden. LiDAR data were used to define channel geometry above a low flow water surface along the 90-m surveyed reach. The channel topography below the water surface was estimated using the simple assumption of a flat streambed. The roughness for the modelled reach was back calculated from a single measurment of discharge. The topographic and roughness information was then used to model a rating curve. To isolate the potential influence of the flat bed assumption, a 'hybrid model' rating curve was developed on the basis of data combined from the LiDAR scan and a detailed ground survey. Whereas this hybrid model rating curve was in agreement with the direct measurements of discharge, the LiDAR model rating curve was equally in agreement with the medium and high flow measurements based on confidence intervals calculated from the direct measurements. The discrepancy between the LiDAR model rating curve and the low flow measurements was likely due to reduced roughness associated with unresolved submerged bed topography. Scanning during periods of low flow can help minimize this deficiency. These results suggest that combined ground surveys and LiDAR scans or multifrequency LiDAR scans that see 'below' the water surface (bathymetric LiDAR) could be useful in generating data needed to run such a fluid mechanics-based model. This opens a realm of possibility to remotely sense and monitor stream flows in channels in remote
Yuan, Frank; Chung, Kevin C
A clear disparity in the pattern and provision of surgical care exists, particularly for patients with vulnerable socioeconomic backgrounds. For hand-injured patients in particular, this discrepancy has been frequently shown in their receiving appropriate care. With the advent of the Affordable Care Act and with Medicaid expansion on the horizon, more patients will be requiring access to care. Safety net programs have been shown to provide equivalent levels of care for patients compared with non-safety net providers, and the survival of these hospitals for the disadvantaged is essential to providing quality care for this growing patient population. In this article, the authors review the factors that affect the barriers to care, the importance of safety net hospitals, the epidemiology of the hand-injured patient, and how the Affordable Care Act will impact these safety net programs. PMID:27465165
Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell
The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth®, providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes. PMID:22163952
Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell
The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes. PMID:22163952
Rosta, Judith; Aasland, Olaf G.
Aims: To describe and discuss the alcohol drinking patterns of the younger generation of hospital doctors in Norway and Germany – respectively the abstainers, frequent drinkers, episodic heavy drinkers and hazardous drinkers. Methods: Data were collected in nationwide postal surveys among doctors in Norway (2000) and Germany (2006). A representative sample of 1898 German and 602 Norwegian hospital doctors aged 27–65 years were included in the analyses (N=2500). Alcohol drinking patterns were measured using the first three items of AUDIT in Norway and the AUDIT-C in Germany, scores of ≥5 (ranking from 0 to 12) indicating hazardous drinking. Episodic heavy drinking was defined by the intake of ≥60g of ethanol, on one occasion, at least once a week. Frequent drinkers were who drank alcoholic beverages at least twice a week. Abstainers were persons who drank no alcohol. The analyses were performed separately for age groups (27–44 years versus 45–65 years) and genders. Results: Compared to the age groups 45 to 65 years in the Norwegian and German samples, the younger age groups (27–44 years) tend to have higher rates of abstainers, higher rates of infrequent drinking of moderate amount of alcoholic drinks, lower rates of episodic heavy drinking and lower rates of hazardous drinking. Conclusion: The younger generation of hospital doctors in Norway and Germany showed tendencies to healthier drinking habits. Changes in professional life, and in the attitude towards alcohol consumption, may go some way towards explaining these findings. PMID:20200658
White, Katherine M; Starfelt, Louise C; Jimmieson, Nerina L; Campbell, Megan; Graves, Nicholas; Barnett, Adrian G; Cockshaw, Wendell; Gee, Phillip; Page, Katie; Martin, Elizabeth; Brain, David; Paterson, David
Hand hygiene is the primary measure in hospitals to reduce the spread of infections, with nurses experiencing the greatest frequency of patient contact. The '5 critical moments' of hand hygiene initiative has been implemented in hospitals across Australia, accompanied by awareness-raising, staff training and auditing. The aim of this study was to understand the determinants of nurses' hand hygiene decisions, using an extension of a common health decision-making model, the theory of planned behaviour (TPB), to inform future health education strategies to increase compliance. Nurses from 50 Australian hospitals (n = 2378) completed standard TPB measures (attitude, subjective norm, perceived behavioural control [PBC], intention) and the extended variables of group norm, risk perceptions (susceptibility, severity) and knowledge (subjective, objective) at Time 1, while a sub-sample (n = 797) reported their hand hygiene behaviour 2 weeks later. Regression analyses identified subjective norm, PBC, group norm, subjective knowledge and risk susceptibility as the significant predictors of nurses' hand hygiene intentions, with intention and PBC predicting their compliance behaviour. Rather than targeting attitudes which are already very favourable among nurses, health education strategies should focus on normative influences and perceptions of control and risk in efforts to encourage hand hygiene adherence. PMID:26590244
Matsubara, Kazuo; Toyama, Akira; Satoh, Hiroshi; Suzuki, Hiroshi; Awaya, Toshio; Tasaki, Yoshikazu; Yasuoka, Toshiaki; Horiuchi, Ryuya
It is obvious that pharmacists play a critical role as risk managers in the healthcare system, especially in medication treatment. Hitherto, there is not a single multicenter-survey report describing the effectiveness of clinical pharmacists in preventing medical errors from occurring in the wards in Japan. Thus, we conducted a 1-month survey to elucidate the relationship between the number of errors and working hours of pharmacists in the ward, and verified whether the assignment of clinical pharmacists to the ward would prevent medical errors between October 1-31, 2009. Questionnaire items for the pharmacists at 42 national university hospitals and a medical institute included the total and the respective numbers of medication-related errors, beds and working hours of pharmacist in 2 internal medicine and 2 surgical departments in each hospital. Regardless of severity, errors were consecutively reported to the Medical Security and Safety Management Section in each hospital. The analysis of errors revealed that longer working hours of pharmacists in the ward resulted in less medication-related errors; this was especially significant in the internal medicine ward (where a variety of drugs were used) compared with the surgical ward. However, the nurse assignment mode (nurse/inpatients ratio: 1 : 7-10) did not influence the error frequency. The results of this survey strongly indicate that assignment of clinical pharmacists to the ward is critically essential in promoting medication safety and efficacy. PMID:21467804
Canto, John G.; Kiefe, Catarina I.; Rogers, William J.; Peterson, Eric D.; Frederick, Paul D.; French, William J.; Gibson, C. Michael; Pollack, Charles V.; Ornato, Joseph P.; Zalenski, Robert J.; Penney, Jan; Tiefenbrunn, Alan J.; Greenland, Philip
Few studies have examined associations between atherosclerotic risk factors and short-term mortality after first myocardial infarction (MI). Histories of 5 traditional atherosclerotic risk factors at presentation (diabetes, hypertension, smoking, dyslipidemia, and family history of premature heart disease) and hospital mortality were examined among 542,008 patients with first MIs in the National Registry of Myocardial Infarction (1994 to 2006). On initial MI presentation, history of hypertension (52.3%) was most common, followed by smoking (31.3%). The least common risk factor was diabetes (22.4%). Crude mortality was highest in patients with MI with diabetes (11.9%) and hypertension (9.8%) and lowest in those with smoking histories (5.4%) and dyslipidemia (4.6%). The inclusion of 5 atherosclerotic risk factors in a stepwise multivariate model contributed little toward predicting hospital mortality over age alone (C-statistic = 0.73 and 0.71, respectively). After extensive multivariate adjustments for clinical and sociodemographic factors, patients with MI with diabetes had higher odds of dying (odds ratio [OR] 1.23, 95% confidence interval [CI] 1.20 to 1.26) than those without diabetes and similarly for hypertension (OR 1.08, 95% CI 1.06 to 1.11). Conversely, family history (OR 0.71, 95% CI 0.69 to 0.73), dyslipidemia (OR 0.62, 95% CI 0.60 to 0.64), and smoking (OR 0.85, 95% CI 0.83 to 0.88) were associated with decreased mortality (C-statistic = 0.82 for the full model). In conclusion, in the setting of acute MI, histories of diabetes and hypertension are associated with higher hospital mortality, but the inclusion of atherosclerotic risk factors in models of hospital mortality does not improve predictive ability beyond other major clinical and sociodemographic characteristics. PMID:22840346
Canto, John G; Kiefe, Catarina I; Rogers, William J; Peterson, Eric D; Frederick, Paul D; French, William J; Gibson, C Michael; Pollack, Charles V; Ornato, Joseph P; Zalenski, Robert J; Penney, Jan; Tiefenbrunn, Alan J; Greenland, Philip
Few studies have examined associations between atherosclerotic risk factors and short-term mortality after first myocardial infarction (MI). Histories of 5 traditional atherosclerotic risk factors at presentation (diabetes, hypertension, smoking, dyslipidemia, and family history of premature heart disease) and hospital mortality were examined among 542,008 patients with first MIs in the National Registry of Myocardial Infarction (1994 to 2006). On initial MI presentation, history of hypertension (52.3%) was most common, followed by smoking (31.3%). The least common risk factor was diabetes (22.4%). Crude mortality was highest in patients with MI with diabetes (11.9%) and hypertension (9.8%) and lowest in those with smoking histories (5.4%) and dyslipidemia (4.6%). The inclusion of 5 atherosclerotic risk factors in a stepwise multivariate model contributed little toward predicting hospital mortality over age alone (C-statistic = 0.73 and 0.71, respectively). After extensive multivariate adjustments for clinical and sociodemographic factors, patients with MI with diabetes had higher odds of dying (odds ratio [OR] 1.23, 95% confidence interval [CI] 1.20 to 1.26) than those without diabetes and similarly for hypertension (OR 1.08, 95% CI 1.06 to 1.11). Conversely, family history (OR 0.71, 95% CI 0.69 to 0.73), dyslipidemia (OR 0.62, 95% CI 0.60 to 0.64), and smoking (OR 0.85, 95% CI 0.83 to 0.88) were associated with decreased mortality (C-statistic = 0.82 for the full model). In conclusion, in the setting of acute MI, histories of diabetes and hypertension are associated with higher hospital mortality, but the inclusion of atherosclerotic risk factors in models of hospital mortality does not improve predictive ability beyond other major clinical and sociodemographic characteristics. PMID:22840346
de Miguel-Díez, Javier; Jiménez-García, Rodrigo; Hernández-Barrera, Valentín; Carrasco-Garrido, Pilar; Bueno, Héctor; Puente-Maestu, Luis; Jimenez-Trujillo, Isabel; Alvaro-Meca, Alejandro; Esteban-Hernandez, Jesús; de Andrés, Ana López
Background People with COPD suffering from coronary artery disease are frequently treated with revascularization procedures. We aim to compare trends in the use and outcomes of these procedures in COPD and non-COPD patients in Spain between 2001 and 2011. Methods We identified all patients who had undergone percutaneous coronary interventions (PCIs) and coronary artery bypass graft (CABG) surgeries, using national hospital discharge data. Discharges were grouped into: COPD and no COPD. Results From 2001 to 2011, 428,516 PCIs and 79,619 CABGs were performed. The sex and age-adjusted use of PCI increased by 21.27% per year from 2001 to 2004 and by 5.47% per year from 2004 to 2011 in patients with COPD. In-hospital mortality (IHM) among patients with COPD who underwent a PCI increased significantly from 2001 to 2011 (odds ratio 1.11; 95% confidence interval 1.03–1.20). Among patients with COPD who underwent a CABG, the sex and age-adjusted CABG incidence rate increased by 9.77% per year from 2001 to 2003, and then decreased by 3.15% through 2011. The probability of dying during hospitalization in patients who underwent a CABG did not change significantly in patients with and without COPD (odds ratio, 1.06; 95% confidence interval 0.96–1.17). Conclusion The annual percent change in PCI procedures increased in COPD and non-COPD patients. We found a decrease in the use of CABG procedures in both groups. IHM was higher in patients with COPD who underwent a PCI than in those without COPD. However, COPD did not increase the probability of dying during hospitalization in patients who underwent a CABG. PMID:26543361
Evans, R; Mullally, D I; Wilson, R W; Gergen, P J; Rosenberg, H M; Grauman, J S; Chevarley, F M; Feinleib, M
National population-based data systems of the National Center for Health Statistics (NCHS) were used to study the epidemiology of asthma in the United States over the last 20 years. Asthma is more prevalent among males, those living below the poverty level, persons living in the South and West, and blacks; however, this difference did not attain statistical significance. Death rates from asthma among the older age groups probably increased between 1968 and 1982, with a substantial increase since 1979. For children, the evidence is less clear, but the death rate has increased for children over five years of age during the period from 1979 to 1982. Between 1964 and 1980, asthma has become more prevalent in children under 17 years of age, but this does not reflect an increase in the severity of asthma over this same time period. Hospitalization rates for asthma between 1965 and 1983 increased by 50 percent in adults and by over 200 percent in children. Rates for black patients are 50 percent higher in adults and 150 percent greater in children. It is concluded that there has been a marked increase in hospitalization rates for asthma, a moderate increase in death rates from asthma and a smaller increase in overall prevalence of the disease in the United States. PMID:3581966
Ruchlin, Hirsch S.; Leveson, Irving
This study presents a comprehensive method for quantifying hospital output and estimating hospital productivity. A number of less comprehensive productivity measures that can be quantified from data available from regional third-party payers and from the American Hospital Association are also developed and evaluated as proxies for the comprehensive measure, which is based on local area data. Methods are discussed for estimating the necessary variables on a regional or national level. PMID:4461703
Höfle, Bernhard; Koenig, Kristina; Griesbaum, Luisa; Kiefer, Andreas; Hämmerle, Martin; Eitel, Jan; Koma, Zsófia
Our physical environment undergoes constant changes in space and time with strongly varying triggers, frequencies, and magnitudes. Monitoring these environmental changes is crucial to improve our scientific understanding of complex human-environmental interactions and helps us to respond to environmental change by adaptation or mitigation. The three-dimensional (3D) description of the Earth surface features and the detailed monitoring of surface processes using 3D spatial data have gained increasing attention within the last decades, such as in climate change research (e.g., glacier retreat), carbon sequestration (e.g., forest biomass monitoring), precision agriculture and natural hazard management. In all those areas, 3D data have helped to improve our process understanding by allowing quantifying the structural properties of earth surface features and their changes over time. This advancement has been fostered by technological developments and increased availability of 3D sensing systems. In particular, LiDAR (light detection and ranging) technology, also referred to as laser scanning, has made significant progress and has evolved into an operational tool in environmental research and geosciences. The main result of LiDAR measurements is a highly spatially resolved 3D point cloud. Each point within the LiDAR point cloud has a XYZ coordinate associated with it and often additional information such as the strength of the returned backscatter. The point cloud provided by LiDAR contains rich geospatial, structural, and potentially biochemical information about the surveyed objects. To deal with the inherently unorganized datasets and the large data volume (frequently millions of XYZ coordinates) of LiDAR datasets, a multitude of algorithms for automatic 3D object detection (e.g., of single trees) and physical surface description (e.g., biomass) have been developed. However, so far the exchange of datasets and approaches (i.e., extraction algorithms) among LiDAR users
Jozkow, G.; Toth, C.; Grejner-Brzezinska, D.
Unmanned Aerial System (UAS) technology is nowadays willingly used in small area topographic mapping due to low costs and good quality of derived products. Since cameras typically used with UAS have some limitations, e.g. cannot penetrate the vegetation, LiDAR sensors are increasingly getting attention in UAS mapping. Sensor developments reached the point when their costs and size suit the UAS platform, though, LiDAR UAS is still an emerging technology. One issue related to using LiDAR sensors on UAS is the limited performance of the navigation sensors used on UAS platforms. Therefore, various hardware and software solutions are investigated to increase the quality of UAS LiDAR point clouds. This work analyses several aspects of the UAS LiDAR point cloud generation performance based on UAS flights conducted with the Velodyne laser scanner and cameras. The attention was primarily paid to the trajectory reconstruction performance that is essential for accurate point cloud georeferencing. Since the navigation sensors, especially Inertial Measurement Units (IMUs), may not be of sufficient performance, the estimated camera poses could allow to increase the robustness of the estimated trajectory, and subsequently, the accuracy of the point cloud. The accuracy of the final UAS LiDAR point cloud was evaluated on the basis of the generated DSM, including comparison with point clouds obtained from dense image matching. The results showed the need for more investigation on MEMS IMU sensors used for UAS trajectory reconstruction. The accuracy of the UAS LiDAR point cloud, though lower than for point cloud obtained from images, may be still sufficient for certain mapping applications where the optical imagery is not useful.
Pan, Z.; Prasad, S.; Starek, M. J.; Fernandez Diaz, J. C.; Glennie, C. L.; Carter, W. E.; Shrestha, R. L.; Singhania, A.; Gibeaut, J. C.
Seagrass provides vital habitat for marine fisheries and is a key indicator species of coastal ecosystem vitality. Monitoring seagrass is therefore an important environmental initiative, but measuring details of seagrass distribution over large areas via remote sensing has proved challenging. Developments in airborne bathymetric light detection and ranging (LiDAR) provide great potential in this regard. Traditional bathymetric LiDAR systems have been limited in their ability to map within the shallow water zone (< 1 m) where seagrass is typically present due to limitations in receiver response and laser pulse length. Emergent short-pulse width bathymetric LiDAR sensors and waveform processing algorithms enable depth measurements in shallow water environments previously inaccessible. This 3D information of the benthic layer can be applied to detect seagrass and characterize its distribution. Researchers with the National Center for Airborne Laser Mapping (NCALM) at the University of Houston (UH) and the Coastal and Marine Geospatial Sciences Lab (CMGL) of the Harte Research Institute at Texas A&M University-Corpus Christi conducted a coordinated airborne and boat-based survey of the Redfish Bay State Scientific Area as part of a collaborative study to investigate the capabilities of bathymetric LiDAR and hyperspectral imaging for seagrass mapping. Redfish Bay, located along the middle Texas coast of the Gulf of Mexico, is a state scientific area designated for the purpose of protecting and studying native seagrasses. Redfish Bay is part of the broader Coastal Bend Bays estuary system recognized by the US Environmental Protection Agency (EPA) as a national estuary of significance. For this survey, UH acquired high-resolution discrete-return and full-waveform bathymetric data using their Optech Aquarius 532 nm green LiDAR. In a separate flight, UH collected 2 sets of hyperspectral imaging data (1.2-m pixel resolution and 72 bands, and 0.6m pixel resolution and 36
Puig-Junoy, Jaume; García-Gómez, Pilar; Casado-Marín, David
This paper examines the impact of coinsurance exemption for prescription medicines applied to elderly individuals in Spain after retirement. We use a rich administrative dataset that links pharmaceutical consumption and hospital discharge records for the full population aged 58 to 65 years in January 2004 covered by the public insurer in a Spanish region, and we follow them until December 2006. We use a difference-in-differences strategy and exploit the eligibility age for Social Security to control for the endogeneity of the retirement decision. Our results show that this uniform exemption increases the consumption of prescription medicines on average by 17.5%, total pharmaceutical expenditure by 25% and the costs borne by the insurer by 60.4%, without evidence of any offset effect in the form of lower short term probability of hospitalization. The impact is concentrated among consumers of medicines for acute and other non-chronic diseases whose previous coinsurance rate was 30% to 40%. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26082341
Haule, Caspar; Ongom, Peter A; Kimuli, Timothy
Introduction The treatment of adhesive small bowel obstruction is controversial, with both operative and non-operative management practiced in different centers worldwide. Non-operative management is increasingly getting popular, though operative rates still remain high. A study to compare the efficacy of an oral water-soluble medium (Gastrografin®) with standard conservative management, both non-operative methods, in the management of this condition was conducted in a tertiary Sub Saharan hospital. Methods An open randomised controlled clinical trial was conducted between September 2012 and March 2013 at Mulago National Referral and Teaching Hospital, Uganda. Fifty patients of both genders, with adhesive small bowel obstruction, in the hospital’s emergency and general surgical wards were included. Randomisation was to Gastrografin® and standard conservative treatment groups. The primary outcomes were: the time interval between admission and relief of obstruction, the length of hospital stay, and the rates of operative surgery. Results All 50 recruited patients were followed up and analysed; 25 for each group. In the Gastrografin® group, 22 (88%) patients had relief of obstruction following the intervention, with 3 (12%) requiring surgery. The conservative treatment group had 16 (64%) patients relieved of obstruction conservatively, and 9 (36%) required surgery. The difference in operative rates between the two groups was not statistically significance (P = 0.67). Average time to relief of obstruction was shorter in the Gastrografin® group (72.52 hrs) compared to the conservative treatment group (117.75 hrs), a significant difference (P = 0.023). The average length of hospital stay was shorter in the Gastrografin® group (5.62 days) compared to the conservative treatment group (10.88 days), a significant difference (P = 0.04). Conclusion The use of Gastrografin® in patients with adhesive small bowel obstruction helps in earlier resolution of obstruction and
Background Tuberculosis remains a major public health problem in sub-Saharan Africa. District hospitals (DHs) play a central role in district-based health systems, and their relation with vertical programmes is very important. Studies on the impact of vertical programmes on DHs are rare. This study aims to fill this gap. Its purpose is to analyse the interaction between the National Tuberculosis Control Programme (NTCP) and DHs in Cameroon, especially its effects on the human resources, routine health information system (HIS) and technical capacity at the hospital level. Methods We used a multiple case study methodology. From the Adamaoua Region, we selected two DHs, one public and one faith-based. We collected qualitative and quantitative data through document reviews, semi-structured interviews with district and regional staff, and observations in the two DHs. Results The NTCP trained and supervised staff, designed and provided tuberculosis data collection and reporting tools, and provided anti-tuberculosis drugs, reagents and microscopes to DHs. However, these interventions were limited to the hospital units designated as Tuberculosis Diagnostic and Treatment Centres and to staff dedicated to tuberculosis control activities. The NTCP installed a parallel HIS that bypassed the District Health Services. The DH that performs well in terms of general hospital care and that is well managed was successful in tuberculosis control. Based on the available resources, the two hospitals adapt the organisation of tuberculosis control to their settings. The management teams in charge of the District Health Services are not involved in tuberculosis control. In our study, we identified several opportunities to strengthen the local health system that have been missed by the NTCP and the health system managers. Conclusion Well-managed DHs perform better in terms of tuberculosis control than DHs that are not well managed. The analysis of the effects of the NTCP on the human
Background A well functioning Health Information System (HIS) is crucial for effective and efficient health service delivery. In Tanzania there is a national HIS called Mfumo wa Taarifa za Uendeshaji Huduma za Afya (MTUHA). It comprises a guideline/manual, a series of registers for primary data collection and secondary data books where information from the registers is totalled or used for calculations. Methods A mix of qualitative methods were used. These included key informant interviews; staff interviews; participant observations; and a retrospective analysis of the hospital’s 2010 MTUHA reporting documents and the hospital’s development plan. Results All staff members acknowledged data collection as part of their job responsibilities. However, all had concerns about the accuracy of MTUHA data. Access to training was limited, mathematical capabilities often low, dissemination of MTUHA knowledge within the hospital poor, and a broad understanding of the HIS’s full capabilities lacking. Whilst data collection for routine services functioned reasonably well, filling of the secondary data tools was unsatisfactory. Internal inconsistencies between the different types of data tools were found. These included duplications, and the collection of data that was not further used. Sixteen of the total 72 forms (22.2%) that make up one of the key secondary data books (Hospital data/MTUHA book 2) could not be completed with the information collected in the primary data books. Moreover, the hospital made no use of any of the secondary data. The hospital’s main planning document was its development plan. Only 3 of the 22 indicators in this plan were the same as indicators in MTUHA, the information for 9 more was collected by the MTUHA system but figures had to be extracted and recalculated to fit, while for the remaining 10 indicators no use could be made of MTUHA at all. Conclusion The HIS in Tanzania is very extensive and it could be advisable to simplify it to the
Cox-Witton, Keren; Reiss, Andrea; Woods, Rupert; Grillo, Victoria; Baker, Rupert T.; Blyde, David J.; Boardman, Wayne; Cutter, Stephen; Lacasse, Claude; McCracken, Helen; Pyne, Michael; Smith, Ian; Vitali, Simone; Vogelnest, Larry; Wedd, Dion; Phillips, Martin; Bunn, Chris; Post, Lyndel
Emerging infectious diseases are increasingly originating from wildlife. Many of these diseases have significant impacts on human health, domestic animal health, and biodiversity. Surveillance is the key to early detection of emerging diseases. A zoo based wildlife disease surveillance program developed in Australia incorporates disease information from free-ranging wildlife into the existing national wildlife health information system. This program uses a collaborative approach and provides a strong model for a disease surveillance program for free-ranging wildlife that enhances the national capacity for early detection of emerging diseases. PMID:24787430
Hooshyar, M.; Kim, S.; Wang, D.; Medeiros, S. C.
The temporal dynamics of stream network is vitally important for understanding hydrologic processes including groundwater interactions and hydrograph recessions. However, observations are limited on flowing channel heads, which are usually located in headwater catchments and under canopy. Near infrared LiDAR data provides an opportunity to map the flowing channel network owing to the fine spatial resolution, canopy penetration, and strong absorption of the light energy by the water surface. A systematic method is developed herein to map flowing channel networks based on the signal intensity of ground LiDAR return, which is lower on water surfaces than on dry surfaces. Based on the selected sample sites where the wetness conditions are known, the signal intensities of ground returns are extracted from the LiDAR point data. The frequency distributions of wet surface and dry surface returns are constructed. With the aid of LiDAR-based ground elevation, the signal intensity thresholds are identified for mapping flowing channels. The developed method is applied to Lake Tahoe area based on eight LiDAR snapshots during recession periods in five watersheds. A power-law relationship between streamflow and flowing channel length during the recession period is derived based on the result.
Wang, K.; Kumar, P.; Dutta, D.
Vegetation biomass information is important for many ecological models that include terrestrial vegetation in their simulations. Biomass has strong influences on carbon, water, and nutrient cycles. Traditionally biomass estimation requires intensive, and often destructive, field measurements. However, with advances in technology, airborne LiDAR has become a convenient tool for acquiring such information on a large scale. In this study, we use infrared full waveform LiDAR to estimate biomass information for individual trees in the Sangamon River basin in Illinois, USA. During this process, we also develop automated geolocation calibration algorithms for raw waveform LiDAR data. In the summer of 2014, discrete and waveform LiDAR data were collected over the Sangamon River basin. Field measurements commonly used in biomass equations such as diameter at breast height and total tree height were also taken for four sites across the basin. Using discrete LiDAR data, individual trees are delineated. For each tree, a voxelization methods is applied to all waveforms associated with the tree to result in a pseudo-waveform. By relating biomass extrapolated using field measurements from a training set of trees to waveform metrics for each corresponding tree, we are able to estimate biomass on an individual tree basis. The results can be especially useful as current models increase in resolution.
Naesset, Erik; Gobakken, Terje; Bollandsas, Ole Martin; Gregoire, Timothy G.; Nelson, Ross; Stahl, Goeran
Airborne scanning LiDAR (Light Detection and Ranging) has emerged as a promising tool to provide auxiliary data for sample surveys aiming at estimation of above-ground tree biomass (AGB), with potential applications in REDD forest monitoring. For larger geographical regions such as counties, states or nations, it is not feasible to collect airborne LiDAR data continuously ("wall-to-wall") over the entire area of interest. Two-stage cluster survey designs have therefore been demonstrated by which LiDAR data are collected along selected individual flight-lines treated as clusters and with ground plots sampled along these LiDAR swaths. Recently, analytical AGB estimators and associated variance estimators that quantify the sampling variability have been proposed. Empirical studies employing these estimators have shown a seemingly equal or even larger uncertainty of the AGB estimates obtained with extensive use of LiDAR data to support the estimation as compared to pure field-based estimates employing estimators appropriate under simple random sampling (SRS). However, comparison of uncertainty estimates under SRS and sophisticated two-stage designs is complicated by large differences in the designs and assumptions. In this study, probability-based principles to estimation and inference were followed. We assumed designs of a field sample and a LiDAR-assisted survey of Hedmark County (HC) (27,390 km2), Norway, considered to be more comparable than those assumed in previous studies. The field sample consisted of 659 systematically distributed National Forest Inventory (NFI) plots and the airborne scanning LiDAR data were collected along 53 parallel flight-lines flown over the NFI plots. We compared AGB estimates based on the field survey only assuming SRS against corresponding estimates assuming two-phase (double) sampling with LiDAR and employing model-assisted estimators. We also compared AGB estimates based on the field survey only assuming two-stage sampling (the NFI
Contemporary forest management on public land incorporates a focus on restoration and maintenance of ecological functions through silvicultural manipulation of forest structure on a landscape scale. Incorporating reference conditions into restoration treatment planning and monitoring can improve treatment efficacy, but the typical ground-based methods of quantifying reference condition data---and comparing it to pre- and post-treatment stands---are expensive, time-consuming, and limited in scale. Airborne LiDAR may be part of the solution to this problem, since LiDAR acquisitions have both broad coverage and high resolution. I evaluated the ability of LiDAR Individual Tree Detection (ITD) to describe forest structure across a structurally variable landscape in support of large-scale forest restoration. I installed nineteen 0.25 ha stem map plots across a range of structural conditions in potential reference areas (Yosemite National Park) and potential restoration treatment areas (Sierra National Forest) in the Sierra Nevada of California. I used the plots to evaluate a common ITD algorithm, the watershed transform, compare it to past uses of ITD, and determine which aspects of forest structure contributed to errors in ITD. I found that ITD across this structurally diverse landscape was generally less accurate than across the smaller and less diverse areas over which it has previously been studied. However, the pattern of tree recognition is consistent: regardless of forest structure, canopy dominants are almost always detected and relatively shorter trees are almost never detected. Correspondingly, metrics dominated by large trees, such as biomass, basal area, and spatial heterogeneity, can be measured using ITD, while metrics dominated by smaller trees, such as stand density, cannot. Bearing these limitations in mind, ITD can be a powerful tool for describing forest structure across heterogeneous landscape restoration project areas.
Jalobeanu, A.; Gonçalves, G. R.
We recently developed a new point cloud registration algorithm. Compared to Iterated Closest Point (ICP) techniques, it is robust to noise and outliers, and easier to use, as it is less sensitive to initial conditions. It minimizes the entropy of the joint point cloud (including intensity attributes to help register areas with poor relief), uses a voxel space and B-Spline interpolation to accelerate computation. A natural application of registration is swath alignment in airborne light detection and ranging (LiDAR). Indeed, due to uncertainty in the inertial navigation system (INS), attitude angles are subject to time-dependent errors. Such errors can be understood as a sum of three terms: 1) a global term, or boresight error, which can be addressed using several existing techniques; 2) a low-frequency term, which is modeled as a constant attitude error for regions several hundred meters along-track; 3) a high-frequency term, responsible for corduroy artifacts (not addressed here). We propose to use the new registration algorithm to correct the low-frequency attitude variations. Relative geometric errors are significantly reduced, as pairs of swaths are registered onto each other local corrections. Absolute geometric errors are reduced during a second step, by applying all the corrections together to the entire dataset. We used a test area of 200 km2 in Portugal, with a density of 3-4 pts/m2. The point clouds were derived from waveform data, and include predictive range uncertainties estimated within a Bayesian framework. The data collection was supported by FCT and FEDER as part of the AutoProbaDTM research project (2009-2012). Modeling and reducing geometric error helps build consistent uncertainty maps. After correction, residual errors are taken into account in the final 3D error budget. For gridded elevation models a vertical uncertainty map is computed. Finally, it is possible to use the inter-swath registration parameters to estimate the distribution of
Columbia/HCA, the industry's 800-pound gorilla, is at last building a national brand. In the process, it has created one of the largest-circulation in-house magazines in the U.S. and found a way to build an enviable direct mail database at the same time. PMID:10161965
Kristensen, Solvejg; Túgvustein, Naina; Zachariassen, Hjørdis; Sabroe, Svend; Bartels, Paul; Mainz, Jan
Purpose The Faroe Islands are formally part of the Kingdom of Denmark, but the islands enjoy extensive autonomy as home ruled. In Denmark, extensive quality management initiatives have been implemented throughout hospitals, this was not the case in the Faroese Islands in 2013. The purpose of this study is to investigate the patient safety culture in the National Hospital of the Faroe Islands prior to implementation of quality management initiatives. Methods The Danish version of the Safety Attitudes Questionnaire (SAQ-DK) was distributed electronically to 557 staff members from five medical centers of the hospital, and one administrative unit. SAQ-DK has six cultural dimensions. The proportion of respondents with positive attitudes and mean scale scores were described, and comparison between medical specialties, and between clinical leaders and frontline staff was made using analysis of variance and chi-square test, respectively. Results The response rate was 65.8% (N=367). Job satisfaction was rated most favorable, and the perceived culture of the top management least favorable. Safety climate was the dimension with the greatest variability across the 28 units. The diagnostic center had the most favorable culture of all centers. More leaders than frontline staff had positive attitudes toward teamwork and safety climate, and working conditions, respectively. Also, the leaders perceived these dimensions more positive than the frontline staff, P<0.05. Among three management levels, the unit management was perceived most favorable and the top management least favorable. Conclusion The management group is recommended to raise awareness of their role in supporting a safe and caring environment for patients and staff, moreover the leaders should ensure that every day work achieves its objectives; keeping the patients safe. Furthermore, following the development in patient safety culture over time is recommended. PMID:27217800
Crosby, C. J.; Conner, J.; Frank, E.; Arrowsmith, J. R.; Memon, A.; Nandigam, V.; Wurman, G.; Baru, C.
choose to download their results in ESRI or ascii grid formats as well as geo tiff. Additionally, our workflow feeds into GEON web services in development that will allow visualization of Lservice outputs in either a web browser window or in 3D through Fledermaus' free viewer iView3D (or our own OpenGL-based tools). This geoinformatics-based system will allow GEON to host LiDAR point cloud data for the greater geoscience community, including data collected by the National Center for Airborne Laser Mapping (NCALM). In addition, most of the functions within this workflow are not limited to LiDAR data and may be used for distributing, interpolating and visualizing any computationally intensive point dataset (such as gravity). By utilizing the computational infrastructure developed by GEON, this system can democratize LiDAR data access for the geoscience community.
Krause, K.; Emery, W. J.; Barnett, D.; Petroy, S. B.; Meier, C. L.; Wessman, C. A.
Remote sensing is a powerful tool for measuring the current state of vegetation and monitoring changes over time with repeated data collections. Airborne Light Detection and Ranging (LiDAR) data is especially well suited for mapping 3D vegetation structure. In 2010, the National Ecological Observatory Network (NEON) contracted LiDAR and hyperspectral airborne data collections over the Ordway-Swisher Biological Station (OSBS). Ground truth campaigns were also conducted in 2010, 2011, and 2014 including structural measurements and generation of species lists for a set of ground validation plots. The vegetation communities at OSBS can be characterized by the Florida Natural Areas Inventory (FNAI) classification system, with a large area of the property belonging to the Sandhill community. For this study, classification algorithm training locations are hand selected for each FNAI community type using photo-interpretation. A series of LiDAR metrics are calculated on the discrete return point clouds and derived digital elevation (DEM) and canopy height models (CHM). A decision tree classification algorithm is run using R package "rpart". A main goal of the project is to relate the LiDAR metrics used by the decision tree to direct canopy structural quantities. For instance, the canopy 75th minus the 50th percentile height in the LiDAR point clouds are related to the uniformity and light penetration in the upper canopy. A prototype of the decision tree achieved a classification accuracy of 89% on the training data itself, suggesting that some locations in different FNAI vegetation communities have similar structure and could not be distinguished in the LiDAR metrics used. An improved decision tree is currently under development which will include more training locations and more LiDAR metrics as input features. Results from this improved model will be presenting using the NEON ground truth locations as an independent and quantitative validation measure of the decision tree
Piotrowski, J.; Goska, R.; Chen, B.; Krajewski, W. F.; Young, N.; Weber, L.
In June 2008, the state of Iowa experienced an unprecedented flood event which resulted in an economic loss of approximately $2.88 billion. Flooding in the Iowa River corridor, which exceeded the previous flood of record by 3 feet, devastated several communities, including Coralville and Iowa City, home to the University of Iowa. Recognizing an opportunity to capture a unique dataset detailing the impacts of the historic flood, the investigators contacted the National Center for Airborne Laser Mapping (NCALM), which performed an aerial Light Detection and Ranging (LiDAR) survey along the Iowa River. The survey, conducted immediately following the flood peak, provided coverage of a 60-mile reach. The goal of the present research is to develop a process by which flood extents and water surface elevations can be accurately extracted from the LiDAR data set and to evaluate the benefit of such data in calibrating one- and two-dimensional hydraulic models. Whereas data typically available for model calibration include sparsely distributed point observations and high water marks, the LiDAR data used in the present study provide broad-scale, detailed, and continuous information describing the spatial extent and depth of flooding. Initial efforts were focused on a 10-mile, primarily urban reach of the Iowa River extending from Coralville Reservoir, a United States Army Corps of Engineers flood control project, downstream through the Coralville and Iowa City. Spatial extent and depth of flooding were estimated from the LiDAR data. At a given cross-sectional location, river channel and floodplain measurements were compared. When differences between floodplain and river channel measurements were less than a standard deviation of the vertical uncertainty in the LiDAR survey, floodplain measurements were classified as flooded. A flood water surface DEM was created using measurements classified as flooded. A two-dimensional, depth-averaged numerical model of a 10-mile reach of
In the years between 1860 and 1910, a revolution in epilepsy theory and practice occurred. The National Hospital for the Relief and Cure of the Paralysed and the Epileptic at Queen Square in London was at the center of this revolution. A series of remarkable physicians and surgeons were appointed to the staff. The four greatest were John Hughlings Jackson, Sir David Ferrier, Sir Victor Horsley, and Sir William Gowers. Their lasting contribution to epilepsy is discussed. Other physicians who made notable contributions to epilepsy were Jabez Spence Ramskill, Charles Eduard Brown-Séquard, Charles Bland Radcliffe, Sir John Russell Reynolds, Sir Edward Henry Sieveking, Walter Stacy Colman, and William Aldren Turner. At the hospital in this period, amongst the lasting contributions to epilepsy were the following: the development of a new conceptual basis of epilepsy, the development of a theory of the physiological structure of the nervous system in relation to epilepsy, the demonstration and investigation of cortical localization of epileptic activity, the establishment of the principle of focal epilepsy and the description of focal seizure types, the discovery of the first effective drug treatment for epilepsy (bromide therapy, indeed one of the first effective drug treatments in the whole of neurology), and the performance of the first surgical operation for epilepsy. This paper is based on the 2013 Gowers Memorial Lecture, delivered in May 2013. PMID:24239432
Gupta, Anshu; Gupta, Chhavi
Context: Certain quality indicators are mandatory in the maintenance and improvement of quality in blood transfusion. Monitoring of such indicators should be done regularly and deficiencies are to be corrected for effective blood transfusion services. Aims: To study the usefulness of monitoring of the National Accreditation Board for Hospitals and Healthcare Providers (NABH) core indicators in blood transfusion and in the maintenance of hemovigilance. Settings and Design: Hemovigilance is a quality process to improve quality and increase the safety of blood transfusion. It covers and surveys all activities of the blood transfusion chain from donors to recipients. Core indicators’ monitoring is a part of the hemovigilance process. Materials and Methods: A 2-year retrospective study was conducted in a blood storage unit of a NABH accredited tertiary care hospital of a metropolitan city. Four NABH core indicators in blood transfusion were observed and monitored by the clinical and blood storage unit staff of different levels. Results: It was observed that there was an improvement in quality by core indicators monitoring with decreased wastage of blood and blood components, decreased average turnaround time for issue of blood and blood components, and lesser number of transfusion reactions. Conclusion: This study demonstrated that monitoring of NABH core indicators results in the enhancement of quality and safety in blood transfusion services, reducing the incidence of transfusion reactions. PMID:27011668
Ding, Qiong; Ji, Shengyue; Chen, Wu
Wetlands have received intensive interdisciplinary attention as a unique ecosystem and valuable resources. As a new technology, the airborne LiDAR system has been applied in wetland research these years. However, most of the studies used only one or two LiDAR observations to extract either terrain or vegetation in wetlands. This research aims at integrating LiDAR's multiple attributes (DSM, DTM, off-ground features, Slop map, multiple pulse returns, and normalized intensity) to improve mapping and classification of wetlands based on a multi-level object-oriented classification method. By using this method, we are able to classify the Yellow River Delta wetland into eight classes with overall classification accuracy of 92.5%
Gastroenteritis - norovirus; Colitis - norovirus; Hospital acquired infection - norovirus ... fluids ( dehydration ). Anyone can become infected with norovirus. Hospital patients who are very old, very young, or ...
Ousseini, H; Kim, D S; Adamou, A
This study has been planned in order to determinate the frequency of the infection by VIH among the 394 new tuberculous, for a period extended from July 1990 to July 1991 at the section of pneumophtisiology in the National Hospital of Niamey. The number of seropositives is 7.6%. The two types of viruses, i.e. VIH1 and VIH2, and the double infection by VIH1 + VIH2 types exist in the tuberculous patients. The VIH1 is most frequently found in the subjects of age group 20-39 years, who are almost emigrants. Inspite of the actual weak sero-prevalency among the tuberculous patients, the authors claim that a sero-epidemiological sequential survey of tuberculosis in Niamey can be a relatively easy method for measuring the variations of sero-prevalency of AIDS in Niger. PMID:8555766
Rivas-Nieto, Andrea C; Málaga, Germán; Ruiz-Grosso, Paulo; Huayanay-Espinoza, Carlos A; Curioso, Walter H
This study aimed to determine the use and perceptions towards information and communication technologies (ICT) in 206 patients with arterial hypertension, dyslipidemia and diabetes, recruited from the outpatient clinic in a national hospital in Lima, Peru. 54.4% were older adults and 70.4% were women. The use of daily phone calls was 44.7%. Most had never used a computer (78.2%), email (84%) or the Internet (84%). Many have never sent (80.6%) or received (69.9%) a text message. 70% had at some time forgotten to take their medicine. 72.8% would like to be reminded to take their medication and 67.9% had a family member who could help them with access to ICT. Despite the low use of ICT in this population, there is willingness and expectation from the patients to participate in programs that implement them. PMID:26338388
Robinson, S. E.; Arrowsmith, R.; de Groot, R. M.; Crosby, C. J.; Whitesides, A. S.; Colunga, J.
The use of high-resolution topography derived from Light Detection and Ranging (LiDAR) in the study of active tectonics is widespread and has become an indispensable tool to better understand earthquake hazards. For this reason and the spectacular representation of the phenomena the data provide, it is appropriate to integrate these data into the Earth science education curriculum. A collaboration between Arizona State University, the OpenTopography Facility, and the Southern California Earthquake Center are developing, three earth science education products to inform students and other audiences about LiDAR and its application to active tectonics research. First, a 10-minute introductory video titled LiDAR: Illuminating Earthquakes was produced and is freely available online through the OpenTopography portal and SCEC. The second product is an update and enhancement of the Wallace Creek Interpretive Trail website (www.scec.org/wallacecreek). LiDAR topography data products have been added along with the development of a virtual tour of the offset channels at Wallace Creek using the B4 LiDAR data within the Google Earth environment. The virtual tour to Wallace Creek is designed as a lab activity for introductory undergraduate geology courses to increase understanding of earthquake hazards through exploration of the dramatic offset created by the San Andreas Fault (SAF) at Wallace Creek and Global Positioning System-derived displacements spanning the SAF at Wallace Creek . This activity is currently being tested in courses at Arizona State University. The goal of the assessment is to measure student understanding of plate tectonics and earthquakes after completing the activity. Including high-resolution topography LiDAR data into the earth science education curriculum promotes understanding of plate tectonics, faults, and other topics related to earthquake hazards.
Background Hospital-associated infections (HAIs) are associated with a considerable burden of disease and direct costs greater than $17 billion. The pathogens that cause the majority of serious HAIs are Enterococcus faecium, Staphylococcus aureus, Clostridium difficile, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, and Enterobacter species, referred as ESCKAPE. We aimed to determine the amount of funding the National Institute of Health (NIH) National Institute of Allergy and Infectious Diseases (NIAID) allocates to research on antimicrobial resistant pathogens, particularly ESCKAPE pathogens. Methods The NIH Research Portfolio Online Reporting Tools (RePORT) database was used to identify NIAID antimicrobial resistance research grants funded in 2007-2009 using the terms "antibiotic resistance," "antimicrobial resistance," and "hospital-associated infection." Results Funding for antimicrobial resistance grants has increased from 2007-2009. Antimicrobial resistance funding for bacterial pathogens has seen a smaller increase than non-bacterial pathogens. The total funding for all ESKCAPE pathogens was $ 22,005,943 in 2007, $ 30,810,153 in 2008 and $ 49,801,227 in 2009. S. aureus grants received $ 29,193,264 in FY2009, the highest funding amount of all the ESCKAPE pathogens. Based on 2009 funding data, approximately $1,565 of research money was spent per S. aureus related death and $750 of was spent per C. difficile related death. Conclusions Although the funding for ESCKAPE pathogens has increased from 2007 to 2009, funding levels for antimicrobial resistant bacteria-related grants is still lower than funding for antimicrobial resistant non-bacterial pathogens. Efforts may be needed to improve research funding for resistant-bacterial pathogens, particularly as their clinical burden increases. PMID:22958856
Kitamura, Tetsuhisa; Kiyohara, Kosuke; Matsuyama, Tasuku; Hatakeyama, Toshihiro; Shimamoto, Tomonari; Izawa, Junichi; Nishiyama, Chika; Iwami, Taku
Background Outcomes after out-of-hospital cardiac arrests (OHCAs) might be worse during academic meetings because many medical professionals attend them. Methods This nationwide population-based observation of all consecutively enrolled Japanese adult OHCA patients with resuscitation attempts from 2005 to 2012. The primary outcome was 1-month survival with a neurologically favorable outcome. Calendar days at three national meetings (Japanese Society of Intensive Care Medicine, Japanese Association for Acute Medicine, and Japanese Circulation Society) were obtained for each year during the study period, because medical professionals who belong to these academic societies play an important role in treating OHCA patients after hospital admission, and we identified two groups: the exposure group included OHCAs that occurred on meeting days, and the control group included OHCAs that occurred on the same days of the week 1 week before and after meetings. Multiple logistic regression analysis was used to adjust for confounding variables. Results A total of 20 143 OHCAs that occurred during meeting days and 38 860 OHCAs that occurred during non-meeting days were eligible for our analyses. The proportion of patients with favorable neurologic outcomes after whole arrests did not differ during meeting and non-meeting days (1.6% [324/20 143] vs 1.5% [596/38 855]; adjusted odds ratio 1.02; 95% confidence interval, 0.88–1.19). Regarding bystander-witnessed ventricular fibrillation arrests of cardiac origin, the proportion of patients with favorable neurologic outcomes also did not differ between the groups. Conclusions In this population, there were no significant differences in outcomes after OHCAs that occurred during national meetings of professional organizations related to OHCA care and those that occurred during non-meeting days. PMID:26639754
Sako, F B; Traoré, F A; Camara, M K; Sylla, M; Bangoura, E F; Baldé, O
Cholera is an epidemic diarrheal disease transmitted through the digestive tract; it can cause obstetric complications in pregnant women. The objective of this study was to describe the epidemiological, clinical, and therapeutic aspects of cholera in pregnant women, as well as its course, during the 2012 epidemic in Conakry. This retrospective, descriptive studied examined the records of this epidemic over a 7-month period (from May 15 to December 15, 2012). Of 2,808 cholera patients at our hospital, 80 were pregnant, that is, 2.85%. Their mean age was 30 years [range: 15-45 years], 94% were from Conakry (94%), and 69% were in the third trimester of pregnancy. Choleriform diarrhea and vomiting were the main signs, found respectively in 100% and 95% of the women; dehydration was mild for 16%, moderate for 45%, and severe for 39%. Support consisted of rehydration, by plans A (16%), B (45%) or C (39%) and antibiotic treatment based on erythromycin (85%), doxycycline (14%), or azithromycin (1%). Other drugs that were used included phloroglucinol-trimethylphloroglucinol (Spasfon(®)) for 45%, acetaminophen for 65%, and iron/folic acid for 1% of cases. The major obstetric complications were 4 intrauterine deaths (5%), 2 cases of threatened abortion (2%), 1 preterm delivery (1%), and 1 maternal death. The cholera outbreak in 2012 affected a large number of pregnant women in Conakry, most during their third trimester. The classic clinical manifestations were associated with obstetric complications and maternal-fetal risks. PMID:27412979
The dominant height of 73 georeferenced field sample plots were modeled from various canopy height metrics derived by means of a small-footprint laser scanning technology, known as light detection and ranging (or just LiDAR), over young and mature forest stands using regression analysis. LiDAR plot metrics were regressed against field measured dominant height using Best Subsets Regression to reduce the number of models. From those models, regression assumptions were evaluated to determine which model was actually the best. The best model included the 1st and 90th height percentiles as predictors and explained 95% of the variance in average dominant height.
Reto Valiente, Luz Victoria; Pichilingue Reto, Catherina; Pichilingue Reto, Patricia; Angulo, Galindo; Agusto, Carlos; Pichilingue Prieto, Oscar Alfredo
The aim of this study is to evaluate the clinical diagnosis and treatment of pediatrics patients with hepatic hydatid disease hospitalized in the pediatric ward of HNHU in the last ten years. The study is a descriptive, cross-sectional and retrospective observational of patients undergoing surgery for liver hydatidosis. We studied 42 confirmed cases of hepatic hydatidosis, the ages ranged from 1 to 17 years and most were adolescents 13 to 17 years (20 cases 47.62%), the gender distribution is equal and the source of patients is mainly from the Central Andes of Peru (24 cases 57.14%) followed by Lima city (10 cases 23.81%). The most common presenting symptom was abdåominal pain (29 cases 69.05%) followed by fever (19 cases 45.24%). Ultrasound is the most common diagnostic method and only not done in a patient carrying a prior CT scan. Serology (indirect immunofluorescence) was positive in only 19 of 27 patients who had the test (70%).Most were single hepatic cysts (22 cases 52.38%) with size from 3 to 20 cm but most commonly they were 5 to 10 cm sized. The location was predominant on the right lobe (26 cases 61.98%) followed on both lobes (10 cases 23.81%). Apart from the liver there were cysts on the lungs (18 cases 42.86%).The surgical procedure performed was radical cystectomy with or without drainage in 36 cases (85.71%). And conservative surgery in only 6 cases (14.28). The important complications were: 15 cases of fever (35.71), nosocomial respiratory infection in 9 cases (21.43%), biliary fistula in 5 cases (11.90%) and residual abscess in 3 cases (7.14%). Although morbidity was high, mortality of the cases studied was zero. PMID:23128950
... 47 Telecommunication 2 2014-10-01 2014-10-01 false Satellite DARS applications subject to...) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Competitive Bidding Procedures for DARS § 25.401 Satellite DARS applications subject to competitive bidding. Mutually exclusive initial applications for...
... 47 Telecommunication 2 2013-10-01 2013-10-01 false Satellite DARS applications subject to...) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Competitive Bidding Procedures for DARS § 25.401 Satellite DARS applications subject to competitive bidding. Mutually exclusive initial applications for...
... 47 Telecommunication 2 2011-10-01 2011-10-01 false Satellite DARS applications subject to...) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Competitive Bidding Procedures for DARS § 25.401 Satellite DARS applications subject to competitive bidding. Mutually exclusive initial applications for...
... 47 Telecommunication 2 2012-10-01 2012-10-01 false Satellite DARS applications subject to...) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Competitive Bidding Procedures for DARS § 25.401 Satellite DARS applications subject to competitive bidding. Mutually exclusive initial applications for...
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Satellite DARS applications subject to...) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Competitive Bidding Procedures for DARS § 25.401 Satellite DARS applications subject to competitive bidding. Mutually exclusive initial applications for...
Low-height vegetation, common in semiarid regions, is difficult to characterize with LiDAR (Light Detection and Ranging) due to similarities, in time and space, of the point returns of vegetation and ground. Other complications may occur due to the low-height vegetation structural characteristics a...
Progress of genomic research in oat has been limited by a lack of common markers and consensus maps that would provide integration platforms for structural genomic analysis. Diversity Array Technology (DArT) is a strategy that provides a high density of molecular markers that can be tested in par...
Zhang, Lihua; Li, Jing; Li, Xi; Nasir, Khurram; Zhang, Haibo; Wu, Yongjian; Hu, Shuang; Wang, Qing; Downing, Nicholas S.; Desai, Nihar R.; Masoudi, Frederick A.; Spertus, John A.; Krumholz, Harlan M.; Jiang, Lixin
Background Statin therapy is among the most effective treatments to improve short- and long-term mortality after acute myocardial infarction. The use of statin, and the intensity of their use, has not been described in acute myocardial infarction patients in China, a country with a rapidly growing burden of cardiovascular disease. Methods and Results Using a nationally representative sample of patients with acute myocardial infarction admitted to 162 Chinese hospitals in 2001, 2006 and 2011, we identified 14,958 patients eligible for statin therapy to determine rates of statin use and the intensity of statin therapy, defined as those statin regimens with expected low-density lipoprotein cholesterol lowering of at least 40%, to identify factors associated with the use of statin therapy. Statin use among hospitalized patients with acute myocardial infarction increased from 27.9% in 2001 to 72.5% in 2006, and 88.8% in 2011 (P<0.001 for trend). Regional variation in statin use correspondingly decreased over time. Among treated patients, those receiving intensive statin therapy increased from 1.0% in 2001 to 24.2% in 2006 to 57.2% in 2011(P<0.001 for trend). Patients without low-density lipoprotein cholesterol measured were less likely to be treated with statin or to receive intensive therapy. Conclusions The use of statin therapy has dramatically increased over the past decade in Chinese patients with acute myocardial infarction. However, half of patients still did not receive intensive statin therapy in 2011.Given that guidelines strongly endorse intensive statin therapy for acute myocardial infarction patients, initiatives promoting the use of statin therapy, with attention to treatment intensity, would support further improvements in practice. PMID:27058862
Fey, Christine; Rutzinger, Martin; Bremer, Magnus; Prager, Christoph; Zangerl, Christian
Information about slope geometry and kinematics of landslides is essential for hazard assessment, monitoring and planning of protection and mitigation measures. Especially for remote and inaccessible slopes, subsurface data (e.g. boreholes, tunnels, investigation adits) are often not available and thus the deformation characteristics must be derived from surface displacement data. In recent years, multi-temporal topographic LiDAR (Light Detection and Ranging) data became an increasingly improved tool for detecting topographic surface deformations. In this context, LiDAR-based change detection is commonly applied for quantifying surface elevation changes. Advanced change detection methods derive displacement vectors with direction and velocities of slope movements. To extract displacement vectors from LiDAR raster data (i) an approach based on feature tracking by image correlation and (ii) an approach based on feature tracking by vectors breaklines are investigated. The image correlation method is based on the IMCORR software (National Snow and Ice Data Center, University of Colorado, Boulder), implemented in a SAGA GIS module. The image correlation algorithm is based on a normalized cross-covariance method. The algorithm searches tie points in two feature rasters derived from a digital surface model acquired at different time stamps. The method assesses automatically the displacement rates and directions of distinct terrain features e.g. displaced mountain ridges or striking boulders. In contrast the vector-based breakline methods require manual selection of tie points. The breaklines are the product of vectorized curvature raster images and extracting the "upper terrain edges" (topographic ridges) and "lower terrain edges" (topographic depressions). Both methods were tested on simulated terrain with determined displacement rates in order to quantify i) the accuracy ii) the minimum detectable movement rates iii) the influence of terrain characteristics iv) the
Phillips, D. A.; Jackson, M. E.; Meertens, C.
UNAVCO has successfully acquired a significant volume of aerial and satellite geodetic imagery as part of GeoEarthScope, a component of the EarthScope Facility project funded by the National Science Foundation. All GeoEarthScope acquisition activities are now complete. Airborne LiDAR data acquisitions took place in 2007 and 2008 and cover a total area of more than 5000 square kilometers. The primary LiDAR survey regions cover features in Northern California, Southern/Eastern California, the Pacific Northwest, the Intermountain Seismic Belt (including the Wasatch and Teton faults and Yellowstone), and Alaska. We have ordered and archived more than 28,000 scenes (more than 81,000 frames) of synthetic aperture radar (SAR) data suitable for interferometric analyses covering most of the western U.S. and parts of Alaska and Hawaii from several satellite platforms, including ERS-1/2, ENVISAT and RADARSAT. In addition to ordering data from existing archives, we also tasked the ESA ENVISAT satellite to acquire new SAR data in 2007 and 2008. GeoEarthScope activities were led by UNAVCO, guided by the community and conducted in partnership with the USGS and NASA. Processed imagery products, in addition to formats intended for use in standard research software, can also be viewed using general purpose tools such as Google Earth. We present a summary of these vast geodetic imagery datasets, totaling tens of terabytes, which are freely available to the community.
Morton, D. C.; Keller, M.; Cook, B. D.; Hunter, Maria; Sales, Marcio; Spinelli, L.; Victoria, D.; Andersen, H.-E.; Saleska, S.
Tropical forests ecosystems respond dynamically to climate variability and disturbances on time scales of minutes to millennia. To date, our knowledge of disturbance and recovery processes in tropical forests is derived almost exclusively from networks of forest inventory plots. These plots typically sample small areas (less than or equal to 1 ha) in conservation units that are protected from logging and fire. Amazon forests with frequent disturbances from human activity remain under-studied. Ongoing negotiations on REDD+ (Reducing Emissions from Deforestation and Forest Degradation plus enhancing forest carbon stocks) have placed additional emphasis on identifying degraded forests and quantifying changing carbon stocks in both degraded and intact tropical forests. We evaluated patterns of forest disturbance and recovery at four -1000 ha sites in the Brazilian Amazon using small footprint LiDAR data and coincident field measurements. Large area coverage with airborne LiDAR data in 2011-2012 included logged and unmanaged areas in Cotriguacu (Mato Grosso), Fiona do Jamari (Rondonia), and Floresta Estadual do Antimary (Acre), and unmanaged forest within Reserva Ducke (Amazonas). Logging infrastructure (skid trails, log decks, and roads) was identified using LiDAR returns from understory vegetation and validated based on field data. At each logged site, canopy gaps from logging activity and LiDAR metrics of canopy heights were used to quantify differences in forest structure between logged and unlogged areas. Contrasting patterns of harvesting operations and canopy damages at the three logged sites reflect different levels of pre-harvest planning (i.e., informal logging compared to state or national logging concessions), harvest intensity, and site conditions. Finally, we used multi-temporal LiDAR data from two sites, Reserva Ducke (2009, 2012) and Antimary (2010, 2011), to evaluate gap phase dynamics in unmanaged forest areas. The rates and patterns of canopy gap
Yotsumoto, Hideki; Yonemaru, Makoto; Suzuki, Katsuhiro; Kawabe, Yoshiko; Sasaki, Yuka; Toyoda, Emiko; Yamagishi, Fumio; Kudoh, Koichiro; Kurasawa, Takuya; Ito, Masami; Kawashiro, Takeo; Sakatani, Mitsunori; Mori, Masashi
Considering the high social activity, the trend of tuberculosis among young adults appears to be one of the key factors that influence the future morbidity rate of tuberculosis in Japan. To investigate its current characteristics, we analyzed new cases of tuberculosis aged 20 to 29 who were admitted to 7 national hospitals in Kanto- and Kinki-areas during the period of January 1st to December 31st, 2000. Data on the following items were compiled: sex, age, body height and weight, nationality; background factors such as life style, complications; course of the disease before the diagnosis; result of PPD skin test; severity of the disease estimated by the amount of M. tuberculosis in sputum and the grade of chest X-ray findings; therapeutic regimens and the response rate. Data were collected from 234 patients (129 males and 105 females) and the results were as follows: 1) about 80% of the patients were symptomatic and in 50% of patients who presented with cough, more than one month was needed before establishing the diagnosis as TB, 2) the disease was found in advanced stage in more than half of the patients, 3) foreigner patients, most of them were from Kanto-area, accounted for 11%, and were in advanced stage, some with drug-resistant tuberculosis, 4) INH resistance was noted in 7.7%, 5) pyrazinamide was included in the therapeutic regimens in 84.0% of the smear positive patients, 6) the admission period was within 90 days in 63.7% of the patients, however, the duration of treatment was 6 months in only 48.0% of patients who were treated with regimens containing pyrazinamide. More efforts for early detection of patients is needed to prevent the transmission of the disease, and more extensive use of directly observed therapy is essential for the prevention of dropout. We also argued about the shortening of the admission and duration of treatment in these patients. PMID:14509224
Woo, Jung Hoon; Grinspan, Zachary; Shapiro, Jason; Rhee, Sang Youl
The Korean National Health Insurance, which provides universal coverage for the entire Korean population, is now facing financial instability. Frequent emergency department (ED) users may represent a medically vulnerable population who could benefit from interventions that both improve care and lower costs. To understand the nature of frequent ED users in Korea, we analyzed claims data from a population-based national representative sample. We performed both bivariate and multivariable analyses to investigate the association between patient characteristics and frequent ED use (4+ ED visits in a year) using claims data of a 1% random sample of the Korean population, collected in 2009. Among 156,246 total ED users, 4,835 (3.1%) were frequent ED users. These patients accounted for 14% of 209,326 total ED visits and 17.2% of $76,253,784 total medical expenses generated from all ED visits in the 1% data sample. Frequent ED users tended to be older, male, and of lower socio-economic status compared with occasional ED users (p < 0.001 for each). Moreover, frequent ED users had longer stays in the hospital when admitted, higher probability of undergoing an operative procedure, and increased mortality. Among 8,425 primary diagnoses, alcohol-related complaints and schizophrenia showed the strongest positive correlation with the number of ED visits. Among the frequent ED users, mortality and annual outpatient department visits were significantly lower in the alcohol-related patient subgroup compared with other frequent ED users; furthermore, the rate was even lower than that for non-frequent ED users. Our findings suggest that expanding mental health and alcohol treatment programs may be a reasonable strategy to decrease the dependence of these patients on the ED. PMID:26809051
Adeleke, A. K.; Smit, J. L.
Apart from the drive to reduce carbon dioxide emissions by carbon-intensive economies like South Africa, the recent spate of electricity load shedding across most part of the country, including Cape Town has left electricity consumers scampering for alternatives, so as to rely less on the national grid. Solar energy, which is adequately available in most part of Africa and regarded as a clean and renewable source of energy, makes it possible to generate electricity by using photovoltaics technology. However, before time and financial resources are invested into rooftop solar photovoltaic systems in urban areas, it is important to evaluate the potential of the building rooftop, intended to be used in harvesting the solar energy. This paper presents methodologies making use of LiDAR data and other ancillary data, such as high-resolution aerial imagery, to automatically extract building rooftops in City of Cape Town and evaluate their potentials for solar photovoltaics systems. Two main processes were involved: (1) automatic extraction of building roofs using the integration of LiDAR data and aerial imagery in order to derive its' outline and areal coverage; and (2) estimating the global solar radiation incidence on each roof surface using an elevation model derived from the LiDAR data, in order to evaluate its solar photovoltaic potential. This resulted in a geodatabase, which can be queried to retrieve salient information about the viability of a particular building roof for solar photovoltaic installation.
Poppenga, Sandra K.; Gesch, Dean B.; Worstell, Bruce B.
The 1:24,000-scale high-resolution National Hydrography Dataset (NHD) mapped hydrography flow lines require regular updating because land surface conditions that affect surface channel drainage change over time. Historically, NHD flow lines were created by digitizing surface water information from aerial photography and paper maps. Using these same methods to update nationwide NHD flow lines is costly and inefficient; furthermore, these methods result in hydrography that lacks the horizontal and vertical accuracy needed for fully integrated datasets useful for mapping and scientific investigations. Effective methods for improving mapped hydrography employ change detection analysis of surface channels derived from light detection and ranging (LiDAR) digital elevation models (DEMs) and NHD flow lines. In this article, we describe the usefulness of surface channels derived from LiDAR DEMs for hydrography change detection to derive spatially accurate and time-relevant mapped hydrography. The methods employ analyses of horizontal and vertical differences between LiDAR-derived surface channels and NHD flow lines to define candidate locations of hydrography change. These methods alleviate the need to analyze and update the nationwide NHD for time relevant hydrography, and provide an avenue for updating the dataset where change has occurred.
Campo, D.; IsoDAR Collaboration
The IsoDAR experiment is the MIT proposal to investigate about several neutrino properties, in order to explain some anomalies experimentally observed. It requires 10mA of proton beam at the energy of 60MeV to produce a high-intensity electron antineutrino flux from the production and the decay of 8Li: it is an ambitious goal for the accelerator design, due also to the fact that the machine has to be placed near a neutrino detector, like KAMLAND or WATCHMAN, located in underground sites. A compact cyclotron able to accelerate H2+ molecule beam up to energy of 60MeV/amu is under study. The critical issues of this machine concern the beam injection due to the effects of space charge, the efficiency of the beam extraction and the technical solutions needed to the machine assembly. Here, the innovative solutions and the preliminary results achieved by the IsoDAR team are discussed.
Varney, Nina M.; Asari, Vijayan K.
One of the most difficult challenges of working with LiDAR data is the large amount of data points that are produced. Analysing these large data sets is an extremely time consuming process. For this reason, automatic perception of LiDAR scenes is a growing area of research. Currently, most LiDAR feature extraction relies on geometrical features specific to the point cloud of interest. These geometrical features are scene-specific, and often rely on the scale and orientation of the object for classification. This paper proposes a robust method for reduced dimensionality feature extraction of 3D objects using a volume component analysis (VCA) approach.1 This VCA approach is based on principal component analysis (PCA). PCA is a method of reduced feature extraction that computes a covariance matrix from the original input vector. The eigenvectors corresponding to the largest eigenvalues of the covariance matrix are used to describe an image. Block-based PCA is an adapted method for feature extraction in facial images because PCA, when performed in local areas of the image, can extract more significant features than can be extracted when the entire image is considered. The image space is split into several of these blocks, and PCA is computed individually for each block. This VCA proposes that a LiDAR point cloud can be represented as a series of voxels whose values correspond to the point density within that relative location. From this voxelized space, block-based PCA is used to analyze sections of the space where the sections, when combined, will represent features of the entire 3-D object. These features are then used as the input to a support vector machine which is trained to identify four classes of objects, vegetation, vehicles, buildings and barriers with an overall accuracy of 93.8%
Glennie, C. L.; Brooks, B. A.; Ericksen, T. L.; Hudnut, K. W.; Foster, J. H.; Hauser, D.; Avery, J.
Airborne LiDAR (LIght Detection And Ranging) systems have become a standard mechanism for acquiring dense high-precision topography, making it possible to perform large scale documentation (100's of km2) per day at spatial scales as fine as a few decimeters horizontally and a few centimeters vertically. However, current airborne and terrestrial LiDAR systems suffer from a number of drawbacks. They are expensive, bulky, require significant power supplies, and are often optimized for use in only one type of mobility platform. It would therefore be advantageous to design a lightweight, compact and relatively inexpensive multipurpose LiDAR and imagery system that could be used from a variety of mobility platforms - both terrestrial and airborne. The system should be quick and easy to deploy, and require a minimum amount of existing infrastructure for operational support. With these goals in mind, our research teams have developed a prototype field deployable compact dynamic laser scanning system that is configured for use on a variety of mobility platforms, including backpack wearable, as well as unmanned aerial vehicles (e.g. balloons & helicopters) and small off-road vehicles such as ATV's. The system is small, self-contained, relatively inexpensive, and easy to deploy. The first version of this multipurpose LiDAR system has been successfully tested in both backpack configuration and on a tethered flight attached to a helium balloon. We will present system design and development details, along with field experiences and a detailed accuracy analysis of the acquired point clouds which show that accuracy of 3-5 cm (1 sigma) vertical can be achieved in both backpack and balloon modalities.
Today airborne LiDAR (Light Detection And Ranging) systems has gained acceptance as a powerful tool to rapidly collect invaluable information to assess the impact from either natural disasters, such as hurricanes, earthquakes and flooding, or human inflicted disasters such as terrorist/enemy activities. Where satellite based imagery provides an excellent tool to remotely detect changes in the environment, the LiDAR systems, being active remote sensors, provide an unsurpassed method to quantify these changes. The strength of the active laser based systems is especially evident in areas covered by occluding vegetation or in the shallow coastal zone as the laser can penetrate the vegetation or water body to unveil what is below. The purpose of this paper is to address the task to survey complex areas with help of the state-of-the-art airborne LiDAR systems and also discuss scenarios where the method is used today and where it may be used tomorrow. Regardless if it is a post-hurricane survey or a preparation stage for a landing operation in unchartered waters, it is today possible to collect, process and present a dense 3D model of the area of interest within just a few hours from deployment. By utilizing the advancement in processing power and wireless network capabilities real-time presentation would be feasible.
Wu, Yu-Ting; Hsuan, Chung-Yao; Lin, Ta-Hui
Taiwan is subject to 3.4 landfall typhoons each year in average, generally occurring in the third quarter of every year (July-September). Understanding of boundary-layer turbulence characteristics of a typhoon is needed to ensure the safety of both onshore and offshore wind turbines used for power generation. In this study, a floating LiDAR (Light Detection and Ranging) was deployed in a harbor to collect data of wind turbulence, atmospheric pressure, and temperature in three typhoon events (Matmo typhoon, Soulik typhoon, Trami typhoon). Data collected from the floating LiDAR and from meteorological stations located at Taipei, Taichung and Kaohsiung are adopted to analyse the wind turbulence characteristics in the three typhoon events. The measurement results show that the maximum 10-min average wind speed measured with the floating LiDAR is up to 24 m/s at a height of 200 m. Compared with other normal days, the turbulence intensity is lower in the three typhoon events where the wind speed has a rapid increase. Changes of wind direction take place clearly as the typhoons cross Taiwan from East to West. Within the crossing intervals, the vertical momentum flux is observed to have a significant pattern with both upward and downward propagating waves which are relevant to the flow structure of the typhoons.
Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho
Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.
Background Previous studies show an increased interest and usage of complementary and alternative medicine (CAM) in the general population and among health care workers both internationally and nationally. CAM usage is also reported to be common among surgical patients. Earlier international studies have reported that a large amount of surgical patients use it prior to and after surgery. Recent publications indicate a weak knowledge about CAM among health care workers. However the current situation in Sweden is unknown. The aim of this study was therefore to explore perceived knowledge about CAM among registered healthcare professions in surgical departments at Swedish university hospitals. Method A questionnaire was distributed to 1757 registered physicians, nurses and physiotherapists in surgical wards at the seven university hospitals in Sweden from spring 2010 to spring 2011. The questionnaire included classification of 21 therapies into conventional, complementary, alternative and integrative, and whether patients were recommended these therapies. Questions concerning knowledge, research, and patient communication about CAM were also included. Result A total of 737 (42.0%) questionnaires were returned. Therapies classified as complementary; were massage, manual therapies, yoga and acupuncture. Alternative therapies; were herbal medicine, dietary supplements, homeopathy and healing. Classification to integrative therapy was low, and unfamiliar therapies were Bowen therapy, iridology and Rosen method. Therapies recommended by > 40% off the participants were massage and acupuncture. Knowledge and research about CAM was valued as minor or none at all by 95.7% respectively 99.2%. Importance of possessing knowledge about it was valued as important by 80.9%. It was believed by 61.2% that more research funding should be addressed to CAM research, 72.8% were interested in reading CAM-research results, and 27.8% would consider taking part in such research. Half of the
Falkowski, M. J.; Fekety, P.; Silva, C. A.; Hudak, A. T.
LiDAR data are increasingly applied to support forest inventory and assessment across a variety of spatial scales. Typically this is achieved by integrating LiDAR data with forest inventory collected at fixed radius forest inventory plots. A well-designed forest inventory, one that covers the full range of structural and compositional variation across the forest of interest, is costly especially when collecting fixed radius plot data. Variable radius plots offer an alternative inventory protocol that is more efficient in terms of both time and money. However, integrating variable radius plot data with LiDAR data is problematic because the plots have unknown sizes that vary with variation in tree size. This leads to a spatial mismatch between LiDAR metrics (e.g., mean height, canopy cover, density, etc.) and plot data, which ultimately translates into errors in LiDAR derived forest inventory predictions. We propose and evaluate and novel approach for integrating variable radius plot data into a LiDAR based forest inventories in two different forest systems, one in the inland northwest and another in the northern lakes states of the USA. The novel approach calculates LiDAR metrics by weighting the point cloud proportional to return height, mimicking the way in which variable radius plot data weights tree measurements by tree size. This could increase inventory sampling efficiency, allowing for the collection of a greater number of inventory plots, and ultimately improve the performance of LiDAR based inventories.
Mitchard, E. T. A.; Saatchi, S. S.; White, L. J. T.; Abernethy, K. A.; Jeffery, K. J.; Lewis, S. L.; Collins, M.; Lefsky, M. A.; Leal, M. E.; Woodhouse, I. H.; Meir, P.
Spatially-explicit maps of aboveground biomass are essential for calculating the losses and gains in forest carbon at a regional to national level. The production of such maps across wide areas will become increasingly necessary as international efforts to protect primary forests, such as the REDD+ (Reducing Emissions from Deforestation and forest Degradation) mechanism, come into effect, alongside their use for management and research more generally. However, mapping biomass over high-biomass tropical forest is challenging as (1) direct regressions with optical and radar data saturate, (2) much of the tropics is persistently cloud-covered, reducing the availability of optical data, (3) many regions include steep topography, making the use of radar data complex, (4) while LiDAR data does not suffer from saturation, expensive aircraft-derived data are necessary for complete coverage. We present a solution to the problems, using a combination of terrain-corrected L-band radar data (ALOS PALSAR), spaceborne LiDAR data (ICESat GLAS) and ground-based data. We map Gabon's Lopé National Park (5000 km2) because it includes a range of vegetation types from savanna to closed-canopy tropical forest, is topographically complex, has no recent cloud-free high-resolution optical data, and the dense forest is above the saturation point for radar. Our 100 m resolution biomass map is derived from fusing spaceborne LiDAR (7142 ICESat GLAS footprints), 96 ground-based plots (average size 0.8 ha) and an unsupervised classification of terrain-corrected ALOS PALSAR radar data, from which we derive the aboveground biomass stocks of the park to be 78 Tg C (173 Mg C ha-1). This value is consistent with our field data average of 181 Mg C ha-1, from the field plots measured in 2009 covering a total of 78 ha, and which are independent as they were not used for the GLAS-biomass estimation. We estimate an uncertainty of ± 25 % on our carbon stock value for the park. This error term includes
Abd Manap, Mohamad; Azhari Razak, Khamarrul; Mohamad, Zakaria; Ahmad, Azhari; Ahmad, Ferdaus; Mohamad Zin, Mazlan; A'zad Rosle, Qalam
Malaysia has faced a substantial number of landslide events every year. Cameron Highlands, Pahang is one of the badly areas affected by slope failures characterized by extreme climate, rugged topographic and weathered geological structures in a tropical environment. A high frequency of landslide occurrence in the hilly areas is predominantly due to the geological materials, tropical monsoon seasons and uncontrolled agricultural activities. Therefore the Government of Malaysia through the Prime Minister Department has allocated a special budget to conduct national level hazard and risk mapping project through Minerals and Geoscience Department Malaysia, the Ministry of Natural Resources and Environment. The primary aim of this project is to provide slope hazard risk information for a better slope management in Malaysia. In addition this project will establish national infrastructure for geospatial information on the geological terrain and slope by emphasizing the disaster risk throughout the country. The areas of interest are located in the three different selected areas i.e. Cameron Highlands (275 square kilometers), Ipoh (200 square kilometers) and Cheras Kajang -- Batang kali (650 square kilometers). These areas are selected based on National Slope Master Plan (2009 -- 2023) that endorsed by Malaysia Government Cabinet. The national hazard and risk mapping project includes six parts of major tasks: (1) desk study and mobilization, (2) airborne LiDAR data acquisition and analysis, (3) field data acquisition and verification, (4) hazard and risk for natural terrain, (5) hazard and risk analysis for man-made slope and (6) Man-made slope mitigation/preventive measures. The project was authorized in September, 2014 and will be ended in March, 2016. In this paper, the main focus is to evaluate the suitability of integrated capability of airborne- and terrestrial LiDAR data acquisition and analysis, and also digital photography for regional landslide assessment. The
Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner
Coastal and tidal environments are valuable ecosystems, which, however, are under pressure in many areas around the world due to globalisation and/or climate change. Detailed mapping of these environments is required in order to manage the coastal zone in a sustainable way. However, historically these transition zones between land and water are difficult or even impossible to map and investigate in high spatial resolution due to the challenging environmental conditions. The new generation of airborne topobathymetric light detection and ranging (LiDAR) potentially enables full-coverage and high-resolution mapping of these land-water transition zones. We have carried out topobathymetric LiDAR surveys in the Knudedyb tidal inlet system, a coastal environment in the Danish Wadden Sea which is part of the Wadden Sea National Park and UNESCO World Heritage. Detailed digital elevation models (DEMs) with a grid cell size of 0.5 m x 0.5 m were generated from the LiDAR point cloud with a mean point density in the order of 20 points/m2. The DEM was analysed morphometrically using a modification of the tool Benthic Terrain Modeler (BTM) developed by Wright et al. (2005). Initially, stage (the elevation in relation to tidal range) was used to divide the area of investigation into the different tidal zones, i.e. subtidal, intertidal and supratidal. Subsequently, morphometric units were identified and characterised by a combination of statistical neighbourhood analysis with varying window sizes (using the Bathymetric Positioning Index (BPI) from the BTM, moving average and standard deviation), slope parameters and area/perimeter ratios. Finally, these morphometric units were classified into six different types of landforms based on their stage and morphometric characteristics, i.e. either subtidal channel, intertidal flat, intertidal creek, linear bar, swash bar or beach dune. We hereby demonstrate the potential of using airborne topobathymetric LiDAR for seamless mapping of land
Kim, Kyunghee; Choi, Jae Wook; Park, Miso; Kim, Min Soo; Lee, Eun Sun
Objectives In light of the need to develop an integrated database on poisoning incidents in Korea, this study seeks to determine the characteristics of poisoning incidents in Korea by age, gender, location of incident, causative substance and patient prognosis. Data sources The Korea National Hospital Discharge In-Depth Injury Survey results (2005–2009) from the Korea Centers for Disease Control and Prevention were used. Participants 3826 participants in the survey who had been hospitalised for poisoning incidents. Results The poisoning hospitalisation rate per 100 000 population was higher in women (1.735) than in men (1.372) and increased with age: the rate was 0.458 among individuals aged ≤9 years, 0.481 among those aged 10–19 years, 1.584 among those aged 20–64 years and 4.053 among those aged ≥65 years. The intentional poisoning hospitalisation rate differed by gender and age group. Women aged ≤19 years and 20–64 years showed a higher hospitalisation rate than men, while men aged ≥65 years showed a higher hospitalisation rate than women in the same age group. The most common poisoning substance was pesticides (33.6%), while antiepileptic, sedative-hypnotic and antiparkinsonism drugs and psychotropic drugs, not elsewhere classified were also very common. Poisoning in those aged ≤9 years usually involved other drugs, while pesticides were the most common substances in those aged 20–64 years and ≥65 years. Conclusions This study analysed poisoning incidents in Korea from 2005 to 2009, by age and gender, causative substance, and characteristics. The results of this study may serve as evidence for new strategies in Korea to prevent poisoning. PMID:26553832
Yator, Obadia; Mathai, Muthoni; Vander Stoep, Ann; Rao, Deepa; Kumar, Manasi
Mothers with HIV are at high risk of a range of psychosocial issues that may impact HIV disease progression for themselves and their children. Stigma has also become a substantial barrier to accessing HIV/AIDS care and prevention services. The study objective was to determine the prevalence and severity of postpartum depression (PPD) among women living with HIV and to further understand the impact of stigma and other psychosocial factors in 123 women living with HIV attending prevention of mother-to-child transmission (PMTCT) clinic at Kenyatta National Hospital located in Nairobi, Kenya. We used the Edinburgh Postnatal Depression Scale and HIV/AIDS Stigma Instrument - PLWHA (HASI - P). Forty-eight percent (N = 59) of women screened positive for elevated depressive symptoms. Eleven (9%) of the participants reported high levels of stigma. Multivariate analyses showed that lower education (OR = 0.14, 95% CI [0.04-0.46], p = .001) and lack of family support (OR = 2.49, 95% CI [1.14-5.42], p = .02) were associated with the presence of elevated depressive symptoms. The presence of stigma implied more than ninefold risk of development of PPD (OR = 9.44, 95% CI [1.132-78.79], p = .04). Stigma was positively correlated with an increase in PPD. PMTCT is an ideal context to reach out to women to address mental health problems especially depression screening and offering psychosocial treatments bolstering quality of life of the mother-baby dyad. PMID:27045273
Chouhy, Diego; D’Andrea, Rubén Mamprín; Iglesias, Mercedes; Messina, Analía; Ivancovich, Juan J.; Cerda, Belen; Galimberti, Diana; Bottai, Hebe; Giri, Adriana A.
Cervarix vaccine was included in the National Immunization Program of Argentina in 2011 but data about the local distribution of human papillomavirus (HPV) infection in women exposed to the virus are scarce. This cross-sectional study determined the prevalence and type distribution of HPV infection in unvaccinated women attending routine gynaecological screening in two public hospitals located in Buenos Aires and Santa Fe, Argentina. Socio-demographic, sexual behaviour and co-factors information was obtained from all participants (Buenos Aires, n=429; Santa Fe, n=433). Cervicovaginal swabs were tested with an MY11/09 primer-based assay and with the CUT primer system targeting mucosal/cutaneous HPVs. Participants from Buenos Aires showed significantly higher rates of HPV infection (52.4% vs. 40.6%), of multiple infections (24.2% vs. 16.4%), and of low-risk (20.3% vs. 13.9%) and high-risk types (44.1% vs. 33.3%) than those from Santa Fe. HPV-66 (Buenos Aires: 17%) and HPV-16 (Santa Fe: 8.5%) were the most prevalent types. Novel HPV-66 putative subtype and variants were identified. Vaccine types 16 and 18 were frequent (Buenos Aires: 13.5%; Santa Fe F: 10.2%) but few participants had co-infections with both (Buenos Aires: 1.4%; Santa Fe: 0.2%). A common risk factor for HPV infection was having a new sexual partner in the last year (Buenos Aires: OR 2.53, p<0.001; Santa Fe: OR 1.85, p=0.04). This study provides valuable baseline data for future assessment of the impact of massive vaccination in Argentina and it underlines the use of additional HPV testing strategies, such as the CUT system, for surveillance and vaccinology. PMID:23296573
Ghosh, Manab Kumar
Snakebite remains a public health problem in India, occurring most frequently in the summer and rainy seasons. Bites are maximal in lower limbs. Victims are typically male and between 17 and 27 years of age. Children and the elderly have higher mortality. The worst affected states are Kerala, Maharashtra, Tamil Nadu, Orissa, Assam and West Bengal. There was no uniform guideline for treatment of snakebite cases. The five common venomous Indian snakes biting humans are common cobra, krait, Russell's viper, saw scaled viper and the hump nose pit viper. Seventy per cent of all snakebites are non-venomous. Even in bites by venomous snakes, envenomation occurs in only 50% of cases. Immobilisation is much more important than tight ligature, which may cause gangrene. Only a minority need antivenom, which is expensive, short in supply and may cause severe reaction. Antivenom treatment is recommended on the basis of local and systemic signs and symptoms and 20 minutes whole blood clotting test (20WBCT). Delay in starting AVS treatment is the main cause of mortality and morbidity. Skin test is of no value. But antivenom should not be used unless specifically indicated. The "Do it RIGHT" approach of national treatment protocol indicates the initial steps to be taken before reaching a hospital or primary healthcare facility. And it resulted in a 66% decline in the amount of ASV administration and an absolute reduction of mortality by 24%. However first aid treatment of the bitten limb/area with broad-spectrum antibiotics, injection tetanus antitoxin and Supportive treatment with blood transfusion, ventilatory support, anticholinesterase and peritoneal dialysis may also be required. PMID:22315862