Listen up! Processing of intensity change differs for vocal and nonvocal sounds.
Schirmer, Annett; Simpson, Elizabeth; Escoffier, Nicolas
2007-10-24
Changes in the intensity of both vocal and nonvocal sounds can be emotionally relevant. However, as only vocal sounds directly reflect communicative intent, intensity change of vocal but not nonvocal sounds is socially relevant. Here we investigated whether a change in sound intensity is processed differently depending on its social relevance. To this end, participants listened passively to a sequence of vocal or nonvocal sounds that contained rare deviants which differed from standards in sound intensity. Concurrently recorded event-related potentials (ERPs) revealed a mismatch negativity (MMN) and P300 effect for intensity change. Direction of intensity change was of little importance for vocal stimulus sequences, which recruited enhanced sensory and attentional resources for both loud and soft deviants. In contrast, intensity change in nonvocal sequences recruited more sensory and attentional resources for loud as compared to soft deviants. This was reflected in markedly larger MMN/P300 amplitudes and shorter P300 latencies for the loud as compared to soft nonvocal deviants. Furthermore, while the processing pattern observed for nonvocal sounds was largely comparable between men and women, sex differences for vocal sounds suggest that women were more sensitive to their social relevance. These findings extend previous evidence of sex differences in vocal processing and add to reports of voice specific processing mechanisms by demonstrating that simple acoustic change recruits more processing resources if it is socially relevant.
Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun
2012-01-01
Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.
ERIC Educational Resources Information Center
Bevans, Katherine B.; Fitzpatrick, Leslie-Anne; Sanchez, Betty M.; Riley, Anne W.; Forrest, Christopher
2010-01-01
Background: This study was conducted to empirically evaluate specific human, curricular, and material resources that maximize student opportunities for physical activity during physical education (PE) class time. A structure-process-outcome model was proposed to identify the resources that influence the frequency of PE and intensity of physical…
Kaunonen, Marja; Salin, Sirpa; Aalto, Pirjo
2015-07-01
To explore factors associated with nursing intensity, work environment intensity and nursing resources that may affect nurse job satisfaction and risk of dissatisfaction in outpatient care at one university hospital in Finland. Much research has been done to study how nursing intensity, work environment intensity and nursing resources are associated with nurse job satisfaction, but not in the context of outpatient care. This research used a cross-sectional design. The data were collected from the hospital information systems of outpatient units (n = 12) in autumn 2010. Management style showed a statistically significant association with job satisfaction. The risk of dissatisfaction increased when nursing staff had no influence over the design of their jobs, when conflicts and contradictions were not addressed in the workplace and when feedback was not processed. Nursing intensity and work environment intensity had no effect on nurse job satisfaction. Nursing resources and patient satisfaction, on the other hand, were important to nurses' job satisfaction. The results indicate that nursing management should involve nursing staff in the development of their jobs and the care delivery model. © 2013 John Wiley & Sons Ltd.
Brown, Ross; Rasmussen, Rune; Baldwin, Ian; Wyeth, Peta
2012-08-01
Nursing training for an Intensive Care Unit (ICU) is a resource intensive process. High demands are made on staff, students and physical resources. Interactive, 3D computer simulations, known as virtual worlds, are increasingly being used to supplement training regimes in the health sciences; especially in areas such as complex hospital ward processes. Such worlds have been found to be very useful in maximising the utilisation of training resources. Our aim is to design and develop a novel virtual world application for teaching and training Intensive Care nurses in the approach and method for shift handover, to provide an independent, but rigorous approach to teaching these important skills. In this paper we present a virtual world simulator for students to practice key steps in handing over the 24/7 care requirements of intensive care patients during the commencing first hour of a shift. We describe the modelling process to provide a convincing interactive simulation of the handover steps involved. The virtual world provides a practice tool for students to test their analytical skills with scenarios previously provided by simple physical simulations, and live on the job training. Additional educational benefits include facilitation of remote learning, high flexibility in study hours and the automatic recording of a reviewable log from the session. To the best of our knowledge, we believe this is a novel and original application of virtual worlds to an ICU handover process. The major outcome of the work was a virtual world environment for training nurses in the shift handover process, designed and developed for use by postgraduate nurses in training. Copyright © 2012 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
Measuring technical efficiency of output quality in intensive care units.
Junoy, J P
1997-01-01
Presents some examples of the implications derived from imposing the objective of maximizing social welfare, subject to limited resources, on ethical care patients management in respect of quality performance of health services. Conventional knowledge of health economics points out that critically ill patients are responsible for increased use of technological resources and that they receive a high proportion of health care resources. Attempts to answer, from the point of view of microeconomics, the question: how do we measure comparative efficiency in the management of intensive care units? Analyses this question through data from an international empirical study using micro-economic measures of productive efficiency in public services (data envelopment analysis). Results show a 28.8 per cent level of technical inefficiency processing data from 25 intensive care units in the USA.
Atypical resource allocation may contribute to many aspects of autism
Goldknopf, Emily J.
2013-01-01
Based on a review of the literature and on reports by people with autism, this paper suggests that atypical resource allocation is a factor that contributes to many aspects of autism spectrum conditions, including difficulties with language and social cognition, atypical sensory and attentional experiences, executive and motor challenges, and perceptual and conceptual strengths and weaknesses. Drawing upon resource theoretical approaches that suggest that perception, cognition, and action draw upon multiple pools of resources, the approach hypothesizes that compared with resources in typical cognition, resources in autism are narrowed or reduced, especially in people with strong sensory symptoms. In narrowed attention, resources are restricted to smaller areas and to fewer modalities, stages of processing, and cognitive processes than in typical cognition; narrowed resources may be more intense than in typical cognition. In reduced attentional capacity, overall resources are reduced; resources may be restricted to fewer modalities, stages of processing, and cognitive processes than in typical cognition, or the amount of resources allocated to each area or process may be reduced. Possible neural bases of the hypothesized atypical resource allocation, relations to other approaches, limitations, and tests of the hypotheses are discussed. PMID:24421760
NASA Astrophysics Data System (ADS)
Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai
2017-08-01
Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.
Diversity in computing technologies and strategies for dynamic resource allocation
Garzoglio, G.; Gutsche, O.
2015-12-23
Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.
The influence of working memory capacity on experimental heat pain.
Nakae, Aya; Endo, Kaori; Adachi, Tomonori; Ikeda, Takashi; Hagihira, Satoshi; Mashimo, Takashi; Osaka, Mariko
2013-10-01
Pain processing and attention have a bidirectional interaction that depends upon one's relative ability to use limited-capacity resources. However, correlations between the size of limited-capacity resources and pain have not been evaluated. Working memory capacity, which is a cognitive resource, can be measured using the reading span task (RST). In this study, we hypothesized that an individual's potential working memory capacity and subjective pain intensity are related. To test this hypothesis, we evaluated 31 healthy participants' potential working memory capacity using the RST, and then applied continuous experimental heat stimulation using the listening span test (LST), which is a modified version of the RST. Subjective pain intensities were significantly lower during the challenging parts of the RST. The pain intensity under conditions where memorizing tasks were performed was compared with that under the control condition, and it showed a correlation with potential working memory capacity. These results indicate that working memory capacity reflects the ability to process information, including precise evaluations of changes in pain perception. In this work, we present data suggesting that changes in subjective pain intensity are related, depending upon individual potential working memory capacities. Individual working memory capacity may be a phenotype that reflects sensitivity to changes in pain perception. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.
Thermodynamic analysis of resources used in manufacturing processes.
Gutowski, Timothy G; Branham, Matthew S; Dahmus, Jeffrey B; Jones, Alissa J; Thiriez, Alexandre
2009-03-01
In this study we use a thermodynamic framework to characterize the material and energy resources used in manufacturing processes. The analysis and data span a wide range of processes from "conventional" processes such as machining, casting, and injection molding, to the so-called "advanced machining" processes such as electrical discharge machining and abrasive waterjet machining, and to the vapor-phase processes used in semiconductor and nanomaterials fabrication. In all, 20 processes are analyzed. The results show that the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least 6 orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period. We illustrate the relevance of thermodynamics (including exergy analysis) for all processes in spite of the fact that long-lasting focus in manufacturing has been on product quality--not necessarily energy/material conversion efficiency. We promote the use of thermodynamics tools for analysis of manufacturing processes within the context of rapidly increasing relevance of sustainable human enterprises. We confirm that exergy analysis can be used to identify where resources are lost in these processes, which is the first step in proposing and/or redesigning new more efficient processes.
Cultural impacts to tribes from climate change influences on forests
Garrit Voggesser; Kathy Lynn; John Daigle; Frank K. Lake; Darren Ranco
2013-01-01
Climate change related impacts, such as increased frequency and intensity of wildfires, higher temperatures, extreme changes to ecosystem processes, forest conversion and habitat degradation are threatening tribal access to valued resources. Climate change is and will affect the quantity and quality of resources tribes depend upon to perpetuate their cultures and...
NASA Astrophysics Data System (ADS)
Higashino, Hideaki; Motojima, Hideko; Ozaki, Masuo; Mursan, Anwar
Securing safe water is an urgent issue to be solved in rural societies in developing countries. Conventional water environment improvement through public works, putting priority on development of water resources, such as construction of dams, well digging, etc., has shown successful results in one hand. However, on the other hand, they generally require large investment cost, long time for administrative process. In addition, inequity of benefit to residents is associated as a potential problem. Meanwhile, intensive use of the existing water resources, with cheap cost and simple technologies, can be effective alternative measures against water shortage where development of water resources is restricted. From the viewpoint, the Study is being conducted to propose water environment improvement through intensive use of the existing water resources. According to the results of the on-site survey conducted in the West Nusa Tenggara Province, Indonesia, it was found out that water environment in the province is deteriorating due to development of beef cattle raising and deforestation. In this paper, the results of the on-site survey are summarized and the water environment improvement plan to mitigate the present status is presented.
USDA-ARS?s Scientific Manuscript database
The measurement of sugar concentration and dry matter in processing potatoes is a time and resource intensive activity, cannot be performed in the field, and does not easily measure within tuber variation. A proposed method to improve the phenotyping of processing potatoes is to employ hyperspectral...
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
Lee K. Cerveny; Emily Jane Davis; Rebecca McLain; Clare M. Ryan; Debra R. Whitall; Eric M. White
2018-01-01
The Northwest Forest Plan (NWFP, or Plan) signified a movement away from intensive focus on timber management that was common through the 1980s and toward an ecosystem management approach, which aims to conserve ecological conditions and restore natural resources while meeting the social, cultural, and economic needs of present and future generations (Brussard et al....
Cherp, Aleg; Kopteva, Irina; Mnatsakanian, Ruben
2003-06-01
Economic liberalization in former socialist countries may have various implications for their environmental sustainability. Positive effects of this process are potentially associated with improved efficiency, investments into cleaner technologies, responsiveness to environmentally aware markets, and ending subsidies to heavy industries. On the other hand, market liberalization may result in weaker environmental controls, economic instabilities distracting attention from environmental issues, and increasing orientation towards profit-making leading to more intensive exploitation of natural resources. In addition, trade liberalization may result in shifts towards more pollution and resource-intensive industries. This article seeks to quantify effects of economic restructuring in Russia on air pollution from productive economic sectors in the 1990s. Air pollution in Russia had significantly declined in 1991-1999, however, this decline was largely due to economic decline, as the overall pollution intensity of the economy had decreased only slightly. The factors that affected the pollution intensity are: (1) a decrease in the combined share of industrial and transport activities in the economy and (2) changing pollution intensities of the industrial and transport sectors. The pollution intensity of the Russian industry had remained relatively stable during the 1990s. This was the result of the two opposite and mutually canceling trends: (a) increasing shares of pollution-intensive branches such as metal smelting and oil production vs. less pollution intensive manufacturing and (b) decline in pollution intensities within the industrial branches. The article proposes a methodology by which the contribution of both factors to the overall pollution intensity of the industrial sector can be quantified. The pollution intensity of the Russian transport sector appears to have declined in the first half of the 1990s and increased in the second half. The most recent trend can be explained by a rising proportion of private motorcars used for transportation of people and goods instead of traditional rail and other public transport. The findings of the paper demonstrate that shifts towards more pollution-, resource- and energy-intensive industries as a result of economic liberalization emerges as a significant negative factor of the process of economic transition threatening sustainability of emerging market economies. A research agenda to further investigate these impacts is proposed.
Privatization of Higher Education in Nigeria: Critical Issues
ERIC Educational Resources Information Center
Okunola, Philips Olayide; Oladipo, Simeon Adebayo
2012-01-01
The broad intent of any educational reform is premised on the assumption that it is capable of improving educational process and practices, hence, the need for evaluation of the system's process in order to determine the efficiency and effectiveness of resource allocation. Education is capital intensive in terms of human, financial and material…
Fox, Jay
2013-01-01
In 2008 the Virginia Research Resource Consortium was launched with the aim of bring together the research resources at research intensive institutions as well as educators in the less research intensive institutions. The first meeting in 2008 served to provide the a survey of resource capabilities in the Commonwealth as well as open discussions for resource sharing between research intensive institution as well as educational use of resources at less research intensive institutions. More recently we have reached out to biotechnology companies in the Commonwealth to provide support for their enterprise as well as reciprocal access to expertise within those companies. This presentation will highlight the steps taken to establish the VRRC and some of its outcomes to date.
Hyperfocusing in Schizophrenia: Evidence from Interactions Between Working Memory and Eye Movements
Luck, Steven J.; McClenon, Clara; Beck, Valerie M.; Hollingworth, Andrew; Leonard, Carly J.; Hahn, Britta; Robinson, Benjamin M.; Gold, James M.
2014-01-01
Recent research suggests that processing resources are focused more narrowly but more intensely in people with schizophrenia (PSZ) than in healthy control subjects (HCS), possibly reflecting local cortical circuit abnormalities. This hyperfocusing hypothesis leads to the counterintuitive prediction that, although PSZ cannot store as much information in working memory as HCS, the working memory representations that are present in PSZ may be more intense than those in HCS. To test this hypothesis, we used a task in which participants make a saccadic eye movement to a peripheral target and avoid a parafoveal nontarget while they are holding a color in working memory. Previous research with this task has shown that the parafoveal nontarget is more distracting when it matches the color being held in working memory. This effect should be enhanced in PSZ if their working memory representations are more intense. Consistent with this prediction, we found that the effect of a match between the distractor color and the memory color was larger in PSZ than in HCS. We also observed evidence that PSZ hyperfocused spatially on the region surrounding the fixation point. These results provide further evidence that some aspects of cognitive dysfunction in schizophrenia may be a result of a narrower and more intense focusing of processing resources. PMID:25089655
Hyperfocusing in schizophrenia: Evidence from interactions between working memory and eye movements.
Luck, Steven J; McClenon, Clara; Beck, Valerie M; Hollingworth, Andrew; Leonard, Carly J; Hahn, Britta; Robinson, Benjamin M; Gold, James M
2014-11-01
Recent research suggests that processing resources are focused more narrowly but more intensely in people with schizophrenia (PSZ) than in healthy control subjects (HCS), possibly reflecting local cortical circuit abnormalities. This hyperfocusing hypothesis leads to the counterintuitive prediction that, although PSZ cannot store as much information in working memory as HCS, the working memory representations that are present in PSZ may be more intense than those in HCS. To test this hypothesis, we used a task in which participants make a saccadic eye movement to a peripheral target and avoid a parafoveal nontarget while they are holding a color in working memory. Previous research with this task has shown that the parafoveal nontarget is more distracting when it matches the color being held in working memory. This effect should be enhanced in PSZ if their working memory representations are more intense. Consistent with this prediction, we found that the effect of a match between the distractor color and the memory color was larger in PSZ than in HCS. We also observed evidence that PSZ hyperfocused spatially on the region surrounding the fixation point. These results provide further evidence that some aspects of cognitive dysfunction in schizophrenia may be a result of a narrower and more intense focusing of processing resources.
Anthropization of groundwater resources in the Mediterranean region: processes and challenges
NASA Astrophysics Data System (ADS)
Leduc, Christian; Pulido-Bosch, Antonio; Remini, Boualem
2017-09-01
A comprehensive overview is provided of processes and challenges related to Mediterranean groundwater resources and associated changes in recent decades. While most studies are focused thematically and/or geographically, this paper addresses different stages of groundwater exploitation in the region and their consequences. Examples emphasize the complex interactions between the physical and social dimensions of uses and evolution of groundwater. In natural conditions, Mediterranean groundwater resources represent a wide range of hydrogeological contexts, recharge conditions and rates of exploitation. They have been actively exploited for millennia but their pseudo-natural regimes have been considerably modified in the last 50 years, especially to satisfy agricultural demand (80% of total water consumption in North Africa), as well as for tourism and coastal cities. Climate variability affects groundwater dynamics but the various forms of anthropization are more important drivers of hydrological change, including changes in land use and vegetation, hydraulic works, and intense pumpings. These changes affect both the quantity and quality of groundwater at different scales, and modify the nature of hydrogeological processes, their location, timing, and intensity. The frequent cases of drastic overexploitation illustrate the fragility of Mediterranean groundwater resources and the limits of present forms of management. There is no easy way to maintain or recover sustainability, which is often threatened by short-term interests. To achieve this goal, a significant improvement in hydrogeological knowledge and closer collaboration between the various disciplines of water sciences are indispensable.
Evaluation of HeadLight: An E-Construction Inspection Technology : research project capsule
DOT National Transportation Integrated Search
2017-09-01
Project delivery and inspection are challenging, resource-intensive jobs. The quality/accuracy of collected field data is crucial. The Louisiana Department of Transportation and Development (DOTD) still relies on a primarily paper-based process for f...
Magargal, Kate E; Parker, Ashley K; Vernon, Kenneth Blake; Rath, Will; Codding, Brian F
2017-07-08
The expansion of Numic speaking populations into the Great Basin required individuals to adapt to a relatively unproductive landscape. Researchers have proposed numerous social and subsistence strategies to explain how and why these settlers were able to replace any established populations, including private property and intensive plant processing. Here we evaluate these hypotheses and propose a new strategy involving the use of landscape fire to increase resource encounter rates. Implementing a novel, spatially explicit, multi-scalar prey choice model, we examine how individual decisions approximating each alternative strategy (private property, anthropogenic fire, and intensive plant processing) would aggregate at the patch and band level to confer an overall benefit to this colonizing population. Analysis relies on experimental data reporting resource profitability and abundance, ecological data on the historic distribution of vegetation patches, and ethnohistoric data on the distribution of Numic bands. Model results show that while resource privatization and landscape fires produce a substantial advantage, intensified plant processing garners the greatest benefit. The relative benefits of alternative strategies vary significantly across ecological patches resulting in variation across ethnographic band ranges. Combined, a Numic strategy including all three alternatives would substantially increase subsistence yields. The application of a strategy set that includes landscape fire, privatization and intensified processing of seeds and nuts, explains why the Numa were able to outcompete local populations. This approach provides a framework to help explain how individual decisions can result in such population replacement events throughout human history. © 2017 Wiley Periodicals, Inc.
Photonic Quantum Networks formed from NV− centers
Nemoto, Kae; Trupke, Michael; Devitt, Simon J.; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J.
2016-01-01
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV−, with one nuclear spin from 15N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology. PMID:27215433
Photonic Quantum Networks formed from NV(-) centers.
Nemoto, Kae; Trupke, Michael; Devitt, Simon J; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J
2016-05-24
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV(-), with one nuclear spin from (15)N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology.
An assessment of waste processing/resource recovery technologies for lunar/Mars life applications
NASA Technical Reports Server (NTRS)
Verostko, Charles E.; Packham, Nigel J. C.; Henninger, Donald H.
1992-01-01
NASA's future manned missions to explore the solar system are by nature of long duration, mandating extensive regeneration of life support consumables from wastes generated in space-based habitats. Long-duration exploration missions would otherwise be prohibitive due to the number and frequency of energy-intensive resupply missions from Earth. Resource recovery is therefore a critical component of the controlled ecological life support system (CELSS). In order to assess resource recovery technologies for CELSS applications, the Crew and Thermal Systems Division at NASA-Johnson Space Center convened a three-day workshop to assess potential resource recovery technologies for application in a space-based CELSS. This paper describes the methodology of assessing and ranking of these technologies. Recommendations and issues are identified. Evaluations focused on the processes for handling and treatment of inedible plant biomass, human waste, and human generated trash. Technologies were assessed on the basis of safety, reliability, technology readiness, and performance characteristics.
Biological and Dose Thresholds for an Early Genomic Biomarker of Liver Carcinogenesis in Mice.
Traditional data sources for cancer risk assessment are resource-intensive, retrospective, and not feasible for the vast majority of environmental chemicals. The use of quantitative short-term genomic biomarkers may streamline this process by providing protective limits for known...
Biological and Dose Thresholds for an Early Genomic Biomarker of Liver Carcinogenesis in Mice
Traditional data sources for cancer risk assessment are resource-intensive, retrospective, and not feasible for the vast majority of environmental chemicals. The use of quantitative short-term genomic biomarkers may streamline this process by providing protective limits for known...
DOT National Transportation Integrated Search
1997-01-01
The rational allocation of pavement maintenance resources requires the periodic assessment of the condition of all pavements. Traditional manual pavement distress surveys, which are based on visual inspection, are labor intensive, slow, and expensive...
Recent processing methods for preparing starch-based bioproducts
USDA-ARS?s Scientific Manuscript database
There is currently an intense interest in starch-based materials because of the low cost of starch, the replacement of dwindling petroleum-based resources with annually-renewable feedstocks, the biodegradability of starch-based products, and the creation of new markets for farm commodities. Non-trad...
Nickel, Moritz M; May, Elisabeth S; Tiemann, Laura; Postorino, Martina; Ta Dinh, Son; Ploner, Markus
2017-11-01
Pain serves the protection of the body by translating noxious stimulus information into a subjective percept and protective responses. Such protective responses rely on autonomic responses that allocate energy resources to protective functions. However, the precise relationship between objective stimulus intensity, subjective pain intensity, autonomic responses, and brain activity is not fully clear yet. Here, we addressed this question by continuously recording pain ratings, skin conductance, heart rate, and electroencephalography during tonic noxious heat stimulation of the hand in 39 healthy human subjects. The results confirmed that pain intensity dissociates from stimulus intensity during 10 minutes of noxious stimulation. Furthermore, skin conductance measures were significantly related to stimulus intensity but not to pain intensity. Correspondingly, skin conductance measures were significantly related to alpha and beta oscillations in contralateral sensorimotor cortex, which have been shown to encode stimulus intensity rather than pain intensity. No significant relationships were found between heart rate and stimulus intensity or pain intensity. The findings were consistent for stimulation of the left and the right hands. These results suggest that sympathetic autonomic responses to noxious stimuli in part directly result from nociceptive rather than from perceptual processes. Beyond, these observations support concepts of pain and emotions in which sensory, motor, and autonomic components are partially independent processes that together shape emotional and painful experiences.
What can we learn from resource pulses?
Yang, Louie H; Bastow, Justin L; Spence, Kenneth O; Wright, Amber N
2008-03-01
An increasing number of studies in a wide range of natural systems have investigated how pulses of resource availability influence ecological processes at individual, population, and community levels. Taken together, these studies suggest that some common processes may underlie pulsed resource dynamics in a wide diversity of systems. Developing a common framework of terms and concepts for the study of resource pulses may facilitate greater synthesis among these apparently disparate systems. Here, we propose a general definition of the resource pulse concept, outline some common patterns in the causes and consequences of resource pulses, and suggest a few key questions for future investigations. We define resource pulses as episodes of increased resource availability in space and time that combine low frequency (rarity), large magnitude (intensity), and short duration (brevity), and emphasize the importance of considering resource pulses at spatial and temporal scales relevant to specific resource-onsumer interactions. Although resource pulses are uncommon events for consumers in specific systems, our review of the existing literature suggests that pulsed resource dynamics are actually widespread phenomena in nature. Resource pulses often result from climatic and environmental factors, processes of spatiotemporal accumulation and release, outbreak population dynamics, or a combination of these factors. These events can affect life history traits and behavior at the level of individual consumers, numerical responses at the population level, and indirect effects at the community level. Consumers show strategies for utilizing ephemeral resources opportunistically, reducing resource variability by averaging over larger spatial scales, and tolerating extended interpulse periods of reduced resource availability. Resource pulses can also create persistent effects in communities through several mechanisms. We suggest that the study of resource pulses provides opportunities to understand the dynamics of many specific systems, and may also contribute to broader ecological questions at individual, population, and community levels.
Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items
ERIC Educational Resources Information Center
Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André
2016-01-01
Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…
Achieving Self-Reliance: Backyard Energy Lessons.
ERIC Educational Resources Information Center
Cook, Stephen
Appropriate technology (the process most appropriate for local cultural, economic, and social conditions) is geared toward projects which: are small in scale, decentralized, and energy efficient; use local materials, labor, and ingenuity; are not capital-intensive; and maximize the use of renewable energy resources. Descriptions of such projects…
Agencies within communities, communities within ecosystems
Jane Kapler Smith; Kerry McMenus
2000-01-01
Can scientific information and intensive, extensive public involvement through facilitated meetings be expected to lead to agreement on natural resource issues? Communications and research in the Bitterroot Ecosystem Management Research Project indicate that, where peopleâs values differ greatly, consensus is not a realistic goal for short term planning processes....
Extraction and recovery of phosphorus from pig manure using the quick wash process
USDA-ARS?s Scientific Manuscript database
Land disposal of manure is a challenging environmental problem in areas with intense confined pig production. Due to nutrient imbalance, manure applied to soil at optimal nitrogen rates for crop growth can promote soil phosphorus (P) surplus and potential pollution of water resources. Although manur...
Zhu, Hua-Xu; Duan, Jin-Ao; Guo, Li-Wei; Li, Bo; Lu, Jin; Tang, Yu-Ping; Pan, Lin-Mei
2014-05-01
Resource of traditional Chinese medicine residue is an inevitable choice to form new industries characterized of modem, environmental protection and intensive in the Chinese medicine industry. Based on the analysis of source and the main chemical composition of the herb residue, and for the advantages of membrane science and technology used in the pharmaceutical industry, especially membrane separation technology used in improvement technical reserves of traditional extraction and separation process in the pharmaceutical industry, it is proposed that membrane science and technology is one of the most important choices in technological design of traditional Chinese medicine resource industrialization. Traditional Chinese medicine residue is a very complex material system in composition and character, and scientific and effective "separation" process is the key areas of technology to re-use it. Integrated process can improve the productivity of the target product, enhance the purity of the product in the separation process, and solve many tasks which conventional separation is difficult to achieve. As integrated separation technology has the advantages of simplified process and reduced consumption, which are in line with the trend of the modern pharmaceutical industry, the membrane separation technology can provide a broad platform for integrated process, and membrane separation technology with its integrated technology have broad application prospects in achieving resource and industrialization process of traditional Chinese medicine residue. We discuss the principles, methods and applications practice of effective component resources in herb residue using membrane separation and integrated technology, describe the extraction, separation, concentration and purification application of membrane technology in traditional Chinese medicine residue, and systematically discourse suitability and feasibility of membrane technology in the process of traditional Chinese medicine resource industrialization in this paper.
Trade in and Valuation of Virtual Water Impacts in a City: A Case Study Of Flagstaff, Arizona
NASA Astrophysics Data System (ADS)
Rushforth, R.; Ruddell, B. L.
2013-12-01
An increasingly intense component of the global coupled natural and human system (CNH) is the economic trade of various types of resources and the outsourcing of resource impacts between geographically distant economic systems. The human economy's trade arrangements allow specific localities, especially cities, to exceed spatially local resource stock sustainability and footprint constraints, as evidenced in the urban metabolism literature. Each movement or trade of a resource along a network is associated with an embedded or 'virtual' exchange of indirect impacts on the inputs to the production process. The networked trade of embedded resources, therefore, is an essential human adaptation to resource limitations. Using the Embedded Resource Impact Accounting (ERA) framework, we examine the network of embedded water flows created through the trade of goods and services and economic development in Flagstaff, Arizona, and associate these flows with the creation of value in sectors of the economy
Research on the Intensive Material Management System of Biomass Power Plant
NASA Astrophysics Data System (ADS)
Zhang, Ruosi; Hao, Tianyi; Li, Yunxiao; Zhang, Fangqing; Ding, Sheng
2017-05-01
In view of the universal problem which the material management is loose, and lack of standardization and interactive real-time in the biomass power plant, a system based on the method of intensive management is proposed in this paper to control the whole process of power plant material. By analysing the whole process of power plant material management and applying the Internet of Things, the method can simplify the management process. By making use of the resources to maximize and data mining, material utilization, circulation rate and quality control management can be improved. The system has been applied in Gaotang power plant, which raised the level of materials management and economic effectiveness greatly. It has an important significance for safe, cost-effective and highly efficient operation of the plant.
Alanya, Sevda; Dewulf, Jo; Duran, Metin
2015-08-18
This study focused on the evaluation of biosolids management systems (BMS) from a natural resource consumption point of view. Additionally, the environmental impact of the facilities was benchmarked using Life Cycle Assessment (LCA) to provide a comprehensive assessment. This is the first study to apply a Cumulative Exergy Extraction from the Natural Environment (CEENE) method for an in-depth resource use assessment of BMS where two full-scale BMS and seven system variations were analyzed. CEENE allows better system evaluation and understanding of how much benefit is achievable from the products generated by BMS, which have valorization potential. LCA results showed that environmental burden is mostly from the intense electricity consumption. The CEENE analysis further revealed that the environmental burden is due to the high consumption of fossil and nuclear-based natural resources. Using Cumulative Degree of Perfection, higher resource-use efficiency, 53%, was observed in the PTA-2 where alkaline stabilization rather than anaerobic digestion is employed. However, an anaerobic digestion process is favorable over alkaline stabilization, with 35% lower overall natural resource use. The most significant reduction of the resource footprint occurred when the output biogas was valorized in a combined heat and power system.
Emotional intensity influences pre-implementation and implementation of distraction and reappraisal
Shafir, Roni; Schwartz, Naama; Blechert, Jens
2015-01-01
Although emotional intensity powerfully challenges regulatory strategies, its influence remains largely unexplored in affective-neuroscience. Accordingly, the present study addressed the moderating role of emotional intensity in two regulatory stages—implementation (during regulation) and pre-implementation (prior to regulation), of two major cognitive regulatory strategies—distraction and reappraisal. According to our framework, because distraction implementation involves early attentional disengagement from emotional information before it gathers force, in high-intensity it should be more effective in the short-term, relative to reappraisal, which modulates emotional processing only at a late semantic meaning phase. Supporting findings showed that in high (but not low) intensity, distraction implementation resulted in stronger modulation of negative experience, reduced neural emotional processing (centro-parietal late positive potential, LPP), with suggestive evidence for less cognitive effort (frontal-LPP), relative to reappraisal. Related pre-implementation findings confirmed that anticipating regulation of high-intensity stimuli resulted in distraction (over reappraisal) preference. In contrast, anticipating regulation of low-intensity stimuli resulted in reappraisal (over distraction) preference, which is most beneficial for long-term adaptation. Furthermore, anticipating cognitively demanding regulation, either in cases of regulating counter to these preferences or via the more effortful strategy of reappraisal, enhanced neural attentional resource allocation (Stimulus Preceding Negativity). Broad implications are discussed. PMID:25700568
From the forest to the sea: a story of fallen trees.
Chris Maser; Robert F. Tarrant; James M. Trappe; Jerry F. Franklin
1988-01-01
Large, fallen trees in various stages of decay contribute much-needed diversity of ecological processes to terrestrial, aquatic, estuarine, coastal beach, and open ocean habitats in the Pacific Northwest. intensive utilization and management can deprive these habitats of large, fallen trees. This publication presents sound information for managers making resource...
Web-Based Time Entry Systems: Providing Greater Automation and Compliance
ERIC Educational Resources Information Center
Williams, Tracy
2005-01-01
Time and resources are becoming increasingly scarce in most higher education institutions today. As a result, colleges and universities are looking to streamline and simplify many costly, labor-intensive administrative processes. In this article, Tracy Williams examines how Web-based time-entry systems can help institutions save valuable time and…
[Costs and consumption of material resources in pediatric intensive and semi-intensive care units].
Zuliani, Larissa Lenotti; Jericó, Marli de Carvalho; de Castro, Liliana Cristina; Soler, Zaida Aurora Sperli Geraldes
2012-01-01
Cost management of hospital material resources is a trendy research topic, especially in specialized health units. Nurses are pointed out as the main managers for costs and consumption of hospital materials resources. This study aimed to characterize Pediatric Intensive and Semi-Intensive Care Units of a teaching hospital and investigate costs and consumption of material resources used to treat patients admitted to these units. This is a descriptive exploratory study with retrospective data and quantitative approach. Data were obtained from a Hospital Information System and analyzed according to the ABC classification. The average expenditures were similar in both the neonatal and cardiac units, and lower in Pediatric Intensive and Semi-Intensive care units. There was a significant variation in the monthly consumption of materials. Higher cost materials had a greater impact on the budget of the studied units. The data revealed the importance of using a systematic method for the analysis of materials consumption and expenditure in pediatric units. They subsidize administrative and economic actions.
Shaw, M; Singh, S
2015-04-01
Diagnostic error has implications for both clinical outcome and resource utilisation, and may often be traced to impaired data gathering, processing or synthesis because of the influence of cognitive bias. Factors inherent to the intensive/acute care environment afford multiple additional opportunities for such errors to occur. This article illustrates many of these with reference to a case encountered on our intensive care unit. Strategies to improve completeness of data gathering, processing and synthesis in the acute care environment are critically appraised in the context of early detection and amelioration of cognitive bias. These include reflection, targeted simulation training and the integration of social media and IT based aids in complex diagnostic processes. A framework which can be quickly and easily employed in a variety of clinical environments is then presented. © 2015 John Wiley & Sons Ltd.
Galvez, David A.; Zhang, Bin; Najar, Ahmed
2014-01-01
Plant ecologists have debated the mechanisms used by plants to cope with the impact of herbivore damage. While plant resistance mechanisms have received much attention, plant compensatory growth as a type of plant tolerance mechanisms has been less studied. We conducted a greenhouse experiment to evaluate compensatory growth for trembling aspen (Populus tremuloides) seedlings under varying intensities and frequencies of simulated defoliation, with or without nutrient enriched media. For the purpose of this study, changes in biomass production and non-structural carbohydrate concentrations (NSC) of roots and leaves were considered compensatory responses. All defoliated seedlings showed biomass accumulation under low defoliation intensity and frequency, regardless of resource availability; however, as defoliation intensity and frequency increased, compensatory growth of seedlings was altered depending on resource availability. Seedlings in a resource-rich environment showed complete compensation, in contrast responses ranged from undercompensation to complete compensation in a resource-limited environment. Furthermore, at the highest defoliation intensity and frequency, NSC concentrations in leaves and roots were similar between defoliated and non-defoliated seedlings in a resource-rich environment; in contrast, defoliated seedlings with limited resources sustained the most biomass loss, had lower amounts of stored NSC. Using these results, we developed a new predictive framework incorporating the interactions between frequency and intensity of defoliation and resource availability as modulators of plant compensatory responses. PMID:25083352
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Transition to intensive care nursing: establishing a starting point.
Boyle, Martin; Butcher, Rand; Conyers, Vicki; Kendrick, Tina; MacNamara, Mary; Lang, Susie
2008-11-01
There is a shortage of intensive care (IC) nurses. A supported transition to IC nursing has been identified as a key strategy for recruitment and retention. In 2004 a discussion document relating to transition of IC nurses was presented to the New South Wales (NSW) Chief Nursing Officer (CNO). A workshop was held with key stakeholders and a Steering Group was established to develop a state-wide transition to IC nursing program. To survey orientation programs and educational resources and develop definitions, goals, learning objectives and clinical competencies relating to transition to IC nursing practice. A questionnaire and a draft document of definitions, target group, goals, learning objectives and clinical competencies for IC transition was distributed to 43 NSW IC units (ICUs). An iterative process of anonymous feedback and modification was undertaken to establish agreement on content. Responses were received from 29 units (return rate of 67%). The survey of educational resources indicated ICUs had access to educational support and there was evidence of a lack of a common standard or definition for "orientation" or "transition". The definitions, target group, goals and competency statements from the draft document were accepted with minor editorial change. Seventeen learning objectives or psychomotor skills were modified and an additional 19 were added to the draft as a result of the process. This work has established valid definitions, goals, learning objectives and clinical competencies that describe transition to intensive care nursing.
The impact of a lean rounding process in a pediatric intensive care unit.
Vats, Atul; Goin, Kristin H; Villarreal, Monica C; Yilmaz, Tuba; Fortenberry, James D; Keskinocak, Pinar
2012-02-01
Poor workflow associated with physician rounding can produce inefficiencies that decrease time for essential activities, delay clinical decisions, and reduce staff and patient satisfaction. Workflow and provider resources were not optimized when a pediatric intensive care unit increased by 22,000 square feet (to 33,000) and by nine beds (to 30). Lean methods (focusing on essential processes) and scenario analysis were used to develop and implement a patient-centric standardized rounding process, which we hypothesize would lead to improved rounding efficiency, decrease required physician resources, improve satisfaction, and enhance throughput. Human factors techniques and statistical tools were used to collect and analyze observational data for 11 rounding events before and 12 rounding events after process redesign. Actions included: 1) recording rounding events, times, and patient interactions and classifying them as essential, nonessential, or nonvalue added; 2) comparing rounding duration and time per patient to determine the impact on efficiency; 3) analyzing discharge orders for timeliness; 4) conducting staff surveys to assess improvements in communication and care coordination; and 5) analyzing customer satisfaction data to evaluate impact on patient experience. Thirty-bed pediatric intensive care unit in a children's hospital with academic affiliation. Eight attending pediatric intensivists and their physician rounding teams. Eight attending physician-led teams were observed for 11 rounding events before and 12 rounding events after implementation of a standardized lean rounding process focusing on essential processes. Total rounding time decreased significantly (157 ± 35 mins before vs. 121 ± 20 mins after), through a reduction in time spent on nonessential (53 ± 30 vs. 9 ± 6 mins) activities. The previous process required three attending physicians for an average of 157 mins (7.55 attending physician man-hours), while the new process required two attending physicians for an average of 121 mins (4.03 attending physician man-hours). Cumulative distribution of completed patient rounds by hour of day showed an improvement from 40% to 80% of patients rounded by 9:30 AM. Discharge data showed pediatric intensive care unit patients were discharged an average of 58.05 mins sooner (p < .05). Staff surveys showed a significant increase in satisfaction with the new process (including increased efficiency, improved physician identification, and clearer understanding of process). Customer satisfaction scores showed improvement after implementing the new process. Implementation of a lean-focused, patient-centric rounding structure stressing essential processes was associated with increased timeliness and efficiency of rounds, improved staff and customer satisfaction, improved throughput, and reduced attending physician man-hours.
Quantum tomography of near-unitary processes in high-dimensional quantum systems
NASA Astrophysics Data System (ADS)
Lysne, Nathan; Sosa Martinez, Hector; Jessen, Poul; Baldwin, Charles; Kalev, Amir; Deutsch, Ivan
2016-05-01
Quantum Tomography (QT) is often considered the ideal tool for experimental debugging of quantum devices, capable of delivering complete information about quantum states (QST) or processes (QPT). In practice, the protocols used for QT are resource intensive and scale poorly with system size. In this situation, a well behaved model system with access to large state spaces (qudits) can serve as a useful platform for examining the tradeoffs between resource cost and accuracy inherent in QT. In past years we have developed one such experimental testbed, consisting of the electron-nuclear spins in the electronic ground state of individual Cs atoms. Our available toolkit includes high fidelity state preparation, complete unitary control, arbitrary orthogonal measurements, and accurate and efficient QST in Hilbert space dimensions up to d = 16. Using these tools, we have recently completed a comprehensive study of QPT in 4, 7 and 16 dimensions. Our results show that QPT of near-unitary processes is quite feasible if one chooses optimal input states and efficient QST on the outputs. We further show that for unitary processes in high dimensional spaces, one can use informationally incomplete QPT to achieve high-fidelity process reconstruction (90% in d = 16) with greatly reduced resource requirements.
Context aware adaptive security service model
NASA Astrophysics Data System (ADS)
Tunia, Marcin A.
2015-09-01
Present systems and devices are usually protected against different threats concerning digital data processing. The protection mechanisms consume resources, which are either highly limited or intensively utilized by many entities. The optimization of these resources usage is advantageous. The resources that are saved performing optimization may be utilized by other mechanisms or may be sufficient for longer time. It is usually assumed that protection has to provide specific quality and attack resistance. By interpreting context situation of business services - users and services themselves, it is possible to adapt security services parameters to countermeasure threats associated with current situation. This approach leads to optimization of used resources and maintains sufficient security level. This paper presents architecture of adaptive security service, which is context-aware and exploits quality of context data issue.
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
USDA-ARS?s Scientific Manuscript database
The genetic transformation of monocot grasses is a resource intensive process, the quality and efficiency of which is dependent in part upon the method of DNA introduction, as well as the ability to effectively separate transformed from wildtype tissue. Agrobacterium-mediated transformation of Brac...
The York Digital Journals Project: Strategies for Institutional Open Journal Systems Implementations
ERIC Educational Resources Information Center
Kosavic, Andrea
2010-01-01
Embarking on a universitywide journal-hosting initiative can be a resource-intensive undertaking. Providing such a service, however, can be equally rewarding, as it positions the library as both partner and colleague in the publishing process. This paper discusses ideas and strategies for institutional journal hosting gleaned over two years by the…
Zhang, Bo; Peng, Beihua; Liu, Mingchu
2012-01-01
This paper presents an overview of the resources use and environmental impact of the Chinese industry during 1997-2006. For the purpose of this analysis the thermodynamic concept of exergy has been employed both to quantify and aggregate the resources input and the environmental emissions arising from the sector. The resources input and environmental emissions show an increasing trend in this period. Compared with 47568.7 PJ in 1997, resources input in 2006 increased by 75.4% and reached 83437.9 PJ, of which 82.5% came from nonrenewable resources, mainly from coal and other energy minerals. Furthermore, the total exergy of environmental emissions was estimated to be 3499.3 PJ in 2006, 1.7 times of that in 1997, of which 93.4% was from GHG emissions and only 6.6% from "three wastes" emissions. A rapid increment of the nonrenewable resources input and GHG emissions over 2002-2006 can be found, owing to the excessive expansion of resource- and energy-intensive subsectors. Exergy intensities in terms of resource input intensity and environmental emission intensity time-series are also calculated, and the trends are influenced by the macroeconomic situation evidently, particularly by the investment-derived economic development in recent years. Corresponding policy implications to guide a more sustainable industry system are addressed.
Caring Decisions: The Development of a Written Resource for Parents Facing End-of-Life Decisions
Gillam, Lynn; Hynson, Jenny; Sullivan, Jane; Cossich, Mary; Wilkinson, Dominic
2015-01-01
Abstract Background: Written resources in adult intensive care have been shown to benefit families facing end of life (EoL) decisions. There are few resources for parents making EoL decisions for their child and no existing resources addressing ethical issues. The Caring Decisions handbook and website were developed to fill these gaps. Aim: We discuss the development of the resources, modification after reviewer feedback and findings from initial pilot implementation. Design: A targeted literature review-to identify resources and factors that impact on parental EoL decision-making; development phase-guided by the literature and the researchers' expertise; consultation process-comprised a multi-disciplinary panel of experts and parents; pilot evaluation study-hard-copy handbook was distributed as part of routine care at an Australian Children's Hospital. Setting/Participants: Twelve experts and parents formed the consultation panel. Eight parents of children with life-limiting conditions and clinicians were interviewed in the pilot study. Results: Numerous factors supporting/impeding EoL decisions were identified. Caring Decisions addressed issues identified in the literature and by the multidisciplinary research team. The consultation panel provided overwhelmingly positive feedback. Pilot study parents found the resources helpful and comforting. Most clinicians viewed the resources as very beneficial to parents and identified them as ideal for training purposes. Conclusions: The development of the resources addressed many of the gaps in existing resources. The consultation process and the pilot study suggest these resources could be of significant benefit to parents and clinicians. PMID:26418215
Pérez, Concepción; Navarro, Ana; Saldaña, María T; Wilson, Koo; Rejas, Javier
2015-03-01
The aim of the present analysis was to model the association and predictive value of pain intensity on cost and resource utilization in patients with chronic peripheral neuropathic pain (PNP) treated in routine clinical practice settings in Spain. We performed a secondary economic analysis based on data from a multicenter, observational, and prospective cost-of-illness study in patients with chronic PNP that is refractory to prior treatment. Pain intensity was measured using the Short-Form McGill Pain Questionnaire. Univariate and multivariate linear regression models were fitted to identify independent predictors of cost and health care/non-health care resource utilization. A total of 1703 patients were included in the current analysis. Pain intensity was an independent predictor of total costs ([total costs]=35.6 [pain intensity]+214.5; coefficient of determination [R(2)]=0.19, P<0.001), direct costs ([direct costs]=10.8 [pain intensity]+257.7; R=0.06, P<0.001), and indirect costs ([indirect costs]=24.8 [pain intensity]-43.4; R(2)=0.20, P<0.001) related to chronic PNP in the univariate analysis. Pain intensity remains significantly associated with total costs, direct costs, and indirect costs after adjustment by other covariates in the multivariate analysis (P<0.001). None of the other variables considered in the multivariate analysis were predictors of resource utilization. Pain intensity predicts the health care and non-health care resource utilization, and costs related to chronic PNP. Management of patients with drugs associated with a higher reduction of pain intensity may have a greater impact on the economic burden of that condition.
ERIC Educational Resources Information Center
Nguyen, T. L.
2016-01-01
At research-intensive universities, building human resources management (HRM) capacity has become a key approach to enhancing a university's research performance. However, despite aspiring to become a research-intensive university, many teaching-intensive universities in developing countries may not have created effective research-promoted HRM…
Couturier, Jean‐Luc; Kokossis, Antonis; Dubois, Jean‐Luc
2016-01-01
Abstract Biorefineries offer a promising alternative to fossil‐based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital‐intensive projects that involve state‐of‐the‐art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well‐documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early‐stage capital cost estimation tool suitable for biorefinery processes. PMID:27484398
An adaptive signal-processing approach to online adaptive tutoring.
Bergeron, Bryan; Cline, Andrew
2011-01-01
Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.
Use of a large-scale rainfall simulator reveals novel insights into stemflow generation
NASA Astrophysics Data System (ADS)
Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.
2017-12-01
Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in Japan (Grant Award No.: S16088) and JSPS KAKENHI (Grant Award No.: JP15H05626).
Framework Resources Multiply Computing Power
NASA Technical Reports Server (NTRS)
2010-01-01
As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.
Schmidt, C E; Gerbershagen, M U; Salehin, J; Weib, M; Schmidt, K; Wolff, F; Wappler, F
2011-06-01
The healthcare market is facing a serious shortage of qualified personnel in 2020. Aging of staff members is one important driver of this human resource deficit but current planning periods of 1-2 years cannot compensate the demographic effects on staff portfolio early enough. Therefore, prospective human resource planning is important to avoid loss of competence. The long range development (10 years) of human resources in the hospitals of the City of Cologne was analyzed. The basis for the analysis was a simulation model that included fluctuation of staff, retirement, maternity leave, status of employee illness, partial retirement and fresh engagements per department and profession. The model was matched with the staff requirements for each department. The results showed a capacity analysis which was used to convey strategic measures for staff recruitment and retention. The greatest risk for shortage of qualified staff was found in the fluctuation of doctors and in the aging work force. Without strategic human resource management the hospitals would face a 50% reduction of the work force within 10 years and after 2 years there would be a 25% deficit of anesthesiologists with impact on the function of operation rooms (OR) and intensive care units. Qualification and continuous training of staff members as well as process optimization are the most important spheres of activity for human resource management in order to recruit and retain qualified staff members. Prospective human resource planning for the OR and intensive care units can help to detect shortage of staff and loss of competence early enough to apply effective personnel development measures. A growing number of companies have started to plan ahead of the current demand of human resources. Hospitals should follow this example because the competition for qualified staff members is increasing rapidly.
Recovery of Iron from Hematite-Rich Diasporic-Type Bauxite Ore
NASA Astrophysics Data System (ADS)
Jiang, Tao; Li, Zhuoxuan; Yang, Lin; Li, Guanghui; Zhang, Yuanbo; Zeng, Jinghua
A technique has been proposed for recovering iron from hematite-rich diasporic-type bauxite ore in this study. Direct reduction roasting followed by low intensity wet magnetic separation process was carried out. The parameters including reduction temperature and time, sodium salts, grinding conditions and magnetic field intensity for separation of iron were determined. The optimum process parameters as follows: roasting temperature of 1050 °C, time of 60 min, sodium salts involving sodium sulfate, borax, sodium carbonate with dosages of 10 wt%, 2 wt%, 35 wt% respectively, and magnetic field intensity of 1000 Gs with fineness of pulp reached 92.75% passing -0.074mm. Under the optimal conditions, an iron concentrate containing 88.17% total iron grade and iron recovery of 92.51% was obtained, 4.55% total iron grade in tailings. This novel technique provide a potential route for utilizing hematiterich diasporic bauxite ore, recovering iron resource firstly, and extracting alumina from magnetic separation tailings further.
Zhang, Bo; Peng, Beihua; Liu, Mingchu
2012-01-01
This paper presents an overview of the resources use and environmental impact of the Chinese industry during 1997–2006. For the purpose of this analysis the thermodynamic concept of exergy has been employed both to quantify and aggregate the resources input and the environmental emissions arising from the sector. The resources input and environmental emissions show an increasing trend in this period. Compared with 47568.7 PJ in 1997, resources input in 2006 increased by 75.4% and reached 83437.9 PJ, of which 82.5% came from nonrenewable resources, mainly from coal and other energy minerals. Furthermore, the total exergy of environmental emissions was estimated to be 3499.3 PJ in 2006, 1.7 times of that in 1997, of which 93.4% was from GHG emissions and only 6.6% from “three wastes” emissions. A rapid increment of the nonrenewable resources input and GHG emissions over 2002–2006 can be found, owing to the excessive expansion of resource- and energy-intensive subsectors. Exergy intensities in terms of resource input intensity and environmental emission intensity time-series are also calculated, and the trends are influenced by the macroeconomic situation evidently, particularly by the investment-derived economic development in recent years. Corresponding policy implications to guide a more sustainable industry system are addressed. PMID:22973176
Mark D. Coleman; David R. Coyle; J. Blake; M. Buford; R.G. Campbell; J. Cox; B. Cregg; D. Daniels; M. Jacobson; Kurt Johnsen; Timothy McDonald; K. McLeod; E. Nelson; D. Robison; R. Rummer; F. Sanchez; John A. Stanturf; B. Stokes; Carl Trettin; J. Tuskan; L. Wright; S. Wullschleger
2004-01-01
Many researchers have studied the productivity potential of intensively managed forest plantations. However, we need to learn more about the effects of fundamental growth processes on forest productivity; especially the influence of above- and belowground resource acquisition and allocation. This report presents installation, establishment, and first-year results of...
3-D Scene Reconstruction from Aerial Imagery
2012-03-01
educational, and recreational uses. Professionals can quickly determine optimal sites for natural resource exploration and potential development... sites . Educators can ex- plore natural landmarks and experience foreign countries with their students, while recreational users can be immediately... sites can forgo an intensive process of erect- ing a grid and documenting the placement of all finds, and medical surgeons can gauge depth while
ERIC Educational Resources Information Center
Ward, Mary-Helen; West, Sandra; Peat, Mary; Atkinson, Susan
2010-01-01
The University of Sydney is a large, research-intensive, campus-based Australian University. Since 2004 a strategic initiative of project-based eLearning support has been creating teams of non-academic and academic staff, who have worked together to develop online resources to meet identified needs. The University's aims in continuing to provide…
Habitat structure mediates biodiversity effects on ecosystem properties
Godbold, J. A.; Bulling, M. T.; Solan, M.
2011-01-01
Much of what we know about the role of biodiversity in mediating ecosystem processes and function stems from manipulative experiments, which have largely been performed in isolated, homogeneous environments that do not incorporate habitat structure or allow natural community dynamics to develop. Here, we use a range of habitat configurations in a model marine benthic system to investigate the effects of species composition, resource heterogeneity and patch connectivity on ecosystem properties at both the patch (bioturbation intensity) and multi-patch (nutrient concentration) scale. We show that allowing fauna to move and preferentially select patches alters local species composition and density distributions, which has negative effects on ecosystem processes (bioturbation intensity) at the patch scale, but overall positive effects on ecosystem functioning (nutrient concentration) at the multi-patch scale. Our findings provide important evidence that community dynamics alter in response to localized resource heterogeneity and that these small-scale variations in habitat structure influence species contributions to ecosystem properties at larger scales. We conclude that habitat complexity forms an important buffer against disturbance and that contemporary estimates of the level of biodiversity required for maintaining future multi-functional systems may need to be revised. PMID:21227969
Habitat structure mediates biodiversity effects on ecosystem properties.
Godbold, J A; Bulling, M T; Solan, M
2011-08-22
Much of what we know about the role of biodiversity in mediating ecosystem processes and function stems from manipulative experiments, which have largely been performed in isolated, homogeneous environments that do not incorporate habitat structure or allow natural community dynamics to develop. Here, we use a range of habitat configurations in a model marine benthic system to investigate the effects of species composition, resource heterogeneity and patch connectivity on ecosystem properties at both the patch (bioturbation intensity) and multi-patch (nutrient concentration) scale. We show that allowing fauna to move and preferentially select patches alters local species composition and density distributions, which has negative effects on ecosystem processes (bioturbation intensity) at the patch scale, but overall positive effects on ecosystem functioning (nutrient concentration) at the multi-patch scale. Our findings provide important evidence that community dynamics alter in response to localized resource heterogeneity and that these small-scale variations in habitat structure influence species contributions to ecosystem properties at larger scales. We conclude that habitat complexity forms an important buffer against disturbance and that contemporary estimates of the level of biodiversity required for maintaining future multi-functional systems may need to be revised.
Bevans, Katherine B; Fitzpatrick, Leslie-Anne; Sanchez, Betty M; Riley, Anne W; Forrest, Christopher
2010-12-01
This study was conducted to empirically evaluate specific human, curricular, and material resources that maximize student opportunities for physical activity during physical education (PE) class time. A structure-process-outcome model was proposed to identify the resources that influence the frequency of PE and intensity of physical activity during PE. The proportion of class time devoted to management was evaluated as a potential mediator of the relations between resource availability and student activity levels. Data for this cross-sectional study were collected from interviews conducted with 46 physical educators and the systematic observation of 184 PE sessions in 34 schools. Regression analyses were conducted to test for the main effects of resource availability and the mediating role of class management. Students who attended schools with a low student-to-physical educator ratio had more PE time and engaged in higher levels of physical activity during class time. Access to adequate PE equipment and facilities was positively associated with student activity levels. The availability of a greater number of physical educators per student was found to impact student activity levels by reducing the amount of session time devoted to class management. The identification of structure and process predictors of student activity levels in PE will support the allocation of resources and encourage instructional practices that best support increased student activity levels in the most cost-effective way possible. Implications for PE policies and programs are discussed. © 2010, American School Health Association.
Knoke, Thomas; Bendix, Jörg; Pohle, Perdita; Hamer, Ute; Hildebrandt, Patrick; Roos, Kristin; Gerique, Andrés; Sandoval, María L; Breuer, Lutz; Tischer, Alexander; Silva, Brenner; Calvas, Baltazar; Aguirre, Nikolay; Castro, Luz M; Windhorst, David; Weber, Michael; Stimm, Bernd; Günter, Sven; Palomeque, Ximena; Mora, Julio; Mosandl, Reinhard; Beck, Erwin
2014-11-26
Increasing demands for livelihood resources in tropical rural areas have led to progressive clearing of biodiverse natural forests. Restoration of abandoned farmlands could counter this process. However, as aims and modes of restoration differ in their ecological and socio-economic value, the assessment of achievable ecosystem functions and benefits requires holistic investigation. Here we combine the results from multidisciplinary research for a unique assessment based on a normalization of 23 ecological, economic and social indicators for four restoration options in the tropical Andes of Ecuador. A comparison of the outcomes among afforestation with native alder or exotic pine, pasture restoration with either low-input or intense management and the abandoned status quo shows that both variants of afforestation and intense pasture use improve the ecological value, but low-input pasture does not. Economic indicators favour either afforestation or intense pasturing. Both Mestizo and indigenous Saraguro settlers are more inclined to opt for afforestation.
Efficacy beliefs predict collaborative practice among intensive care unit nurses.
Le Blanc, Pascale M; Schaufeli, Wilmar B; Salanova, Marisa; Llorens, Susana; Nap, Raoul E
2010-03-01
This paper is a report of an investigation of whether intensive care nurses' efficacy beliefs predict future collaborative practice, and to test the potential mediating role of team commitment in this relationship. Recent empirical studies in the field of work and organizational psychology have demonstrated that (professional) efficacy beliefs are reciprocally related to workers' resources and well-being over time, resulting in a positive gain spiral. Moreover, there is ample evidence that workers' affective commitment to their organization or work-team is related to desirable work behaviours such as citizenship behaviour. A longitudinal design was applied to questionnaire data from the EURICUS-project. Structural Equation Modelling was used to analyse the data. The sample consisted of 372 nurses working in 29 different European intensive care units. Data were collected in 1997 and 1998. However, our research model deals with fundamental psychosocial processes that are not time-dependent. Moreover, recent empirical literature shows that there is still room for improvement in ICU collaborative practice. The hypotheses that (i) the relationship between efficacy beliefs and collaborative practice is mediated by team commitment and (ii) efficacy beliefs, team commitment and collaborative practice are reciprocally related were supported, suggesting a potential positive gain spiral of efficacy beliefs. Healthcare organizations should create working environments that provide intensive care unit nurses with sufficient resources to perform their job well. Further research is needed to design and evaluate interventions for the enhancement of collaborative practice in intensive care units.
Top 40 priorities for science to inform conservation and management policy in the United States
Fleishman, Erica; Blockstein, David E.; Hall, John A.; Mascia, Michael B.; Rudd, Murray A.; Scott, J. Michael; Sutherland, William J.; Bartuska, Ann M.; Brown, A. Gordon; Christen, Catherine A.; Clement, Joel P.; DellaSala, Dominick; Duke, Clifford D.; Fiske, Shirley J.; Gosnell, Hannah; Haney, J. Christopher; Hutchins, Michael; Klein, Mary L.; Marqusee, Jeffrey; Noon, Barry R.; Nordgren, John R.; Orbuch, Paul M.; Powell, Jimmie; Quarles, Steven P.; Saterson, Kathryn A.; Stein, Bruce A.; Webster, Michael S.; Vedder, Amy
2011-01-01
To maximize the utility of research to decisionmaking, especially given limited financial resources, scientists must set priorities for their efforts. We present a list of the top 40 high-priority, multidisciplinary research questions directed toward informing some of the most important current and future decisions about management of species, communities, and ecological processes in the United States. The questions were generated by an open, inclusive process that included personal interviews with decisionmakers, broad solicitation of research needs from scientists and policymakers, and an intensive workshop that included scientifically oriented individuals responsible for managing and developing policy related to natural resources. The process differed from previous efforts to set priorities for conservation research in its focus on the engagement of decisionmakers in addition to researchers. The research priorities emphasized the importance of addressing societal context and exploration of trade-offs among alternative policies and actions, as well as more traditional questions related to ecological processes and functions.
NASA Astrophysics Data System (ADS)
Seipel, S.; Yu, J.; Periyasamy, A. P.; Viková, M.; Vik, M.; Nierstrasz, V. A.
2017-10-01
For the development of niche products like smart textiles and other functional high-end products, resource-saving production processes are needed. Niche products only require small batches, which makes their production with traditional textile production techniques time-consuming and costly. To achieve a profitable production, as well as to further foster innovation, flexible and integrated production techniques are a requirement. Both digital inkjet printing and UV-light curing contribute to a flexible, resource-efficient, energy-saving and therewith economic production of smart textiles. In this article, a smart textile UV-sensor is printed using a piezoelectric drop-on-demand printhead and cured with a UV-LED lamp. The UVcurable ink system is based on free radical polymerization and the integrated UVsensing material is a photochromic dye, Reversacol Ruby Red. The combination of two photoactive compounds, for which UV-light is both the curer and the activator, challenges two processes: polymer crosslinking of the resin and color performance of the photochromic dye. Differential scanning calorimetry (DSC) is used to characterize the curing efficiency of the prints. Color measurements are made to determine the influence of degree of polymer crosslinking on the developed color intensities, as well as coloration and decoloration rates of the photochromic prints. Optimized functionality of the textile UV-sensor is found using different belt speeds and lamp intensities during the curing process.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
An overview of science contributions to the management of the Tongass National Forest, Alaska.
Douglas A. Jr. Boyce; Robert C. Szaro
2005-01-01
After 6 years of intensive study, all the research studies designed to answer the information needs identified in appendix B of the Tongass land management plan have ended, with their results published or in press. The knowledge generated from these studies not only informs the ongoing process of regional natural resource management in southeast Alaska, but also helped...
[Precipitation pulses and ecosystem responses in arid and semiarid regions: a review].
Zhao, Wen-Zhi; Liu, Hu
2011-01-01
Precipitation events in arid/semi-arid environment are usually occurred in "pulses", with highly variable arrival time, duration, and intensity. These discrete and largely unpredictable features may lead to the pulsed availability of soil water and nutrients in space and time. Resources pulses can affect the life history traits and behaviors at individual level, numerous responses at population level, and indirect effects at community level. This paper reviewed the most recent research advances in the related fields from the aspects of the effects of resources pulses and the responses of ecosystems. It was emphasized that the following issues are still open, e.g., the effects of the pulsed features of resources availability on ecosystems, the discrepancy among the effects of resources pulses in different ecosystems, the eco-hydrological mechanisms that determine the persistence of pulsed resources effects, and the effects of the pulsed resources availability on ecosystem processes. Given the potential global climate and precipitation pattern change, an important research direction in the future is to determine how the resources pulses affect the ecosystem responses at different scales under different climate scenarios.
The Montage architecture for grid-enabled science processing of large, distributed datasets
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui
2004-01-01
Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.
Priest, Kelsey C; Lobingier, Hannah; McCully, Nancy; Lombard, Jackie; Hansen, Mark; Uchiyama, Makoto; Hagg, Daniel S
2016-01-01
Health care delivery systems are challenged to support the increasing demands for improving patient safety, satisfaction, and outcomes. Limited resources and staffing are common barriers for making significant and sustained improvements. At Oregon Health & Science University, the medical intensive care unit (MICU) leadership team faced internal capacity limitations for conducting continuous quality improvement, specifically for the implementation and evaluation of the mobility portion of an evidence-based care bundle. The MICU team successfully addressed this capacity challenge using the person power of prehealth volunteers. In the first year of the project, 52 trained volunteers executed an evidence-based mobility intervention for 305 critically ill patients, conducting more than 200 000 exercise repetitions. The volunteers contributed to real-time evaluation of the project, with the collection of approximately 26 950 process measure data points. Prehealth volunteers are an untapped resource for effectively expanding internal continuous quality improvement capacity in the MICU and beyond.
Tuneable porous carbonaceous materials from renewable resources.
White, Robin J; Budarin, Vitaly; Luque, Rafael; Clark, James H; Macquarrie, Duncan J
2009-12-01
Porous carbon materials are ubiquitous with a wide range of technologically important applications, including separation science, heterogeneous catalyst supports, water purification filters, stationary phase materials, as well as the developing future areas of energy generation and storage applications. Hard template routes to ordered mesoporous carbons are well established, but whilst offering different mesoscopic textural phases, the surface of the material is difficult to chemically post-modify and processing is energy, resource and step intensive. The production of carbon materials from biomass (i.e. sugars or polysaccharides) is a relatively new but rapidly expanding research area. In this tutorial review, we compare and contrast recently reported routes to the preparation of porous carbon materials derived from renewable resources, with examples of our previously reported mesoporous polysaccharide-derived "Starbon" carbonaceous material technology.
TORC3: Token-ring clearing heuristic for currency circulation
NASA Astrophysics Data System (ADS)
Humes, Carlos, Jr.; Lauretto, Marcelo S.; Nakano, Fábio; Pereira, Carlos A. B.; Rafare, Guilherme F. G.; Stern, Julio Michael
2012-10-01
Clearing algorithms are at the core of modern payment systems, facilitating the settling of multilateral credit messages with (near) minimum transfers of currency. Traditional clearing procedures use batch processing based on MILP - mixed-integer linear programming algorithms. The MILP approach demands intensive computational resources; moreover, it is also vulnerable to operational risks generated by possible defaults during the inter-batch period. This paper presents TORC3 - the Token-Ring Clearing Algorithm for Currency Circulation. In contrast to the MILP approach, TORC3 is a real time heuristic procedure, demanding modest computational resources, and able to completely shield the clearing operation against the participating agents' risk of default.
NASA Astrophysics Data System (ADS)
Hultine, K. R.; Bush, S.; Nagler, P. L.; Morino, K.; Burtch, K.; Dennison, P. E.; Glenn, E. P.; Ehleringer, J.
2010-12-01
Global change processes such as climate change and intensive land use pose significant threats to water resources, particularly in arid regions where potential evapotranspiration far exceeds annual rainfall. Potentially compounding these shortages is the progressive expansion of introduced plant species in riparian areas along streams, canals and rivers in geographically arid regions. The question of whether these invasive species have had or will have impacts on water resources is currently under intense debate. We identify a framework for assessing when and where introduced riparian plant species are likely to have the highest potential impact on hydrologic fluxes of arid and semi-arid river systems. We focus on three introduced plant systems that currently dominate southwestern U.S. riparian forests: tamarisk (Tamarix spp.), Russian olive (Eleagnus angustifolia), and Russian knapweed (Acroptilon repens). Our framework focuses on two main criteria: 1) the ecophysiological traits that promote establishment of invasive species across environmental gradients, and 2) an assessment of how hydrologic fluxes are altered by the establishment of introduced species at varying scales. The framework identifies when and where introduced species should have the highest potential impact on the water cycle. This framework will assist land managers and policy makers with restoration and conservation priorities to preserve water resources and valued riparian habitat given limited economic resources.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
System-on-Chip Data Processing and Data Handling Spaceflight Electronics
NASA Technical Reports Server (NTRS)
Kleyner, I.; Katz, R.; Tiggeler, H.
1999-01-01
This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.
Tsagkari, Mirela; Couturier, Jean-Luc; Kokossis, Antonis; Dubois, Jean-Luc
2016-09-08
Biorefineries offer a promising alternative to fossil-based processing industries and have undergone rapid development in recent years. Limited financial resources and stringent company budgets necessitate quick capital estimation of pioneering biorefinery projects at the early stages of their conception to screen process alternatives, decide on project viability, and allocate resources to the most promising cases. Biorefineries are capital-intensive projects that involve state-of-the-art technologies for which there is no prior experience or sufficient historical data. This work reviews existing rapid cost estimation practices, which can be used by researchers with no previous cost estimating experience. It also comprises a comparative study of six cost methods on three well-documented biorefinery processes to evaluate their accuracy and precision. The results illustrate discrepancies among the methods because their extrapolation on biorefinery data often violates inherent assumptions. This study recommends the most appropriate rapid cost methods and urges the development of an improved early-stage capital cost estimation tool suitable for biorefinery processes. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
NASA Astrophysics Data System (ADS)
Yarovyi, Andrii A.; Timchenko, Leonid I.; Kozhemiako, Volodymyr P.; Kokriatskaia, Nataliya I.; Hamdi, Rami R.; Savchuk, Tamara O.; Kulyk, Oleksandr O.; Surtel, Wojciech; Amirgaliyev, Yedilkhan; Kashaganova, Gulzhan
2017-08-01
The paper deals with a problem of insufficient productivity of existing computer means for large image processing, which do not meet modern requirements posed by resource-intensive computing tasks of laser beam profiling. The research concentrated on one of the profiling problems, namely, real-time processing of spot images of the laser beam profile. Development of a theory of parallel-hierarchic transformation allowed to produce models for high-performance parallel-hierarchical processes, as well as algorithms and software for their implementation based on the GPU-oriented architecture using GPGPU technologies. The analyzed performance of suggested computerized tools for processing and classification of laser beam profile images allows to perform real-time processing of dynamic images of various sizes.
Daugherty, Elizabeth L; Rubinson, Lewis
2011-11-01
In recent years, healthcare disaster planning has grown from its early place as an occasional consideration within the manuals of emergency medical services and emergency department managers to a rapidly growing field, which considers continuity of function, surge capability, and process changes across the spectrum of healthcare delivery. A detailed examination of critical care disaster planning was undertaken in 2007 by the Task Force for Mass Critical Care of the American College of Chest Physicians Critical Care Collaborative Initiative. We summarize the Task Force recommendations and available updated information to answer a fundamental question for critical care disaster planners: What is a prepared intensive care unit and how do I ensure my unit's readiness? Database searches and review of relevant published literature. Preparedness is essential for successful response, but because intensive care units face many competing priorities, without defining "preparedness for what," the task can seem overwhelming. Intensive care unit disaster planners should, therefore, along with the entire hospital, participate in a hospital or regionwide planning process to 1) identify critical care response vulnerabilities; and 2) clarify the hazards for which their community is most at risk. The process should inform a comprehensive written preparedness plan targeting the most worrisome scenarios and including specific guidance on 1) optimal use of space, equipment, and staffing for delivery of critical care to significantly increased patient volumes; 2) allocation of resources for provision of essential critical care services under conditions of absolute scarcity; 3) intensive care unit evacuation; and 4) redundant internal communication systems and means for timely data collection. Critical care disaster planners have a complex, challenging task. Experienced planners will agree that no disaster response is perfect, but careful planning will enable the prepared intensive care unit to respond effectively in times of crisis.
Scotti, Dennis J; Harmon, Joel; Behson, Scott J
2009-01-01
This study assesses the importance of customer-contact intensity at the service encounter level as a determinant of service quality assessments. Using data from the U.S. Department of Veterans Affairs, it shows that performance-driven human resources practices play an important role as determinants of employee customer orientation and service capability in both high-contact (outpatient healthcare) and low-contact (benefits claim processing) human service contexts. However, there existed significant differences across service delivery settings in the salience of customer orientation and the congruence between employee and customer perceptions of service quality, depending on the intensity of customer contact. In both contexts, managerial attention to high-performance work systems and customer-orientation has the potential to favorably impact perceptions of service quality, amplify consumer satisfaction, and enhance operational efficiency.
NASA Astrophysics Data System (ADS)
Filipponi, Federico; Zucca, Francesco; Taramelli, Andrea; Valentini, Emiliana
2015-12-01
Monitoring sediment fluxes patterns in coastal area, like dispersion, sedimentation and resuspension processes, is a relevant topic for scientists, decision makers and natural resources management. Time series analysis of Earth Observation (EO) data may contribute to the understanding and the monitoring of processes in sedimentary depositional marine environment, especially for shallow coastal areas. This research study show the ability of optical medium resolution imagery to interpret the evolution of sediment resuspension from seafloor in coastal areas during intense wind forcings. Intense bora wind events in northern Adriatic Sea basin during winter season provoke considerable wave-generated resuspension of sediments, which cause variation in water column turbidity. Total Suspended Matter (TSM) product has been selected as proxy for qualitative and quantitative analysis of resuspended sediments. In addition, maximum signal depth (Z90_max), has been used to evaluate the evolution of sediment concentration in the water column.
Kasey Jacobs
2017-01-01
The U.S. Forest Service has found itself in an era of intense human activity, a changing climate; development and loss of open space; resource consumption; and problematic introduced species; and diversity in core beliefs and values. These challenges test our task-relevant maturity and the ability and willingness to meet the growing demands for services. The Forest...
Women’s Role in Disaster Management and Implications for National Security
2017-07-11
management policies, plans and decision making processes,” available at http://www.unisdr.org/we/ inform /publications/1037. Beijing Agenda for Global...1 WOMEN’S ROLE IN DISASTER MANAGEMENT AND IMPLICATIONS FOR NATIONAL SECURITY By Jessica Ear Introduction Disasters are increasing in...frequency and intensity. For those lacking control and access to services and resources such as education and information , disaster risks are even
Jose Luiz Stape; Dan Binkley; Michael G. Ryan
Millions of hectares of Eucalyptus are intensively managed for wood production in the tropics, but little is known about the physiological processes that control growth and their regulation. We examined the main environmental factors controlling growth and resource use across a geographic gradient with clonal E. grandis x urophylla in north-eastern Brazil. Rates of...
Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C
2015-03-30
Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Mechanical Properties and Eco-Efficiency of Steel Fiber Reinforced Alkali-Activated Slag Concrete.
Kim, Sun-Woo; Jang, Seok-Joon; Kang, Dae-Hyun; Ahn, Kyung-Lim; Yun, Hyun-Do
2015-10-30
Conventional concrete production that uses ordinary Portland cement (OPC) as a binder seems unsustainable due to its high energy consumption, natural resource exhaustion and huge carbon dioxide (CO₂) emissions. To transform the conventional process of concrete production to a more sustainable process, the replacement of high energy-consumptive PC with new binders such as fly ash and alkali-activated slag (AAS) from available industrial by-products has been recognized as an alternative. This paper investigates the effect of curing conditions and steel fiber inclusion on the compressive and flexural performance of AAS concrete with a specified compressive strength of 40 MPa to evaluate the feasibility of AAS concrete as an alternative to normal concrete for CO₂ emission reduction in the concrete industry. Their performances are compared with reference concrete produced using OPC. The eco-efficiency of AAS use for concrete production was also evaluated by binder intensity and CO₂ intensity based on the test results and literature data. Test results show that it is possible to produce AAS concrete with compressive and flexural performances comparable to conventional concrete. Wet-curing and steel fiber inclusion improve the mechanical performance of AAS concrete. Also, the utilization of AAS as a sustainable binder can lead to significant CO₂ emissions reduction and resources and energy conservation in the concrete industry.
NASA Astrophysics Data System (ADS)
Hu, Di; Dolganov, Aleksei; Ma, Mingchan; Bhattacharya, Biyash; Bishop, Matthew T.; Chen, George Z.
2018-02-01
The Kroll process has been employed for titanium extraction since the 1950s. It is a labour and energy intensive multi-step semi-batch process. The post-extraction processes for making the raw titanium into alloys and products are also excessive, including multiple remelting steps. Invented in the late 1990s, the Fray-Farthing-Chen (FFC) Cambridge process extracts titanium from solid oxides at lower energy consumption via electrochemical reduction in molten salts. Its ability to produce alloys and powders, while retaining the cathode shape also promises energy and material efficient manufacturing. Focusing on titanium and its alloys, this article reviews the recent development of the FFC-Cambridge process in two aspects, (1) resource and process sustainability and (2) advanced post-extraction processing.
The future of Yellowcake: a global assessment of uranium resources and mining.
Mudd, Gavin M
2014-02-15
Uranium (U) mining remains controversial in many parts of the world, especially in a post-Fukushima context, and often in areas with significant U resources. Although nuclear proponents point to the relatively low carbon intensity of nuclear power compared to fossil fuels, opponents argue that this will be eroded in the future as ore grades decline and energy and greenhouse gas emissions (GGEs) intensity increases as a result. Invariably both sides fail to make use of the increasingly available data reported by some U mines through sustainability reporting - allowing a comprehensive assessment of recent trends in the energy and GGE intensity of U production, as well as combining this with reported mineral resources to allow more comprehensive modelling of future energy and GGEs intensity. In this study, detailed data sets are compiled on reported U resources by deposit type, as well as mine production, energy and GGE intensity. Some important aspects included are the relationship between ore grade, deposit type and recovery, which are crucial in future projections of U mining. Overall, the paper demonstrates that there are extensive U resources known to meet potential short to medium term demand, although the future of U mining remains uncertain due to the doubt about the future of nuclear power as well as a range of complex social, environmental, economic and some site-specific technical issues. Copyright © 2013 Elsevier B.V. All rights reserved.
McIntosh, Nathalie; Oppel, Eva; Mohr, David; Meterko, Mark
2017-09-01
Improving patient care quality in intensive care units is increasingly important as intensive care unit services account for a growing proportion of hospital services. Organizational factors associated with quality of patient care in such units have been identified; however, most were examined in isolation, making it difficult to assess the relative importance of each. Furthermore, though most intensive care units now use a closed model, little research has been done in this specific context. To examine the relative importance of organizational factors associated with patient care quality in closed intensive care units. In a national exploratory, cross-sectional study focused on intensive care units at US Veterans Health Administration acute care hospitals, unit directors were surveyed about nurse and physician staffing, work resources and training, patient care coordination, rounding, and perceptions of patient care quality. Administrative records yielded data on patient volume and facility teaching status. Descriptive statistics, bivariate analyses, and regression modeling were used for data analysis. Sixty-nine completed surveys from directors of closed intensive care units were returned. Regression model results showed that better patient care coordination (β = 0.43; P = .01) and having adequate work resources (β = 0.26; P = .02) were significantly associated with higher levels of patient care quality in such units ( R 2 = 0.22). Augmenting work resources and/or focusing limited hospital resources on improving patient care coordination may be the most productive ways to improve patient care quality in closed intensive care units. ©2017 American Association of Critical-Care Nurses.
[Limitation of therapeutic effort: Approach to a combined view].
Bueno Muñoz, M J
2013-01-01
Over the past few decades, we have been witnessing that increasing fewer people pass away at home and increasing more do so within the hospital. More specifically, 20% of deaths now occur in an intensive care unit (ICU). However, death in the ICU has become a highly technical process. This sometimes originates excesses because the resources used are not proportionate related to the purposes pursued (futility). It may create situations that do not respect the person's dignity throughout the death process. It is within this context that the situation of the clinical procedure called "limitation of the therapeutic effort" (LTE) is reviewed. This has become a true bridge between Intensive Care and Palliative Care. Its final goal is to guarantee a dignified and painless death for the terminally ill. Copyright © 2012 Elsevier España, S.L. y SEEIUC. All rights reserved.
Recent developments of downstream processing for microbial lipids and conversion to biodiesel.
Yellapu, Sravan Kumar; Bharti; Kaur, Rajwinder; Kumar, Lalit R; Tiwari, Bhagyashree; Zhang, Xiaolei; Tyagi, Rajeshwar D
2018-05-01
With increasing global population and depleting resources, there is an apparent demand for radical unprecedented innovation to satisfy the basal needs of lives. Hence, non-conventional renewable energy resources like biodiesel have been worked out in past few decades. Biofuel (e.g. Biodiesel) serves to be the most sustainable answer to solve "food vs. fuel crisis". In biorefinery process, lipid extraction from oleaginous microbial lipids is an integral part as it facilitates the release of fatty acids. Direct lipid extraction from wet cell-biomass is favorable in comparison to dry-cell biomass because it eliminates the application of expensive dehydration. However, this process is not commercialized yet, instead, it requires intensive research and development in order to establish robust approaches for lipid extraction that can be practically applied on an industrial scale. This review aims for the critical presentation on cell disruption, lipid recovery and purification to support extraction from wet cell-biomass for an efficient transesterification. Copyright © 2018 Elsevier Ltd. All rights reserved.
Knoke, Thomas; Bendix, Jörg; Pohle, Perdita; Hamer, Ute; Hildebrandt, Patrick; Roos, Kristin; Gerique, Andrés; Sandoval, María L.; Breuer, Lutz; Tischer, Alexander; Silva, Brenner; Calvas, Baltazar; Aguirre, Nikolay; Castro, Luz M.; Windhorst, David; Weber, Michael; Stimm, Bernd; Günter, Sven; Palomeque, Ximena; Mora, Julio; Mosandl, Reinhard; Beck, Erwin
2014-01-01
Increasing demands for livelihood resources in tropical rural areas have led to progressive clearing of biodiverse natural forests. Restoration of abandoned farmlands could counter this process. However, as aims and modes of restoration differ in their ecological and socio-economic value, the assessment of achievable ecosystem functions and benefits requires holistic investigation. Here we combine the results from multidisciplinary research for a unique assessment based on a normalization of 23 ecological, economic and social indicators for four restoration options in the tropical Andes of Ecuador. A comparison of the outcomes among afforestation with native alder or exotic pine, pasture restoration with either low-input or intense management and the abandoned status quo shows that both variants of afforestation and intense pasture use improve the ecological value, but low-input pasture does not. Economic indicators favour either afforestation or intense pasturing. Both Mestizo and indigenous Saraguro settlers are more inclined to opt for afforestation. PMID:25425182
Burkart, M.R.; Kolpin, D.W.
1993-01-01
The US Geological Survey, US Department of Agriculture, and US Environmental Protection Agency are conducting research and regional assessments in support of policy alternatives intended to protect water resources from agricultural chemical contamination. The mid-continent was selected because of the intense row crop agriculture and associated herbicide application in this region. An application of a geographic information system is demonstrated for analyzing and comparing the distribution of estimated atrazine use to the detection rate of atrazine in groundwater. Understanding the relations between atrazine use and detection in groundwater is important in policy deliberations to protect water resources. Relational analyses between measures of chemical use and detection rate by natural resource units may provide insight into critical factors controlling the processes that result in groundwater contamination from agricultural chemicals.
Krieger, James W.; Takaro, Tim K.; Song, Lin; Weaver, Marcia
2005-01-01
Objectives. We assessed the effectiveness of a community health worker intervention focused on reducing exposure to indoor asthma triggers. Methods. We conducted a randomized controlled trial with 1-year follow-up among 274 low-income households containing a child aged 4–12 years who had asthma. Community health workers provided in-home environmental assessments, education, support for behavior change, and resources. Participants were assigned to either a high-intensity group receiving 7 visits and a full set of resources or a low-intensity group receiving a single visit and limited resources. Results. The high-intensity group improved significantly more than the low-intensity group in its pediatric asthma caregiver quality-of-life score (P=.005) and asthma-related urgent health services use (P=.026). Asthma symptom days declined more in the high-intensity group, although the across-group difference did not reach statistical significance (P= .138). Participant actions to reduce triggers generally increased in the high-intensity group. The projected 4-year net savings per participant among the high-intensity group relative to the low-intensity group were $189–$721. Conclusions. Community health workers reduced asthma symptom days and urgent health services use while improving caregiver quality-of-life score. Improvement was greater with a higher-intensity intervention. PMID:15798126
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.
Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.
Nanomaterials for renewable energy
Chen, Shimou; Li, Liang; Sun, Hanwen; ...
2015-05-19
With demand for sustainable energy, resource, and environment protection, new material technologies are constantly expanding during the last few couple of decades. An intensive attention has been given by the scientific communities. In particular, nanomaterials are increasingly playing an active role either by increasing the efficiency of the energy storage and conversion processes or by improving the device design and performance. This special issue presents recent research advances in various aspects of energy storage technologies, advanced batteries, fuel cells, solar cell, biofuels, and so on. Design and synthesis of novel materials have demonstrated great impact on the utilization of themore » sustainable energy, which need to solve the increasing shortage of resource and the issues of environmental pollution.« less
Landscape ecological risk assessment study in arid land
NASA Astrophysics Data System (ADS)
Gong, Lu; Amut, Aniwaer; Shi, Qingdong; Wang, Gary Z.
2007-09-01
The ecosystem risk assessment is an essential decision making system for predicting the reconstruction and recovery of a damaged ecosystem after intensive mankind activities. The sustainability of environment and resources of the lake ecosystem in arid districts have been paid close attention to by international communities as well as numerous experts and scholars. The ecological risk assessment offered a scientific foundation for making the decision and execution of ecological risk management. Bosten Lake, the largest inland freshwater lake in China, is the main water source of the industrial and agricultural production as well as the local residence in Yanqi basin, Kuara city and Yuri County in the southern Xinjiang. Bosten Lake also provides a direct water source for emergency transportation in the Lower Reaches of Tarim River. However, with the intensive utilizations of water and soil resources, the environmental condition in the Bosten Lake has become more and more serious. In this study, the theory and method of landscape ecological risk assessment has been practiced using 3S technologies combined with the frontier theory of landscape ecology. Defining the mainly risk resource including flood, drought, water pollution and rich nutrition of water has been evaluated based on the ecosystem risk assessment system. The main process includes five stages: regional natural resources analysis, risk receptor selection, risk sources evaluation, exposure and hazard analysis, and integrated risk assessment. Based on the risk assessment results, the environmental risk management countermeasure has been determined.
Extracting remanent magnetization from magnetic data inversion
NASA Astrophysics Data System (ADS)
Liu, S.; Fedi, M.; Baniamerian, J.; Hu, X.
2017-12-01
Remanent magnetization is an important vector parameter of rocks' and ores' magnetism, which is related to the intensity and direction of primary geomagnetic fields at all geological periods and hence shows critical evidences of geological tectonic movement and sedimentary evolution. We extract the remanence information from the distributions of the inverted magnetization vector. Firstly, directions of total magnetization vector are estimated from reduced-to-pole anomaly (max-min algorithm) and by its correlations with other magnitude magnetic transforms such as magnitude magnetic anomaly and normalized source strength. Then we invert data for the magnetization intensity and finally the intensity and direction of the remanent magnetization are separated from the total magnetization vector with a generalized formula of the apparent susceptibility based on a priori information on the Koenigsberger ratio. Our approach is used to investigate the targeted resources and geologic processes of the mining areas in China.
U.S. Geological Survey water resources activities in Florida, 1985-86
Glenn, M. E.
1986-01-01
This report contains summary statements of water resources activities in Florida conducted by the Water Resources Division of the U.S. Geological Survey in cooperation with Federal, State , and local agencies during 1985-86. These activities are part of the Federal program of appraising the Nation 's water resources. Water resources appraisals in Florida are highly diversified, ranging from hydrologic records networks to interpretive appraisals of water resources and applied research to develop investigative techniques. Thus, water resource investigations range from basic descriptive water-availability studies for areas of low-intensity water development and management to sophisticated cause and effect studies in areas of high-intensity water development and management. The interpretive reports and records that are products of the investigations are a principal hydrologic foundation upon which the plans for development, management, and protection of Florida 's water resources may be based. (Lantz-PTT)
Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couch, R; Becker, R; Rhee, M
2004-09-24
Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less
Processing ARM VAP data on an AWS cluster
NASA Astrophysics Data System (ADS)
Martin, T.; Macduff, M.; Shippert, T.
2017-12-01
The Atmospheric Radiation Measurement (ARM) Data Management Facility (DMF) manages over 18,000 processes and 1.3 TB of data each day. This includes many Value Added Products (VAPs) that make use of multiple instruments to produce the derived products that are scientifically relevant. A thermodynamic and cloud profile VAP is being developed to provide input to the ARM Large-eddy simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project (https://www.arm.gov/capabilities/vaps/lasso-122) . This algorithm is CPU intensive and the processing requirements exceeded the available DMF computing capacity. Amazon Web Service (AWS) along with CfnCluster was investigated to see how it would perform. This cluster environment is cost effective and scales dynamically based on demand. We were able to take advantage of autoscaling which allowed the cluster to grow and shrink based on the size of the processing queue. We also were able to take advantage of the Amazon Web Services spot market to further reduce the cost. Our test was very successful and found that cloud resources can be used to efficiently and effectively process time series data. This poster will present the resources and methodology used to successfully run the algorithm.
Comparison of approaches for mobile document image analysis using server supported smartphones
NASA Astrophysics Data System (ADS)
Ozarslan, Suleyman; Eren, P. Erhan
2014-03-01
With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.
Managing interior Northwest rangelands: the Oregon Range Evaluation Project.
Thomas M. Quigley; H. Reed Sanderson; Arthur R. Tiedemann
1989-01-01
This report is a synthesis of results from an 11-year study of the effects of increasing intensities of range management strategies on herbage production, water resources, economics, and associated resources-such as wood fiber and recreation-in Grant County, Oregon. Four intensities of management were studied on Federal land (19 grazing allotments) ranging from no...
Homo Heuristicus: Less-is-More Effects in Adaptive Cognition
Brighton, Henry; Gigerenzer, Gerd
2012-01-01
Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We discuss some of the major progress made so far, focusing on the discovery of less-is-more effects and the study of the ecological rationality of heuristics which examines in which environments a given strategy succeeds or fails, and why. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. PMID:23613644
Aluvaala, Jalemba; Collins, Gary S; Maina, Michuki; Berkley, James A; English, Mike
2017-12-07
Treatment intensity scores can predict mortality and estimate resource use. They may therefore be of interest for essential neonatal care in low resource settings where neonatal mortality remains high. We sought to systematically review neonatal treatment intensity scores to (1) assess the level of evidence on predictive performance in predicting clinical outcomes and estimating resource utilisation and (2) assess the applicability of the identified models to decision making for neonatal care in low resource settings. We conducted a systematic search of PubMed, EMBASE (OVID), CINAHL, Global Health Library (Global index, WHO) and Google Scholar to identify studies published up until 21 December 2016. Included were all articles that used treatments as predictors in neonatal models. Individual studies were appraised using the CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies (CHARMS). In addition, Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used as a guiding framework to assess certainty in the evidence for predicting outcomes across studies. Three thousand two hundred forty-nine articles were screened, of which ten articles were included in the review. All of the studies were conducted in neonatal intensive care units with sample sizes ranging from 22 to 9978, with a median of 163. Two articles reported model development, while eight reported external application of existing models to new populations. Meta-analysis was not possible due heterogeneity in the conduct and reporting of the identified studies. Discrimination as assessed by area under receiver operating characteristic curve was reported for in-hospital mortality, median 0.84 (range 0.75-0.96, three studies), early adverse outcome and late adverse outcome (0.78 and 0.59, respectively, one study). Existing neonatal treatment intensity models show promise in predicting mortality and morbidity. There is however low certainty in the evidence on their performance in essential neonatal care in low resource settings as all studies had methodological limitations and were conducted in intensive care. The approach may however be developed further for low resource settings like Kenya because treatment data may be easier to obtain compared to measures of physiological status. PROSPERO CRD42016034205.
NASA Technical Reports Server (NTRS)
Kemeny, Sabrina E.
1994-01-01
Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.
Ebels, Kelly; Faulx, Dunia; Gerth-Guyette, Emily; Murunga, Peninah; Mahapatro, Samarendra; Das, Manoja Kumar; Ginsburg, Amy Sarah
2016-01-01
Pneumonia is the leading cause of death from infection in children worldwide. Despite global treatment recommendations that call for children with pneumonia to receive amoxicillin dispersible tablets, only one-third of children with pneumonia receive any antibiotics and many do not complete the full course of treatment. Poor adherence to antibiotics may be driven in part by a lack of user-friendly treatment instructions. In order to optimise childhood pneumonia treatment adherence at the community level, we developed a user-friendly product presentation for caregivers and a job aid for healthcare providers (HCPs). This paper aims to document the development process and offers a model for future health communication tools. We employed an iterative design process that included document review, key stakeholder interviews, engagement with a graphic designer and pre-testing design concepts among target users in India and Kenya. The consolidated criteria for reporting qualitative research were used in the description of results. Though resources for pneumonia treatment are available in some countries, their content is incomplete and inconsistent with global recommendations. Document review and stakeholder interviews provided the information necessary to convey to caregivers and recommendations for how to present this information. Target users in India and Kenya confirmed the need to support better treatment adherence, recommended specific modifications to design concepts and suggested the development of a companion job aid. There was a consensus among caregivers and HCPs that these tools would be helpful and improve adherence behaviours. The development of user-friendly instructions for medications for use in low-resource settings is a critically important but time-intensive and resource-intensive process that should involve engagement with target audiences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Digital Library Storage using iRODS Data Grids
NASA Astrophysics Data System (ADS)
Hedges, Mark; Blanke, Tobias; Hasan, Adil
Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.
Performance Enhancements Under Dual-task Conditions
NASA Technical Reports Server (NTRS)
Kramer, A. F.; Wickens, C. D.; Donchin, E.
1984-01-01
Research on dual-task performance has been concerned with delineating the antecedent conditions which lead to dual-task decrements. Capacity models of attention, which propose that a hypothetical resource structure underlies performance, have been employed as predictive devices. These models predict that tasks which require different processing resources can be more successfully time shared than tasks which require common resources. The conditions under which such dual-task integrality can be fostered were assessed in a study in which three factors likely to influence the integrality between tasks were manipulated: inter-task redundancy, the physical proximity of tasks and the task relevant objects. Twelve subjects participated in three experimental sessions in which they performed both single and dual-tasks. The primary task was a pursuit step tracking task. The secondary tasks required the discrimination between different intensities or different spatial positions of a stimulus. The results are discussed in terms of a model of dual-task integrality.
The Faculty Role in Advocacy: What, Why, and How
NASA Astrophysics Data System (ADS)
Franklin, Scott
2015-04-01
The Capitol Hill environment is completely unlike that in the halls of academia, and advocating for science policy requires a style of communication quite different from scientific discourse. Nevertheless, the experience, while challenging, can be extremely rewarding, and change how one approaches changing our educational system. Fortunately, there are a growing number of resources that faculty can draw upon to make the process easier and more effective. I will discuss my first trip to Capitol Hill, including the details of setting up and managing appointments with congressional aides, and the resources I found useful during my visit. I'll also describe the initial culture shock and how I quickly came to appreciate the intensity and clarity of the visits. In addition to providing a roadmap for other faculty wishing to advocate for science policy, I'll describe additional resources that are in development.
NASA Astrophysics Data System (ADS)
Xue, Changsheng; Li, Qingquan; Li, Deren
2004-02-01
In 1988, the detail information on land resource was investigated in China. Fourteen years later, it has changed a lot. It is necessary that the second land resource detailed investigation should be implemented. On this condition, the New National Land and Resources Investigation Project in China, which will last 12 years, has been started since 1999. The project is directly under the administration of the Ministry of Land and Resource (MLR). It was organized and implemented By China Geological, China Land Surveying and Planning Institute (CLSPI) and Information Center of MLR. It is a grand and cross century project supported by the Central Finance, based on State and public interests and strategic characteristics. Up to now, "Land Use Dynamic Monitoring By Remote Sensing," "Arable Land Resource Investigation," "Rural Collective Land Property Right Investgiation," "Establishment of Public Consulting Standardization of Cadastral Information," "Land Resource Fundamental Maps and Data Updating," "Urban Land Price Investigation and Intensive Utilization Potential Capacity Evaluation," "Farmland Classification, Gradation, and Evaluation," "Land Use Database Construction at City or County Level" 8 subprojects have had the preliminary achievements. In this project, SPOT-1/2/4 and Landsat-7 TM data were always applied to monitor land use dynamic change as the main data resource. Certainly, IRS, CBERS-2, and IKONOS data also were tested in small areas. In 2002, the SPOT-5 data, whose spatial resolution of the panchromatic image is 2.5 meters and the spectral one is 10 meters, were applied into update the land use base map at the 1:10000 scale in 26 Chinese cities. The purpose in this paper is to communicate the experience of SPOT-5 image processing with the colleagues.
Agile Electro-Mechanical Product Accelerator - Final Research Performance Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Brian
2016-07-29
NCDMM recognized the need to focus on the most efficient use of limited resources while ensuring compliance with regulations and minimizing the energy intensity and environmental impact of manufactured components. This was accomplished through the evaluation of current machining and processing practices, and their efficiencies, to further the sustainability of manufacturing as a whole. Additionally, the activities also identified, and furthered the implementation of new “best practices” within the southwestern Pennsylvania manufacturing sector.
Mechanical Properties and Eco-Efficiency of Steel Fiber Reinforced Alkali-Activated Slag Concrete
Kim, Sun-Woo; Jang, Seok-Joon; Kang, Dae-Hyun; Ahn, Kyung-Lim; Yun, Hyun-Do
2015-01-01
Conventional concrete production that uses ordinary Portland cement (OPC) as a binder seems unsustainable due to its high energy consumption, natural resource exhaustion and huge carbon dioxide (CO2) emissions. To transform the conventional process of concrete production to a more sustainable process, the replacement of high energy-consumptive PC with new binders such as fly ash and alkali-activated slag (AAS) from available industrial by-products has been recognized as an alternative. This paper investigates the effect of curing conditions and steel fiber inclusion on the compressive and flexural performance of AAS concrete with a specified compressive strength of 40 MPa to evaluate the feasibility of AAS concrete as an alternative to normal concrete for CO2 emission reduction in the concrete industry. Their performances are compared with reference concrete produced using OPC. The eco-efficiency of AAS use for concrete production was also evaluated by binder intensity and CO2 intensity based on the test results and literature data. Test results show that it is possible to produce AAS concrete with compressive and flexural performances comparable to conventional concrete. Wet-curing and steel fiber inclusion improve the mechanical performance of AAS concrete. Also, the utilization of AAS as a sustainable binder can lead to significant CO2 emissions reduction and resources and energy conservation in the concrete industry. PMID:28793639
Exploring ethical conflicts in emergency trauma research: The COMBAT study experience
Chin, Theresa L.; Moore, Ernest E.; Coors, Marilyn; Chandler, James G.; Ghasabyan, Arsen; Harr, Jeffrey N.; Stringham, John R.; Ramos, Christopher; Ammons, Sarah; Banerjee, Anirban; Sauaia, Angela
2014-01-01
Background Up to 25% of severely injured patients develop trauma-induced coagulopathy. To study interventions for this vulnerable population for whom consent cannot be obtained easily, the Food and Drug Administration (FDA) issued regulations for emergency research with an exception from informed consent (ER-EIC). We describe the community consultation and public disclosure (CCPD) process in preparation an ER-EIC study namely the Control of Major Bleeding after Trauma (COMBAT) study. Methods The CCPD was guided by the four bioethical principles. We employed a multimedia approach including one-way communications (newspaper ads, brochures, television, radio, and web) and two-way communications (interactive in-person presentations at community meetings, printed and online feedback forms) to reach the trials catchment area (Denver County’s population: 643,000 and the Denver larger metro area where commuters reside: 2.9 million). Particular attention was given to special-interests groups (e.g., Jehovah Witnesses, homeless) and to Spanish-speaking communities (brochures and presentations in Spanish). Opt-out materials were available during on-site presentations or via the COMBAT website. Results 227 community organizations were contacted. Brochures were distributed to 11 medical clinics and 3 homeless shelters. The multimedia campaign had the potential to reach an estimated audience of 1.5 million individuals in large metro Denver area, the majority via one-way communication and 1900 in two-way communications. This resource intensive process cost over $84,000. Conclusions The CCPC process is resource-intensive, costly, and complex. While the multimedia CCPC reached a large audience, the effectiveness of this process remains elusive. The templates can be helpful to similar ER-EIC studies. PMID:25444222
Chin, Theresa L; Moore, Ernest E; Coors, Marilyn E; Chandler, James G; Ghasabyan, Arsen; Harr, Jeffrey N; Stringham, John R; Ramos, Christopher R; Ammons, Sarah; Banerjee, Anirban; Sauaia, Angela
2015-01-01
Up to 25% of severely injured patients develop trauma-induced coagulopathy. To study interventions for this vulnerable population for whom consent cannot be obtained easily, the Food and Drug Administration issued regulations for emergency research with an exception from informed consent (ER-EIC). We describe the community consultation and public disclosure (CC/PD) process in preparation for an ER-EIC study, namely the Control Of Major Bleeding After Trauma (COMBAT) study. The CC/PD was guided by the four bioethical principles. We used a multimedia approach, including one-way communications (newspaper ads, brochures, television, radio, and web) and two-way communications (interactive in-person presentations at community meetings, printed and online feedback forms) to reach the trials catchment area (Denver County's population: 643,000 and the Denver larger metro area where commuters reside: 2.9 million). Particular attention was given to special-interests groups (eg, Jehovah Witnesses, homeless) and to Spanish-speaking communities (brochures and presentations in Spanish). Opt-out materials were available during on-site presentations or via the COMBAT study website. A total of 227 community organizations were contacted. Brochures were distributed to 11 medical clinics and 3 homeless shelters. The multimedia campaign had the potential to reach an estimated audience of 1.5 million individuals in large metro Denver area, the majority via one-way communication and 1900 in two-way communications. This resource intensive process cost more than $84,000. The CC/PD process is resource-intensive, costly, and complex. Although the multimedia CC/PD reached a large audience, the effectiveness of this process remains elusive. The templates can be helpful to similar ER-EIC studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Burholt, Vanessa; Scharf, Thomas
2014-03-01
We draw on cognitive discrepancy theory to hypothesize and test a pathway from poor health to loneliness in later life. We hypothesize that poor health will have a negative influence on social participation and social resources, and these factors will mediate between health and loneliness. We hypothesize that rural environments will amplify any difficulties associated with social participation or accessing social resources and that depression will moderate how intensely people react to levels of social contact and support. We conceptualize a mediation model and a moderated-mediation model. Nationally representative data on older people living in the Republic of Ireland are used to validate the hypothesized pathways. In the mediation model, health has a significant indirect effect on loneliness through the mediating variables social resources and social participation. In the moderated-mediation model, rurality moderates the pathway between health and social resources but not social participation. Depressive symptoms moderate the effect of social resources on loneliness but not social participation. The results provide further credence to cognitive discrepancy theory, suggesting that depressive symptoms influence cognitive processes, interfering with judgments about the adequacy of social interaction. The theory is extended by demonstrating the impact of the environment on loneliness.
Earth observation for regional scale environmental and natural resources management
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Brookshire, D.; Faulkner, S.; Chivoiu, B.; Bridge, B.; Broadbent, C.
2013-12-01
Earth observations (EO) provide critical information to natural resource assessment. Three examples are presented: conserving potable groundwater in intense agricultural regions, maximizing ecosystem service benefits at regional scales from afforestation investment and management, and enabling integrated natural and behavioral sciences for resource management and policy analysis. In each of these cases EO of different resolutions are used in different ways to help in the classification, characterization, and availability of natural resources and ecosystem services. To inform decisions, each example includes a spatiotemporal economic model to optimize the net societal benefits of resource development and exploitation. 1) EO is used for monitoring land use in intensively cultivated agricultural regions. Archival imagery is coupled to a hydrogeological process model to evaluate the tradeoff between agrochemical use and retention of potable groundwater. EO is used to couple individual producers and regional resource managers using information from markets and natural systems to aid in the objective of maximizing agricultural production and maintaining groundwater quality. The contribution of EO is input to a nitrate loading and transport model to estimate the cumulative impact on groundwater at specified distances from specific sites (wells) for 35 Iowa counties and two aquifers. 2) Land use/land cover (LULC) derived from EO is used to compare biological carbon sequestration alternatives and their provisioning of ecosystem services. EO is used to target land attributes that are more or less desirable for enhancing ecosystem services in two parishes in Louisiana. Ecological production functions are coupled with value data to maximize the expected return on investment in carbon sequestration and other ancillary ecosystem services while minimizing the risk. 3) Environmental and natural resources management decisions employ probabilistic estimates of yet-to-find or yet-to-develop volumes of natural and environmental resources and ecosystem services. The potential quantities of resources available are of great societal relevance, as are the resources that are necessarily disturbed in the development of economic reserves. EO is input to a multidimensional decision framework for natural resources and ecosystem services. Imagery supports a spatiotemporal model of regional resource extraction and the associated impacts on ecosystem services. The framework is used to assess societal tradeoffs by evaluating the benefits and costs of future development or preservation in a comparison of regional development options.
A global dataset of sub-daily rainfall indices
NASA Astrophysics Data System (ADS)
Fowler, H. J.; Lewis, E.; Blenkinsop, S.; Guerreiro, S.; Li, X.; Barbero, R.; Chan, S.; Lenderink, G.; Westra, S.
2017-12-01
It is still uncertain how hydrological extremes will change with global warming as we do not fully understand the processes that cause extreme precipitation under current climate variability. The INTENSE project is using a novel and fully-integrated data-modelling approach to provide a step-change in our understanding of the nature and drivers of global precipitation extremes and change on societally relevant timescales, leading to improved high-resolution climate model representation of extreme rainfall processes. The INTENSE project is in conjunction with the World Climate Research Programme (WCRP)'s Grand Challenge on 'Understanding and Predicting Weather and Climate Extremes' and the Global Water and Energy Exchanges Project (GEWEX) Science questions. A new global sub-daily precipitation dataset has been constructed (data collection is ongoing). Metadata for each station has been calculated, detailing record lengths, missing data, station locations. A set of global hydroclimatic indices have been produced based upon stakeholder recommendations including indices that describe maximum rainfall totals and timing, the intensity, duration and frequency of storms, frequency of storms above specific thresholds and information about the diurnal cycle. This will provide a unique global data resource on sub-daily precipitation whose derived indices will be freely available to the wider scientific community.
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149
Optimization of tomographic reconstruction workflows on geographically distributed resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
South Asia river-flow projections and their implications for water resources
NASA Astrophysics Data System (ADS)
Mathison, C.; Wiltshire, A. J.; Falloon, P.; Challinor, A. J.
2015-12-01
South Asia is a region with a large and rising population, a high dependence on water intense industries, such as agriculture and a highly variable climate. In recent years, fears over the changing Asian summer monsoon (ASM) and rapidly retreating glaciers together with increasing demands for water resources have caused concern over the reliability of water resources and the potential impact on intensely irrigated crops in this region. Despite these concerns, there is a lack of climate simulations with a high enough resolution to capture the complex orography, and water resource analysis is limited by a lack of observations of the water cycle for the region. In this paper we present the first 25 km resolution regional climate projections of river flow for the South Asia region. Two global climate models (GCMs), which represent the ASM reasonably well are downscaled (1960-2100) using a regional climate model (RCM). In the absence of robust observations, ERA-Interim reanalysis is also downscaled providing a constrained estimate of the water balance for the region for comparison against the GCMs (1990-2006). The RCM river flow is routed using a river-routing model to allow analysis of present-day and future river flows through comparison with available river gauge observations. We examine how useful these simulations are for understanding potential changes in water resources for the South Asia region. In general the downscaled GCMs capture the seasonality of the river flows but overestimate the maximum river flows compared to the observations probably due to a positive rainfall bias and a lack of abstraction in the model. The simulations suggest an increasing trend in annual mean river flows for some of the river gauges in this analysis, in some cases almost doubling by the end of the century. The future maximum river-flow rates still occur during the ASM period, with a magnitude in some cases, greater than the present-day natural variability. Increases in river flow could mean additional water resources for irrigation, the largest usage of water in this region, but has implications in terms of inundation risk. These projected increases could be more than countered by changes in demand due to depleted groundwater, increases in domestic use or expansion of water intense industries. Including missing hydrological processes in the model would make these projections more robust but could also change the sign of the projections.
ERIC Educational Resources Information Center
Chou, Yueh-Ching; Lee, Yue-Chune; Chang, Shu-chuan; Yu, Amy Pei-Lung
2013-01-01
This study evaluated the potential of using the Supports Intensity Scale (SIS) for resource allocation for people with intellectual disabilities (ID) in Taiwan. SIS scores were compared with those obtained from three tools that are currently used in Taiwan for homecare services: the medical diagnosis issued by local authorities and two scales…
Sprung, Charles L; Zimmerman, Janice L; Christian, Michael D; Joynt, Gavin M; Hick, John L; Taylor, Bruce; Richards, Guy A; Sandrock, Christian; Cohen, Robert; Adini, Bruria
2010-03-01
To provide recommendations and standard operating procedures for intensive care units and hospital preparedness for an influenza pandemic. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics. Key recommendations include: Hospitals should increase their ICU beds to the maximal extent by expanding ICU capacity and expanding ICUs into other areas. Hospitals should have appropriate beds and monitors for these expansion areas. Establish a management system with control groups at facility, local, regional and/or national levels to exercise authority over resources. Establish a system of communication, coordination and collaboration between the ICU and key interface departments. A plan to access, coordinate and increase labor resources is required with a central inventory of all clinical and non-clinical staff. Delegate duties not within the usual scope of workers' practice. Ensure that adequate essential medical equipment, pharmaceuticals and supplies are available. Protect patients and staff with infection control practices and supporting occupational health policies. Maintain staff confidence with reassurance plans for legal protection and assistance. Have objective, ethical, transparent triage criteria that are applied equitably and publically disclosed. ICU triage of patients should be based on the likelihood for patients to benefit most or a 'first come, first served' basis. Develop protocols for safe performance of high-risk procedures. Train and educate staff. Mortality, although inevitable during a severe influenza outbreak or disaster, can be reduced by adequate preparation.
Opportunities and challenges for the life sciences community.
Kolker, Eugene; Stewart, Elizabeth; Ozdemir, Vural
2012-03-01
Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19-20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16-17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org ) was formed to become a Digital Commons for the life sciences community.
Opportunities and Challenges for the Life Sciences Community
Stewart, Elizabeth; Ozdemir, Vural
2012-01-01
Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659
Koni, Phillip; Chishinga, Nathaniel; Nyirenda, Lameck; Kasonde, Prisca; Nsakanya, Richard; Welsh, Michael
2015-01-01
The FHI360-led Zambia Prevention Care and Treatment partnership II (ZPCT II) with funding from United States Agency for International Development, supports the Zambian Ministry of Health in scaling up HIV/AIDS services. To improve the quality of HIV/AIDS services, ZPCT II provides technical assistance until desired standards are met and districts are weaned-off intensive technical support, a process referred to as district graduation. This study describes the graduation process and determines performance domains associated with district graduation. Data were collected from 275 health facilities in 39 districts in 5 provinces of Zambia between 2008 and 2012. Performance in technical capacity, commodity management, data management and human resources domains were assessed in the following services areas: HIV counselling and testing and prevention of mother to child transmission, antiretroviral therapy/clinical care, pharmacy and laboratory. The overall mean percentage score was calculated by obtaining the mean of mean percentage scores for the four domains. Logistic regression models were used to obtain odds ratios (OR) and 95% confidence intervals (CI) for the domain mean percentage scores in graduated versus non-graduated districts; according to rural-urban, and province strata. 24 districts out of 39 graduated from intensive donor supported technical assistance while 15 districts did not graduate. The overall mean percentage score for all four domains was statistically significantly higher in graduated than non-graduated districts (93.2% versus 91.2%, OR = 1.34, 95%CI:1.20-1.49); including rural settings (92.4% versus 89.4%, OR = 1.43,95%CI:1.24-1.65). The mean percentage score in human resource domain was statistically significantly higher in graduated than non-graduated districts (93.6% versus 71.6%, OR = 5.81, 95%CI: 4.29-7.86) and in both rural and urban settings. QA/QI tools can be used to assess performance at health facilities and determine readiness for district graduation. Human resources management domain was found to be an important factor associated with district graduation.
Koni, Phillip; Chishinga, Nathaniel; Nyirenda, Lameck; Kasonde, Prisca; Nsakanya, Richard; Welsh, Michael
2015-01-01
Introduction The FHI360-led Zambia Prevention Care and Treatment partnership II (ZPCT II) with funding from United States Agency for International Development, supports the Zambian Ministry of Health in scaling up HIV/AIDS services. To improve the quality of HIV/AIDS services, ZPCT II provides technical assistance until desired standards are met and districts are weaned-off intensive technical support, a process referred to as district graduation. This study describes the graduation process and determines performance domains associated with district graduation. Methods Data were collected from 275 health facilities in 39 districts in 5 provinces of Zambia between 2008 and 2012. Performance in technical capacity, commodity management, data management and human resources domains were assessed in the following services areas: HIV counselling and testing and prevention of mother to child transmission, antiretroviral therapy/clinical care, pharmacy and laboratory. The overall mean percentage score was calculated by obtaining the mean of mean percentage scores for the four domains. Logistic regression models were used to obtain odds ratios (OR) and 95% confidence intervals (CI) for the domain mean percentage scores in graduated versus non-graduated districts; according to rural-urban, and province strata. Results 24 districts out of 39 graduated from intensive donor supported technical assistance while 15 districts did not graduate. The overall mean percentage score for all four domains was statistically significantly higher in graduated than non-graduated districts (93.2% versus 91.2%, OR = 1.34, 95%CI:1.20–1.49); including rural settings (92.4% versus 89.4%, OR = 1.43,95%CI:1.24–1.65). The mean percentage score in human resource domain was statistically significantly higher in graduated than non-graduated districts (93.6% versus 71.6%, OR = 5.81, 95%CI: 4.29–7.86) and in both rural and urban settings. Conclusions QA/QI tools can be used to assess performance at health facilities and determine readiness for district graduation. Human resources management domain was found to be an important factor associated with district graduation. PMID:26098555
Rangel-Landa, Selene; Casas, Alejandro; García-Frapolli, Eduardo; Lira, Rafael
2017-10-30
Identifying factors influencing plant management allows understanding how processes of domestication operate. Uncertain availability of resources is a main motivation for managing edible plants, but little is known about management motives of non-edible resources like medicinal and ceremonial plants. We hypothesized that uncertain availability of resources would be a general factor motivating their management, but other motives could operate simultaneously. Uncertainty and risk might be less important motives in medicinal than in edible plants, while for ceremonial plants, symbolic and spiritual values would be more relevant. We inventoried edible, medicinal, and ceremonial plants in Ixcatlán, Oaxaca, Mexico, and conducted in-depth studies with 20 native and naturalized species per use type; we documented their cultural importance and abundance by interviewing 25 households and sampling vegetation in 33 sites. Consumption amounts and preferences were studied through surveys and free listings with 38 interviewees. Management intensity and risk indexes were calculated through PCA and their relation analyzed through regression analyses. Canonical methods allowed identifying the main sociocultural and ecological factors influencing management of plants per use type. Nearly 64, 63, and 55% of all ceremonial, edible, and medicinal wild plants recorded, respectively, are managed in order to maintain or increase their availability, embellishing environments, and because of ethical reasons and curiosity. Management intensity was higher in edible plants under human selection and associated with risk. Management of ceremonial and medicinal plants was not associated with indexes of risk or uncertainty in their availability. Other sociocultural and ecological factors influence management intensity, the most important being reciprocal relations and abundance perception. Plant management through practices and collectively regulated strategies is strongly related to control of risk and uncertainty in edible plants, compared with medicinal and ceremonial plants, in which reciprocal interchanges, curiosity, and spiritual values are more important factors. Understanding how needs, worries, social relations, and ethical values influence management decisions is important to understand processes of constructing management strategies and how domestication could be started in the past and are operated at the present.
NASA Astrophysics Data System (ADS)
Caldeira, Rylan; Honnungar, Sunilkumar
2018-04-01
Most of small to medium industries tend to follow traditional systems of manufacturing which aims at maximum resource utilization irrespective of giving attention to customers volatile demand. In recent times manufacturing is being shifted to be consumers centered, with intense competition among industries to satisfy the customer needs in the required quantity and at the right time. To achieve this, companies investigate the possibility of implementation of cellular manufacturing which is characterized by high variety with optimum usage of resources. Cellular layout coupled with the application of lean methodology, places focus on the production process rather than the production methods so as to identify the wastage and apply methods to further improve productivity.
Joynt, Gavin M; Loo, Shi; Taylor, Bruce L; Margalit, Gila; Christian, Michael D; Sandrock, Christian; Danis, Marion; Leoniv, Yuval; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures (SOPs) for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on enhancing coordination and collaboration between the ICU and other key stakeholders. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including coordination and collaboration. Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resource use and communications; (2) develop a system of communication, coordination and collaboration between the ICU and key interface departments within the hospital; (3) identify key functions or processes requiring coordination and collaboration, the most important of these being manpower and resources utilization (surge capacity) and re-allocation of personnel, equipment and physical space; (4) develop processes to allow smooth inter-departmental patient transfers; (5) creating systems and guidelines is not sufficient, it is important to: (a) identify the roles and responsibilities of key individuals necessary for the implementation of the guidelines; (b) ensure that these individuals are adequately trained and prepared to perform their roles; (c) ensure adequate equipment to allow key coordination and collaboration activities; (d) ensure an adequate physical environment to allow staff to properly implement guidelines; (6) trigger events for determining a crisis should be defined. Judicious planning and adoption of protocols for coordination and collaboration with interface units are necessary to optimize outcomes during a pandemic.
P3 and provoked aggressive behavior.
Fanning, Jennifer R; Berman, Mitchell E; Long, James M
2014-01-01
Cognitive and biological processes play a role in human aggression. However, relatively little is known about the neural correlates of cognitive processes in aggressive individuals, particularly as they unfold during an aggressive encounter. We investigated whether the P3 event-related potential (ERP) discriminates aggressive versus nonaggressive individuals during a provocative, aggressive encounter. Forty-eight participants (23 men and 25 women) were classified as aggressive or nonaggressive based on self-reported life history of aggression. Aggressive behavior was assessed using a modification of a well-validated laboratory task during which the participant and a fictitious opponent ostensibly delivered and received noise blasts of low, medium, and high intensity. Provocation was manipulated by altering the level of noise set by the opponent. Aggression was defined as the number of high-intensity noise blasts the participant set for the opponent. As predicted, P3 amplitude in response to provocation differed as a function of aggressive history. Nonaggressive individuals showed enhanced P3 when provoked by the opponent relative to low provocation, but this effect was absent in aggressive individuals. The results suggest that aggressive individuals engage fewer neural processing resources in response to provoking social cues, which may reflect aberrant cognitive and emotional processes.
[Rationing, prioritisation, rationalizing: Significance in everyday intensive care].
Gretenkort, P
2015-11-01
Rationing, even in the treatment of critically ill patients, is the reality on intensive care units. Severity of illnesses and urgency of care are posing high ethical barriers for explicit cost-saving orders. Nevertheless, implicit rationing decisions are a daily ethical minefield, which is not always appreciated by healthcare providers. In this article, typical decision-making situations are described, where limitation of resources plays a role. The idea of saving resources by rationalising rather than rationing results from the fact that not every patient benefits from the full scope of services available in the intensive care unit, and not every patient desires the full scope of care to be supplied to them. Thus, the irrational use of resources can sometimes be avoided to save them for cases where they are necessary.
Life cycle assessment of Chinese shrimp farming systems targeted for export and domestic sales.
Cao, Ling; Diana, James S; Keoleian, Gregory A; Lai, Qiuming
2011-08-01
We conducted surveys of six hatcheries and 18 farms for data inputs to complete a cradle-to-farm-gate life cycle assessment (LCA) to evaluate the environmental performance for intensive (for export markets in Chicago) and semi-intensive (for domestic markets in Shanghai) shrimp farming systems in Hainan Province, China. The relative contribution to overall environmental performance of processing and distribution to final markets were also evaluated from a cradle-to-destination-port perspective. Environmental impact categories included global warming, acidification, eutrophication, cumulative energy use, and biotic resource use. Our results indicated that intensive farming had significantly higher environmental impacts per unit production than semi-intensive farming in all impact categories. The grow-out stage contributed between 96.4% and 99.6% of the cradle-to-farm-gate impacts. These impacts were mainly caused by feed production, electricity use, and farm-level effluents. By averaging over intensive (15%) and semi-intensive (85%) farming systems, 1 metric ton (t) live-weight of shrimp production in China required 38.3 ± 4.3 GJ of energy, as well as 40.4 ± 1.7 t of net primary productivity, and generated 23.1 ± 2.6 kg of SO(2) equiv, 36.9 ± 4.3 kg of PO(4) equiv, and 3.1 ± 0.4 t of CO(2) equiv. Processing made a higher contribution to cradle-to-destination-port impacts than distribution of processed shrimp from farm gate to final markets in both supply chains. In 2008, the estimated total electricity consumption, energy consumption, and greenhouse gas emissions from Chinese white-leg shrimp production would be 1.1 billion kW·h, 49 million GJ, and 4 million metric tons, respectively. Improvements suggested for Chinese shrimp aquaculture include changes in feed composition, farm management, electricity-generating sources, and effluent treatment before discharge. Our results can be used to optimize market-oriented shrimp supply chains and promote more sustainable shrimp production and consumption.
Siar, Susana V
2003-05-01
The coastal zone is a place of intense activity where resources, users, and resource-use practices interact. This case study of small-scale fisheries in Honda Bay, Palawan, Philippines shows that resources, space, and gender are intertwined. The study was conducted between June 1997 and July 1998. The data were gathered using free listing, pile sort, ranking, resource mapping, and key informant interviews. The results showed that women's knowledge about fishery resources and their fishing activities are associated with the intertidal zone whereas men's knowledge is associated with coral reefs. In classifying fishery resources, appearance is the main consideration for women whereas a combination of appearance, habitat, and type of fishing gear is the consideration used by men. Market price is very important because of its dependence on the demand of the export market as well as the local market. Women dominate the buying of fishery products. Many women market their husband's catch, process fish, or gather shells and sea cucumber for sale. Among the fishing households, type of fishing gear provides an indication of socioeconomic standing. This paper concludes that access to resources is shaped by gender and age. The differences in resource knowledge possessed by men and women lead to differential access to fishery resources. In addition, the differences in socioeconomic status also influence resource access. The socialization of children into fishing reinforces the gender division of labor and space in the coastal zone.
Christian, Michael D; Joynt, Gavin M; Hick, John L; Colvin, John; Danis, Marion; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on critical care triage. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including critical care triage. Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resources; (2) developing fair and equitable policies may require restricting ICU services to patients most likely to benefit; (3) usual treatments and standards of practice may be impossible to deliver; (4) ICU care and treatments may have to be withheld from patients likely to die even with ICU care and withdrawn after a trial in patients who do not improve or deteriorate; (5) triage criteria should be objective, ethical, transparent, applied equitably and be publically disclosed; (6) trigger triage protocols for pandemic influenza only when critical care resources across a broad geographic area are or will be overwhelmed despite all reasonable efforts to extend resources or obtain additional resources; (7) triage of patients for ICU should be based on those who are likely to benefit most or a 'first come, first served' basis; (8) a triage officer should apply inclusion and exclusion criteria to determine patient qualification for ICU admission. Judicious planning and adoption of protocols for critical care triage are necessary to optimize outcomes during a pandemic.
Isolated Ficus trees deliver dual conservation and development benefits in a rural landscape.
Cottee-Jones, H Eden W; Bajpai, Omesh; Chaudhary, Lal B; Whittaker, Robert J
2015-11-01
Many of the world's rural populations are dependent on the local provision of economically and medicinally important plant resources. However, increasing land-use intensity is depleting these resources, reducing human welfare, and thereby constraining development. Here we investigate a low cost strategy to manage the availability of valuable plant resources, facilitated by the use of isolated Ficus trees as restoration nuclei. We surveyed the plants growing under 207 isolated trees in Assam, India, and categorized them according to their local human-uses. We found that Ficus trees were associated with double the density of important high-grade timber, firewood, human food, livestock fodder, and medicinal plants compared to non-Ficus trees. Management practices were also important in determining the density of valuable plants, with grazing pressure and land-use intensity significantly affecting densities in most categories. Community management practices that conserve isolated Ficus trees, and restrict livestock grazing and high-intensity land-use in their vicinity, can promote plant growth and the provision of important local resources.
Optimizing a mobile robot control system using GPU acceleration
NASA Astrophysics Data System (ADS)
Tuck, Nat; McGuinness, Michael; Martin, Fred
2012-01-01
This paper describes our attempt to optimize a robot control program for the Intelligent Ground Vehicle Competition (IGVC) by running computationally intensive portions of the system on a commodity graphics processing unit (GPU). The IGVC Autonomous Challenge requires a control program that performs a number of different computationally intensive tasks ranging from computer vision to path planning. For the 2011 competition our Robot Operating System (ROS) based control system would not run comfortably on the multicore CPU on our custom robot platform. The process of profiling the ROS control program and selecting appropriate modules for porting to run on a GPU is described. A GPU-targeting compiler, Bacon, is used to speed up development and help optimize the ported modules. The impact of the ported modules on overall performance is discussed. We conclude that GPU optimization can free a significant amount of CPU resources with minimal effort for expensive user-written code, but that replacing heavily-optimized library functions is more difficult, and a much less efficient use of time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
Incorporating A Structured Writing Process into Existing CLS Curricula.
Honeycutt, Karen; Latshaw, Sandra
2014-01-01
Good communication and critical thinking are essential skills for all successful professionals, including Clinical Laboratory Science/Medical Laboratory Science (CLS/MLS) practitioners. Professional programs can incorporate writing assignments into their curricula to improve student written communication and critical thinking skills. Clearly defined, scenario-focused writing assignments provide student practice in clearly articulating responses to proposed problems or situations, researching and utilizing informational resources, and applying and synthesizing relevant information. Assessment rubrics, structured feedback, and revision writing methodologies help guide students through the writing process. This article describes how a CLS Program in a public academic medical center, located in the central United States (US) serving five centrally-located US states has incorporated writing intensive assignments into an existing 11-month academic year using formal, informal and reflective writing to improve student written communication and critical thinking skills. Faculty members and employers of graduates assert that incorporating writing intensive requirements have better prepared students for their professional role to effectively communicate and think critically.
Quality Improvement in Critical Care: Selection and Development of Quality Indicators
Martin, Claudio M.; Project, The Quality Improvement in Critical Care
2016-01-01
Background. Caring for critically ill patients is complex and resource intensive. An approach to monitor and compare the function of different intensive care units (ICUs) is needed to optimize outcomes for patients and the health system as a whole. Objective. To develop and implement quality indicators for comparing ICU characteristics and performance within and between ICUs and regions over time. Methods. Canadian jurisdictions with established ICU clinical databases were invited to participate in an iterative series of face-to-face meetings, teleconferences, and web conferences. Eighteen adult intensive care units across 14 hospitals and 5 provinces participated in the process. Results. Six domains of ICU function were identified: safe, timely, efficient, effective, patient/family satisfaction, and staff work life. Detailed operational definitions were developed for 22 quality indicators. The feasibility was demonstrated with the collection of 3.5 years of data. Statistical process control charts and graphs of composite measures were used for data display and comparisons. Medical and nursing leaders as well as administrators found the system to be an improvement over prior methods. Conclusions. Our process resulted in the selection and development of 22 indicators representing 6 domains of ICU function. We have demonstrated the feasibility of such a reporting system. This type of reporting system will demonstrate variation between units and jurisdictions to help identify and prioritize improvement efforts. PMID:27493476
Planning for the next influenza pandemic: using the science and art of logistics.
Cupp, O Shawn; Predmore, Brad G
2011-01-01
The complexities and challenges for healthcare providers and their efforts to provide fundamental basic items to meet the logistical demands of an influenza pandemic are discussed in this article. The supply chain, planning, and alternatives for inevitable shortages are some of the considerations associated with this emergency mass critical care situation. The planning process and support for such events are discussed in detail with several recommendations obtained from the literature and the experience from recent mass casualty incidents (MCIs). The first step in this planning process is the development of specific triage requirements during an influenza pandemic. The second step is identification of logistical resources required during such a pandemic, which are then analyzed within the proposed logistics science and art model for planning purposes. Resources highlighted within the model include allocation and use of work force, bed space, intensive care unit assets, ventilators, personal protective equipment, and oxygen. The third step is using the model to discuss in detail possible workarounds, suitable substitutes, and resource allocation. An examination is also made of the ethics surrounding palliative care within the construction of an MCI and the factors that will inevitably determine rationing and prioritizing of these critical assets to palliative care patients.
ASME V\\&V challenge problem: Surrogate-based V&V
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beghini, Lauren L.; Hough, Patricia D.
2015-12-18
The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivitymore » analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.« less
Expert system for on-board satellite scheduling and control
NASA Technical Reports Server (NTRS)
Barry, John M.; Sary, Charisse
1988-01-01
An Expert System is described which Rockwell Satellite and Space Electronics Division (S&SED) is developing to dynamically schedule the allocation of on-board satellite resources and activities. This expert system is the Satellite Controller. The resources to be scheduled include power, propellant and recording tape. The activities controlled include scheduling satellite functions such as sensor checkout and operation. The scheduling of these resources and activities is presently a labor intensive and time consuming ground operations task. Developing a schedule requires extensive knowledge of the system and subsystems operations, operational constraints, and satellite design and configuration. This scheduling process requires highly trained experts anywhere from several hours to several weeks to accomplish. The process is done through brute force, that is examining cryptic mnemonic data off line to interpret the health and status of the satellite. Then schedules are formulated either as the result of practical operator experience or heuristics - that is rules of thumb. Orbital operations must become more productive in the future to reduce life cycle costs and decrease dependence on ground control. This reduction is required to increase autonomy and survivability of future systems. The design of future satellites require that the scheduling function be transferred from ground to on board systems.
[Big data, medical language and biomedical terminology systems].
Schulz, Stefan; López-García, Pablo
2015-08-01
A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.
Pyrolysis of plastic waste for liquid fuel production as prospective energy resource
NASA Astrophysics Data System (ADS)
Sharuddin, S. D. A.; Abnisa, F.; Daud, W. M. A. W.; Aroua, M. K.
2018-03-01
The worldwide plastic generation expanded over years because of the variety applications of plastics in numerous sectors that caused the accumulation of plastic waste in the landfill. The growing of plastics demand definitely affected the petroleum resources availability as non-renewable fossil fuel since plastics were the petroleum-based material. A few options that have been considered for plastic waste management were recycling and energy recovery technique. Nevertheless, several obstacles of recycling technique such as the needs of sorting process that was labour intensive and water pollution that lessened the process sustainability. As a result, the plastic waste conversion into energy was developed through innovation advancement and extensive research. Since plastics were part of petroleum, the oil produced through the pyrolysis process was said to have high calorific value that could be used as an alternative fuel. This paper reviewed the thermal and catalytic degradation of plastics through pyrolysis process and the key factors that affected the final end product, for instance, oil, gaseous and char. Additionally, the liquid fuel properties and a discussion on several perspectives regarding the optimization of the liquid oil yield for every plastic were also included in this paper.
Advancing Cyberinfrastructure to support high resolution water resources modeling
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.
2012-12-01
Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics challenges involved with data management and easy-to-use access to high performance computing being tackled in this project.
NASA Astrophysics Data System (ADS)
Zhang, Yan; Liu, Hong; Chen, Bin; Zheng, Hongmei; Li, Yating
2014-06-01
Discovering ways in which to increase the sustainability of the metabolic processes involved in urbanization has become an urgent task for urban design and management in China. As cities are analogous to living organisms, the disorders of their metabolic processes can be regarded as the cause of "urban disease". Therefore, identification of these causes through metabolic process analysis and ecological element distribution through the urban ecosystem's compartments will be helpful. By using Beijing as an example, we have compiled monetary input-output tables from 1997, 2000, 2002, 2005, and 2007 and calculated the intensities of the embodied ecological elements to compile the corresponding implied physical input-output tables. We then divided Beijing's economy into 32 compartments and analyzed the direct and indirect ecological intensities embodied in the flows of ecological elements through urban metabolic processes. Based on the combination of input-output tables and ecological network analysis, the description of multiple ecological elements transferred among Beijing's industrial compartments and their distribution has been refined. This hybrid approach can provide a more scientific basis for management of urban resource flows. In addition, the data obtained from distribution characteristics of ecological elements may provide a basic data platform for exploring the metabolic mechanism of Beijing.
Cheng, D C; Newman, M F; Duke, P; Wong, D T; Finegan, B; Howie, M; Fitch, J; Bowdle, T A; Hogue, C; Hillel, Z; Pierce, E; Bukenya, D
2001-05-01
We compared (a) the perioperative complications; (b) times to eligibility for, and actual time of the following: extubation, less intense monitoring, intensive care unit (ICU), and hospital discharge; and (c) resource utilization of nursing ratio for patients receiving either a typical fentanyl/isoflurane/propofol regimen or a remifentanil/isoflurane/propofol regimen for fast-track cardiac anesthesia in 304 adults by using a prospective randomized, double-blinded, double-dummy trial. There were no differences in demographic data, or perioperative mortality and morbidity between the two study groups. The mini-mental status examination at postoperative Days 1 to 3 were similar between the two groups. The eligible and actual times for extubation, less intense monitoring, ICU discharge, and hospital discharge were not significantly different. Further analyses revealed no differences in times for extubation and resource utilization after stratification by preoperative risk scores, age, and country. The nurse/patient ratio was similar between the remifentanil/isoflurane/propofol and fentanyl/isoflu-rane/propofol groups during the initial ICU phase and less intense monitoring phase. Increasing preoperative risk scores and older age (>70 yr) were associated with longer times until extubation (eligible), ICU discharge (eligible and actual), and hospital discharge (eligible and actual). Times until extubation (eligible and actual) and less intense monitoring (eligible) were significantly shorter in Canadian patients than United States' patients. However, there was no difference in hospital length of stay in Canadian and United States' patients. We conclude that both anesthesia techniques permit early and similar times until tracheal extubation, less intense monitoring, ICU and hospital discharge, and reduced resource utilization after coronary artery bypass graft surgery. An ultra-short opioid technique was compared with a standard fast-track small-dose opioid technique in coronary artery bypass graft patients in a prospective randomized, double-blinded controlled study. The postoperative recovery and resource utilization, including stratification of preoperative risk score, age, and country, were analyzed.
Streaming support for data intensive cloud-based sequence analysis.
Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed
2013-01-01
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.
Prediction of Disease Case Severity Level To Determine INA CBGs Rate
NASA Astrophysics Data System (ADS)
Puspitorini, Sukma; Kusumadewi, Sri; Rosita, Linda
2017-03-01
Indonesian Case-Based Groups (INA CBGs) is case-mix payment system using software grouper application. INA CBGs consisting of four digits code where the last digits indicating the severity level of disease cases. Severity level influence by secondary diagnosis (complications and co-morbidity) related to resource intensity level. It is medical resources used to treat a hospitalized patient. Objectives of this research is developing decision support system to predict severity level of disease cases and illustrate INA CBGs rate by using data mining decision tree classification model. Primary diagnosis (DU), first secondary diagnosis (DS 1), and second secondary diagnosis (DS 2) are attributes that used as input of severity level. The training process using C4.5 algorithm and the rules will represent in the IF-THEN form. Credibility of the system analyzed through testing process and confusion matrix present the results. Outcome of this research shows that first secondary diagnosis influence significant to form severity level predicting rules from new disease cases and INA CBGs rate illustration.
Louisiana mid-cycle survey shows change in forests resource trends
Charles E. Thomas; Carl V. Bylin
1982-01-01
Because costs of doing surveys are escalating rapidly, and both dollars and manpower are scarce resources, a low-intensity survey for the mid-cycle inventory may be the answer to timely monitoring of state resource trends.
Rainio, Anna-Kaisa; Ohinmaa, Arto E
2005-07-01
RAFAELA is a new Finnish PCS, which is used in several University Hospitals and Central Hospitals and has aroused considerable interest in hospitals in Europe. The aim of the research is firstly to assess the feasibility of the RAFAELA Patient Classification System (PCS) in nursing staff management and, secondly, whether it can be seen as the transferring of nursing resources between wards according to the information received from nursing care intensity classification. The material was received from the Central Hospital's 12 general wards between 2000 and 2001. The RAFAELA PCS consists of three different measures: a system measuring patient care intensity, a system recording daily nursing resources, and a system measuring the optimal nursing care intensity/nurse situation. The data were analysed in proportion to the labour costs of nursing work and, from that, we calculated the employer's loss (a situation below the optimal level) and savings (a situation above the optimal level) per ward as both costs and the number of nurses. In 2000 the wards had on average 77 days below the optimal level and 106 days above it. In 2001 the wards had on average 71 days below the optimal level and 129 above it. Converting all these days to monetary and personnel resources the employer lost 307,745 or 9.84 nurses and saved 369,080 or 11.80 nurses in total in 2000. In 2001 the employer lost in total 242,143 or 7.58 nurses and saved 457,615 or 14.32 nurses. During the time period of the research nursing resources seemed not have been transferred between wards. RAFAELA PCS is applicable to the allocation of nursing resources but its possibilities have not been entirely used in the researched hospital. The management of nursing work should actively use the information received in nursing care intensity classification and plan and implement the transferring of nursing resources in order to ensure the quality of patient care. Information on which units resources should be allocated to is needed in the planning of staff resources of the whole hospital. More resources do not solve the managerial problem of the right allocation of resources. If resources are placed wrongly, the problems of daily staff management and cost control continue.
Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.
Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messer, Bronson; Sewell, Christopher; Heitmann, Katrin
2015-01-01
Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less
Smaldone, Arlene; Tsimicalis, Argerie; Stone, Patricia W
2011-01-01
In the United States, rising health care costs have led to discussion about bending the cost curve. To understand the true burden of disease and its treatment, costs of care, including those incurred by patients and their families, must be comprehensively assessed using psychometrically sound instruments. The Resource Utilization Questionnaire (RUQ) is a 21-item self-report questionnaire first developed to measure the costs incurred by families of infants who had required intensive care during the newborn period. The purpose of this article is to describe the conceptualization of resource utilization and costs and other methodological issues in conducting economic analyses, the process of adapting the RUQ for use in children and families with Type 1 diabetes mellitus (T1DM), and the psychometric evaluation to establish content and criterion validity of the instrument. The finalized modified RUQ for T1DM (mRUQ-T1DM) contained 25 items reflecting direct (5 items) and nondirect (3 items) health care, patient/family time (8 items), and patient/family productivity (9 items) costs using a 3-month recall. The mRUQ-T1DM validly measures cost incurred by children and families with T1DM and is easily completed by parents. Furthermore, the mRUQ-T1DM may be adapted for use in other populations using a similar process.
FPGA implementation for real-time background subtraction based on Horprasert model.
Rodriguez-Gomez, Rafael; Fernandez-Sanchez, Enrique J; Diaz, Javier; Ros, Eduardo
2012-01-01
Background subtraction is considered the first processing stage in video surveillance systems, and consists of determining objects in movement in a scene captured by a static camera. It is an intensive task with a high computational cost. This work proposes an embedded novel architecture on FPGA which is able to extract the background on resource-limited environments and offers low degradation (produced because of the hardware-friendly model modification). In addition, the original model is extended in order to detect shadows and improve the quality of the segmentation of the moving objects. We have analyzed the resource consumption and performance in Spartan3 Xilinx FPGAs and compared to others works available on the literature, showing that the current architecture is a good trade-off in terms of accuracy, performance and resources utilization. With less than a 65% of the resources utilization of a XC3SD3400 Spartan-3A low-cost family FPGA, the system achieves a frequency of 66.5 MHz reaching 32.8 fps with resolution 1,024 × 1,024 pixels, and an estimated power consumption of 5.76 W.
FPGA Implementation for Real-Time Background Subtraction Based on Horprasert Model
Rodriguez-Gomez, Rafael; Fernandez-Sanchez, Enrique J.; Diaz, Javier; Ros, Eduardo
2012-01-01
Background subtraction is considered the first processing stage in video surveillance systems, and consists of determining objects in movement in a scene captured by a static camera. It is an intensive task with a high computational cost. This work proposes an embedded novel architecture on FPGA which is able to extract the background on resource-limited environments and offers low degradation (produced because of the hardware-friendly model modification). In addition, the original model is extended in order to detect shadows and improve the quality of the segmentation of the moving objects. We have analyzed the resource consumption and performance in Spartan3 Xilinx FPGAs and compared to others works available on the literature, showing that the current architecture is a good trade-off in terms of accuracy, performance and resources utilization. With less than a 65% of the resources utilization of a XC3SD3400 Spartan-3A low-cost family FPGA, the system achieves a frequency of 66.5 MHz reaching 32.8 fps with resolution 1,024 × 1,024 pixels, and an estimated power consumption of 5.76 W. PMID:22368487
Water intensity assessment of shale gas resources in the Wattenberg field in northeastern Colorado.
Goodwin, Stephen; Carlson, Ken; Knox, Ken; Douglas, Caleb; Rein, Luke
2014-05-20
Efficient use of water, particularly in the western U.S., is an increasingly important aspect of many activities including agriculture, urban, and industry. As the population increases and agriculture and energy needs continue to rise, the pressure on water and other natural resources is expected to intensify. Recent advances in technology have stimulated growth in oil and gas development, as well as increasing the industry's need for water resources. This study provides an analysis of how efficiently water resources are used for unconventional shale development in Northeastern Colorado. The study is focused on the Wattenberg Field in the Denver-Julesberg Basin. The 2000 square mile field located in a semiarid climate with competing agriculture, municipal, and industrial water demands was one of the first fields where widespread use of hydraulic fracturing was implemented. The consumptive water intensity is measured using a ratio of the net water consumption and the net energy recovery and is used to measure how efficiently water is used for energy extraction. The water and energy use as well as energy recovery data were collected from 200 Noble Energy Inc. wells to estimate the consumptive water intensity. The consumptive water intensity of unconventional shale in the Wattenberg is compared with the consumptive water intensity for extraction of other fuels for other energy sources including coal, natural gas, oil, nuclear, and renewables. 1.4 to 7.5 million gallons is required to drill and hydraulically fracture horizontal wells before energy is extracted in the Wattenberg Field. However, when the large short-term total freshwater-water use is normalized to the amount of energy produced over the lifespan of a well, the consumptive water intensity is estimated to be between 1.8 and 2.7 gal/MMBtu and is similar to surface coal mining.
Agulnik, Asya; Nadkarni, Anisha; Mora Robles, Lupe Nataly; Soberanis Vasquez, Dora Judith; Mack, Ricardo; Antillon-Klussmann, Federico; Rodriguez-Galindo, Carlos
2018-04-10
Pediatric oncology patients hospitalized in resource-limited settings are at high risk for clinical deterioration resulting in mortality. Intermediate care units (IMCUs) provide a cost-effective alternative to pediatric intensive care units (PICUs). Inappropriate IMCU triage, however, can lead to poor outcomes and suboptimal resource utilization. In this study, we sought to characterize patients with clinical deterioration requiring unplanned transfer to the IMCU in a resource-limited pediatric oncology hospital. Patients requiring subsequent early PICU transfer had longer PICU length of stay. PEWS results prior to IMCU transfer were higher in patients requiring early PICU transfer, suggesting PEWS can aid in triage between IMCU and PICU care. © 2018 Wiley Periodicals, Inc.
Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.
2016-12-01
To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.
NASA Astrophysics Data System (ADS)
Wijaya, E. R.; Irianto, D.
2018-03-01
The industry sectors that have an important role in the era of globalization is the electro engineering sector. The era of globalization led to intense competition. One of the negative effects of the intense competition is declining profits. Drop in profits caused many firms reduces their employees without seeking the root cause of declining profits in detail. Whereas, employee is the important resources to maintain competitive advantage. Competitive advantage can be measured by the performance of which is owned by the firm. The firm's performance can be formed of competencies that is unique, rare, irreplaceable, and difficult to imitate within the firm, one of them is the competence of the individual. According to a competency-based approach and the resource- based approach, individual competence that affect the performance of the firm is managerial competence, technical competence, and strategic competence. Questionnaire is built based on the dimensions of the firm's performance, managerial competence, technical competence, and strategic competence, are processed using partial least squares application. The results indicate that managerial competence negatively impact firm’s performance with weak ties. The technical competence and strategic competence positively affect firm’s performance with moderate ties.
Analyses of impacts of China's international trade on its water resources and uses
NASA Astrophysics Data System (ADS)
Zhang, Z. Y.; Yang, H.; Shi, M. J.; Zehnder, A. J. B.; Abbaspour, K. C.
2011-04-01
This study provides an insight into the impact of China's international trade of goods and services on its water resources and uses. Virtual water flows associated with China's international trade are quantified in an input-output framework. The analysis is scaled down to the sectoral and provincial levels to trace the origins and destinations of virtual water flows associated with the international trade. The results reveal that China is a net virtual water exporter of 4.7 × 1010 m3 year-1, accounting for 2.1% of its total water resources and 8.9% of the total water use. Water scarce regions tend to have higher percentages of virtual water export relative to their water resources and water uses. In the water scarce Huang-Huai-Hai region, the net virtual water export accounts for 7.9% of the region's water resources and 11.2% of its water uses. For individual sectors, major net virtual water exporters are those where agriculture provides raw materials in the initial process of the production chain and/or pollution intensity is high. The results suggest that China's economic gains from being a world "manufacture factory" have come at a high cost to its water resources and through pollution to its environment.
2014-01-01
Objectives. We draw on cognitive discrepancy theory to hypothesize and test a pathway from poor health to loneliness in later life. We hypothesize that poor health will have a negative influence on social participation and social resources, and these factors will mediate between health and loneliness. We hypothesize that rural environments will amplify any difficulties associated with social participation or accessing social resources and that depression will moderate how intensely people react to levels of social contact and support. Methods. We conceptualize a mediation model and a moderated-mediation model. Nationally representative data on older people living in the Republic of Ireland are used to validate the hypothesized pathways. Results. In the mediation model, health has a significant indirect effect on loneliness through the mediating variables social resources and social participation. In the moderated-mediation model, rurality moderates the pathway between health and social resources but not social participation. Depressive symptoms moderate the effect of social resources on loneliness but not social participation. Discussion. The results provide further credence to cognitive discrepancy theory, suggesting that depressive symptoms influence cognitive processes, interfering with judgments about the adequacy of social interaction. The theory is extended by demonstrating the impact of the environment on loneliness. PMID:24326076
Towards a Unified Architecture for Data-Intensive Seismology in VERCE
NASA Astrophysics Data System (ADS)
Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.
2013-12-01
Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.
NASA Astrophysics Data System (ADS)
Barreiro, B.; Barton, E. D.
2012-04-01
The study of Eastern Boundary Upwelling Systems is of vital importance, given the interest in rational management of the fisheries resources. The high level of biogeochemical activity associated with the physical process of upwelling increases primary production and enriches the living resources of these areas. This presentation focuses on the variability of these physical processes on daily to interdecadal scales, in an investigation of the effects of climate change in the Iberian and California-Oregon Upwelling Systems. The Upwelling Index (UI) was analysed for the period 1967-2010 at 35.5-44.5°N in both areas. The two systems differ in that the magnitudes of upwelling intensity off California-Oregon are 3.3 higher than off Iberia but they show a similar latitudinal behaviour. The annual/interannual scale variability of upwelling can be represented by the recently introduced Cumulative Upwelling Index (CUI) based on summing the mean daily UI. The seasonal cycle results show the length of upwelling season increases southwards from 180 to 300 days and a net upwelling occurs only for latitudes lower than 43°N. On the interannual scales, the CUI showed a roughly linear change at high and low latitudes (R>0.9), with slopes between 250 and -130 m3 s-1 km-1 day-1 in Iberian and 620 and -290 m3 s-1 km-1 day-1 in California-Oregon. The central areas (40.5-42.5°N) are less stable and shifted between net upwelling and downwelling over extended periods. This information helps us contextualize the present state of the study area and interpreted ongoing intensive process-oriented studies within the longer term variability.
Assigning Resources to Health Care Use for Health Services Research: Options and Consequences
Fishman, Paul A.; Hornbrook, Mark C.
2013-01-01
Aims Our goals are threefold: 1) to review the leading options for assigning resource coefficients to health services utilization; 2) to discuss the relative advantages of each option; and, 3) provide examples where the research question had marked implications for the choice of which resource measure to employ. Methods Three approaches have been used to establish relative resource weights in health services research: a) direct estimation of production costs through micro-costing or step down allocation methods; b) macro-costing/regression analysis; and, c) standardized resource assignment. We describe each of these methods and provide examples of how the study question drove the choice of resource use measure. Findings All empirical resource-intensity weighting systems contain distortions that limit their universal application. Hence, users must select the weighting system that matches the needs of their specific analysis. All systems require significant data resources and data processing. However, inattention to the distortions contained in a complex resource weighting system may undermine the validity and generalizability of an economic evaluation. Conclusions Direct estimation of production costs are useful for empirical analyses, but they contain distortions that undermine optimal resource allocation decisions. Researchers must ensure that the data being used meets both the study design and the question being addressed. They also should ensure that the choice of resource measure is the best fit for the analysis. Implications for Research and Policy Researchers should consider which of the available measures is the most appropriate for the question being addressed rather than take ‘cost’ or utilization as a variable over which they have no control PMID:19536002
Yamada, Janet; Squires, Janet E; Estabrooks, Carole A; Victor, Charles; Stevens, Bonnie
2017-01-23
Despite substantial research on pediatric pain assessment and management, health care professionals do not adequately incorporate this knowledge into clinical practice. Organizational context (work environment) is a significant factor in influencing outcomes; however, the nature of the mechanisms are relatively unknown. The objective of this study was to assess how organizational context moderates the effect of research use and pain outcomes in hospitalized children. A cross-sectional survey was undertaken with 779 nurses in 32 patient care units in 8 Canadian pediatric hospitals, following implementation of a multifaceted knowledge translation intervention, Evidence-based Practice for Improving Quality (EPIQ). The influence of organizational context was assessed in relation to pain process (assessment and management) and clinical (pain intensity) outcomes. Organizational context was measured using the Alberta Context Tool that includes: leadership, culture, evaluation, social capital, informal interactions, formal interactions, structural and electronic resources, and organizational slack (staff, space, and time). Marginal modeling estimated the effects of instrumental research use (direct use of research knowledge) and conceptual research use (indirect use of research knowledge) on pain outcomes while examining the effects of context. Six of the 10 organizational context factors (culture, social capital, informal interactions, resources, and organizational slack [space and time]) significantly moderated the effect of instrumental research use on pain assessment; four factors (culture, social capital, resources and organizational slack time) moderated the effect of conceptual research use and pain assessment. Only two factors (evaluation and formal interactions) moderated the effect of instrumental research use on pain management. All organizational factors except slack space significantly moderated the effect of instrumental research use on pain intensity; informal interactions and organizational slack space moderated the effect of conceptual research use and pain intensity. Many aspects of organizational context consistently moderated the effects of instrumental research use on pain assessment and pain intensity, while only a few influenced conceptual use of research on pain outcomes. Organizational context factors did not generally influence the effect of research use on pain management. Further research is required to further explore the relationships between organizational context and pain management outcomes.
Designing and Delivering Intensive Interventions: A Teacher's Toolkit
ERIC Educational Resources Information Center
Murray, Christy S.; Coleman, Meghan A.; Vaughn, Sharon; Wanzek, Jeanne; Roberts, Greg
2012-01-01
This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…
Critical Zone Services as a Measure for Evaluating the Trade-offs in Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Richardson, M.; Kumar, P.
2015-12-01
The Critical Zone includes the range of biophysical processes occurring from the top of the vegetation canopy to the weathering zone below the groundwater table. These services (Field et al. 2015) provide a measure to value processes that support the goods and services from our landscapes. In intensively managed landscapes the provisioning and regulating services are being altered through anthropogenic energy inputs so as to derive more agricultural productivity from the landscapes. Land use change and other alterations to the environment result in positive and/or negative net Critical Zone services. Through studies in the Critical Zone Observatory for Intensively Managed Landscapes (IMLCZO), this research seeks to answer questions such as: Are perennial bioenergy crops or annual replaced crops better for the land and surrounding environment? How do we evaluate the products and services from the land for the energy and resources we put in? Before the economic valuation of Critical Zone services, these questions seemed abstract. However, with developments such as Critical Zone services and life cycle assessments, they are more concrete. To evaluate the trade-offs between positive and negative impacts, life cycle assessments are used to create an inventory of all the energy inputs and outputs in a landscape management system. Total energy is computed by summing the mechanical energy used to construct tile drains, fertilizer, and other processes involved in intensely managed landscapes and the chemical energy gained by the production of biofuels from bioenergy crops. A multi-layer canopy model (MLCan) computes soil, water, and nutrient outputs for each crop type, which can be translated into Critical Zone services. These values are then viewed alongside the energy inputs into the system to show the relationship between agricultural practices and their corresponding ecosystem and environmental impacts.
Bamberger, Katharine T
2016-03-01
The use of intensive longitudinal methods (ILM)-rapid in situ assessment at micro timescales-can be overlaid on RCTs and other study designs in applied family research. Particularly, when done as part of a multiple timescale design-in bursts over macro timescales-ILM can advance the study of the mechanisms and effects of family interventions and processes of family change. ILM confers measurement benefits in accurately assessing momentary and variable experiences and captures fine-grained dynamic pictures of time-ordered processes. Thus, ILM allows opportunities to investigate new research questions about intervention effects on within-subject (i.e., within-person, within-family) variability (i.e., dynamic constructs) and about the time-ordered change process that interventions induce in families and family members beginning with the first intervention session. This paper discusses the need and rationale for applying ILM to family intervention evaluation, new research questions that can be addressed with ILM, example research using ILM in the related fields of basic family research and the evaluation of individual-based interventions. Finally, the paper touches on practical challenges and considerations associated with ILM and points readers to resources for the application of ILM.
Bamberger, Katharine T.
2015-01-01
The use of intensive longitudinal methods (ILM)—rapid in situ assessment at micro timescales—can be overlaid on RCTs and other study designs in applied family research. Especially when done as part of a multiple timescale design—in bursts over macro timescales, ILM can advance the study of the mechanisms and effects of family interventions and processes of family change. ILM confers measurement benefits in accurately assessing momentary and variable experiences and captures fine-grained dynamic pictures of time-ordered processes. Thus, ILM allows opportunities to investigate new research questions about intervention effects on within-subject (i.e., within-person, within-family) variability (i.e., dynamic constructs) and about the time-ordered change process that interventions induce in families and family members beginning with the first intervention session. This paper discusses the need and rationale for applying ILM to intervention evaluation, new research questions that can be addressed with ILM, example research using ILM in the related fields of basic family research and the evaluation of individual-based (rather than family-based) interventions. Finally, the paper touches on practical challenges and considerations associated with ILM and points readers to resources for the application of ILM. PMID:26541560
Meng, Qingmin
2015-05-15
Hydraulic fracturing, also known as fracking, has been increasing exponentially across the United States, which holds the largest known shale gas reserves in the world. Studies have found that the high-volume horizontal hydraulic fracturing process (HVHFP) threatens water resources, harms air quality, changes landscapes, and damages ecosystems. However, there is minimal research focusing on the spatial study of environmental and human risks of HVHFP, which is necessary for state and federal governments to administer, regulate, and assess fracking. Integrating GIS and spatial kernel functions, we study the presently operating fracking wells across the state of Pennsylvania (PA), which is the main part of the current hottest Marcellus Shale in US. We geographically process the location data of hydraulic fracturing wells, 2010 census block data, urbanized region data, railway data, local road data, open water data, river data, and wetland data for the state of PA. From this we develop a distance based risk assessment in order to understand the environmental and urban risks. We generate the surface data of fracking well intensity and population intensity by integrating spatial dependence, semivariogram modeling, and a quadratic kernel function. The surface data of population risk generated by the division of fracking well intensity and population intensity provide a novel insight into the local and regional regulation of hydraulic fracturing activities in terms of environmental and health related risks due to the proximity of fracking wells. Copyright © 2015 Elsevier B.V. All rights reserved.
Privalle, Laura S; Chen, Jingwen; Clapper, Gina; Hunst, Penny; Spiegelhalter, Frank; Zhong, Cathy X
2012-10-17
"Genetically modified" (GM) or "biotech" crops have been the most rapidly adopted agricultural technology in recent years. The development of a GM crop encompasses trait identification, gene isolation, plant cell transformation, plant regeneration, efficacy evaluation, commercial event identification, safety evaluation, and finally commercial authorization. This is a lengthy, complex, and resource-intensive process. Crops produced through biotechnology are the most highly studied food or food component consumed. Before commercialization, these products are shown to be as safe as conventional crops with respect to feed, food, and the environment. This paper describes this global process and the various analytical tests that must accompany the product during the course of development, throughout its market life, and beyond.
Are natural resources bad for health?
El Anshasy, Amany A; Katsaiti, Marina-Selini
2015-03-01
The purpose of this paper is to empirically examine whether economic dependence on various natural resources is associated with lower investment in health, after controlling for countries' geographical and historical fixed effects, corruption, autocratic regimes, income levels, and initial health status. Employing panel data for 118 countries for the period 1990-2008, we find no compelling evidence in support of a negative effect of resources on healthcare spending and outcomes. On the contrary, higher dependence on agricultural exports is associated with higher healthcare spending, higher life expectancy, and lower diabetes rates. Similarly, healthcare spending increases with higher mineral intensity. Finally, more hydrocarbon resource rents are associated with less diabetes and obesity rates. There is however evidence that public health provision relative to the size of the economy declines with greater hydrocarbon resource-intensity; the magnitude of this effect is less severe in non-democratic countries. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cuny, Henri E; Rathgeber, Cyrille B K; Lebourgeois, François; Fortin, Mathieu; Fournier, Meriem
2012-05-01
We investigated whether timing and rate of growth are related to the life strategies and fitness of three conifer species. Intra-annual dynamics of wood formation, shoot elongation and needle phenology were monitored over 3 years in five Norway spruces (Picea abies (L.) Karst.), five Scots pines (Pinus sylvestris L.) and five silver firs (Abies alba Mill.) grown intermixed. For the three species, the growing season (delimited by cambial activity onset and cessation) lasted about 4 months, while the whole process of wood formation lasted 5-6 months. Needle unfolding and shoot elongation followed the onset of cambial activity and lasted only one-third of the season. Pines exhibited an 'extensive strategy' of cambial activity, with long durations but low growth rates, while firs and spruces adopted an 'intensive strategy' with shorter durations but higher growth rates. We estimated that about 75% of the annual radial increment variability was attributable to the rate of cell production, and only 25% to its duration. Cambial activity rates culminated at the same time for the three species, whereas shoot elongation reached its maximal rate earlier in pines. Results show that species-specific life strategies are recognizable through functional traits of intra-annual growth dynamics. The opposition between Scots pine extensive strategy and silver fir and Norway spruce intensive strategy supports the theory that pioneer species are greater resource expenders and develop riskier life strategies to capture resources, while shade-tolerant species utilize resources more efficiently and develop safer life strategies. Despite different strategies, synchronicity of the maximal rates of cambial activity suggests a strong functional convergence between co-existing conifer species, resulting in head-on competition for resources.
Clark, C H; Miles, E A; Urbano, M T Guerrero; Bhide, S A; Bidmead, A M; Harrington, K J; Nutting, C M
2009-07-01
The purpose of this study was to compare conventional radiotherapy with parotid gland-sparing intensity-modulated radiation therapy (IMRT) using the PARSPORT trial. The validity of such a trial depends on the radiotherapy planning and delivery meeting a defined standard across all centres. At the outset, many of the centres had little or no experience of delivering IMRT; therefore, quality assurance processes were devised to ensure consistency and standardisation of all processes for comparison within the trial. The pre-trial quality assurance (QA) programme and results are described. Each centre undertook exercises in target volume definition and treatment planning, completed a resource questionnaire and produced a process document. Additionally, the QA team visited each participating centre. Each exercise had to be accepted before patients could be recruited into the trial. 10 centres successfully completed the quality assurance exercises. A range of treatment planning systems, linear accelerators and delivery methods were used for the planning exercises, and all the plans created reached the standard required for participation in this multicentre trial. All 10 participating centres achieved implementation of a comprehensive and robust IMRT programme for treatment of head and neck cancer.
Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulakhe, D.; Rodriguez, A.; Wilde, M.
2008-03-01
Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less
Evidence-based human resource management: a study of nurse leaders' resource allocation.
Fagerström, Lisbeth
2009-05-01
The aims were to illustrate how the RAFAELA system can be used to facilitate evidence-based human resource management. The theoretical framework of the RAFAELA system is based on a holistic view of humankind and a view of leadership founded on human resource management. Nine wards from three central hospitals in Finland participated in the study. The data, stemming from 2006-2007, were taken from the critical indicators (ward-related and nursing intensity information) for national benchmarking used in the RAFAELA system. The data were analysed descriptively. The daily nursing resources per classified patient ratio is a more specific method of measurement than the nurse-to-patient ratio. For four wards, the nursing intensity per nurse surpassed the optimal level 34% to 62.2% of days. Resource allocation was clearly improved in that a better balance between patients' care needs and available nursing resources was maintained. The RAFAELA system provides a rational, systematic and objective foundation for evidence-based human resource management. Data from a systematic use of the RAFAELA system offer objective facts and motives for evidence-based decision making in human resource management, and will therefore enhance the nurse leaders' evidence and scientific based way of working.
Sprung, Charles L; Kesecioglu, Jozef
2010-04-01
To provide recommendations and standard operating procedures for intensive care unit and hospital preparations for an influenza pandemic or mass disaster with a specific focus on essential equipment, pharmaceuticals and supplies. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including essential equipment, pharmaceuticals and supplies. Key recommendations include: (1) ensure that adequate essential medical equipment, pharmaceuticals and important supplies are available during a disaster; (2) develop a communication and coordination system between health care facilities and local/regional/state/country governmental authorities for the provision of additional support; (3) determine the required resources, order and stockpile adequate resources, and judiciously distribute them; (4) acquire additional mechanical ventilators that are portable, provide adequate gas exchange for a range of clinical conditions, function with low-flow oxygen and without high pressure, and are safe for patients and staff; (5) provide advanced ventilatory support and rescue therapies including high levels of inspired oxygen and positive end-expiratory pressure, volume and pressure control ventilation, inhaled nitric oxide, high-frequency ventilation, prone positioning ventilation and extracorporeal membrane oxygenation; (6) triage scarce resources including equipment, pharmaceuticals and supplies based on those who are likely to benefit most or on a 'first come, first served' basis. Judicious planning and adoption of protocols for providing adequate equipment, pharmaceuticals and supplies are necessary to optimize outcomes during a pandemic.
Barfield, Wanda D; Rhodes, Julia C; Kohn, Melvin A; Hedberg, Katrina; Schoendorf, Kenneth C
2002-01-01
In November 1998, Oregon voters passed Ballot Measure 58, which allowed Oregon adoptees > or = 21 years of age access to their original birth records, which are sealed at adoption. The objective of this study was to evaluate the impact of the measure on the Oregon Health Division (since renamed Oregon Health Services) by assessing procedures used and resources needed after implementation of Measure 58. Vital records employees were interviewed about processing, storage, and archive retrieval procedures for pre-adoption birth records before, during, and after the implementation of Measure 58 and the effect on their usual workload. Personnel time, space, and fiscal resources used to process requests for pre-adoption records were also calculated. The Oregon Health Division began to receive requests from adoptees immediately following the passage of Measure 58 in November 1998, but due to legal challenges, they could not be processed until May 31, 2000. From June 2, 2000, through October 20, 2000, 12 staff members and two supervisors issued more than 4,700 pre-adoption birth records while also processing their normal workload, which averages more than 135,400 vital record orders annually. Due to the need for retrieval from archives, requests for pre-adoption birth records were estimated to take 75 hours to process vs. 2-3 minutes for standard requests. Each batch of approximately 75 pre-adoption birth records required approximately 12.5 person-hours from vital records staff and 3-4 person-hours from archive personnel; in addition, supervisors spent time responding to incomplete orders, informing the public and the media, and responding to concerns of adoptees, birth parents, and adoptive parents. Fewer than 1% of requests went unfilled. Implementation of Measure 58 utilized substantial resources of the Oregon Health Division. States contemplating similar legislation should consider increasing personnel and resources, preparing for intense public and media interest, and reorganizing the storage of adoptees' original birth records so they are easily retrieved.
Qiu, Jiali; Shen, Zhenyao; Wei, Guoyuan; Wang, Guobo; Xie, Hui; Lv, Guanping
2018-03-01
The assessment of peak flow rate, total runoff volume, and pollutant loads during rainfall process are very important for the watershed management and the ecological restoration of aquatic environment. Real-time measurements of rainfall-runoff and pollutant loads are always the most reliable approach but are difficult to carry out at all desired location in the watersheds considering the large consumption of material and financial resources. An integrated environmental modeling approach for the estimation of flash streamflow that combines the various hydrological and quality processes during rainstorms within the agricultural watersheds is essential to develop targeted management strategies for the endangered drinking water. This study applied the Hydrological Simulation Program-Fortran (HSPF) to simulate the spatial and temporal variation in hydrological processes and pollutant transport processes during rainstorm events in the Miyun Reservoir watershed, a drinking water resource area in Beijing. The model performance indicators ensured the acceptable applicability of the HSPF model to simulate flow and pollutant loads in the studied watershed and to establish a relationship between land use and the parameter values. The proportion of soil and land use was then identified as the influencing factors of the pollution intensities. The results indicated that the flush concentrations were much higher than those observed during normal flow periods and considerably exceeded the limits of Class III Environmental Quality Standards for Surface Water (GB3838-2002) for the secondary protection zones of the drinking water resource in China. Agricultural land and leached cinnamon soils were identified as the key sources of sediment, nutrients, and fecal coliforms. Precipitation volume was identified as a driving factor that determined the amount of runoff and pollutant loads during rainfall processes. These results are useful to improve the streamflow predictions, provide useful information for the identification of highly polluted areas, and aid the development of integrated watershed management system in the drinking water resource area.
NASA Astrophysics Data System (ADS)
Jacobs, B. E.; Bohls-Graham, E.; Martinez, A. O.; Ellins, K. K.; Riggs, E. M.; Serpa, L. F.; Stocks, E.; Fox, S.; Kent, M.
2014-12-01
Today's instruction in Earth's systems requires thoughtful selection of curricula, and in turn, high quality learning activities that address modern Earth science. The Next Generation Science Standards (NGSS), which are intended to guide K-12 science instruction, further demand a discriminating selection process. The DIG (Diversity & Innovation in Geoscience) Texas Instructional Blueprints attempt to fulfill this practice by compiling vetted educational resources freely available online into units that are the building blocks of the blueprints. Each blueprint is composed of 9 three-week teaching units and serves as a scope and sequence for teaching a one-year Earth science course. In the earliest stages of the project, teams explored the Internet for classroom-worthy resources, including laboratory investigations, videos, visualizations, and readings, and submitted the educational resources deemed suitable for the project into the project's online review tool. Each team member evaluated the educational resources chosen by fellow team members according to a set of predetermined criteria that had been incorporated into the review tool. Resources rated as very good or excellent by all team members were submitted to the project PIs for approval. At this stage, approved resources became candidates for inclusion in the blueprint units. Team members tagged approved resources with descriptors for the type of resource and instructional strategy, and aligned these to the Texas Essential Knowledge and Skills for Earth and Space Science and the Earth Science Literacy Principles. Each team then assembled and sequenced resources according to content strand, balancing the types of learning experiences within each unit. Once units were packaged, teams then considered how they addressed the NGSS and identified the relevant disciplinary core ideas, crosscutting concepts, and science and engineering practices. In addition to providing a brief overview of the project, this presentation will detail the intensive review process educators utilized to determine the viability of the resources included in the blueprints. A short summary of first-year implementation results will be shared, along with the second year now in progress.
NASA Technical Reports Server (NTRS)
Kornhauser, A. L.; Wilson, L. B.
1974-01-01
Potential economic benefits obtainable from a state-of-the-art ERS system in the resource area of intensive use of living resources, agriculture, are studied. A spectrum of equal capability (cost saving), increased capability, and new capability benefits are quantified. These benefits are estimated via ECON developed models of the agricultural marketplace and include benefits of improved production and distribution of agricultural crops. It is shown that increased capability benefits and new capability benefits result from a reduction of losses due to disease and insect infestation given ERS's capability to distinguish crop vigor and from the improvement in world trade negotiations given ERS's worldwide surveying capability.
Breland, Jessica Y; Asch, Steven M; Slightam, Cindie; Wong, Ava; Zulman, Donna M
2016-03-01
Intensive outpatient programs aim to transform care while conserving resources for high-need, high-cost patients, but little is known about factors that influence their implementation within patient-centered medical homes (PCMHs). In this mixed-methods study, we reviewed the literature to identify factors affecting intensive outpatient program implementation, then used semi-structured interviews to determine how these factors influenced the implementation of an intensive outpatient program within the Veterans Affairs' (VA) PCMH. Interviewees included facility leadership and clinical staff who were involved in a pilot Intensive Management Patient Aligned Care Team (ImPACT) intervention for high-need, high-cost VA PCMH patents. We classified implementation factors in the literature review and qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). The literature review (n=9 studies) and analyses of interviews (n=15) revealed key implementation factors in three CFIR domains. First, the Inner Setting (i.e., the organizational and PCMH environment), mostly enabled implementation through a culture of innovation, good networks and communication, and positive tension for change. Second, Characteristics of Individuals, including creativity, flexibility, and interpersonal skills, allowed program staff to augment existing PCMH services. Finally, certain Intervention Characteristics (e.g., adaptability) enabled implementation, while others (e.g., complexity) generated implementation barriers. Resources and structural features common to PCMHs can facilitate implementation of intensive outpatient programs, but program success is also dependent on staff creativity and flexibility, and intervention adaptations to meet patient and organizational needs. Established PCMHs likely provide resources and environments that permit accelerated implementation of intensive outpatient programs. V. Published by Elsevier Inc.
Ben Butler
2007-01-01
Obtaining accurate biomass measurements is often a resource-intensive task. Data collection crews often spend large amounts of time in the field clipping, drying, and weighing grasses to calculate the biomass of a given vegetation type. Such a problem is currently occurring in the Great Plains region of the Bureau of Indian Affairs. A study looked at six reservations...
Proposal for grid computing for nuclear applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.
2014-02-12
The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.
Assessing Resource Intensity and Renewability of Cellulosic Ethanol Technologies using Eco-LCA
Recognizing the contributions of natural resources and the lack of their comprehensive accounting in life cycle assessment (LCA) of cellulosic ethanol, an in-depth analysis of the contribution of natural resources in the life cycle of cellulosic ethanol derived from five differen...
NASA Astrophysics Data System (ADS)
Sineeva, Natalya
2018-03-01
Our study relevance is due to the increasing man-made impact on water bodies and associated land resources within the urban areas, as a consequence, by a change in the morphology and dynamics of Rivers' canals. This leads to the need to predict the development of erosion-accumulation processes, especially within the built-up urban areas. Purpose of the study is to develop programs on the assessment of erosion-accumulation processes at a water body, a mouth area of the Inia River, in the of perspective high-rise construction zone of a residential microdistrict, the place, where floodplain-channel complex is intensively expected to develop. Results of the study: Within the velocities of the water flow comparing, full-scale measured conditions, and calculated from the model, a slight discrepancy was recorded. This allows us to say that the numerical model reliably describes the physical processes developing in the River. The carried out calculations to assess the direction and intensity of the channel re-formations, made us possible to conclude, there was an insignificant predominance of erosion processes over the accumulative ones on the undeveloped part of the Inia River (the processes activity is noticeable only in certain areas (by the coasts and the island)). Importance of the study: The study on the erosion-accumulation processes evaluation can be used in design decisions for the future high-rise construction of this territory, which will increase their economic efficiency.
Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.
1995-01-01
The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of the constituent groups.
NASA Astrophysics Data System (ADS)
Sherwood, John; Clabeaux, Raeanne; Carbajales-Dale, Michael
2017-10-01
We developed a physically-based environmental account of US food production systems and integrated these data into the environmental-input-output life cycle assessment (EIO-LCA) model. The extended model was used to characterize the food, energy, and water (FEW) intensities of every US economic sector. The model was then applied to every Bureau of Economic Analysis metropolitan statistical area (MSA) to determine their FEW usages. The extended EIO-LCA model can determine the water resource use (kGal), energy resource use (TJ), and food resource use in units of mass (kg) or energy content (kcal) of any economic activity within the United States. We analyzed every economic sector to determine its FEW intensities per dollar of economic output. This data was applied to each of the 382 MSAs to determine their total and per dollar of GDP FEW usages by allocating MSA economic production to the corresponding FEW intensities of US economic sectors. Additionally, a longitudinal study was performed for the Los Angeles-Long Beach-Anaheim, CA, metropolitan statistical area to examine trends from this singular MSA and compare it to the overall results. Results show a strong correlation between GDP and energy use, and between food and water use across MSAs. There is also a correlation between GDP and greenhouse gas emissions. The longitudinal study indicates that these correlations can shift alongside a shifting industrial composition. Comparing MSAs on a per GDP basis reveals that central and southern California tend to be more resource intensive than many other parts of the country, while much of Florida has abnormally low resource requirements. Results of this study enable a more complete understanding of food, energy, and water as key ingredients to a functioning economy. With the addition of the food data to the EIO-LCA framework, researchers will be able to better study the food-energy-water nexus and gain insight into how these three vital resources are interconnected. Applying this extended model to MSAs has demonstrated that all three resources are important to a MSA’s vitality, though the exact proportion of each resource may differ across urban areas.
Reduced prefrontal and temporal processing and recall of high "sensation value" ads.
Langleben, Daniel D; Loughead, James W; Ruparel, Kosha; Hakun, Jonathan G; Busch-Winokur, Samantha; Holloway, Matthew B; Strasser, Andrew A; Cappella, Joseph N; Lerman, Caryn
2009-05-15
Public service announcements (PSAs) are non-commercial broadcast ads that are an important part of televised public health campaigns. "Message sensation value" (MSV), a measure of sensory intensity of audio, visual, and content features of an ad, is an important factor in PSA impact. Some communication theories propose that higher message sensation value brings increased attention and cognitive processing, leading to higher ad impact. Others argue that the attention-intensive format could compete with ad's message for cognitive resources and result in reduced processing of PSA content and reduced overall effectiveness. Brain imaging during PSA viewing provides a quantitative surrogate measure of PSA impact and addresses questions of PSA evaluation and design not accessible with traditional subjective and epidemiological methods. We used Blood Oxygenation Level Dependent (BOLD) functional Magnetic Resonance Imaging (fMRI) and recognition memory measures to compare high and low MSV anti-tobacco PSAs and neutral videos. In a short-delay, forced-choice memory test, frames extracted from PSAs were recognized more accurately than frames extracted from the NV. Frames from the low MSV PSAs were better recognized than frames from the high MSV PSAs. The accuracy of recognition of PSA frames was positively correlated with the prefrontal and temporal, and negatively correlated with the occipital cortex activation. The low MSV PSAs were associated with greater prefrontal and temporal activation, than the high MSV PSAs. The high MSV PSAs produced greater activation primarily in the occipital cortex. These findings support the "dual processing" and "limited capacity" theories of communication that postulate a competition between ad's content and format for the viewers' cognitive resources and suggest that the "attention-grabbing" high MSV format could impede the learning and retention of an ad. These findings demonstrate the potential of using neuroimaging in the design and evaluation of mass media public health communications.
Resource-mediated indirect effects of grassland management on arthropod diversity.
Simons, Nadja K; Gossner, Martin M; Lewinsohn, Thomas M; Boch, Steffen; Lange, Markus; Müller, Jörg; Pašalić, Esther; Socher, Stephanie A; Türke, Manfred; Fischer, Markus; Weisser, Wolfgang W
2014-01-01
Intensive land use is a driving force for biodiversity decline in many ecosystems. In semi-natural grasslands, land-use activities such as mowing, grazing and fertilization affect the diversity of plants and arthropods, but the combined effects of different drivers and the chain of effects are largely unknown. In this study we used structural equation modelling to analyse how the arthropod communities in managed grasslands respond to land use and whether these responses are mediated through changes in resource diversity or resource quantity (biomass). Plants were considered resources for herbivores which themselves were considered resources for predators. Plant and arthropod (herbivores and predators) communities were sampled on 141 meadows, pastures and mown pastures within three regions in Germany in 2008 and 2009. Increasing land-use intensity generally increased plant biomass and decreased plant diversity, mainly through increasing fertilization. Herbivore diversity decreased together with plant diversity but showed no response to changes in plant biomass. Hence, land-use effects on herbivore diversity were mediated through resource diversity rather than quantity. Land-use effects on predator diversity were mediated by both herbivore diversity (resource diversity) and herbivore quantity (herbivore biomass), but indirect effects through resource quantity were stronger. Our findings highlight the importance of assessing both direct and indirect effects of land-use intensity and mode on different trophic levels. In addition to the overall effects, there were subtle differences between the different regions, pointing to the importance of regional land-use specificities. Our study underlines the commonly observed strong effect of grassland land use on biodiversity. It also highlights that mechanistic approaches help us to understand how different land-use modes affect biodiversity.
Resource-Mediated Indirect Effects of Grassland Management on Arthropod Diversity
Simons, Nadja K.; Gossner, Martin M.; Lewinsohn, Thomas M.; Boch, Steffen; Lange, Markus; Müller, Jörg; Pašalić, Esther; Socher, Stephanie A.; Türke, Manfred; Fischer, Markus; Weisser, Wolfgang W.
2014-01-01
Intensive land use is a driving force for biodiversity decline in many ecosystems. In semi-natural grasslands, land-use activities such as mowing, grazing and fertilization affect the diversity of plants and arthropods, but the combined effects of different drivers and the chain of effects are largely unknown. In this study we used structural equation modelling to analyse how the arthropod communities in managed grasslands respond to land use and whether these responses are mediated through changes in resource diversity or resource quantity (biomass). Plants were considered resources for herbivores which themselves were considered resources for predators. Plant and arthropod (herbivores and predators) communities were sampled on 141 meadows, pastures and mown pastures within three regions in Germany in 2008 and 2009. Increasing land-use intensity generally increased plant biomass and decreased plant diversity, mainly through increasing fertilization. Herbivore diversity decreased together with plant diversity but showed no response to changes in plant biomass. Hence, land-use effects on herbivore diversity were mediated through resource diversity rather than quantity. Land-use effects on predator diversity were mediated by both herbivore diversity (resource diversity) and herbivore quantity (herbivore biomass), but indirect effects through resource quantity were stronger. Our findings highlight the importance of assessing both direct and indirect effects of land-use intensity and mode on different trophic levels. In addition to the overall effects, there were subtle differences between the different regions, pointing to the importance of regional land-use specificities. Our study underlines the commonly observed strong effect of grassland land use on biodiversity. It also highlights that mechanistic approaches help us to understand how different land-use modes affect biodiversity. PMID:25188423
Emami, Nasir; Sobhani, Reza; Rosso, Diego
2018-04-01
A model was developed for a water resources recovery facility (WRRF) activated sludge process (ASP) in Modified Ludzack-Ettinger (MLE) configuration. Amplification of air requirements and its associated energy consumptions were observed as a result of concurrent circadian variations in ASP influent flow and carbonaceous/nitrogenous constituent concentrations. The indirect carbon emissions associated with the ASP aeration were further amplified due to the simultaneous variations in carbon emissions intensity (kgCO 2,eq (kWh) -1 ) and electricity consumption (kWh). The ratio of peak to minimum increased to 3.4 (for flow), 4.2 (for air flow and energy consumption), and 5.2 (for indirect CO 2,eq emission), which is indicative of strong amplification. Similarly, the energy costs for ASP aeration were further increased due to the concurrency of peak energy consumptions and power demands with time of use peak electricity rates. A comparison between the results of the equilibrium model and observed data from the benchmark WRRF demonstrated under- and over-aeration attributed to the circadian variation in air requirements and limitations associated with the aeration system specification and design.
Brown, Michael J; Kor, Daryl J; Curry, Timothy B; Marmor, Yariv; Rohleder, Thomas R
2015-01-01
Transfer of intensive care unit (ICU) patients to the operating room (OR) is a resource-intensive, time-consuming process that often results in patient throughput inefficiencies, deficiencies in information transfer, and suboptimal nurse to patient ratios. This study evaluates the implementation of a coordinated patient transport system (CPTS) designed to address these issues. Using data from 1,557 patient transfers covering the 2006-2010 period, interrupted time series and before and after designs were used to analyze the effect of implementing a CPTS at Mayo Clinic, Rochester. Using a segmented regression for the interrupted time series, on-time OR start time deviations were found to be significantly lower after the implementation of CPTS (p < .0001). The implementation resulted in a fourfold improvement in on-time OR starts (p < .01) while significantly reducing idle OR time (p < .01). A coordinated patient transfer process for moving patient from ICUs to ORs can significantly improve OR efficiency, reduce nonvalue added time, and ensure quality of care by preserving appropriate care provider to patient ratios.
The Changing Human Resources Function. Report Number 950.
ERIC Educational Resources Information Center
Freedman, Audrey
The role of the top human resources executive in major corporations has changed during the past decade into a directly involved business advisor, strategist, and implementer of business objectives. Intense competition has overridden previous human resources concerns and forced priorities toward internal, business-driven issues. Since cost-cutting…
Thomas, Neil; Farhall, John; Foley, Fiona; Rossell, Susan L; Castle, David; Ladd, Emma; Meyer, Denny; Mihalopoulos, Cathrine; Leitan, Nuwan; Nunan, Cassy; Frankish, Rosalie; Smark, Tara; Farnan, Sue; McLeod, Bronte; Sterling, Leon; Murray, Greg; Fossey, Ellie; Brophy, Lisa; Kyrios, Michael
2016-09-07
Psychosocial interventions have an important role in promoting recovery in people with persisting psychotic disorders such as schizophrenia. Readily available, digital technology provides a means of developing therapeutic resources for use together by practitioners and mental health service users. As part of the Self-Management and Recovery Technology (SMART) research program, we have developed an online resource providing materials on illness self-management and personal recovery based on the Connectedness-Hope-Identity-Meaning-Empowerment (CHIME) framework. Content is communicated using videos featuring persons with lived experience of psychosis discussing how they have navigated issues in their own recovery. This was developed to be suitable for use on a tablet computer during sessions with a mental health worker to promote discussion about recovery. This is a rater-blinded randomised controlled trial comparing a low intensity recovery intervention of eight one-to-one face-to-face sessions with a mental health worker using the SMART website alongside routine care, versus an eight-session comparison condition, befriending. The recruitment target is 148 participants with a schizophrenia-related disorder or mood disorder with a history of psychosis, recruited from mental health services in Victoria, Australia. Following baseline assessment, participants are randomised to intervention, and complete follow up assessments at 3, 6 and 9 months post-baseline. The primary outcome is personal recovery measured using the Process of Recovery Questionnaire (QPR). Secondary outcomes include positive and negative symptoms assessed with the Positive and Negative Syndrome Scale, subjective experiences of psychosis, emotional symptoms, quality of life and resource use. Mechanisms of change via effects on self-stigma and self-efficacy will be examined. This protocol describes a novel intervention which tests new therapeutic methods including in-session tablet computer use and video-based peer modelling. It also informs a possible low intensity intervention model potentially viable for delivery across the mental health workforce. NCT02474524 , 24 May 2015, retrospectively registered during the recruitment phase.
USDA-ARS?s Scientific Manuscript database
The rainfall-induced removal of pathogens and microbial indicators from land-applied manure with runoff and infiltration greatly contributes to the impairment of surface and groundwater resources. It has been assumed that rainfall intensity and changes in rainfall intensity during a rainfall event d...
López-Gamero, María D; Molina-Azorín, José F; Claver-Cortés, Enrique
2009-07-01
The examination of the possible direct link between environmental protection and firm performance in the literature has generally produced mixed results. The present paper contributes to the literature by using the resource-based view as a mediating process in this relationship. The study specifically tests whether or not the resource-based view of the firm mediates the positive relationships of proactive environmental management and improved environmental performance with competitive advantage, which also has consequences for financial performance. We also check the possible link between the adoption of a pioneering approach and good environmental management practices. Our findings support that early investment timing and intensity in environmental issues impact on the adoption of a proactive environmental management, which in turn helps to improve environmental performance. The findings also show that a firm's resources and competitive advantage act as mediator variables for a positive relationship between environmental protection and financial performance. This contribution is original because the present paper develops a comprehensive whole picture of this path process, which has previously only been partially discussed in the literature. In addition, this study clarifies a relevant point in the literature, namely that the effect of environmental protection on firm performance is not direct and can vary depending on the sector considered. Whereas competitive advantage in relation to costs influences financial performance in the IPPC law sector, the relevant influence in the hotel sector comes from competitive advantage through differentiation.
Nahm, Eun-Shim; Orwig, Denise; Resnick, Barbara; Magaziner, Jay; Bellantoni, Michele; Sterling, Robert
2012-01-12
Hip fracture is a significant health problem for older adults and generally requires surgery followed by intensive rehabilitation. Informal caregivers (CGs) can provide vital assistance to older adults recovering from hip fracture. Caregiving is a dyadic process that affects both CGs and care recipients (CRs). In a feasibility study, we assessed the effects of using a theory-based online hip fracture resource program for CGs on both CGs and CRs. In this article, we discuss our recruitment process and the lessons learned. Participants were recruited from six acute hospitals, and CGs used the online resource program for 8 weeks. A total of 256 hip fracture patients were screened, and 164 CRs were ineligible. CG screening was initiated when CRs were determined to be eligible. Among 41 eligible dyads, 36 dyads were recruited. Several challenges to the recruitment of these dyads for online studies were identified, including a low number of eligible dyads in certain hospitals and difficulty recruiting both the CR and the CG during the short hospital stay. Field nurses often had to make multiple trips to the hospital to meet with both the CR and the CG. Thus, when a subject unit is a dyad recruited from acute settings, the resources required for the recruitment may be more than doubled. These challenges could be successfully alleviated with careful planning, competent field staff members, collaboration with hospital staff members, and efficient field operations.
NASA Astrophysics Data System (ADS)
Haider, Shahid A.; Kazemzadeh, Farnoud; Wong, Alexander
2017-03-01
An ideal laser is a useful tool for the analysis of biological systems. In particular, the polarization property of lasers can allow for the concentration of important organic molecules in the human body, such as proteins, amino acids, lipids, and carbohydrates, to be estimated. However, lasers do not always work as intended and there can be effects such as mode hopping and thermal drift that can cause time-varying intensity fluctuations. The causes of these effects can be from the surrounding environment, where either an unstable current source is used or the temperature of the surrounding environment is not temporally stable. This intensity fluctuation can cause bias and error in typical organic molecule concentration estimation techniques. In a low-resource setting where cost must be limited and where environmental factors, like unregulated power supplies and temperature, cannot be controlled, the hardware required to correct for these intensity fluctuations can be prohibitive. We propose a method for computational laser intensity stabilisation that uses Bayesian state estimation to correct for the time-varying intensity fluctuations from electrical and thermal instabilities without the use of additional hardware. This method will allow for consistent intensities across all polarization measurements for accurate estimates of organic molecule concentrations.
Career Planning: Developing the Nation's Primary Resource.
ERIC Educational Resources Information Center
Jarvis, Phillip S.
1995-01-01
Argues for intensive and ongoing career planning assistance for all age groups to ensure the development of people resources to meet Canada's economic needs. Points out the economic consequences of inadequate planning. (LKS)
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.
2011-01-01
Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.
Economic input-output life-cycle assessment of trade between Canada and the United States.
Norman, Jonathan; Charpentier, Alex D; MacLean, Heather L
2007-03-01
With increasing trade liberalization, attempts at accounting for environmental impacts and energy use across the manufacturing supply chain are complicated by the predominance of internationally supplied resources and products. This is particularly true for Canada and the United States, the world's largest trading partners. We use an economic input-output life-cycle assessment (EIO-LCA) technique to estimate the economy-wide energy intensity and greenhouse gas (GHG) emissions intensity for 45 manufacturing and resource sectors in Canada and the United States. Overall, we find that U.S. manufacturing and resource industries are about 1.15 times as energy-intensive and 1.3 times as GHG-intensive as Canadian industries, with significant sector-specific discrepancies in energy and GHG intensity. This trend is mainly due to a greater direct reliance on fossil fuels for many U.S. industries, in addition to a highly fossil-fuel based electricity mix in the U.S. To account for these differences, we develop a 76 sector binational EIO-LCA model that implicitly considers trade in goods between Canada and the U.S. Our findings show that accounting for trade can significantly alter the results of life-cycle assessment studies, particularly for many Canadian manufacturing sectors, and the production/consumption of goods in one country often exerts significant energy- and GHG-influences on the other.
Beneficiation of Kulon Progo iron sand by using tabling and magnetic separation methods
NASA Astrophysics Data System (ADS)
Oediyani, Soesaptri; Ikhlasul Amal, M.; M. Victoriyan, N.; Juniarsih, Andinnie
2018-04-01
There are two types of iron resources such as primary iron ore and iron sand. In general, primary iron ores use as raw materials in iron and steel making because the iron content is high (± 60%) and can reduce directly. On the other and, iron sand is rarely used as a raw material because the iron content is low (20-40%) but the iron sand reserves are very abundant in Indonesia for instance is about 173 million tons in Kulon Progo, Jogyakarta. In addition, the new regulation of Energy and Mineral Resources Ministry required that iron sands must process before are being export. Therefore, the proper beneficiation methods are need to improve the iron content of iron sand. In this research Kulon Progo iron sand was used as a raw materials because not only the reserves very abundant but also the new iron making plant will build here soon. The combine of ore concentration methods such as tabling and magnetic separation used to improve the iron content. Then, the variable process were inclination of the table (2°,3° and 4°), fraction of feed size (-100 mesh, -150 mesh and -200 mesh) and the magnetic intensity (176, 830, 1500 Gauss). The highest recovery was about 96.75% and the concentrate which is contain 59.78 % Fe achieved by using -200 mesh particle size, 4° inclination of table and 1500 Gauss as a magnetic intensity. In conclusion, this concentrate fulfilled raw material requires of iron making (≥ 55%Fe).
NASA Astrophysics Data System (ADS)
Sanchez-Mejia, Z. M.; Papuga, S. A.
2013-12-01
In semiarid regions, where water resources are limited and precipitation dynamics are changing, understanding land surface-atmosphere interactions that regulate the coupled soil moisture-precipitation system is key for resource management and planning. We present a modeling approach to study soil moisture and albedo controls on planetary boundary layer height (PBLh). We used data from the Santa Rita Creosote Ameriflux site and Tucson Airport atmospheric sounding to generate empirical relationships between soil moisture, albedo and PBLh. We developed empirical relationships and show that at least 50% of the variation in PBLh can be explained by soil moisture and albedo. Then, we used a stochastically driven two-layer bucket model of soil moisture dynamics and our empirical relationships to model PBLh. We explored soil moisture dynamics under three different mean annual precipitation regimes: current, increase, and decrease, to evaluate at the influence on soil moisture on land surface-atmospheric processes. While our precipitation regimes are simple, they represent future precipitation regimes that can influence the two soil layers in our conceptual framework. For instance, an increase in annual precipitation, could impact on deep soil moisture and atmospheric processes if precipitation events remain intense. We observed that the response of soil moisture, albedo, and the PBLh will depend not only on changes in annual precipitation, but also on the frequency and intensity of this change. We argue that because albedo and soil moisture data are readily available at multiple temporal and spatial scales, developing empirical relationships that can be used in land surface - atmosphere applications are of great value.
Mental fatigue and impaired response processes: event-related brain potentials in a Go/NoGo task.
Kato, Yuichiro; Endo, Hiroshi; Kizuka, Tomohiro
2009-05-01
The effects of mental fatigue on the availability of cognitive resources and associated response-related processes were examined using event-related brain potentials. Subjects performed a Go/NoGo task for 60 min. Reaction time, number of errors, and mental fatigue scores all significantly increased with time spent on the task. The NoGo-P3 amplitude significantly decreased with time on task, but the Go-P3 amplitude was not modulated. The amplitude of error-related negativity (Ne/ERN) also decreased with time on task. These results indicate that mental fatigue attenuates resource allocation and error monitoring for NoGo stimuli. The Go- and NoGo-P3 latencies both increased with time on task, indicative of a delay in stimulus evaluation time due to mental fatigue. NoGo-N2 latency increased with time on task, but NoGo-N2 amplitude was not modulated. The amplitude of response-locked lateralized readiness potential (LRP) significantly decreased with time on task. Mental fatigue appears to slows down the time course of response inhibition, and impairs the intensity of response execution.
Streaming Support for Data Intensive Cloud-Based Sequence Analysis
Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed
2013-01-01
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461
NASA Astrophysics Data System (ADS)
Vilotte, Jean-Pierre; Atkinson, Malcolm; Carpené, Michele; Casarotti, Emanuele; Frank, Anton; Igel, Heiner; Rietbrock, Andreas; Schwichtenberg, Horst; Spinuso, Alessandro
2016-04-01
Seismology pioneers global and open-data access -- with internationally approved data, metadata and exchange standards facilitated worldwide by the Federation of Digital Seismic Networks (FDSN) and in Europe the European Integrated Data Archives (EIDA). The growing wealth of data generated by dense observation and monitoring systems and recent advances in seismic wave simulation capabilities induces a change in paradigm. Data-intensive seismology research requires a new holistic approach combining scalable high-performance wave simulation codes and statistical data analysis methods, and integrating distributed data and computing resources. The European E-Infrastructure project "Virtual Earthquake and seismology Research Community e-science environment in Europe" (VERCE) pioneers the federation of autonomous organisations providing data and computing resources, together with a comprehensive, integrated and operational virtual research environment (VRE) and E-infrastructure devoted to the full path of data use in a research-driven context. VERCE delivers to a broad base of seismology researchers in Europe easily used high-performance full waveform simulations and misfit calculations, together with a data-intensive framework for the collaborative development of innovative statistical data analysis methods, all of which were previously only accessible to a small number of well-resourced groups. It balances flexibility with new integrated capabilities to provide a fluent path from research innovation to production. As such, VERCE is a major contribution to the implementation phase of the ``European Plate Observatory System'' (EPOS), the ESFRI initiative of the solid-Earth community. The VRE meets a range of seismic research needs by eliminating chores and technical difficulties to allow users to focus on their research questions. It empowers researchers to harvest the new opportunities provided by well-established and mature high-performance wave simulation codes of the community. It enables active researchers to invent and refine scalable methods for innovative statistical analysis of seismic waveforms in a wide range of application contexts. The VRE paves the way towards a flexible shared framework for seismic waveform inversion, lowering the barriers to uptake for the next generation of researchers. The VRE can be accessed through the science gateway that puts together computational and data-intensive research into the same framework, integrating multiple data sources and services. It provides a context for task-oriented and data-streaming workflows, and maps user actions to the full gamut of the federated platform resources and procurement policies, activating the necessary behind-the-scene automation and transformation. The platform manages and produces domain metadata, coupling them with the provenance information describing the relationships and the dependencies, which characterise the whole workflow process. This dynamic knowledge base, can be explored for validation purposes via a graphical interface and a web API. Moreover, it fosters the assisted selection and re-use of the data within each phase of the scientific analysis. These phases can be identified as Simulation, Data Access, Preprocessing, Misfit and data processing, and are presented to the users of the gateway as dedicated and interactive workspaces. By enabling researchers to share results and provenance information, VERCE steers open-science behaviour, allowing researchers to discover and build on prior work and thereby to progress faster. A key asset is the agile strategy that VERCE deployed in a multi-organisational context, engaging seismologists, data scientists, ICT researchers, HPC and data resource providers, system administrators into short-lived tasks each with a goal that is a seismology priority, and intimately coupling research thinking with technical innovation. This changes the focus from HPC production environments and community data services to user-focused scenario, avoiding wasteful bouts of technology centricity where technologists collect requirements and develop a system that is not used because the ideas of the planned users have moved on. As such the technologies and concepts developed in VERCE are relevant to many other disciplines in computational and data driven Earth Sciences and can provide the key technologies for a European wide computational and data intensive framework in Earth Sciences.
IAU Public Astronomical Organisations Network
NASA Astrophysics Data System (ADS)
Canas, Lina; Cheung, Sze Leung
2015-08-01
The Office for Astronomy Outreach has devoted intensive means to create and support a global network of public astronomical organisations around the world. Focused on bringing established and newly formed amateur astronomy organizations together, providing communications channels and platforms for disseminating news to the global community and the sharing of best practices and resources among these associations around the world. In establishing the importance that these organizations have for the dissemination of activities globally and acting as key participants in IAU various campaigns social media has played a key role in keeping this network engaged and connected. Here we discuss the implementation process of maintaining this extensive network, the processing and gathering of information and the interactions between local active members at a national and international level.
The Real-Time IRB: A Collaborative Innovation to Decrease IRB Review Time.
Spellecy, Ryan; Eve, Ann Marie; Connors, Emily R; Shaker, Reza; Clark, David C
2018-06-01
Lengthy review times for institutional review boards (IRBs) are a well-known barrier to research. In response to numerous calls to reduce review times, we devised "Real-Time IRB," a process that drastically reduces IRB review time. In this, investigators and study staff attend the IRB meeting and make changes to the protocol while the IRB continues its meeting, so that final approval can be issued at the meeting. This achieved an overall reduction in time from submission to the IRB to final approval of 40%. While this process is time and resource intensive, and cannot address all delays in research, it shows great promise for increasing the pace by which research is translated to patient care.
[Implementation of a rational standard of hygiene for preparation of operating rooms].
Bauer, M; Scheithauer, S; Moerer, O; Pütz, H; Sliwa, B; Schmidt, C E; Russo, S G; Waeschle, R M
2015-10-01
The assurance of high standards of care is a major requirement in German hospitals while cost reduction and efficient use of resources are mandatory. These requirements are particularly evident in the high-risk and cost-intensive operating theatre field with multiple process steps. The cleaning of operating rooms (OR) between surgical procedures is of major relevance for patient safety and requires time and human resources. The hygiene procedure plan for OR cleaning between operations at the university hospital in Göttingen was revised and optimized according to the plan-do-check-act principle due to not clearly defined specifications of responsibilities, use of resources, prolonged process times and increased staff engagement. The current status was evaluated in 2012 as part of the first step "plan". The subsequent step "do" included an expert symposium with external consultants, interdisciplinary consensus conferences with an actualization of the former hygiene procedure plan and the implementation process. All staff members involved were integrated into this management change process. The penetration rate of the training and information measures as well as the acceptance and compliance with the new hygiene procedure plan were reviewed within step "check". The rates of positive swabs and air sampling as well as of postoperative wound infections were analyzed for quality control and no evidence for a reduced effectiveness of the new hygiene plan was found. After the successful implementation of these measures the next improvement cycle ("act") was performed in 2014 which led to a simplification of the hygiene plan by reduction of the number of defined cleaning and disinfection programs for preparation of the OR. The reorganization measures described led to a comprehensive commitment of the hygiene procedure plan by distinct specifications for responsibilities, for the course of action and for the use of resources. Furthermore, a simplification of the plan, a rational staff assignment and reduced process times were accomplished. Finally, potential conflicts due to an insufficient evidence-based knowledge of personnel was reduced. This present project description can be used by other hospitals as a guideline for similar changes in management processes.
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
NASA Astrophysics Data System (ADS)
Randhir, Timothy O.; Raposa, Sarah
2014-11-01
Urbanization has a significant impact on water resources and requires a watershed-based approach to evaluate impacts of land use and urban development on watershed processes. This study uses a simulation with urban policy scenarios to model and strategize transferable recommendations for municipalities and cities to guide urban decisions using watershed ecohydrologic principles. The watershed simulation model is used to evaluation intensive (policy in existing built regions) and extensive (policy outside existing build regions) urban development scenarios with and without implementation of Best Management practices (BMPs). Water quantity and quality changes are simulated to assess effectiveness of five urban development scenarios. It is observed that optimal combination of intensive and extensive strategies can be used to sustain urban ecosystems. BMPs are found critical to reduce storm water and water quality impacts on urban development. Conservation zoning and incentives for voluntary adoption of BMPs can be used in sustaining urbanizing watersheds.
Horner, Ronnie D; Szaflarski, Jerzy P; Ying, Jun; Meganathan, Karthikeyan; Matthews, Gerald; Schroer, Brian; Weber, Debra; Raphaelson, Marc
2011-11-01
Similarities and differences in physician work intensity among specialties are poorly understood but have implications for quality of care, patient safety, practice organization and management, and payment. To determine the magnitude and important dimensions of physician work intensity for 4 specialties. Cross-sectional assessment of work intensity associated with actual patient care in the examination room or operating room. A convenience sample of 45 family physicians, 20 general internists, 22 neurologists, and 21 surgeons, located in Kansas, Kentucky, Maryland, Ohio, and Virginia. Work intensity measures included the National Aeronautics and Space Administration-Task Load Index (NASA-TLX), Subjective Work Assessment Technique (SWAT), and Multiple Resource Questionnaire. Stress was measured by the Dundee Stress State Questionnaire. Physicians reported similar magnitude of work intensity on the NASA-TLX and Multiple Resource Questionnaire. On the SWAT, general internists reported work intensity similar to surgeons but significantly lower than family physicians and neurologists (P=0.035). Surgeons reported significantly higher levels of task engagement on the stress measure than the other specialties (P=0.019), significantly higher intensity on physical demand (P < 0.001), and significantly lower intensity on the performance dimensions of the NASA-TLX than the other specialties (P=0.003). Surgeons reported the lowest intensity for temporal demand of all specialties, being significantly lower than either family physicians or neurologists (P=0.014). Family physicians reported the highest intensity on the time dimension of the SWAT, being significantly higher than either general internists or surgeons (P=0.008). Level of physician work intensity seems to be similar among specialties.
Thompson, Lindsay A; Goodman, David C; Little, George A
2002-06-01
Despite high per capita health care expenditure, the United States has crude infant survival rates that are lower than similarly developed nations. Although differences in vital recording and socioeconomic risk have been studied, a systematic, cross-national comparison of perinatal health care systems is lacking. To characterize systems of reproductive care for the United States, Australia, Canada, and the United Kingdom, including a detailed analysis of neonatal intensive care and mortality. Comparison of selected indicators of reproductive care and mortality from 1993-2000 through a systematic review of journal and government publications and structured interviews of leaders in perinatal and neonatal care. Compared with the other 3 countries, the United States has more neonatal intensive care resources yet provides proportionately less support for preconception and prenatal care. Unlike the United States, the other countries provided free family planning services and prenatal and perinatal physician care, and the United Kingdom and Australia paid for all contraception. The United States has high neonatal intensive care capacity, with 6.1 neonatologists per 10 000 live births; Australia, 3.7; Canada, 3.3; and the United Kingdom, 2.7. For intensive care beds, the United States has 3.3 per 10 000 live births; Australia and Canada, 2.6; and the United Kingdom, 0.67. Greater neonatal intensive care resources were not consistently associated with lower birth weight-specific mortality. The relative risk (United States as reference) of neonatal mortality for infants <1000 g was 0.84 for Australia, 1.12 for Canada, and 0.99 for the United Kingdom; for 1000 to 2499 g infants, the relative risk was 0.97 for Australia, 1.26 for Canada, and 0.95 for the United Kingdom. As reported elsewhere, low birth weight rates were notably higher in the United States, partially explaining the high crude mortality rates. The United States has significantly greater neonatal intensive care resources per capita, compared with 3 other developed countries, without having consistently better birth weight-specific mortality. Despite low birth weight rates that exceed other countries, the United States has proportionately more providers per low birth weight infant, but offers less extensive preconception and prenatal services. This study questions the effectiveness of the current distribution of US reproductive care resources and its emphasis on neonatal intensive care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Ching-Ho, E-mail: chchen@tea.ntue.edu.t; Liu, Wei-Lin, E-mail: wlliu@nanya.edu.t; Graduate Institute of Environmental Engineering, National Central University, Jungli, Taoyuan 320, Taiwan
Strategic environmental assessment (SEA) focuses primarily on assessing how policies, plans, and programs (PPPs) influence the sustainability of the involved regions. However, the processes of assessing policies and developing management strategies for pollution load and resource use are usually separate in the current SEA system. This study developed a policy management methodology to overcome the defects generated during the above processes. This work first devised a dynamic management framework using the methods of systems thinking, system dynamics, and Managing for Results (MFRs). Furthermore, a driving force-pressure-state-impact-response (DPSIR) indicator system was developed. The golf course installation policy was applied as amore » case study. Taiwan, counties of Taiwan, and the golf courses within those individual counties were identified as a system, subsystems, and objects, respectively. This study identified an object-linked double-layer framework with multi-stage-option to simultaneously to quantify golf courses in each subsystem and determine ratios of abatement and allocation for pollution load and resource use of each golf course. The DPSIR indicator values for each item of each golf course in each subsystem are calculated based on the options taken in the two decision layers. The summation of indicator values for all items of all golf courses in all subsystems according to various options is defined as the sustainability value of the policy. An optimization model and a system (IDPMS) were developed to obtain the greatest sustainability value of the policy, while golf course quantity, human activity intensity, total quantities of pollution load and resource use are simultaneously obtained. The solution method based on enumeration of multiple bounds for objectives and constraints (EMBOC) was developed for the problem with 1.95 x 10{sup 128} combinations of possible options to solve the optimal solution in ten minutes using a personal computer with 3.0 GHz CPU. This study obtain the policy with the optimal environmental sustainability value in Taiwan is 102 golf courses. Human activity intensity and total quantities of pollution load and resource use which are concurrently obtained are less than those of the existing policy and the existing quantities in 2006. The optimal solution remains unchanged under most sensitivity analysis conditions, unless the weights and constraints are extremely changed. The analytical results indicate that the proposed methodology can be used to assist the authorities for simultaneously generating and assessing the policy during the SEA process.« less
NASA Astrophysics Data System (ADS)
Hinkley, James T.; McNaughton, Robbie K.; Pye, John; Saw, Woei; Stechel, Ellen B.
2016-05-01
Reforming of methane is practiced on a vast scale globally for the production of syngas as a precursor for the production of many commodities, including hydrogen, ammonia and synthetic liquid fuels. Solar reforming can reduce the greenhouse gas intensity of syngas production by up to about 40% by using solar thermal energy to provide the endothermic heat of reaction, traditionally supplied by combustion of some of the feed. This has the potential to enable the production of solar derived synthetic fuels as drop in replacements for conventional fuels with significantly lower CO2 intensity than conventional gas to liquids (GTL) processes. However, the intermittent nature of the solar resource - both diurnal and seasonal - poses significant challenges for such a concept, which relies on synthesis processes that typically run continuously on very stable feed compositions. We find that the integration of solar syngas production to a GTL process is a non-trivial exercise, with the ability to turn down the capacity of the GTL synthesis section, and indeed to suspend operations for short periods without significant detriment to product quality or process operability, likely to be a key driver for the commercial implementation of solar liquid fuels. Projected costs for liquid fuel synthesis suggest that solar reforming and small scale gas to liquid synthesis can potentially compete with conventional oil derived transport fuels in the short to medium term.
A comparative assessment of resource efficiency in petroleum refining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Jeongwoo; Forman, Grant S.; Elgowainy, Amgad
Because of increasing environmental and energy security concerns, a detailed understanding of energy efficiency and greenhouse gas (GHG) emissions in the petroleum refining industry is critical for fair and equitable energy and environmental policies. To date, this has proved challenging due in part to the complex nature and variability within refineries. In an effort to simplify energy and emissions refinery analysis, we delineated LP modeling results from 60 large refineries from the US and EU into broad categories based on crude density (API gravity) and heavy product (HP) yields. Product-specific efficiencies and process fuel shares derived from this study weremore » incorporated in Argonne National Laboratory’s GREET life-cycle model, along with regional upstream GHG intensities of crude, natural gas and electricity specific to the US and EU regions. The modeling results suggest that refineries that process relatively heavier crude inputs and have lower yields of HPs generally have lower energy efficiencies and higher GHG emissions than refineries that run lighter crudes with lower yields of HPs. The former types of refineries tend to utilize energy-intensive units which are significant consumers of utilities (heat and electricity) and hydrogen. Among the three groups of refineries studied, the major difference in the energy intensities is due to the amount of purchased natural gas for utilities and hydrogen, while the sum of refinery feed inputs are generally constant. These results highlight the GHG emissions cost a refiner pays to process deep into the barrel to produce more of the desirable fuels with low carbon to hydrogen ratio.« less
A Comparative Assessment of Resource Efficiency in Petroleum Refining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Jeongwoo; Forman, G; Elgowainy, Amgad
2015-10-01
Because of increasing environmental and energy security concerns, a detailed understanding of energy efficiency and greenhouse gas (GHG) emissions in the petroleum refining industry is critical for fair and equitable energy and environmental policies. To date, this has proved challenging due in part to the complex nature and variability within refineries. In an effort to simplify energy and emissions refinery analysis, we delineated LP modeling results from 60 large refineries from the US and EU into broad categories based on crude density (API gravity) and heavy product (HP) yields. Product-specific efficiencies and process fuel shares derived from this study weremore » incorporated in Argonne National Laboratory's GREET life-cycle model, along with regional upstream GHG intensities of crude, natural gas and electricity specific to the US and EU regions. The modeling results suggest that refineries that process relatively heavier crude inputs and have lower yields of HPs generally have lower energy efficiencies and higher GHG emissions than refineries that run lighter crudes with lower yields of HPs. The former types of refineries tend to utilize energy-intensive units which are significant consumers of utilities (heat and electricity) and hydrogen. Among the three groups of refineries studied, the major difference in the energy intensities is due to the amount of purchased natural gas for utilities and hydrogen, while the sum of refinery feed inputs are generally constant. These results highlight the GHG emissions cost a refiner pays to process deep into the barrel to produce more of the desirable fuels with low carbon to hydrogen ratio. (c) 2015 Argonne National Laboratory. Published by Elsevier Ltd.« less
A comparative assessment of resource efficiency in petroleum refining
Han, Jeongwoo; Forman, Grant S.; Elgowainy, Amgad; ...
2015-03-25
Because of increasing environmental and energy security concerns, a detailed understanding of energy efficiency and greenhouse gas (GHG) emissions in the petroleum refining industry is critical for fair and equitable energy and environmental policies. To date, this has proved challenging due in part to the complex nature and variability within refineries. In an effort to simplify energy and emissions refinery analysis, we delineated LP modeling results from 60 large refineries from the US and EU into broad categories based on crude density (API gravity) and heavy product (HP) yields. Product-specific efficiencies and process fuel shares derived from this study weremore » incorporated in Argonne National Laboratory’s GREET life-cycle model, along with regional upstream GHG intensities of crude, natural gas and electricity specific to the US and EU regions. The modeling results suggest that refineries that process relatively heavier crude inputs and have lower yields of HPs generally have lower energy efficiencies and higher GHG emissions than refineries that run lighter crudes with lower yields of HPs. The former types of refineries tend to utilize energy-intensive units which are significant consumers of utilities (heat and electricity) and hydrogen. Among the three groups of refineries studied, the major difference in the energy intensities is due to the amount of purchased natural gas for utilities and hydrogen, while the sum of refinery feed inputs are generally constant. These results highlight the GHG emissions cost a refiner pays to process deep into the barrel to produce more of the desirable fuels with low carbon to hydrogen ratio.« less
Experience in using commercial clouds in CMS
NASA Astrophysics Data System (ADS)
Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration
2017-10-01
Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.
Experience in using commercial clouds in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauerdick, L.; Bockelman, B.; Dykstra, D.
Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less
Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density
NASA Astrophysics Data System (ADS)
Hohl, A.; Delmelle, E. M.; Tang, W.
2015-07-01
Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.
Singer, Kanakadurga; Subbaiah, Perla; Hutchinson, Raymond; Odetola, Folafoluwa; Shanley, Thomas P
2011-11-01
To describe the clinical course, resource use, and mortality of patients with leukemia admitted to the pediatric intensive care unit with sepsis and nonsepsis diagnoses over a 10-yr period. Retrospective analysis. Tertiary medical-surgical pediatric intensive care unit at C.S. Mott Children's Hospital, University of Michigan. All patients with leukemia admitted to the pediatric intensive care unit from January 1, 1998, to December 31, 2008. None; chart review. Clinical course was characterized by demographics, leukemia diagnosis, phase of therapy, leukocyte count on admission, presence of sepsis, steroid administration, intensity of care, and Pediatric Risk of Mortality score on admission to the pediatric intensive care unit. The primary outcome was survival to pediatric intensive care unit discharge. Among 68 single admissions to the pediatric intensive care unit with leukemia during the study period, 33 (48.5%) were admitted with sepsis. Admission to the pediatric intensive care unit for sepsis was associated with greater compromise of hemodynamic and renal function and use of stress dose steroids (p = .016), inotropic and/or vasopressor drugs (p = .01), and renal replacement therapy (p = .028) than nonsepsis admission. There was higher mortality among children with sepsis than other diagnoses (52% vs. 17%, p = .004). Also, mortality among children with sepsis was higher among those with acute lymphoblastic leukemia (60% vs. 44%) compared with acute myelogenous leukemia. Administration of stress dose steroids was associated with higher mortality (50% vs. 17%, p = .005) and neutropenia. Patients with acute lymphoblastic leukemia and sepsis showed the greatest mortality and resource use. Patients with acute leukemia and sepsis had a much higher mortality rate compared with previously described sepsis mortality rates for the general pediatric intensive care unit patient populations. Patients who received steroids had an increased mortality rate, but given the retrospective nature of this study, we maintain a position of equipoise with regard to this association. Variation in mortality and resource use by leukemia type suggests further research is needed to develop targeted intervention strategies to enhance patient outcomes.
Karthikeyan, Balasubramanian; Kadhiravan, Tamilarasu; Deepanjali, Surendran; Swaminathan, Rathinam Palamalai
2015-01-01
Mechanical ventilation is a resource intensive organ support treatment, and historical studies from low-resource settings had reported a high mortality. We aimed to study the outcomes in patients receiving mechanical ventilation in a contemporary low-resource setting. We prospectively studied the characteristics and outcomes (disease-related, mechanical ventilation-related, and process of care-related) in 237 adults mechanically ventilated for a medical illness at a teaching hospital in southern India during February 2011 to August 2012. Vital status of patients discharged from hospital was ascertained on Day 90 or later. Mean age of the patients was 40 ± 17 years; 140 (51%) were men. Poisoning and envenomation accounted for 98 (41%) of 237 admissions. In total, 87 (37%) patients died in-hospital; 16 (7%) died after discharge; 115 (49%) were alive at 90-day assessment; and 19 (8%) were lost to follow-up. Weaning was attempted in 171 (72%) patients; most patients (78 of 99 [79%]) failing the first attempt could be weaned off. Prolonged mechanical ventilation was required in 20 (8%) patients. Adherence to head-end elevation and deep vein thrombosis prophylaxis were 164 (69%) and 147 (62%) respectively. Risk of nosocomial infections particularly ventilator-associated pneumonia was high (57.2 per 1,000 ventilator-days). Higher APACHE II score quartiles (adjusted HR [95% CI] quartile 2, 2.65 [1.19-5.89]; quartile 3, 2.98 [1.24-7.15]; quartile 4, 5.78 [2.45-13.60]), and new-onset organ failure (2.98 [1.94-4.56]) were independently associated with the risk of death. Patients with poisoning had higher risk of reintubation (43% vs. 20%; P = 0.001) and ventilator-associated pneumonia (75% vs. 53%; P = 0.001). But, their mortality was significantly lower compared to the rest (24% vs. 44%; P = 0.002). The case-mix considerably differs from other settings. Mortality in this low-resource setting is similar to high-resource settings. But, further improvements in care processes and prevention of nosocomial infections are required.
Blancas, José; Casas, Alejandro; Pérez-Salicrup, Diego; Caballero, Javier; Vega, Ernesto
2013-06-02
Management types and their intensity may vary according to indicators such as: (1) practices complexity, (2) degree of techniques specialization, (3) occurrence and types of social regulations, (4) artificial selection intensity, (5) energy invested, (6) tools types, and (7) amounts of resources obtained. Management types of edible plants were characterized and analyzed in Náhuatl communities of the Tehuacán Valley. We expected that both natural and human pressures generate risk on plant resources availability, influencing human responses of management directed to decrease risk. We particularly hypothesized that magnitude of risk would be a direct function of human pressures favored by cultural and economic value and ecological factors such as scarcity (restricted distribution and abundance). Management practices may decrease risk of plant resources, more effectively when they are more intense; however, absence or insufficiency of management practices on endangered plants may favor loss of their populations. Understanding current management motives and their consequences on the purpose of ensuring availability of plant resources might allow us to understand similar processes occurring in the past. This issue is particularly important to be studied in the Tehuacán Valley, where archaeologists documented possible scenarios motivating origins of plant management by agriculture during prehistory. Through ethnobotanical collecting, 55 semi-structured and free listing interviews we inventoried edible plant species used in five villages of Coyomeapan, Mexico. We identified: (1) native plant species whose products are obtained exclusively through simple gathering, (2) native species involving simple gathering and other management types, and (3) non-native species managed by agricultural management. We conducted in depth studies on the 33 native species managed through gathering and other types of practices. We carried out a total of 660 sessions of detailed interviews to 20 households randomly selected. We showed to people voucher specimens and photos of the sample of species chosen and documented their cultural and economic values. Spatial availability of these plant species was evaluated through vegetation sampling. Values for each cultural, economic, and ecological indicator were codified and averaged or summed and weighed according to frequency of interviewees' responses or ecological conditions per plant species. With the standardized values of these indicators we performed a PCA and scores of the first principal component were considered as a risk index, which summarizes information of thirteen indicators of human use, demand and scarcity of each plant species. Similarly, eleven indicators of energy invested, complexity, tools and management strategies were used for performing PCA and scores of the first principal component were considered as management intensity index for each plant species. A linear regression analysis was performed to analyze the relation between risk and management intensity indexes. Amounts of variation of management data explained by ecological, cultural and economic information, as well as their risk level were analyzed through canonical correspondence analyses (CCA). A total of 122 edible plant species were recorded, nearly 30% of them were introduced domesticated plants, 51 were wild species obtained exclusively by simple gathering and 33 were native species obtained by simple gathering and other management practices, these latter were the ones more deeply studied. People recognized variants in 21 of these latter 33 species, the variants receiving differential use, management, artificial selection and incipient domestication. The lowest values of management intensity corresponded to species under simple gathering and tolerance, mostly annual abundant plants, occasionally consumed by few people. The highest management intensity values were recorded in species with economic importance, mostly perennial with recognized variants whose management requires using tools, and which are protected by collective regulations. The regression analysis indicated significant value R² = 0.433 (P < 0.001) between risk and management indexes. CCA explained 65.5% of variation of management intensity, mainly by socio-cultural factors (32.6%), whereas ecological data explained 21.3% and the intersection of all factors 11.6%. Variation of management intensity is 67.6% explained by risk variables. Length-span of life cycle, reproductive system type, distribution, number of parts used, number of management and use forms and type of regulations were statistically significant. People manage plant resources according to the role these play in households' subsistence, the quantity available and the quality of their useful products; particularly important is the balance between resources availability and demand. Management responses to risk are also influenced by the ease to propagate or manipulate individual plants and time requiring the construction of manipulation strategies and techniques.
2013-01-01
Background Management types and their intensity may vary according to indicators such as: (1) practices complexity, (2) degree of techniques specialization, (3) occurrence and types of social regulations, (4) artificial selection intensity, (5) energy invested, (6) tools types, and (7) amounts of resources obtained. Management types of edible plants were characterized and analyzed in Náhuatl communities of the Tehuacán Valley. We expected that both natural and human pressures generate risk on plant resources availability, influencing human responses of management directed to decrease risk. We particularly hypothesized that magnitude of risk would be a direct function of human pressures favored by cultural and economic value and ecological factors such as scarcity (restricted distribution and abundance). Management practices may decrease risk of plant resources, more effectively when they are more intense; however, absence or insufficiency of management practices on endangered plants may favor loss of their populations. Understanding current management motives and their consequences on the purpose of ensuring availability of plant resources might allow us to understand similar processes occurring in the past. This issue is particularly important to be studied in the Tehuacán Valley, where archaeologists documented possible scenarios motivating origins of plant management by agriculture during prehistory. Methods Through ethnobotanical collecting, 55 semi-structured and free listing interviews we inventoried edible plant species used in five villages of Coyomeapan, Mexico. We identified: (1) native plant species whose products are obtained exclusively through simple gathering, (2) native species involving simple gathering and other management types, and (3) non-native species managed by agricultural management. We conducted in depth studies on the 33 native species managed through gathering and other types of practices. We carried out a total of 660 sessions of detailed interviews to 20 households randomly selected. We showed to people voucher specimens and photos of the sample of species chosen and documented their cultural and economic values. Spatial availability of these plant species was evaluated through vegetation sampling. Values for each cultural, economic, and ecological indicator were codified and averaged or summed and weighed according to frequency of interviewees’ responses or ecological conditions per plant species. With the standardized values of these indicators we performed a PCA and scores of the first principal component were considered as a risk index, which summarizes information of thirteen indicators of human use, demand and scarcity of each plant species. Similarly, eleven indicators of energy invested, complexity, tools and management strategies were used for performing PCA and scores of the first principal component were considered as management intensity index for each plant species. A linear regression analysis was performed to analyze the relation between risk and management intensity indexes. Amounts of variation of management data explained by ecological, cultural and economic information, as well as their risk level were analyzed through canonical correspondence analyses (CCA). Results A total of 122 edible plant species were recorded, nearly 30% of them were introduced domesticated plants, 51 were wild species obtained exclusively by simple gathering and 33 were native species obtained by simple gathering and other management practices, these latter were the ones more deeply studied. People recognized variants in 21 of these latter 33 species, the variants receiving differential use, management, artificial selection and incipient domestication. The lowest values of management intensity corresponded to species under simple gathering and tolerance, mostly annual abundant plants, occasionally consumed by few people. The highest management intensity values were recorded in species with economic importance, mostly perennial with recognized variants whose management requires using tools, and which are protected by collective regulations. The regression analysis indicated significant value R2 = 0.433 (P < 0.001) between risk and management indexes. CCA explained 65.5% of variation of management intensity, mainly by socio-cultural factors (32.6%), whereas ecological data explained 21.3% and the intersection of all factors 11.6%. Variation of management intensity is 67.6% explained by risk variables. Length-span of life cycle, reproductive system type, distribution, number of parts used, number of management and use forms and type of regulations were statistically significant. Conclusion People manage plant resources according to the role these play in households’ subsistence, the quantity available and the quality of their useful products; particularly important is the balance between resources availability and demand. Management responses to risk are also influenced by the ease to propagate or manipulate individual plants and time requiring the construction of manipulation strategies and techniques. PMID:23725379
Turkelson, Carman; Aebersold, Michelle; Redman, Richard; Tschannen, Dana
Effective interprofessional communication is critical to patient safety. This pre-/postimplementation project used a multifaceted educational strategy with high-fidelity simulation to introduce evidence-based communication tools, adapted from Nursing Crew Resource Management, to intensive care unit nurses. Results indicated that participants were satisfied with the education, and their perceptions of interprofessional communication and knowledge improved. Teams (n = 16) that used the communication tools during simulation were more likely to identify the problem, initiate key interventions, and have positive outcomes.
Assessment of Southern California environment from ERTS-1
NASA Technical Reports Server (NTRS)
Bowden, L. W.; Viellenave, J. H.
1973-01-01
ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.
Madore, Amy; Rosenberg, Julie; Muyindike, Winnie R; Bangsberg, David R; Bwana, Mwebesa B; Martin, Jeffrey N; Kanyesigye, Michael; Weintraub, Rebecca
2015-12-01
Implementation lessons: • Technology alone does not necessarily lead to improvement in health service delivery, in contrast to the common assumption that advanced technology goes hand in hand with progress. • Implementation of electronic medical record (EMR) systems is a complex, resource-intensive process that, in addition to software, hardware, and human resource investments, requires careful planning, change management skills, adaptability, and continuous engagement of stakeholders. • Research requirements and goals must be balanced with service delivery needs when determining how much information is essential to collect and who should be interfacing with the EMR system. • EMR systems require ongoing monitoring and regular updates to ensure they are responsive to evolving clinical use cases and research questions. • High-quality data and analyses are essential for EMRs to deliver value to providers, researchers, and patients. Copyright © 2015 Elsevier Inc. All rights reserved.
[HYGIENIC REGULATION OF THE USE OF ELECTRONIC EDUCATIONAL RESOURCES IN THE MODERN SCHOOL].
Stepanova, M I; Aleksandrova, I E; Sazanyuk, Z I; Voronova, B Z; Lashneva, L P; Shumkova, T V; Berezina, N O
2015-01-01
We studied the effect of academic studies with the use a notebook computer and interactive whiteboard on the functional state of an organism of schoolchildren. Using a complex of hygienic and physiological methods of the study we established that regulation of the computer activity of students must take into account not only duration but its intensity either. Design features of a notebook computer were shown both to impede keeping the optimal working posture in primary school children and increase the risk offormation of disorders of vision and musculoskeletal system. There were established the activating influence of the interactive whiteboard on performance activities and favorable dynamics of indices of the functional state of the organism of students under keeping optimal density of the academic study and the duration of its use. There are determined safety regulations of the work of schoolchildren with electronic resources in the educational process.
Petersen, David W; Kawasaki, Ernest S
2007-01-01
DNA microarray technology has become a powerful tool in the arsenal of the molecular biologist. Capitalizing on high precision robotics and the wealth of DNA sequences annotated from the genomes of a large number of organisms, the manufacture of microarrays is now possible for the average academic laboratory with the funds and motivation. Microarray production requires attention to both biological and physical resources, including DNA libraries, robotics, and qualified personnel. While the fabrication of microarrays is a very labor-intensive process, production of quality microarrays individually tailored on a project-by-project basis will help researchers shed light on future scientific questions.
Experiences in autotuning matrix multiplication for energy minimization on GPUs
Anzt, Hartwig; Haugen, Blake; Kurzak, Jakub; ...
2015-05-20
In this study, we report extensive results and analysis of autotuning the computationally intensive graphics processing units kernel for dense matrix–matrix multiplication in double precision. In contrast to traditional autotuning and/or optimization for runtime performance only, we also take the energy efficiency into account. For kernels achieving equal performance, we show significant differences in their energy balance. We also identify the memory throughput as the most influential metric that trades off performance and energy efficiency. Finally, as a result, the performance optimal case ends up not being the most efficient kernel in overall resource use.
ERIC Educational Resources Information Center
Zhang, Liang; Bao, Wei; Sun, Liang
2016-01-01
In this study we examined the resource-research relationship at China's research universities. The stochastic frontier production function was employed in analyses of a panel data set on a group of the most research-intensive universities in China from 2000 to 2010. Results suggested overall tight relationships between various resources (including…
Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Clarke, Lauren; Gillanders, Elizabeth; Feuer, Eric J.
2013-01-01
Summary: Many simulation methods and programs have been developed to simulate genetic data of the human genome. These data have been widely used, for example, to predict properties of populations retrospectively or prospectively according to mathematically intractable genetic models, and to assist the validation, statistical inference and power analysis of a variety of statistical models. However, owing to the differences in type of genetic data of interest, simulation methods, evolutionary features, input and output formats, terminologies and assumptions for different applications, choosing the right tool for a particular study can be a resource-intensive process that usually involves searching, downloading and testing many different simulation programs. Genetic Simulation Resources (GSR) is a website provided by the National Cancer Institute (NCI) that aims to help researchers compare and choose the appropriate simulation tools for their studies. This website allows authors of simulation software to register their applications and describe them with well-defined attributes, thus allowing site users to search and compare simulators according to specified features. Availability: http://popmodels.cancercontrol.cancer.gov/gsr. Contact: gsr@mail.nih.gov PMID:23435068
Transformation of OODT CAS to Perform Larger Tasks
NASA Technical Reports Server (NTRS)
Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean
2008-01-01
A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.
Pan, Shu-Yuan; Chung, Tai-Chun; Ho, Chang-Ching; Hou, Chin-Jen; Chen, Yi-Hung; Chiang, Pen-Chi
2017-12-08
Both steelmaking via an electric arc furnace and manufacturing of portland cement are energy-intensive and resource-exploiting processes, with great amounts of carbon dioxide (CO 2 ) emission and alkaline solid waste generation. In fact, most CO 2 capture and storage technologies are currently too expensive to be widely applied in industries. Moreover, proper stabilization prior to utilization of electric arc furnace slag are still challenging due to its high alkalinity, heavy metal leaching potentials and volume instability. Here we deploy an integrated approach to mineralizing flue gas CO 2 using electric arc furnace slag while utilizing the reacted product as supplementary cementitious materials to establish a waste-to-resource supply chain toward a circular economy. We found that the flue gas CO 2 was rapidly mineralized into calcite precipitates using electric arc furnace slag. The carbonated slag can be successfully utilized as green construction materials in blended cement mortar. By this modulus, the global CO 2 reduction potential using iron and steel slags was estimated to be ~138 million tons per year.
How Confounder Strength Can Affect Allocation of Resources in Electronic Health Records.
Lynch, Kristine E; Whitcomb, Brian W; DuVall, Scott L
2018-01-01
When electronic health record (EHR) data are used, multiple approaches may be available for measuring the same variable, introducing potentially confounding factors. While additional information may be gleaned and residual confounding reduced through resource-intensive assessment methods such as natural language processing (NLP), whether the added benefits offset the added cost of the additional resources is not straightforward. We evaluated the implications of misclassification of a confounder when using EHRs. Using a combination of simulations and real data surrounding hospital readmission, we considered smoking as a potential confounder. We compared ICD-9 diagnostic code assignment, which is an easily available measure but has the possibility of substantial misclassification of smoking status, with NLP, a method of determining smoking status that more expensive and time-consuming than ICD-9 code assignment but has less potential for misclassification. Classification of smoking status with NLP consistently produced less residual confounding than the use of ICD-9 codes; however, when minimal confounding was present, differences between the approaches were small. When considerable confounding is present, investing in a superior measurement tool becomes advantageous.
Timothy D. Faust; Alexander Clark; Charles E. Courchene; Barry D. Shiver; Monique L. Belli
1999-01-01
The demand for southern pine fiber is increasing. However, the land resources to produce wood fiber are decreasing. The wood industry is now using intensive cultural treatments, such as competition control, fertilization, and short rotations, to increase fiber production. The impact of these intensive environmental treatments on increased growth is positive and...
Hughes, K Michael; Benenson, Ronald S; Krichten, Amy E; Clancy, Keith D; Ryan, James Patrick; Hammond, Christopher
2014-09-01
Crew Resource Management (CRM) is a team-building communication process first implemented in the aviation industry to improve safety. It has been used in health care, particularly in surgical and intensive care settings, to improve team dynamics and reduce errors. We adapted a CRM process for implementation in the trauma resuscitation area. An interdisciplinary steering committee developed our CRM process to include a didactic classroom program based on a preimplementation survey of our trauma team members. Implementation with new cultural and process expectations followed. The Human Factors Attitude Survey and Communication and Teamwork Skills assessment tool were used to design, evaluate, and validate our CRM program. The initial trauma communication survey was completed by 160 team members (49% response). Twenty-five trauma resuscitations were observed and scored using Communication and Teamwork Skills. Areas of concern were identified and 324 staff completed our 3-hour CRM course during a 3-month period. After CRM training, 132 communication surveys and 38 Communication and Teamwork Skills observations were completed. In the post-CRM survey, respondents indicated improvement in accuracy of field to medical command information (p = 0.029); accuracy of emergency department medical command information to the resuscitation area (p = 0.002); and team leader identity, communication of plan, and role assignment (p = 0.001). After CRM training, staff were more likely to speak up when patient safety was a concern (p = 0.002). Crew Resource Management in the trauma resuscitation area enhances team dynamics, communication, and, ostensibly, patient safety. Philosophy and culture of CRM should be compulsory components of trauma programs and in resuscitation of injured patients. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Berge, Jerica M; Adamek, Margaret; Caspi, Caitlin; Loth, Katie A; Shanafelt, Amy; Stovitz, Steven D; Trofholz, Amanda; Grannon, Katherine Y; Nanney, Marilyn S
2017-08-01
Despite intense nationwide efforts to improve healthy eating and physical activity across the lifespan, progress has plateaued. Moreover, health inequities remain. Frameworks that integrate research, clinical practice, policy, and community resources to address weight-related behaviors are needed. Implementation and evaluation of integration efforts also remain a challenge. The purpose of this paper is to: (1) Describe the planning and development process of an integrator entity, HEAL (Healthy Eating and Activity across the Lifespan); (2) present outcomes of the HEAL development process including the HEAL vision, mission, and values statements; (3) define the planned integrator functions of HEAL; and (4) describe the ongoing evaluation of the integration process. HEAL team members used a theoretically-driven, evidence-based, systemic, twelve-month planning process to guide the development of HEAL and to lay the foundation for short- and long-term integration initiatives. Key development activities included a review of the literature and case studies, identifying guiding principles and infrastructure needs, conducting stakeholder/key informant interviews, and continuous capacity building among team members. Outcomes/deliverables of the first year of HEAL included a mission, vision, and values statements; definitions of integration and integrator functions and roles; a set of long-range plans; and an integration evaluation plan. Application of the HEAL integration model is currently underway through community solicited initiatives. Overall, HEAL aims to lead real world integrative work that coalesce across research, clinical practice, and policy with community resources to inspire a culture of health equity aimed at improving healthy eating and physical activity across the lifespan. Copyright © 2017 Elsevier Inc. All rights reserved.
Orchestrating Distributed Resource Ensembles for Petascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldin, Ilya; Mandal, Anirban; Ruth, Paul
2014-04-24
Distributed, data-intensive computational science applications of interest to DOE scientific com- munities move large amounts of data for experiment data management, distributed analysis steps, remote visualization, and accessing scientific instruments. These applications need to orchestrate ensembles of resources from multiple resource pools and interconnect them with high-capacity multi- layered networks across multiple domains. It is highly desirable that mechanisms are designed that provide this type of resource provisioning capability to a broad class of applications. It is also important to have coherent monitoring capabilities for such complex distributed environments. In this project, we addressed these problems by designing an abstractmore » API, enabled by novel semantic resource descriptions, for provisioning complex and heterogeneous resources from multiple providers using their native provisioning mechanisms and control planes: computational, storage, and multi-layered high-speed network domains. We used an extensible resource representation based on semantic web technologies to afford maximum flexibility to applications in specifying their needs. We evaluated the effectiveness of provisioning using representative data-intensive ap- plications. We also developed mechanisms for providing feedback about resource performance to the application, to enable closed-loop feedback control and dynamic adjustments to resource allo- cations (elasticity). This was enabled through development of a novel persistent query framework that consumes disparate sources of monitoring data, including perfSONAR, and provides scalable distribution of asynchronous notifications.« less
Robidoux, Serje; Rauwerda, Derek; Besner, Derek
2014-05-01
Whether or not lexical access from print requires spatial attention has been debated intensively for the last 30 years. Studies involving colour naming generally find evidence that "unattended" words are processed. In contrast, reading-based experiments do not find evidence of distractor processing. One theory ascribes the discrepancy to weaker attentional demands for colour identification. If colour naming does not capture all of a subject's attention, the remaining attentional resources can be deployed to process the distractor word. The present study combined exogenous spatial cueing with colour naming and reading aloud separately and found that colour naming is less sensitive to the validity of a spatial cue than is reading words aloud. Based on these results, we argue that colour naming studies do not effectively control attention so that no conclusions about unattended distractor processing can be drawn from them. Thus we reiterate the consistent conclusion drawn from reading aloud and lexical decision studies: There is no word identification without (spatial) attention.
Water-quality monitoring and process understanding in support of environmental policy and management
Peters, N.E.
2008-01-01
The quantity and quality of freshwater at any point on the landscape reflect the combined effects of many processes operating along hydrological pathways within a drainage basin/watershed/catchment. Primary drivers for the availability of water are landscape changes and patterns, and the processes affecting the timing, magnitude, and intensity of precipitation, including global climate change. The degradation of air, land, and water in one part of a drainage basin can have negative effects on users downstream; the time and space scales of the effects are determined by the residence time along the various hydrological pathways. Hydrology affects transport, deposition, and recycling of inorganic materials and sediment. These components affect biota and associated ecosystem processes, which rely on sustainable flows throughout a drainage basin. Human activities on all spatial scales affect both water quantity and quality, and some human activities can have a disproportionate effect on an entire drainage basin. Aquatic systems have been continuously modified by agriculture, through land-use change, irrigation and navigation, disposal of urban, mining, and industrial wastes, and engineering modifications to the environment. Interdisciplinary integrated basin studies within the last several decades have provided a more comprehensive understanding of the linkages among air, land, and water resources. This understanding, coupled with environmental monitoring, has evolved a more multidisciplinary integrated approach to resource management, particularly within drainage basins.
Implementing partnership-driven clinical federated electronic health record data sharing networks.
Stephens, Kari A; Anderson, Nicholas; Lin, Ching-Ping; Estiri, Hossein
2016-09-01
Building federated data sharing architectures requires supporting a range of data owners, effective and validated semantic alignment between data resources, and consistent focus on end-users. Establishing these resources requires development methodologies that support internal validation of data extraction and translation processes, sustaining meaningful partnerships, and delivering clear and measurable system utility. We describe findings from two federated data sharing case examples that detail critical factors, shared outcomes, and production environment results. Two federated data sharing pilot architectures developed to support network-based research associated with the University of Washington's Institute of Translational Health Sciences provided the basis for the findings. A spiral model for implementation and evaluation was used to structure iterations of development and support knowledge share between the two network development teams, which cross collaborated to support and manage common stages. We found that using a spiral model of software development and multiple cycles of iteration was effective in achieving early network design goals. Both networks required time and resource intensive efforts to establish a trusted environment to create the data sharing architectures. Both networks were challenged by the need for adaptive use cases to define and test utility. An iterative cyclical model of development provided a process for developing trust with data partners and refining the design, and supported measureable success in the development of new federated data sharing architectures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Odetola, Folafoluwa O; Davis, Matthew M; Cohn, Lisa M; Clark, Sarah J
2009-03-01
To describe patterns of transfer, resource utilization, and clinical outcomes associated with interhospital transfer of critically ill and injured children. Secondary analysis of administrative claims data. Children 0 to 18 years in the Michigan Medicaid program who underwent interhospital transfer for intensive care from January 1, 2002 to December 31, 2004. The 3 sources of transfer from referring hospitals were: emergency department (ED), ward, or intensive care unit (ICU). Mortality and duration of hospital stay at the receiving hospitals. Of 1643 interhospital transfer admissions to intensive care at receiving hospitals, 62%, 31%, and 7% were from the ED, ward, and ICU of referring hospitals, respectively. Nineteen percent had comorbid illness, while 11% had organ dysfunction at the referring hospital. After controlling for comorbid illness, patient age, and pretransfer organ dysfunction; compared with ED transfers, mortality in the receiving hospital was higher for ward transfers (odds ratio [OR], 1.76; 95% confidence interval [CI], 1.02-3.03) but not for ICU transfers. Also, compared with ED transfers, hospital stay was longer by 1.5 days for ward transfers and by 13.5 days for ICU transfers. In this multiyear, statewide sample, mortality and resource utilization were higher among children who underwent interhospital transfer to intensive care after initial hospitalization, compared with those transferred directly from emergency to intensive care. Decision-making underlying initial triage and subsequent interhospital transfer of critically ill children warrants further study. (c) 2009 Society of Hospital Medicine.
Potential Applications of Zeolite Membranes in Reaction Coupling Separation Processes
Daramola, Michael O.; Aransiola, Elizabeth F.; Ojumu, Tunde V.
2012-01-01
Future production of chemicals (e.g., fine and specialty chemicals) in industry is faced with the challenge of limited material and energy resources. However, process intensification might play a significant role in alleviating this problem. A vision of process intensification through multifunctional reactors has stimulated research on membrane-based reactive separation processes, in which membrane separation and catalytic reaction occur simultaneously in one unit. These processes are rather attractive applications because they are potentially compact, less capital intensive, and have lower processing costs than traditional processes. Therefore this review discusses the progress and potential applications that have occurred in the field of zeolite membrane reactors during the last few years. The aim of this article is to update researchers in the field of process intensification and also provoke their thoughts on further research efforts to explore and exploit the potential applications of zeolite membrane reactors in industry. Further evaluation of this technology for industrial acceptability is essential in this regard. Therefore, studies such as techno-economical feasibility, optimization and scale-up are of the utmost importance.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
NASA Astrophysics Data System (ADS)
Grover, S.; Tayal, S.
2014-12-01
Interdependency between water and energy is generally transacted in trade-off mode; where either of the resource gets affected because of the other. Generally this trade-off is commonly known as water-energy nexus. Many studies have been undertaken in various parts of the world using various approaches to tease out the intricate nexus. This research has adopted a different approach to quantify the inter-dependency. The adopted approach made an attempt to tease out the nexus from demand side for both the resources. For water demand assessment PODIUM Sim model was used and for other parameters available secondary data was used. Using this approach percentage share of water for energy and energy for water was estimated. For an informed decision making and sustainable development, assessment was carried out at state level as most of the policies are made specifically for the state. The research was done for the southernmost state of India, Tamil Nadu which is a rapidly growing industrial hub. Tamil Nadu is energy and water intensive state and the analysis shows that the share of water demand from energy sector compared to water demand from other major sectors is miniscule. While, the energy demand in water sector for various processes in different sectors compared to energy demand as total has a comparable share of range 15-25%. This analysis indicated the relative risk sectors face in competition for the resource. It point outs that water sector faces fierce competition with other sectors for energy. Moreover, the results of the study has assessed that state has negative water balance, which may make access to water more energy intensive with time. But, a projection into future scenario with an assumption based on the ongoing policy program of improving irrigation efficiency was made. It provided a solution of a potential positive equilibrium which conserves both water and energy. This scenario gave promising results which indicated less of water demand from agricultural sector which is the most water intensive sector in the state, less requirement of energy for irrigation and improvement in overall water balance of the state.With the changing climate and growing population, resources at crisis can be managed sustainably if this nexus is decoded to understand the interdependency.
The trade of virtual water: do property rights matter?
NASA Astrophysics Data System (ADS)
Xu, Ankai
2016-04-01
My paper examines the determinants of the virtual water trade - embodied in the trade of agriculture products - by estimating a structural gravity model. In particular, it tests the relationship between property rights and the export of water-intensive agricultural products based on water footprint data in Mekonnen and Hoekstra (2011, 2012). Using two different measures of property rights protection, I show that countries with weaker property rights have an apparent comparative advantage in the trade of water-intensive products. After controlling for the economic size, natural resource endowments, and possible effects of reverse causality, the trade flow of virtual water is negatively and significantly correlated with the property rights index of the exporting country. Holding other factors constant, one point increase in the property rights index of a country is associated with a 24% - 36% decrease in its virtual water export, whereas a 1% increase in the natural resource protection index of a country is associated with a 16% decrease in its virtual water export. This paper is the first empirical work that tests the relationship between property rights and trade of water-intensive products, offering a new perceptive in the debate of virtual water trade. The findings provide a possible explanation on the paradoxical evidence that some countries with scarce water resources export water-intensive products. The result is important not only in terms of its theoretical relevance, but also its policy implications. As prescribed by the model of trade and property rights, when countries with weaker property rights open to international trade, they are more likely to over-exploit and thus expedite the depletion of natural resources.
Kumar, Parmeshwar; Jithesh, Vishwanathan; Gupta, Shakti Kumar
2015-01-01
Though intensive care units (ICUs) only account for 10% of hospital beds, they consume nearly 22% of hospital resources. Few definitive costing studies have been conducted in Indian settings that would help determine appropriate resource allocation. To evaluate and compare the cost of intensive care delivery between multi-specialty and neurosurgery ICU in an apex trauma care facility in India. The study was conducted in a polytrauma and neurosurgery ICU at a 203 bedded level IV trauma care facility in New Delhi, India from May, 2012 to June 2012. The study was cross-sectional, retrospective, and record-based. Traditional costing was used to arrive at the cost for both direct and indirect cost estimates. The cost centers included in study were building cost, equipment cost, human resources, materials and supplies, clinical and nonclinical support services, engineering maintenance cost, and biomedical waste management. Fisher's two-tailed t-test. Total cost/bed/day for the multi-specialty ICU was Rs. 14,976.9/- and for the neurosurgery ICU was Rs. 14,306.7/-, manpower constituting nearly half of the expenditure in both ICUs. The cost center wise and overall difference in the cost among the ICUs were statistically significant. Quantification of expenditure in running an ICU in a trauma center would assist healthcare decision makers in better allocation of resources. Although multi-specialty ICUs are more expensive, other factors will also play a role in defining the kind of ICU that need to be designed.
Nelson, Judith E; Bassett, Rick; Boss, Renee D; Brasel, Karen J; Campbell, Margaret L; Cortez, Therese B; Curtis, J Randall; Lustbader, Dana R; Mulkerin, Colleen; Puntillo, Kathleen A; Ray, Daniel E; Weissman, David E
2010-09-01
To describe models used in successful clinical initiatives to improve the quality of palliative care in critical care settings. We searched the MEDLINE database from inception to April 2010 for all English language articles using the terms "intensive care," "critical care," or "ICU" and "palliative care"; we also hand-searched reference lists and author files. Based on review and synthesis of these data and the experiences of our interdisciplinary expert Advisory Board, we prepared this consensus report. We critically reviewed the existing data with a focus on models that have been used to structure clinical initiatives to enhance palliative care for critically ill patients in intensive care units and their families. There are two main models for intensive care unit-palliative care integration: 1) the "consultative model," which focuses on increasing the involvement and effectiveness of palliative care consultants in the care of intensive care unit patients and their families, particularly those patients identified as at highest risk for poor outcomes; and 2) the "integrative model," which seeks to embed palliative care principles and interventions into daily practice by the intensive care unit team for all patients and families facing critical illness. These models are not mutually exclusive but rather represent the ends of a spectrum of approaches. Choosing an overall approach from among these models should be one of the earliest steps in planning an intensive care unit-palliative care initiative. This process entails a careful and realistic assessment of available resources, attitudes of key stakeholders, structural aspects of intensive care unit care, and patterns of local practice in the intensive care unit and hospital. A well-structured intensive care unit-palliative care initiative can provide important benefits for patients, families, and providers.
Prehn, Kristin; Kazzer, Philipp; Lischke, Alexander; Heinrichs, Markus; Herpertz, Sabine C; Domes, Gregor
2013-06-01
To investigate the mechanisms by which oxytocin improves socioaffective processing, we measured behavioral and pupillometric data during a dynamic facial emotion recognition task. In a double-blind between-subjects design, 47 men received either 24 IU intranasal oxytocin (OXT) or a placebo (PLC). Participants in the OXT group recognized all facial expressions at lower intensity levels than did participants in the PLC group. Improved performance was accompanied by increased task-related pupil dilation, indicating an increased recruitment of attentional resources. We also found increased pupil dilation during the processing of female compared with male faces. This gender-specific stimulus effect diminished in the OXT group, in which pupil size specifically increased for male faces. Results suggest that improved emotion recognition after OXT treatment might be due to an intensified processing of stimuli that usually do not recruit much attention. Copyright © 2013 Society for Psychophysiological Research.
Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrouchov, George; Doll, William E.; Beard, Les P.
2009-01-01
Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
NASA Astrophysics Data System (ADS)
Li, Jiqing; Huang, Jing; Li, Jianchang
2018-06-01
The time-varying design flood can make full use of the measured data, which can provide the reservoir with the basis of both flood control and operation scheduling. This paper adopts peak over threshold method for flood sampling in unit periods and Poisson process with time-dependent parameters model for simulation of reservoirs time-varying design flood. Considering the relationship between the model parameters and hypothesis, this paper presents the over-threshold intensity, the fitting degree of Poisson distribution and the design flood parameters are the time-varying design flood unit period and threshold discriminant basis, deduced Longyangxia reservoir time-varying design flood process at 9 kinds of design frequencies. The time-varying design flood of inflow is closer to the reservoir actual inflow conditions, which can be used to adjust the operating water level in flood season and make plans for resource utilization of flood in the basin.
Realizing the geothermal electricity potential—water use and consequences
NASA Astrophysics Data System (ADS)
Shankar Mishra, Gouri; Glassley, William E.; Yeh, Sonia
2011-07-01
Electricity from geothermal resources has the potential to supply a significant portion of US baseload electricity. We estimate the water requirements of geothermal electricity and the impact of potential scaling up of such electricity on water demand in various western states with rich geothermal resources but stressed water resources. Freshwater, degraded water, and geothermal fluid requirements are estimated explicitly. In general, geothermal electricity has higher water intensity (l kWh - 1) than thermoelectric or solar thermal electricity. Water intensity decreases with increase in resource enthalpy, and freshwater gets substituted by degraded water at higher resource temperatures. Electricity from enhanced geothermal systems (EGS) could displace 8-100% of thermoelectricity generated in most western states. Such displacement would increase stress on water resources if re-circulating evaporative cooling, the dominant cooling system in the thermoelectric sector, is adopted. Adoption of dry cooling, which accounts for 78% of geothermal capacity today, will limit changes in state-wide freshwater abstraction, but increase degraded water requirements. We suggest a research and development focus to develop advanced energy conversion and cooling technologies that reduce water use without imposing energy and consequent financial penalties. Policies should incentivize the development of higher enthalpy resources, and support identification of non-traditional degraded water sources and optimized siting of geothermal plants.
Donázar, José Antonio; Ceballos, Olga; Cortés-Avizanda, Ainara
2018-07-15
The expansion of road networks and the increase in traffic have emerged in recent years as key threats to the conservation of biodiversity. This is particularly concerning in many protected areas because the increase of recreational activities requiring the use of vehicles. Effects of roads and traffic within guild scenarios and ecological processes remain however poorly known. Here we examined how road proximity and traffic intensity influence patterns of resource use in an Old-World avian scavenger guild living in a protected natural park in northern Spain. We experimentally placed 130 carcasses at different distances from a scenic road in the centre of the park. Vehicles were recorded by means of traffic counters which revealed that maximum numbers were reached during weekends and holidays and during the middle hours of the day. Avian scavenger attendance at carcasses was recorded by means of camera-traps. Obligated scavengers, Eurasian griffon (Gyps fulvus) and Egyptian vultures (Neophron percnopterus) were frequently observed (59.4% and 37.7% of the consumed carcasses) together with five other facultative scavenger species. We found that the richness (number of species) and the probability of consumption of the resource were reduced the smaller the distance to the road and in days with higher traffic intensity. The same factors affected the probability of presence of all the scavenger species. Moreover, some of them, notably griffon vultures, showed hourly patterns of carcass attendance suggesting avoidance of maximum traffic levels. Our results highlight that roads and traffic would trigger consequences on the structure and functioning of scavenger food webs, which may be particularly concerning in protected areas with remarkable levels of biodiversity. Future regulations at protected areas should couple both traffic and tourist affluence with wildlife conservation. In this way important ecological processes would be preserved while maintaining a good dissemination of natural values. Copyright © 2018 Elsevier B.V. All rights reserved.
Applications of LANDSAT data to the integrated economic development of Mindoro, Phillipines
NASA Technical Reports Server (NTRS)
Wagner, T. W.; Fernandez, J. C.
1977-01-01
LANDSAT data is seen as providing essential up-to-date resource information for the planning process. LANDSAT data of Mindoro Island in the Philippines was processed to provide thematic maps showing patterns of agriculture, forest cover, terrain, wetlands and water turbidity. A hybrid approach using both supervised and unsupervised classification techniques resulted in 30 different scene classes which were subsequently color-coded and mapped at a scale of 1:250,000. In addition, intensive image analysis is being carried out in evaluating the images. The images, maps, and aerial statistics are being used to provide data to seven technical departments in planning the economic development of Mindoro. Multispectral aircraft imagery was collected to compliment the application of LANDSAT data and validate the classification results.
Association of unit size, resource utilization and occupancy with outcomes of preterm infants.
Shah, P S; Mirea, L; Ng, E; Solimano, A; Lee, S K
2015-07-01
Assess association of NICU size, and occupancy rate and resource utilization at admission with neonatal outcome. Retrospective cohort study of 9978 infants born at 23-32 weeks gestation and admitted to 23 tertiary-level Canadian NICUs during 2010-2012. Adjusted odds ratios (AOR) were estimated for a composite outcome of mortality/any major morbidity with respect to NICU size, occupancy rate and intensity of resource utilization at admission. A total of 2889 (29%) infants developed the composite outcome, the odds of which were higher for 16-29, 30-36 and >36-bed NICUs compared with <16-bed NICUs (AOR (95% CI): 1.47 (1.25-1.73); 1.49 (1.25-1.78); 1.55 (1.29-1.87), respectively) and for NICUs with higher resource utilization at admission (AOR: 1.30 (1.08-1.56), Q4 vs Q1) but not different according to NICU occupancy. Larger NICUs and more intense resource utilization at admission are associated with higher odds of a composite adverse outcome in very preterm infants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbose, Galen; Wiser, Ryan; Phadke, Amol
2008-02-01
The long economic lifetime and development lead-time of many electric infrastructure investments requires that utility resource planning consider potential costs and risks over a lengthy time horizon. One long-term -- and potentially far-reaching -- risk currently facing the electricity industry is the uncertain cost of future carbon dioxide (CO2) regulations. Recognizing the importance of this issue, many utilities (sometimes spurred by state regulatory requirements) are beginning to actively assess carbon regulatory risk within their resource planning processes, and to evaluate options for mitigating that risk. However, given the relatively recent emergence of this issue and the rapidly changing political landscape,more » methods and assumptions used to analyze carbon regulatory risk, and the impact of this analysis on the selection of a preferred resource portfolio, vary considerably across utilities. In this study, we examine the treatment of carbon regulatory risk in utility resource planning, through a comparison of the most-recent resource plans filed by fifteen investor-owned and publicly-owned utilities in the Western U.S. Together, these utilities account for approximately 60percent of retail electricity sales in the West, and cover nine of eleven Western states. This report has two related elements. First, we compare and assess utilities' approaches to addressing key analytical issues that arise when considering the risk of future carbon regulations. Second, we summarize the composition and carbon intensity of the preferred resource portfolios selected by these fifteen utilities and compare them to potential CO2 emission benchmark levels.« less
Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A
2010-10-01
Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p <0.01). There were no significant differences in the likelihood of biochemical recurrence between the pathological methods when patients were stratified by pathological outcome. Except for estimated tumor volume and multiple margins whole mount and systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
[Nitrogen and water cycling of typical cropland in the North China Plain].
Pei, Hong-wei; Shen, Yan-jun; Liu, Chang-ming
2015-01-01
Intensive fertilization and irrigation associated increasing grain production has led to serious groundwater depletion and soil/water pollution in the North China Plain (NCP). Intensive agriculture changes the initial mass and energy balance, and also results in huge risks to the water/soil resources and food security regionally. Based on the research reports on the nitrogen cycle and water cycle in typical cropland (winter wheat and summer corn) in the NCP during the past 20 years, and the meteorological data, field experiments and surveys, we calculated the nitrogen cycle and water-cycle for this typical cropland. Annual total nitrogen input were 632 kg N . hm-2, including 523 kg N . hm-2 from commercial fertilizer, 74 kg N . hm-2 from manure, 23 kg N . hm-2 from atmosphere, and 12 kg N . hm-2 from irrigation. All of annual outputs summed to 532 kg N . hm-2 including 289 kg N . hm-2 for crop, 77 kg N . hm-2 staying in soil profile, leaching 104 kg N . hm-2, 52 kg N . hm-2 for ammonia volatilization, 10 kg N . hm-2 loss in nitrification and denitrification. Uncertainties of the individual cases and the summary process lead to the unbalance of nitrogen. For the dominant parts of the field water cycle, annual precipitation was 557 mm, irrigation was 340 mm, while 762 mm was for evapotranspiration and 135 mm was for deep percolation. Considering uncertainties in the nitrogen and water cycles, coupled experiments based on multi-disciplines would be useful for understanding mechanisms for nitrogen and water transfer processes in the soil-plant-atmosphere-continuum (SPAC) , and the interaction between nitrogen and water, as well as determining the critical threshold values for sustainability of soil and water resources in the NCP.
Managing Academic Libraries with Fewer Resources.
ERIC Educational Resources Information Center
Riggs, Donald E.
1992-01-01
A discussion of academic library management during retrenchment looks at a variety of issues, including staffing needs in the labor-intensive library environment, acquisitions budgeting, interlibrary cooperation (ownership vs. access to resources), entrepreneurship and strategic planning for problem solving, and use of total quality management…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dafler, J.R.; Sinnott, J.; Novil, M.
The first phase of a study to identify candidate processes and products suitable for future exploitation using high-temperature solar energy is presented. This phase has been principally analytical, consisting of techno-economic studies, thermodynamic assessments of chemical reactions and processes, and the determination of market potentials for major chemical commodities that use significant amounts of fossil resources today. The objective was to identify energy-intensive processes that would be suitable for the production of chemicals and fuels using solar energy process heat. Of particular importance was the comparison of relative costs and energy requirements for the selected solar product versus costs formore » the product derived from conventional processing. The assessment methodology used a systems analytical approach to identify processes and products having the greatest potential for solar energy-thermal processing. This approach was used to establish the basis for work to be carried out in subsequent phases of development. It has been the intent of the program to divide the analysis and process identification into the following three distinct areas: (1) process selection, (2) process evaluation, and (3) ranking of processes. Four conventional processes were selected for assessment namely, methanol synthesis, styrene monomer production, vinyl chloride monomer production, and terephthalic acid production.« less
Schuchmann, Maike; Siemers, Björn M
2010-09-17
Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure.
Schuchmann, Maike; Siemers, Björn M.
2010-01-01
Background Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. Methodology/Principal Findings We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Conclusions/Significance Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure. PMID:20862252
Rogan, Fran; San Miguel, Caroline
2013-09-01
Increasingly, students with English as a second language (ESL) are enrolled in nursing degrees in English speaking countries (Wang et al., 2008). However, they may be at risk of clinical practice failure due to communication difficulties associated with unfamiliar linguistic and cultural factors (Guhde, 2003). This paper describes and evaluates an innovation to assist ESL nursing students at an Australian university develop their clinical communication skills and practice readiness by providing online learning resources, using podcast and vodcast technology, that blend with classroom activities and facilitate flexible and independent learning. The innovation builds on an intensive clinical language workshop program called 'Clinically Speaking' which has evolved through a cyclical process of ongoing research to produce resources in response to students' learning needs. Whilst uptake of the resources was modest, students of ESL as well as English speaking backgrounds (ESB) found the resources improved their clinical preparation and confidence by increasing their understanding of expectations, clinical language and communication skills. The innovation, developed with a modest budget, shows potential in developing ESL and ESB students' readiness for clinical communication, enabling them to engage in clinical practice to develop competency standards required of nursing graduates and registration authorities. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rodríguez-Gómez, Guillermo; Palmqvist, Paul; Ros-Montoya, Sergio; Espigares, M. Patrocinio; Martínez-Navarro, Bienvenido
2017-05-01
With an age of ∼1.6-1.5 Ma, the Early Pleistocene site of Venta Micena (Orce, Baza Basin, SE Spain) has provided the large mammals assemblage of Late Villafranchian age with higher preservational completeness in Western Europe and offers a unique opportunity to analyze the food webs of the mammalian paleocommunity before the first human arrival in this continent. Taphonomic analysis of the fossil assemblage has shown evidence of carnivore involvement, particularly hyenas, in the bone accumulating process. In this study we use a mathematical approach based on Leslie matrices to quantify the biomass of ungulates available to the members of the carnivore guild as well as the pattern of resource partitioning and competition intensity among them. The results obtained show that although the biomass of primary consumers available to the secondary consumers was lower than the value expected under optimal conditions, more than half the individuals and biomass of carnivores expected would be reached, which allowed a viable ecosystem in Venta Micena. In fact, the biomass available for the members of the carnivore guild is 25-30% greater than the estimates obtained for two nearby sites, Barranco León-D and Fuente Nueva-3, which are somewhat younger (∼1.4 Ma) and preserve the oldest evidence on human presence in this region. Given that the competition intensity estimated in the carnivore guild of Venta Micena was lower than in the latter sites, this suggests that the timing of the first human dispersal in Western Europe was probably not a matter of ecological opportunity.
Smith, James R.; Ghazoul, Jaboury; Burslem, David F. R. P.; Itoh, Akira; Khoo, Eyen; Lee, Soon Leong; Maycock, Colin R.; Nanami, Satoshi; Ng, Kevin Kit Siong; Kettle, Chris J.
2018-01-01
Documenting the scale and intensity of fine-scale spatial genetic structure (FSGS), and the processes that shape it, is relevant to the sustainable management of genetic resources in timber tree species, particularly where logging or fragmentation might disrupt gene flow. In this study we assessed patterns of FSGS in three species of Dipterocarpaceae (Parashorea tomentella, Shorea leprosula and Shorea parvifolia) across four different tropical rain forests in Malaysia using nuclear microsatellite markers. Topographic heterogeneity varied across the sites. We hypothesised that forests with high topographic heterogeneity would display increased FSGS among the adult populations driven by habitat associations. This hypothesis was not supported for S. leprosula and S. parvifolia which displayed little variation in the intensity and scale of FSGS between sites despite substantial variation in topographic heterogeneity. Conversely, the intensity of FSGS for P. tomentella was greater at a more topographically heterogeneous than a homogeneous site, and a significant difference in the overall pattern of FSGS was detected between sites for this species. These results suggest that local patterns of FSGS may in some species be shaped by habitat heterogeneity in addition to limited gene flow by pollen and seed dispersal. Site factors can therefore contribute to the development of FSGS. Confirming consistency in species’ FSGS amongst sites is an important step in managing timber tree genetic diversity as it provides confidence that species specific management recommendations based on species reproductive traits can be applied across a species’ range. Forest managers should take into account the interaction between reproductive traits and site characteristics, its consequences for maintaining forest genetic resources and how this might influence natural regeneration across species if management is to be sustainable. PMID:29547644
Separation of Allelopathy from Resource Competition Using Rice/Barnyardgrass Mixed-Cultures
Fang, Chang Xun; Lin, Zhi Hua; Yu, Zheng Ming; Lin, Wen Xiong
2012-01-01
Plant-plant interference is the combined effect of allelopathy, resource competition, and many other factors. Separating allelopathy from resource competition is almost impossible in natural systems but it is important to evaluate the relative contribution of each of the two mechanisms on plant interference. Research on allelopathy in natural and cultivated plant communities has been hindered in the absence of a reliable method that can separate allelopathic effect from resource competition. In this paper, the interactions between allelopathic rice accession PI312777, non-allelopathic rice accession Lemont and barnyardgrass were explored respectively by using a target (rice)-neighbor (barnyardgrass) mixed-culture in hydroponic system. The relative competitive intensity (RCI), the relative neighbor effect (RNE) and the competitive ratio (CR) were used to quantify the intensity of competition between each of the two different potentially allelopathic rice accessions and barnyardgrass. Use of hydroponic culture system enabled us to exclude any uncontrolled factors that might operate in the soil and we were able to separate allelopathy from resource competition between each rice accession and barnyardgrass. The RCI and RNE values showed that the plant-plant interaction was positive (facilitation) for PI312777 but that was negative (competition) for Lemont and barnyardgrass in rice/barnyardgrass mixed-cultures. The CR values showed that one PI312777 plant was more competitive than 2 barnyardgrass plants. The allelopathic effects of PI312777 were much more intense than the resource competition in rice/barnyardgrass mixed cultures. The reverse was true for Lemont. These results demonstrate that the allelopathic effect of PI312777 was predominant in rice/barnyardgrass mixed-cultures. The most significant result of our study is the discovery of an experimental design, target-neighbor mixed-culture in combination with competition indices, can successfully separate allelopathic effects from competition. PMID:22590655
Pandemic influenza-implications for critical care resources in Australia and New Zealand.
Anderson, Therese A; Hart, Graeme K; Kainer, Marion A
2003-09-01
To quantify resource requirements (additional beds and ventilator capacity), for critical care services in the event of pandemic influenza. Cross-sectional survey about existing and potential critical care resources. Participants comprised 156 of the 176 Australasian (Australia and New Zealand) critical care units on the database of the Australian and New Zealand Intensive Care Society (ANZICS) Research Centre for Critical Care Resources. The Meltzer, Cox and Fukuda model was adapted to map a range of influenza attack rate estimates for hospitalisation and episodes likely to require intensive care and to predict critical care admission rates and bed day requirements. Estimations of ventilation rates were based on those for community-acquired pneumonia. The estimated extra number of persons requiring hospitalisation ranged from 8,455 (10% attack rate) to 150,087 (45% attack rate). The estimated number of additional admissions to critical care units ranged from 423 (5% admission rate, 10% attack rate) to 37,522 (25% admission rate, 45% attack rate). The potential number of required intensive care bed days ranged from 846 bed days (2 day length of stay, 10% attack rate) to 375,220 bed days (10 day length of stay, 45% attack rate). The number of persons likely to require mechanical ventilation ranged from 106 (25% of projected critical care admissions, 10% attack rate) to 28,142 (75% of projected critical care admissions, 45% attack rate). An additional 1,195 emergency ventilator beds were identified in public sector and 248 in private sector hospitals. Cancellation of elective surgery could release a potential 76,402 intensive care bed days (per annum), but in the event of pandemic influenza, 31,150 bed days could be required over an 8- to 12-week period. Australasian critical care services would be overwhelmed in the event of pandemic influenza. More work is required in relation to modelling, contingency plans, and resource allocation.
What is a hospital bed day worth? A contingent valuation study of hospital Chief Executive Officers.
Page, Katie; Barnett, Adrain G; Graves, Nicholas
2017-02-14
Decreasing hospital length of stay, and so freeing up hospital beds, represents an important cost saving which is often used in economic evaluations. The savings need to be accurately quantified in order to make optimal health care resource allocation decisions. Traditionally the accounting cost of a bed is used. We argue instead that the economic cost of a bed day is the better value for making resource decisions, and we describe our valuation method and estimations for costing this important resource. We performed a contingent valuation using 37 Australian Chief Executive Officers' (CEOs) willingness to pay (WTP) to release bed days in their hospitals, both generally and using specific cases. We provide a succinct thematic analysis from qualitative interviews post survey completion, which provide insight into the decision making process. On average CEOs are willing to pay a marginal rate of $216 for a ward bed day and $436 for an Intensive Care Unit (ICU) bed day, with estimates of uncertainty being greater for ICU beds. These estimates are significantly lower (four times for ward beds and seven times for ICU beds) than the traditional accounting costs often used. Key themes to emerge from the interviews include the importance of national funding and targets, and their associated incentive structures, as well as the aversion to discuss bed days as an economic resource. This study highlights the importance for valuing bed days as an economic resource to inform cost effectiveness models and thus improve hospital decision making and resource allocation. Significantly under or over valuing the resource is very likely to result in sub-optimal decision making. We discuss the importance of recognising the opportunity costs of this resource and highlight areas for future research.
Impact of coastal processes on resource development with an example from Icy Bay, Alaska
Molnia, Bruce F.
1978-01-01
The coastline of Alaska is dynamic and continually readjusting to changes in the many processes that operate in the coastal zone. Because of this dynamic nature, special consideration must be made in planning for development, and. caution must be exercised in site selection for facilities to be emplaced in the coastal zone. All types of coastal processes from continuously active normal processes to the low frequency-high intensity rare event must be considered. Site-specific evaluation-s considering the broad range of possible processes must precede initiation of development. An example of the relation between coastal processes and a proposed resource treatment facility is presented for Icy Bay, Alaska. Icy Bay is the only sheltered bay near many of the offshore tracts leased for petroleum exploration in the 1976 northern Gulf of Alaska OCS (Outer Continental Shelf) lease sale. Consequently, it has been selected as a primary onshore staging site for the support of offshore exploration and development. The environment of Icy Bay has many potentially hazardous features, including a submarine moraine at the bay mouth and actively calving glaciers at the bay's head which produce many icebergs. But most significant from the point of view of locating onshore facilities and pipeline corridors are the high rates of shoreline erosion and sediment deposition. If pipelines or any onshore staging facilities are to be placed in the coastal areas of Icy Bay, then the dynamic changes in shoreline position must be considered so that man-made structures will not be eroded away or be silted in before the completion of development.
The correlation study of parallel feature extractor and noise reduction approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewi, Deshinta Arrova; Sundararajan, Elankovan; Prabuwono, Anton Satria
2015-05-15
This paper presents literature reviews that show variety of techniques to develop parallel feature extractor and finding its correlation with noise reduction approaches for low light intensity images. Low light intensity images are normally displayed as darker images and low contrast. Without proper handling techniques, those images regularly become evidences of misperception of objects and textures, the incapability to section them. The visual illusions regularly clues to disorientation, user fatigue, poor detection and classification performance of humans and computer algorithms. Noise reduction approaches (NR) therefore is an essential step for other image processing steps such as edge detection, image segmentation,more » image compression, etc. Parallel Feature Extractor (PFE) meant to capture visual contents of images involves partitioning images into segments, detecting image overlaps if any, and controlling distributed and redistributed segments to extract the features. Working on low light intensity images make the PFE face challenges and closely depend on the quality of its pre-processing steps. Some papers have suggested many well established NR as well as PFE strategies however only few resources have suggested or mentioned the correlation between them. This paper reviews best approaches of the NR and the PFE with detailed explanation on the suggested correlation. This finding may suggest relevant strategies of the PFE development. With the help of knowledge based reasoning, computational approaches and algorithms, we present the correlation study between the NR and the PFE that can be useful for the development and enhancement of other existing PFE.« less
Ivanova, S M; Mokurov, B V; Iarlykova, Iu V; Labetskaia, O I
2012-01-01
Erythrocyte metabolism and erythropoiesis intensity were investigated in the humans subjects (6 males, 25 to 37 yrs. of age) volunteered for experimental simulation of such factors in a mission to Mars as very long duration (520-d) of isolation and confinement, autonomy, delayed communication, emergencies, and limited consumable resources. Venous blood and extracted erythrocytes were analyzed in the baseline data collection period (2 wks. before the experiment), on days 60, 120, 170, 240, 300, 360, 417 and 510 of the experiment and on days 7-8 after its completion. Erythrocyte metabolic and plasmatic membrane parameters were measured. Blood serum was analyzed for iron turnover; erythropoiesis intensity was evaluated by the erythropoietin level. According to the results of the investigation, there were phase-type shifts in the parameters throughout the experiment that were particularly significant on days 60 and 120. Inhibition of energy-production and enhancement of reparative processes in the cell could be tokens of compensatory reactions aimed to control oxidation processes and to raise antioxidation efficiency of the cell. The phase-type changes in membrane lipids and phospholipids point to increased microviscosity of the plasmatic membrane at the beginning and then end of the experiment. Hemoglobin content in blood and erythrocytes showed a significant increase on day-510 of isolation and in the ensuing recovery period. Data about iron turnover and erythropoietin level evidence an adequate bone marrow response to the changed hemoglobin content in blood.
Microwave irradiation biodiesel processing of waste cooking oil
NASA Astrophysics Data System (ADS)
Motasemi, Farough; Ani, Farid Nasir
2012-06-01
Major part of the world's total energy output is generated from fossil fuels, consequently its consumption has been continuously increased which accelerates the depletion of fossil fuel reserves and also increases the price of these valuable limited resources. Biodiesel is a renewable, non-toxic and biodegradable diesel fuel which it can be the best environmentally friendly and easily attainable alternative for fossil fuels. The costs of feedstock and production process are two important factors which are particularly against large-scale biodiesel production. This study is intended to optimize three critical reaction parameters including intensity of mixing, microwave exit power and reaction time from the transesterification of waste cooking oil by using microwave irradiation in an attempt to reduce the production cost of biodiesel. To arrest the reaction, similar quantities of methanol/oil molar ratio (6:1) and potassium hydroxide (2% wt) as the catalyst were used. The results showed that the best yield percentage (95%) was obtained using 300W microwave exit power, 300 rpm stirrer speed (intensity of mixing) and 78°C for 5 min. It was observed that increasing the intensity of mixing greatly ameliorates the yield percentage of biodiesel (up to 17%). Moreover, the results demonstrate that increasing the reaction time in the low microwave exit power (100W) improves the yield percentage of biodiesel, while it has a negative effect on the conversion yield in the higher microwave exit power (300W). From the obtained results it was clear that FAME was within the standards of biodiesel fuel.
Sun, Yongqi; Sridhar, Seetharaman; Seetharaman, Seshadri; Wang, Hao; Liu, Lili; Wang, Xidong; Zhang, Zuotai
2016-01-01
Herein a big Fe-C-Ca cycle, clarifying the basic element flows and energy flows in modern carbon-intensive industries including the metallurgical industry and the cement industry, was proposed for the first time in the contexts of emission reduction and iron ore degradation nowadays. This big cycle was focused on three industrial elements of Fe, C and Ca and thus it mainly comprised three interdependent loops, i.e., a C-cycle, a Fe-cycle and a Ca-path. As exemplified, we started from the integrated disposal of hot steel slags, a man-made iron resource via char gasification and the employment of hematite, a natural iron resource greatly extended the application area of this idea. Accordingly, based on this concept, the theoretical potentials for energy saving, emission reduction and Fe resource recovery achieved in modern industry are estimated up to 7.66 Mt of standard coal, 63.9 Mt of CO2 and 25.2 Mt of pig iron, respectively. PMID:26923104
NASA Astrophysics Data System (ADS)
Sun, Yongqi; Sridhar, Seetharaman; Seetharaman, Seshadri; Wang, Hao; Liu, Lili; Wang, Xidong; Zhang, Zuotai
2016-02-01
Herein a big Fe-C-Ca cycle, clarifying the basic element flows and energy flows in modern carbon-intensive industries including the metallurgical industry and the cement industry, was proposed for the first time in the contexts of emission reduction and iron ore degradation nowadays. This big cycle was focused on three industrial elements of Fe, C and Ca and thus it mainly comprised three interdependent loops, i.e., a C-cycle, a Fe-cycle and a Ca-path. As exemplified, we started from the integrated disposal of hot steel slags, a man-made iron resource via char gasification and the employment of hematite, a natural iron resource greatly extended the application area of this idea. Accordingly, based on this concept, the theoretical potentials for energy saving, emission reduction and Fe resource recovery achieved in modern industry are estimated up to 7.66 Mt of standard coal, 63.9 Mt of CO2 and 25.2 Mt of pig iron, respectively.
Lennert, Barb; Farrelly, Eileen; Sacco, Patricia; Pira, Geraldine; Frost, Michael
2013-04-01
Seizures are a hallmark manifestation of tuberous sclerosis complex, yet data characterizing resource utilization are lacking. This retrospective chart review was performed to assess the economic burden of tuberous sclerosis complex with neurologic manifestations. Demographic and resource utilization data were collected for 95 patients for up to 5 years after tuberous sclerosis complex diagnosis. Mean age at diagnosis was 3.1 years, with complex partial and infantile spasms as the most common seizure types. In the first 5 years post-diagnosis, 83.2% required hospitalization, 30.5% underwent surgery, and the majority of patients (90.5%) underwent ≥3 testing procedures. In 79 patients with a full 5 years of data, hospitalizations, intensive care unit stays, diagnostic testing, and rehabilitation services decreased over the 5-year period. Resource utilization is cost-intensive in children with tuberous sclerosis complex and associated seizures during the first few years following diagnosis. Improving seizure control and reducing health care costs in this population remain unmet needs.
NASA Astrophysics Data System (ADS)
Shadananan Nair, K.
2016-10-01
Freshwater resources of India are getting fast degraded and depleted from the changing climate and pressure of fast rising population. Changing intensity and seasonality of rainfall affect quantity and quality of water. Most of the rivers are polluted far above safety limits from the untreated domestic, industrial and agricultural effluents. Changes in the intensity, frequency and tracks of storms salinate coastal aquifers. Aquifers are also under the threat from rising sea level. Groundwater in urban limits and industrial zones are far beyond safety limits. Large-scale destruction of wetlands for industries and residential complexes has affected the quality of surface and groundwater resources in most parts of India. Measures to maintain food security and the new developments schemes such as river linking will further deteriorate the water resources. Falling water availability leads to serious health issues and various socio-economic issues. India needs urgent and appropriate adaptation strategies in the water sector.
Gidoin, Cynthia; Avelino, Jacques; Deheuvels, Olivier; Cilas, Christian; Bieng, Marie Ange Ngo
2014-03-01
Vegetation composition and plant spatial structure affect disease intensity through resource and microclimatic variation effects. The aim of this study was to evaluate the independent effect and relative importance of host composition and plant spatial structure variables in explaining disease intensity at the plot scale. For that purpose, frosty pod rot intensity, a disease caused by Moniliophthora roreri on cacao pods, was monitored in 36 cacao agroforests in Costa Rica in order to assess the vegetation composition and spatial structure variables conducive to the disease. Hierarchical partitioning was used to identify the most causal factors. Firstly, pod production, cacao tree density and shade tree spatial structure had significant independent effects on disease intensity. In our case study, the amount of susceptible tissue was the most relevant host composition variable for explaining disease intensity by resource dilution. Indeed, cacao tree density probably affected disease intensity more by the creation of self-shading rather than by host dilution. Lastly, only regularly distributed forest trees, and not aggregated or randomly distributed forest trees, reduced disease intensity in comparison to plots with a low forest tree density. A regular spatial structure is probably crucial to the creation of moderate and uniform shade as recommended for frosty pod rot management. As pod production is an important service expected from these agroforests, shade tree spatial structure may be a lever for integrated management of frosty pod rot in cacao agroforests.
[Video-based self-control in surgical teaching. A new tool in a new concept].
Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O
2013-10-01
Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.
A CPS Based Optimal Operational Control System for Fused Magnesium Furnace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, Tian-you; Wu, Zhi-wei; Wang, Hong
Fused magnesia smelting for fused magnesium furnace (FMF) is an energy intensive process with high temperature and comprehensive complexities. Its operational index namely energy consumption per ton (ECPT) is defined as the consumed electrical energy per ton of acceptable quality and is difficult to measure online. Moreover, the dynamics of ECPT cannot be precisely modelled mathematically. The model parameters of the three-phase currents of the electrodes such as the molten pool level, its variation rate and resistance are uncertain and nonlinear functions of the changes in both the smelting process and the raw materials composition. In this paper, an integratedmore » optimal operational control algorithm proposed is composed of a current set-point control, a current switching control and a self-optimized tuning mechanism. The tight conjoining of and coordination between the computational resources including the integrated optimal operational control, embedded software, industrial cloud, wireless communication and the physical resources of FMF constitutes a cyber-physical system (CPS) based embedded optimal operational control system. Successful application of this system has been made for a production line with ten fused magnesium furnaces in a factory in China, leading to a significant reduced ECPT.« less
Population regulation and character displacement in a seasonal environment.
Goldberg, Emma E; Lande, Russell; Price, Trevor D
2012-06-01
Competition has negative effects on population size and also drives ecological character displacement, that is, evolutionary divergence to utilize different portions of the resource spectrum. Many species undergo an annual cycle composed of a lean season of intense competition for resources and a breeding season. We use a quantitative genetic model to study the effects of differential reproductive output in the summer or breeding season on character displacement in the winter or nonbreeding season. The model is developed with reference to the avian family of Old World leaf warblers (Phylloscopidae), which breed in the temperate regions of Eurasia and winter in tropical and subtropical regions. Empirical evidence implicates strong winter density-dependent regulation driven by food shortage, but paradoxically, the relative abundance of each species appears to be determined by conditions in the summer. We show how population regulation in the two seasons becomes linked, with higher reproductive output by one species in the summer resulting in its evolution to occupy a larger portion of niche space in the winter. We find short-term ecological processes and longer-term evolutionary processes to have comparable effects on a species population size. This modeling approach can also be applied to other differential effects of productivity across seasons.
Understanding Variability To Reduce the Energy and GHG Footprints of U.S. Ethylene Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Yuan; Graziano, Diane J.; Riddle, Matthew
2015-11-18
Recent growth in U.S. ethylene production due to the shale gas boom is affecting the U.S. chemical industry's energy and greenhouse gas (GHG) emissions footprints. To evaluate these effects, a systematic, first-principles model of the cradle-to-gate ethylene production system was developed and applied. The variances associated with estimating the energy consumption and GHG emission intensities of U.S. ethylene production, both from conventional natural gas,and from shale gas, are explicitly analyzed. A sensitivity analysis illustrates that the large variances in energy intensity are due to process parameters (e.g., compressor efficiency), and that large variances in GHG emissions intensity are due tomore » fugitive emissions from upstream natural gas production. On the basis of these results, the opportunities with the greatest leverage for reducing the energy and GHG footprints are presented. The model and analysis provide energy analysts and policy makers with a better understanding of the drivers of energy use and GHG emissions associated with U.S. ethylene production. They also constitute a rich data resource that can be used to evaluate options for managing the industry's footprints moving forward.« less
A hybrid life cycle inventory of nano-scale semiconductor manufacturing.
Krishnan, Nikhil; Boyd, Sarah; Somani, Ajay; Raoux, Sebastien; Clark, Daniel; Dornfeld, David
2008-04-15
The manufacturing of modern semiconductor devices involves a complex set of nanoscale fabrication processes that are energy and resource intensive, and generate significant waste. It is important to understand and reduce the environmental impacts of semiconductor manufacturing because these devices are ubiquitous components in electronics. Furthermore, the fabrication processes used in the semiconductor industry are finding increasing application in other products, such as microelectromechanical systems (MEMS), flat panel displays, and photovoltaics. In this work we develop a library of typical gate-to-gate materials and energy requirements, as well as emissions associated with a complete set of fabrication process models used in manufacturing a modern microprocessor. In addition, we evaluate upstream energy requirements associated with chemicals and materials using both existing process life cycle assessment (LCA) databases and an economic input-output (EIO) model. The result is a comprehensive data set and methodology that may be used to estimate and improve the environmental performance of a broad range of electronics and other emerging applications that involve nano and micro fabrication.
Samuelson, Karin A M; Corrigan, Ingrid
2009-01-01
The benefits of critical care follow-up services include increased understanding of the long-term consequences of intensive care and entail helping patients and their next of kin to come to terms with their problems and distress following critical illness and intensive care treatment. To establish an intensive care after-care programme and to conduct a preliminary evaluation of the follow-up service from the patients' and relatives' perspectives in a general intensive care unit (ICU) in Sweden. A descriptive and evaluative design was used, and data from the first year of the after-care programme were collected. The final programme was nurse led and included five main points; a patient diary with colour photographs, ward visits, a patient information pamphlet, a follow-up consultation 2-3 months after intensive care discharge and feedback to the ICU staff. An evaluation questionnaire was handed out to patients and next of kin attending the follow-up clinic, e.g. asking the respondents to rate their satisfaction of the consultation on a visual analogue scale (VAS). The first year of after-care statistics showed that 170 survivors with a stay of 48 h or more were discharged from the ICU, resulting in 190 ward visits and 79 follow-up consultations. The preliminary evaluation revealed that the 2-month follow-up consultation achieved a median VAS rating of 9.8 (ranging from 1 to 10, poor to excellent) from both patients and next of kin. The development and preliminary evaluation of this nurse-led intensive care programme resulted in a feasible programme, requiring modest resources, with a high level of patient and relative satisfaction. This paper attempts to share with professional colleagues important steps during the developmental process of establishing an intensive care follow-up service and presents the content and preliminary evaluation of a nurse-led intensive care after-care programme focusing on the patients' and relatives' perspectives.
Allan, Catherine K; Thiagarajan, Ravi R; Beke, Dorothy; Imprescia, Annette; Kappus, Liana J; Garden, Alexander; Hayes, Gavin; Laussen, Peter C; Bacha, Emile; Weinstock, Peter H
2010-09-01
Resuscitation of pediatric cardiac patients involves unique and complex physiology, requiring multidisciplinary collaboration and teamwork. To optimize team performance, we created a multidisciplinary Crisis Resource Management training course that addressed both teamwork and technical skill needs for the pediatric cardiac intensive care unit. We sought to determine whether participation improved caregiver comfort and confidence levels regarding future resuscitation events. We developed a simulation-based, in situ Crisis Resource Management curriculum using pediatric cardiac intensive care unit scenarios and unit-specific resuscitation equipment, including an extracorporeal membrane oxygenation circuit. Participants replicated the composition of a clinical team. Extensive video-based debriefing followed each scenario, focusing on teamwork principles and technical resuscitation skills. Pre- and postparticipation questionnaires were used to determine the effects on participants' comfort and confidence regarding participation in future resuscitations. A total of 182 providers (127 nurses, 50 physicians, 2 respiratory therapists, 3 nurse practitioners) participated in the course. All participants scored the usefulness of the program and scenarios as 4 of 5 or higher (5 = most useful). There was significant improvement in participants' perceived ability to function as a code team member and confidence in a code (P < .001). Participants reported they were significantly more likely to raise concerns about inappropriate management to the code leader (P < .001). We developed a Crisis Resource Management training program in a pediatric cardiac intensive care unit to teach technical resuscitation skills and improve team function. Participants found the experience useful and reported improved ability to function in a code. Further work is needed to determine whether participation in the Crisis Resource Management program objectively improves team function during real resuscitations. 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Kumar, Parmeshwar; Jithesh, Vishwanathan; Gupta, Shakti Kumar
2015-01-01
Context: Though intensive care units (ICUs) only account for 10% of hospital beds, they consume nearly 22% of hospital resources. Few definitive costing studies have been conducted in Indian settings that would help determine appropriate resource allocation. Aim: To evaluate and compare the cost of intensive care delivery between multi-specialty and neurosurgery ICU in an apex trauma care facility in India. Materials and Methods: The study was conducted in a polytrauma and neurosurgery ICU at a 203 bedded level IV trauma care facility in New Delhi, India from May, 2012 to June 2012. The study was cross-sectional, retrospective, and record-based. Traditional costing was used to arrive at the cost for both direct and indirect cost estimates. The cost centers included in study were building cost, equipment cost, human resources, materials and supplies, clinical and nonclinical support services, engineering maintenance cost, and biomedical waste management. Statistical Analysis: Fisher's two-tailed t-test. Results: Total cost/bed/day for the multi-specialty ICU was Rs. 14,976.9/- and for the neurosurgery ICU was Rs. 14,306.7/-, manpower constituting nearly half of the expenditure in both ICUs. The cost center wise and overall difference in the cost among the ICUs were statistically significant. Conclusions: Quantification of expenditure in running an ICU in a trauma center would assist healthcare decision makers in better allocation of resources. Although multi-specialty ICUs are more expensive, other factors will also play a role in defining the kind of ICU that need to be designed. PMID:25829909
A bioinformatics knowledge discovery in text application for grid computing
Castellano, Marcello; Mastronardi, Giuseppe; Bellotti, Roberto; Tarricone, Gianfranco
2009-01-01
Background A fundamental activity in biomedical research is Knowledge Discovery which has the ability to search through large amounts of biomedical information such as documents and data. High performance computational infrastructures, such as Grid technologies, are emerging as a possible infrastructure to tackle the intensive use of Information and Communication resources in life science. The goal of this work was to develop a software middleware solution in order to exploit the many knowledge discovery applications on scalable and distributed computing systems to achieve intensive use of ICT resources. Methods The development of a grid application for Knowledge Discovery in Text using a middleware solution based methodology is presented. The system must be able to: perform a user application model, process the jobs with the aim of creating many parallel jobs to distribute on the computational nodes. Finally, the system must be aware of the computational resources available, their status and must be able to monitor the execution of parallel jobs. These operative requirements lead to design a middleware to be specialized using user application modules. It included a graphical user interface in order to access to a node search system, a load balancing system and a transfer optimizer to reduce communication costs. Results A middleware solution prototype and the performance evaluation of it in terms of the speed-up factor is shown. It was written in JAVA on Globus Toolkit 4 to build the grid infrastructure based on GNU/Linux computer grid nodes. A test was carried out and the results are shown for the named entity recognition search of symptoms and pathologies. The search was applied to a collection of 5,000 scientific documents taken from PubMed. Conclusion In this paper we discuss the development of a grid application based on a middleware solution. It has been tested on a knowledge discovery in text process to extract new and useful information about symptoms and pathologies from a large collection of unstructured scientific documents. As an example a computation of Knowledge Discovery in Database was applied on the output produced by the KDT user module to extract new knowledge about symptom and pathology bio-entities. PMID:19534749
A bioinformatics knowledge discovery in text application for grid computing.
Castellano, Marcello; Mastronardi, Giuseppe; Bellotti, Roberto; Tarricone, Gianfranco
2009-06-16
A fundamental activity in biomedical research is Knowledge Discovery which has the ability to search through large amounts of biomedical information such as documents and data. High performance computational infrastructures, such as Grid technologies, are emerging as a possible infrastructure to tackle the intensive use of Information and Communication resources in life science. The goal of this work was to develop a software middleware solution in order to exploit the many knowledge discovery applications on scalable and distributed computing systems to achieve intensive use of ICT resources. The development of a grid application for Knowledge Discovery in Text using a middleware solution based methodology is presented. The system must be able to: perform a user application model, process the jobs with the aim of creating many parallel jobs to distribute on the computational nodes. Finally, the system must be aware of the computational resources available, their status and must be able to monitor the execution of parallel jobs. These operative requirements lead to design a middleware to be specialized using user application modules. It included a graphical user interface in order to access to a node search system, a load balancing system and a transfer optimizer to reduce communication costs. A middleware solution prototype and the performance evaluation of it in terms of the speed-up factor is shown. It was written in JAVA on Globus Toolkit 4 to build the grid infrastructure based on GNU/Linux computer grid nodes. A test was carried out and the results are shown for the named entity recognition search of symptoms and pathologies. The search was applied to a collection of 5,000 scientific documents taken from PubMed. In this paper we discuss the development of a grid application based on a middleware solution. It has been tested on a knowledge discovery in text process to extract new and useful information about symptoms and pathologies from a large collection of unstructured scientific documents. As an example a computation of Knowledge Discovery in Database was applied on the output produced by the KDT user module to extract new knowledge about symptom and pathology bio-entities.
Bauer-Nilsen, Kristine; Hill, Colin; Trifiletti, Daniel M; Libby, Bruce; Lash, Donna H; Lain, Melody; Christodoulou, Deborah; Hodge, Constance; Showalter, Timothy N
2018-01-01
To evaluate the delivery costs, using time-driven activity-based costing, and reimbursement for definitive radiation therapy for locally advanced cervical cancer. Process maps were created to represent each step of the radiation treatment process and included personnel, equipment, and consumable supplies used to deliver care. Personnel were interviewed to estimate time involved to deliver care. Salary data, equipment purchasing information, and facilities costs were also obtained. We defined the capacity cost rate (CCR) for each resource and then calculated the total cost of patient care according to CCR and time for each resource. Costs were compared with 2016 Medicare reimbursement and relative value units (RVUs). The total cost of radiation therapy for cervical cancer was $12,861.68, with personnel costs constituting 49.8%. Brachytherapy cost $8610.68 (66.9% of total) and consumed 423 minutes of attending radiation oncologist time (80.0% of total). External beam radiation therapy cost $4055.01 (31.5% of total). Personnel costs were higher for brachytherapy than for the sum of simulation and external beam radiation therapy delivery ($4798.73 vs $1404.72). A full radiation therapy course provides radiation oncologists 149.77 RVUs with intensity modulated radiation therapy or 135.90 RVUs with 3-dimensional conformal radiation therapy, with total reimbursement of $23,321.71 and $16,071.90, respectively. Attending time per RVU is approximately 4-fold higher for brachytherapy (5.68 minutes) than 3-dimensional conformal radiation therapy (1.63 minutes) or intensity modulated radiation therapy (1.32 minutes). Time-driven activity-based costing was used to calculate the total cost of definitive radiation therapy for cervical cancer, revealing that brachytherapy delivery and personnel resources constituted the majority of costs. However, current reimbursement policy does not reflect the increased attending physician effort and delivery costs of brachytherapy. We hypothesize that the significant discrepancy between treatment costs and physician effort versus reimbursement may be a potential driver of reported national trends toward poor compliance with brachytherapy, and we suggest re-evaluation of payment policies to incentivize quality care. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Boettcher, Steven; Merz, Christoph; Lischeid, Gunnar
2015-04-01
The water budget of many catchments has vastly changed throughout the last decades. Intensified land use and increased water withdrawal for drinking water production and irrigation are likely to intensify pressure on water resources. According to model predictions, changing rainfall intensity, duration and spatial distribution in conjunction with increasing temperatures will worsen the situation in the future. The current water resources management has to adapt to these negative developments and to account for competing demands and threats. Essential for successful management applications is the identification and the quantification of the cause-and-effect chains driving the hydrological behavior of a catchment on the scale of management. It needs to check direction and magnitude of intended effects of measures taken as well as to identify unintended side effects that interact with natural effects in heterogeneous environments (Wood et al., 1988; Bloschl and Sivapalan, 1995). Therefore, these tools have to be able to distinguish between natural and anthropogenic driven impacts, even in complex geological settings like the Pleistocene landscape of North-East Germany. This study presents an approach that utilizes monitoring data to detect and quantitatively describe the predominant processes or factors of an observed hydrological system. The multivariate data analysis involves a non-linear dimension reduction method called Isometric Feature Mapping (Isomap, Tenenbaum et al., 2000) to extract information about the causes for the observed dynamics. Ordination methods like Isomap are used to derive a meaningful low-dimensional representation of a complex, high-dimensional data set. The approach is based on the hypothesis, that the number of processes which explain the variance of the data is relative low although the intensity of the processes varies in time and space. Therefore, the results can be interpreted in reference to the effective hydrological processes which control the system. The method was applied on a data set of groundwater head and lake water level. Two factors explaining more than 95 percent of the observed spatial variations were identified: (1) the anthropogenic impact of a waterworks in the study area and (2) natural groundwater recharge dynamics of different degrees of dampening at the respective sites of observation. The spatial variation of the identified processes revealed previously unknown hydraulic connections between two aquifers and between surface water bodies and groundwater. The obtained information can be used to reduce model structure uncertainty and a more efficient process-based modeling of hydraulic system behavior. Thus, the approach provides essential information to evaluate and adapt strategies for an integrated water resources management in complex landscapes. Bloschl, G., Sivapalan, M., 1995. Scale Issues in Hydrological Modeling - a Review. Hydrological Processes, 9(3-4): 251-290. Tenenbaum, J.B., de Silva, V., Langford, J.C., 2000. A global geometric framework for nonlinear dimensionality reduction. Science, 290: 2319-2323. Wood, E.F., Sivapalan, M., Beven, K., Band, L., 1988. Effects of Spatial Variability and Scale with Implications to Hydrologic Modeling. Journal of Hydrology, 102(1-4): 29-47.
Supporting Positive Behaviour in Alberta Schools: An Intensive Individualized Approach
ERIC Educational Resources Information Center
Souveny, Dwaine
2008-01-01
Drawing on current research and best practices, this third part of the three-part resource, "Supporting Positive Behaviour in Alberta Schools," provides information and strategies for providing intensive, individualized support and instruction for the small percentage of students requiring a high degree of intervention. This system of…
Intensive olive orchards on sloping land: good water and pest management are essential.
Metzidakis, I; Martinez-Vilela, A; Castro Nieto, G; Basso, B
2008-11-01
There is intensive cultivation of olives on sloping land in Jaen-Granada (Spain), Basilicata (Italy) and Western Crete (Greece). The intensive olive groves here are characterised by a tree density of about 250treesha(-1), yearly fertilisation and pruning, several chemical sprays for pest control, soil tillage once to thrice per year and irrigation up to 2700m3ha(-1)yr(-1). Intensive management results in high yields of 3600-6500kgha(-1) but also higher labour costs of 1154-1590euroha(-1)yr(-1), varying per area. The major environmental concerns in this system are related to chemical residues in the fruit, the extinction of useful insects, the depletion of groundwater resources, the pollution of soil and water and the erosion of soil. This paper describes the impact of intensive orchard management on natural resources and gives recommendations for soil and water conservation, reduction of chemicals use and biodiversity enhancement. The specific recommendations for the relevant stakeholders--farmers, technicians, agricultural services and policy makers--are based on the experimental evaluation of different agricultural practices and a socio-economic analysis of local and global production and markets.
Ecosystem-based incorporation of nectar-producing plants for stink bug parasitoids
USDA-ARS?s Scientific Manuscript database
Adult parasitoids of pest insects rely on floral resources for survival and reproduction but can be food-deprived in intensively managed agricultural systems lacking these resources. Stink bugs are serious pests of crops in southwest Georgia. Provisioning nectar-producing plants for parasitoids of s...
[Process design in high-reliability organizations].
Sommer, K-J; Kranz, J; Steffens, J
2014-05-01
Modern medicine is a highly complex service industry in which individual care providers are linked in a complicated network. The complexity and interlinkedness is associated with risks concerning patient safety. Other highly complex industries like commercial aviation have succeeded in maintaining or even increasing its safety levels despite rapidly increasing passenger figures. Standard operating procedures (SOPs), crew resource management (CRM), as well as operational risk evaluation (ORE) are historically developed and trusted parts of a comprehensive and systemic safety program. If medicine wants to follow this quantum leap towards increased patient safety, it must intensively evaluate the results of other high-reliability industries and seek step-by-step implementation after a critical assessment.
Entrepreneurship management in health services: an integrative model.
Guo, Kristina L
2006-01-01
This research develops an integrated systems model of entrepreneurship management as a method for achieving health care organizational survival and growth. Specifically, it analyzes current health care environment challenges, identifies roles of managers and discusses organizational theories that are relevant to the health care environment, outlines the role of entrepreneurship in health care, and describes the entrepreneurial manager in the entrepreneurial management process to produce desirable organizational outcomes. The study concludes that as current health care environment continues to show intense competition, entrepreneurial managers are responsible for creating innovations, managing change, investing in resources, and recognizing opportunities in the environment to increase organizational viability.
Conducting Research from Small University Observatories: Investigating Exoplanet Candidates
NASA Astrophysics Data System (ADS)
Moreland, Kimberly D.
2018-01-01
Kepler has to date discovered 4,496 exoplanet candidates, but only half are confirmed, and only a handful are thought to be Earth sized and in the habitable zone. Planet verification often involves extensive follow-up observations, which are both time and resource intensive. The data set collected by Kepler is massive and will be studied for decades. University/small observatories, such as the one at Texas State University, are in a good position to assist with the exoplanet candidate verification process. By preforming extended monitoring campaigns, which are otherwise cost ineffective for larger observatories, students gain valuable research experience and contribute valuable data and results to the scientific community.
NASA Astrophysics Data System (ADS)
Yoo, C. M.; Joo, J.; Hyeong, K.; Chi, S. B.
2016-12-01
Manganese nodule, also known as polymetallic nodule, contains precious elements in high contents and is regarded as one of the most important future mineral resources. It occurs throughout the world oceans, but economically feasible deposits show limited distribution only in several deepsea basins including Clarion-Clipperton Fracture Zone (CCFZ) in northeast equatorial Pacific. Estimation of resources potential is one of the key factors prerequisite for economic feasibility study. Nodule abundance is commonly estimated from direct nodule sampling, however it is difficult to obtain statistically robust data because of highly variable spatial distribution and high cost of direct sampling. Variogram analysis indicates 3.5×3.5km sampling resolution to obtain indicated category of resources data, which requires over 1,000 sampling operations to cover the potential exploitation area with mining life of 20-30 years. High-resolution acoustic survey, bathymetry and back-scattered intensity, can provide high-resolution resources data with the definition of obstacles, such as faults and scarps, for operation of nodule collecting robots. We operated 120 kHz deep-tow side scan sonar (DTSSS) with spatial resolution of 1×1m in a representative area. Sea floor images were also taken continuously by deep-tow camera from selected tracks, converted to nodule abundance using image analysis program and conversion equation, and compared with acoustic data. Back-scattering intensity values could be divided into several group and translated into nodule abundance with high confidence level. Our result indicates that high resolution acoustic survey is appropriate tool for reliable assessment of manganese nodule abundance and definition of minable area.
Aspects of energy transitions: History and determinants
NASA Astrophysics Data System (ADS)
O'Connor, Peter A.
Energy intensity in the U.S. from 1780 to 2010 shows a declining trend when traditional energy is included, in contrast to the "inverted U-curve" seen when only commercial energy is considered. The analysis quantifies use of human and animal muscle power, wind and water power, biomass, harvested ice, fossil fuels, and nuclear power. Historical prices are provided for many energy resources. The analysis reaffirms the importance of innovation in conversion technologies in energy transitions. An increase in energy intensity in the early 20th century is explained by diminishing returns to pre-electric manufacturing systems, which produced a transformation in manufacturing. In comparison to similar studies for other countries, the U.S. has generally higher energy intensity. A population-weighted series of heating degree days and cooling degree days partially explains differences in energy intensity. Series are developed for 231 countries and territories with multiple reference temperatures, with a "wet-bulb" series accounting for the effects of humidity. Other variables considered include energy prices, income per capita, and governance indices. A panel regression of thirty-two countries from 1995 to 2010 establishes GDP per capita and share of primary energy as determinants of energy intensity, but fails to establish statistical significance of the climate variables. A group mean regression finds average heating and cooling degree days to be significant predictors of average energy intensity over the study period, increasing energy intensity by roughly 1.5 kJ per 2005 international dollar for each annual degree day. Group mean regression results explain differences in countries' average energy intensity, but not changes within a country over time. Energy Return on Investment (EROI) influences the economic competitiveness and environmental impacts of an energy resource and is one driver of energy transitions. The EROI of U.S. petroleum production has declined since 1972, with a partial rebound in the 1980s and 1990s. External Energy Return (EER), which excludes the consumption of energy from within the resource, falls by two-thirds from 1972 to 2007. A literature review finds the projected EROI of oil shale to be much lower than the EROI of U.S. petroleum production.
Zhang, Mingji; Wang, Wei; Millar, Ross; Li, Guohong; Yan, Fei
2017-08-04
Health reform in China since 2009 has emphasized basic public health services to enhance the function of Community Health Services as a primary health care facility. A variety of studies have documented these efforts, and the challenges these have faced, yet up to now the experience of primary health care (PHC) providers in terms of how they have coped with these changes remains underdeveloped. Despite the abundant literature on psychological coping processes and mechanisms, the application of coping research within the context of human resources for health remains yet to be explored. This research aims to understand how PHC providers coped with the new primary health care model and the job characteristics brought about by these changes. Semi-structured interviews with primary health care workers were conducted in Jinan city of Shandong province in China. A maximum variation sampling method selected 30 PHC providers from different specialties. Thematic analysis was used drawing on a synthesis of theories related to the Job Demands-Resources model, work adjustment, and the model of exit, voice, loyalty and neglect to understand PHC providers' coping strategies. Our interviews identified that the new model of primary health care significantly affected the nature of primary health work and triggered a range of PHC providers' coping processes. The results found that health workers perceived their job as less intensive than hospital medical work but often more trivial, characterized by heavy workload, blurred job description, unsatisfactory income, and a lack of professional development. However, close relationship with community and low work pressure were satisfactory. PHC providers' processing of job demands and resources displayed two ways of interaction: aggravation and alleviation. Processing of job demands and resources led to three coping strategies: exit, passive loyalty, and compromise with new roles and functions. Primary health care providers employed coping strategies of exit, passive loyalty, and compromise to deal with changes in primary health work. In light of these findings, our paper concludes that it is necessary for the policymakers to provide further job resources for CHS, and involve health workers in policy-making. The introduction of particular professional training opportunities to support job role orientation for PHC providers is advocated.
Comparison of typical mega cities in China using emergy synthesis
NASA Astrophysics Data System (ADS)
Zhang, L. X.; Chen, B.; Yang, Z. F.; Chen, G. Q.; Jiang, M. M.; Liu, G. Y.
2009-06-01
An emergy-based comparison analysis is conducted for three typical mega cities in China, i.e., Beijing, Shanghai and Guangzhou, from 1990 to 2005 in four perspectives including emergy intensity, resource structure, environmental pressure and resource use efficiency. A new index of non-renewable emergy/money ratio is established to indicate the utilization efficiency of the non-renewable resources. The results show that for the three mega urban systems, Beijing, Shanghai and Guangzhou, the total emergy inputs were 3.76E+23, 3.54E+23, 2.52E+23 sej in 2005, of which 64.88%, 91.45% and 72.28% were imported from the outsides, respectively. As to the indicators of emergy intensity involving the total emergy use, emergy density and emergy use per cap, three cities exhibited similar overall increase trends with annual fluctuations from 1990 to 2005. Shanghai achieved the highest level of economic development and non-renewable resource use efficiency, and meanwhile, lower proportion of renewable resource use and higher environmental pressure compared to those of Beijing and Guangzhou. Guangzhou has long term sustainability considering an amount of local renewable resources used, per capita emergy used, energy consumption per unit GDP and the ratio of waste to renewable emergy. It can be concluded that different emergy-based evaluation results arise from different geographical locations, resources endowments, industrial structures and urban orientations of the concerned mega cities.
Correcting geometric and photometric distortion of document images on a smartphone
NASA Astrophysics Data System (ADS)
Simon, Christian; Williem; Park, In Kyu
2015-01-01
A set of document image processing algorithms for improving the optical character recognition (OCR) capability of smartphone applications is presented. The scope of the problem covers the geometric and photometric distortion correction of document images. The proposed framework was developed to satisfy industrial requirements. It is implemented on an off-the-shelf smartphone with limited resources in terms of speed and memory. Geometric distortions, i.e., skew and perspective distortion, are corrected by sending horizontal and vertical vanishing points toward infinity in a downsampled image. Photometric distortion includes image degradation from moiré pattern noise and specular highlights. Moiré pattern noise is removed using low-pass filters with different sizes independently applied to the background and text region. The contrast of the text in a specular highlighted area is enhanced by locally enlarging the intensity difference between the background and text while the noise is suppressed. Intensive experiments indicate that the proposed methods show a consistent and robust performance on a smartphone with a runtime of less than 1 s.
Peebles, Emma; Subbe, Christian P; Hughes, Paul; Gemmell, Les
2012-06-01
Rapid Response Teams aim to accelerate recognition and treatment of acutely unwell patients. Delays in delivery might undermine efficiency of the intervention. Our understanding of the causes of these delays is, as yet, incomplete. To identify modifiable causes of delays in the treatment of critically ill patients outside intensive care with a focus on factors amenable to system design. Review of care records and direct observation with process mapping of care delivered to 17 acutely unwell patients attended by a Rapid Response Team in a District General Hospital in the United Kingdom. Delays were defined as processes with no added value for patient care. Essential diagnostic and therapeutic procedures accounted for only 31% of time of care processes. Causes for delays could be classified into themes as (1) delays in call-out of the Rapid Response Team, (2) problems with team cohesion including poor communication and team efficiency and (3) lack of resources including lack of first line antibiotics, essential equipment, experienced staff and critical care beds. We identified a number of potentially modifiable causes for delays in care of acutely ill patients. Improved process design could include automated call-outs, a dedicated kit for emergency treatment in relevant clinical areas, increased usage of standard operating procedures and staff training using crew resource management techniques. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Powering the planet: Chemical challenges in solar energy utilization
Lewis, Nathan S.; Nocera, Daniel G.
2006-01-01
Global energy consumption is projected to increase, even in the face of substantial declines in energy intensity, at least 2-fold by midcentury relative to the present because of population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of CO2 emissions in the atmosphere demands that holding atmospheric CO2 levels to even twice their preanthropogenic values by midcentury will require invention, development, and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable energy resources, solar energy is by far the largest exploitable resource, providing more energy in 1 hour to the earth than all of the energy consumed by humans in an entire year. In view of the intermittency of insolation, if solar energy is to be a major primary energy source, it must be stored and dispatched on demand to the end user. An especially attractive approach is to store solar-converted energy in the form of chemical bonds, i.e., in a photosynthetic process at a year-round average efficiency significantly higher than current plants or algae, to reduce land-area requirements. Scientific challenges involved with this process include schemes to capture and convert solar energy and then store the energy in the form of chemical bonds, producing oxygen from water and a reduced fuel such as hydrogen, methane, methanol, or other hydrocarbon species. PMID:17043226
Schaubroeck, Thomas; Alvarenga, Rodrigo A F; Verheyen, Kris; Muys, Bart; Dewulf, Jo
2013-01-01
Life Cycle Assessment (LCA) is a tool to assess the environmental sustainability of a product; it quantifies the environmental impact of a product's life cycle. In conventional LCAs, the boundaries of a product's life cycle are limited to the human/industrial system, the technosphere. Ecosystems, which provide resources to and take up emissions from the technosphere, are not included in those boundaries. However, similar to the technosphere, ecosystems also have an impact on their (surrounding) environment through their resource usage (e.g., nutrients) and emissions (e.g., CH4). We therefore propose a LCA framework to assess the impact of integrated Techno-Ecological Systems (TES), comprising relevant ecosystems and the technosphere. In our framework, ecosystems are accounted for in the same manner as technosphere compartments. Also, the remediating effect of uptake of pollutants, an ecosystem service, is considered. A case study was performed on a TES of sawn timber production encompassing wood growth in an intensively managed forest ecosystem and further industrial processing. Results show that the managed forest accounted for almost all resource usage and biodiversity loss through land occupation but also for a remediating effect on human health, mostly via capture of airborne fine particles. These findings illustrate the potential relevance of including ecosystems in the product's life cycle of a LCA, though further research is needed to better quantify the environmental impact of TES.
Powering the planet: chemical challenges in solar energy utilization.
Lewis, Nathan S; Nocera, Daniel G
2006-10-24
Global energy consumption is projected to increase, even in the face of substantial declines in energy intensity, at least 2-fold by midcentury relative to the present because of population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of CO(2) emissions in the atmosphere demands that holding atmospheric CO(2) levels to even twice their preanthropogenic values by midcentury will require invention, development, and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable energy resources, solar energy is by far the largest exploitable resource, providing more energy in 1 hour to the earth than all of the energy consumed by humans in an entire year. In view of the intermittency of insolation, if solar energy is to be a major primary energy source, it must be stored and dispatched on demand to the end user. An especially attractive approach is to store solar-converted energy in the form of chemical bonds, i.e., in a photosynthetic process at a year-round average efficiency significantly higher than current plants or algae, to reduce land-area requirements. Scientific challenges involved with this process include schemes to capture and convert solar energy and then store the energy in the form of chemical bonds, producing oxygen from water and a reduced fuel such as hydrogen, methane, methanol, or other hydrocarbon species.
RESOURCE MANAGEMENT AMONG INTENSIVE CARE NURSES: AN ETHNOGRAPHIC STUDY.
Heydari, Abbas; Najar, Ali Vafaee; Bakhshi, Mahmoud
2015-12-01
Nurses are the main users of supplies and equipment applied in the Intensive Care Units (ICUs) which are high-priced and costly. Therefore, understanding ICU nurses' experiences about resource management contributes to the better control of the costs. This study aimed to investigate the culture of nurses' working environment regarding the resource management in the ICUs in Iran. In this study, a focused ethnographic method was used. Twenty-eight informants among ICU nurses and other professional individuals were purposively selected and interviewed. As well, 400 hours of ethnographic observations as a participant observer was used for data gathering. Data analysis was performed using the methods described by Miles and Huberman (1994). Two main themes describing the culture of ICU nurses regarding resource management included (a) consumption monitoring and auditing, and (b) prudent use. The results revealed that the efforts for resource management are conducted in the conditions of scarcity and uncertainty in supply. ICU nurses had a sense of futurism in the supply and use of resources in the unit and do the planning through taking the rules and guidelines as well as the available resources and their values into account. Improper storage of some supplies and equipment was a reaction to this uncertain condition among nurses. To manage the resources effectively, improvement of supply chain management in hospital seems essential. It is also necessary to hold educational classes in order to enhance the nurses' awareness on effective supply chain and storage of the items in the unit stock.
Isaak, Robert Scott; Stiegler, Marjorie Podraza
2016-04-01
The practice of medicine is characterized by routine and typical cases whose management usually goes according to plan. However, the occasional case does arise which involves rare catastrophic emergencies, such as intraoperative malignant hyperthermia (MH), which require a comprehensive, coordinated, and resource-intensive treatment plan. Physicians are expected to provide expert quality care for routine, typical cases, but is it reasonable to expect the same standard of expertise and comprehensive management when the emergency involves a rare entity? Although physicians would like to say yes to this question, the reality is that no physician will ever amass the amount of experience in patient care needed to truly qualify as an expert in the management of a rare emergency entity, such as MH. However, physicians can become expert in the global process of managing emergencies by using the principles of crisis resource management (CRM). In this article, we review the key concepts of CRM, using a real life example of a team who utilized CRM principles to successfully manage an intraoperative MH crisis, despite there being no one on the team who had ever previously encountered a true MH crisis.
Optimization of the resources management in fighting wildfires.
Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J Manuel
2002-09-01
Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.
Optimization of the Resources Management in Fighting Wildfires
NASA Astrophysics Data System (ADS)
Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J. Manuel
2002-09-01
Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.
Chirico, Peter G.; Malpeli, Katherine C.
2012-01-01
In May of 2000, a meeting was convened in Kimberley, South Africa, by representatives of the diamond industry and leaders of African governments to develop a certification process intended to assure that export shipments of rough diamonds were free of conflict concerns. Outcomes of the meeting were formally supported later in December of 2000 by the United Nations in a resolution adopted by the General Assembly. By 2002, the Kimberley Process Certification Scheme was ratified and signed by diamond-producing and diamond-importing countries. As of August 2012, the Kimberley Process (KP) had 51 participants representing 77 countries. It is often difficult to obtain independent verification of the diamond production statistics that are provided to the KP. However, some degree of independent verification can be obtained through an understanding of a country’s naturally occurring endowment of diamonds and the intensity of mining activities. Studies that integrate these two components can produce a range of estimated values for a country’s diamond production, and these estimates can then be compared to the production statistics released by that country. This methodology is used to calculate (1) the diamond resource potential of a country, which refers to the total number of carats estimated to be remaining in the country, and (2) the diamond production capacity of a country, which is the current volume of diamonds that may realistically be produced per year utilizing current human and physical resources. The following sections outline the methodology used by the U.S. Geological Survey (USGS) to perform diamond assessments in Mali, the Central African Republic, Ghana, and Guinea.
A Low Cost Structurally Optimized Design for Diverse Filter Types
Kazmi, Majida; Aziz, Arshad; Akhtar, Pervez; Ikram, Nassar
2016-01-01
A wide range of image processing applications deploys two dimensional (2D)-filters for performing diversified tasks such as image enhancement, edge detection, noise suppression, multi scale decomposition and compression etc. All of these tasks require multiple type of 2D-filters simultaneously to acquire the desired results. The resource hungry conventional approach is not a viable option for implementing these computationally intensive 2D-filters especially in a resource constraint environment. Thus it calls for optimized solutions. Mostly the optimization of these filters are based on exploiting structural properties. A common shortcoming of all previously reported optimized approaches is their restricted applicability only for a specific filter type. These narrow scoped solutions completely disregard the versatility attribute of advanced image processing applications and in turn offset their effectiveness while implementing a complete application. This paper presents an efficient framework which exploits the structural properties of 2D-filters for effectually reducing its computational cost along with an added advantage of versatility for supporting diverse filter types. A composite symmetric filter structure is introduced which exploits the identities of quadrant and circular T-symmetries in two distinct filter regions simultaneously. These T-symmetries effectually reduce the number of filter coefficients and consequently its multipliers count. The proposed framework at the same time empowers this composite filter structure with additional capabilities of realizing all of its Ψ-symmetry based subtypes and also its special asymmetric filters case. The two-fold optimized framework thus reduces filter computational cost up to 75% as compared to the conventional approach as well as its versatility attribute not only supports diverse filter types but also offers further cost reduction via resource sharing for sequential implementation of diversified image processing applications especially in a constraint environment. PMID:27832133
NASA Astrophysics Data System (ADS)
Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.
2015-12-01
In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.
NASA Astrophysics Data System (ADS)
Shi, Wenwu; Pinto, Brian
2017-12-01
Melting and holding molten metals within crucibles accounts for a large portion of total energy demand in the resource-intensive nonferrous foundry industry. Multivariate mathematical modeling aided by detailed material characterization and advancements in crucible technologies can make a significant impact in the areas of cost-efficiency and carbon footprint reduction. Key thermal properties such as conductivity and specific heat capacity were studied to understand their influence on crucible furnace energy consumption during melting and holding processes. The effects of conductivity on thermal stresses and longevity of crucibles were also evaluated. With this information, accurate theoretical models using finite element analysis were developed to study total energy consumption and melting time. By applying these findings to recent crucible developments, considerable improvements in field performance were reported and documented as case studies in applications such as aluminum melting and holding.
[Concept for a department of intensive care].
Nierhaus, A; de Heer, G; Kluge, S
2014-10-01
Demographic change and increasing complexity are among the reasons for high-tech critical care playing a major and increasing role in today's hospitals. At the same time, intensive care is one of the most cost-intensive departments in the hospital. To guarantee high-quality care, close cooperation of specialised intensive care staff with specialists of all other medical areas is essential. A network of the intensive care units within the hospital may lead to synergistic effects concerning quality of care, simultaneously optimizing the use of human and technical resources. Notwithstanding any organisational concepts, development and maintenance of the highest possible quality of care should be of overriding importance.
Assessment of groundwater recharge in an ash-fall mantled karst aquifer of southern Italy
NASA Astrophysics Data System (ADS)
Manna, F.; Nimmo, J. R.; De Vita, P.; Allocca, V.
2014-12-01
In southern Italy, Mesozoic carbonate formations, covered by ash-fall pyroclastic soils, are large karst aquifers and major groundwater resources. For these aquifers, even though Allocca et al., 2014 estimated a mean annual groundwater recharge coefficient at regional scale, a more complete understanding of the recharge processes at small spatio-temporal scale is a primary scientific target. In this paper, we study groundwater recharge processes in the Acqua della Madonna test site (Allocca et al., 2008) through the integrated analysis of piezometric levels, rainfall, soil moisture and air temperature data. These were gathered with hourly frequency by a monitoring station in 2008. We applied the Episodic Master Recharge method (Nimmo et al., 2014) to identify episodes of recharge and estimate the Recharge to Precipitation Ratio (RPR) at both the individual-episode and annual time scales. For different episodes of recharge observed, RPR ranges from 97% to 37%, with an annual mean around 73%. This result has been confirmed by a soil water balance and the application of the Thornthwaite-Mather method to estimate actual evapotranspiration. Even though it seems higher than RPRs typical of some parts of the world, it is very close to the mean annual groundwater recharge coefficient estimated at the regional scale for the karst aquifers of southern Italy. In addition, the RPR is affected at the daily scale by both antecedent soil moisture and rainfall intensity, as demonstrated by a statistically significant multiple linear regression among such hydrological variables. In particular, the recharge magnitude is great for low storm intensity and high antecedent soil moisture value. The results advance the comprehension of groundwater recharge processes in karst aquifers, and the sensitivity of RPR to antecedent soil moisture and rainfall intensity facilitates the prediction of the influence of climate and precipitation regime change on the groundwater recharge process.
An evaluation of the range and availability of intensive smoking cessation services in Ireland.
Currie, L M; Keogan, S; Campbell, P; Gunning, M; Kabir, Z; Clancy, L
2010-03-01
A review of smoking cessation (SC) services in Ireland is a necessary step in improving service planning and provision. To assess the range and availability of intensive SC services in Ireland in 2006. A survey of SC service providers in Ireland was conducted. Descriptive analysis and simple linear regression analysis was used. Response rate was 86.3% (63/73). All service providers surveyed are employing evidence-based interventions; the most common form of support is individual counselling with initial sessions averaging 40 min and weekly review sessions 20 min in duration. Reaching the recommended target of treating 5.0% of smokers does not seem feasible given the current distribution of resources and there appears to be regional differences in resource allocation. While intensive SC services are available in all four Health Service Executive Areas, it would appear that there is little uniformity or consistency countrywide in the scope and structure of these services.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
Claus, R.
2015-10-23
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQmore » building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. Furthermore, the full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.« less
Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter
2017-08-01
Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
NASA Astrophysics Data System (ADS)
Claus, R.; ATLAS Collaboration
2016-07-01
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. The full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
NASA Astrophysics Data System (ADS)
Bartoldus, R.; Claus, R.; Garelli, N.; Herbst, R. T.; Huffer, M.; Iakovidis, G.; Iordanidou, K.; Kwan, K.; Kocian, M.; Lankford, A. J.; Moschovakos, P.; Nelson, A.; Ntekas, K.; Ruckman, L.; Russell, J.; Schernau, M.; Schlenker, S.; Su, D.; Valderanis, C.; Wittgen, M.; Yildiz, S. C.
2016-01-01
The ATLAS muon Cathode Strip Chamber (CSC) backend readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run-2 luminosity. The readout design is based on the Reconfigurable Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the Advanced Telecommunication Computing Architecture (ATCA) platform. The RCE design is based on the new System on Chip XILINX ZYNQ series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources. Together with auxiliary memories, all these components form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the ZYNQ for high speed input and output fiberoptic links and TTC allowed the full system of 320 input links from the 32 chambers to be processed by 6 COBs in one ATCA shelf. The full system was installed in September 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning for LHC Run 2.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
Bartoldus, R.; Claus, R.; Garelli, N.; ...
2016-01-25
The ATLAS muon Cathode Strip Chamber (CSC) backend readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run-2 luminosity. The readout design is based on the Reconfigurable Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the Advanced Telecommunication Computing Architecture (ATCA) platform. The RCE design is based on the new System on Chip XILINX ZYNQ series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources. Together with auxiliary memories, all ofmore » these components form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the ZYNQ for high speed input and output fiberoptic links and TTC allowed the full system of 320 input links from the 32 chambers to be processed by 6 COBs in one ATCA shelf. The full system was installed in September 2014. In conclusion, we will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning for LHC Run 2.« less
De Kauwe, Martin G; Medlyn, Belinda E; Zaehle, Sönke; Walker, Anthony P; Dietze, Michael C; Wang, Ying-Ping; Luo, Yiqi; Jain, Atul K; El-Masri, Bassil; Hickler, Thomas; Wårlind, David; Weng, Ensheng; Parton, William J; Thornton, Peter E; Wang, Shusen; Prentice, I Colin; Asao, Shinichi; Smith, Benjamin; McCarthy, Heather R; Iversen, Colleen M; Hanson, Paul J; Warren, Jeffrey M; Oren, Ram; Norby, Richard J
2014-01-01
Elevated atmospheric CO2 concentration (eCO2) has the potential to increase vegetation carbon storage if increased net primary production causes increased long-lived biomass. Model predictions of eCO2 effects on vegetation carbon storage depend on how allocation and turnover processes are represented. We used data from two temperate forest free-air CO2 enrichment (FACE) experiments to evaluate representations of allocation and turnover in 11 ecosystem models. Observed eCO2 effects on allocation were dynamic. Allocation schemes based on functional relationships among biomass fractions that vary with resource availability were best able to capture the general features of the observations. Allocation schemes based on constant fractions or resource limitations performed less well, with some models having unintended outcomes. Few models represent turnover processes mechanistically and there was wide variation in predictions of tissue lifespan. Consequently, models did not perform well at predicting eCO2 effects on vegetation carbon storage. Our recommendations to reduce uncertainty include: use of allocation schemes constrained by biomass fractions; careful testing of allocation schemes; and synthesis of allocation and turnover data in terms of model parameters. Data from intensively studied ecosystem manipulation experiments are invaluable for constraining models and we recommend that such experiments should attempt to fully quantify carbon, water and nutrient budgets. PMID:24844873
Making the Most of What We Already Know: A Three-Stage Approach to Systematic Reviewing.
Rebelo Da Silva, Natalie; Zaranyika, Hazel; Langer, Laurenz; Randall, Nicola; Muchiri, Evans; Stewart, Ruth
2016-09-06
Conducting a systematic review in social policy is a resource-intensive process in terms of time and funds. It is thus important to understand the scope of the evidence base of a topic area prior to conducting a synthesis of primary research in order to maximize these resources. One approach to conserving resources is to map out the available evidence prior to undertaking a traditional synthesis. A few examples of this approach exist in the form of gap maps, overviews of reviews, and systematic maps supported by social policy and systematic review agencies alike. Despite this growing call for alternative approaches to systematic reviews, it is still common for systematic review teams to embark on a traditional in-depth review only. This article describes a three-stage approach to systematic reviewing that was applied to a systematic review focusing in interventions for smallholder farmers in Africa. We argue that this approach proved useful in helping us to understand the evidence base. By applying preliminary steps as part of a three-stage approach, we were able to maximize the resources needed to conduct a traditional systematic review on a more focused research question. This enabled us to identify and fill real knowledge gaps, build on work that had already been done, and avoid wasting resources on areas of work that would have no useful outcome. It also facilitated meaningful engagement between the review team and our key policy stakeholders. © The Author(s) 2016.
Pediatric Sepsis Guidelines: Summary for resource-limited countries
Khilnani, Praveen; Singhi, Sunit; Lodha, Rakesh; Santhanam, Indumathi; Sachdev, Anil; Chugh, Krishan; Jaishree, M.; Ranjit, Suchitra; Ramachandran, Bala; Ali, Uma; Udani, Soonu; Uttam, Rajiv; Deopujari, Satish
2010-01-01
Justification: Pediatric sepsis is a commonly encountered global issue. Existing guidelines for sepsis seem to be applicable to the developed countries, and only few articles are published regarding application of these guidelines in the developing countries, especially in resource-limited countries such as India and Africa. Process: An expert representative panel drawn from all over India, under aegis of Intensive Care Chapter of Indian Academy of Pediatrics (IAP) met to discuss and draw guidelines for clinical practice and feasibility of delivery of care in the early hours in pediatric patient with sepsis, keeping in view unique patient population and limited availability of equipment and resources. Discussion included issues such as sepsis definitions, rapid cardiopulmonary assessment, feasibility of early aggressive fluid therapy, inotropic support, corticosteriod therapy, early endotracheal intubation and use of positive end expiratory pressure/mechanical ventilation, initial empirical antibiotic therapy, glycemic control, and role of immunoglobulin, blood, and blood products. Objective: To achieve a reasonable evidence-based consensus on the basis of published literature and expert opinion to formulating clinical practice guidelines applicable to resource-limited countries such as India. Recommendations: Pediatric sepsis guidelines are presented in text and flow chart format keeping resource limitations in mind for countries such as India and Africa. Levels of evidence are indicated wherever applicable. It is anticipated that once the guidelines are used and outcomes data evaluated, further modifications will be necessary. It is planned to periodically review and revise these guidelines every 3–5 years as new body of evidence accumulates. PMID:20606908
NASA Astrophysics Data System (ADS)
Zhang, A.; Feng, D.; Tian, Y.; Zheng, Y.
2017-12-01
Water resource is of fundamental importance to the society and ecosystem in arid endorheic river basins, and water-use conflicts between upstream and downstream are usually significant. Heihe river basin (HRB) is the second largest endorheic river basin in china, which is featured with dry climate, intensively irrigated farmlands in oases and significant surface water-groundwater interaction. The irrigation districts in the middle HRB consume a large portion of the river flow, and the low HRB, mainly Gobi Desert, has an extremely vulnerable ecological environment. The water resources management has significantly altered the hydrological processes in HRB, and is now facing multiple challenges, including decline of groundwater table in the middle HRB, insufficient environmental flow for the lower HRB. Furthermore, future climate change adds substantial uncertainty to the water system. Thus, it is imperative to have a sustainable water resources management in HRB in order to tackle the existing challenges and future uncertainty. Climate projection form a dynamical downscaled climate change scenario shows precipitation will increase at a rate of approximately 3 millimeter per ten years and temperature will increase at a rate of approximately 0.2 centigrade degree per ten years in the following 50 years in the HRB. Based on an integrated ecohydrological model, we evaluated how the climate change and agricultural development would collaboratively impact the water resources and ecological health in the middle and lower HRB, and investigated how the water management should cope with the complex impact.
Toghanian, Samira; Johnson, David A; Stålhammar, Nils-Olov; Zerbib, Frank
2011-10-01
Research on the negative impact of gastro-oesophageal reflux disease (GORD) on the health-related quality of life (HR-QOL) and resource utilization of patients with persistent and intense GORD symptoms despite proton pump inhibitor (PPI) therapy is lacking. The aim of this study was to describe the population of patients with GORD with persistent moderate-to-severe symptoms despite ongoing PPI therapy, and to compare their HR-QOL and healthcare resource use with patients with low GORD symptom load during ongoing PPI therapy. In this post hoc analysis of the 2007 National Health and Wellness Survey (NHWS), PPI-compliant (≥22 days with PPI use in the past month) European (France, Germany and the UK) and US respondents with physician-diagnosed GORD were stratified into those with persistent and intense GORD symptoms, those with low symptom load, or an intermediate group. 5672 PPI-compliant respondents were identified (persistent and intense symptoms, n = 1741; low symptom load, n = 1805; intermediate group, n = 2126). Respondents with persistent and intense symptoms had poorer HR-QOL than patients with a low symptom load, but none of the differences were statistically significant. Respondents with persistent and intense symptoms also reported lower work productivity (all countries; significant difference [p < 0.01] only in the US), greater activity impairment (all countries; significant difference [p < 0.01] only in the US) and more hours missed from work due to health problems (US, UK and Germany; significant difference [p < 0.01] only in the US). In the UK and US, respondents with persistent and intense symptoms reported significantly more visits to both primary-care physicians and specialists than respondents with a low symptom load (all p < 0.01). Additionally, US respondents with persistent and intense symptoms reported significantly more emergency room visits (p < 0.01). The 2007 NHWS gives support to the hypothesis that persistent and intense GORD symptoms despite PPI therapy have a significant and negative impact on both HR-QOL and healthcare resource utilization. These findings outline the need for new treatment options for symptomatic GORD patients taking PPI therapy.
A Market Model for Evaluating Technologies That Impact Critical-Material Intensity
NASA Astrophysics Data System (ADS)
Iyer, Ananth V.; Vedantam, Aditya
2016-07-01
A recent Critical Materials Strategy report highlighted the supply chain risk associated with neodymium and dysprosium, which are used in the manufacturing of neodymium-iron-boron permanent magnets (PM). In response, the Critical Materials Institute is developing innovative strategies to increase and diversify primary production, develop substitutes, reduce material intensity and recycle critical materials. Our goal in this paper is to propose an economic model to quantify the impact of one of these strategies, material intensity reduction. Technologies that reduce material intensity impact the economics of magnet manufacturing in multiple ways because of: (1) the lower quantity of critical material required per unit PM, (2) more efficient use of limited supply, and (3) the potential impact on manufacturing cost. However, the net benefit of these technologies to a magnet manufacturer is an outcome of an internal production decision subject to market demand characteristics, availability and resource constraints. Our contribution in this paper shows how a manufacturer's production economics moves from a region of being supply-constrained, to a region enabling the market optimal production quantity, to a region being constrained by resources other than critical materials, as the critical material intensity changes. Key insights for engineers and material scientists are: (1) material intensity reduction can have a significant market impact, (2) benefits to manufacturers are non-linear in the material intensity reduction, (3) there exists a threshold value for material intensity reduction that can be calculated for any target PM application, and (4) there is value for new intellectual property (IP) when existing manufacturing technology is IP-protected.
Protecting your forest asset: managing risks in changing times
Lisa Jennings; Leslie Boby; Bill Hubbard; Mark Megalos
2013-01-01
Private forest owners control most of the southern forest resource and are critical to maintaining forest health in the South. Record droughts, rising temperatures, increased frequency and intensity of wildfires, insect and plant invasions, and more intense storm events all pose threats to the health of Southern forests. Scientists project that increases in temperature...
Schmidt, Susanne I; Cuthbert, Mark O; Schwientek, Marc
2017-08-15
Micro scale processes are expected to have a fundamental role in shaping groundwater ecosystems and yet they remain poorly understood and under-researched. In part, this is due to the fact that sampling is rarely carried out at the scale at which microorganisms, and their grazers and predators, function and thus we lack essential information. While set within a larger scale framework in terms of geochemical features, supply with energy and nutrients, and exchange intensity and dynamics, the micro scale adds variability, by providing heterogeneous zones at the micro scale which enable a wider range of redox reactions. Here we outline how understanding micro scale processes better may lead to improved appreciation of the range of ecosystems functions taking place at all scales. Such processes are relied upon in bioremediation and we demonstrate that ecosystem modelling as well as engineering measures have to take into account, and use, understanding at the micro scale. We discuss the importance of integrating faunal processes and computational appraisals in research, in order to continue to secure sustainable water resources from groundwater. Copyright © 2017 Elsevier B.V. All rights reserved.
Quality of life in long-term survivors of intensive care.
Buckley, T A; Cheng, A Y; Gomersall, C D
2001-05-01
Traditionally, outcome from intensive care has focused on mortality. The cost of intensive care and the limited resources devoted to patients who have a poor prognosis also raises questions about the utilisation of such resources. There is increasing pressure for outcome evaluation of intensive care to incorporate assessment of long-term survival and the quality of life in survivors. The principal objectives of this article were to examine current methods of assessing quality of life measures in critically ill patients surviving intensive care and to determine the quality of life of these survivors. Direct and computerised search of published research articles. Measurement of quality of life after intensive care is not common practice. There is a lack of consensus concerning appropriate measuring instruments to be used and how best to interpret results. Despite the availability of general outcome tools and disease specific instruments, there is a paucity of studies in the literature which include assessments of quality of life following intensive care unit (ICU) care. Generic health indices suggest that the quality of life in ICU survivors is acceptable though in certain sub-groups, e.g. adult respiratory distress syndrome and sepsis, quality of life may be moderately impaired. ICU survivors appear to suffer less disability than chronic physical disease patients. Assessment of outcome after intensive care should include health related quality of life measurements. A unifying framework is required to enhance communication between clinicians, administrators and investigators of quality of life research and also to enable more rational and effective decision making at the bedside. Patients who survive intensive care appear to enjoy a reasonable standard of quality of life. While their health status may not be as good, subjectively patients find this acceptable.
Water Footprint Assessment in the Agro-industry: A Case Study of Soy Sauce Production
NASA Astrophysics Data System (ADS)
Firda, Alfiana Aulia; Purwanto
2018-02-01
In terms of global water scarcity, the water footprint is an indicator of the use of water resources that given knowledge about the environmental impact of consuming a product. The sustainable use of water resources nowadays bring challenges related to the production and consumption phase of water intensive related goods such as in the agro-industry. The objective of the study was to assessment the total water footprint from soy sauce production in Grobogan Regency. The total water footprint is equal to the sum of the supply chain water footprint and the operational water footprint. The assessment is based on the production chain diagram of soy sauce production which presenting the relevant process stages from the source to the final product. The result of this research is the total water footprint of soy sauce production is 1.986,35 L/kg with fraction of green water 78,43%, blue water 21,4% and gray water 0,17%.
Real-time data-intensive computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander
2016-07-27
Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficientmore » closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.« less
Paleolithic human exploitation of plant foods during the last glacial maximum in North China
Liu, Li; Bestel, Sheahan; Shi, Jinming; Song, Yanhua; Chen, Xingcan
2013-01-01
Three grinding stones from Shizitan Locality 14 (ca. 23,000–19,500 calendar years before present) in the middle Yellow River region were subjected to usewear and residue analyses to investigate human adaptation during the last glacial maximum (LGM) period, when resources were generally scarce and plant foods may have become increasingly important in the human diet. The results show that these tools were used to process various plants, including Triticeae and Paniceae grasses, Vigna beans, Dioscorea opposita yam, and Trichosanthes kirilowii snakegourd roots. Tubers were important food resources for Paleolithic hunter–gatherers, and Paniceae grasses were exploited about 12,000 y before their domestication. The long tradition of intensive exploitation of certain types of flora helped Paleolithic people understand the properties of these plants, including their medicinal uses, and eventually led to the plants' domestication. This study sheds light on the deep history of the broad spectrum subsistence strategy characteristic of late Pleistocene north China before the origins of agriculture in this region. PMID:23509257
NASA Astrophysics Data System (ADS)
Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina
2009-04-01
Food production and consumption cause significant environmental burdens during the product life cycles. As a result of intensive development and the changing social attitudes and behaviors in the last century, the agrofood sector is the highest resource consumer after housing in the EU. This paper is part of an effort to estimate environmental impacts associated with life cycles of the agrofood chain, such as primary energy consumption, water exploitation, and global warming. Life cycle assessment is used to investigate the production of the following citrus-based products in Italy: essential oil, natural juice, and concentrated juice from oranges and lemons. The related process flowcharts, the relevant mass and energy flows, and the key environmental issues are identified for each product. This paper represents one of the first studies on the environmental impacts from cradle to gate for citrus products in order to suggest feasible strategies and actions to improve their environmental performance.
Caraballo, Manuel A; Macías, Francisco; Nieto, José Miguel; Ayora, Carlos
2016-01-01
Water resources management and restoration strategies, and subsequently ecological and human life quality, are highly influenced by the presence of short and long term cycles affecting the intensity of a targeted pollution. On this respect, a typical acid mine drainage (AMD) groundwater from a sulfide mining district with dry Mediterranean climate (Iberian Pyrite Belt, SW Spain) was studied to unravel the effect of long term weather changes in water flow rate and metal pollutants concentration. Three well differentiated polluting stages were observed and the specific geochemical, mineralogical and hydrological processes involved (pyrite and enclosing rocks dissolution, evaporitic salts precipitation-redisolution and pluviometric long term fluctuations) were discussed. Evidencing the importance of including longer background monitoring stage in AMD management and restoration strategies, the present study strongly advise a minimum 5-years period of AMD continuous monitoring previous to the design of any AMD remediation system in regions with dry Mediterranean climate. Copyright © 2015 Elsevier B.V. All rights reserved.
Paleolithic human exploitation of plant foods during the last glacial maximum in North China.
Liu, Li; Bestel, Sheahan; Shi, Jinming; Song, Yanhua; Chen, Xingcan
2013-04-02
Three grinding stones from Shizitan Locality 14 (ca. 23,000-19,500 calendar years before present) in the middle Yellow River region were subjected to usewear and residue analyses to investigate human adaptation during the last glacial maximum (LGM) period, when resources were generally scarce and plant foods may have become increasingly important in the human diet. The results show that these tools were used to process various plants, including Triticeae and Paniceae grasses, Vigna beans, Dioscorea opposita yam, and Trichosanthes kirilowii snakegourd roots. Tubers were important food resources for Paleolithic hunter-gatherers, and Paniceae grasses were exploited about 12,000 y before their domestication. The long tradition of intensive exploitation of certain types of flora helped Paleolithic people understand the properties of these plants, including their medicinal uses, and eventually led to the plants' domestication. This study sheds light on the deep history of the broad spectrum subsistence strategy characteristic of late Pleistocene north China before the origins of agriculture in this region.
Mid-Shelf Hardground Fish Habitats off the Georgia Coast
NASA Astrophysics Data System (ADS)
Platt, M.; Sautter, L.
2016-02-01
Multibeam sonar data were collected off the Georgia coast aboard the R/V Savannah by the College of Charleston BEAMS Program in May 2015. Kongsberg EM2040C data were post-processed in CARIS HIPS and SIPS 9.0 to create bathymetric maps overlain with backscatter intensity. The mid-shelf focus sites lie at depths between 25 and 40 m, and include the northern edge of Gray's Reef National Marine Sanctuary. The study sites are known areas of abundant fish congregations, identified by the South Carolina Department of Natural Resources' Marine Resources Monitoring, Assessment, & Prediction (MARMAP) program. The regional mid-shelf seafloor morphology consists of sand ridges, rock outcrops, and incised meandering channels 1 to 3 m deep. Backscatter analysis was used to identify hardground structures that might provide habitat for a high diversity of vertebrates and invertebrates. Multiple hardground structures were found and characterized at these locations and will be targeted for further research and possible inclusion in the Georgia and South Carolina continental shelf Marine Protected Areas.
Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina
2009-04-01
Food production and consumption cause significant environmental burdens during the product life cycles. As a result of intensive development and the changing social attitudes and behaviors in the last century, the agrofood sector is the highest resource consumer after housing in the EU. This paper is part of an effort to estimate environmental impacts associated with life cycles of the agrofood chain, such as primary energy consumption, water exploitation, and global warming. Life cycle assessment is used to investigate the production of the following citrus-based products in Italy: essential oil, natural juice, and concentrated juice from oranges and lemons. The related process flowcharts, the relevant mass and energy flows, and the key environmental issues are identified for each product. This paper represents one of the first studies on the environmental impacts from cradle to gate for citrus products in order to suggest feasible strategies and actions to improve their environmental performance.
Remien, Robert H; Mellins, Claude A.; Robbins, Reuben N.; Kelsey, Ryan; Rowe, Jessica; Warne, Patricia; Chowdhury, Jenifar; Lalkhen, Nuruneesa; Hoppe, Lara; Abrams, Elaine J.; El-Bassel, Nabila; Witte, Susan; Stein, Dan J.
2013-01-01
Effective medical treatment for HIV/AIDS requires patients’ optimal adherence to antiretroviral therapy (ART). In resource-constrained settings, lack of adequate standardized counseling for patients on ART remains a significant barrier to adherence. Masivukeni (“Lets Wake Up” in Xhosa) is an innovative multimedia-based intervention designed to help people living with HIV in resource-limited settings achieve and maintain high levels of ART adherence. Adapted from a couples-based intervention tested in the United States (US), Masivukeni was developed through community-based participatory research with US and South African partners and informed by Ewart’s Social Action Theory. Innovative computer-based multimedia strategies were used to translate a labor- and training-intensive intervention into one that could be readily and widely used by lay counselors with relatively little training with low-literacy patients. In this paper, we describe the foundations of this new intervention, the process of its development, and the evidence of its high acceptability and feasibility. PMID:23468079
Stampfli, Andreas; Bloor, Juliette M G; Fischer, Markus; Zeiter, Michaela
2018-05-01
Climate change projections anticipate increased frequency and intensity of drought stress, but grassland responses to severe droughts and their potential to recover are poorly understood. In many grasslands, high land-use intensity has enhanced productivity and promoted resource-acquisitive species at the expense of resource-conservative ones. Such changes in plant functional composition could affect the resistance to drought and the recovery after drought of grassland ecosystems with consequences for feed productivity resilience and environmental stewardship. In a 12-site precipitation exclusion experiment in upland grassland ecosystems across Switzerland, we imposed severe edaphic drought in plots under rainout shelters and compared them with plots under ambient conditions. We used soil water potentials to scale drought stress across sites. Impacts of precipitation exclusion and drought legacy effects were examined along a gradient of land-use intensity to determine how grasslands resisted to, and recovered after drought. In the year of precipitation exclusion, aboveground net primary productivity (ANPP) in plots under rainout shelters was -15% to -56% lower than in control plots. Drought effects on ANPP increased with drought severity, specified as duration of topsoil water potential ψ < -100 kPa, irrespective of land-use intensity. In the year after drought, ANPP had completely recovered, but total species diversity had declined by -10%. Perennial species showed elevated mortality, but species richness of annuals showed a small increase due to enhanced recruitment. In general, the more resource-acquisitive grasses increased at the expense of the deeper-rooted forbs after drought, suggesting that community reorganization was driven by competition rather than plant mortality. The negative effects of precipitation exclusion on forbs increased with land-use intensity. Our study suggests a synergistic impact of land-use intensification and climate change on grassland vegetation composition, and implies that biomass recovery after drought may occur at the expense of biodiversity maintenance. © 2018 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Liu, Wenfeng; Yang, Hong; Liu, Yu; Kummu, Matti; Hoekstra, Arjen Y; Liu, Junguo; Schulin, Rainer
2018-08-15
Global food trade entails virtual flows of agricultural resources and pollution across countries. Here we performed a global-scale assessment of impacts of international food trade on blue water use, total water use, and nitrogen (N) inputs and on N losses in maize, rice, and wheat production. We simulated baseline conditions for the year 2000 and explored the impacts of an agricultural intensification scenario, in which low-input countries increase N and irrigation inputs to a greater extent than high-input countries. We combined a crop model with the Global Trade Analysis Project model. Results show that food exports generally occurred from regions with lower water and N use intensities, defined here as water and N uses in relation to crop yields, to regions with higher resources use intensities. Globally, food trade thus conserved a large amount of water resources and N applications, and also substantially reduced N losses. The trade-related conservation in blue water use reached 85km 3 y -1 , accounting for more than half of total blue water use for producing the three crops. Food exported from the USA contributed the largest proportion of global water and N conservation as well as N loss reduction, but also led to substantial export-associated N losses in the country itself. Under the intensification scenario, the converging water and N use intensities across countries result in a more balanced world; crop trade will generally decrease, and global water resources conservation and N pollution reduction associated with the trade will reduce accordingly. The study provides useful information to understand the implications of agricultural intensification for international crop trade, crop water use and N pollution patterns in the world. Copyright © 2018 Elsevier B.V. All rights reserved.
The Best of Both Worlds: Exploring Cross-Collaborative Community Engagement
ERIC Educational Resources Information Center
Hunt, Kathleen P.; Krakow, Melinda M.
2015-01-01
Lauded as a rewarding pedagogical approach, community-engagement can be time-consuming, resource-intensive, and difficult for instructors to manage for effective student learning outcomes. Collaborative teaching can allow instructors working in the same classroom to draw from each others' expertise and share resources. In this essay, we propose a…
Martin, Lynn; Fries, Brant E; Hirdes, John P; James, Mary
2011-06-01
Since 1991, the Minimum Data Set 2.0 (MDS 2.0) has been the mandated assessment in US nursing homes. The Resource Utilization Groups III (RUG-III) case-mix system provides person-specific means of allocating resources based on the variable costs of caring for persons with different needs. Retrospective analyses of data collected on a sample of 9707 nursing home residents (2.4% had an intellectual disability) were used to examine the fit of the RUG-III case-mix system for determining the cost of supporting persons with intellectual disability (intellectual disability). The RUG-III system explained 33.3% of the variance in age-weighted nursing time among persons with intellectual disability compared to 29.6% among other residents, making it a good fit among persons with intellectual disability in nursing homes. The RUG-III may also serve as the basis for the development of a classification system that describes the resource intensity of persons with intellectual disability in other settings that provide similar types of support.
NASA Astrophysics Data System (ADS)
Masoud, Alaa A.; El-Horiny, Mohamed M.; Atwia, Mohamed G.; Gemail, Khaled S.; Koike, Katsuaki
2018-06-01
Salinization of groundwater and soil resources has long been a serious environmental hazard in arid regions. This study was conducted to investigate and document the factors controlling such salinization and their inter-relationships in the Dakhla Oasis (Egypt). To accomplish this, 60 groundwater samples and 31 soil samples were collected in February 2014. Factor analysis (FA) and hierarchical cluster analysis (HCA) were integrated with geostatistical analyses to characterize the chemical properties of groundwater and soil and their spatial patterns, identify the factors controlling the pattern variability, and clarify the salinization mechanism. Groundwater quality standards revealed emergence of salinization (av. 885.8 mg/L) and extreme occurrences of Fe2+ (av. 17.22 mg/L) and Mn2+ (av. 2.38 mg/L). Soils were highly salt-affected (av. 15.2 dS m-1) and slightly alkaline (av. pH = 7.7). Evaporation and ion-exchange processes governed the evolution of two main water types: Na-Cl (52%) and Ca-Mg-Cl (47%), respectively. Salinization leads the chemical variability of both resources. Distinctive patterns of slight salinization marked the northern part and intense salinization marked the middle and southern parts. Congruence in the resources clusters confirmed common geology, soil types, and urban and agricultural practices. Minimizing the environmental and socioeconomic impacts of the resources salinization urges the need for better understanding of the hydrochemical characteristics and prediction of quality changes.
NASA Technical Reports Server (NTRS)
Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.
1992-01-01
Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.
Autoplan: A self-processing network model for an extended blocks world planning environment
NASA Technical Reports Server (NTRS)
Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank
1990-01-01
Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.
NASA Astrophysics Data System (ADS)
Early, A. B.; Chen, G.; Beach, A. L., III; Northup, E. A.
2016-12-01
NASA has conducted airborne tropospheric chemistry studies for over three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in Hampton Virginia originally developed the Toolsets for Airborne Data (TAD) web application in September 2013 to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. The analysis of airborne data typically requires data subsetting, which can be challenging and resource intensive for end users. In an effort to streamline this process, the TAD toolset enhancements will include new data subsetting features and updates to the current database model. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. This effort will allow for the automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The development of these enhancements will be discussed in this presentation.
Decreased Pain Perception by Unconscious Emotional Pictures
Peláez, Irene; Martínez-Iñigo, David; Barjola, Paloma; Cardoso, Susana; Mercado, Francisco
2016-01-01
Pain perception arises from a complex interaction between a nociceptive stimulus and different emotional and cognitive factors, which appear to be mediated by both automatic and controlled systems. Previous evidence has shown that whereas conscious processing of unpleasant stimuli enhances pain perception, emotional influences on pain under unaware conditions are much less known. The aim of the present study was to investigate the modulation of pain perception by unconscious emotional pictures through an emotional masking paradigm. Two kinds of both somatosensory (painful and non-painful) and emotional stimulation (negative and neutral pictures) were employed. Fifty pain-free participants were asked to rate the perception of pain they were feeling in response to laser-induced somatosensory stimuli as faster as they can. Data from pain intensity and reaction times were measured. Statistical analyses revealed a significant effect for the interaction between pain and emotional stimulation, but surprisingly this relationship was opposite to expected. In particular, lower pain intensity scores and longer reaction times were found in response to negative images being strengthened this effect for painful stimulation. Present findings suggest a clear pain perception modulation by unconscious emotional contexts. Attentional capture mechanisms triggered by unaware negative stimulation could explain this phenomenon leading to a withdrawal of processing resources from pain. PMID:27818642
Decreased Pain Perception by Unconscious Emotional Pictures.
Peláez, Irene; Martínez-Iñigo, David; Barjola, Paloma; Cardoso, Susana; Mercado, Francisco
2016-01-01
Pain perception arises from a complex interaction between a nociceptive stimulus and different emotional and cognitive factors, which appear to be mediated by both automatic and controlled systems. Previous evidence has shown that whereas conscious processing of unpleasant stimuli enhances pain perception, emotional influences on pain under unaware conditions are much less known. The aim of the present study was to investigate the modulation of pain perception by unconscious emotional pictures through an emotional masking paradigm. Two kinds of both somatosensory (painful and non-painful) and emotional stimulation (negative and neutral pictures) were employed. Fifty pain-free participants were asked to rate the perception of pain they were feeling in response to laser-induced somatosensory stimuli as faster as they can. Data from pain intensity and reaction times were measured. Statistical analyses revealed a significant effect for the interaction between pain and emotional stimulation, but surprisingly this relationship was opposite to expected. In particular, lower pain intensity scores and longer reaction times were found in response to negative images being strengthened this effect for painful stimulation. Present findings suggest a clear pain perception modulation by unconscious emotional contexts. Attentional capture mechanisms triggered by unaware negative stimulation could explain this phenomenon leading to a withdrawal of processing resources from pain.
Tang, Long; Wolf, Amelia A; Gao, Yang; Wang, Cheng Huan
2018-06-01
In an attempt to clarify the role of environmental and biotic interactions on plant growth, there has been a long-running ecological debate over whether the intensity and importance of competition stabilizes, increases or decreases across environmental gradients. We conducted an experiment in a Chinese estuary to investigate the effects of a non-resource stress gradient, soil salinity (from 1.4‰ to 19.0‰ salinity), on the competitive interactions between native Phragmites australis and invasive Spartina alterniflora. We linked these effects to measurements of photosynthetic activities to further elucidate the underlying physiological mechanism behind the competitive interactions and the driver of invasion. The experiments revealed that while biomass of both species decreased in the presence of the other, competition did not alter photosynthetic activity of either species over time. P. australis exhibited high photosynthetic activity, including low chlorophyllase activity, high chlorophyll content, high stomatal conductance and high net photosynthetic rate, at low salinity. Under these conditions, P. australis experienced low competitive intensity, leading to high biomass production and competitive exclusion of S. alterniflora. The opposite was observed for S. alterniflora: while competitive intensity experienced by P. australis increased with increasing salinity, and photosynthetic activity, biomass, competitive dominance and the importance of competition for P. australis growth decreased, those of S. alterniflora were stable. These findings demonstrate that S. alterniflora invasion driven by competitive exclusion are likely to occur and expand in high salinity zones. The change in the nature of competition along a non-resource stress gradient differs between competitors likely due to differences in photosynthetic tolerance to salinity. The driver of growth of the less-tolerant species changes from competition to non-resource stress factors with increasing stress levels, whereas competition is constantly important for growth of the more-tolerant species. Incorporating metrics of both competition intensity and importance, as well as linking these competitive outcomes with physiological mechanisms, is crucial to understanding, predicting, and mediating the effects of invasive species in the future. © 2018 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Han, Bangshuai; Benner, Shawn G.; Bolte, John P.; Vache, Kellie B.; Flores, Alejandro N.
2017-07-01
Humans have significantly altered the redistribution of water in intensively managed hydrologic systems, shifting the spatiotemporal patterns of surface water. Evaluating water availability requires integration of hydrologic processes and associated human influences. In this study, we summarize the development and evaluation of an extensible hydrologic model that explicitly integrates water rights to spatially distribute irrigation waters in a semi-arid agricultural region in the western US, using the Envision integrated modeling platform. The model captures both human and biophysical systems, particularly the diversion of water from the Boise River, which is the main water source that supports irrigated agriculture in this region. In agricultural areas, water demand is estimated as a function of crop type and local environmental conditions. Surface water to meet crop demand is diverted from the stream reaches, constrained by the amount of water available in the stream, the water-rights-appropriated amount, and the priority dates associated with particular places of use. Results, measured by flow rates at gaged stream and canal locations within the study area, suggest that the impacts of irrigation activities on the magnitude and timing of flows through this intensively managed system are well captured. The multi-year averaged diverted water from the Boise River matches observations well, reflecting the appropriation of water according to the water rights database. Because of the spatially explicit implementation of surface water diversion, the model can help diagnose places and times where water resources are likely insufficient to meet agricultural water demands, and inform future water management decisions.
NASA Astrophysics Data System (ADS)
Abashev, V. M.; Korabelnikov, A. V.; Kuranov, A. L.; Tretyakov, P. K.
2017-10-01
At the analysis of the work process in a ramjet, a complex consideration of the ensemble of problems the solution of which determines the engine efficiency appears reasonable. The main problems are ensuring a high completeness of fuel combustion and minimal hydraulic losses, the reliability of cooling of high-heat areas with the use of the fuel cooling resource, and ensuring the strength of the engine duct elements under non-uniform heat loads due to fuel combustion in complex gas-dynamic flow structures. The fundamental techniques and approaches to the solution of above-noted problems are considered in the present report, their novelty and advantages in comparison with conventional techniques are substantiated. In particular, a technique of the arrangement of an intense (pre-detonation) combustion regime for ensuring a high completeness of fuel combustion and minimal hydraulic losses at a smooth deceleration of a supersonic flow down to the sound velocity using the pulsed-periodic gas-dynamic flow control has been proposed. A technique has been proposed for cooling the high-heat areas, which employs the cooling resource of the hydrocarbon fuel, including the process of the kerosene chemical transformation (conversion) using the nano-catalysts. An analysis has shown that the highly heated structure will operate in the elastic-plastic domain of the behavior of constructional materials, which is directly connected to the engine operation resource. There arise the problems of reducing the ramjet shells depending on deformations. The deformations also lead to a significant influence on the work process in the combustor and, naturally, on the heat transfer process and the performance of catalysts (the action of plastic and elastic deformations of restrained shells). The work presents some results illustrating the presence of identified problems. A conclusion is drawn about the necessity of formulating a complex investigation both with the realization in model experiments and execution of computational and theoretical investigations.
Simulation analysis of resource flexibility on healthcare processes
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. Methods The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity. Results The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process. Conclusion This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing patients demands. This study was limited to a single patient process; studies focusing on additional processes are recommended. PMID:27785046
Simulation analysis of resource flexibility on healthcare processes.
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity. The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process. This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing patients demands. This study was limited to a single patient process; studies focusing on additional processes are recommended.
World Federation of Pediatric Intensive Care and Critical Care Societies: Global Sepsis Initiative.
Kissoon, Niranjan; Carcillo, Joseph A; Espinosa, Victor; Argent, Andrew; Devictor, Denis; Madden, Maureen; Singhi, Sunit; van der Voort, Edwin; Latour, Jos
2011-09-01
According to World Health Organization estimates, sepsis accounts for 60%-80% of lost lives per year in childhood. Measures appropriate for resource-scarce and resource-abundant settings alike can reduce sepsis deaths. In this regard, the World Federation of Pediatric Intensive Care and Critical Care Societies Board of Directors announces the Global Pediatric Sepsis Initiative, a quality improvement program designed to improve quality of care for children with sepsis. To announce the global sepsis initiative; to justify some of the bundles that are included; and to show some preliminary data and encourage participation. The Global Pediatric Sepsis Initiative is developed as a Web-based education, demonstration, and pyramid bundles/checklist tool (http://www.pediatricsepsis.org or http://www.wfpiccs.org). Four health resource categories are included. Category A involves a nonindustrialized setting with mortality rate <5 yrs and >30 of 1,000 children. Category B involves a nonindustrialized setting with mortality rate <5 yrs and <30 of 1,000 children. Category C involves a developing industrialized nation. In category D, developed industrialized nation are determined and separate accompanying administrative and clinical parameters bundles or checklist quality improvement recommendations are provided, requiring greater resources and tasks as resource allocation increased from groups A to D, respectively. In the vanguard phase, data for 361 children (category A, n = 34; category B, n = 12; category C, n = 84; category D, n = 231) were successfully entered, and quality-assurance reports were sent to the 23 participating international centers. Analysis of bundles for categories C and D showed that reduction in mortality was associated with compliance with the resuscitation (odds ratio, 0.369; 95% confidence interval, 0.188-0.724; p < .0004) and intensive care unit management (odds ratio, 0.277; 95% confidence interval, 0.096-0.80) bundles. The World Federation of Pediatric Intensive Care and Critical Care Societies Global Pediatric Sepsis Initiative is online. Success in reducing pediatric mortality and morbidity, evaluated yearly as a measure of global child health care quality improvement, requires ongoing active recruitment of international participant centers. Please join us at http://www.pediatricsepsis.org or http://www.wfpiccs.org.
Intensive culture of black cherry
L.R. Auchmoody
1973-01-01
The recently-released Timber Resources Review forecasts increasing demands for wood and wood products through the end of this century. If these demands are to be met, particularly in view of a shrinking forest-land base, then widespread use of intensive culture must ultimately be adopted. Two cultural techniques being looked at more and more closely as ways of...
ERIC Educational Resources Information Center
Ennis, Robin Parks; Lane, Kathleen Lynne; Oakes, Wendy Peia
2018-01-01
Self-monitoring is a low-intensity strategy teachers can use to support instruction in classrooms across the grade span in various instructional settings and content areas. This study extended the knowledge base by examining the effectiveness of self-monitoring through a systematic replication with three students with specific learning…
ERIC Educational Resources Information Center
Miller, Roxanne Greitz; Hurlock, Ashley J.
2017-01-01
Non research-intensive institutions of higher education are effective at narrowing STEM gender gaps in major selection and persistence to degree completion, yet the decision to attend such a setting is likely seen as counterintuitive when such institutions typically have lower levels of research, financial resources, and total student enrollments…
Sooyoung Kim; Robert J. McGaughey; Hans-Erik Andersen; Gerard Schreuder
2009-01-01
Tree species identification is important for a variety of natural resource management and monitoring activities including riparian buffer characterization, wildfire risk assessment, biodiversity monitoring, and wildlife habitat assessment. Intensity data recorded for each laser point in a LIDAR system is related to the spectral reflectance of the target material and...
RESOURCE MANAGEMENT AMONG INTENSIVE CARE NURSES: AN ETHNOGRAPHIC STUDY
Heydari, Abbas; Najar, Ali Vafaee; Bakhshi, Mahmoud
2015-01-01
Background: Nurses are the main users of supplies and equipment applied in the Intensive Care Units (ICUs) which are high-priced and costly. Therefore, understanding ICU nurses’ experiences about resource management contributes to the better control of the costs. Objectives: This study aimed to investigate the culture of nurses’ working environment regarding the resource management in the ICUs in Iran. Patients and Methods: In this study, a focused ethnographic method was used. Twenty-eight informants among ICU nurses and other professional individuals were purposively selected and interviewed. As well, 400 hours of ethnographic observations as a participant observer was used for data gathering. Data analysis was performed using the methods described by Miles and Huberman (1994). Results: Two main themes describing the culture of ICU nurses regarding resource management included (a) consumption monitoring and auditing, and (b) prudent use. The results revealed that the efforts for resource management are conducted in the conditions of scarcity and uncertainty in supply. ICU nurses had a sense of futurism in the supply and use of resources in the unit and do the planning through taking the rules and guidelines as well as the available resources and their values into account. Improper storage of some supplies and equipment was a reaction to this uncertain condition among nurses. Conclusions: To manage the resources effectively, improvement of supply chain management in hospital seems essential. It is also necessary to hold educational classes in order to enhance the nurses’ awareness on effective supply chain and storage of the items in the unit stock. PMID:26889097
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform
NASA Astrophysics Data System (ADS)
Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio
2016-08-01
SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.
May, Jon; Kavanagh, David J; Andrade, Jackie
2015-05-01
Ten years after the publication of Elaborated Intrusion (EI) Theory, there is now substantial research into its key predictions. The distinction between intrusive thoughts, which are driven by automatic processes, and their elaboration, involving controlled processing, is well established. Desires for both addictive substances and other desired targets are typically marked by imagery, especially when they are intense. Attention training strategies such as body scanning reduce intrusive thoughts, while concurrent tasks that introduce competing sensory information interfere with elaboration, especially if they compete for the same limited-capacity working memory resources. EI Theory has spawned new assessment instruments that are performing strongly and offer the ability to more clearly delineate craving from correlated processes. It has also inspired new approaches to treatment. In particular, training people to use vivid sensory imagery for functional goals holds promise as an intervention for substance misuse, since it is likely to both sustain motivation and moderate craving. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nunes, J P
1998-01-01
Abortion is the interruption of a dynamic process in a final and irreversible form. The legalization of abortion is applied to human ontogenesis, that is, the development of the human being. However, the embryo that is growing in the uterus is not a human being because a human being is a complex organism with differentiated systems, its own identity and intrinsic autonomy in its process of development. There are basically four levels of the analysis of the problem of abortion: 1) fundamental emotional arguments; 2) profound ignorance of technical and scientific facts; 3) rational positions obfuscated by the dramatic intensity of everyday situations; and 4) the conjunction of deliberated position where culpability is avoided with solidarity for all subjects of the process with a socially oriented view. The phenomenon of abortion from an epidemiological point of view summons the facts with which it is associated: poverty, illiteracy, shortage or lack of community health resources, absence of centers for adolescents, degradation of the environment, and precariousness of employment.
Shen, Jianbo; Li, Chunjian; Mi, Guohua; Li, Long; Yuan, Lixing; Jiang, Rongfeng; Zhang, Fusuo
2013-03-01
Root and rhizosphere research has been conducted for many decades, but the underlying strategy of root/rhizosphere processes and management in intensive cropping systems remain largely to be determined. Improved grain production to meet the food demand of an increasing population has been highly dependent on chemical fertilizer input based on the traditionally assumed notion of 'high input, high output', which results in overuse of fertilizers but ignores the biological potential of roots or rhizosphere for efficient mobilization and acquisition of soil nutrients. Root exploration in soil nutrient resources and root-induced rhizosphere processes plays an important role in controlling nutrient transformation, efficient nutrient acquisition and use, and thus crop productivity. The efficiency of root/rhizosphere in terms of improved nutrient mobilization, acquisition, and use can be fully exploited by: (1) manipulating root growth (i.e. root development and size, root system architecture, and distribution); (2) regulating rhizosphere processes (i.e. rhizosphere acidification, organic anion and acid phosphatase exudation, localized application of nutrients, rhizosphere interactions, and use of efficient crop genotypes); and (3) optimizing root zone management to synchronize root growth and soil nutrient supply with demand of nutrients in cropping systems. Experiments have shown that root/rhizosphere management is an effective approach to increase both nutrient use efficiency and crop productivity for sustainable crop production. The objectives of this paper are to summarize the principles of root/rhizosphere management and provide an overview of some successful case studies on how to exploit the biological potential of root system and rhizosphere processes to improve crop productivity and nutrient use efficiency.
Flight Systems Integration and Test
NASA Technical Reports Server (NTRS)
Wright, Michael R.
2011-01-01
Topics to be Covered in this presentation are: (1) Integration and Test (I&T) Planning (2) Integration and Test Flows (3) Overview of Typical Mission I&T (4) Supporting Elements (5) Lessons-Learned and Helpful Hints (6) I&T Mishaps and Failures (7) The Lighter Side of I&T and (8) Small-Group Activity. This presentation highlights a typical NASA "in-house" I&T program (1) For flight systems that are developed by NASA at a space flight center (like GSFC) (2) Requirements well-defined: qualification/acceptance, documentation, configuration management. (3) Factors: precedents, human flight, risk-aversion ("failure-phobia"), taxpayer dollars, jobs and (4) Some differences among NASA centers, but generally a resource-intensive process
How Tenneco manages energy productivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glorioso, J.
1982-08-01
Tenneco's energy-management investments are intended to improve energy productivity, and are reported in terms of avoided costs in a way that highlights the energy value of conservation projects. This accounting approach helps management see that the return on conservation projects has increased faster than the rate of inflation. Tenneco's pursuit of higher productivity extends to labor, capital, and materials as well as energy resources. Data collection is the first step, followed by a ranking of possible projects. Continuous monitoring and energy use figures from each plant track the trend of energy value over time. Specific projects at Tenneco's energy-intensive operationsmore » of refining, shipbuilding, and food processing illustrate the company's energy management program. (DCK)« less
Nora: A Vocabulary Discovery Tool for Concept Extraction.
Divita, Guy; Carter, Marjorie E; Durgahee, B S Begum; Pettey, Warren E; Redd, Andrew; Samore, Matthew H; Gundlapalli, Adi V
2015-01-01
Coverage of terms in domain-specific terminologies and ontologies is often limited in controlled medical vocabularies. Creating and augmenting such terminologies is resource intensive. We developed Nora as an interactive tool to discover terminology from text corpora; the output can then be employed to refine and enhance natural language processing-based concept extraction tasks. Nora provides a visualization of chains of words foraged from word frequency indexes from a text corpus. Domain experts direct and curate chains that contain relevant terms, which are further curated to identify lexical variants. A test of Nora demonstrated an increase of a domain lexicon in homelessness and related psychosocial factors by 38%, yielding an additional 10% extracted concepts.
[Features of organization of nutrition for young athletes].
Korosteleva, M M; Nikitiuk, D B; Volkova, L Iu
2013-01-01
Organization of nutrition for young athletes implied a regime, which includes the distribution of meals throughout the day, the multiplicity of power an nutrients that must be strictly consistent with the mode of the training process. Athletes' requirements in energy and nutrients vary considerably depending on the sport discipline and the amount of intense of physical activity. In the Institute of Nutrition of Russian Academy of Medical Sciences the recommended average daily sets of products, which are based on daily energy expenditure of young athletes, depending on the duration and intensity of physical activity in diverse kinds sports has been developed, these kits provide young athletes the necessary nutrients and micronutrients. In precompetitive period athletes must be given high physical activity and the diet should be mainly protein and fat-containing, with a high level of fiber. The training process should be intense for three days, then the athlete is advised to transfer to the carbohydrate-rich diet that is combined with a significant reduction in the intensity of trainings--glycogen super compensation. During competition period meal should be well digestible and low-volume. It must contain proteins of high biological values and carbohydrates in the required quantity. During this period the inclusion of new dishes and products in the menu for athletes is not desirable. During marathon the main aim is to recover the energy, water, mineral resources, and to maintain normal blood glucose concentrations. This is achieved in the following ways: carbohydrate intake with a relatively small amount of liquid, high product content of vitamins and minerals that helps to maintain the water-salt metabolism at the appropriate level, taking food in liquid form, in small portions. In the recovery period adequate nutrition should achieve the following objectives: to restore the acid-base and fluid and electrolyte balance, eliminate the effect of metabolic products (urea, lactic acid, ammonia, etc) associated with high physical activity; restore carbohydrate stores, provide plastic exchange, synthesis processes. The article also contains the basic sanitary and epidemiological requirements for the catering departments, selection of products and sports doctors.
Jackson, George L.; Weinberger, Morris; Kirshner, Miriam A.; Stechuchak, Karen M.; Melnyk, Stephanie D.; Bosworth, Hayden B.; Coffman, Cynthia J.; Neelon, Brian; Van Houtven, Courtney; Gentry, Pamela W.; Morris, Isis J.; Rose, Cynthia M.; Taylor, Jennifer P.; May, Carrie L.; Han, Byungjoo; Wainwright, Christi; Alkon, Aviel; Powell, Lesa; Edelman, David
2016-01-01
Despite the availability of efficacious treatments, only half of patients with hypertension achieve adequate blood pressure (BP) control. This paper describes the protocol and baseline subject characteristics of a 2-arm, 18-month randomized clinical trial of titrated disease management (TDM) for patients with pharmaceutically-treated hypertension for whom systolic blood pressure (SBP) is not controlled (≥140mmHg for non-diabetic or ≥130mmHg for diabetic patients). The trial is being conducted among patients of four clinic locations associated with a Veterans Affairs Medical Center. An intervention arm has a TDM strategy in which patients' hypertension control at baseline, 6, and 12 months determines the resource intensity of disease management. Intensity levels include: a low-intensity strategy utilizing a licensed practical nurse to provide bi-monthly, non-tailored behavioral support calls to patients whose SBP comes under control; medium-intensity strategy utilizing a registered nurse to provide monthly tailored behavioral support telephone calls plus home BP monitoring; and high-intensity strategy utilizing a pharmacist to provide monthly tailored behavioral support telephone calls, home BP monitoring, and pharmacist-directed medication management. Control arm patients receive the low-intensity strategy regardless of BP control. The primary outcome is SBP. There are 385 randomized (192 intervention; 193 control) veterans that are predominately older (mean age 63.5 years) men (92.5%). 61.8% are African American, and the mean baseline SBP for all subjects is 143.6mmHg. This trial will determine if a disease management program that is titrated by matching the intensity of resources to patients' BP control leads to superior outcomes compared to a low-intensity management strategy. PMID:27417982
Jackson, George L; Weinberger, Morris; Kirshner, Miriam A; Stechuchak, Karen M; Melnyk, Stephanie D; Bosworth, Hayden B; Coffman, Cynthia J; Neelon, Brian; Van Houtven, Courtney; Gentry, Pamela W; Morris, Isis J; Rose, Cynthia M; Taylor, Jennifer P; May, Carrie L; Han, Byungjoo; Wainwright, Christi; Alkon, Aviel; Powell, Lesa; Edelman, David
2016-09-01
Despite the availability of efficacious treatments, only half of patients with hypertension achieve adequate blood pressure (BP) control. This paper describes the protocol and baseline subject characteristics of a 2-arm, 18-month randomized clinical trial of titrated disease management (TDM) for patients with pharmaceutically-treated hypertension for whom systolic blood pressure (SBP) is not controlled (≥140mmHg for non-diabetic or ≥130mmHg for diabetic patients). The trial is being conducted among patients of four clinic locations associated with a Veterans Affairs Medical Center. An intervention arm has a TDM strategy in which patients' hypertension control at baseline, 6, and 12months determines the resource intensity of disease management. Intensity levels include: a low-intensity strategy utilizing a licensed practical nurse to provide bi-monthly, non-tailored behavioral support calls to patients whose SBP comes under control; medium-intensity strategy utilizing a registered nurse to provide monthly tailored behavioral support telephone calls plus home BP monitoring; and high-intensity strategy utilizing a pharmacist to provide monthly tailored behavioral support telephone calls, home BP monitoring, and pharmacist-directed medication management. Control arm patients receive the low-intensity strategy regardless of BP control. The primary outcome is SBP. There are 385 randomized (192 intervention; 193 control) veterans that are predominately older (mean age 63.5years) men (92.5%). 61.8% are African American, and the mean baseline SBP for all subjects is 143.6mmHg. This trial will determine if a disease management program that is titrated by matching the intensity of resources to patients' BP control leads to superior outcomes compared to a low-intensity management strategy. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Gaprindashvili, G.; Tsereteli, E.; Gaprindashvili, M.
2013-12-01
In the last decades of the XX century, protect the population from geological hazards, to maintain land and safe operation of the engineering facilities has become the most important social - economic, demographic, political and environmental problems for the whole world. Georgia, with its scales of origination of the natural-catastrophic processes (landslide, mudflow, rockfall, erosion and etc.), their re-occurrence and with the negative results inflicted by these processes to the population, agricultural lands and engineering objects, is one of the most complex mountainous region. The extremely sensitive conditions were conditioned by: 1. Activation of highly intense earthquakes; 2. Activation of the negative meteorological events provoking the disaster processes on the background of global climatic changes and their abnormally frequent occurrence (mostly increased atmospheric precipitations, temperature and humidity); 3. Large-scale Human impact on the environment. Following the problem urgency, a number of departmental and research institutions have made their operations more intense in the given direction within the limits of their competence. First of all, the activity of the Department of Geology of Georgia (which is at present included in the National Environmental Agency of the Ministry of Environment and Natural Resources Protection), which mapped, identified and cataloged the hazardous processes on the territory of the country and identified the spatial limits and developmental regularities of these processes for tens of years. The increased risk of Geological catastrophes in Georgia first of all is caused by insufficient information between society and responsible persons toward this event. The existed situation needs the base assessment of natural disasters level, the identification of events, to determine their caused reasons, to develop special maps in GIS system, and continuous functioning of geo monitoring researches for develop safety early warning system.
NASA Astrophysics Data System (ADS)
Gaprindashvili, George; Tsereteli, Emil; Gaprindashvili, Merab
2014-05-01
In the last decades of the XX century, protect the population from geological hazards, to maintain land and safe operation of the engineering facilities has become the most important social - economic, demographic, political and environmental problems for the whole world. Georgia, with its scales of origination of the natural-catastrophic processes (landslide, mudflow, rockfall, erosion and etc.), their re-occurrence and with the negative results inflicted by these processes to the population, agricultural lands and engineering objects, is one of the most complex mountainous region. The extremely sensitive conditions were conditioned by: 1. Activation of highly intense earthquakes; 2. Activation of the negative meteorological events provoking the disaster processes on the background of global climatic changes and their abnormally frequent occurrence (mostly increased atmospheric precipitations, temperature and humidity); 3. Large-scale Human impact on the environment. Following the problem urgency, a number of departmental and research institutions have made their operations more intense in the given direction within the limits of their competence. First of all, the activity of the Department of Geology of Georgia (which is at present included in the National Environmental Agency of the Ministry of Environment and Natural Resources Protection), which mapped, identified and cataloged the hazardous processes on the territory of the country and identified the spatial limits and developmental regularities of these processes for tens of years. The increased risk of Geological catastrophes in Georgia first of all is caused by insufficient information between society and responsible persons toward this event. The existed situation needs the base assessment of natural disasters level, the identification of events, to determine their caused reasons, to develop special maps in GIS system, and continuous functioning of geo monitoring researches for develop safety early warning system.
Subsetting Tools for Enabling Easy Access to International Airborne Chemistry Data
NASA Astrophysics Data System (ADS)
Northup, E. A.; Chen, G.; Quam, B. M.; Beach, A. L., III; Silverman, M. L.; Early, A. B.
2017-12-01
In response to the Research Opportunities in Earth and Space Science (ROSES) 2015 release announcement for Advancing Collaborative Connections for Earth System Science (ACCESS), researchers at NASA Langley Research Center (LaRC) proposed to extend the capabilities of the existing Toolsets for Airborne Data (TAD) to include subsetting functionality to allow for easier access to international airborne field campaign data. Airborne field studies are commonly used to gain a detailed understanding of atmospheric processes for scientific research on international climate change and air quality issues. To accommodate the rigorous process for manipulating airborne field study chemistry data, and to lessen barriers for researchers, TAD was created with the ability to geolocate data from various sources measured on different time scales from a single flight. The analysis of airborne chemistry data typically requires data subsetting, which can be challenging and resource-intensive for end users. In an effort to streamline this process, new data subsetting features and updates to the current database model will be added to the TAD toolset. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. These new web-based tools will allow for automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The system has been designed to allow for new in-situ airborne missions to be added as they become available, with only minor pre-processing required. The development of these enhancements will be discussed in this presentation.
[A research review on "fertile islands" of soils under shrub canopy in arid and semi-arid regions].
Chen, Guangsheng; Zeng, Dehui; Chen, Fusheng; Fan, Zhiping; Geng, Haiping
2003-12-01
Due to the inclemency of climate and soil conditions and the intense disturbance of human beings, the soil resources heterogeneity in arid and semi-arid grassland ecosystems worldwide was gradually increased during the last century. The interaction between soil heterogeneity and shrubs induced the autogenic development of "fertile islands" and the increasing spread of shrubs in the grassland ecosystems. The development of "fertile islands" around individual shrubs could change the vegetation composition and structure, as well as the distribution patterns of soil resources, and thus, reinforced the changes of the ecosystem function and structure from a relative stable grassland ecosystem to a quasi-stable shrubland ecosystem. The study of "fertile islands" phenomenon would help us to understand the causes, consequences and processes of desertification in arid and semi-arid areas. In this paper, the causes of "fertile islands", its study methods and significance and its relationship with shrub spreading as well as the responses of vegetation to it were summarized. The problems which might occur in the study of this phenomenon were also pointed out. Our aim was to offer some references to the study of land desertification processes and vegetation restoration in the arid and semi-arid regions.
SWARM : a scientific workflow for supporting Bayesian approaches to improve metabolic models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, X.; Stevens, R.; Mathematics and Computer Science
2008-01-01
With the exponential growth of complete genome sequences, the analysis of these sequences is becoming a powerful approach to build genome-scale metabolic models. These models can be used to study individual molecular components and their relationships, and eventually study cells as systems. However, constructing genome-scale metabolic models manually is time-consuming and labor-intensive. This property of manual model-building process causes the fact that much fewer genome-scale metabolic models are available comparing to hundreds of genome sequences available. To tackle this problem, we design SWARM, a scientific workflow that can be utilized to improve genome-scale metabolic models in high-throughput fashion. SWARM dealsmore » with a range of issues including the integration of data across distributed resources, data format conversions, data update, and data provenance. Putting altogether, SWARM streamlines the whole modeling process that includes extracting data from various resources, deriving training datasets to train a set of predictors and applying Bayesian techniques to assemble the predictors, inferring on the ensemble of predictors to insert missing data, and eventually improving draft metabolic networks automatically. By the enhancement of metabolic model construction, SWARM enables scientists to generate many genome-scale metabolic models within a short period of time and with less effort.« less
Carotenoid coloration is related to fat digestion efficiency in a wild bird
NASA Astrophysics Data System (ADS)
Madonia, Christina; Hutton, Pierce; Giraudeau, Mathieu; Sepp, Tuul
2017-12-01
Some of the most spectacular visual signals found in the animal kingdom are based on dietarily derived carotenoid pigments (which cannot be produced de novo), with a general assumption that carotenoids are limited resources for wild organisms, causing trade-offs in allocation of carotenoids to different physiological functions and ornamentation. This resource trade-off view has been recently questioned, since the efficiency of carotenoid processing may relax the trade-off between allocation toward condition or ornamentation. This hypothesis has so far received little exploratory support, since studies of digestive efficiency of wild animals are limited due to methodological difficulties. Recently, a method for quantifying the percentage of fat in fecal samples to measure digestive efficiency has been developed in birds. Here, we use this method to test if the intensity of the carotenoid-based coloration predicts digestive efficiency in a wild bird, the house finch ( Haemorhous mexicanus). The redness of carotenoid feather coloration (hue) positively predicted digestion efficiency, with redder birds being more efficient at absorbing fats from seeds. We show for the first time in a wild species that digestive efficiency predicts ornamental coloration. Though not conclusive due to the correlative nature of our study, these results strongly suggest that fat extraction might be a crucial but overlooked process behind many ornamental traits.
NASA Astrophysics Data System (ADS)
Belyaev, A.; Berezhnaya, A.; Betev, L.; Buncic, P.; De, K.; Drizhuk, D.; Klimentov, A.; Lazin, Y.; Lyalin, I.; Mashinistov, R.; Novikov, A.; Oleynik, D.; Polyakov, A.; Poyda, A.; Ryabinkin, E.; Teslyuk, A.; Tkachenko, I.; Yasnopolskiy, L.
2015-12-01
The LHC experiments are preparing for the precision measurements and further discoveries that will be made possible by higher LHC energies from April 2015 (LHC Run2). The need for simulation, data processing and analysis would overwhelm the expected capacity of grid infrastructure computing facilities deployed by the Worldwide LHC Computing Grid (WLCG). To meet this challenge the integration of the opportunistic resources into LHC computing model is highly important. The Tier-1 facility at Kurchatov Institute (NRC-KI) in Moscow is a part of WLCG and it will process, simulate and store up to 10% of total data obtained from ALICE, ATLAS and LHCb experiments. In addition Kurchatov Institute has supercomputers with peak performance 0.12 PFLOPS. The delegation of even a fraction of supercomputing resources to the LHC Computing will notably increase total capacity. In 2014 the development a portal combining a Tier-1 and a supercomputer in Kurchatov Institute was started to provide common interfaces and storage. The portal will be used not only for HENP experiments, but also by other data- and compute-intensive sciences like biology with genome sequencing analysis; astrophysics with cosmic rays analysis, antimatter and dark matter search, etc.
[Descriptive Analysis of Health Economics of Intensive Home Care of Ventilated Patients].
Lehmann, Yvonne; Ostermann, Julia; Reinhold, Thomas; Ewers, Michael
2018-05-14
Long-term ventilated patients in Germany receive intensive care mainly in the patients' home or in assisted-living facilities. There is a lack of knowledge about the nature and extent of resource use and costs associated with care of this small, heterogeneous but overall growing patient group. A sub-study in the context of a research project SHAPE analyzed costs of 29 patients descriptively from a social perspective. Direct and indirect costs of intensive home care over a period of three months were recorded and analyzed retrospectively. Standardized recorded written self-reports from patients and relatives as well as information from the interviewing of nursing staff and from nursing documentation were the basis for this analysis. There was an average total cost of intensive home care for three months per patient of 61194 € (95% CI 53 884-68 504) including hospital stays. The main costs were directly linked to outpatient medical and nursing care provided according to the Code of Social Law V and XI. Services provided by nursing home care service according to § 37(2) Code of Social Law V (65%) were the largest cost item. Approximately 13% of the total costs were attributable to indirect costs. Intensive home care for ventilated patients is resource-intensive and cost-intensive and has received little attention also from a health economics perspective. Valid information and transparency about the cost structures are required for an effective and economic design and management of the long-term care of this patient group. © Georg Thieme Verlag KG Stuttgart · New York.
Shrira, Amit; Shmotkin, Dov; Palgi, Yuval; Hoffman, Yaakov; Bodner, Ehud; Ben-Ezra, Menachem; Litwin, Howard
2017-01-01
The potentially different psychological effects of ongoing trauma vis-à-vis an intense time-limited exposure to trauma have not been examined in older adults. Therefore, this study examined posttraumatic stress disorder (PTSD) symptoms and their health concomitants in two groups of older adults in Israel: those exposed to ongoing missile attacks and those exposed to an intense time-limited period of missile attacks. In the third administration of the Israeli component of the Survey of Health, Ageing, and Retirement in Europe (SHARE-Israel), 297 older adults reported ongoing exposure to missile attacks due to the Israel-Gaza conflict (mean age = 66.97), while 309 older adults reported exposure to an intense period of missile attacks during the Second Lebanon War (mean age = 66.63). Participants completed measures of PTSD symptoms, and physical, cognitive, and mental health. Older adults with ongoing exposure reported higher PTSD symptom level relative to those with intense time-limited exposure. The groups also differed in health variables related to PTSD symptoms. Namely, impaired physical and cognitive health were related to a higher level of PTSD symptoms in ongoing exposure, while impaired mental health was related to a higher PTSD symptom level following intense time-limited exposure. The findings suggest that physical and cognitive health involves resources that are vital for daily survival when living under ongoing warfare threat, whereas mental health involves resources that are needed in dealing with psychological effects of warfare trauma. Accordingly, different interventions may be necessary when helping older adults exposed to ongoing versus intense time-limited trauma.
De Silva, A Pubudu; Stephens, Tim; Welch, John; Sigera, Chathurani; De Alwis, Sunil; Athapattu, Priyantha; Dharmagunawardene, Dilantha; Olupeliyawa, Asela; de Abrew, Ashwini; Peiris, Lalitha; Siriwardana, Somalatha; Karunathilake, Indika; Dondorp, Arjen; Haniffa, Rashan
2015-04-01
To assess the impact of a nurse-led, short, structured training program for intensive care unit (ICU) nurses in a resource-limited setting. A training program using a structured approach to patient assessment and management for ICU nurses was designed and delivered by local nurse tutors in partnership with overseas nurse trainers. The impact of the course was assessed using the following: pre-course and post-course self-assessment, a pre-course and post-course Multiple Choice Questionnaire (MCQ), a post-course Objective Structured Clinical Assessment station, 2 post-course Short Oral Exam (SOE) stations, and post-course feedback questionnaires. In total, 117 ICU nurses were trained. Post-MCQ scores were significantly higher when compared with pre-MCQ (P < .0001). More than 95% passed the post-course Objective Structured Clinical Assessment (patient assessment) and SOE 1 (arterial blood gas analysis), whereas 76.9% passed SOE 2 (3-lead electrocardiogram analysis). The course was highly rated by participants, with 98% believing that this was a useful experience. Nursing Intensive Care Skills Training was highly rated by participants and was effective in improving the knowledge of the participants. This sustainable short course model may be adaptable to other resource-limited settings. Copyright © 2014 Elsevier Inc. All rights reserved.
High-Tech Versus High-Touch: Components of Hospital Costs Vary Widely.
Song, Paula H; Reiter, Kristin L; Yi Xu, Wendy
The recent release by the Centers for Medicare & Medicaid Services of hospital charge and payment data to the public has renewed a national dialogue on hospital costs and prices. However, to better understand the driving force of hospital pricing and to develop strategies for controlling expenditures, it is important to understand the underlying costs of providing hospital services. We use Medicare Provider and Analysis Review inpatient claims data and Medicare cost report data for fiscal years 2008 and 2012 to examine variations in the contribution of "high-tech" resources (i.e., technology/medical device-intensive resources) versus "high-touch" resources (i.e., labor-intensive resources) to the total costs of providing two common services, as well as assess how these costs have changed over time. We found that high-tech inputs accounted for a greater proportion of the total costs of surgical service, whereas medical service costs were primarily attributable to high-touch inputs. Although the total costs of services did not change significantly over time, the distribution of high-tech, high-touch, and other costs for each service varied considerably across hospitals. Understanding resource inputs and the varying contribution of these inputs by clinical condition is an important first step in developing effective cost control strategies.
Pretreatment methods for bioethanol production.
Xu, Zhaoyang; Huang, Fang
2014-09-01
Lignocellulosic biomass, such as wood, grass, agricultural, and forest residues, are potential resources for the production of bioethanol. The current biochemical process of converting biomass to bioethanol typically consists of three main steps: pretreatment, enzymatic hydrolysis, and fermentation. For this process, pretreatment is probably the most crucial step since it has a large impact on the efficiency of the overall bioconversion. The aim of pretreatment is to disrupt recalcitrant structures of cellulosic biomass to make cellulose more accessible to the enzymes that convert carbohydrate polymers into fermentable sugars. This paper reviews several leading acidic, neutral, and alkaline pretreatments technologies. Different pretreatment methods, including dilute acid pretreatment (DAP), steam explosion pretreatment (SEP), organosolv, liquid hot water (LHW), ammonia fiber expansion (AFEX), soaking in aqueous ammonia (SAA), sodium hydroxide/lime pretreatments, and ozonolysis are intensively introduced and discussed. In this minireview, the key points are focused on the structural changes primarily in cellulose, hemicellulose, and lignin during the above leading pretreatment technologies.
Botanochemicals and chemurgy in the petroleum drought ahead
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bagby, M.O.; Buchanan, R.A.; Duke, J.A.
1979-01-01
Green plants, collectively, are still a major under-exploited resource. However, new crops and agricultural systems are being developed for the production of fuels and materials in addition to foods and fibers. Whole-plant oils and botanochemicals are being evaluated as annually renewable replacements for petroleum crude and petrochemicals, respectively. Plant derived fuel alcohols are becoming a viable supplement to gasoline and fuel oils. Polyisoprenes, terpenes, oils, waxes, alcohols, phenols, furfural, methane, and producer gas from plant sources can potentially displace petroleum derived feedstocks for the synthetic chemical industry. Moreover, new botanochemical processing methods offer prospects for reducing US dependence on importsmore » for many specialty plant-products traditionally produced by labor-intensive methods. Extraction of essential oils, pharmaceutical intermediates, tannins, and vegetable dyes may be integrated with botanochemical processing to allow exploitation of the varied US climate for domestic production of nearly every botanical now imported.« less
Decisions on new product development under uncertainties
NASA Astrophysics Data System (ADS)
Huang, Yeu-Shiang; Liu, Li-Chen; Ho, Jyh-Wen
2015-04-01
In an intensively competitive market, developing a new product has become a valuable strategy for companies to establish their market positions and enhance their competitive advantages. Therefore, it is essential to effectively manage the process of new product development (NPD). However, since various problems may arise in NPD projects, managers should set up some milestones and subsequently construct evaluative mechanisms to assess their feasibility. This paper employed the approach of Bayesian decision analysis to deal with the two crucial uncertainties for NPD, which are the future market share and the responses of competitors. The proposed decision process can provide a systematic analytical procedure to determine whether an NPD project should be continued or not under the consideration of whether effective usage is being made of the organisational resources. Accordingly, the proposed decision model can assist the managers in effectively addressing the NPD issue under the competitive market.
European Union pharmacovigilance capabilities: potential for the new legislation
Tanti, Amy; Kouvelas, Dimitrios; Lungu, Calin; Pirozynski, Michal; Serracino-Inglott, Anthony; Aislaitner, George
2015-01-01
European Directives and Regulations introduced between late 2010 and 2012 have substantially overhauled pharmacovigilance processes across the European Union (EU). In this review, the implementation of the pharmacovigilance legislative framework by EU regulators is examined with the aim of mapping Directive 2010/84/EU and Regulation EC No. 1235/2010 against their aspired objectives of strengthening and rationalizing pharmacovigilance in the EU. A comprehensive review of the current state of affairs of the progress made by EU regulators is presented in this paper. Our review shows that intense efforts by regulators and industry to fulfil legislative obligations have resulted in major positive shifts in pharmacovigilance. Harmonized decision making, transparency in decision processes with patient involvement, information accessibility to the public, patient adverse drug reaction reporting, efforts in communication and enhanced cooperation between member states to maximize resource utilization and minimize duplication of efforts are observed. PMID:26301067
Garavelli, Lysel; Colas, François; Verley, Philippe; Kaplan, David Michael; Yannicelli, Beatriz; Lett, Christophe
2016-01-01
In marine benthic ecosystems, larval connectivity is a major process influencing the maintenance and distribution of invertebrate populations. Larval connectivity is a complex process to study as it is determined by several interacting factors. Here we use an individual-based, biophysical model, to disentangle the effects of such factors, namely larval vertical migration, larval growth, larval mortality, adults fecundity, and habitat availability, for the marine gastropod Concholepas concholepas (loco) in Chile. Lower transport success and higher dispersal distances are observed including larval vertical migration in the model. We find an overall decrease in larval transport success to settlement areas from northern to southern Chile. This spatial gradient results from the combination of current direction and intensity, seawater temperature, and available habitat. From our simulated connectivity patterns we then identify subpopulations of loco along the Chilean coast, which could serve as a basis for spatial management of this resource in the future.
Influence of Biological Factors on Connectivity Patterns for Concholepas concholepas (loco) in Chile
Garavelli, Lysel; Colas, François; Verley, Philippe; Kaplan, David Michael; Yannicelli, Beatriz; Lett, Christophe
2016-01-01
In marine benthic ecosystems, larval connectivity is a major process influencing the maintenance and distribution of invertebrate populations. Larval connectivity is a complex process to study as it is determined by several interacting factors. Here we use an individual-based, biophysical model, to disentangle the effects of such factors, namely larval vertical migration, larval growth, larval mortality, adults fecundity, and habitat availability, for the marine gastropod Concholepas concholepas (loco) in Chile. Lower transport success and higher dispersal distances are observed including larval vertical migration in the model. We find an overall decrease in larval transport success to settlement areas from northern to southern Chile. This spatial gradient results from the combination of current direction and intensity, seawater temperature, and available habitat. From our simulated connectivity patterns we then identify subpopulations of loco along the Chilean coast, which could serve as a basis for spatial management of this resource in the future. PMID:26751574
The Computer Aided Aircraft-design Package (CAAP)
NASA Technical Reports Server (NTRS)
Yalif, Guy U.
1994-01-01
The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.
Tongue-tied: Confused meanings for common fire terminology can lead to fuels mismanagement
Theresa B. Jain; Russell T. Graham; David S. Pilliod
2004-01-01
The ineffective and inconsistent use of terminology among fire managers, scientists, resource managers and the public is a constant problem in resource management. In fire management and fire science, the terms fire severity, burn severity and fire intensity are defined in a variety of ways, used inconsistently and, in some cases, interchangeably.
Water and water use in southern Nevada [Chapter 3
Wayne R. Belcher; Michael J. Moran; Megan E. Rogers
2013-01-01
Water and water use in southern Nevada is an important issue. The scarcity of water resources for both human and biologic communities often leads to intense competition for both surface and groundwaters. Anthropogenic and climate change impacts on scarce water resources need to be understood to assess human and ecosystem health for the study area.
ERIC Educational Resources Information Center
Runyan, Rodney C.; Finnegan, Carol; Gonzalez-Padron, Tracy; Line, Nathan D.
2013-01-01
The promotion, tenure, and salary of marketing faculty have been topics of intense interest recently. What has received less interest are the drivers of publishing productivity, especially for new, pretenure faculty. We use resource advantage (RA) theory to examine the drivers of pretenure faculty productivity, specifically in the top marketing…
Deploying wildland fire suppression resources with a scenario-based standard response model.
Robert G. Haight; Jeremy S. Fried
2007-01-01
Wildland fire managers deploy suppression resources to bases and dispatch them to fires to maximize the percentage of fires that are successfully contained before unacceptable costs and losses occur. Deployment is made with budget constraints and uncertainty about the daily number, location, and intensity of fires, all of which affect initial-attack success. To address...
Deploying wildland fire suppression resources with a scenario-based standard response model
Robert G. Haight; Jeremy S. Fried
2007-01-01
Wildland fire managers deploy suppression resources to bases and dispatch them to fires to maximize the percentage of fires that are successfully contained before unacceptable costs and losses occur. Deployment is made with budget constraints and uncertainty about the daily number, location, and intensity of fires, all of which affect initial-attack success. To address...
Managing uncertainty in climate-driven ecological models to inform adaptation to climate change
Jeremy S. Littell; Donald McKenzie; Becky K. Kerns; Samuel Cushman; Charles G. Shaw
2011-01-01
The impacts of climate change on forest ecosystems are likely to require changes in forest planning and natural resource management. Changes in tree growth, disturbance extent and intensity, and eventually species distributions are expected. In natural resource management and planning, ecosystem models are typically used to provide a "best estimate" about how...
ERIC Educational Resources Information Center
Clark, Lindie; Rowe, Anna; Cantori, Alex; Bilgin, Ayse; Mukuria, Valentine
2016-01-01
Work-integrated learning (WIL) courses can be more time consuming and resource intensive to design, teach, administer and support than classroom-based courses, as they generally require different curricula and pedagogical approaches as well as additional administrative and pastoral responsibilities. Workload and resourcing issues are reported as…
Forest resources of the Susitna Valley, Alaska.
Karl M. Hegg
1970-01-01
This report summarizes the data from the first intensive inventory of the forests in the Susitna Valley, Alaska, conducted during the period 1964-65. The primary purposes of the inventory were to determine the total area of forested lands, the commercial forest area and timber volume, and the condition and growth of this resource, and to report on...
A survey-based benchmarking approach for health care using the Baldrige quality criteria.
Jennings, K; Westfall, F
1994-09-01
Since 1988, manufacturing and service industries have been using the Malcolm Baldrige National Quality Award to assess their management processes (for example, leadership, information, and analysis) against critical performance criteria. Recognizing that the typical Baldrige assessment is time intensive and dependent on intensive training, The Pacer Group, a consulting firm in Dayton, Ohio, developed a self-assessment tool based on the Baldrige criteria which provides a snapshot assessment of an organization's management practices. The survey was administered at 25 hospitals within a health care system. Hospitals were able to compare their scores with other hospitals in the system, as well as the scores of a Baldrige award winner. Results were also analyzed on a systemwide basis to identify strengths and weaknesses across the system. For all 25 hospitals, the following areas were identified as strengths: management of process quality, leadership, and customer focus and satisfaction. Weaknesses included lack of employee involvement in the quality planning process, poor design of quality systems, and lack of cross-departmental cooperation. One of the surveyed hospitals launched improvement initiatives in knowledge of improvement tools and methods and in a patient satisfaction focus. A team was formed to improve the human resource management system. Also, a new unit was designed using patient-centered care principles. A team re-evaluated every operation that affected patients on the unit. A survey modeled after the Baldrige Award criteria can be useful in benchmarking an organization's quality improvement practices.
NASA Astrophysics Data System (ADS)
Omotoso, T.
2015-12-01
By 2050, the world will need to feed 9 billion people. This will require a 60% increase in agricultural production and subsequently a 6% increase in water use by the agricultural sector alone. By 2030, global water demand is expected to increase by 40%, mostly in developing countries like Nigeria (Addams, Boccaletti, Kerlin, & Stuchtey, 2009) and global energy demand is expected to increase by 33% in 2035, also, mostly in emerging economies (IEA, 2013). These resources have to be managed efficiently in preparation for these future demands. Population growth leads to increased demand for water, energy and food. More food production will lead to more water-for-food and energy-for-food usage; and more demand for energy will lead to more water-for-energy needs. This nexus between water, energy and food is poorly understood and furthermore, complicated by external drivers such as climate change. Niger State Nigeria, which is blessed with abundant water and arable land resources, houses the three hydropower dams in Nigeria and one of the governments' proposed Staple Crops Processing Zones (SCPZ) for rice production. Both of these capital intensive investments depend heavily on water resources and are all highly vulnerable to changes in climate. Thus, it is essential to know how the local climate in this state will likely change and its impacts on water, energy and food security, so that policy makers can make informed mitigation/adaptation plans; operational and investment decisions. The objective of this project is to provide information, using an integrated resources management approach, on the effects of future climate changes on water, energy (hydropower) and food resources in Niger State, Nigeria and improve knowledge on the interlinkages between water, energy and food at a local scale.
Passionate Intensity and the Educational Process.
ERIC Educational Resources Information Center
Thompson, Mark E.
The educational process and passionate intensity are forces that are often at odds in society. Passionate intensity is a force that introduces turmoil and threatens those social processes that depend on reason and independent thought. In contrast, the educational process seeks to develop coping skills to limit dependence on others and promote…
Fire intensity, fire severity and burn severity: A brief review and suggested usage
Keeley, J.E.
2009-01-01
Several recent papers have suggested replacing the terminology of fire intensity and fire severity. Part of the problem with fire intensity is that it is sometimes used incorrectly to describe fire effects, when in fact it is justifiably restricted to measures of energy output. Increasingly, the term has created confusion because some authors have restricted its usage to a single measure of energy output referred to as fireline intensity. This metric is most useful in understanding fire behavior in forests, but is too narrow to fully capture the multitude of ways fire energy affects ecosystems. Fire intensity represents the energy released during various phases of a fire, and different metrics such as reaction intensity, fireline intensity, temperature, heating duration and radiant energy are useful for different purposes. Fire severity, and the related term burn severity, have created considerable confusion because of recent changes in their usage. Some authors have justified this by contending that fire severity is defined broadly as ecosystem impacts from fire and thus is open to individual interpretation. However, empirical studies have defined fire severity operationally as the loss of or change in organic matter aboveground and belowground, although the precise metric varies with management needs. Confusion arises because fire or burn severity is sometimes defined so that it also includes ecosystem responses. Ecosystem responses include soil erosion, vegetation regeneration, restoration of community structure, faunal recolonization, and a plethora of related response variables. Although some ecosystem responses are correlated with measures of fire or burn severity, many important ecosystem processes have either not been demonstrated to be predicted by severity indices or have been shown in some vegetation types to be unrelated to severity. This is a critical issue because fire or burn severity are readily measurable parameters, both on the ground and with remote sensing, yet ecosystem responses are of most interest to resource managers.
Ishikura, Satoshi
2008-11-01
The process of radiotherapy (RT) is complex and involves understanding of the principles of medical physics, radiobiology, radiation safety, dosimetry, radiation treatment planning, simulation and interaction of radiation with other treatment modalities. Each step in the integrated process of RT needs quality control and quality assurance (QA) to prevent errors and to give high confidence that patients will receive the prescribed treatment correctly. Recent advances in RT, including intensity-modulated and image-guided RT, focus on the need for a systematic RTQA program that balances patient safety and quality with available resources. It is necessary to develop more formal error mitigation and process analysis methods, such as failure mode and effect analysis, to focus available QA resources optimally on process components. External audit programs are also effective. The International Atomic Energy Agency has operated both an on-site and off-site postal dosimetry audit to improve practice and to assure the dose from RT equipment. Several countries have adopted a similar approach for national clinical auditing. In addition, clinical trial QA has a significant role in enhancing the quality of care. The Advanced Technology Consortium has pioneered the development of an infrastructure and QA method for advanced technology clinical trials, including credentialing and individual case review. These activities have an impact not only on the treatment received by patients enrolled in clinical trials, but also on the quality of treatment administered to all patients treated in each institution, and have been adopted globally; by the USA, Europe and Japan also.
Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.
2015-01-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363
Van Dyken, J. David; Wade, Michael J.
2012-01-01
Understanding the evolution of altruism requires knowledge of both its constraints and its drivers. Here we show that, paradoxically, ecological constraints on altruism may ultimately be its strongest driver. We construct a two-trait, co-evolutionary adaptive dynamics model of social evolution in a genetically structured population with local resource competition. The intensity of local resource competition, which influences the direction and strength of social selection and which is typically treated as a static parameter, is here allowed to be an evolvable trait. Evolution of survival/fecundity altruism, which requires weak local competition, increases local competition as it evolves, creating negative environmental feedback that ultimately inhibits its further evolutionary advance. Alternatively, evolution of resource-based altruism, which requires strong local competition, weakens local competition as it evolves, also ultimately causing its own evolution to stall. When evolving independently, these altruistic strategies are intrinsically self-limiting. However, the co-existence of these two altruism types transforms the negative eco-evolutionary feedback generated by each strategy on itself into positive feedback on the other, allowing the presence of one trait to drive the evolution of the other. We call this feedback conversion “reciprocal niche construction”. In the absence of constraints, this process leads to runaway co-evolution of altruism types. We discuss applications to the origins and evolution of eusociality, division of labor, the inordinate ecological success of eusocial species, and the interaction between technology and demography in human evolution. Our theory suggests that the evolution of extreme sociality may often be an autocatalytic process. PMID:22834748
Webb, Nicholas P.; Herrick, Jeffrey E.; Van Zee, Justin W; Courtright, Ericha M; Hugenholtz, Ted M; Zobeck, Ted M; Okin, Gregory S.; Barchyn, Thomas E; Billings, Benjamin J; Boyd, Robert A.; Clingan, Scott D; Cooper, Brad F; Duniway, Michael C.; Derner, Justin D.; Fox, Fred A; Havstad, Kris M.; Heilman, Philip; LaPlante, Valerie; Ludwig, Noel A; Metz, Loretta J; Nearing, Mark A; Norfleet, M Lee; Pierson, Frederick B; Sanderson, Matt A; Sharrat, Brenton S; Steiner, Jean L; Tatarko, John; Tedela, Negussie H; Todelo, David; Unnasch, Robert S; Van Pelt, R Scott; Wagner, Larry
2016-01-01
The National Wind Erosion Research Network was established in 2014 as a collaborative effort led by the United States Department of Agriculture’s Agricultural Research Service and Natural Resources Conservation Service, and the United States Department of the Interior’s Bureau of Land Management, to address the need for a long-term research program to meet critical challenges in wind erosion research and management in the United States. The Network has three aims: (1) provide data to support understanding of basic aeolian processes across land use types, land cover types, and management practices, (2) support development and application of models to assess wind erosion and dust emission and their impacts on human and environmental systems, and (3) encourage collaboration among the aeolian research community and resource managers for the transfer of wind erosion technologies. The Network currently consists of thirteen intensively instrumented sites providing measurements of aeolian sediment transport rates, meteorological conditions, and soil and vegetation properties that influence wind erosion. Network sites are located across rangelands, croplands, and deserts of the western US. In support of Network activities, http://winderosionnetwork.org was developed as a portal for information about the Network, providing site descriptions, measurement protocols, and data visualization tools to facilitate collaboration with scientists and managers interested in the Network and accessing Network products. The Network provides a mechanism for engaging national and international partners in a wind erosion research program that addresses the need for improved understanding and prediction of aeolian processes across complex and diverse land use types and management practices.
Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L
2015-02-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
A Multiple-player-game Approach to Agricultural Water Use in Regions of Seasonal Drought
NASA Astrophysics Data System (ADS)
Lu, Z.
2013-12-01
In the wide distributed regions of seasonal drought, conflicts of water allocation between multiple stakeholders (which means water consumers and policy makers) are frequent and severe problems. These conflicts become extremely serious in the dry seasons, and are ultimately caused by an intensive disparity between the lack of natural resource and the great demand of social development. Meanwhile, these stakeholders are often both competitors and cooperators in water saving problems, because water is a type of public resource. Conflicts often occur due to lack of appropriate water allocation scheme. Among the many uses of water, the need of agricultural irrigation water is highly elastic, but this factor has not yet been made full use to free up water from agriculture use. The primary goal of this work is to design an optimal distribution scheme of water resource for dry seasons to maximize benefits from precious water resources, considering the high elasticity of agriculture water demand due to the dynamic of soil moisture affected by the uncertainty of precipitation and other factors like canopy interception. A dynamic programming model will be used to figure out an appropriate allocation of water resources among agricultural irrigation and other purposes like drinking water, industry, and hydropower, etc. In this dynamic programming model, we analytically quantify the dynamic of soil moisture in the agricultural fields by describing the interception with marked Poisson process and describing the rainfall depth with exponential distribution. Then, we figure out a water-saving irrigation scheme, which regulates the timetable and volumes of water in irrigation, in order to minimize irrigation water requirement under the premise of necessary crop yield (as a constraint condition). And then, in turn, we provide a scheme of water resource distribution/allocation among agriculture and other purposes, taking aim at maximizing benefits from precious water resources, or in other words, make best use of limited water resource.
ERIC Educational Resources Information Center
Unger, Daniel R.
2014-01-01
Undergraduate students pursuing a Bachelor of Science in Forestry (BSF) at Stephen F. Austin State University (SFA) attend an intensive 6-week residential hands-on instruction in applied field methods. The intensive 6-week instruction includes learning how to use the Global Positioning System (GPS) with a Garmin eTrex HCx GPS unit to accurately…
Hutchinson, David; Bradley, Samuel D
2009-03-01
In the recent United States-led "war on terror," including ongoing engagements in Iraq and Afghanistan, news organizations have been accused of showing a negative view of developments on the ground. In particular, news depictions of casualties have brought accusations of anti-Americanism and aiding and abetting the terrorists' cause. In this study, video footage of war from television news stories was manipulated to investigate the effects of negative compelling images on cognitive resource allocation, physiological arousal, and recognition memory. Results of a within-subjects experiment indicate that negatively valenced depictions of casualties and destruction elicit greater attention and physiological arousal than positive and low-intensity images. Recognition memory for visual information in the graphic negative news condition was highest, whereas audio recognition for this condition was lowest. The results suggest that negative, high-intensity video imagery diverts cognitive resources away from the encoding of verbal information in the newscast, positioning visual images and not the spoken narrative as a primary channel of viewer learning.
Lim, Seong-Rin; Kang, Daniel; Ogunseitan, Oladele A; Schoenung, Julie M
2011-01-01
Light-emitting diodes (LEDs) are advertised as environmentally friendly because they are energy efficient and mercury-free. This study aimed to determine if LEDs engender other forms of environmental and human health impacts, and to characterize variation across different LEDs based on color and intensity. The objectives are as follows: (i) to use standardized leachability tests to examine whether LEDs are to be categorized as hazardous waste under existing United States federal and California state regulations; and (ii) to use material life cycle impact and hazard assessment methods to evaluate resource depletion and toxicity potentials of LEDs based on their metallic constituents. According to federal standards, LEDs are not hazardous except for low-intensity red LEDs, which leached Pb at levels exceeding regulatory limits (186 mg/L; regulatory limit: 5). However, according to California regulations, excessive levels of copper (up to 3892 mg/kg; limit: 2500), Pb (up to 8103 mg/kg; limit: 1000), nickel (up to 4797 mg/kg; limit: 2000), or silver (up to 721 mg/kg; limit: 500) render all except low-intensity yellow LEDs hazardous. The environmental burden associated with resource depletion potentials derives primarily from gold and silver, whereas the burden from toxicity potentials is associated primarily with arsenic, copper, nickel, lead, iron, and silver. Establishing benchmark levels of these substances can help manufacturers implement design for environment through informed materials substitution, can motivate recyclers and waste management teams to recognize resource value and occupational hazards, and can inform policymakers who establish waste management policies for LEDs.
Kumar, Parmeshwar; Jithesh, V; Gupta, Shakti Kumar
2016-07-01
Although Intensive Care Units (ICUs) only account for 10% of the hospital beds, they consume nearly 22% of the hospital resources. Few definitive costing studies have been conducted in Indian settings that would help determine appropriate resource allocation. The aim of this study was to evaluate and compare the cost of intensive care delivery between multispecialty and neurosurgery ICUs at an apex trauma care facility in India. The study was conducted in a polytrauma and neurosurgery ICU at a 203-bedded Level IV trauma care facility in New Delhi, India, from May 1, 2012 to June 30, 2012. The study was cross-sectional, retrospective, and record-based. Traditional costing was used to arrive at the cost for both direct and indirect cost estimates. The cost centers included in the study were building cost, equipment cost, human resources, materials and supplies, clinical and nonclinical support services, engineering maintenance cost, and biomedical waste management. Statistical analysis was performed by Fisher's two tailed t-test. Total cost/bed/day for the multispecialty ICU was Rs. 14,976.9/- and for the neurosurgery ICU, it was Rs. 14,306.7/-, workforce constituting nearly half of the expenditure in both ICUs. The cost center wise and overall difference in the cost among the ICUs were statistically significant. Quantification of expenditure in running an ICU in a trauma center would assist health-care decision makers in better allocation of resources. Although multispecialty ICUs are more cost-effective, other factors will also play a role in defining the kind of ICU that needs to be designed.
The cellulose resource matrix.
Keijsers, Edwin R P; Yılmaz, Gülden; van Dam, Jan E G
2013-03-01
The emerging biobased economy is causing shifts from mineral fossil oil based resources towards renewable resources. Because of market mechanisms, current and new industries utilising renewable commodities, will attempt to secure their supply of resources. Cellulose is among these commodities, where large scale competition can be expected and already is observed for the traditional industries such as the paper industry. Cellulose and lignocellulosic raw materials (like wood and non-wood fibre crops) are being utilised in many industrial sectors. Due to the initiated transition towards biobased economy, these raw materials are intensively investigated also for new applications such as 2nd generation biofuels and 'green' chemicals and materials production (Clark, 2007; Lange, 2007; Petrus & Noordermeer, 2006; Ragauskas et al., 2006; Regalbuto, 2009). As lignocellulosic raw materials are available in variable quantities and qualities, unnecessary competition can be avoided via the choice of suitable raw materials for a target application. For example, utilisation of cellulose as carbohydrate source for ethanol production (Kabir Kazi et al., 2010) avoids the discussed competition with easier digestible carbohydrates (sugars, starch) deprived from the food supply chain. Also for cellulose use as a biopolymer several different competing markets can be distinguished. It is clear that these applications and markets will be influenced by large volume shifts. The world will have to reckon with the increase of competition and feedstock shortage (land use/biodiversity) (van Dam, de Klerk-Engels, Struik, & Rabbinge, 2005). It is of interest - in the context of sustainable development of the bioeconomy - to categorize the already available and emerging lignocellulosic resources in a matrix structure. When composing such "cellulose resource matrix" attention should be given to the quality aspects as well as to the available quantities and practical possibilities of processing the feedstock and the performance in the end-application. The cellulose resource matrix should become a practical tool for stakeholders to make choices regarding raw materials, process or market. Although there is a vast amount of scientific and economic information available on cellulose and lignocellulosic resources, the accessibility for the interested layman or entrepreneur is very difficult and the relevance of the numerous details in the larger context is limited. Translation of science to practical accessible information with modern data management and data integration tools is a challenge. Therefore, a detailed matrix structure was composed in which the different elements or entries of the matrix were identified and a tentative rough set up was made. The inventory includes current commodities and new cellulose containing and raw materials as well as exotic sources and specialties. Important chemical and physical properties of the different raw materials were identified for the use in processes and products. When available, the market data such as price and availability were recorded. Established and innovative cellulose extraction and refining processes were reviewed. The demands on the raw material for suitable processing were collected. Processing parameters known to affect the cellulose properties were listed. Current and expected emerging markets were surveyed as well as their different demands on cellulose raw materials and processes. The setting up of the cellulose matrix as a practical tool requires two steps. Firstly, the reduction of the needed data by clustering of the characteristics of raw materials, processes and markets and secondly, the building of a database that can provide the answers to the questions from stakeholders with an indicative character. This paper describes the steps taken to achieve the defined clusters of most relevant and characteristic properties. These data can be expanded where required. More detailed specification can be obtained from the background literature and handbooks. Where gaps of information are identified, the research questions can be defined that will require further investigation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Neural processing of emotional-intensity predicts emotion regulation choice.
Shafir, Roni; Thiruchselvam, Ravi; Suri, Gaurav; Gross, James J; Sheppes, Gal
2016-12-01
Emotional-intensity is a core characteristic of affective events that strongly determines how individuals choose to regulate their emotions. Our conceptual framework suggests that in high emotional-intensity situations, individuals prefer to disengage attention using distraction, which can more effectively block highly potent emotional information, as compared with engagement reappraisal, which is preferred in low emotional-intensity. However, existing supporting evidence remains indirect because prior intensity categorization of emotional stimuli was based on subjective measures that are potentially biased and only represent the endpoint of emotional-intensity processing. Accordingly, this study provides the first direct evidence for the role of online emotional-intensity processing in predicting behavioral regulatory-choices. Utilizing the high temporal resolution of event-related potentials, we evaluated online neural processing of stimuli's emotional-intensity (late positive potential, LPP) prior to regulatory-choices between distraction and reappraisal. Results showed that enhanced neural processing of intensity (enhanced LPP amplitudes) uniquely predicted (above subjective measures of intensity) increased tendency to subsequently choose distraction over reappraisal. Additionally, regulatory-choices led to adaptive consequences, demonstrated in finding that actual implementation of distraction relative to reappraisal-choice resulted in stronger attenuation of LPPs and self-reported arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Optimisation of logistics processes of energy grass collection
NASA Astrophysics Data System (ADS)
Bányai, Tamás.
2010-05-01
The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social Fund. References [1] P. R. Daniel: The Economics of Harvesting and Transporting Corn Stover for Conversion to Fuel Ethanol: A Case Study for Minnesota. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/14213.html [2] T. G. Douglas, J. Brendan, D. Erin & V.-D. Becca: Energy and Chemicals from Native Grasses: Production, Transportation and Processing Technologies Considered in the Northern Great Plains. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/13838.html [3] Homepage of energygrass. www.energiafu.hu
Ozcan, Seyda; Rogers, Helen; Choudhary, Pratik; Amiel, Stephanie A; Cox, Alison; Forbes, Angus
2013-01-01
Context Providing effective support for patients in using insulin effectively is essential for good diabetes care. For that support to be effective it must reflect and attend to the needs of patients. Purpose To explore the perspectives of adult type 1 diabetes patients on their current diabetes care in order to generate ideas for creating a new patient centered intensive insulin clinic. Methods A multi-method approach was used, comprising: an observational exercise of current clinical care; three focus groups (n = 17); and a survey of service users (n = 419) to test the ideas generated from the observational exercise and focus groups (rating 1 to 5 in terms of importance). The ideas generated by the multi-method approach were organized thematically and mapped onto the Chronic Care Model (CCM). Results The themes and preferences for service redesign in relation to CCM components were: health care organization, there was an interest in having enhanced systems for sharing clinical information; self-management support, patients would like more flexible and easy to access resources and more help with diabetes technology and psychosocial support; delivery system design and clinical information systems, the need for greater integration of care and better use of clinic time; productive relationships, participants would like more continuity; access to health professionals, patient involvement and care planning. The findings from the patient survey indicate high preferences for most of the areas for service enhancement identified in the focus groups and observational exercise. Clinical feedback and professional continuity (median = 5, interquartile range = 1) were the most highly rated. Conclusion The patient consultation process had generated important ideas on how the clinical team and service can improve the care provided. Key areas for service development were: a stronger emphasis of collaborative care planning; improved patient choice in the use of health technology; more resources for self-management support; and a more explicit format for the process of care in the clinic. PMID:23776329
The medical science DMZ: a network design pattern for data-intensive medical science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Dart, Eli; Barnett, William
We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations.High-end networking, packet-filter firewalls, network intrusion-detection systems.We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs.The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and networkmore » resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows.By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements.« less
Shi, Yinxian; Hu, Huabin; Xu, Youkai; Liu, Aizhong
2014-09-24
The genus Ficus, collectively known as figs, is a key component of tropical forests and is well known for its ethnobotanical importance. In recent decades an increasing number of studies have shown the indigenous knowledge about wild edible Ficus species and their culinary or medicinal value. However, rather little is known about the role of these species in rural livelihoods, because of both species and cultural diversity. In this study we 1) collected the species and ethnic names of wild edible Ficus exploited by four cultural groups in Xishuangbanna, Southwest China, and 2) recorded the collection activities and modes of consumption through semi-structured interviews, 3) investigated the resource management by a statistical survey of their field distribution and cultivation, and 4) compared and estimated the usage intensities by the grading method. The young leaves, leaf buds and young or ripe syconia of 13 Ficus species or varieties are traditionally consumed. All the species had fixed and usually food-related ethnic names. All four cultural groups are experienced in the collection and use of edible Ficus species as vegetables, fruits or beverages, with the surplus sold for cash income. Different cultural groups use the Ficus species at different intensities because of differences in availability, forest dependency and cultural factors. Both the mountain and basin villagers make an effort to realize sustainable collection and meet their own and market needs by resource management in situ or cultivation. In comparison with reports from other parts of the world, ethnic groups in Xishuangbanna exploited more edible Ficus species for young leaves or leaf buds. Most of the edible species undergo a gradient of management intensities following a gradient of manipulation from simple field gathering to ex situ cultivation. This study contributes to our understanding of the origins and diffusion of the knowledge of perception, application and managing a group of particular plant species, and how the local culture, economic and geographical factors influence the process.
The medical science DMZ: a network design pattern for data-intensive medical science.
Peisert, Sean; Dart, Eli; Barnett, William; Balas, Edward; Cuff, James; Grossman, Robert L; Berman, Ari; Shankar, Anurag; Tierney, Brian
2017-10-06
We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet-filter firewalls, network intrusion-detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs. The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Treeline proximity alters an alpine plant-herbivore interaction.
Illerbrun, Kurt; Roland, Jens
2011-05-01
Rising treeline threatens the size and contiguity of alpine meadows worldwide. As trees encroach into previously open habitat, the movement and population dynamics of above-treeline alpine species may be disrupted. This process is well documented in studies of the Rocky Mountain apollo butterfly (Parnassius smintheus). However, subtler consequences of treeline rise remain poorly understood. In this study, we examine whether treeline proximity affects feeding behaviour of P. smintheus larvae, due to altered habitat affecting the distribution and availability of their host plant, lance-leaved stonecrop (Sedum lanceolatum). Understanding differential larval exploitation of food resources in relation to the treeline is an important step in predicting the consequences of continued treeline rise. Parnassius smintheus larvae feed more intensively on S. lanceolatum away from the treeline despite the relative paucity of hosts in these areas, and despite higher fitness penalties associated with the plant's herbivory-induced chemical defenses. Sedum lanceolatum growing near the treeline is less attractive, and therefore represents a less significant resource for P. smintheus larvae than its abundance might imply. If treeline rise continues, we suggest that this pattern of altered resource exploitation may represent a mechanism by which larvae are adversely affected even while adult movement among and within meadows appears sufficient for maintaining population health, and total host availability seems ample.
The Physical Economy of the United States of America
Gierlinger, Sylvia; Krausmann, Fridolin
2012-01-01
The United States is not only the world's largest economy, but it is also one of the world's largest consumers of natural resources. The country, which is inhabited by some 5% of the world's population, uses roughly one-fifth of the global primary energy supply and 15% of all extracted materials. This article explores long-term trends and patterns of material use in the United States. Based on a material flow account (MFA) that is fully consistent with current standards of economy-wide MFAs and covers domestic extraction, imports, and exports of materials for a 135-year period, we investigated the evolution of the U.S. industrial metabolism. This process was characterized by an 18-fold increase in material consumption, a multiplication of material use per capita, and a shift from renewable biomass toward mineral and fossil resources. In spite of considerable improvements in material intensity, no dematerialization has happened so far; in contrast to other high-income countries, material use has not stabilized since the 1970s, but has continued to grow. This article compares patterns and trends of material use in the United States with those in Japan and the United Kingdom and discusses the factors underlying the disproportionately high level of U.S. per capita resource consumption. PMID:24436632
NASA Astrophysics Data System (ADS)
Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal
2015-05-01
When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.
Fort Stewart integrated resource assessment. Volume 1, Executive summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, L.L.; Keller, J.M.
1993-10-01
The US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), has developed a model program that provides a systematic approach to evaluating energy opportunities that (1) identifies the building groups and end uses that use the most energy (not just have the greatest energy-use intensity), and (2) evaluates the numerous options for retrofit or installation of new technology that will result in the selection of the most cost-effective technologies. In essence, this model program provides the federal energy manager with a roadmap to significantly reduce energy use in a planned, rational, cost-effectivemore » fashion that is not biased by the constraints of the typical funding sources available to federal sites. The results from this assessment process can easily be turned into a five- to ten-year energy management plan that identifies where to start and how to proceed in order to reach the mandated energy consumption targets. This report provides the results of the fossil fuel and electric energy resource opportunity (ERO) assessments performed by PNL at the US Army US Forces Command (FORSCOM) Fort Stewart facility located approximately 25 miles southwest of Savannah, Georgia. It is a companion report to Volume 2, Baseline Detail, and Volume 3, Resource Assessment.« less
NASA Astrophysics Data System (ADS)
Newcomer, M. E.; Gurdak, J. J.
2011-12-01
Groundwater resources in urban, coastal environments are highly vulnerable to increased human pressures and climate variability. Impervious surfaces, such as buildings, roads, and parking lots prevent infiltration, reduce recharge to underlying aquifers, and increase contaminants in surface runoff that often overflow sewage systems. To mitigate these effects, cities worldwide are adopting low impact design (LID) approaches that direct runoff into natural vegetated systems, such as rain gardens that reduce, filter, and slow stormwater runoff, and are hypothesized to increase infiltration and recharge rates to aquifers. The effects of LID on recharge rates and quality is unknown, particularly during intense precipitation events for cities along the Pacific coast in response to interannual variability of the El Niño Southern Oscillation (ENSO). Using vadose zone monitoring sensors and instruments, I collected and monitored soil, hydraulic, and geochemical data to quantify the rates and quality of infiltration and recharge to the California Coastal aquifer system beneath a LID rain garden and traditional turf-lawn setting in San Francisco, CA. The data were used to calibrate a HYDRUS-3D model to simulate recharge rates under historical and future variability of ENSO. Understanding these processes has important implications for managing groundwater resources in urban, coastal environments.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chima, C.M.
This study evaluates the commercial energy sector of the Economic Community of West African States (ECOWAS). Presently, an economic union exists between the 16 countries of West Africa that are members of ECOWAS. Although the ECOWAS region has plentiful resources of commercial energy, it faces problems in this sector for two reasons. First is the problem resulting from the diminishing traditional energy resources such as wood fuel and charcoal. Second, most ECOWAS members, except Nigeria, are net importers of commercial energy, and hence face a high import burden for oil. Liquid petroleum is the dominant form of commercial energy usedmore » in the ECOWAS despite the availability of other resources. This author basically argues that the best policy and strategy solution for dealing with energy problems is through a combination of regional cooperative effort, and a more-intensive country level. The intensity-of-use hypothesis is tested with case studies of Ghana, the Ivory Coast, and Nigeria. The results indicate that newly developing countries can deviate from the expectations of the hypothesis.« less
The RAFAELA system: a workforce planning tool for nurse staffing and human resource management.
Fagerström, Lisbeth; Lønning, Kjersti; Andersen, Marit Helen
2014-05-01
The RAFAELA system was developed in Finland during the 1990s to help with the systematic and daily measurement of nursing intensity (NI) and allocation of nursing staff. The system has now been rolled out across almost all hospitals in Finland, and implementation has started elsewhere in Europe and Asia. This article describes the system, which aims to uphold staffing levels in accordance with patients' care needs, and its structure, which consists of three parts: the Oulu Patient Classification instrument; registration of available nursing resources; and the Professional Assessment of Optimal Nursing Care Intensity Level method, as an alternative to classical time studies. The article also highlights the benefits of using a systematic measurement of NI.
Electrophysiological Evidence for Hyperfocusing of Spatial Attention in Schizophrenia.
Kreither, Johanna; Lopez-Calderon, Javier; Leonard, Carly J; Robinson, Benjamin M; Ruffle, Abigail; Hahn, Britta; Gold, James M; Luck, Steven J
2017-04-05
A recently proposed hyperfocusing hypothesis of cognitive dysfunction in schizophrenia proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more narrowly but more intensely than healthy control subjects (HCS). The present study tests a key prediction of this hypothesis, namely, that PSZ will hyperfocus on information presented at the center of gaze. This should lead to greater filtering of peripheral stimuli when the task requires focusing centrally but reduced filtering of central stimuli when the task requires attending broadly in the periphery. These predictions were tested in a double oddball paradigm, in which frequent standard stimuli and rare oddball stimuli were presented at central and peripheral locations while event-related potentials were recorded. Participants were instructed to discriminate between the standard and oddball stimuli at either the central location or at the peripheral locations. PSZ and HCS showed opposite patterns of spatial bias at the level of early sensory processing, as assessed with the P1 component: PSZ exhibited stronger sensory suppression of peripheral stimuli when the task required attending narrowly to the central location, whereas HCS exhibited stronger sensory suppression of central stimuli when the task required attending broadly to the peripheral locations. Moreover, PSZ exhibited a stronger stimulus categorization response than HCS, as assessed with the P3b component, for central stimuli when the task required attending to the peripheral region. These results provide strong evidence of hyperfocusing in PSZ, which may provide a unified mechanistic account of multiple aspects of cognitive dysfunction in schizophrenia. SIGNIFICANCE STATEMENT Schizophrenia clearly involves impaired attention, but attention is complex, and delineating the precise nature of attentional dysfunction in schizophrenia has been difficult. The present study tests a new hyperfocusing hypothesis, which proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more intensely but more narrowly than healthy control subjects (HCS). Using electrophysiological measures of sensory and cognitive processing, we found that PSZ were actually superior to HCS in focusing attention at the point of gaze and filtering out peripheral distractors when the task required a narrow focusing of attention. This finding of superior filtering in PSZ supports the hyperfocusing hypothesis, which may provide the mechanism underlying a broad range of cognitive impairments in schizophrenia. Copyright © 2017 the authors 0270-6474/17/373813-11$15.00/0.
Electrophysiological Evidence for Hyperfocusing of Spatial Attention in Schizophrenia
Kreither, Johanna; Lopez-Calderon, Javier; Leonard, Carly J.; Robinson, Benjamin M.; Ruffle, Abigail; Hahn, Britta; Gold, James M.
2017-01-01
A recently proposed hyperfocusing hypothesis of cognitive dysfunction in schizophrenia proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more narrowly but more intensely than healthy control subjects (HCS). The present study tests a key prediction of this hypothesis, namely, that PSZ will hyperfocus on information presented at the center of gaze. This should lead to greater filtering of peripheral stimuli when the task requires focusing centrally but reduced filtering of central stimuli when the task requires attending broadly in the periphery. These predictions were tested in a double oddball paradigm, in which frequent standard stimuli and rare oddball stimuli were presented at central and peripheral locations while event-related potentials were recorded. Participants were instructed to discriminate between the standard and oddball stimuli at either the central location or at the peripheral locations. PSZ and HCS showed opposite patterns of spatial bias at the level of early sensory processing, as assessed with the P1 component: PSZ exhibited stronger sensory suppression of peripheral stimuli when the task required attending narrowly to the central location, whereas HCS exhibited stronger sensory suppression of central stimuli when the task required attending broadly to the peripheral locations. Moreover, PSZ exhibited a stronger stimulus categorization response than HCS, as assessed with the P3b component, for central stimuli when the task required attending to the peripheral region. These results provide strong evidence of hyperfocusing in PSZ, which may provide a unified mechanistic account of multiple aspects of cognitive dysfunction in schizophrenia. SIGNIFICANCE STATEMENT Schizophrenia clearly involves impaired attention, but attention is complex, and delineating the precise nature of attentional dysfunction in schizophrenia has been difficult. The present study tests a new hyperfocusing hypothesis, which proposes that people with schizophrenia (PSZ) tend to concentrate processing resources more intensely but more narrowly than healthy control subjects (HCS). Using electrophysiological measures of sensory and cognitive processing, we found that PSZ were actually superior to HCS in focusing attention at the point of gaze and filtering out peripheral distractors when the task required a narrow focusing of attention. This finding of superior filtering in PSZ supports the hyperfocusing hypothesis, which may provide the mechanism underlying a broad range of cognitive impairments in schizophrenia. PMID:28283557
Developing a Logistics Data Process for Support Equipment for NASA Ground Operations
NASA Technical Reports Server (NTRS)
Chakrabarti, Suman
2010-01-01
The United States NASA Space Shuttle has long been considered an extremely capable yet relatively expensive rocket. A great part of the roughly US $500 million per launch expense was the support footprint: refurbishment and maintenance of the space shuttle system, together with the long list of resources required to support it, including personnel, tools, facilities, transport and support equipment. NASA determined to make its next rocket system with a smaller logistics footprint, and thereby more cost-effective and quicker turnaround. The logical solution was to adopt a standard Logistics Support Analysis (LSA) process based on GEIA-STD-0007 http://www.logisticsengineers.org/may09pres/GEIASTD0007DEXShortIntro.pdf which is the successor of MIL-STD-1388-2B widely used by U.S., NATO, and other world military services and industries. This approach is unprecedented at NASA: it is the first time a major program of programs, Project Constellation, is factoring logistics and supportability into design at many levels. This paper will focus on one of those levels NASA ground support equipment for the next generation of NASA rockets and on building a Logistics Support Analysis Record (LSAR) for developing and documenting a support solution and inventory of resources for. This LSAR is actually a standards-based database, containing analyses of the time and tools, personnel, facilities and support equipment required to assemble and integrate the stages and umbilicals of a rocket. This paper will cover building this database from scratch: including creating and importing a hierarchical bill of materials (BOM) from legacy data; identifying line-replaceable units (LRUs) of a given piece of equipment; analyzing reliability and maintainability of said LRUs; and therefore making an assessment back to design whether the support solution for a piece of equipment is too much work, i.e., too resource-intensive. If one must replace or inspect an LRU too much, perhaps a modification of the design of the equipment can make such operational effort unnecessary. Finally, this paper addresses processes of tying resources to a timeline of tasks performed in ground operations: this enables various overarching analyses, e.g., a summarization of all resources used for a given piece of equipment. Quality Control of data will also be discussed: importing and exporting data from product teams, including spreadsheets-todatabase or data exchange between databases.
Boucek, Dana M; Lal, Ashwin K; Eckhauser, Aaron W; Weng, Hsin-Yi Cindy; Sheng, Xiaoming; Wilkes, Jacob F; Pinto, Nelangi M; Menon, Shaji C
2018-04-15
Pediatric heart transplantation (HT) is resource intensive. Event-driven pediatric databases do not capture data on resource use. The objective of this study was to evaluate resource utilization and identify associated factors during initial hospitalization for pediatric HT. This multicenter retrospective cohort study utilized the Pediatric Health Information Systems database (43 children's hospitals in the United States) of children ≤19 years of age who underwent transplant between January 2007 and July 2013. Demographic variables including site, payer, distance and time to center, clinical pre- and post-transplant variables, mortality, cost, and charge were the data collected. Total length of stay (LOS) and charge for the initial hospitalization were used as surrogates for resource use. Charges were inflation adjusted to 2013 dollars. Of 1,629 subjects, 54% were male, and the median age at HT was 5 years (IQR [interquartile range] 0 to 13). The median total and intensive care unit LOS were 51 (IQR 23 to 98) and 23 (IQR 9 to 58) days, respectively. Total charge and cost for hospitalization were $852,713 ($464,900 to $1,609,300) and $383,600 ($214,900 to $681,000) respectively. Younger age, lower volume center, southern region, and co-morbidities before transplant were associated with higher resource use. In later years, charges increased despite shorter LOS. In conclusion, this large multicenter study provides novel insight into factors associated with resource use in pediatric patients having HT. Peritransplant morbidities are associated with increased cost and LOS. Reducing costs in line with LOS will improve health-care value. Regional and center volume differences need further investigation for optimizing value-based care and efficient use of scarce resources. Copyright © 2018 Elsevier Inc. All rights reserved.
Water and water use in southern Nevada [Chapter 3] (Executive Summary)
Wayne R. Belcher; Michael J. Moran; Megan E. Rogers
2013-01-01
Water and water use in southern Nevada is an important issue. The scarcity of water resources for both human and biologic communities often leads to intense competition for both surface and ground waters. Anthropogenic and climate change impacts on scarce water resources need to be understood to assess human and ecosystem health for southern Nevada. Chapter 3 outlines...
How Is That Done? Student Views on Resources Used outside the Engineering Classroom
ERIC Educational Resources Information Center
Maclaren, Peter
2018-01-01
While the traditional lecture remains a key feature in the teaching of mathematically intensive disciplines at a tertiary level, what students do outside class, the resources they use, and how they use them are critical factors in their success. This study reports on a survey of students studying a range of engineering subjects, giving their views…
ERIC Educational Resources Information Center
Omer, Selma; Hickson, Gilles; Tache, Stephanie; Blind, Raymond; Masters, Susan; Loeser, Helen; Souza, Kevin; Mkony, Charles; Debas, Haile; O'Sullivan, Patricia
2008-01-01
Teaching to large classes is often challenging particularly when the faculty and teaching resources are limited. Innovative, less staff intensive ways need to be explored to enhance teaching and to engage students. We describe our experience teaching biochemistry to 350 students at Muhimbili University of Health and Allied Sciences (MUHAS) under…
Michael R. Vanderberg; Mary Beth Adams; Mark S. Wiseman
2012-01-01
Forests are important economic and ecological resources for both the Appalachian hardwood forest region and the country. Increased demand for woody biomass can be met, at least in part, by improved utilization of these resources. However, concerns exist about the impacts of increased intensity of woody biomass removal on the sustainability of forest ecosystems....
Budding, Karin E.; Kluender, Steven E.
1983-01-01
The depth of several thousand feet at which coal may underlie the surface rocks of the study area makes it a resource with little likelihood of development. The potential for oil and gas appears low because of the apparent lack of structural traps and the intense igneous activity in the area.
A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.
2009-09-01
Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.
Nielsen, Katie R; Becerra, Rosario; Mallma, Gabriela; Tantaleán da Fieno, José
2018-01-01
Acute lower respiratory infections are the leading cause of death outside the neonatal period for children less than 5 years of age. Widespread availability of invasive and non-invasive mechanical ventilation in resource-rich settings has reduced mortality rates; however, these technologies are not always available in many low- and middle-income countries due to the high cost and trained personnel required to implement and sustain their use. High flow nasal cannula (HFNC) is a form of non-invasive respiratory support with growing evidence for use in pediatric respiratory failure. Its simple interface makes utilization in resource-limited settings appealing, although widespread implementation in these settings lags behind resource-rich settings. Implementation science is an emerging field dedicated to closing the know-do gap by incorporating evidence-based interventions into routine care, and its principles have guided the scaling up of many global health interventions. In 2016, we introduced HFNC use for respiratory failure in a pediatric intensive care unit in Lima, Peru using implementation science methodology. Here, we review our experience in the context of the principles of implementation science to serve as a guide for others considering HFNC implementation in resource-limited settings.
NASA Astrophysics Data System (ADS)
Anton, J. M.; Sanchez, M. E.; Grau, J. B.; Andina, D.
2012-04-01
The engineering careers models were diverse in Europe, and are adopting now in Spain the Bolonia process for European Universities. Separated from older Universities, that are in part technically active, Civil Engineering (Caminos, Canales y Puertos) started at end of 18th century in Spain adopting the French models of Upper Schools for state civil servants with exam at entry. After 1800 intense wars, to conserve forest regions Ingenieros de Montes appeared as Upper School, and in 1855 also the Ingenieros Agrónomos to push up related techniques and practices. Other Engineers appeared as Upper Schools but more towards private factories. These ES got all adapted Lower Schools of Ingeniero Tecnico. Recently both grew much in number and evolved, linked also to recognized Professions. Spanish society, into European Community, evolved across year 2000, in part highly well, but with severe discordances, that caused severe youth unemployment with 2008-2011 crisis. With Bolonia process high formal changes step in from 2010-11, accepted with intense adaptation. The Lower Schools are changing towards the Upper Schools, and both that have shifted since 2010-11 various 4-years careers (Grado), some included into the precedent Professions, and diverse Masters. Acceptation of them to get students has started relatively well, and will evolve, and acceptation of new grades for employment in Spain, Europe or outside will be essential. Each Grado has now quite rigid curricula and programs, MOODLE was introduced to connect pupils, some specific uses of Personal Computers are taught in each subject. Escuela de Agronomos centre, reorganized with its old name in its precedent buildings at entrance of Campus Moncloa, offers Grados of Agronomic Engineering and Science for various public and private activities for agriculture, Alimentary Engineering for alimentary activities and control, Agro-Environmental Engineering more related to environment activities, and in part Biotechnology also in laboratories in Campus Monte-Gancedo for Biotechnology of Plants and Computational Biotechnology. Curricula include Basics, Engineering, Practices, Visits, English, "project of end of career", Stays. Some masters will conduce to specific professional diploma, list includes now Agro-Engineering, Agro-Forestal Biotechnology, Agro and Natural Resources Economy, Complex Physical Systems, Gardening and Landscaping, Rural Genie, Phytogenetic Resources, Plant Genetic Resources, Environmental Technology for Sustainable Agriculture, Technology for Human Development and Cooperation.
Chirico, Peter G.; Barthelemy, Francis; Ngbokoto, Francois A.
2010-01-01
In May of 2000, a meeting was convened in Kimberley, South Africa, and attended by representatives of the diamond industry and leaders of African governments to develop a certification process intended to assure that rough, exported diamonds were free of conflict concerns. This meeting was supported later in 2000 by the United Nations in a resolution adopted by the General Assembly. By 2002, the Kimberly Process Certification Scheme (KPCS) was ratified and signed by diamond-producing and diamond-importing countries. Over 70 countries were included as members of the KPCS at the end of 2007. To prevent trade in "conflict diamonds" while protecting legitimate trade, the KPCS requires that each country set up an internal system of controls to prevent conflict diamonds from entering any imported or exported shipments of rough diamonds. Every diamond or diamond shipment must be accompanied by a Kimberley Process (KP) certificate and be contained in tamper-proof packaging. The objective of this study was (1) to assess the naturally occurring endowment of diamonds in the Central African Republic (potential resources) based on geological evidence, previous studies, and recent field data and (2) to assess the diamond-production capacity and measure the intensity of mining activity. Several possible methods can be used to estimate the potential diamond resource. However, because there is generally a lack of sufficient and consistent data recording all diamond mining in the Central African Republic and because time to conduct fieldwork and accessibility to the diamond mining areas are limited, two different methodologies were used: the volume and grade approach and the content per kilometer approach. Estimates are that approximately 39,000,000 carats of alluvial diamonds remain in the eastern and western zones of the CAR combined. This amount is roughly twice the total amount of diamonds reportedly exported from the Central African Republic since 1931. Production capacity is calculated to be 840,000 carats per year, a number that is nearly twice the 450,000 carats per year reported annually by the Central African Republic. The difference in the two numbers reflects the lack of sufficient data on diamond resource grades, worker productivity, and the number and locations of sites being worked.
NASA Astrophysics Data System (ADS)
Agol, D.
2012-04-01
This paper is based on recent studies in Lake Naivasha Basin that explored the ways in which locally based institutions namely the Water Resources Users Associations (WRUAs) are contributing to hydrological knowledge for decision-making processes. Lake Naivasha is a shallow freshwater body which is situated on the floor of Kenya's Rift Valley. It covers approximately 140 Km2 and supports a rich diversity of plants and animals. The Lake Naivasha Basin faces several challenges associated with over- population, urbanization and intensive agricultural activities. For example, the large-scale floricultural and horticultural export industries around the Lake have attracted thousands of migrants from different parts of Kenya who have settled around the Lake and exert a lot of pressure on its resources. The Lake Naivasha is one of the best examples in Kenya where the WRUAs development process has shown some progress. There are 12 WRUAS across the Lake Basin representing its various sub-catchments. In recent years, the role of WRUAs in the Lake has changed rapidly as they are no longer restricted to just resolving conflicts and fostering cooperation between water users. They now have an additional responsibility of collecting hydrological data within their respective sub-catchments. The majority of WRUA officials have been trained on how to collect data such as reading rain gauges, measuring stream flows, turbidity and sediment loads. The data collected are sent to the relevant government authorities for validation and interpretation and the information derived from this process is used to formulate important strategies such as water allocation plans. Using secondary data analysis, interviews and focus group discussions the study investigated how this new role of the WRUAs is changing the water resource management landscape in the Lake Naivasha Basin. In particular it presents key challenges and opportunities associated with attempts to build capacities of lower level institutions like the WRUAs to take a more active role in participating in evidence based research. Some interesting issues have emerged including data validation, credibility and authenticity of the information generated as well as intellectual property rights.
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi
Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.
A Survey on the Feasibility of Sound Classification on Wireless Sensor Nodes
Salomons, Etto L.; Havinga, Paul J. M.
2015-01-01
Wireless sensor networks are suitable to gain context awareness for indoor environments. As sound waves form a rich source of context information, equipping the nodes with microphones can be of great benefit. The algorithms to extract features from sound waves are often highly computationally intensive. This can be problematic as wireless nodes are usually restricted in resources. In order to be able to make a proper decision about which features to use, we survey how sound is used in the literature for global sound classification, age and gender classification, emotion recognition, person verification and identification and indoor and outdoor environmental sound classification. The results of the surveyed algorithms are compared with respect to accuracy and computational load. The accuracies are taken from the surveyed papers; the computational loads are determined by benchmarking the algorithms on an actual sensor node. We conclude that for indoor context awareness, the low-cost algorithms for feature extraction perform equally well as the more computationally-intensive variants. As the feature extraction still requires a large amount of processing time, we present four possible strategies to deal with this problem. PMID:25822142
NASA Astrophysics Data System (ADS)
Mishra Patidar, Manju; Jain, Deepti; Nath, R.; Ganesan, V.
2016-10-01
Poly (L-lactic acid) (PLLA) is a biodegradable and biocompatible polyester that can be produced by renewable resources, like corn. Being non-toxic to human body, PLLA is used in biomedical applications, like surgical sutures, bone fixation devices, or controlled drug delivery. Besides its application studies, very few experiments have been done to study its dielectric relaxation in the low temperature region. Keeping this in mind we have performed a low temperature thermally stimulated depolarization current (TSDC) studies over the temperature range of 80K-400K to understand the relaxation phenomena of PLLA. We could observe a multi modal broad relaxation of small but significant intensity at low temperatures while a sharp and high intense peak around glass transition temperature, Tg∼ 333K, of PLLA has appeared. The fine structure of the low temperature TSDC peak may be attributed to the spherulites formation of crystallite regions inter twinned with the polymer as seen in AFM and appear to be produced due to an isothermal crystallization process. XRD analysis also confirms the semicrystalline nature of the PLLA film.
Skouroliakou, Maria; Soloupis, George; Gounaris, Antonis; Charitou, Antonia; Papasarantopoulos, Petros; Markantonis, Sophia L; Golna, Christina; Souliotis, Kyriakos
2008-07-28
This study assesses the results of implementation of a software program that allows for input of admission/discharge summary data (including cost) in a neonatal intensive care unit (NICU) in Greece, based on the establishment of a baseline statistical database for infants treated in a NICU and the statistical analysis of epidemiological and resource utilization data thus collected. A software tool was designed, developed, and implemented between April 2004 and March 2005 in the NICU of the LITO private maternity hospital in Athens, Greece, to allow for the first time for step-by-step collection and management of summary treatment data. Data collected over this period were subsequently analyzed using defined indicators as a basis to extract results related to treatment options, treatment duration, and relative resource utilization. Data for 499 babies were entered in the tool and processed. Information on medical costs (e.g., mean total cost +/- SD of treatment was euro310.44 +/- 249.17 and euro6704.27 +/- 4079.53 for babies weighing more than 2500 g and 1000-1500 g respectively), incidence of complications or disease (e.g., 4.3 percent and 14.3 percent of study babies weighing 1,000 to 1,500 g suffered from cerebral bleeding [grade I] and bronchopulmonary dysplasia, respectively, while overall 6.0 percent had microbial infections), and medical statistics (e.g., perinatal mortality was 6.8 percent) was obtained in a quick and robust manner. The software tool allowed for collection and analysis of data traditionally maintained in paper medical records in the NICU with greater ease and accuracy. Data codification and analysis led to significant findings at the epidemiological, medical resource utilization, and respective hospital cost levels that allowed comparisons with literature findings for the first time in Greece. The tool thus contributed to a clearer understanding of treatment practices in the NICU and set the baseline for the assessment of the impact of future interventions at the policy or hospital level.
Population growth and a sustainable environment. The Machakos story.
Mortimore, M; Tiffen, M
1994-10-01
The view is taken that population density in the Machakos District (boundaries prior to 1992) of Kenya influenced both environmental conservation and productivity through adaptation of new technologies. Changes in resource management in Machakos District are identified as a shift to cash crop production, experimentation with staple food options, faster tillage, use of fertilizers for enhancing soil fertility, and livestock and tree cultivation. These agricultural changes occurred due to subdivision of landholdings among sons, private appropriation of scarce grazing land, and land scarcity. Intensive practices such as intensive livestock feeding systems and the permanent manuring of fields increased the efficiency of nutrient cycling through plants, animals, and soils. The Akamba custom gave land rights to those who tilled the soil first. Formal land registration occurred after 1968 and favored owners and investors. Small farm investment was made possible through work off-farm and remittances. The value of output per square kilometer at constant prices increased during 1930-87. Cultivated land area also increased during this period, but mostly on poorer quality land. Agricultural changes were enhanced by social and institutional factors such as small family units and greater partnerships between husband and wife. Families pooled resources through collectives. Women played leadership roles. Competing interest groups and organizations have evolved and enabled people to articulate their needs and obtain access to resources at all levels. These institutions increased in strength over time and with increased density. The cost of service provision decreased with greater population numbers. Development of roads and schools facilitated formal education. Population density, market growth, and a generally supportive economic environment are viewed as the factors responsible for changes in Machakos District. Technological change is viewed as an endogenous process of adaptation to new technologies. Changes in Machakos District are viewed as driven by a combination of exogenous and endogenous practices and local initiative.
43 CFR 1610.4 - Resource management planning process.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Resource management planning process. 1610... LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) PLANNING, PROGRAMMING, BUDGETING Resource Management Planning § 1610.4 Resource management planning process. ...
43 CFR 1610.4 - Resource management planning process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Resource management planning process. 1610... LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) PLANNING, PROGRAMMING, BUDGETING Resource Management Planning § 1610.4 Resource management planning process. ...
43 CFR 1610.4 - Resource management planning process.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Resource management planning process. 1610... LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) PLANNING, PROGRAMMING, BUDGETING Resource Management Planning § 1610.4 Resource management planning process. ...
43 CFR 1610.4 - Resource management planning process.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Resource management planning process. 1610... LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR GENERAL MANAGEMENT (1000) PLANNING, PROGRAMMING, BUDGETING Resource Management Planning § 1610.4 Resource management planning process. ...
Intense hurricane strikes in southeastern New England since A.D. 1000
NASA Astrophysics Data System (ADS)
Donnelly, J. P.; Ettinger, R.; Cleary, P.
2001-05-01
Intense, category 3, 4, and 5 landfalling hurricanes pose a significant threat to lives and resources in coastal areas. Intense hurricane strikes also play a significant role in transporting sediments and shaping coastal landforms. Potential links between human-induced climate change and the frequency and intensity of tropical cyclones and the recent concentration of resources and population in areas where intense hurricanes may strike necessitate examination of decadal-to-millennial-scale variability in hurricane activity. The National Oceanic and Atmospheric Administration hurricane activity records for the western Atlantic Ocean only go back to the late 19th century. In the northeast United States historical records of hurricanes date back 370 years. We use stratigraphic evidence from coastal wetlands to extend the record of intense hurricane strikes into the prehistoric period in southeastern New England. Storm surge and wave action associated with intense storms can overtop barrier islands, remove sand and gravel from the beach and nearshore environment and deposit these sediments across the surface of coastal wetlands. In a regime of rising sea level, organic wetland sediments accumulate on top of these storm-induced deposits, preserving a record of past storms. We reconstructed storm deposition records within coastal marshes from eastern Connecticut to Cape Cod, Massachusetts. We matched these records to the historic record of storms and established the age of prehistoric storm deposits dating back about 1000 years with isotopic and stratigraphic dating techniques. The ages of storm deposits at all sites correlate to historic intense hurricane strikes. Prehistoric storm deposits can repeatedly be correlated among multiple sites and are of similar character and extent to the more recent deposits that we attribute to historic intense hurricane strikes. Therefore these older storm deposits were also likely deposited during prehistoric intense hurricanes. We documented at least eight deposits consistent with intense hurricane strikes in the last 1000 years. We identified deposits associated with historic intense hurricanes that occurred in A.D. 1954, 1938, 1869, 1815, 1638 and/or 1635. In addition we identified deposits likely associated with prehistoric intense hurricane strikes that occurred in A.D. 1400-1450, 1300-1400, and 1100-1150. These records indicate no apparent correlation between the frequency of intense hurricane landfalls in southeastern New England and the Little Ice Age and Medieval Warm Period climate oscillations.
Water-resources activities in Florida, 1988-89
Glenn, Mildred E.
1989-01-01
This report contains summary statements of water resources activities in Florida conducted by the Water Resources Division of the U.S. Geological Survey in cooperation with Federal, State , and local agencies during 1988. These activities are part of the Federal program of appraising the Nation 's water resources. Included are brief descriptions of the nature and scope of all active studies, summaries of significant results for 1988 and anticipated accomplishments during 1989. Water resources appraisals in Florida are highly diversified, ranging from hydrologic records networks to interpretive appraisals of water resources and applied research to develop investigative techniques. Thus, water-resources investigations range from basic descriptive water-availability studies for areas of low-intensity water development and management to sophisticated cause and effect studies in areas of high-intensity water development and management. The interpretive reports and records that are products of the investigations are a principal hydrologic foundation upon which the plans for development, management, and protection of Florida 's water resources may be used. Water data and information required to implement sound water-management programs in highly urbanized areas relate to the quantity and quality of storm runoff, sources of aquifer contamination, injection of wastes into deep strata, underground storage of freshwater, artificial recharge of aquifers, environmental effects of reuse of water, and effects of land development on changes in ground-and surface-water quality. In some parts of the State broad areas are largely rural. Future growth is anticipated in many of these. This report is intended to inform those agencies vitally interested in the water resources of Florida as to the current status and objectives of the U.S. Geological Survey cooperative program. The mission of this program is to collect, interpret, and publish information on water resources. Almost all of this work is done in cooperation with other public agencies. (USGS)
The implementation of a postoperative care process on a neurosurgical unit.
Douglas, Mary; Rowed, Sheila
2005-12-01
The postoperative phase is a critical time for any neurosurgical patient. Historically, certain patients having neurosurgical procedures, such as craniotomies and other more complex surgeries, have been nursed postoperatively in the intensive care unit (ICU) for an overnight stay, prior to transfer to a neurosurgical floor. At the Hospital for Sick Children in Toronto, because of challenges with access to ICU beds and the cancellation of surgeries because of lack of available nurses for the ICU setting, this practice was reexamined. A set of criteria was developed to identify which postoperative patients should come directly to the neurosurgical unit immediately following their anesthetic recovery. The criteria were based on patient diagnosis, preoperative condition, comorbidities, the surgical procedure, intraoperative complications, and postoperative status. A detailed process was then outlined that allowed the optimum patients to be selected for this process to ensure patient safety. Included in this process was a postoperative protocol addressing details such as standard physician orders and the levels of monitoring required. Outcomes of this new process include fewer surgical cancellations for patients and families, equally safe, or better patient care, and the conservation of limited ICU resources. The program has currently been expanded to include patients who have undergone endovascular therapies.
Chen, Haimei; Guo, Baolin; Liu, Chang
2017-01-01
Epimedium pseudowushanense B.L.Guo, a light-demanding shade herb, is used in traditional medicine to increase libido and strengthen muscles and bones. The recognition of the health benefits of Epimedium has increased its market demand. However, its resource recycling rate is low and environmentally dependent. Furthermore, its natural sources are endangered, further increasing prices. Commercial culture can address resource constraints of it.Understanding the effects of environmental factors on the production of its active components would improve the technology for cultivation and germplasm conservation. Here, we studied the effects of light intensities on the flavonoid production and revealed the molecular mechanism using RNA-seq analysis. Plants were exposed to five levels of light intensity through the periods of germination to flowering, the flavonoid contents were measured using HPLC. Quantification of epimedin A, epimedin B, epimedin C, and icariin showed that the flavonoid contents varied with different light intensity levels. And the largest amount of epimedin C was produced at light intensity level 4 (I4). Next, the leaves under the treatment of three light intensity levels (“L”, “M” and “H”) with the largest differences in the flavonoid content, were subjected to RNA-seq analysis. Transcriptome reconstruction identified 43,657 unigenes. All unigene sequences were annotated by searching against the Nr, Gene Ontology, and Kyoto Encyclopedia of Genes and Genomes (KEGG) databases. In total, 4008, 5260, and 3591 significant differentially expressed genes (DEGs) were identified between the groups L vs. M, M vs. H and L vs. H. Particularly, twenty-one full-length genes involved in flavonoid biosynthesis were identified. The expression levels of the flavonol synthase, chalcone synthase genes were strongly associated with light-induced flavonoid abundance with the highest expression levels found in the H group. Furthermore, 65 transcription factors, including 31 FAR1, 17 MYB-related, 12 bHLH, and 5 WRKY, were differentially expressed after light induction. Finally, a model was proposed to explain the light-induced flavonoid production. This study provided valuable information to improve cultivation practices and produced the first comprehensive resource for E. pseudowushanense transcriptomes. PMID:28786984
Pan, Junqian; Chen, Haimei; Guo, Baolin; Liu, Chang
2017-01-01
Epimedium pseudowushanense B.L.Guo, a light-demanding shade herb, is used in traditional medicine to increase libido and strengthen muscles and bones. The recognition of the health benefits of Epimedium has increased its market demand. However, its resource recycling rate is low and environmentally dependent. Furthermore, its natural sources are endangered, further increasing prices. Commercial culture can address resource constraints of it.Understanding the effects of environmental factors on the production of its active components would improve the technology for cultivation and germplasm conservation. Here, we studied the effects of light intensities on the flavonoid production and revealed the molecular mechanism using RNA-seq analysis. Plants were exposed to five levels of light intensity through the periods of germination to flowering, the flavonoid contents were measured using HPLC. Quantification of epimedin A, epimedin B, epimedin C, and icariin showed that the flavonoid contents varied with different light intensity levels. And the largest amount of epimedin C was produced at light intensity level 4 (I4). Next, the leaves under the treatment of three light intensity levels ("L", "M" and "H") with the largest differences in the flavonoid content, were subjected to RNA-seq analysis. Transcriptome reconstruction identified 43,657 unigenes. All unigene sequences were annotated by searching against the Nr, Gene Ontology, and Kyoto Encyclopedia of Genes and Genomes (KEGG) databases. In total, 4008, 5260, and 3591 significant differentially expressed genes (DEGs) were identified between the groups L vs. M, M vs. H and L vs. H. Particularly, twenty-one full-length genes involved in flavonoid biosynthesis were identified. The expression levels of the flavonol synthase, chalcone synthase genes were strongly associated with light-induced flavonoid abundance with the highest expression levels found in the H group. Furthermore, 65 transcription factors, including 31 FAR1, 17 MYB-related, 12 bHLH, and 5 WRKY, were differentially expressed after light induction. Finally, a model was proposed to explain the light-induced flavonoid production. This study provided valuable information to improve cultivation practices and produced the first comprehensive resource for E. pseudowushanense transcriptomes.
The Water-Energy-Food Nexus of Unconventional Fossil Fuels.
NASA Astrophysics Data System (ADS)
Rosa, L.; Davis, K. F.; Rulli, M. C.; D'Odorico, P.
2017-12-01
Extraction of unconventional fossil fuels has increased human pressure on freshwater resources. Shale formations are globally abundant and widespread. Their extraction through hydraulic fracturing, a water-intensive process, may be limited by water availability, especially in arid and semiarid regions where stronger competition is expected to emerge with food production. It is unclear to what extent and where shale resource extraction could compete with local water and food security. Although extraction of shale deposits materializes economic gains and increases energy security, in some regions it may exacerbate the reliance on food imports, thereby decreasing regional food security. We consider the global distribution of known shale deposits suitable for oil and gas extraction and evaluate their impacts on water resources for food production and other human and environmental needs. We find that 17% of the world's shale deposits are located in areas affected by both surface water and groundwater stress, 50% in areas with surface water stress, and about 30% in irrigated areas. In these regions shale oil and shale gas production will likely threaten water and food security. These results highlight the importance of hydrologic analyses in the extraction of fossil fuels. Indeed, neglecting water availability as one of the possible factors constraining the development of shale deposits around the world could lead to unaccounted environmental impacts and business risks for firms and investors. Because several shale deposits in the world stretch across irrigated agricultural areas in arid regions, an adequate development of these resources requires appropriate environmental, economic and political decisions.
Electrostatic Beneficiation of Lunar Regolith: Applications in In-Situ Resource Utilization
NASA Technical Reports Server (NTRS)
Trigwell, Steve; Captain, James; Weis, Kyle; Quinn, Jacqueline
2011-01-01
Upon returning to the moon, or further a field such as Mars, presents enormous challenges in sustaining life for extended periods of time far beyond the few days the astronauts experienced on the moon during the Apollo missions. A stay on Mars is envisioned to last several months, and it would be cost prohibitive to take all the requirements for such a stay from earth. Therefore, future exploration missions will be required to be self-sufficient and utilize the resources available at the mission site to sustain human occupation. Such an exercise is currently the focus of intense research at NASA under the In-situ Resource Utilization (ISRU) program. As well as oxygen and water necessary for human life, resources for providing building materials for habitats, radiation protection, and landing/launch pads are required. All these materials can be provided by the regolith present on the surface as it contains sufficient minerals and metals oxides to meet the requirements. However, before processing, it would be cost effective if the regolith could be enriched in the mineral(s) of interest. This can be achieved by electrostatic beneficiation in which tribocharged mineral particles are separated out and the feedstock enriched or depleted as required. The results of electrostatic beneficiation of lunar simulants and actual Apollo regolith, in lunar high vacuum are reported in which various degrees of efficient particle separation and mineral enrichment up to a few hundred percent were achieved.
Interpersonal interactions, job demands and work-related outcomes in pharmacy.
Gaither, Caroline A; Nadkarni, Anagha
2012-04-01
Objectives The objective of this study was to examine the interaction between job demands of pharmacists and resources in the form of interpersonal interactions and its association with work-related outcomes such as organizational and professional commitment, job burnout, professional identity and job satisfaction. The job demands-resources (JD-R) model served as the theoretical framework. Methods Subjects for the study were drawn from the Pharmacy Manpower Project Database (n = 1874). A 14-page mail-in survey measured hospital pharmacists' responses on the frequency of occurrence of various job-related scenarios as well as work-related outcomes. The study design was a 2 × 2 factorial design. Responses were collected on a Likert scale. Descriptive statistics, reliability analyses and correlational and multiple regression analyses were conducted using SPSS version 17 (SPSS, Chicago, IL, USA). Key findings The 566 pharmacists (30% response rate) who responded to the survey indicated that high-demand/pleasant encounters and low-demand/pleasant encounters occurred more frequently in the workplace. The strongest correlations were found between high-demand/unpleasant encounters and frequency and intensity of emotional exhaustion. Multiple regression analyses indicated that when controlling for demographic factors high-demand/unpleasant encounters were negatively related to affective organizational commitment and positively related to frequency and intensity of emotional exhaustion. Low-demand/pleasant encounters were positively related to frequency and intensity of personal accomplishment. Low-demand/unpleasant encounters were significantly and negatively related to professional commitment, job satisfaction and frequency and intensity of emotional exhaustion, while high-demand/pleasant encounters were also related to frequency and intensity of emotional exhaustion Conclusion Support was found for the JD-R model and the proposed interaction effects. Study results suggest that adequate attention must be paid to the interplay between demands on the job and interactions with healthcare professionals to improve the quality of the pharmacist's work life. Future research should examine other types of job demands and resources. © 2011 The Authors. IJPP © 2011 Royal Pharmaceutical Society.
Quality assurance in an adult intensive care unit, Eastern region, Saudi Arabia.
Iqbal, Mobeen; Rehmani, Rifat; Venter, Joan; Alaithan, Abdulsalam M
2007-03-01
Quality assurance (QA) is an increasingly important element in the administrative management of Intensive Care Unit (ICU). This is not only to improve clinical practices and patient's outcome, but also helps in proper resource utilization. We introduced a comprehensive quality assurance program in ICU at King Abdulaziz National Guard Hospital, Alhasa, Saudi Arabia, based on the existing medical evidence. We identified an already-validated set of quality indicators in intensive care and grouped them in categories of outcome measures (which reflect patient's subsequent health status) and process measures (related to patient-healthcare professional's interaction). Data collection forms were developed for nurses and physicians. Data were reported on monthly basis starting from January 2005, and the first 10 months data are presented. Three hundred eighty-seven patients were admitted during the study period. Approximately 56.9% had cardiac related diseases, 33.5% had medical ailments, and 9.6% had surgery related issues. There were 54.6% males and 45.4% females. Mean age of the patients was 58.4 +/- 18.3 years. The mean acute physiology and chronic health evaluation II (APACHE-II) score was 13.6 +/- 4.9. Outcome measures were either better or comparable to international data, while adherence to process measures was found to be excellent. Standardized mortality ratio for the duration of study was 0.24 with 95% confidence interval from 0.15-0.36. Implementation of QA program is practical in an ICU. Disseminating the quality monitoring information at national level can lead to a broad data base, which can identify the best performing ICUs, thus, leading to bench marking and creating risk adjusted models applicable to local population.
Current state and problems of integrated development of mineral resources base in Russia
NASA Astrophysics Data System (ADS)
Filimonova, I. V.; Eder, L. V.; Mishenin, M. V.; Mamakhatov, T. M.
2017-09-01
The article deals with the issues of integrated development of subsoil resources taking into account the actual problems facing the Russian oil and gas complex. The key factors determining the need for integrated development of subsoil resources have been systematized and investigated. These factors are the change of the hydrocarbon resource base quality, the improvement of the depletion degree of basic (unique and major) oil fields, the increase in the number of small and smallest oil fields discovered and introduced into development, the increased capital intensity and the riskiness of geological exploration, and the territorial location of new subsoil use facilities.
The case for positive emotions in the stress process.
Folkman, Susan
2008-01-01
For many decades, the stress process was described primarily in terms of negative emotions. However, robust evidence that positive emotions co-occurred with negative emotions during intensely stressful situations suggested the need to consider the possible roles of positive emotions in the stress process. About 10 years ago, these possibilities were incorporated into a revision of stress and coping theory (Folkman, 1997). This article summarizes the research reported during the intervening 10 years that pertains to the revised model. Evidence has accumulated regarding the co-occurrence of positive and negative emotions during stressful periods; the restorative function of positive emotions with respect to physiological, psychological, and social coping resources; and the kinds of coping processes that generate positive emotions including benefit finding and reminding, adaptive goal processes, reordering priorities, and infusing ordinary events with positive meaning. Overall, the evidence supports the propositions set forth in the revised model. Contrary to earlier tendencies to dismiss positive emotions, the evidence indicates they have important functions in the stress process and are related to coping processes that are distinct from those that regulate distress. Including positive emotions in future studies will help address an imbalance between research and clinical practice due to decades of nearly exclusive concern with the negative emotions.
Articulating the Resources for Business Process Analysis and Design
ERIC Educational Resources Information Center
Jin, Yulong
2012-01-01
Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…
Organizational Influences on Health Professionals' Experiences of Moral Distress in PICUs.
Wall, Sarah; Austin, Wendy J; Garros, Daniel
2016-03-01
This article reports the findings of a qualitative study (secondary analysis) that explored the organizational influences on moral distress for health professionals working in pediatric intensive care units (PICUs) across Canada. Participants were recruited to the study from PICUs across Canada. The PICU is a high-tech, fast-paced, high-pressure environment where caregivers frequently face conflict and ethical tension in the care of critically ill children. A number of themes including relationships with management, organizational structure and processes, workload and resources, and team dynamics were identified. This study provides a rare and important multi-disciplinary perspective on this topic and the findings have implications for administrators and leaders who seek to improve the moral climate of healthcare delivery.
Management of intestinal failure in children.
Dalzell, A Mark
2015-10-01
The management of children with intestinal failure is a rewarding but resource intensive process. There is however variability in practice and outcome for patients, despite the basic principles of care and measures of success being well defined. The importance of multidisciplinary working is paramount and there is an urgent need to obtain collaboration between paediatric surgical and medical gastroenterological colleagues and an obligation of commissioners to see that there is recognition and implementation of ideal practice as an essential element in improving the outlook for children with intestinal failure in the United Kingdom. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Floquet-Engineered Valleytronics in Dirac Systems.
Kundu, Arijit; Fertig, H A; Seradjeh, Babak
2016-01-08
Valley degrees of freedom offer a potential resource for quantum information processing if they can be effectively controlled. We discuss an optical approach to this problem in which intense light breaks electronic symmetries of a two-dimensional Dirac material. The resulting quasienergy structures may then differ for different valleys, so that the Floquet physics of the system can be exploited to produce highly polarized valley currents. This physics can be utilized to realize a valley valve whose behavior is determined optically. We propose a concrete way to achieve such valleytronics in graphene as well as in a simple model of an inversion-symmetry broken Dirac material. We study the effect numerically and demonstrate its robustness against moderate disorder and small deviations in optical parameters.
Lopetegui, Marcelo A; Lara, Barbara A; Yen, Po-Yin; Çatalyürek, Ümit V; Payne, Philip R O
2015-01-01
Multiple choice questions play an important role in training and evaluating biomedical science students. However, the resource intensive nature of question generation limits their open availability, reducing their contribution to evaluation purposes mainly. Although applied-knowledge questions require a complex formulation process, the creation of concrete-knowledge questions (i.e., definitions, associations) could be assisted by the use of informatics methods. We envisioned a novel and simple algorithm that exploits validated knowledge repositories and generates concrete-knowledge questions by leveraging concepts' relationships. In this manuscript we present the development and validation of a prototype which successfully produced meaningful concrete-knowledge questions, opening new applications for existing knowledge repositories, potentially benefiting students of all biomedical sciences disciplines.
Islam, Rafiqul; Kar, Sumit; Islam, Clarinda; Farmen, Raymond
2018-06-01
There has been an increased use of commercial kits for biomarker measurement, commensurate with the increased demand for biomarkers in drug development. However, in most cases these kits do not meet the quality attributes for use in regulated environment. The process for adaptation of these kits can be frustrating, time consuming and resource intensive. In addition, a lack of harmonized guidance for the validation of biomarker poses a significant challenge in the adaptation of kits in a regulated environment. The purpose of this perspective is to propose a tiered approach to commercial drug development kits with clearly defined quality attributes and to demonstrate how these kits can be adapted to perform analytical validation in a regulated environment.
Newbery, David M; Chuyong, George B; Zimmermann, Lukas
2006-01-01
Mast fruiting is a distinctive reproductive trait in trees. This rain forest study, at a nutrient-poor site with a seasonal climate in tropical Africa, provides new insights into the causes of this mode of phenological patterning. At Korup, Cameroon, 150 trees of the large, ectomycorrhizal caesalp, Microberlinia bisulcata, were recorded almost monthly for leafing, flowering and fruiting during 1995-2000. The series was extended to 1988-2004 with less detailed data. Individual transitions in phenology were analysed. Masting occurred when the dry season before fruiting was drier, and the one before that was wetter, than average. Intervals between events were usually 2 or 3 yr. Masting was associated with early leaf exchange, followed by mass flowering, and was highly synchronous in the population. Trees at higher elevation showed more fruiting. Output declined between 1995 and 2000. Mast fruiting in M. bisulcata appears to be driven by climate variation and is regulated by internal tree processes. The resource-limitation hypothesis was supported. An 'alternative bearing' system seems to underlie masting. That ectomycorrhizal habit facilitates masting in trees is strongly implied.
Figeys, Daniel; Fai, Stephen; Bennett, Steffany A. L.
2013-01-01
Motivation: Establishing phospholipid identities in large lipidomic datasets is a labour-intensive process. Where genomics and proteomics capitalize on sequence-based signatures, glycerophospholipids lack easily definable molecular fingerprints. Carbon chain length, degree of unsaturation, linkage, and polar head group identity must be calculated from mass to charge (m/z) ratios under defined mass spectrometry (MS) conditions. Given increasing MS sensitivity, many m/z values are not represented in existing prediction engines. To address this need, Visualization and Phospholipid Identification is a web-based application that returns all theoretically possible phospholipids for any m/z value and MS condition. Visualization algorithms produce multiple chemical structure files for each species. Curated lipids detected by the Canadian Institutes of Health Research Training Program in Neurodegenerative Lipidomics are provided as high-resolution structures. Availability: VaLID is available through the Canadian Institutes of Health Research Training Program in Neurodegenerative Lipidomics resources web site at https://www.med.uottawa.ca/lipidomics/resources.html. Contacts: lipawrd@uottawa.ca Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23162086
Gonzalez, Luis F.; Montes, Glen A.; Puig, Eduard; Johnson, Sandra; Mengersen, Kerrie; Gaston, Kevin J.
2016-01-01
Surveying threatened and invasive species to obtain accurate population estimates is an important but challenging task that requires a considerable investment in time and resources. Estimates using existing ground-based monitoring techniques, such as camera traps and surveys performed on foot, are known to be resource intensive, potentially inaccurate and imprecise, and difficult to validate. Recent developments in unmanned aerial vehicles (UAV), artificial intelligence and miniaturized thermal imaging systems represent a new opportunity for wildlife experts to inexpensively survey relatively large areas. The system presented in this paper includes thermal image acquisition as well as a video processing pipeline to perform object detection, classification and tracking of wildlife in forest or open areas. The system is tested on thermal video data from ground based and test flight footage, and is found to be able to detect all the target wildlife located in the surveyed area. The system is flexible in that the user can readily define the types of objects to classify and the object characteristics that should be considered during classification. PMID:26784196
Herrero, Mario; Havlík, Petr; Valin, Hugo; Notenbaert, An; Rufino, Mariana C.; Thornton, Philip K.; Blümmel, Michael; Weiss, Franz; Grace, Delia; Obersteiner, Michael
2013-01-01
We present a unique, biologically consistent, spatially disaggregated global livestock dataset containing information on biomass use, production, feed efficiency, excretion, and greenhouse gas emissions for 28 regions, 8 livestock production systems, 4 animal species (cattle, small ruminants, pigs, and poultry), and 3 livestock products (milk, meat, and eggs). The dataset contains over 50 new global maps containing high-resolution information for understanding the multiple roles (biophysical, economic, social) that livestock can play in different parts of the world. The dataset highlights: (i) feed efficiency as a key driver of productivity, resource use, and greenhouse gas emission intensities, with vast differences between production systems and animal products; (ii) the importance of grasslands as a global resource, supplying almost 50% of biomass for animals while continuing to be at the epicentre of land conversion processes; and (iii) the importance of mixed crop–livestock systems, producing the greater part of animal production (over 60%) in both the developed and the developing world. These data provide critical information for developing targeted, sustainable solutions for the livestock sector and its widely ranging contribution to the global food system. PMID:24344273
Food plant diversity as broad-scale determinant of avian frugivore richness
Kissling, W. Daniel; Rahbek, Carsten; Böhning-Gaese, Katrin
2007-01-01
The causes of variation in animal species richness at large spatial scales are intensively debated. Here, we examine whether the diversity of food plants, contemporary climate and energy, or habitat heterogeneity determine species richness patterns of avian frugivores across sub-Saharan Africa. Path models indicate that species richness of Ficus (their fruits being one of the major food resources for frugivores in the tropics) has the strongest direct effect on richness of avian frugivores, whereas the influences of variables related to water–energy and habitat heterogeneity are mainly indirect. The importance of Ficus richness for richness of avian frugivores diminishes with decreasing specialization of birds on fruit eating, but is retained when accounting for spatial autocorrelation. We suggest that a positive relationship between food plant and frugivore species richness could result from niche assembly mechanisms (e.g. coevolutionary adaptations to fruit size, fruit colour or vertical stratification of fruit presentation) or, alternatively, from stochastic speciation–extinction processes. In any case, the close relationship between species richness of Ficus and avian frugivores suggests that figs are keystone resources for animal consumers, even at continental scales. PMID:17251107
Gonzalez, Luis F; Montes, Glen A; Puig, Eduard; Johnson, Sandra; Mengersen, Kerrie; Gaston, Kevin J
2016-01-14
Surveying threatened and invasive species to obtain accurate population estimates is an important but challenging task that requires a considerable investment in time and resources. Estimates using existing ground-based monitoring techniques, such as camera traps and surveys performed on foot, are known to be resource intensive, potentially inaccurate and imprecise, and difficult to validate. Recent developments in unmanned aerial vehicles (UAV), artificial intelligence and miniaturized thermal imaging systems represent a new opportunity for wildlife experts to inexpensively survey relatively large areas. The system presented in this paper includes thermal image acquisition as well as a video processing pipeline to perform object detection, classification and tracking of wildlife in forest or open areas. The system is tested on thermal video data from ground based and test flight footage, and is found to be able to detect all the target wildlife located in the surveyed area. The system is flexible in that the user can readily define the types of objects to classify and the object characteristics that should be considered during classification.
NASA Astrophysics Data System (ADS)
Andrade, Ricardo G.; de C. Teixeira, Antônio H.; Sano, Edson E.; Leivas, Janice F.; Victoria, Daniel C.; Nogueira, Sandra F.
2014-10-01
The Alto Tocantins watershed, located in the Brazilian Savanna (Cerrado biome), is under an intense land use and occupation process, causing increased pressure on natural resources. Pasture areas in the region are highly relevant to the rational use of natural resources in order to achieve economic and environmental sustainability. In this context, remote sensing techniques have been essential for obtaining information relevant to the assessment of vegetation conditions on a large scale. This study aimed to apply this tool in conjunction with field measurements to evaluate evapotranspiration (ET) against pasture degradation indicators. The SAFER algorithm was applied to estimate ET using MODIS images and weather station data from year 2012. Results showed that ET was lower in degraded pastures. It is noteworthy that during low rainfall period, ET values were 22.2% lower in relation to non-degraded pastures. This difference in ET indicates changes in the partition of the energy balance and may impact the microclimate. These results may contribute to public policies that aim to reduce the loss of the productive potential of pastures.
Creating a global sub-daily precipitation dataset
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Blenkinsop, Stephen; Fowler, Hayley
2017-04-01
Extremes of precipitation can cause flooding and droughts which can lead to substantial damages to infrastructure and ecosystems and can result in loss of life. It is still uncertain how hydrological extremes will change with global warming as we do not fully understand the processes that cause extreme precipitation under current climate variability. The INTENSE project is using a novel and fully-integrated data-modelling approach to provide a step-change in our understanding of the nature and drivers of global precipitation extremes and change on societally relevant timescales, leading to improved high-resolution climate model representation of extreme rainfall processes. The INTENSE project is in conjunction with the World Climate Research Programme (WCRP)'s Grand Challenge on 'Understanding and Predicting Weather and Climate Extremes' and the Global Water and Energy Exchanges Project (GEWEX) Science questions. The first step towards achieving this is to construct a new global sub-daily precipitation dataset. Data collection is ongoing and already covers North America, Europe, Asia and Australasia. Comprehensive, open source quality control software is being developed to set a new standard for verifying sub-daily precipitation data and a set of global hydroclimatic indices will be produced based upon stakeholder recommendations. This will provide a unique global data resource on sub-daily precipitation whose derived indices, e.g. monthly/annual maxima, will be freely available to the wider scientific community.
Caruthers, Allison S.; Van Ryzin, Mark J.; Dishion, Thomas J.
2013-01-01
Adolescent study participants who engaged in a brief, family-centered intervention (the Family Check-Up; FCU) were later assessed for the intervention’s effects on high-risk sexual behavior (HRSB) in early adulthood (age 22). Participants (N = 998 adolescents and their families) were randomly assigned to a family-centered intervention in 6th grade and were offered a gated, multilevel intervention that included (a) a school-based family resource center, (b) the FCU, and (c) more intensive, family-based treatment. All services were voluntary, but high-risk families were actively recruited into the FCU. Approximately 23% of the intervention families engaged in the FCU and approximately 18% engaged in more intensive treatment. Using an intent-to-treat design, we found that the direct effect of the FCU on HRSB was not significant; however, an analysis of the developmental processes indicated that intervention families demonstrated improved family relationship quality when compared to control families, which in turn resulted in lower levels of HRSB in early adulthood. Further, the significant effect of family relationship quality on HRSB was mediated by differences in parental monitoring and early sexual activity, and these effects varied as a function of gender and ethnicity. Indirect effects of the FCU on HRSB were significant via multiple different pathways. The implications of these findings for enhancing the impact of family-centered interventions are discussed. PMID:23536124
2011-01-01
Background Crew resource management (CRM) has the potential to enhance patient safety in intensive care units (ICU) by improving the use of non-technical skills. However, CRM evaluation studies in health care are inconclusive with regard to the effect of this training on behaviour and organizational outcomes, due to weak study designs and the scarce use of direct observations. Therefore, the aim of this study is to determine the effectiveness and cost-effectiveness of CRM training on attitude, behaviour and organization after one year, using a multi-method approach and matched control units. The purpose of the present article is to describe the study protocol and the underlying choices of this evaluation study of CRM in the ICU in detail. Methods/Design Six ICUs participated in a paired controlled trial, with one pre-test and two post test measurements (respectively three months and one year after the training). Three ICUs were trained and compared to matched control ICUs. The 2-day classroom-based training was delivered to multidisciplinary groups. Typical CRM topics on the individual, team and organizational level were discussed, such as situational awareness, leadership and communication. All levels of Kirkpatrick's evaluation framework (reaction, learning, behaviour and organisation) were assessed using questionnaires, direct observations, interviews and routine ICU administration data. Discussion It is expected that the CRM training acts as a generic intervention that stimulates specific interventions. Besides effectiveness and cost-effectiveness, the assessment of the barriers and facilitators will provide insight in the implementation process of CRM. Trial registration Netherlands Trial Register (NTR): NTR1976 PMID:22073981
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
Variation in normal mood state influences sensitivity to dynamic changes in emotional expression.
Jackson, Margaret C; Arlegui-Prieto, Maritxu
2016-03-01
Normal social functioning depends on the ability to efficiently and accurately detect when someone's facial expression changes to convey positive or negative emotion. While observer mood state has been shown to influence emotion recognition, how variations in normal mood might influence sensitivity to the dynamic emergence of expressions has not yet been addressed. To investigate this, we modified an existing face-morphing paradigm in which a central face gradually changes from neutral to expressive (angry, sad, happy, surprised). Our sample comprised healthy young adults, and current mood state was measured using the PANAS-X. Participants pressed a key as soon as they (1) noticed a physical change in expression (perceptual sensitivity-novel task element), and (2) could clearly conceptualize which expression was emerging (conceptual sensitivity). A final unspeeded response required participants to explicitly label the expression as a measure of recognition accuracy. We measured the percentage morph (expression intensity) at which a perceptual and conceptual change was detected, where greater intensity equates to poorer sensitivity. Increased positive mood reduced perceptual and conceptual sensitivity to angry and sad expressions only (a mood incongruency effect). Of particular interest, increased negative mood decreased conceptual sensitivity for all expressions, but had limited impact on perceptual sensitivity. Thus, heightened negative mood is particularly detrimental for effectively decoding someone else's mood change. This may reflect greater introspection and consumption of attentional resources directed toward the negative self, leaving fewer resources to process emotional signals conveyed by others. This could have important consequences for human social interaction. (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung H.
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must tradeoff time and cost for analysis quality and quantity. The quality is associated with the fidelity of the multidisciplinary models and the quantity is associated with the design space that can be analyzed. The tradeoff is due to the resource intensive process of creating a cohesive multidisciplinary system model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than handwritten translation scripts or codes of multidisciplinary models and their analyses. The key is to work from a core system model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query-View- Transform (QVT), from the OMG community. SysML was designed to model multidisciplinary systems and analyses. The QVT standard was designed to transform SysML models. The Europa Hability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, a symbolic mathematical framework (supported by Wolfram Mathematica) is coordinated by data objects transformed from the system model, enabling extremely flexible and powerful tradespace exploration and analytical investigations of expected system performance.
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must trade between time and cost for analysis quality and quantity. The quality often correlates with greater run time in multidisciplinary models and the quantity is associated with the number of alternatives that can be analyzed. The trade-off is due to the resource intensive process of creating a cohesive multidisciplinary systems model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than hand-written translation scripts between multi-disciplinary models and their analyses. The key is to work from a core systems model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query/View/Transformation (QVT), from the OMG community. SysML was designed to model multidisciplinary systems. The QVT standard was designed to transform SysML models into other models, including those leveraged by engineering analyses. The Europa Habitability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, symbolic analysis (supported by Wolfram Mathematica) is coordinated by data objects transformed from the systems model, enabling extremely flexible and powerful design exploration and analytical investigations of expected system performance.