Evaluating the process parameters of the dry coating process using a 2(5-1) factorial design.
Kablitz, Caroline Désirée; Urbanetz, Nora Anne
2013-02-01
A recent development of coating technology is dry coating, where polymer powder and liquid plasticizer are layered on the cores without using organic solvents or water. Several studies evaluating the process were introduced in literature, however, little information about the critical process parameters (CPPs) is given. Aim of the study was the investigation and optimization of CPPs with respect to one of the critical quality attributes (CQAs), the coating efficiency of the dry coating process in a rotary fluid bed. Theophylline pellets were coated with hydroxypropyl methylcellulose acetate succinate as enteric film former and triethyl citrate and acetylated monoglyceride as plasticizer. A 2(5-1) design of experiments (DOEs) was created investigating five independent process parameters namely coating temperature, curing temperature, feeding/spraying rate, air flow and rotor speed. The results were evaluated by multilinear regression using the software Modde(®) 7. It is shown, that generally, low feeding/spraying rates and low rotor speeds increase coating efficiency. High coating temperatures enhance coating efficiency, whereas medium curing temperatures have been found to be optimum in terms of coating efficiency. This study provides a scientific base for the design of efficient dry coating processes with respect to coating efficiency.
ERIC Educational Resources Information Center
Research for Better Schools, Inc., Philadelphia, PA.
The process for providing a "thorough and efficient" (T & E) education according to New Jersey statutes and regulations involves six basic steps. This document suggests procedures for handling the fifth step, educational program evaluation. Processes discussed include committee formation, evaluation planning, action plan…
Indicators and Metrics for Evaluating the Sustainability of Chemical Processes
A metric-based method, called GREENSCOPE, has been developed for evaluating process sustainability. Using lab-scale information and engineering assumptions the method evaluates full-scale epresentations of processes in environmental, efficiency, energy and economic areas. The m...
NASA Technical Reports Server (NTRS)
Wolf, M.
1979-01-01
To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.
Evaluation of the energy efficiency of enzyme fermentation by mechanistic modeling.
Albaek, Mads O; Gernaey, Krist V; Hansen, Morten S; Stocks, Stuart M
2012-04-01
Modeling biotechnological processes is key to obtaining increased productivity and efficiency. Particularly crucial to successful modeling of such systems is the coupling of the physical transport phenomena and the biological activity in one model. We have applied a model for the expression of cellulosic enzymes by the filamentous fungus Trichoderma reesei and found excellent agreement with experimental data. The most influential factor was demonstrated to be viscosity and its influence on mass transfer. Not surprisingly, the biological model is also shown to have high influence on the model prediction. At different rates of agitation and aeration as well as headspace pressure, we can predict the energy efficiency of oxygen transfer, a key process parameter for economical production of industrial enzymes. An inverse relationship between the productivity and energy efficiency of the process was found. This modeling approach can be used by manufacturers to evaluate the enzyme fermentation process for a range of different process conditions with regard to energy efficiency. Copyright © 2011 Wiley Periodicals, Inc.
76 FR 37344 - Technology Evaluation Process
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-27
...-NOA-0039] Technology Evaluation Process AGENCY: Office of Energy Efficiency and Renewable Energy... is an extension of a prior RFI seeking comment on a proposed commercial buildings technology... seeks comments and information related to a commercial buildings technology evaluation process. DOE is...
76 FR 30696 - Technology Evaluation Process
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
...-NOA-0039] Technology Evaluation Process AGENCY: Office of Energy Efficiency and Renewable Energy... (DOE) seeks comments and information related to a commercial buildings technology evaluation process... technologies for commercial buildings based on the voluntary submittal of product test data. The program would...
Silicon solar cell process. Development, fabrication and analysis
NASA Technical Reports Server (NTRS)
Yoo, H. I.; Iles, P. A.; Tanner, D. P.
1978-01-01
Solar cells were fabricated from unconventional silicon sheets, and the performances were characterized with an emphasis on statistical evaluation. A number of solar cell fabrication processes were used and conversion efficiency was measured under AMO condition at 25 C. Silso solar cells using standard processing showed an average efficiency of about 9.6%. Solar cells with back surface field process showed about the same efficiency as the cells from standard process. Solar cells from grain boundary passivation process did not show any improvements in solar cell performance.
Finite-size effect on optimal efficiency of heat engines.
Tajima, Hiroyasu; Hayashi, Masahito
2017-07-01
The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.
Improvement of the Performance of an Electrocoagulation Process System Using Fuzzy Control of pH.
Demirci, Yavuz; Pekel, Lutfiye Canan; Altinten, Ayla; Alpbaz, Mustafa
2015-12-01
The removal efficiencies of electrocoagulation (EC) systems are highly dependent on the initial value of pH. If an EC system has an acidic influent, the pH of the effluent increases during the treatment process; conversely, if such a system has an alkaline influent, the pH of the effluent decreases during the treatment process. Thus, changes in the pH of the wastewater affect the efficiency of the EC process. In this study, we investigated the dynamic effects of pH. To evaluate approaches for preventing increases in the pH of the system, the MATLAB/Simulink program was used to develop and evaluate an on-line computer-based system for pH control. The aim of this work was to study Proportional-Integral-Derivative (PID) control and fuzzy control of the pH of a real textile wastewater purification process using EC. The performances and dynamic behaviors of these two control systems were evaluated based on determinations of COD, colour, and turbidity removal efficiencies.
Li, Rundong; Li, Yanlong; Yang, Tianhua; Wang, Lei; Wang, Weiyun
2015-05-30
Evaluations of technologies for heavy metal control mainly examine the residual and leaching rates of a single heavy metal, such that developed evaluation method have no coordination or uniqueness and are therefore unsuitable for hazard control effect evaluation. An overall pollution toxicity index (OPTI) was established in this paper, based on the developed index, an integrated evaluation method of heavy metal pollution control was established. Application of this method in the melting and sintering of fly ash revealed the following results: The integrated control efficiency of the melting process was higher in all instances than that of the sintering process. The lowest integrated control efficiency of melting was 56.2%, and the highest integrated control efficiency of sintering was 46.6%. Using the same technology, higher integrated control efficiency conditions were all achieved with lower temperatures and shorter times. This study demonstrated the unification and consistency of this method. Copyright © 2015 Elsevier B.V. All rights reserved.
Efficiency of Osmotic Dehydration of Apples in Polyols Solutions.
Cichowska, Joanna; Żubernik, Joanna; Czyżewski, Jakub; Kowalska, Hanna; Witrowa-Rajchert, Dorota
2018-02-17
The present study aimed to evaluate the influence of selected compounds from the polyol group, as well as other saccharides, on the osmotic dehydration process of apples. The following alternative solutions were examined: erythritol, xylitol, maltitol, inulin and oligofructose. Efficiency of the osmotic dehydration process was evaluated based on the kinetics of the process, and through comparison of the results obtained during the application of a sucrose solution. This innovative research utilizes alternative solutions in osmotic pretreatment, which until now, have not been commonly used in fruit processing by researchers worldwide. Results indicate that erythritol and xylitol show stronger or similar efficiency to sucrose; however, the use of inulin, as well as oligofructose, was not satisfactory due to the insufficient, small osmotic driving forces of the process, and the low values of mass transfer parameters.
Evaluation of the Treatment Process of Landfill Leachate Using the Toxicity Assessment Method
Qiu, Aifeng; Cai, Qiang; Zhao, Yuan; Guo, Yingqing; Zhao, Liqian
2016-01-01
Landfill leachate is composed of a complex composition with strong biological toxicity. The combined treatment process of coagulation and sedimentation, anaerobics, electrolysis, and aerobics was set up to treat landfill leachate. This paper explores the effect of different operational parameters of coagulation and sedimentation tanks and electrolytic cells, while investigating the combined process for the removal efficiency of physicochemical indices after processing the landfill leachate. Meanwhile, a battery of toxicity tests with Vibrio fischeri, zebrafish larvae, and embryos were conducted to evaluate acute toxicity and calculated the toxicity reduction efficiency after each treatment process. The combined treatment process resulted in a 100% removal efficiency of Cu, Cd and Zn, and a 93.50% and an 87.44% removal efficiency of Ni and Cr, respectively. The overall removal efficiency of chemical oxygen demand (COD), ammonium nitrogen (NH4+-N), and total nitrogen (TN) were 93.57%, 97.46% and 73.60%, respectively. In addition, toxicity test results showed that the acute toxicity of landfill leachate had also been reduced significantly: toxicity units (TU) decreased from 84.75 to 12.00 for zebrafish larvae, from 82.64 to 10.55 for zebrafish embryos, and from 3.41 to 0.63 for Vibrio fischeri. The combined treatment process was proved to be an efficient treatment method to remove heavy metals, COD, NH4+-N, and acute bio-toxicity of landfill leachate. PMID:28009808
Evaluation of the Treatment Process of Landfill Leachate Using the Toxicity Assessment Method.
Qiu, Aifeng; Cai, Qiang; Zhao, Yuan; Guo, Yingqing; Zhao, Liqian
2016-12-21
Landfill leachate is composed of a complex composition with strong biological toxicity. The combined treatment process of coagulation and sedimentation, anaerobics, electrolysis, and aerobics was set up to treat landfill leachate. This paper explores the effect of different operational parameters of coagulation and sedimentation tanks and electrolytic cells, while investigating the combined process for the removal efficiency of physicochemical indices after processing the landfill leachate. Meanwhile, a battery of toxicity tests with Vibrio fischeri , zebrafish larvae, and embryos were conducted to evaluate acute toxicity and calculated the toxicity reduction efficiency after each treatment process. The combined treatment process resulted in a 100% removal efficiency of Cu, Cd and Zn, and a 93.50% and an 87.44% removal efficiency of Ni and Cr, respectively. The overall removal efficiency of chemical oxygen demand (COD), ammonium nitrogen (NH₄⁺-N), and total nitrogen (TN) were 93.57%, 97.46% and 73.60%, respectively. In addition, toxicity test results showed that the acute toxicity of landfill leachate had also been reduced significantly: toxicity units (TU) decreased from 84.75 to 12.00 for zebrafish larvae, from 82.64 to 10.55 for zebrafish embryos, and from 3.41 to 0.63 for Vibrio fischeri . The combined treatment process was proved to be an efficient treatment method to remove heavy metals, COD, NH₄⁺-N, and acute bio-toxicity of landfill leachate.
Towards a Cloud Based Smart Traffic Management Framework
NASA Astrophysics Data System (ADS)
Rahimi, M. M.; Hakimpour, F.
2017-09-01
Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.
Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria
2011-08-01
Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.
Li, Chao; Nges, Ivo Achu; Lu, Wenjing; Wang, Haoyu
2017-11-01
Increasing popularity and applications of the anaerobic digestion (AD) process has necessitated the development and identification of tools for obtaining reliable indicators of organic matter degradation rate and hence evaluate the process efficiency especially in full-scale, commercial biogas plants. In this study, four biogas plants (A1, A2, B and C) based on different feedstock, process configuration, scale and operational performance were selected and investigated. Results showed that the biochemical methane potential (BMP) based degradation rate could be use in incisively gauging process efficiency in lieu of the traditional degradation rate indicators. The BMP degradation rates ranged from 70 to 90% wherein plants A2 and C showed the highest throughput. This study, therefore, corroborates the feasibility of using the BMP degradation rate as a practical tool for evaluating process performance in full-scale biogas processes and spots light on the microbial diversity in full-scale biogas processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Product Evaluation of the Selective Abandonment Process for School Budgeting
ERIC Educational Resources Information Center
Loofe, Christopher M.
2016-01-01
The purpose of this study is to evaluate the degree to which the Selective Abandonment budget process objectives were achieved by analyzing stakeholder perceptions. Use of this evaluation may enable the district to become more effective, efficient, and more fiscally responsible when developing future program budgeting plans. Program evaluation was…
The evaluation model of the enterprise energy efficiency based on DPSR.
Wei, Jin-Yu; Zhao, Xiao-Yu; Sun, Xue-Shan
2017-05-08
The reasonable evaluation of the enterprise energy efficiency is an important work in order to reduce the energy consumption. In this paper, an effective energy efficiency evaluation index system is proposed based on DPSR (Driving forces-Pressure-State-Response) with the consideration of the actual situation of enterprises. This index system which covers multi-dimensional indexes of the enterprise energy efficiency can reveal the complete causal chain which includes the "driver forces" and "pressure" of the enterprise energy efficiency "state" caused by the internal and external environment, and the ultimate enterprise energy-saving "response" measures. Furthermore, the ANP (Analytic Network Process) and cloud model are used to calculate the weight of each index and evaluate the energy efficiency level. The analysis of BL Company verifies the feasibility of this index system and also provides an effective way to improve the energy efficiency at last.
NASA Astrophysics Data System (ADS)
Li, L.; Zhao, Y.; Wang, L.; Yang, Q.; Liu, G.; Tang, B.; Xiao, J.
2017-08-01
In this paper, the background of performance testing of in-service process flow compressors set in user field are introduced, the main technique barriers faced in the field test are summarized, and the factors that result in real efficiencies of most process flow compressors being lower than the guaranteed by manufacturer are analysed. The authors investigated the present operational situation of process flow compressors in China and found that low efficiency operation of flow compressors is because the compressed gas is generally forced to flow back into the inlet pipe for adapting to the process parameters variety. For example, the anti-surge valve is always opened for centrifugal compressor. To improve the operation efficiency of process compressors the energy efficiency monitoring technology was overviewed and some suggestions are proposed in the paper, which is the basis of research on energy efficiency evaluation and/or labelling of process compressors.
EFFECT OF LOADING DUST TYPE ON THE FILTRATION EFFICIENCY OF ELECTROSTATICALLY CHARGED FILTERS
The paper gives results of an evaluation of the effect of loading dust type on the filtration efficiency of electrostatically charged filters. Three types of filters were evaluated: a rigid-cell filter charged using an electrodynamic spinning process, a pleated-panel filter cha...
Eco-Efficiency Analysis of biotechnological processes.
Saling, Peter
2005-07-01
Eco-Efficiency has been variously defined and analytically implemented by several workers. In most cases, Eco-Efficiency is taken to mean the ecological optimization of overall systems while not disregarding economic factors. Eco-Efficiency should increase the positive ecological performance of a commercial company in relation to economic value creation--or to reduce negative effects. Several companies use Eco-Efficiency Analysis for decision-making processes; and industrial examples of best practices in developing and implementing Eco-Efficiency have been reviewed. They clearly demonstrate the environmental and business benefits of Eco-Efficiency. An instrument for the early recognition and systematic detection of economic and environmental opportunities and risks for production processes in the chemical industry began use in 1997, since when different new features have been developed, leading to many examples. This powerful Eco-Efficiency Analysis allows a feasibility evaluation of existing and future business activities and is applied by BASF. In many cases, decision-makers are able to choose among alternative processes for making a product.
Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M
2016-10-01
To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.
Shepherd, Jonathan; Frampton, Geoff K; Pickett, Karen; Wyatt, Jeremy C
2018-01-01
To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review 'innovations'. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality.
Biocatalytic Synthesis of the Rare Sugar Kojibiose: Process Scale-Up and Application Testing.
Beerens, Koen; De Winter, Karel; Van de Walle, Davy; Grootaert, Charlotte; Kamiloglu, Senem; Miclotte, Lisa; Van de Wiele, Tom; Van Camp, John; Dewettinck, Koen; Desmet, Tom
2017-07-26
Cost-efficient (bio)chemical production processes are essential to evaluate the commercial and industrial applications of promising carbohydrates and also are essential to ensure economically viable production processes. Here, the synthesis of the naturally occurring disaccharide kojibiose (2-O-α-d-glucopyranosyl-d-glucopyranoside) was evaluated using different Bifidobacterium adolescentis sucrose phosphorylase variants. Variant L341I_Q345S was found to efficiently synthesize kojibiose while remaining fully active after 1 week of incubation at 55 °C. Process optimization allowed kojibiose production at the kilogram scale, and simple but efficient downstream processing, using a yeast treatment and crystallization, resulted in more than 3 kg of highly pure crystalline kojibiose (99.8%). These amounts allowed a deeper characterization of its potential in food applications. It was found to have possible beneficial health effects, including delayed glucose release and potential to trigger SCFA production. Finally, we compared the bulk functionality of highly pure kojibiose to that of sucrose, hereby mapping its potential as a new sweetener in confectionery products.
Efficiency analysis of wood processing industry in China during 2006-2015
NASA Astrophysics Data System (ADS)
Zhang, Kun; Yuan, Baolong; Li, Yanxuan
2018-03-01
The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.
Self Evaluation of Organizations.
ERIC Educational Resources Information Center
Pooley, Richard C.
Evaluation within human service organizations is defined in terms of accepted evaluation criteria, with reasonable expectations shown and structured into a model of systematic evaluation practice. The evaluation criteria of program effort, performance, adequacy, efficiency and process mechanisms are discussed, along with measurement information…
Code of Federal Regulations, 2013 CFR
2013-07-01
... performance test for those control techniques in accordance with paragraph (b)(6) of this section. The design..., immediately preceding the use of the control technique. A design evaluation shall also address other vent... paragraph (f)(1)(i) of this section, the design evaluation shall document the control efficiency and address...
No Cost – Low Cost Compressed Air System Optimization in Industry
NASA Astrophysics Data System (ADS)
Dharma, A.; Budiarsa, N.; Watiniasih, N.; Antara, N. G.
2018-04-01
Energy conservation is a systematic, integrated of effort, in order to preserve energy sources and improve energy utilization efficiency. Utilization of energy in efficient manner without reducing the energy usage it must. Energy conservation efforts are applied at all stages of utilization, from utilization of energy resources to final, using efficient technology, and cultivating an energy-efficient lifestyle. The most common way is to promote energy efficiency in the industry on end use and overcome barriers to achieve such efficiency by using system energy optimization programs. The facts show that energy saving efforts in the process usually only focus on replacing tools and not an overall system improvement effort. In this research, a framework of sustainable energy reduction work in companies that have or have not implemented energy management system (EnMS) will be conducted a systematic technical approach in evaluating accurately a compressed-air system and potential optimization through observation, measurement and verification environmental conditions and processes, then processing the physical quantities of systems such as air flow, pressure and electrical power energy at any given time measured using comparative analysis methods in this industry, to provide the potential savings of energy saving is greater than the component approach, with no cost to the lowest cost (no cost - low cost). The process of evaluating energy utilization and energy saving opportunities will provide recommendations for increasing efficiency in the industry and reducing CO2 emissions and improving environmental quality.
NASA Astrophysics Data System (ADS)
Kim, Euiyoung; Cho, Maenghyo
2017-11-01
In most non-linear analyses, the construction of a system matrix uses a large amount of computation time, comparable to the computation time required by the solving process. If the process for computing non-linear internal force matrices is substituted with an effective equivalent model that enables the bypass of numerical integrations and assembly processes used in matrix construction, efficiency can be greatly enhanced. A stiffness evaluation procedure (STEP) establishes non-linear internal force models using polynomial formulations of displacements. To efficiently identify an equivalent model, the method has evolved such that it is based on a reduced-order system. The reduction process, however, makes the equivalent model difficult to parameterize, which significantly affects the efficiency of the optimization process. In this paper, therefore, a new STEP, E-STEP, is proposed. Based on the element-wise nature of the finite element model, the stiffness evaluation is carried out element-by-element in the full domain. Since the unit of computation for the stiffness evaluation is restricted by element size, and since the computation is independent, the equivalent model can be constructed efficiently in parallel, even in the full domain. Due to the element-wise nature of the construction procedure, the equivalent E-STEP model is easily characterized by design parameters. Various reduced-order modeling techniques can be applied to the equivalent system in a manner similar to how they are applied in the original system. The reduced-order model based on E-STEP is successfully demonstrated for the dynamic analyses of non-linear structural finite element systems under varying design parameters.
NASA Astrophysics Data System (ADS)
Onizawa, Naoya; Tamakoshi, Akira; Hanyu, Takahiro
2017-08-01
In this paper, reinitialization-free nonvolatile computer systems are designed and evaluated for energy-harvesting Internet of things (IoT) applications. In energy-harvesting applications, as power supplies generated from renewable power sources cause frequent power failures, data processed need to be backed up when power failures occur. Unless data are safely backed up before power supplies diminish, reinitialization processes are required when power supplies are recovered, which results in low energy efficiencies and slow operations. Using nonvolatile devices in processors and memories can realize a faster backup than a conventional volatile computer system, leading to a higher energy efficiency. To evaluate the energy efficiency upon frequent power failures, typical computer systems including processors and memories are designed using 90 nm CMOS or CMOS/magnetic tunnel junction (MTJ) technologies. Nonvolatile ARM Cortex-M0 processors with 4 kB MRAMs are evaluated using a typical computing benchmark program, Dhrystone, which shows a few order-of-magnitude reductions in energy in comparison with a volatile processor with SRAM.
Zupanc, Mojca; Kosjek, Tina; Petkovšek, Martin; Dular, Matevž; Kompare, Boris; Širok, Brane; Blažeka, Željko; Heath, Ester
2013-07-01
To augment the removal of pharmaceuticals different conventional and alternative wastewater treatment processes and their combinations were investigated. We tested the efficiency of (1) two distinct laboratory scale biological processes: suspended activated sludge and attached-growth biomass, (2) a combined hydrodynamic cavitation-hydrogen peroxide process and (3) UV treatment. Five pharmaceuticals were chosen including ibuprofen, naproxen, ketoprofen, carbamazepine and diclofenac, and an active metabolite of the lipid regulating agent clofibric acid. Biological treatment efficiency was evaluated using lab-scale suspended activated sludge and moving bed biofilm flow-through reactors, which were operated under identical conditions in respect to hydraulic retention time, working volume, concentration of added pharmaceuticals and synthetic wastewater composition. The suspended activated sludge process showed poor and inconsistent removal of clofibric acid, carbamazepine and diclofenac, while ibuprofen, naproxen and ketoprofen yielded over 74% removal. Moving bed biofilm reactors were filled with two different types of carriers i.e. Kaldnes K1 and Mutag BioChip™ and resulted in higher removal efficiencies for ibuprofen and diclofenac. Augmentation and consistency in the removal of diclofenac were observed in reactors using Mutag BioChip™ carriers (85%±10%) compared to reactors using Kaldnes carriers and suspended activated sludge (74%±22% and 48%±19%, respectively). To enhance the removal of pharmaceuticals hydrodynamic cavitation with hydrogen peroxide process was evaluated and optimal conditions for removal were established regarding the duration of cavitation, amount of added hydrogen peroxide and initial pressure, all of which influence the efficiency of the process. Optimal parameters resulted in removal efficiencies between 3-70%. Coupling the attached-growth biomass biological treatment, hydrodynamic cavitation/hydrogen peroxide process and UV treatment resulted in removal efficiencies of >90% for clofibric acid and >98% for carbamazepine and diclofenac, while the remaining compounds were reduced to levels below the LOD. For ibuprofen, naproxen, ketoprofen and diclofenac the highest contribution to overall removal was attributed to biological treatment, for clofibric acid UV treatment was the most efficient, while for carbamazepine hydrodynamic cavitation/hydrogen peroxide process and UV treatment were equally efficient. Copyright © 2012 Elsevier B.V. All rights reserved.
Multiple-state quantum Otto engine, 1D box system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Latifah, E., E-mail: enylatifah@um.ac.id; Purwanto, A.
2014-03-24
Quantum heat engines produce work using quantum matter as their working substance. We studied adiabatic and isochoric processes and defined the general force according to quantum system. The processes and general force are used to evaluate a quantum Otto engine based on multiple-state of one dimensional box system and calculate the efficiency. As a result, the efficiency depends on the ratio of initial and final width of system under adiabatic processes.
An Econometric Approach to Evaluate Navy Advertising Efficiency.
1996-03-01
This thesis uses an econometric approach to systematically and comprehensively analyze Navy advertising and recruiting data to determine Navy... advertising cost efficiency in the Navy recruiting process. Current recruiting and advertising cost data are merged into an appropriate data base and...evaluated using multiple regression techniques to find assessments of the relationships between Navy advertising expenditures and recruit contracts attained
Zhaojiang Wang; Menghua Qin; J.Y. Zhu; Guoyu Tian; Zongquan Li
2013-01-01
Rejects from sulfite pulp mill that otherwise would be disposed of by incineration were converted to ethanol by a combined physicalâbiological process that was comprised of physical refining and simultaneous saccharification and fermentation (SSF). The energy efficiency was evaluated with comparison to thermochemically pretreated biomass, such as those pretreated by...
CFD code evaluation for internal flow modeling
NASA Technical Reports Server (NTRS)
Chung, T. J.
1990-01-01
Research on the computational fluid dynamics (CFD) code evaluation with emphasis on supercomputing in reacting flows is discussed. Advantages of unstructured grids, multigrids, adaptive methods, improved flow solvers, vector processing, parallel processing, and reduction of memory requirements are discussed. As examples, researchers include applications of supercomputing to reacting flow Navier-Stokes equations including shock waves and turbulence and combustion instability problems associated with solid and liquid propellants. Evaluation of codes developed by other organizations are not included. Instead, the basic criteria for accuracy and efficiency have been established, and some applications on rocket combustion have been made. Research toward an ultimate goal, the most accurate and efficient CFD code, is in progress and will continue for years to come.
Kwak, D H; Yoo, S J; Lee, E J; Lee, J W
2010-01-01
Most of the water treatment plants applying the DAF process are faced with off-flavors control problems. For simultaneous control of particles of impurities and dissolved organics that cause pungent taste and odor in water, an effective method would be the simple application of powdered activated carbon (PAC) in the DAF process. A series of experiments were carried out to explore the feasibility for simultaneous removal of kaolin particles and organic compounds that produce off-flavors (2-MIB and geosmin). In addition, the flotation efficiency of kaolin and PAC particles adsorbing organics in the DAF process was evaluated by employing the population balance theory. The removal efficiency of 2-MIB and geosmin under the treatment condition with kaolin particles for simultaneous treatment was lower than that of the individual treatment. The decrease in the removal efficiency was probably caused by 2-MIB and geosmin remaining in the PAC particle in the treated water of DAF after bubble flotation. Simulation results obtained by the population balance model indicate, that the initial collision-attachment efficiency of PAC particles was lower than that of kaolin particles.
Photonic efficiency of the photodegradation of paracetamol in water by the photo-Fenton process.
Yamal-Turbay, E; Ortega, E; Conte, L O; Graells, M; Mansilla, H D; Alfano, O M; Pérez-Moya, M
2015-01-01
An experimental study of the homogeneous Fenton and photo-Fenton degradation of 4-amidophenol (paracetamol, PCT) is presented. For all the operation conditions evaluated, PCT degradation is efficiently attained by both Fenton and photo-Fenton processes. Also, photonic efficiencies of PCT degradation and mineralization are determined under different experimental conditions, characterizing the influence of hydrogen peroxide (H2O2) and Fe(II) on both contaminant degradation and sample mineralization. The maximum photonic degradation efficiencies for 5 and 10 mg L(-1) Fe(II) were 3.9 (H2O2 = 189 mg L(-1)) and 5 (H2O2 = 378 mg L(-1)), respectively. For higher concentrations of oxidant, H2O2 acts as a "scavenger" radical, competing in pollutant degradation and reducing the reaction rate. Moreover, in order to quantify the consumption of the oxidizing agent, the specific consumption of the hydrogen peroxide was also evaluated. For all operating conditions of both hydrogen peroxide and Fe(II) concentration, the consumption values obtained for Fenton process were always higher than the corresponding values observed for photo-Fenton. This implies a less efficient use of the oxidizing agent for dark conditions.
Qi, Wenqiang; Chen, Taojing; Wang, Liang; Wu, Minghong; Zhao, Quanyu; Wei, Wei
2017-03-01
In this study, the sequential process of anaerobic fermentation followed by microalgae cultivation was evaluated from both nutrient and energy recovery standpoints. The effects of different fermentation type on the biogas generation, broth metabolites' composition, algal growth and nutrients' utilization, and energy conversion efficiencies for the whole processes were discussed. When the fermentation was designed to produce hydrogen-dominating biogas, the total energy conversion efficiency (TECE) of the sequential process was higher than that of the methane fermentation one. With the production of hydrogen in anaerobic fermentation, more organic carbon metabolites were left in the broth to support better algal growth with more efficient incorporation of ammonia nitrogen. By applying the sequential process, the heat value conversion efficiency (HVCE) for the wastewater could reach 41.2%, if methane was avoided in the fermentation biogas. The removal efficiencies of organic metabolites and NH 4 + -N in the better case were 100% and 98.3%, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.
Loss, Edenes; Royer, Andrea Rafaela; Barreto-Rodrigues, Marcio; Barana, Ana Claudia
2009-07-30
This study evaluated the Pleurotus spp. mushroom production process using an effluent from the maize agroindustrial process as a carbon and nitrogen source and as a wetting agent. A complete experimental design based on factorial planning was used to optimize the biological efficiency and evaluate the effect of the concentration of effluent, pH and species of Pleurotus. The results indicated that the effluent affects the biological efficiency for the production of both species of mushrooms at all pH values studied. The maximum biological efficiency predicted by the model (81.36%) corresponded to the point defined by the effluent contents (X(1)=1), pH (X(2)=-1) and fungus species (X(3)=1), specifically 50%, 5.0 and P. floridae, respectively. The results demonstrated that the effluent is a good alternative for the production of Pleurotus mushrooms.
An exergy approach to efficiency evaluation of desalination
NASA Astrophysics Data System (ADS)
Ng, Kim Choon; Shahzad, Muhammad Wakil; Son, Hyuk Soo; Hamed, Osman A.
2017-05-01
This paper presents an evaluation process efficiency based on the consumption of primary energy for all types of practical desalination methods available hitherto. The conventional performance ratio has, thus far, been defined with respect to the consumption of derived energy, such as the electricity or steam, which are susceptible to the conversion losses of power plants and boilers that burned the input primary fuels. As derived energies are usually expressed by the units, either kWh or Joules, these units cannot differentiate the grade of energy supplied to the processes accurately. In this paper, the specific energy consumption is revisited for the efficacy of all large-scale desalination plants. In today's combined production of electricity and desalinated water, accomplished with advanced cogeneration concept, the input exergy of fuels is utilized optimally and efficiently in a temperature cascaded manner. By discerning the exergy destruction successively in the turbines and desalination processes, the relative contribution of primary energy to the processes can be accurately apportioned to the input primary energy. Although efficiency is not a law of thermodynamics, however, a common platform for expressing the figures of merit explicit to the efficacy of desalination processes can be developed meaningfully that has the thermodynamic rigor up to the ideal or thermodynamic limit of seawater desalination for all scientists and engineers to aspire to.
Improving environmental impact and cost assessment for supplier evaluation
NASA Astrophysics Data System (ADS)
Beucker, Severin; Lang, Claus
2004-02-01
Improving a company"s environmental and financial performance necessitates the evaluation of environmental impacts deriving from the production and cost effects of corporate actions. These effects have to be made transparent and concrete targets have to be developed. Such an evaluation has to be done on a regular basis but with limited expenses. To achieve this, different instruments of environmental controlling such as LCA and environmental performance indicators have to be combined with methods from cost accounting. Within the research project CARE (Computer Aided Resource Efficiency Accounting for Medium-Sized Enterprises), the method Resource Efficiency Accounting (REA) is used to give the participating companies new insights into hidden costs and environmental effects of their production and products. The method combines process based cost accounting with environmental impact assessment methodology and offers results that can be integrated into a company"s environmental controlling system and business processes like cost accounting, supplier assessment, etc. Much of the data necessary for the combined assessment can be available within a company"s IT system and therefore can be efficiently used for the assessment process. The project CARE puts a strong focus on the use of company data and information systems for the described assessment process and offers a methodological background for the evaluation and the structuring of such data. Besides the general approach of the project CARE the paper will present results from a case study in which the described approach is used for the evaluation of suppliers.
Evaluation of a time efficient immunization strategy for anti-PAH antibody development
Li, Xin; Kaattari, Stephen L.; Vogelbein, Mary Ann; Unger, Michael A.
2016-01-01
The development of monoclonal antibodies (mAb) with affinity to small molecules can be a time-consuming process. To evaluate shortening the time for mAb production, we examined mouse antisera at different time points post-immunization to measure titer and to evaluate the affinity to the immunogen PBA (pyrene butyric acid). Fusions were also conducted temporally to evaluate antibody production success at various time periods. We produced anti-PBA antibodies 7 weeks post-immunization and selected for anti-PAH reactivity during the hybridoma screening process. Moreover, there were no obvious sensitivity differences relative to antibodies screened from a more traditional 18 week schedule. Our results demonstrate a more time efficient immunization strategy for anti-PAH antibody development that may be applied to other small molecules. PMID:27282486
Steuten, Lotte; Vrijhoef, Bert; Severens, Hans; van Merode, Frits; Spreeuwenberg, Cor
2006-01-01
An overview was produced of indicators currently used to assess disease management programs and, based on these findings, provide a framework regarding sets of indicators that should be used when taking the aims and types of disease management programs into account. A systematic literature review was performed. Thirty-six studies met the inclusion criteria. It appeared that a link between aims of disease management and evaluated structure, process, as well as outcome indicators does not exist in a substantial part of published studies on disease management of diabetes and asthma/chronic obstructive pulmonary disease, especially when efficiency of care is concerned. Furthermore, structure indicators are largely missing from the evaluations, although these are of major importance for the interpretation of outcomes for purposes of decision-making. Efficiency of disease management is mainly evaluated by means of process indicators; the use of outcome indicators is less common. Within a framework, structure, process, and outcome indicators for effectiveness and efficiency are recommended for each type of disease management program. The link between aims of disease management and evaluated structure, process, and outcome indicators does not exist in a substantial part of published studies on disease management. The added value of this study mainly lies in the development of a framework to guide the choice of indicators for health technology assessment of disease management.
Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver
2017-08-01
Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.
The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. In this work diff...
Ecotoxicological evaluation of diesel-contaminated soil before and after a bioremediation process.
Molina-Barahona, L; Vega-Loyo, L; Guerrero, M; Ramírez, S; Romero, I; Vega-Jarquín, C; Albores, A
2005-02-01
Evaluation of contaminated sites is usually performed by chemical analysis of pollutants in soil. This is not enough either to evaluate the environmental risk of contaminated soil nor to evaluate the efficiency of soil cleanup techniques. Information on the bioavailability of complex mixtures of xenobiotics and degradation products cannot be totally provided by chemical analytical data, but results from bioassays can integrate the effects of pollutants in complex mixtures. In the preservation of human health and environment quality, it is important to assess the ecotoxicological effects of contaminated soils to obtain a better evaluation of the healthiness of this system. The monitoring of a diesel-contaminated soil and the evaluation of a bioremediation technique conducted on a microcosm scale were performed by a battery of ecotoxicological tests including phytotoxicity, Daphnia magna, and nematode assays. In this study we biostimulated the native microflora of soil contaminated with diesel by adding nutrients and crop residue (corn straw) as a bulking agent and as a source of microorganisms and nutrients; in addition, moisture was adjusted to enhance diesel removal. The bioremediation process efficiency was evaluated directly by an innovative, simple phytotoxicity test system and the diesel extracts by Daphnia magna and nematode assays. Contaminated soil samples were revealed to have toxic effects on seed germination, seedling growth, and Daphnia survival. After biostimulation, the diesel concentration was reduced by 50.6%, and the soil samples showed a significant reduction in phytotoxicity (9%-15%) and Daphnia assays (3-fold), confirming the effectiveness of the bioremediation process. Results from our microcosm study suggest that in addition to the evaluation of the bioremediation processes efficiency, toxicity testing is different with organisms representative of diverse phylogenic levels. The integration of analytical, toxicological and bioremediation data is necessary to properly assess the ecological risk of bioremediation processes. (c) 2005 Wiley Periodicals, Inc.
Frampton, Geoff K.; Pickett, Karen; Wyatt, Jeremy C.
2018-01-01
Objective To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. Methods A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review ‘innovations’. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. Results A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. Conclusions There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality. PMID:29750807
ERIC Educational Resources Information Center
Lalla, Michele; Ferrari, Davide
2011-01-01
The collection of teaching evaluation questionnaires in the traditional paper-and-pencil format is a costly and time-consuming process and yet it is a common assessment practice in many university systems. Web-based data collection would reduce costs and significantly increase the efficiency of the overall evaluation process in numerous ways.…
See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R
2013-08-01
We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.
Anaerobic digestion of food waste: A review focusing on process stability.
Li, Lei; Peng, Xuya; Wang, Xiaoming; Wu, Di
2018-01-01
Food waste (FW) is rich in biomass energy, and increasing numbers of national programs are being established to recover energy from FW using anaerobic digestion (AD). However process instability is a common operational issue for AD of FW. Process monitoring and control as well as microbial management can be used to control instability and increase the energy conversion efficiency of anaerobic digesters. Here, we review research progress related to these methods and identify existing limitations to efficient AD; recommendations for future research are also discussed. Process monitoring and control are suitable for evaluating the current operational status of digesters, whereas microbial management can facilitate early diagnosis and process optimization. Optimizing and combining these two methods are necessary to improve AD efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Krishnan, Anath Rau; Hamzah, Ahmad Aizuddin
2017-08-01
It is crucial for a zakat institution to evaluate and understand how efficiently they have operated in the past, thus ideal strategies could be developed for future improvement. However, evaluating the efficiency of a zakat institution is actually a challenging process as it involves the presence of multiple inputs or/and outputs. This paper proposes a step-by-step procedure comprising two data envelopment analysis models, namely dual Charnes-Cooper-Rhodes and slack-based model to quantitatively measure the overall efficiency of a zakat institution over a period of time. The applicability of the proposed procedure was demonstrated by evaluating the efficiency of Pusat Zakat Sabah, Malaysia from the year of 2007 up to 2015 by treating each year as a decision making unit. Two inputs (i.e. number of staff and number of branches) and two outputs (i.e. total collection and total distribution) were used to measure the overall efficiency achieved each year. The causes of inefficiency and strategy for future improvement were discussed based on the results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folsom, D.W.; Gavaskar, A.R.; Jones, J.A.
1993-10-01
The project compared chemical use, waste generation, cost, and product quality between electroless copper and carbon-black-based preplating technologies at the printed wire board (PWB) manufacturing facility of McCurdy Circuits in Orange, CA. The carbon-black based preplating technology evaluated is used as an alternative process for electroless copper (EC) plating of through-holes before electrolytic copper plating. The specific process used at McCurdy is the BlackHole (BH) technology process, which uses a dispersion of carbon black in an aqueous solution to provide a conductive surface for subsequent electrolytic copper plating. The carbon-black dispersion technology provided effective waste reduction and long-term cost savings.more » The economic analysis determined that the new process was cost efficient because chemical use was reduced and the process proved more efficient; the payback period was less than 4 yrs.« less
Study on loading coefficient in steam explosion process of corn stalk.
Sui, Wenjie; Chen, Hongzhang
2015-03-01
The object of this work was to evaluate the effect of loading coefficient on steam explosion process and efficacy of corn stalk. Loading coefficient's relation with loading pattern and material property was first revealed, then its effect on transfer process and pretreatment efficacy of steam explosion was assessed by established models and enzymatic hydrolysis tests, respectively, in order to propose its optimization strategy for improving the process economy. Results showed that loading coefficient was mainly determined by loading pattern, moisture content and chip size. Both compact loading pattern and low moisture content improved the energy efficiency of steam explosion pretreatment and overall sugar yield of pretreated materials, indicating that they are desirable to improve the process economy. Pretreatment of small chip size showed opposite effects in pretreatment energy efficiency and enzymatic hydrolysis performance, thus its optimization should be balanced in investigated aspects according to further techno-economical evaluation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Automated matching software for clinical trials eligibility: measuring efficiency and flexibility.
Penberthy, Lynne; Brown, Richard; Puma, Federico; Dahman, Bassam
2010-05-01
Clinical trials (CT) serve as the media that translates clinical research into standards of care. Low or slow recruitment leads to delays in delivery of new therapies to the public. Determination of eligibility in all patients is one of the most important factors to assure unbiased results from the clinical trials process and represents the first step in addressing the issue of under representation and equal access to clinical trials. This is a pilot project evaluating the efficiency, flexibility, and generalizibility of an automated clinical trials eligibility screening tool across 5 different clinical trials and clinical trial scenarios. There was a substantial total savings during the study period in research staff time spent in evaluating patients for eligibility ranging from 165h to 1329h. There was a marked enhancement in efficiency with the automated system for all but one study in the pilot. The ratio of mean staff time required per eligible patient identified ranged from 0.8 to 19.4 for the manual versus the automated process. The results of this study demonstrate that automation offers an opportunity to reduce the burden of the manual processes required for CT eligibility screening and to assure that all patients have an opportunity to be evaluated for participation in clinical trials as appropriate. The automated process greatly reduces the time spent on eligibility screening compared with the traditional manual process by effectively transferring the load of the eligibility assessment process to the computer. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding
NASA Astrophysics Data System (ADS)
Güpner, Michael; Patschger, Andreas; Bliedtner, Jens
Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.
Efficiency of innovative technology in construction industry
NASA Astrophysics Data System (ADS)
Stverkova, H.; Vaclavik, V.
2017-10-01
The need for sustainability increasingly influences the development of new technologies, business processes and working practices. Innovations are an important part of all business processes. The aim of innovation is, in particular, to reduce the burden on the environment. The current trend in the construction industry is diamond rope cutting. The aim of the paper is to evaluate the most advanced technology for cutting and removing concrete structures in terms of efficiency.
Zuriaga-Agustí, E; Alventosa-deLara, E; Barredo-Damas, S; Alcaina-Miranda, M I; Iborra-Clar, M I; Mendoza-Roca, J A
2014-05-01
Ultrafiltration membrane processes have become an established technology in the treatment and reuse of secondary effluents. Nevertheless, membrane fouling arises as a major obstacle in the efficient operation of these systems. In the current study, the performance of tubular ultrafiltration ceramic membranes was evaluated according to the roles exerted by membrane pore size, transmembrane pressure and feed concentration on a binary foulant system simulating textile wastewater. For that purpose, carboxymethyl cellulose sodium salt (CMC) and an azo dye were used as colloidal and organic foulants, respectively. Results showed that a larger pore size enabled more solutes to get adsorbed into the pores, producing a sharp permeate flux decline attributed to the rapid pore blockage. Besides, an increase in CMC concentration enhanced severe fouling in the case of the tighter membrane. Concerning separation efficiency, organic matter was almost completely removed with removal efficiency above 98.5%. Regarding the dye, 93% of rejection was achieved. Comparable removal efficiencies were attributed to the dynamic membrane formed by the cake layer, which governed process performance in terms of rejection and selectivity. As a result, none of the evaluated parameters showed significant influence on separation efficiency, supporting the significant role of cake layer on filtration process. Copyright © 2014 Elsevier Ltd. All rights reserved.
Souza, Fernanda S; Da Silva, Vanessa V; Rosin, Catiusa K; Hainzenreder, Luana; Arenzon, Alexandre; Pizzolato, Tania; Jank, Louise; Féris, Liliana A
2018-02-23
This study investigates the mineralization efficiency, i.e. removal of total organic carbon (TOC) in hospital wastewater by direct ozonation, ozonation with UV radiation (O 3 /UV), homogeneous catalytic ozonation (O 3 /Fe 2+ ) and homogeneous photocatalytic ozonation (O 3 /Fe 2+ /UV). The influence of pH and reaction time was evaluated. For the best process, toxicity and degradation efficiency of the selected pharmaceutical compounds (PhCs) were determined. The results showed that the PhCs detected in the hospital wastewater were completely degraded when the mineralization efficiency reached 54.7% for O 3 /UV with 120 minutes of reaction time using a rate of 1.57 g O 3 h -1 . This process also achieved a higher chemical oxygen demand removal efficiency (64.05%), an increased aromaticity reduction efficiency (81%) and a toxicity reduction.
Fernández de Dios, Maria Ángeles; Iglesias, Olaia; Pazos, Marta; Sanromán, Maria Ángeles
2014-01-01
The applicability of electro-Fenton technology to remediation of wastewater contaminated by several organic pollutants such as dyes and polycyclic aromatic hydrocarbons has been evaluated using iron-enriched zeolite as heterogeneous catalyst. The electro-Fenton technology is an advanced oxidation process that is efficient for the degradation of organic pollutants, but it suffers from the high operating costs due to the need for power investment. For this reason, in this study microbial fuel cells (MFCs) were designed in order to supply electricity to electro-Fenton processes and to achieve high treatment efficiency at low cost. Initially, the effect of key parameters on the MFC power generation was evaluated. Afterwards, the degradation of Reactive Black 5 dye and phenanthrene was evaluated in an electro-Fenton reactor, containing iron-enriched zeolite as catalyst, using the electricity supplied by the MFC. Near complete dye decolourization and 78% of phenanthrene degradation were reached after 90 min and 30 h, respectively. Furthermore, preliminary reusability tests of the developed catalyst showed high degradation levels for successive cycles. The results permit concluding that the integrated system is adequate to achieve high treatment efficiency with low electrical consumption. PMID:24723828
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... of performing the technical analysis, management assessment, and program evaluation tasks required to.... Analysis of elements of the review process (including the presubmission process, and investigational device... time to facilitate a more efficient process. This includes analysis of root causes for inefficiencies...
The Scale Effects of Engineered Inlets in Urban Hydrologic Processes
NASA Astrophysics Data System (ADS)
Shevade, L.; Montalto, F. A.
2017-12-01
Runoff from urban surfaces is typically captured by engineered inlets for conveyance to receiving water bodies or treatment plants. Normative hydrologic and hydraulic (H&H) modeling tools generally assume 100% efficient inlets, though observations by the authors suggest this assumption is invalid. The discrepancy is key since the more efficiently the inlet, the more linearly hydrologic processes scale with catchment area. Using several years of remote sensing, the observed efficiencies of urban green infrastructure (GI) facility inlets in New York City are presented, as a function of the morphological and climatological properties of their catchments and events. The rainfall-runoff response is modeled with EPA to assess the degree of inaccuracy that the assumption of efficient inlets introduces in block and neighborhood-scale simulations. Next, an algorithm is presented that incorporates inlet efficiency into SWMM and the improved predictive skill evaluated using Nash-Sutcliffe and root-mean-square error (RMSE). The results are used to evaluate the extent to which decentralized green stormwater management facilities positioned at the low points of urban catchments ought to be designed with larger capacities than their counterparts located further upslope.
Analysis and Evaluation of Parameters Determining Maximum Efficiency of Fish Protection
NASA Astrophysics Data System (ADS)
Khetsuriani, E. D.; Kostyukov, V. P.; Khetsuriani, T. E.
2017-11-01
The article is concerned with experimental research findings. The efficiency of fish fry protection from entering water inlets is the main criterion of any fish protection facility or device. The research was aimed to determine an adequate mathematical model E = f(PCT, Vp, α), where PCT, Vp and α are controlled factors influencing the process of fish fry protection. The result of the processing of experimental data was an adequate regression model. We determined the maximum of fish protection Emax=94,21 and the minimum of optimization function Emin=44,41. As a result of the statistical processing of experimental data we obtained adequate dependences for determining an optimal rotational speed of tip and fish protection efficiency. The analysis of fish protection efficiency dependence E% = f(PCT, Vp, α) allowed the authors to recommend the following optimized operating modes for it: the maximum fish protection efficiency is achieved at the process pressure PCT=3 atm, stream velocity Vp=0,42 m/s and nozzle inclination angle α=47°49’. The stream velocity Vp has the most critical influence on fish protection efficiency. The maximum efficiency of fish protection is obtained at the tip rotational speed of 70.92 rpm.
Evaluation strategy of regenerative braking energy for supercapacitor vehicle.
Zou, Zhongyue; Cao, Junyi; Cao, Binggang; Chen, Wen
2015-03-01
In order to improve the efficiency of energy conversion and increase the driving range of electric vehicles, the regenerative energy captured during braking process is stored in the energy storage devices and then will be re-used. Due to the high power density of supercapacitors, they are employed to withstand high current in the short time and essentially capture more regenerative energy. The measuring methods for regenerative energy should be investigated to estimate the energy conversion efficiency and performance of electric vehicles. Based on the analysis of the regenerative braking energy system of a supercapacitor vehicle, an evaluation system for energy recovery in the braking process is established using USB portable data-acquisition devices. Experiments under various braking conditions are carried out. The results verify the higher efficiency of energy regeneration system using supercapacitors and the effectiveness of the proposed measurement method. It is also demonstrated that the maximum regenerative energy conversion efficiency can reach to 88%. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Michael; Dietsch, Niko
2018-01-01
This guide describes frameworks for evaluation, measurement, and verification (EM&V) of utility customer–funded energy efficiency programs. The authors reviewed multiple frameworks across the United States and gathered input from experts to prepare this guide. This guide provides the reader with both the contents of an EM&V framework, along with the processes used to develop and update these frameworks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Nan; Romankiewicz, John; Vine, Edward
2012-12-15
In recent years, the number of energy efficiency policies implemented has grown very rapidly as energy security and climate change have become top policy issues for many governments around the world. Within the sphere of energy efficiency policy, governments (federal and local), electric utilities, and other types of businesses and institutions are implementing a wide variety of programs to spread energy efficiency practices in industry, buildings, transport, and electricity. As programs proliferate, there is an administrative and business imperative to evaluate the savings and processes of these programs to ensure that program funds spent are indeed leading to a moremore » energy-efficient economy.« less
Performance of biofuel processes utilising separate lignin and carbohydrate processing.
Melin, Kristian; Kohl, Thomas; Koskinen, Jukka; Hurme, Markku
2015-09-01
Novel biofuel pathways with increased product yields are evaluated against conventional lignocellulosic biofuel production processes: methanol or methane production via gasification and ethanol production via steam-explosion pre-treatment. The novel processes studied are ethanol production combined with methanol production by gasification, hydrocarbon fuel production with additional hydrogen produced from lignin residue gasification, methanol or methane synthesis using synthesis gas from lignin residue gasification and additional hydrogen obtained by aqueous phase reforming in synthesis gas production. The material and energy balances of the processes were calculated by Aspen flow sheet models and add on excel calculations applicable at the conceptual design stage to evaluate the pre-feasibility of the alternatives. The processes were compared using the following criteria: energy efficiency from biomass to products, primary energy efficiency, GHG reduction potential and economy (expressed as net present value: NPV). Several novel biorefinery concepts gave higher energy yields, GHG reduction potential and NPV. Copyright © 2015 Elsevier Ltd. All rights reserved.
ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.
Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping
2018-04-27
A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.
Barta, Zsolt; Reczey, Kati; Zacchi, Guido
2010-09-15
Replacing the energy-intensive evaporation of stillage by anaerobic digestion is one way of decreasing the energy demand of the lignocellulosic biomass to the ethanol process. The biogas can be upgraded and sold as transportation fuel, injected directly into the gas grid or be incinerated on-site for combined heat and power generation. A techno-economic evaluation of the spruce-to-ethanol process, based on SO2-catalysed steam pretreatment followed by simultaneous saccharification and fermentation, has been performed using the commercial flow-sheeting program Aspen Plus™. Various process configurations of anaerobic digestion of the stillage, with different combinations of co-products, have been evaluated in terms of energy efficiency and ethanol production cost versus the reference case of evaporation. Anaerobic digestion of the stillage showed a significantly higher overall energy efficiency (87-92%), based on the lower heating values, than the reference case (81%). Although the amount of ethanol produced was the same in all scenarios, the production cost varied between 4.00 and 5.27 Swedish kronor per litre (0.38-0.50 euro/L), including the reference case. Higher energy efficiency options did not necessarily result in lower ethanol production costs. Anaerobic digestion of the stillage with biogas upgrading was demonstrated to be a favourable option for both energy efficiency and ethanol production cost. The difference in the production cost of ethanol between using the whole stillage or only the liquid fraction in anaerobic digestion was negligible for the combination of co-products including upgraded biogas, electricity and district heat.
2010-01-01
Background Replacing the energy-intensive evaporation of stillage by anaerobic digestion is one way of decreasing the energy demand of the lignocellulosic biomass to the ethanol process. The biogas can be upgraded and sold as transportation fuel, injected directly into the gas grid or be incinerated on-site for combined heat and power generation. A techno-economic evaluation of the spruce-to-ethanol process, based on SO2-catalysed steam pretreatment followed by simultaneous saccharification and fermentation, has been performed using the commercial flow-sheeting program Aspen Plus™. Various process configurations of anaerobic digestion of the stillage, with different combinations of co-products, have been evaluated in terms of energy efficiency and ethanol production cost versus the reference case of evaporation. Results Anaerobic digestion of the stillage showed a significantly higher overall energy efficiency (87-92%), based on the lower heating values, than the reference case (81%). Although the amount of ethanol produced was the same in all scenarios, the production cost varied between 4.00 and 5.27 Swedish kronor per litre (0.38-0.50 euro/L), including the reference case. Conclusions Higher energy efficiency options did not necessarily result in lower ethanol production costs. Anaerobic digestion of the stillage with biogas upgrading was demonstrated to be a favourable option for both energy efficiency and ethanol production cost. The difference in the production cost of ethanol between using the whole stillage or only the liquid fraction in anaerobic digestion was negligible for the combination of co-products including upgraded biogas, electricity and district heat. PMID:20843330
Evaluation and recommendations for work group integration within the Materials and Processes Lab
NASA Technical Reports Server (NTRS)
Farrington, Phillip A.
1992-01-01
The goal of this study was to evaluate and make recommendations for improving the level of integration of several work groups within the Materials and Processes Lab at the Marshall Space Flight Center. This evaluation has uncovered a variety of projects that could improve the efficiency and operation of the work groups as well as the overall integration of the system. In addition, this study provides the foundation for specification of a computer integrated manufacturing test bed environment in the Materials and Processes Lab.
Langone, Michela; Ferrentino, Roberta; Cadonna, Maria; Andreottola, Gianni
2016-12-01
A laboratory-scale sequencing batch reactor (SBR) performing partial nitritation - anammox and denitrification was used to treat anaerobic digester effluents. The SBR cycle consisted of a short mixing filling phase followed by oxic and anoxic reaction phases. Working at 25 °C, an ammonium conversion efficiency of 96.5%, a total nitrogen removal efficiency of 88.6%, and an organic carbon removal efficiency of 63.5% were obtained at a nitrogen loading rate of 0.15 kg N m -3 d -1 , and a biodegradable organic carbon to nitrogen ratio of 0.37. The potential contribution of each biological process was evaluated by using a stoichiometric model. The nitritation contribution decreased as the temperature decreased, while the contribution from anammox depended on the wastewater type and soluble carbon to nitrogen ratio. Denitrification improved the total nitrogen removal efficiency, and it was influenced by the biodegradable organic carbon to nitrogen ratio. The characteristic patterns of conductivity, oxidation-reduction potential (ORP) and pH in the SBR cycle were well related to biological processes. Conductivity profiles were found to be directly related to the decreasing profiles of ammonium. Positive ORP values at the end of the anoxic phases were detected for total nitrogen removal efficiency of lower than 85%, and the occurrence of bending points on the ORP curves during the anoxic phases was associated with nitrite depletion by the anammox process. Copyright © 2016 Elsevier Ltd. All rights reserved.
FUEL-EFFICIENT SEWAGE SLUDGE INCINERATION
A study was performed to evaluate the status of incineration with low fuel use as a sludge disposal technology. The energy requirements, life-cycle costs, operation and maintenance requirements, and process capabilities of four sludge incineration facilities were evaluated. These...
CONSIDERATIONS FOR INNOVATIVE REMEDIATION TECHNOLOGY EVALUATION SAMPLING PLANS
Field trials of innovative subsurface cleanup technologies require the use of integrated site characterization approaches to obtain critical design parameters, to evaluate pre-treatment contaminant distributions, and to assess process efficiency. This review focuses on the trans...
Cerminati, Sebastián; Eberhardt, Florencia; Elena, Claudia E; Peirú, Salvador; Castelli, María E; Menzella, Hugo G
2017-06-01
Enzymatic degumming using phospholipase C (PLC) enzymes may be used in environmentally friendly processes with improved oil recovery yields. In this work, phosphatidylinositol-specific phospholipase C (PIPLC) candidates obtained from an in silico analysis were evaluated for oil degumming. A PIPLC from Lysinibacillus sphaericus was shown to efficiently remove phosphatidylinositol from crude oil, and when combined with a second phosphatidylcholine and phosphatidylethanolamine-specific phospholipase C, the three major phospholipids were completely hydrolyzed, providing an extra yield of oil greater than 2.1%, compared to standard methods. A remarkably efficient fed-batch Escherichia coli fermentation process producing ∼14 g/L of the recombinant PIPLC enzyme was developed, which may facilitate the adoption of this cost-effective oil-refining process.
Kim, Sung Bong; Park, Chulhwan; Kim, Seung Wook
2014-11-01
To design biorefinery processes producing bioethanol from lignocellulosic biomass with dilute acid pretreatment, biorefinery processes were simulated using the SuperPro Designer program. To improve the efficiency of biomass use and the economics of biorefinery, additional pretreatment processes were designed and evaluated, in which a combined process of dilute acid and aqueous ammonia pretreatments, and a process of waste media containing xylose were used, for the production of 7-aminocephalosporanic acid. Finally, the productivity and economics of the designed processes were compared. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nested polynomial trends for the improvement of Gaussian process-based predictors
NASA Astrophysics Data System (ADS)
Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.
2017-10-01
The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.
Hydrometrics, founded in 1979 and located in Helena, MT, manufactures a commercial-ready High Efficiency Reverse Osmosis (HERO™) industrial wastewater treatment system. The system uses a three-stage reverse osmosis process to remove and concentrate metals for recovery while prod...
Liu, Yang; Jiang, Wen-Ming; Yang, Jie; Li, Yu-Xing; Chen, Ming-Can; Li, Jian-Na
2017-08-01
Tilt angle of parallel-plate electrodes (APE) is very important as it improves the economy of diffusion controlled Electrocoagulation (EC) processes. This study aimed to evaluate and optimize APE of a self-made EC device including integrally rotary electrodes, at a fixed current density of 120 Am -2 . The APEs investigated in this study were selected at 0°, 30°, 45°, 60°, 90°, and a special value (α (d) ) which was defined as a special orientation of electrode when the upper end of anode and the lower end of cathode is in a line vertical to the bottom of reactor. Experiments were conducted to determine the optimum APE for demulsification process using four evaluation indexes, as: oil removal efficiency in the center between electrodes; energy consumption and Al consumption, and besides, a novel universal evaluation index named as evenness index of oil removal efficiency employed to fully reflect distribution characteristics of demulsification efficiency. At a given plate spacing of 4 cm, the optimal APE was found to be α (d) because of its potential of enhancing the mass transfer process within whole EC reactor without addition, external mechanical stirring energy, and finally the four evaluation indexed are 97.07%, 0.11 g Al g -1 oil, 2.99 kwhkg -1 oil, 99.97% and 99.97%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Effects of image processing on the detective quantum efficiency
NASA Astrophysics Data System (ADS)
Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na
2010-04-01
Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.
de Oliveira, Neurilene Batista; Peres, Heloisa Helena Ciqueto
2015-01-01
To evaluate the functional performance and the technical quality of the Electronic Documentation System of the Nursing Process of the Teaching Hospital of the University of São Paulo. exploratory-descriptive study. The Quality Model of regulatory standard 25010 and the Evaluation Process defined under regulatory standard 25040, both of the International Organization for Standardization/International Electrotechnical Commission. The quality characteristics evaluated were: functional suitability, reliability, usability, performance efficiency, compatibility, security, maintainability and portability. The sample was made up of 37 evaluators. in the evaluation of the specialists in information technology, only the characteristic of usability obtained a rate of positive responses of less than 70%. For the nurse lecturers, all the quality characteristics obtained a rate of positive responses of over 70%. The staff nurses of the medical and surgical clinics with experience in using the system) and staff nurses from other units of the hospital and from other health institutions (without experience in using the system) obtained rates of positive responses of more than 70% referent to the functional suitability, usability, and security. However, performance efficiency, reliability and compatibility all obtained rates below the parameter established. the software achieved rates of positive responses of over 70% for the majority of the quality characteristics evaluated.
"Efficiency Space" - A Framework for Evaluating Joint Evaporation and Runoff Behavior
NASA Technical Reports Server (NTRS)
Koster, Randal
2014-01-01
At the land surface, higher soil moisture levels generally lead to both increased evaporation for a given amount of incoming radiation (increased evaporation efficiency) and increased runoff for a given amount of precipitation (increased runoff efficiency). Evaporation efficiency and runoff efficiency can thus be said to vary with each other, motivating the development of a unique hydroclimatic analysis framework. Using a simple water balance model fitted, in different experiments, with a wide variety of functional forms for evaporation and runoff efficiency, we transform net radiation and precipitation fields into fields of streamflow that can be directly evaluated against observations. The optimal combination of the functional forms the combination that produces the most skillful stream-flow simulations provides an indication for how evaporation and runoff efficiencies vary with each other in nature, a relationship that can be said to define the overall character of land surface hydrological processes, at least to first order. The inferred optimal relationship is represented herein as a curve in efficiency space and should be valuable for the evaluation and development of GCM-based land surface models, which by this measure are often found to be suboptimal.
Embedding technology into inter-professional best practices in home safety evaluation.
Burns, Suzanne Perea; Pickens, Noralyn Davel
2017-08-01
To explore inter-professional home evaluators' perspectives and needs for building useful and acceptable decision-support tools for the field of home modifications. Twenty semi-structured interviews were conducted with a range of home modification professionals from different regions of the United States. The interview transcripts were analyzed with a qualitative, descriptive, perspective approach. Technology supports current best practice and has potential to inform decision making through features that could enhance home evaluation processes, quality, efficiency and inter-professional communication. Technological advances with app design have created numerous opportunities for the field of home modifications. Integrating technology and inter-professional best practices will improve home safety evaluation and intervention development to meet client-centred and societal needs. Implications for rehabilitation Understanding home evaluators technology needs for home safety evaluations contributes to the development of app-based assessments. Integrating inter-professional perspectives of best practice and technological needs in an app for home assessments improves processes. Novice and expert home evaluators would benefit from decision support systems embedded in app-based assessments. Adoption of app-based assessment would improve efficiency while remaining client-centred.
Hubert: Software for efficient analysis of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2016-10-01
Combination of short data acquisition time and local investigation of a solid state through hyperfine parameters makes nuclear forward scattering (NFS) a unique experimental technique for investigation of fast processes. However, the total number of acquired NFS time spectra may be very high. Therefore an efficient way of the data evaluation is needed. In this paper we report the development of Hubert software package as a response to the rapidly developing field of in-situ NFS experiments. Hubert offers several useful features for data files processing and could significantly shorten the evaluation time by using a simple connection between the neighboring time spectra through their input and output parameter values.
Emergy Evaluation of a Production and Utilization Process of Irrigation Water in China
Chen, Dan; Luo, Zhao-Hui; Chen, Jing; Kong, Jun; She, Dong-Li
2013-01-01
Sustainability evaluation of the process of water abstraction, distribution, and use for irrigation can contribute to the policy of decision making in irrigation development. Emergy theory and method are used to evaluate a pumping irrigation district in China. A corresponding framework for its emergy evaluation is proposed. Its emergy evaluation shows that water is the major component of inputs into the irrigation water production and utilization systems (24.7% and 47.9% of the total inputs, resp.) and that the transformities of irrigation water and rice as the systems' products (1.72E + 05 sej/J and 1.42E + 05 sej/J, resp.; sej/J = solar emjoules per joule) represent their different emergy efficiencies. The irrigated agriculture production subsystem has a higher sustainability than the irrigation water production subsystem and the integrated production system, according to several emergy indices: renewability ratio (%R), emergy yield ratio (EYR), emergy investment ratio (EIR), environmental load ratio (ELR), and environmental sustainability index (ESI). The results show that the performance of this irrigation district could be further improved by increasing the utilization efficiencies of the main inputs in both the production and utilization process of irrigation water. PMID:24082852
Emergy evaluation of a production and utilization process of irrigation water in China.
Chen, Dan; Luo, Zhao-Hui; Chen, Jing; Kong, Jun; She, Dong-Li
2013-01-01
Sustainability evaluation of the process of water abstraction, distribution, and use for irrigation can contribute to the policy of decision making in irrigation development. Emergy theory and method are used to evaluate a pumping irrigation district in China. A corresponding framework for its emergy evaluation is proposed. Its emergy evaluation shows that water is the major component of inputs into the irrigation water production and utilization systems (24.7% and 47.9% of the total inputs, resp.) and that the transformities of irrigation water and rice as the systems' products (1.72E + 05 sej/J and 1.42E + 05 sej/J, resp.; sej/J = solar emjoules per joule) represent their different emergy efficiencies. The irrigated agriculture production subsystem has a higher sustainability than the irrigation water production subsystem and the integrated production system, according to several emergy indices: renewability ratio (%R), emergy yield ratio (EYR), emergy investment ratio (EIR), environmental load ratio (ELR), and environmental sustainability index (ESI). The results show that the performance of this irrigation district could be further improved by increasing the utilization efficiencies of the main inputs in both the production and utilization process of irrigation water.
DEVELOPING A CAPE-OPEN COMPLIANT METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (CO-MFFP2T)
The USEPA is developing a Computer Aided Process Engineering (CAPE) software tool for the metal finishing industry that helps users design efficient metal finishing processes that are less polluting to the environment. Metal finishing process lines can be simulated and evaluated...
Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Brandon, Jay M.
2017-01-01
Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.
Oliveira, Edna M S; Silva, Francisco R; Morais, Crislânia C O; Oliveira, Thiago Mielle B F; Martínez-Huitle, Carlos A; Motheo, Artur J; Albuquerque, Cynthia C; Castro, Suely S L
2018-06-01
This study investigated the anodic oxidation of phenolic wastewater generated by cashew-nut processing industry (CNPI) using active (Ti/RuO 2 -TiO 2 ) and inactive (boron doped diamond, BDD) anodes. During electrochemical treatment, various operating parameters were investigated, such as current density, chemical oxygen demand (COD), total phenols, O 2 production, temperature, pH, as well as current efficiency and energy consumption. After electrolysis under optimized working conditions, samples were evaluated by chromatography and toxicological tests against L. sativa. When both electrode materials were compared under the same operating conditions, higher COD removal efficiency was achieved for BDD anode; achieving lower energy requirements when compared with the values estimated for Ti/RuO 2 -TiO 2 . The presence of Cl - in the wastewater promoted the electrogeneration of strong oxidant species as chlorine, hypochlorite and mainly hypochlorous acid, increasing the efficiency of degradation process. Regarding the temperature effect, BDD showed slower performances than those achieved for Ti/RuO 2 -TiO 2 . Chromatographic and phytotoxicity studies indicated formation of some by-products after electrolytic process, regardless of the anode evaluated, and phytotoxic action of the effluent. Results encourage the applicability of the electrochemical method as wastewater treatment process for the CNPI, reducing depuration time. Copyright © 2018. Published by Elsevier Ltd.
Microcidal effects of a new pelleting process.
Ekperigin, H E; McCapes, R H; Redus, R; Ritchie, W L; Cameron, W J; Nagaraja, K V; Noll, S
1990-09-01
The microcidal efficiency of a new pelleting process was evaluated in four trials. Also, different methods of measuring temperature and moisture were compared and attempts were made to determine the influence on efficiency of pH changes occurring during processing. In the new process, the traditional boiler-conditioner was replaced by an Anaerobic Pasteurizing Conditioning (APC) System. Microcidal efficiency of the APC System, by itself or in conjunction with a pellet mill, appeared to be 100% against Escherichia coli and nonlactose-fermenters, 99% against aerobic mesophiles, and 90% against fungi. These levels of efficiency were attained when the temperature and moisture of feed conditioned in the APC System for 4.6 +/- .5 min were 82.9 +/- 2.4 C and 14.9 +/- .3%, respectively. On-line temperature probes were reliable and provided quick, accurate estimates of feed temperature. The near infrared scanner and microwave oven methods of measuring moisture were much quicker but less accurate than the in vacuo method. There were no differences among the pH of samples of raw, conditioned, and pelleted feed.
Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François
2016-08-01
The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.
A hybrid process integrating vapor stripping with vapor compression and vapor permeation membrane separation, termed Membrane Assisted Vapor Stripping (MAVS), was evaluated for recovery and dehydration of ethanol from aqueous solution as an alternative to conventional distillatio...
Efficiency, costs and benefits of AOPs for removal of pharmaceuticals from the water cycle.
Tuerk, J; Sayder, B; Boergers, A; Vitz, H; Kiffmeyer, T K; Kabasci, S
2010-01-01
Different advanced oxidation processes (AOP) were developed for the treatment of highly loaded wastewater streams. Optimisation of removal and improvement of efficiency were carried out on a laboratory, semiworks and pilot plant scale. The persistent cytostatic drug cyclophosphamide was selected as a reference substance regarding elimination and evaluation of the various oxidation processes because of its low degradability rate. The investigated processes are cost-efficient and suitable regarding the treatment of wastewater streams since they lead to efficient elimination of antibiotics and antineoplastics. A total reduction of toxicity was proven by means of the umuC-test. However, in order to reduce pharmaceuticals from the water cycle, it must be considered that the input of more than 80 % of the pharmaceuticals entering wastewater treatment systems results from private households. Therefore, advanced technologies should also be installed at wastewater treatment plants.
[Efficiency evaluation of capsaicinoids to discriminate bio-waste oils from edible vegetable oils].
Mao, Lisha; Liu, Honghe; Kang, Li; Jiang, Jie; Liao, Shicheng; Liu, Guihua; Deng, Pingjian
2014-07-01
To evaluate the efficiency of capsaicinoids to discriminate bio-waste oil from edible vegetable oil. 14 raw vegetable oils, 24 fried waste oils, 34 kitchen-waste oils, 32 edible non-peanut vegetable oil, 32 edible peanuts oil, 16 edible oil add flavorand and 11 refined bio-waste oils were prepared and examined for capsaicinoids including capsaicin, dihydrocapsaicin and nonylic acid vanillylamide. The detection results of the above samples were statistically tested based on sample category to assessment identify the effectiveness of the bio-waste oils with capsaicinoids. As a indicator, capsaincin was possessed of high detection sensitivity and has the highest efficiency to discern kitchen-waste oils and refined bio-waste oils samples from edible non-peanut vegetable oil correctly. The accuracy rate of identification were 100% and 90.1% respectively. There is the background in peanut oil. CONCLUSION Capsaicin added in cooking process can be retained in the refining process and hardly be removed in the refining process. In the case of fully eliminating the background interference, capsaicinoids can effectively identify bio-waste oils and edible vegetable oil in combination.
Analysis of the Efficiency of Surfactant-Mediated Stabilization Reactions of EGaIn Nanodroplets.
Finkenauer, Lauren R; Lu, Qingyun; Hakem, Ilhem F; Majidi, Carmel; Bockstaller, Michael R
2017-09-26
A methodology based on light scattering and spectrophotometry was developed to evaluate the effect of organic surfactants on the size and yield of eutectic gallium/indium (EGaIn) nanodroplets formed in organic solvents by ultrasonication. The process was subsequently applied to systematically evaluate the role of headgroup chemistry as well as polar/apolar interactions of aliphatic surfactant systems on the efficiency of nanodroplet formation. Ethanol was found to be the most effective solvent medium in promoting the formation and stabilization of EGaIn nanodroplets. For the case of thiol-based surfactants in ethanol, the yield of nanodroplet formation increased with the number of carbon atoms in the aliphatic part. In the case of the most effective surfactant system-octadecanethiol-the nanodroplet yield increased by about 370% as compared to pristine ethanol. The rather low overall efficiency of the reaction process along with the incompatibility of surfactant-stabilized EGaIn nanodroplets in nonpolar organic solvents suggests that the stabilization mechanism differs from the established self-assembled monolayer formation process that has been widely observed in nanoparticle formation.
Gleich, Stephen J; Nemergut, Michael E; Stans, Anthony A; Haile, Dawit T; Feigal, Scott A; Heinrich, Angela L; Bosley, Christopher L; Tripathi, Sandeep
2016-08-01
Ineffective and inefficient patient transfer processes can increase the chance of medical errors. Improvements in such processes are high-priority local institutional and national patient safety goals. At our institution, nonintubated postoperative pediatric patients are first admitted to the postanesthesia care unit before transfer to the PICU. This quality improvement project was designed to improve the patient transfer process from the operating room (OR) to the PICU. After direct observation of the baseline process, we introduced a structured, direct OR-PICU transfer process for orthopedic spinal fusion patients. We performed value stream mapping of the process to determine error-prone and inefficient areas. We evaluated primary outcome measures of handoff error reduction and the overall efficiency of patient transfer process time. Staff satisfaction was evaluated as a counterbalance measure. With the introduction of the new direct OR-PICU patient transfer process, the handoff communication error rate improved from 1.9 to 0.3 errors per patient handoff (P = .002). Inefficiency (patient wait time and non-value-creating activity) was reduced from 90 to 32 minutes. Handoff content was improved with fewer information omissions (P < .001). Staff satisfaction significantly improved among nearly all PICU providers. By using quality improvement methodology to design and implement a new direct OR-PICU transfer process with a structured multidisciplinary verbal handoff, we achieved sustained improvements in patient safety and efficiency. Handoff communication was enhanced, with fewer errors and content omissions. The new process improved efficiency, with high staff satisfaction. Copyright © 2016 by the American Academy of Pediatrics.
NASA Astrophysics Data System (ADS)
Fotilas, P.; Batzias, A. F.
2007-12-01
The equivalence indices synthesized for the comparative evaluation of technoeconomic efficiency of industrial processes are of critical importance since they serve as both, (i) positive/analytic descriptors of the physicochemical nature of the process and (ii) measures of effectiveness, especially helpful for investigated competitiveness in the industrial/energy/environmental sector of the economy. In the present work, a new algorithmic procedure has been developed, which initially standardizes a real industrial process, then analyzes it as a compromise of two ideal processes, and finally synthesizes the index that can represent/reconstruct the real process as a result of the trade-off between the two ideal processes taking as parental prototypes. The same procedure makes fuzzy multicriteria ranking within a set of pre-selected industrial processes for two reasons: (a) to analyze the process most representative of the production/treatment under consideration, (b) to use the `second best' alternative as a dialectic pole in absence of the two ideal processes mentioned above. An implantation of this procedure is presented, concerning a facility of biological wastewater treatment with six alternatives: activated sludge through (i) continuous-flow incompletely-stirred tank reactors in series, (ii) a plug flow reactor with dispersion, (iii) an oxidation ditch, and biological processing through (iv) a trickling filter, (v) rotating contactors, (vi) shallow ponds. The criteria used for fuzzy (to count for uncertainty) ranking are capital cost, operating cost, environmental friendliness, reliability, flexibility, extendibility. Two complementary indices were synthesized for the (ii)-alternative ranked first and their quantitative expressions were derived, covering a variety of kinetic models as well as recycle/bypass conditions. Finally, analysis of estimating the optimal values of these indices at maximum technoeconomic efficiency is presented and the implications (expected to be) caused by exogenous and endogenous factors (e.g., environmental standards change and innovative energy savings/substitution, respectively) are discussed by means of marginal efficiency graphs.
Performance indicators for the efficiency analysis of urban drainage systems.
Artina, S; Becciu, G; Maglionico, M; Paoletti, A; Sanfilippo, U
2005-01-01
Performance indicators implemented in a decision support system (DSS) for the technical, managerial and economic evaluation of urban drainage systems (UDS), called MOMA FD, are presented. Several kinds of information are collected and processed by MOMA FD to evaluate both present situation and future scenarios of development and enhancement. Particular interest is focused on the evaluation of the environmental impact, which is considered a very relevant factor in the decision making process to identify the priorities for UDS improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knittel, Christopher; Wolfran, Catherine; Gandhi, Raina
A wide range of climate plans rely on energy efficiency to generate energy and carbon emissions reductions, but conventional wisdom holds that consumers have historically underinvested in energy efficiency upgrades. This underinvestment may occur for a variety of reasons, one of which is that consumers are not adequately informed about the benefits to energy efficiency. To address this, the U.S. Department of Energy created a tool called the Home Energy Score (HEScore) to act as a simple, low-cost means to provide clear information about a home’s energy efficiency and motivate homeowners and homebuyers to invest in energy efficiency. The Departmentmore » of Energy is in the process of conducting four evaluations assessing the impact of the Home Energy Score on residential energy efficiency investments and program participation. This paper describes one of these evaluations: a randomized controlled trial conducted in New Jersey in partnership with New Jersey Natural Gas. The evaluation randomly provides homeowners who have received an audit, either because they have recently replaced their furnace, boiler, and/or gas water heater with a high-efficiency model and participated in a free audit to access an incentive, or because they requested an independent audit3, between May 2014 and October 2015, with the Home Energy Score.« less
Stoeckel, D.M.; Stelzer, E.A.; Dick, L.K.
2009-01-01
Quantitative PCR (qPCR), applied to complex environmental samples such as water, wastewater, and feces, is susceptible to methodological and sample related biases. In this study, we evaluated two exogenous DNA spike-and-recovery controls as proxies for recovery efficiency of Bacteroidales 16S rDNA gene sequences (AllBac and qHF183) that are used for microbial source tracking (MST) in river water. Two controls-(1) the plant pathogen Pantoea stewartii, carrying the chromosomal target gene cpsD, and (2) Escherichia coli, carrying the plasmid-borne target gene DsRed2-were added to raw water samples immediately prior to concentration and DNA extraction for qPCR. When applied to samples processed in replicate, recovery of each control was positively correlated with the observed concentration of each MST marker. Adjustment of MST marker concentrations according to recovery efficiency reduced variability in replicate analyses when consistent processing and extraction methodologies were applied. Although the effects of this procedure on accuracy could not be tested due to uncertainties in control DNA concentrations, the observed reduction in variability should improve the strength of statistical comparisons. These findings suggest that either of the tested spike-and-recovery controls can be useful to measure efficiency of extraction and recovery in routine laboratory processing. ?? 2009 Elsevier Ltd.
Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M
2008-06-01
Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.
Artificial Intelligence in Medicine and Radiation Oncology
Weidlich, Vincent
2018-01-01
Artifical Intelligence (AI) was reviewed with a focus on its potential applicability to radiation oncology. The improvement of process efficiencies and the prevention of errors were found to be the most significant contributions of AI to radiation oncology. It was found that the prevention of errors is most effective when data transfer processes were automated and operational decisions were based on logical or learned evaluations by the system. It was concluded that AI could greatly improve the efficiency and accuracy of radiation oncology operations. PMID:29904616
Artificial Intelligence in Medicine and Radiation Oncology.
Weidlich, Vincent; Weidlich, Georg A
2018-04-13
Artifical Intelligence (AI) was reviewed with a focus on its potential applicability to radiation oncology. The improvement of process efficiencies and the prevention of errors were found to be the most significant contributions of AI to radiation oncology. It was found that the prevention of errors is most effective when data transfer processes were automated and operational decisions were based on logical or learned evaluations by the system. It was concluded that AI could greatly improve the efficiency and accuracy of radiation oncology operations.
NASA Technical Reports Server (NTRS)
1975-01-01
The gasification reactions necessary for the production of hydrogen from montana subbituminous coal are presented. The coal composition is given. The gasifier types mentioned include: suspension (entrained) combustion; fluidized bed; and moving bed. Each gasification process is described. The steam-iron process, raw and product gas compositions, gasifier feed quantities, and process efficiency evaluations are also included.
An Evaluation of the Decennial Review Process.
ERIC Educational Resources Information Center
Barak, Robert; And Others
Information on a review of the decennial review process required by the Arizona Board of Regents (ABOR) for every academic department in each Arizona university is presented as part of the final report by ABOR's Task Force on Excellence, Efficiency and Competitiveness. Revision in the review process and eight policy statements are discussed, and…
Efficient processing of two-dimensional arrays with C or C++
Donato, David I.
2017-07-20
Because fast and efficient serial processing of raster-graphic images and other two-dimensional arrays is a requirement in land-change modeling and other applications, the effects of 10 factors on the runtimes for processing two-dimensional arrays with C and C++ are evaluated in a comparative factorial study. This study’s factors include the choice among three C or C++ source-code techniques for array processing; the choice of Microsoft Windows 7 or a Linux operating system; the choice of 4-byte or 8-byte array elements and indexes; and the choice of 32-bit or 64-bit memory addressing. This study demonstrates how programmer choices can reduce runtimes by 75 percent or more, even after compiler optimizations. Ten points of practical advice for faster processing of two-dimensional arrays are offered to C and C++ programmers. Further study and the development of a C and C++ software test suite are recommended.Key words: array processing, C, C++, compiler, computational speed, land-change modeling, raster-graphic image, two-dimensional array, software efficiency
NASA Astrophysics Data System (ADS)
Kumar, Pankaj; Saraswat, Chitresh; Mishra, Binaya Kumar; Avtar, Ram; Patel, Hiral; Patel, Asha; Sharma, Tejal; Patel, Roshni
2017-09-01
Fluoride pollution (with concentration >1.0 mg/L) in groundwater has become a global threat in the recent past due to the lesser availability of potable groundwater resource. In between several defluoridation techniques discovered so far, the adsorption process proved to be most economic and efficient. This study is an effort to evaluate defluoridation efficiency of powdered rice husk, fine chopped rice husk and sawdust by the batch adsorption process. Optimum defluoridation capacity is achieved by optimizing various parameters, viz. dose of adsorbent, pH, contact time and initial concentration. It was found that all three materials can be employed for the defluoridation technique, but powdered rice husk is the best adsorbent in the midst of all three. Powdered rice husk showed fluoride removal efficiency ranging between 85 and 90 % in the contact period of 7 h only in conditions of all optimized parameter. Following this parameter optimization, adsorption efficiency was also evaluated at natural pH of groundwater to minimize the cost of defluoridation. No significant difference was found between fluoride adsorption at optimized pH (pH = 4) and natural one (pH = 7), which concludes that powdered rice husk can be efficiently used for the defluoridation technique at field scale. The adsorption isotherm using this adsorbent perfectly followed Langmuir isotherms. The value of calculated separation factor also suggests the favourable adsorption of fluoride onto this adsorbent under the conditions used for the experiments. The field application for defluoridation of groundwater using this adsorbent (based on pH of natural groundwater there and seasonal variation of temperature) showed the high success rate.
NASA Astrophysics Data System (ADS)
Qaddus, Muhammad Kamil
The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.
A Sociotechnical Approach to Evaluating the Impact of ICT on Clinical Care Environments
Li, Julie
2010-01-01
Introduction: Process-supporting information technology holds the potential to increase efficiency, reduce errors, and alter professional roles and responsibilities in a manner which allows improvement in the delivery of patient care. However, clashes between the model of health care work inscribed in these tools with the actual nature of work has resulted in staff resistance and decreased organisational uptake of ICT, as well as the facilitation of unexpected and negative effects in efficiency and patient safety. Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. Design: This paper will conceptualise a formative, multi-method longitudinal evaluation process to explore the impact of ICT with an appreciation of the relationship between the social and technical systems within a clinical department. Method: Departmental culture, including clinical work processes and communication patterns will be thoroughly explored before system implementation using both quantitative and qualitative research methods. Findings will be compared with post implementation data, which will incorporate measurement of safety and workflow efficiency indicators. Discussion: Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. However, sociotechnical and multimethod approaches to evaluation do not exist without criticism. Inherent in the protocol are limitations of sociotechnical theory and criticism of the multimethod approach; testing of the methodology in real clinical settings will serve to verify efficacy and refine the process. PMID:21594005
In Vitro Comparison of Adipokine Export Signals.
Sharafi, Parisa; Kocaefe, Y Çetin
2016-01-01
Mammalian cells are widely used for recombinant protein production in research and biotechnology. Utilization of export signals significantly facilitates production and purification processes. 35 years after the discovery of the mammalian export machinery, there still are obscurities regarding the efficiency of the export signals. The aim of this study was the comparative evaluation of the efficiency of selected export signals using adipocytes as a cell model. Adipocytes have a large capacity for protein secretion including several enzymes, adipokines, and other signaling molecules, providing a valid system for a quantitative evaluation. Constructs that expressed N-terminal fusion export signals were generated to express Enhanced Green Fluorescence Protein (EGFP) as a reporter for quantitative and qualitative evaluation. Furthermore, fluorescent microscopy was used to trace the intracellular traffic of the reporter. The export efficiency of six selected proteins secreted from adipocytes was evaluated. Quantitative comparison of intracellular and exported fractions of the recombinant constructs demonstrated a similar efficiency among the studied sequences with minor variations. The export signal of Retinol Binding Protein (RBP4) exhibited the highest efficiency. This study presents the first quantitative data showing variations among export signals, in adipocytes which will help optimization of recombinant protein distribution.
Wang, Mo; Ling, Jie; Chen, Ying; Song, Jie; Sun, E; Shi, Zi-Qi; Feng, Liang; Jia, Xiao-Bin; Wei, Ying-Jie
2017-11-01
The increasingly apparent liver injury problems of bone strengthening Chinese medicines have brought challenges for clinical application, and it is necessary to consider both effectiveness and safety in screening anti-osteoporosis Chinese medicines. Metabolic transformation is closely related to drug efficacy and toxicity, so it is significant to comprehensively consider metabolism-action/toxicity(M-Act/Tox) for screening anti-osteoporosis Chinese medicines. The current evaluation models and the number of compounds(including metabolites) severely restrict efficient screening in vivo. By referring to previous relevant research and domestic and abroad literature, zebrafish M-Act/Tox integrative method was put forward for efficiently screening anti-osteoporosis herb medicines, which has organically integrated zebrafish metabolism model, osteoporosis model and toxicity evaluation method. This method can break through the bottleneck and blind spots that trace compositions can't achieve efficient and integrated in vivo evaluation, and realize both efficient and comprehensive screening on anti-osteoporosis traditional medicines based on in vivo process taking both safety and effectiveness into account, which is significant to accelerate discovery of effective and safe innovative traditional Chinese medicines for osteoporosis. Copyright© by the Chinese Pharmaceutical Association.
NASA Technical Reports Server (NTRS)
Powell, W. B.
1973-01-01
Thrust chamber performance is evaluated in terms of an analytical model incorporating all the loss processes that occur in a real rocket motor. The important loss processes in the real thrust chamber were identified, and a methodology and recommended procedure for predicting real thrust chamber vacuum specific impulse were developed. Simplified equations for the calculation of vacuum specific impulse are developed to relate the delivered performance (both vacuum specific impulse and characteristic velocity) to the ideal performance as degraded by the losses corresponding to a specified list of loss processes. These simplified equations enable the various performance loss components, and the corresponding efficiencies, to be quantified separately (except that interaction effects are arbitrarily assigned in the process). The loss and efficiency expressions presented can be used to evaluate experimentally measured thrust chamber performance, to direct development effort into the areas most likely to yield improvements in performance, and as a basis to predict performance of related thrust chamber configurations.
Integrating Waste Heat from CO 2 Removal and Coal-Fired Flue Gas to Increase Plant Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irvin, Nick; Kowalczyk, Joseph
In project DE-FE0007525, Southern Company Services demonstrated heat integration methods for the capture and sequestration of carbon dioxide produced from pulverized coal combustion. A waste heat recovery technology (termed High Efficiency System) from Mitsubishi Heavy Industries America was integrated into an existing 25-MW amine-based CO 2 capture process (Kansai Mitsubishi Carbon Dioxide Recovery Process®1) at Southern Company’s Plant Barry to evaluate improvements in the energy performance of the pulverized coal plant and CO 2 capture process. The heat integration system consists of two primary pieces of equipment: (1) the CO 2 Cooler which uses product CO 2 gas from themore » capture process to heat boiler condensate, and (2) the Flue Gas Cooler which uses air heater outlet flue gas to further heat boiler condensate. Both pieces of equipment were included in the pilot system. The pilot CO 2 Cooler used waste heat from the 25-MW CO 2 capture plant (but not always from product CO 2 gas, as intended). The pilot Flue Gas Cooler used heat from a slipstream of flue gas taken from downstream of Plant Barry’s air heater. The pilot also included a 0.25-MW electrostatic precipitator. The 25-MW High Efficiency System operated for approximately six weeks over a four month time period in conjunction with the 25-MW CO 2 capture facility at Plant Barry. Results from the program were used to evaluate the technical and economic feasibility of full-scale implementation of this technology. The test program quantified energy efficiency improvements to a host power plant that could be realized due to the High Efficiency System. Through the execution of this project, the team verified the integrated operation of the High Efficiency System and Kansai Mitsubishi Carbon Dioxide Recovery Process®. The ancillary benefits of the High Efficiency System were also quantified, including reduced water consumption, a decrease in toxic air emissions, and better overall air quality control systems performance.« less
ERIC Educational Resources Information Center
Magagula, Cisco
2002-01-01
The evaluator was contracted to determine whether the online or face-to-face course components met course participants' needs and increased their understanding and knowledge of policy development processes, and to determine the efficiency and effectiveness of delivery strategies. In addition, the course evaluator was asked to look at the…
Kagan, Jonathan M; Rosas, Scott; Trochim, William M K
2010-10-01
New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease.
Scotland, Graham; Bryan, Stirling
2017-02-01
At a time of intense pressure on health care budgets, the technology management challenge is for disinvestment in low-value technologies and reinvestment in higher value alternatives. The aim of this article is to explore ways in which health economists might begin to redress the observed imbalance between the evaluation of new and existing in-use technologies. The argument is not against evaluating new technologies but in favor of the "search for efficiency," where the ultimate objective is to identify reallocations that improve population health in the face of resource scarcity. We explore why in-use technologies may be of low value and consider how economic evaluation analysts might embrace a broader efficiency lens, first through "technology management" (a process of analysis and evidence-informed decision making throughout a technology's life cycle) and progressing through "pathway management" (the search for efficiency gains across entire clinical care pathways). A number of model-based examples are used to illustrate the approaches.
Comparative Evaluation of Financing Programs: Insights From California’s Experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deason, Jeff
Berkeley Lab examines criteria for a comparative assessment of multiple financing programs for energy efficiency, developed through a statewide public process in California. The state legislature directed the California Alternative Energy and Advanced Transportation Financing Authority (CAEATFA) to develop these criteria. CAEATFA's report to the legislature, an invaluable reference for other jurisdictions considering these topics, discusses the proposed criteria and the rationales behind them in detail. Berkeley Lab's brief focuses on several salient issues that emerged during the criteria development and discussion process. Many of these issues are likely to arise in other states that plan to evaluate the impactsmore » of energy efficiency financing programs, whether for a single program or multiple programs. Issues discussed in the brief include: -The stakeholder process to develop the proposed assessment criteria -Attribution of outcomes - such as energy savings - to financing programs vs. other drivers -Choosing the outcome metric of primary interest: program take-up levels vs. savings -The use of net benefits vs. benefit-cost ratios for cost-effectiveness evaluation -Non-energy factors -Consumer protection factors -Market transformation impacts -Accommodating varying program goals in a multi-program evaluation -Accounting for costs and risks borne by various parties, including taxpayers and utility customers, in cost-effectiveness analysis -How to account for potential synergies among programs in a multi-program evaluation« less
2005-05-01
efficiencies similar to those in the private sector . However, along the way, Government and private sector industry have begun to disagree about how PPI is...double that of the private sector due to an evaluation process that is cumbersome, time-consuming, and lacking the efficiencies enjoyed by private
Smith, Joseph M.; Wells, Sarah P.; Mather, Martha E.; Muth, Robert M.
2014-01-01
When researchers and managers initiate sampling on a new stream or river system, they do not know how effective each gear type is and whether their sampling effort is adequate. Although the types and amount of gear may be different for other studies, systems, and research questions, the five-step process described here for making sampling decisions and evaluating sampling efficiency can be applied widely to any system to restore, manage, and conserve aquatic ecosystems. It is believed that incorporating this gear-evaluation process into a wide variety of studies and ecosystems will increase rigour within and across aquatic biodiversity studies.
NASA Astrophysics Data System (ADS)
Smith, L.; Murphy, J. W.; Kim, J.; Rozhdestvenskyy, S.; Mejia, I.; Park, H.; Allee, D. R.; Quevedo-Lopez, M.; Gnade, B.
2016-12-01
Solid-state neutron detectors offer an alternative to 3He based detectors, but suffer from limited neutron efficiencies that make their use in security applications impractical. Solid-state neutron detectors based on single crystal silicon also have relatively high gamma-ray efficiencies that lead to false positives. Thin film polycrystalline CdTe based detectors require less complex processing with significantly lower gamma-ray efficiencies. Advanced geometries can also be implemented to achieve high thermal neutron efficiencies competitive with silicon based technology. This study evaluates these strategies by simulation and experimentation and demonstrates an approach to achieve >10% intrinsic efficiency with <10-6 gamma-ray efficiency.
Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J
2011-01-01
Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki
2012-12-21
There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.
Assessment of iron chelates efficiency for photo-Fenton at neutral pH.
De Luca, Antonella; Dantas, Renato F; Esplugas, Santiago
2014-09-15
In this study, homogeneous photo-Fenton like at neutral pH was applied to remove sulfamethoxazole from water. The process was performed using different chelating agents in order to solubilize iron in a neutral water solution. The chelating agents tested were: ethylenediaminetetraacetic acid (EDTA); nitrilotriacetic acid (NTA); oxalic acid (OA) and tartaric acid (TA). The iron leaching was monitored over reaction time to evaluate the chelates stability and their resistance to HO· and UV-A radiation. Chelates of EDTA and NTA presented more stability than OA and TA, which also confirmed their higher efficiency. Total Organic Carbon (TOC) analyses were also performed to evaluate the contribution in terms of solution contamination related to the use of chelating agents. The better properties of biodegradability in respect of EDTA combined with better efficiency in terms of microcontaminant removal and the smallest TOC contribution indicate that NTA could represent a useful option to perform photo-Fenton processes at neutral pH. Copyright © 2014 Elsevier Ltd. All rights reserved.
Large Scale Frequent Pattern Mining using MPI One-Sided Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishnu, Abhinav; Agarwal, Khushbu
In this paper, we propose a work-stealing runtime --- Library for Work Stealing LibWS --- using MPI one-sided model for designing scalable FP-Growth --- {\\em de facto} frequent pattern mining algorithm --- on large scale systems. LibWS provides locality efficient and highly scalable work-stealing techniques for load balancing on a variety of data distributions. We also propose a novel communication algorithm for FP-growth data exchange phase, which reduces the communication complexity from state-of-the-art O(p) to O(f + p/f) for p processes and f frequent attributed-ids. FP-Growth is implemented using LibWS and evaluated on several work distributions and support counts. Anmore » experimental evaluation of the FP-Growth on LibWS using 4096 processes on an InfiniBand Cluster demonstrates excellent efficiency for several work distributions (87\\% efficiency for Power-law and 91% for Poisson). The proposed distributed FP-Tree merging algorithm provides 38x communication speedup on 4096 cores.« less
Solar-assisted photodegradation of isoproturon over easily recoverable titania catalysts.
Tolosana-Moranchel, A; Carbajo, J; Faraldos, M; Bahamonde, A
2017-03-01
An easily recoverable homemade TiO 2 catalyst (GICA-1) has been evaluated during the overall photodegradation process, understood as photocatalytic efficiency and catalyst recovery step, in the solar light-assisted photodegradation of isoproturon and its reuse in two consecutive cycles. The global feasibility has been compared to the commercial TiO 2 P25. The homemade GICA-1 catalyst presented better sedimentation efficiency than TiO 2 P25 at all studied pHs, which could be explained by its higher average hydrodynamic particle size (3 μm) and other physicochemical surface properties. The evaluation of the overall process (isoproturon photo-oxidation + catalyst recovery) revealed GICA-1 homemade titania catalyst strengths: total removal of isoproturon in less than 60 min, easy recovery by sedimentation, and reusability in two consecutive cycles, without any loss of photocatalytic efficiency. Therefore, considering the whole photocatalytic cycle (good performance in photodegradation plus catalyst recovery step), the homemade GICA-1 photocatalyst resulted in more affordability than commercial TiO 2 P25. Graphical abstract.
Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko
2014-01-01
Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.
Asami, Tatsuya; Katayama, Hiroyuki; Torrey, Jason Robert; Visvanathan, Chettiyappan; Furumai, Hiroaki
2016-09-15
In order to properly assess and manage the risk of infection by enteric viruses in tap water, virus removal efficiency should be evaluated quantitatively for individual processes in actual drinking water treatment plants (DWTPs); however, there have been only a few studies due to technical difficulties in quantifying low virus concentration in water samples. In this study, the removal efficiency of indigenous viruses was evaluated for coagulation-sedimentation (CS) and rapid sand filtration (RSF) processes in a DWTP in Bangkok, Thailand by measuring the concentration of viruses before and after treatment processes using real-time polymerase chain reaction (qPCR). Water samples were collected and concentrated from raw source water, after CS, and after RSF, and inhibitory substances in water samples were reduced by use of a hydrophobic resin (DAX-8). Pepper mild mottle virus (PMMoV) and JC polyomavirus (JC PyV) were found to be highly prevalent in raw waters, with concentrations of 10(2.88 ± 0.35) and 10(3.06 ± 0.42) copies/L (geometric mean ± S.D.), respectively. Step-wise removal efficiencies were calculated for individual processes, with some variation observed between wet and dry seasons. During the wet season, PMMoV was removed less by CS and more by RSF on average (0.40 log10 vs 1.26 log10, respectively), while the reverse was true for JC PyV (1.91 log10 vs 0.49 log10, respectively). Both viruses were removed similarly during the dry season, with CS removing the most virus (PMMoV, 1.61 log10 and 0.78 log10; JC PyV, 1.70 log10, and 0.59 log10; CS and RSF, respectively). These differences between seasons were potentially due to variations in raw water quality and the characteristics of the viruses themselves. These results suggest that PMMoV and JC PyV, which are more prevalent in environmental waters than the other enteric viruses evaluated in this study, could be useful in determining viral fate for the risk management of viruses in water treatment processes in actual full-scale DWTPs. Copyright © 2016. Published by Elsevier Ltd.
The energy demand of distillation-based systems for ethanol recovery and dehydration can be significant, particularly for dilute solutions [1]. An alternative separation process integrating vapor stripping with a vapor compression step and a vapor permeation membrane separation ...
NASA Astrophysics Data System (ADS)
Souček, P.; Murakami, T.; Claux, B.; Meier, R.; Malmbeck, R.; Tsukada, T.; Glatz, J.-P.
2015-04-01
An electrorefining process for metallic spent nuclear fuel treatment is being investigated in ITU. Solid aluminium cathodes are used for homogeneous recovery of all actinides within the process carried out in molten LiCl-KCl eutectic salt at a temperature of 500 °C. As the selectivity, efficiency and performance of solid Al has been already shown using un-irradiated An-Zr alloy based test fuels, the present work was focused on laboratory-scale demonstration of the process using irradiated METAPHIX-1 fuel composed of U67-Pu19-Zr10-MA2-RE2 (wt.%, MA = Np, Am, Cm, RE = Nd, Ce, Gd, Y). Different electrorefining techniques, conditions and cathode geometries were used during the experiment yielding evaluation of separation factors, kinetic parameters of actinide-aluminium alloy formation, process efficiency and macro-structure characterisation of the deposits. The results confirmed an excellent separation and very high efficiency of the electrorefining process using solid Al cathodes.
Studies on Tasar Cocoon Cooking Using Permeation Method
NASA Astrophysics Data System (ADS)
Javali, Uday C.; Malali, Kiran B.; Ramya, H. G.; Naik, Subhas V.; Padaki, Naveen V.
2018-02-01
Cocoon cooking is an important process before reeling of tasar silk yarn. Cooking ensures loosening of the filaments in the tasar cocoons thereby easing the process of yarn withdrawal during reeling process. Tasar cocoons have very hard shell and hence these cocoons need chemical cooking process to loosen the silk filaments. Attempt has been made in this article to study the effect of using vacuum permeation chamber for tasar cocoon cooking in order to reduce the cooking time and improve the quality of tasar silk yarn. Vacuum assisted permeation cooking method has been studied in this article on tasar daba cocoons for cooking efficiency, deflossing and reelability. Its efficiency has been evaluated with respect to different cooking methods viz, traditional and open pan cooking methods. The tasar silk produced after reeling process has been tested for fineness, strength and cohesion properties. Results indicate that permeation method of tasar cooking ensures uniform cooking with higher efficiency along with better reeling performance and improved yarn properties.
Occupants' Perceptions of Amenity and Efficiency for Verification of Spatial Design Adequacy.
Lee, Sangwon; Wohn, Kwangyun
2016-01-14
The best spatial design condition to satisfy the occupancy needs of amenity and efficiency is determined through analyzing the spatial design adequacy (SDA). In this study, the relationship between the space design elements and space on future occupants' perception are analyzed. The thirty-three participants reported their self-evaluated SDA that describes the quality of eight alternative housing living rooms with different spatial factors. The occupants were guided through the perception processing elaboration in order for them to evaluate the actual perception in the real space. The findings demonstrated that the spatial size (e.g., width, depth, and height) is significantly correlated with the overall satisfaction of amenity. It is also found that the spatial shape (e.g., the width-to-depth ratio, the height-to-area ratio, and room shape) may significantly influence the overall satisfaction of efficiency. The findings also demonstrate that the causal relationship between the spatial factors and space is clearly present in the occupants' perception, reflecting the time-sequential characteristics of the actual experience divided into amenity and efficiency. This result indicates that the correlation between the spatial factors and space of SDA under the occupants' perception processing elaboration can be a useful guide to predict the occupancy satisfaction of amenity and efficiency in real spaces.
Occupants’ Perceptions of Amenity and Efficiency for Verification of Spatial Design Adequacy
Lee, Sangwon; Wohn, Kwangyun
2016-01-01
The best spatial design condition to satisfy the occupancy needs of amenity and efficiency is determined through analyzing the spatial design adequacy (SDA). In this study, the relationship between the space design elements and space on future occupants’ perception are analyzed. The thirty-three participants reported their self-evaluated SDA that describes the quality of eight alternative housing living rooms with different spatial factors. The occupants were guided through the perception processing elaboration in order for them to evaluate the actual perception in the real space. The findings demonstrated that the spatial size (e.g., width, depth, and height) is significantly correlated with the overall satisfaction of amenity. It is also found that the spatial shape (e.g., the width-to-depth ratio, the height-to-area ratio, and room shape) may significantly influence the overall satisfaction of efficiency. The findings also demonstrate that the causal relationship between the spatial factors and space is clearly present in the occupants’ perception, reflecting the time-sequential characteristics of the actual experience divided into amenity and efficiency. This result indicates that the correlation between the spatial factors and space of SDA under the occupants’ perception processing elaboration can be a useful guide to predict the occupancy satisfaction of amenity and efficiency in real spaces. PMID:26784211
Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee
2016-04-01
In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.
USDA-ARS?s Scientific Manuscript database
Evaluating the effectiveness of conservation practices (CPs) is an important step to achieving efficient and successful water quality management. Watershed-scale simulation models can provide useful and convenient tools for this evaluation, but simulated conservation practice effectiveness should be...
Holding Students Accountable in Team Projects
ERIC Educational Resources Information Center
Mentzer, Nathan
2014-01-01
This article describes an efficient peer evaluation process that can be implemented at the middle and high school levels, and that holds students accountable for their individual contributions in a team-based project. Teachers faced with this challenge will welcome the web-based peer-evaluation interface that was capable of soliciting student…
Evaluating Eco-Innovation of OECD Countries with Data Envelopment Analysis
ERIC Educational Resources Information Center
Mavi, Reza Kiani; Standing, Craig
2016-01-01
Government regulations require businesses to improve their processes and products/services in a green and sustainable manner. For being environmentally friendly, businesses should invest more on eco-innovation practices. Firms eco-innovate to promote eco-efficiency and sustainability. This paper evaluates the eco-innovation performance of…
Performance evaluation of nonhomogeneous hospitals: the case of Hong Kong hospitals.
Li, Yongjun; Lei, Xiyang; Morton, Alec
2018-02-14
Throughout the world, hospitals are under increasing pressure to become more efficient. Efficiency analysis tools can play a role in giving policymakers insight into which units are less efficient and why. Many researchers have studied efficiencies of hospitals using data envelopment analysis (DEA) as an efficiency analysis tool. However, in the existing literature on DEA-based performance evaluation, a standard assumption of the constant returns to scale (CRS) or the variable returns to scale (VRS) DEA models is that decision-making units (DMUs) use a similar mix of inputs to produce a similar set of outputs. In fact, hospitals with different primary goals supply different services and provide different outputs. That is, hospitals are nonhomogeneous and the standard assumption of the DEA model is not applicable to the performance evaluation of nonhomogeneous hospitals. This paper considers the nonhomogeneity among hospitals in the performance evaluation and takes hospitals in Hong Kong as a case study. An extension of Cook et al. (2013) [1] based on the VRS assumption is developed to evaluated nonhomogeneous hospitals' efficiencies since inputs of hospitals vary greatly. Following the philosophy of Cook et al. (2013) [1], hospitals are divided into homogeneous groups and the product process of each hospital is divided into subunits. The performance of hospitals is measured on the basis of subunits. The proposed approach can be applied to measure the performance of other nonhomogeneous entities that exhibit variable return to scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elia, Valerio; Gnoni, Maria Grazia, E-mail: mariagrazia.gnoni@unisalento.it; Tornese, Fabiana
Highlights: • Pay-As-You-Throw (PAYT) schemes are becoming widespread in several countries. • Economic, organizational and technological issues have to be integrated in an efficient PAYT model design. • Efficiency refers to a PAYT system which support high citizen participation rates as well as economic sustainability. • Different steps and constraints have to be evaluated from collection services to type technologies. • An holistic approach is discussed to support PAYT systems diffusion. - Abstract: Pay-As-You-Throw (PAYT) strategies are becoming widely applied in solid waste management systems; the main purpose is to support a more sustainable – from economic, environmental and socialmore » points of view – management of waste flows. Adopting PAYT charging models increases the complexity level of the waste management service as new organizational issues have to be evaluated compared to flat charging models. In addition, innovative technological solutions could also be adopted to increase the overall efficiency of the service. Unit pricing, user identification and waste measurement represent the three most important processes to be defined in a PAYT system. The paper proposes a holistic framework to support an effective design and management process. The framework defines most critical processes and effective organizational and technological solutions for supporting waste managers as well as researchers.« less
Design of the storage location based on the ABC analyses
NASA Astrophysics Data System (ADS)
Jemelka, Milan; Chramcov, Bronislav; Kříž, Pavel
2016-06-01
The paper focuses on process efficiency and saving storage costs. Maintaining inventory through putaway strategy takes personnel time and costs money. The aim is to control inventory in the best way. The ABC classification based on Villefredo Pareto theory is used for a design of warehouse layout. New design of storage location reduces the distance of fork-lifters, total costs and it increases inventory process efficiency. The suggested solutions and evaluation of achieved results are described in detail. Proposed solutions were realized in real warehouse operation.
Relatively fast! Efficiency advantages of comparative thinking.
Mussweiler, Thomas; Epstude, Kai
2009-02-01
Comparisons are a ubiquitous process in information processing. Seven studies examine whether, how, and when comparative thinking increases the efficiency of judgment and choice. Studies 1-4 demonstrate that procedurally priming participants to engage in more vs. less comparison influences how they process information about a target. Specifically, they retrieve less information about the target (Studies 1A, 1B), think more about an information-rich standard (Study 2) about which they activate judgment-relevant information (Study 3), and use this information to compensate for missing target information (Study 4). Studies 2-5 demonstrate the ensuing efficiency advantages. Participants who are primed on comparative thinking are faster in making a target judgment (Studies 2A, 2B, 4, 5) and have more residual processing capacities for a secondary task (Study 5). Studies 6 and 7 establish two boundary conditions by demonstrating that comparative thinking holds efficiency advantages only if target and standard are partly characterized by alignable features (Study 6) that are difficult to evaluate in isolation (Study 7). These findings indicate that comparative thinking may often constitute a useful mechanism to simplify information processing. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
Improved system integration for integrated gasification combined cycle (IGCC) systems.
Frey, H Christopher; Zhu, Yunhua
2006-03-01
Integrated gasification combined cycle (IGCC) systems are a promising technology for power generation. They include an air separation unit (ASU), a gasification system, and a gas turbine combined cycle power block, and feature competitive efficiency and lower emissions compared to conventional power generation technology. IGCC systems are not yet in widespread commercial use and opportunities remain to improve system feasibility via improved process integration. A process simulation model was developed for IGCC systems with alternative types of ASU and gas turbine integration. The model is applied to evaluate integration schemes involving nitrogen injection, air extraction, and combinations of both, as well as different ASU pressure levels. The optimal nitrogen injection only case in combination with an elevated pressure ASU had the highest efficiency and power output and approximately the lowest emissions per unit output of all cases considered, and thus is a recommended design option. The optimal combination of air extraction coupled with nitrogen injection had slightly worse efficiency, power output, and emissions than the optimal nitrogen injection only case. Air extraction alone typically produced lower efficiency, lower power output, and higher emissions than all other cases. The recommended nitrogen injection only case is estimated to provide annualized cost savings compared to a nonintegrated design. Process simulation modeling is shown to be a useful tool for evaluation and screening of technology options.
Chiu, Ming-Chuan; Hsieh, Min-Chih
2016-05-01
The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Ghafuri, Mohazabeh; Golfar, Bahareh; Nosrati, Mohsen; Hoseinkhani, Saman
2014-12-01
The process of ATP production is one of the most vital processes in living cells which happens with a high efficiency. Thermodynamic evaluation of this process and the factors involved in oxidative phosphorylation can provide a valuable guide for increasing the energy production efficiency in research and industry. Although energy transduction has been studied qualitatively in several researches, there are only few brief reviews based on mathematical models on this subject. In our previous work, we suggested a mathematical model for ATP production based on non-equilibrium thermodynamic principles. In the present study, based on the new discoveries on the respiratory chain of animal mitochondria, Golfar's model has been used to generate improved results for the efficiency of oxidative phosphorylation and the rate of energy loss. The results calculated from the modified coefficients for the proton pumps of the respiratory chain enzymes are closer to the experimental results and validate the model.
Evaluation of automatic video summarization systems
NASA Astrophysics Data System (ADS)
Taskiran, Cuneyt M.
2006-01-01
Compact representations of video, or video summaries, data greatly enhances efficient video browsing. However, rigorous evaluation of video summaries generated by automatic summarization systems is a complicated process. In this paper we examine the summary evaluation problem. Text summarization is the oldest and most successful summarization domain. We show some parallels between these to domains and introduce methods and terminology. Finally, we present results for a comprehensive evaluation summary that we have performed.
ERIC Educational Resources Information Center
Estelami, Hooman
2016-01-01
One of the fundamental drivers of the growing use of distance learning methods in modern business education has been the efficiency gains associated with this method of educational delivery. Distance methods benefit both students and educational institutions as they facilitate the processing of large volumes of learning material to overcome…
ERIC Educational Resources Information Center
Hruška, Ing. Zdenek
2018-01-01
Teaching of accounting is specific due to its frequently updated content, because Czech legal regulations significantly change annually, either because of the legislative or harmonization modifications, hence there is a need to constantly seek new ways to ensure a good quality of teaching in the efficient education process. The paper is based on…
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo.
McDaniel, T; D'Azevedo, E F; Li, Y W; Wong, K; Kent, P R C
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
NASA Astrophysics Data System (ADS)
McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.
2017-11-01
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Barreiros, Willian; Teodoro, George; Kurc, Tahsin; Kong, Jun; Melo, Alba C. M. A.; Saltz, Joel
2017-01-01
We investigate efficient sensitivity analysis (SA) of algorithms that segment and classify image features in a large dataset of high-resolution images. Algorithm SA is the process of evaluating variations of methods and parameter values to quantify differences in the output. A SA can be very compute demanding because it requires re-processing the input dataset several times with different parameters to assess variations in output. In this work, we introduce strategies to efficiently speed up SA via runtime optimizations targeting distributed hybrid systems and reuse of computations from runs with different parameters. We evaluate our approach using a cancer image analysis workflow on a hybrid cluster with 256 nodes, each with an Intel Phi and a dual socket CPU. The SA attained a parallel efficiency of over 90% on 256 nodes. The cooperative execution using the CPUs and the Phi available in each node with smart task assignment strategies resulted in an additional speedup of about 2×. Finally, multi-level computation reuse lead to an additional speedup of up to 2.46× on the parallel version. The level of performance attained with the proposed optimizations will allow the use of SA in large-scale studies. PMID:29081725
Thermochemical water decomposition. [hydrogen separation for energy applications
NASA Technical Reports Server (NTRS)
Funk, J. E.
1977-01-01
At present, nearly all of the hydrogen consumed in the world is produced by reacting hydrocarbons with water. As the supply of hydrocarbons diminishes, the problem of producing hydrogen from water alone will become increasingly important. Furthermore, producing hydrogen from water is a means of energy conversion by which thermal energy from a primary source, such as solar or nuclear fusion of fission, can be changed into an easily transportable and ecologically acceptable fuel. The attraction of thermochemical processes is that they offer the potential for converting thermal energy to hydrogen more efficiently than by water electrolysis. A thermochemical hydrogen-production process is one which requires only water as material input and mainly thermal energy, or heat, as an energy input. Attention is given to a definition of process thermal efficiency, the thermodynamics of the overall process, the single-stage process, the two-stage process, multistage processes, the work of separation and a process evaluation.
NASA Technical Reports Server (NTRS)
Spitzer, M. B.
1983-01-01
The objective of this program is the investigation and evaluation of the capabilities of the ion implantation process for the production of photovoltaic cells from a variety of present-day, state-of-the-art, low-cost silicon sheet materials. Task 1 of the program concerns application of ion implantation and furnace annealing to fabrication of cells made from dendritic web silicon. Task 2 comprises the application of ion implantation and pulsed electron beam annealing (PEBA) to cells made from SEMIX, SILSO, heat-exchanger-method (HEM), edge-defined film-fed growth (EFG) and Czochralski (CZ) silicon. The goals of Task 1 comprise an investigation of implantation and anneal processes applied to dendritic web. A further goal is the evaluation of surface passivation and back surface reflector formation. In this way, processes yielding the very highest efficiency can be evaluated. Task 2 seeks to evaluate the use of PEBA for various sheet materials. A comparison of PEBA to thermal annealing will be made for a variety of ion implantation processes.
Mennesson, Eric; Erbacher, Patrick; Piller, Véronique; Kieda, Claudine; Midoux, Patrick; Pichon, Chantal
2005-06-01
Following systemic administration, polyplexes must cross the endothelium barrier to deliver genes to the target cells underneath. To design an efficient gene delivery system into lung epithelium, we evaluated capture and transfection efficiencies of DNA complexed with either Jet-PEI (PEI-polyplexes) or histidylated polylysine (His-polyplexes) in human lung microvascular endothelial cells (HLMEC) and tracheal epithelial cells. After optimizing growth conditions to obtain a tight HLMEC monolayer, we characterized uptake of polyplexes by flow cytometry and evaluated their transfection efficiency. Polyplexes were formulated as small particles. YOYO-labelled plasmid fluorescence intensity and luciferase activity were used as readouts for uptake and gene expression, respectively. PEI-polyplexes were more efficiently taken up than His-polyplexes by both non-polarized (2-fold) and polarized HLMEC (10-fold). They were mainly internalized by a clathrin-dependent pathway whatever the cell state. In non-polarized cells, His-polyplexes entered also mainly via a clathrin-dependent pathway but with an involvement of cholesterol. The cell polarization decreased this way and a clathrin-independent pathway became predominant. PEI-polyplexes transfected more efficiently HLMEC than His-polyplexes (10(7) vs. 10(5) relative light units (RLU)/mg of proteins) with a more pronounced difference in polarized cells. In contrast, no negative effect of the cell polarization was observed with tracheal epithelial cells in which both polyplexes had comparable efficiency. We show that the efficiency of polyplex uptake by HLMEC and their internalization mechanism are polymer-dependent. By contrast with His-polyplexes, the HLMEC polarization has little influence on the uptake process and on the transfection efficiency of PEI-polyplexes. Copyright (c) 2005 John Wiley & Sons, Ltd.
Assessing efficiency and effectiveness of Malaysian Islamic banks: A two stage DEA analysis
NASA Astrophysics Data System (ADS)
Kamarudin, Norbaizura; Ismail, Wan Rosmanira; Mohd, Muhammad Azri
2014-06-01
Islamic banks in Malaysia are indispensable players in the financial industry with the growing needs for syariah compliance system. In the banking industry, most recent studies concerned only on operational efficiency. However rarely on the operational effectiveness. Since the production process of banking industry can be described as a two-stage process, two-stage Data Envelopment Analysis (DEA) can be applied to measure the bank performance. This study was designed to measure the overall performance in terms of efficiency and effectiveness of Islamic banks in Malaysia using Two-Stage DEA approach. This paper presents analysis of a DEA model which split the efficiency and effectiveness in order to evaluate the performance of ten selected Islamic Banks in Malaysia for the financial year period ended 2011. The analysis shows average efficient score is more than average effectiveness score thus we can say that Malaysian Islamic banks were more efficient rather than effective. Furthermore, none of the bank exhibit best practice in both stages as we can say that a bank with better efficiency does not always mean having better effectiveness at the same time.
NASA Astrophysics Data System (ADS)
Zhang, Wei; Rao, Qiaomeng
2018-01-01
In order to solve the problem of high speed, large capacity and limited spectrum resources of satellite communication network, a double-layered satellite network with global seamless coverage based on laser and microwave hybrid links is proposed in this paper. By analyzing the characteristics of the double-layered satellite network with laser and microwave hybrid links, an effectiveness evaluation index system for the network is established. And then, the fuzzy analytic hierarchy process, which combines the analytic hierarchy process and the fuzzy comprehensive evaluation theory, is used to evaluate the effectiveness of the double-layered satellite network with laser and microwave hybrid links. Furthermore, the evaluation result of the proposed hybrid link network is obtained by simulation. The effectiveness evaluation process of the proposed double-layered satellite network with laser and microwave hybrid links can help to optimize the design of hybrid link double-layered satellite network and improve the operating efficiency of the satellite system.
[Efficiency versus quality in the NHS, in Portugal: methodologies for evaluation].
Giraldes, Maria do Rosário
2008-01-01
To proceed to the evaluation of the efficiency and quality in the NHS, based in methodologies of evaluation of management, indicators of benchmarking and indicators of process and outcome. The 1980 and 1990 decades have seen the proliferation of all forms of process indicators as a way to control health services. It is not a coincidence that the increase in managed care has been accompanied by an explosion of process indicators, as it has happened in the health system of the USA. More recently the attention has turned away from measures of performance, which measure the process (what has been done) to those which measure outcomes (what was the result). Quality indicators have been developed in Europe, first to be used in hospitals, but also to be used in primary health care. Conceptually the justification for the introduction of process indicators comes from the principle that their use will reinforce a modification in the quality of the proceedings, which will give origin to better outcomes as well at population level, as resource saving. Outcome indicators compared with process indicators in health care shows that process indicators have the advantage of being more sensitive than outcome indicators to differences in the quality. Optimizing health care quality has the objective of establishing a quantitative relationship between the quality of the health services and cost-effectiveness. To identify quality indicators and benchmarking and to implement plans to measure the quality of health care. In a study made in a group of senior GP, in the UK, with the objective of determining which process indicators better reflect the quality of the services in primary health care services a Delphi method was used. Only seven indicators were chosen by 75% of the respondents: the percentage of eligible patients receiving cervical screening; the percentage of generic prescribing; the percentage of eligible patients receiving childhood immunization; the percentage of eligible patients receiving influenza vaccinations; ability to see GP within 48 hours; percentage prescribing antibacterial drugs; primary care management (diabetes and asthma). The main characteristics of health indicators are: acceptability--The acceptability of the data collected using a measure will depend upon the extent to which the findings are acceptable to both those being assessed and those undertaking the assessment; feasibility--information about the quality of services is often driven by data availability rather than by epidemiological and clinical considerations. Quality measurement cannot be achieved without accurate and consistent information systems; reliability--indicators should be used to compare organisations/practitioners with similar organisations/practitioners; sensitivity to change--quality measures must be capable of detecting changes in quality of care in order to discriminate between and within subjects; validity--there has been little methodological scrutiny of the validity of consensus methods. Outcome indicators are not good performance indicators in health care. Which causes the variation in outcomes between deliverers of primary health care services are the observed differences due to differences in users, due to age, sex, co-morbidity, severity and socio-economic situation. The Medical Outcomes Study, published in 1989, has brought, for the first time, subjective indicators, based in the evaluation of users, as an important outcome indicator. Clinical indicators are those that are more associated with the outcomes. A few studies exist of the effects of management indicators in outcomes. Several indicators, however, reflect norms related with the local of work. The use of a Composite Indicator presents advantages. In England it has been used a Composite Indicator of process indicators in 302 organizations of primary health care, in 2001-2002. This study has used a mathematical model to select the best indicators which allow the evaluation of performance. It has concluded that the use of a Composite Indicator is of easy construction, interpretation, and acceptable and that has validity. Giraldes (2007) has done an evaluation of health centres in a perspective of management and quality of deliver using a Composite Indicator of Efficiency and Quality. It includes the efficiency indicators concerned with the main activities of the health centre, preventive activities, curative activities and drugs, by main pharmaco therapeutic groups, and auxiliary means of diagnosis (analysis, X Ray, ecographies and CAT by user, weighted according to the relevance of the expenditure in total expenditure). The Composite Quality Indicator includes 12 performance and 5 outcome indicators. From the 10 best health centres in an efficiency and quality perspective 3 are from the Porto Sub-Region (Negrelos, Rebordosa and Paredes) and 2 from the Braga Sub-Region (Vila Verde and Vila Nova de Famalicão I), Leiria (Pedrogão Grande and Batalha), and Vila Real (Mesão Frio and Sabrosa), while 1 belongs to the Aveiro Sub-Region (Sever do Vouga). The more efficient health centres are from the Aveiro Sub-Region, followed by Braga, Porto, and Lisboa. Sub-Regions with very similar values. Giraldes (2007) has made an evaluation of the hospital expenditure by user in an efficiency perspective and to evaluate the quality of the health system process indicators and outcome indicators. In an efficiency perspective the concept of technical efficiency has been chosen, and a correction has been made, as well, in what concerns a case-mix index (CMI). The indicators have been calculated by user in what concerns the main hospital activities (the expenditure in inpatient care by treated patient, in day hospital by treated patient, in outpatient care by consultation, etc.) and as well the auxiliary sections of clinic support and the hotel support services. All the indicators have been corrected according to the relevance of its expenditure in total expenditure. In a quality perspective two types of indicators have been considered; process indicators and outcome indicators. Process indicators, as the percentage of surgeries in ambulatory care, the percentage of cesareans in total deliveries and the rate of autopsy. The outcome indicator number of episodes of inpatient care due to surgery infection in total days of inpatient care. Those indicators have been aggregated, by a mean. The Composite Indicator of Efficiency and Quality is the mean of the Composite Indicator of Efficiency and the Composite Indicator of Quality, having this one been converted in inverse base.
NASA Technical Reports Server (NTRS)
Gurtler, R. W.; Baghdadi, A.
1977-01-01
A ribbon-to-ribbon process was used for routine growth of samples for analysis and fabrication into solar cells. One lot of solar cells was completely evaluated: ribbon solar cell efficiencies averaged 9.23% with a highest efficiency of 11.7%. Spherical reflectors have demonstrated significant improvements in laser silicon coupling efficiencies. Material analyses were performed including silicon photovoltage and open circuit photovoltage diffusion length measurements, crystal morphology studies, modulus of rupture measurements, and annealing/gettering studies. An initial economic analysis was performed indicating that ribbon-to-ribbon add-on costs of $.10/watt might be expected in the early 1980's.
Degradation of caffeine by conductive diamond electrochemical oxidation.
Indermuhle, Chloe; Martín de Vidales, Maria J; Sáez, Cristina; Robles, José; Cañizares, Pablo; García-Reyes, Juan F; Molina-Díaz, Antonio; Comninellis, Christos; Rodrigo, Manuel A
2013-11-01
The use of Conductive-Diamond Electrochemical Oxidation (CDEO) and Sonoelectrochemical Oxidation (CDSEO) has been evaluated for the removal of caffeine of wastewater. Effects of initial concentration, current density and supporting electrolyte on the process efficiency are assessed. Results show that caffeine is very efficiently removed with CDEO and that depletion of caffeine has two stages depending on its concentration. At low concentrations, opposite to what it is expected in a mass-transfer controlled process, the efficiency increases with current density very significantly, suggesting a very important role of mediated oxidation processes on the removal of caffeine. In addition, the removal of caffeine is faster than TOC, indicating the formation of reaction intermediates. The number and relative abundance of them depend on the operating conditions and supporting electrolyte used. In chloride media, removal of caffeine is faster and more efficiently, although the occurrence of more intermediates takes place. CDSEO does not increase the efficiency of caffeine removal, but it affects to the formation of intermediates. A detailed characterization of intermediates by liquid chromatography time-of-flight mass spectrometry seems to indicate that the degradation of caffeine by CDEO follows an oxidation pathway similar to mechanism proposed by other advanced oxidation processes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Kübler, Andrea; Holz, Elisa M; Riccio, Angela; Zickler, Claudia; Kaufmann, Tobias; Kleih, Sonja C; Staiger-Sälzer, Pit; Desideri, Lorenzo; Hoogerwerf, Evert-Jan; Mattia, Donatella
2014-01-01
Albeit research on brain-computer interfaces (BCI) for controlling applications has expanded tremendously, we still face a translational gap when bringing BCI to end-users. To bridge this gap, we adapted the user-centered design (UCD) to BCI research and development which implies a shift from focusing on single aspects, such as accuracy and information transfer rate (ITR), to a more holistic user experience. The UCD implements an iterative process between end-users and developers based on a valid evaluation procedure. Within the UCD framework usability of a device can be defined with regard to its effectiveness, efficiency, and satisfaction. We operationalized these aspects to evaluate BCI-controlled applications. Effectiveness was regarded equivalent to accuracy of selections and efficiency to the amount of information transferred per time unit and the effort invested (workload). Satisfaction was assessed with questionnaires and visual-analogue scales. These metrics have been successfully applied to several BCI-controlled applications for communication and entertainment, which were evaluated by end-users with severe motor impairment. Results of four studies, involving a total of N = 19 end-users revealed: effectiveness was moderate to high; efficiency in terms of ITR was low to high and workload low to medium; depending on the match between user and technology, and type of application satisfaction was moderate to high. The here suggested evaluation metrics within the framework of the UCD proved to be an applicable and informative approach to evaluate BCI controlled applications, and end-users with severe impairment and in the locked-in state were able to participate in this process.
Kübler, Andrea; Holz, Elisa M.; Riccio, Angela; Zickler, Claudia; Kaufmann, Tobias; Kleih, Sonja C.; Staiger-Sälzer, Pit; Desideri, Lorenzo; Hoogerwerf, Evert-Jan; Mattia, Donatella
2014-01-01
Albeit research on brain-computer interfaces (BCI) for controlling applications has expanded tremendously, we still face a translational gap when bringing BCI to end-users. To bridge this gap, we adapted the user-centered design (UCD) to BCI research and development which implies a shift from focusing on single aspects, such as accuracy and information transfer rate (ITR), to a more holistic user experience. The UCD implements an iterative process between end-users and developers based on a valid evaluation procedure. Within the UCD framework usability of a device can be defined with regard to its effectiveness, efficiency, and satisfaction. We operationalized these aspects to evaluate BCI-controlled applications. Effectiveness was regarded equivalent to accuracy of selections and efficiency to the amount of information transferred per time unit and the effort invested (workload). Satisfaction was assessed with questionnaires and visual-analogue scales. These metrics have been successfully applied to several BCI-controlled applications for communication and entertainment, which were evaluated by end-users with severe motor impairment. Results of four studies, involving a total of N = 19 end-users revealed: effectiveness was moderate to high; efficiency in terms of ITR was low to high and workload low to medium; depending on the match between user and technology, and type of application satisfaction was moderate to high. The here suggested evaluation metrics within the framework of the UCD proved to be an applicable and informative approach to evaluate BCI controlled applications, and end-users with severe impairment and in the locked-in state were able to participate in this process. PMID:25469774
Application of micronucleus test and comet assay to evaluate BTEX biodegradation.
Mazzeo, Dânia Elisa Christofoletti; Matsumoto, Silvia Tamie; Levy, Carlos Emílio; de Angelis, Dejanira de Franceschi; Marin-Morales, Maria Aparecida
2013-01-01
The BTEX (benzene, toluene, ethylbenzene and xylene) mixture is an environmental pollutant that has a high potential to contaminate water resources, especially groundwater. The bioremediation process by microorganisms has often been used as a tool for removing BTEX from contaminated sites. The application of biological assays is useful in evaluating the efficiency of bioremediation processes, besides identifying the toxicity of the original contaminants. It also allows identifying the effects of possible metabolites formed during the biodegradation process on test organisms. In this study, we evaluated the genotoxic and mutagenic potential of five different BTEX concentrations in rat hepatoma tissue culture (HTC) cells, using comet and micronucleus assays, before and after biodegradation. A mutagenic effect was observed for the highest concentration tested and for its respective non-biodegraded concentration. Genotoxicity was significant for all non-biodegraded concentrations and not significant for the biodegraded ones. According to our results, we can state that BTEX is mutagenic at concentrations close to its water solubility, and genotoxic even at lower concentrations, differing from some described results reported for the mixture components, when tested individually. Our results suggest a synergistic effect for the mixture and that the biodegradation process is a safe and efficient methodology to be applied at BTEX-contaminated sites. Copyright © 2012 Elsevier Ltd. All rights reserved.
Rebar, Amanda L.; Ram, Nilam; Conroy, David E.
2014-01-01
Objective The Single-Category Implicit Association Test (SC-IAT) has been used as a method for assessing automatic evaluations of physical activity, but measurement artifact or consciously-held attitudes could be confounding the outcome scores of these measures. The objective of these two studies was to address these measurement concerns by testing the validity of a novel SC-IAT scoring technique. Design Study 1 was a cross-sectional study, and study 2 was a prospective study. Method In study 1, undergraduate students (N = 104) completed SC-IATs for physical activity, flowers, and sedentary behavior. In study 2, undergraduate students (N = 91) completed a SC-IAT for physical activity, self-reported affective and instrumental attitudes toward physical activity, physical activity intentions, and wore an accelerometer for two weeks. The EZ-diffusion model was used to decompose the SC-IAT into three process component scores including the information processing efficiency score. Results In study 1, a series of structural equation model comparisons revealed that the information processing score did not share variability across distinct SC-IATs, suggesting it does not represent systematic measurement artifact. In study 2, the information processing efficiency score was shown to be unrelated to self-reported affective and instrumental attitudes toward physical activity, and positively related to physical activity behavior, above and beyond the traditional D-score of the SC-IAT. Conclusions The information processing efficiency score is a valid measure of automatic evaluations of physical activity. PMID:25484621
NASA Astrophysics Data System (ADS)
Blanco, K.; Aponte, H.; Vera, E.
2017-12-01
For all Industrial sector is important to extend the useful life of the materials that they use in their process, the scales of CaCO3 are common in situation where fluids are handled with high concentration of ions and besides this temperatures and CO2 concentration dissolved, that scale generates large annual losses because there is a reduction in the process efficiency or corrosion damage under deposit, among other. In order to find new alternatives to this problem, the citric acid was evaluated as scale of calcium carbonate inhibition in critical condition of temperature and concentration of CO2 dissolved. Once the results are obtained it was carried out the statistical evaluation in order to generate an equation that allow to see that behaviour, giving as result, a good efficiency of inhibition to the conditions evaluated the scales of products obtained were characterized through scanning electron microscopy.
Evaluation the course of the vehicle braking process in case of hydraulic circuit malfunction
NASA Astrophysics Data System (ADS)
Szczypiński-Sala, W.; Lubas, J.
2016-09-01
In the paper, the results of the research were discussed, the aim of which was the evaluation of the vehicle braking performance efficiency and the course of this process with regard to the dysfunction which may occur in braking hydraulic circuit. As part of the research, on-road tests were conducted. During the research, the delay of the vehicle when braking was measured with the use of the set of sensors placed in the parallel and the perpendicular axis of the vehicle. All the tests were conducted on the same flat section of asphalt road with wet surface. Conditions of diminished tire-to-road adhesion were chosen in order to force the activity of anti-lock braking system. The research was conducted comparatively for the vehicle with acting anti-lock braking system and subsequently for the vehicle without the system. In both cases, there was a subsequent evaluation of the course of braking with efficient braking system and with the dysfunction of hydraulic circuit.
Park, Sora; Seon, Jiyun; Byun, Imgyu; Cho, Sunja; Park, Taejoo; Lee, Taeho
2010-05-01
The applicability of modified spent caustic (MSC) as an electron donor for denitrification was evaluated in a lab-scale reactor for the Bardenpho process under various electron donor conditions: (A) no electron donor, (B) methanol, (C) thiosulfate and (D) MSC conditions. TN removal efficiency varied in each condition, 23.1%, 87.8%, 83.7% and 71.7%, respectively. The distribution ratio of nitrifying bacteria and DGGE profile including sulfur-reducing or oxidizing bacteria also varied depending on the conditions. These results indicated that the MSC would be used as an efficient electron donor for denitrification by autotrophic denitrifier in wastewater treatment process. Copyright 2009 Elsevier Ltd. All rights reserved.
Simulation-Based Learning: The Learning-Forgetting-Relearning Process and Impact of Learning History
ERIC Educational Resources Information Center
Davidovitch, Lior; Parush, Avi; Shtub, Avy
2008-01-01
The results of empirical experiments evaluating the effectiveness and efficiency of the learning-forgetting-relearning process in a dynamic project management simulation environment are reported. Sixty-six graduate engineering students performed repetitive simulation-runs with a break period of several weeks between the runs. The students used a…
Privatization of Higher Education in Nigeria: Critical Issues
ERIC Educational Resources Information Center
Okunola, Philips Olayide; Oladipo, Simeon Adebayo
2012-01-01
The broad intent of any educational reform is premised on the assumption that it is capable of improving educational process and practices, hence, the need for evaluation of the system's process in order to determine the efficiency and effectiveness of resource allocation. Education is capital intensive in terms of human, financial and material…
Cache write generate for parallel image processing on shared memory architectures.
Wittenbrink, C M; Somani, A K; Chen, C H
1996-01-01
We investigate cache write generate, our cache mode invention. We demonstrate that for parallel image processing applications, the new mode improves main memory bandwidth, CPU efficiency, cache hits, and cache latency. We use register level simulations validated by the UW-Proteus system. Many memory, cache, and processor configurations are evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Alberta; Mann, Margaret; Gelman, Rachel
In evaluating next-generation materials and processes, the supply chain can have a large impact on the life cycle energy impacts. The Materials Flow through Industry (MFI) tool was developed for the Department of Energy's Advanced Manufacturing Office to be able to evaluate the energy impacts of the U.S. supply chain. The tool allows users to perform process comparisons, material substitutions, and grid modifications, and to see the effects of implementing sector efficiency potentials (Masanet, et al. 2009). This paper reviews the methodology of the tool and provides results around specific scenarios.
A framework for the direct evaluation of large deviations in non-Markovian processes
NASA Astrophysics Data System (ADS)
Cavallaro, Massimo; Harris, Rosemary J.
2016-11-01
We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means.
Rico, Carlos; Montes, Jesús A; Rico, José Luis
2017-08-01
Three different types of anaerobic sludge (granular, thickened digestate and anaerobic sewage) were evaluated as seed inoculum sources for the high rate anaerobic digestion of pig slurry in UASB reactors. Granular sludge performance was optimal, allowing a high efficiency process yielding a volumetric methane production rate of 4.1LCH 4 L -1 d -1 at 1.5days HRT (0.248LCH 4 g -1 COD) at an organic loading rate of 16.4gCODL -1 d -1 . The thickened digestate sludge experimented flotation problems, thus resulting inappropriate for the UASB process. The anaerobic sewage sludge reactor experimented biomass wash-out, but allowed high process efficiency operation at 3days HRT, yielding a volumetric methane production rate of 1.7LCH 4 L -1 d -1 (0.236LCH 4 g -1 COD) at an organic loading rate of 7.2gCODL -1 d -1 . To guarantee the success of the UASB process, the settleable solids of the slurry must be previously removed. Copyright © 2017 Elsevier Ltd. All rights reserved.
The economic production of alcohol fuels from coal-derived synthesis gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kugler, E.L.; Dadyburjor, D.B.; Yang, R.Y.K.
1995-12-31
The objectives of this project are to discover, (1) study and evaluate novel heterogeneous catalytic systems for the production of oxygenated fuel enhancers from synthesis gas. Specifically, alternative methods of preparing catalysts are to be investigated, and novel catalysts, including sulfur-tolerant ones, are to be pursued. (Task 1); (2) explore, analytically and on the bench scale, novel reactor and process concepts for use in converting syngas to liquid fuel products. (Task 1); (3) simulate by computer the most energy efficient and economically efficient process for converting coal to energy, with primary focus on converting syngas to fuel alcohols. (Task 2);more » (4) develop on the bench scale the best holistic combination of chemistry, catalyst, reactor and total process configuration integrated with the overall coal conversion process to achieve economic optimization for the conversion of syngas to liquid products within the framework of achieving the maximum cost effective transformation of coal to energy equivalents. (Tasks 1 and 2); and (5) evaluate the combustion, emission and performance characteristics of fuel alcohols and blends of alcohols with petroleum-based fuels. (Task 2)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric C; Smith, Raymond; Ruiz-Mercado, Gerardo
This presentation examines different methods for analyzing manufacturing processes in the early stages of technical readiness. Before developers know much detail about their processes, it is valuable to apply various assessments to evaluate their performance. One type of assessment evaluates performance indicators to describe how closely processes approach desirable objectives. Another type of assessment determines the life cycle inventories (LCI) of inputs and outputs for processes, where for a functional unit of product, the user evaluates the resources used and the releases to the environment. These results can be compared to similar processes or combined with the LCI of othermore » processes to examine up-and down-stream chemicals. The inventory also provides a listing of the up-stream chemicals, which permits study of the whole life cycle. Performance indicators are evaluated in this presentation with the U.S. Environmental Protection Agency's GREENSCOPE (Gauging Reaction Effectiveness for ENvironmental Sustainability with a multi-Objective Process Evaluator) methodology, which evaluates processes in four areas: Environment, Energy, Economics, and Efficiency. The method develops relative scores for indicators that allow comparisons across various technologies. In this contribution, two conversion pathways for producing cellulosic ethanol from biomass, via thermochemical and biochemical routes, are studied. The information developed from the indicators and LCI can be used to inform the process design and the potential life cycle effects of up- and down-stream chemicals.« less
Towards an evaluation framework for Laboratory Information Systems.
Yusof, Maryati M; Arifin, Azila
Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
Framework for the Intelligent Transportation System (ITS) Evaluation : ITS Integration Activities
DOT National Transportation Integrated Search
2006-08-01
Intelligent Transportation Systems (ITS) represent a significant opportunity to improve the efficiency and safety of the surface transportation system. ITS includes technologies to support information processing, communications, surveillance and cont...
Solar thermal technology evaluation, fiscal year 1982. Volume 2: Technical
NASA Technical Reports Server (NTRS)
1983-01-01
The technology base of solar thermal energy is investigated. The materials, components, subsystems, and processes capable of meeting specific energy cost targets are emphasized, as are system efficiency and reliability.
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Recurrent neural network based virtual detection line
NASA Astrophysics Data System (ADS)
Kadikis, Roberts
2018-04-01
The paper proposes an efficient method for detection of moving objects in the video. The objects are detected when they cross a virtual detection line. Only the pixels of the detection line are processed, which makes the method computationally efficient. A Recurrent Neural Network processes these pixels. The machine learning approach allows one to train a model that works in different and changing outdoor conditions. Also, the same network can be trained for various detection tasks, which is demonstrated by the tests on vehicle and people counting. In addition, the paper proposes a method for semi-automatic acquisition of labeled training data. The labeling method is used to create training and testing datasets, which in turn are used to train and evaluate the accuracy and efficiency of the detection method. The method shows similar accuracy as the alternative efficient methods but provides greater adaptability and usability for different tasks.
Quantitative basis for component factors of gas flow proportional counting efficiencies
NASA Astrophysics Data System (ADS)
Nichols, Michael C.
This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.
The Brain as a Distributed Intelligent Processing System: An EEG Study
da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo
2011-01-01
Background Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion The present results support these claims and the neural efficiency hypothesis. PMID:21423657
Gao, Pin; Ding, Yunjie; Li, Hui; Xagoraraki, Irene
2012-06-01
Occurrence and removal efficiencies of fifteen pharmaceuticals were investigated in a conventional municipal wastewater treatment plant in Michigan. Concentrations of these pharmaceuticals were determined in both wastewater and sludge phases by a high-performance liquid chromatograph coupled to a tandem mass spectrometer. Detailed mass balance analysis was conducted during the whole treatment process to evaluate the contributing processes for pharmaceutical removal. Among the pharmaceuticals studied, demeclocycline, sulfamerazine, erythromycin and tylosin were not detected in the wastewater treatment plant influent. Other target pharmaceuticals detected in wastewater were also found in the corresponding sludge phase. The removal efficiencies of chlortetracycline, tetracycline, sulfamerazine, acetaminophen and caffeine were >99%, while doxycycline, oxytetracycline, sulfadiazine and lincomycin exhibited relatively lower removal efficiencies (e.g., <50%). For sulfamethoxazole, the removal efficiency was approximately 90%. Carbamazepine manifested a net increase of mass, i.e. 41% more than the input from the influent. Based on the mass balance analysis, biotransformation is believed to be the predominant process responsible for the removal of pharmaceuticals (22% to 99%), whereas contribution of sorption to sludge was relatively insignificant (7%) for the investigated pharmaceuticals. Copyright © 2012 Elsevier Ltd. All rights reserved.
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less
Fuzzy Evaluating Customer Satisfaction of Jet Fuel Companies
NASA Astrophysics Data System (ADS)
Cheng, Haiying; Fang, Guoyi
Based on the market characters of jet fuel companies, the paper proposes an evaluation index system of jet fuel company customer satisfaction from five dimensions as time, business, security, fee and service. And a multi-level fuzzy evaluation model composing with the analytic hierarchy process approach and fuzzy evaluation approach is given. Finally a case of one jet fuel company customer satisfaction evaluation is studied and the evaluation results response the feelings of the jet fuel company customers, which shows the fuzzy evaluation model is effective and efficient.
Effective clinical education: strategies for teaching medical students and residents in the office.
Cayley, William E
2011-08-01
Educating medical students and residents in the office presents the challenges of providing quality medical care, maintaining efficiency, and incorporating meaningful education for learners. Numerous teaching strategies to address these challenges have been described in the medical educational literature, but only a few teaching strategies have been evaluated for their impact on education and office practice. Literature on the impact of office-based teaching strategies on educational outcomes and on office efficiency was selected from a Pub Med search, from review of references in retrieved articles, and from the author's personal files. Two teaching strategies, "one-minute preceptor" (OMP) and "SNAPPS," have been shown to improve educational processes and outcomes. Two additional strategies, "Aunt Minnie" pattern recognition and "activated demonstration," show promise but have not been fully evaluated. None of these strategies has been shown to improve office efficiency. OMP and SNAPPS are strategies that can be used in office precepting to improve educational processes and outcomes, while pattern recognition and activated demonstration show promise but need further assessment. Additional areas of research also are suggested.
Carbon-free hydrogen production from low rank coal
NASA Astrophysics Data System (ADS)
Aziz, Muhammad; Oda, Takuya; Kashiwagi, Takao
2018-02-01
Novel carbon-free integrated system of hydrogen production and storage from low rank coal is proposed and evaluated. To measure the optimum energy efficiency, two different systems employing different chemical looping technologies are modeled. The first integrated system consists of coal drying, gasification, syngas chemical looping, and hydrogenation. On the other hand, the second system combines coal drying, coal direct chemical looping, and hydrogenation. In addition, in order to cover the consumed electricity and recover the energy, combined cycle is adopted as addition module for power generation. The objective of the study is to find the best system having the highest performance in terms of total energy efficiency, including hydrogen production efficiency and power generation efficiency. To achieve a thorough energy/heat circulation throughout each module and the whole integrated system, enhanced process integration technology is employed. It basically incorporates two core basic technologies: exergy recovery and process integration. Several operating parameters including target moisture content in drying module, operating pressure in chemical looping module, are observed in terms of their influence to energy efficiency. From process modeling and calculation, two integrated systems can realize high total energy efficiency, higher than 60%. However, the system employing coal direct chemical looping represents higher energy efficiency, including hydrogen production and power generation, which is about 83%. In addition, optimum target moisture content in drying and operating pressure in chemical looping also have been defined.
Quality measurement and benchmarking of HPV vaccination services: a new approach.
Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta
2014-01-01
A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.
DOT National Transportation Integrated Search
1996-08-01
The Intermodal Surface Transportation Efficiency Act of 1991 (ISTEA) requested that at least two urban transit investment projects be acquired by means of a process referred to as "turnkey", to demonstrate the concept and determine whether it can ser...
Improving operating room productivity via parallel anesthesia processing.
Brown, Michael J; Subramanian, Arun; Curry, Timothy B; Kor, Daryl J; Moran, Steven L; Rohleder, Thomas R
2014-01-01
Parallel processing of regional anesthesia may improve operating room (OR) efficiency in patients undergoes upper extremity surgical procedures. The purpose of this paper is to evaluate whether performing regional anesthesia outside the OR in parallel increases total cases per day, improve efficiency and productivity. Data from all adult patients who underwent regional anesthesia as their primary anesthetic for upper extremity surgery over a one-year period were used to develop a simulation model. The model evaluated pure operating modes of regional anesthesia performed within and outside the OR in a parallel manner. The scenarios were used to evaluate how many surgeries could be completed in a standard work day (555 minutes) and assuming a standard three cases per day, what was the predicted end-of-day time overtime. Modeling results show that parallel processing of regional anesthesia increases the average cases per day for all surgeons included in the study. The average increase was 0.42 surgeries per day. Where it was assumed that three cases per day would be performed by all surgeons, the days going to overtime was reduced by 43 percent with parallel block. The overtime with parallel anesthesia was also projected to be 40 minutes less per day per surgeon. Key limitations include the assumption that all cases used regional anesthesia in the comparisons. Many days may have both regional and general anesthesia. Also, as a case study, single-center research may limit generalizability. Perioperative care providers should consider parallel administration of regional anesthesia where there is a desire to increase daily upper extremity surgical case capacity. Where there are sufficient resources to do parallel anesthesia processing, efficiency and productivity can be significantly improved. Simulation modeling can be an effective tool to show practice change effects at a system-wide level.
A prototype software methodology for the rapid evaluation of biomanufacturing process options.
Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli
2007-10-01
A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.
Investigation of Test Methods, Material Properties and Processes for Solar Cell Encapsulants
NASA Technical Reports Server (NTRS)
Willis, P.; Baum, B.
1982-01-01
The evaluation of potentially useful low cost encapsulation materials is discussed. The goal is to identify, evaluate, test and recommend encapsulant materials and processes for the production of cost effective, long life solar cell modules. Technical investigations concerned the development of advanced cure chemistries for lamination type pottants; the continued evaluation of soil resistant surface treatment, and the results of an accelerated aging test program for the comparison of material stabilities. New compounds were evaluated for efficiency in curing both ethylene/vinyl acetate and ethylene/methyl acrylate pottants intended for vacuum bag lamination of solar cells. Two component aliphatic urethane casting syrups were evaluated for suitability as solar module pottants on the basis of optical, physical and fabrication characteristics.
Electrosprayed chitosan nanoparticles: facile and efficient approach for bacterial transformation
NASA Astrophysics Data System (ADS)
Abyadeh, Morteza; Sadroddiny, Esmaeil; Ebrahimi, Ammar; Esmaeili, Fariba; Landi, Farzaneh Saeedi; Amani, Amir
2017-12-01
A rapid and efficient procedure for DNA transformation is a key prerequisite for successful cloning and genomic studies. While there are efforts to develop a facile method, so far obtained efficiencies for alternative methods have been unsatisfactory (i.e. 105-106 CFU/μg plasmid) compared with conventional method (up to 108 CFU/μg plasmid). In this work, for the first time, we prepared chitosan/pDNA nanoparticles by electrospraying methods to improve transformation process. Electrospray method was used for chitosan/pDNA nanoparticles production to investigate the non-competent bacterial transformation efficiency; besides, the effect of chitosan molecular weight, N/P ratio and nanoparticle size on non-competent bacterial transformation efficiency was evaluated too. The results showed that transformation efficiency increased with decreasing the molecular weight, N/P ratio and nanoparticles size. In addition, transformation efficiency of 1.7 × 108 CFU/μg plasmid was obtained with chitosan molecular weight, N/P ratio and nanoparticles size values of 30 kDa, 1 and 125 nm. Chitosan/pDNA electrosprayed nanoparticles were produced and the effect of molecular weight, N/P and size of nanoparticles on transformation efficiency was evaluated. In total, we present a facile and rapid method for bacterial transformation, which has comparable efficiency with the common method.
ERIC Educational Resources Information Center
Shepard, Suzanne
The assessment process can be integrated with treatment and evaluation for helping teenage suicide attempters and families in short term psychiatric hospitalization programs. The method is an extremely efficient way for the therapist to work within a given time constraint. During family assessment sufficient information can be gathered to…
Teacher Evaluation Processes in Lebanon's Catholic Schools: A Problem-Based Learning Project
ERIC Educational Resources Information Center
Wakim, Antonio Joseph
2013-01-01
Teacher effectiveness has been on the decline in Catholic schools in Lebanon due to many factors, including, the lack of efficiency and competency of teachers and the lack of professional development opportunities. Much of this is due to the absence of an effective teacher evaluation system. As a result, many unqualified teachers are becoming…
A scalable parallel algorithm for multiple objective linear programs
NASA Technical Reports Server (NTRS)
Wiecek, Malgorzata M.; Zhang, Hong
1994-01-01
This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.
NASA Technical Reports Server (NTRS)
Ambur, Damodar R.
1993-01-01
Geodesically stiffened structures are very efficient in carrying combined bending, torsion, and pressure loading that is typical of primary aircraft structures. They are also very damage tolerant since there are multiple load paths available to redistribute loads compared to prismatically stiffened structures. Geodesically stiffened structures utilize continuous filament composite materials which make them amenable to automated manufacturing processes to reduce cost. The current practice for geodesically stiffened structures is to use a solid blade construction for the stiffener. This stiffener configuration is not an efficient concept and there is a need to identify other stiffener configurations that are more efficient but utilize the same manufacturing process as the solid blade. This paper describes a foam-filled stiffener cross section that is more efficient than a solid-blade stiffener in the load range corresponding to primary aircraft structures. A prismatic hat-stiffener panel design is then selected for structural evaluation in uni-axial compression with and without impact damage. Experimental results for both single stiffener specimens and multi-stiffener panel specimens are presented. Finite element analysis results are presented that predict the buckling and postbuckling response of the test specimens. Analytical results for both the element and panel specimens are compared with experimental results.
Lo Storto, Corrado
2013-11-01
This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Fischman, Daniel
2010-01-01
Patients' connectedness to their providers has been shown to influence the success of preventive health and disease management programs. Lean Six Sigma methodologies were employed to study workflow processes, patient-physician familiarity, and appointment compliance to improve continuity of care in an internal medicine residency clinic. We used a rapid-cycle test to evaluate proposed improvements to the baseline-identified factors impeding efficient clinic visits. Time-study, no-show, and patient-physician familiarity data were collected to evaluate the effect of interventions to improve clinic efficiency and continuity of medical care. Forty-seven patients were seen in each of the intervention and control groups. The wait duration between the end of triage and the resident-patient encounter was statistically shorter for the intervention group. Trends toward shorter wait times for medical assistant triage and total encounter were also seen in the intervention group. On all measures of connectedness, both the physicians and patients in the intervention group showed a statistically significant increased familiarity with each other. This study shows that incremental changes in workflow processes in a residency clinic can have a significant impact on practice efficiency and adherence to scheduled visits for preventive health care and chronic disease management. This project used a structured "Plan-Do-Study-Act" approach.
A hybrid process integrating vapor stripping with vapor compression and vapor permeation membrane separation, termed Membrane Assisted Vapor Stripping (MAVS), was evaluated for recovery and dehydration of ethanol and/or 1-butanol from aqueous solution as an alternative to convent...
Velasco, Antonio; Ramírez, Martha; Hernández, Sergio; Schmidt, Winfried; Revah, Sergio
2012-03-15
Single Cr(VI) reduction and coupled reduction/stabilization (R/S) processes were evaluated at pilot scale to determine their effectiveness to treat chromite ore processing residue (COPR). Sodium sulfide was used as the reducing agent and cement, gypsum and lime were tested as the stabilizing agents. The pilot experiments were performed in a helical ribbon blender mixer with batches of 250 kg of COPR and mixing time up to 30 min. Na2S/Cr(VI) mass ratios of 4.6, 5.7 and 6.8 were evaluated in the single reduction process to treat COPR with Cr(VI) concentration of ≈4.2 g/kg. The R/S process was tested with a Na2S/Cr(VI) mass ratio of 5.7 and including stabilizing agents not exceeding 5% (w/w(COPR)), to treat COPR with a Cr(VI) content of ≈5.1g/kg. The single reduction process with a ratio of 6.8, reached Cr(VI) reduction efficiencies up to 97.6% in the first days, however these values decreased to around 93% after 380 days of storage. At this point the total Cr level was around 12.5 mg/L. Cr(VI) removal efficiencies exceeding 96.5% were reached and maintained during 380 days when the coupled R/S process was evaluated. Total Cr levels lower than 5 mg/l were attained at the initials days in all R/S batch tested, however after 380 days, concentrations below the regulatory limit were only found with gypsum (2%) as single agent and with a blend of cement (4%) and lime (1%). These results indicated that the coupled R/S process is an excellent alternative to stabilize COPR. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsuchida, Yuji; Enokizono, Masato
2018-04-01
The iron loss of industrial motors increases by residual stress during manufacturing processes. It is very important to make clear the distribution of the residual stress in the motor cores to reduce the iron loss in the motors. Barkhausen signals which occur on electrical steel sheets can be used for the evaluation of the residual stress because they are very sensitive to the material properties. Generally, a B-sensor is used to measure Barkhausen signals, however, we developed a new H-sensor to measure them and applied it into the stress evaluation. It is supposed that the Barkhausen signals by using a H-sensor can be much effective to the residual stress on the electrical steel sheets by referring our results regarding to the stress evaluations. We evaluated the tensile stress of the electrical steel sheets by measuring Barkhausen signals by using our developed H-sensor for high efficiency electrical motors.
A fast and efficient segmentation scheme for cell microscopic image.
Lebrun, G; Charrier, C; Lezoray, O; Meurie, C; Cardot, H
2007-04-27
Microscopic cellular image segmentation schemes must be efficient for reliable analysis and fast to process huge quantity of images. Recent studies have focused on improving segmentation quality. Several segmentation schemes have good quality but processing time is too expensive to deal with a great number of images per day. For segmentation schemes based on pixel classification, the classifier design is crucial since it is the one which requires most of the processing time necessary to segment an image. The main contribution of this work is focused on how to reduce the complexity of decision functions produced by support vector machines (SVM) while preserving recognition rate. Vector quantization is used in order to reduce the inherent redundancy present in huge pixel databases (i.e. images with expert pixel segmentation). Hybrid color space design is also used in order to improve data set size reduction rate and recognition rate. A new decision function quality criterion is defined to select good trade-off between recognition rate and processing time of pixel decision function. The first results of this study show that fast and efficient pixel classification with SVM is possible. Moreover posterior class pixel probability estimation is easy to compute with Platt method. Then a new segmentation scheme using probabilistic pixel classification has been developed. This one has several free parameters and an automatic selection must dealt with, but criteria for evaluate segmentation quality are not well adapted for cell segmentation, especially when comparison with expert pixel segmentation must be achieved. Another important contribution in this paper is the definition of a new quality criterion for evaluation of cell segmentation. The results presented here show that the selection of free parameters of the segmentation scheme by optimisation of the new quality cell segmentation criterion produces efficient cell segmentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messenger, Mike; Bharvirkar, Ranjit; Golemboski, Bill
Public and private funding for end-use energy efficiency actions is expected to increase significantly in the United States over the next decade. For example, Barbose et al (2009) estimate that spending on ratepayer-funded energy efficiency programs in the U.S. could increase frommore » $3.1 billion in 2008 to $$7.5 and 12.4 billion by 2020 under their medium and high scenarios. This increase in spending could yield annual electric energy savings ranging from 0.58% - 0.93% of total U.S. retail sales in 2020, up from 0.34% of retail sales in 2008. Interest in and support for energy efficiency has broadened among national and state policymakers. Prominent examples include {approx}$$18 billion in new funding for energy efficiency programs (e.g., State Energy Program, Weatherization, and Energy Efficiency and Conservation Block Grants) in the 2009 American Recovery and Reinvestment Act (ARRA). Increased funding for energy efficiency should result in more benefits as well as more scrutiny of these results. As energy efficiency becomes a more prominent component of the U.S. national energy strategy and policies, assessing the effectiveness and energy saving impacts of energy efficiency programs is likely to become increasingly important for policymakers and private and public funders of efficiency actions. Thus, it is critical that evaluation, measurement, and verification (EM&V) is carried out effectively and efficiently, which implies that: (1) Effective program evaluation, measurement, and verification (EM&V) methodologies and tools are available to key stakeholders (e.g., regulatory agencies, program administrators, consumers, and evaluation consultants); and (2) Capacity (people and infrastructure resources) is available to conduct EM&V activities and report results in ways that support program improvement and provide data that reliably compares achieved results against goals and similar programs in other jurisdictions (benchmarking). The National Action Plan for Energy Efficiency (2007) presented commonly used definitions for EM&V in the context of energy efficiency programs: (1) Evaluation (E) - The performance of studies and activities aimed at determining the effects and effectiveness of EE programs; (2) Measurement and Verification (M&V) - Data collection, monitoring, and analysis associated with the calculation of gross energy and demand savings from individual measures, sites or projects. M&V can be a subset of program evaluation; and (3) Evaluation, Measurement, and Verification (EM&V) - This term is frequently seen in evaluation literature. EM&V is a catchall acronym for determining both the effectiveness of program designs and estimates of load impacts at the portfolio, program and project level. This report is a scoping study that assesses current practices and methods in the evaluation, measurement and verification (EM&V) of ratepayer-funded energy efficiency programs, with a focus on methods and practices currently used for determining whether projected (ex-ante) energy and demand savings have been achieved (ex-post). M&V practices for privately-funded energy efficiency projects (e.g., ESCO projects) or programs where the primary focus is greenhouse gas reductions were not part of the scope of this study. We identify and discuss key purposes and uses of current evaluations of end-use energy efficiency programs, methods used to evaluate these programs, processes used to determine those methods; and key issues that need to be addressed now and in the future, based on discussions with regulatory agencies, policymakers, program administrators, and evaluation practitioners in 14 states and national experts in the evaluation field. We also explore how EM&V may evolve in a future in which efficiency funding increases significantly, innovative mechanisms for rewarding program performance are adopted, the role of efficiency in greenhouse gas mitigation is more closely linked, and programs are increasingly funded from multiple sources often with multiple program administrators and intended to meet multiple purposes.« less
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
Nguyen, Dinh Duc; Yoon, Yong Soo; Bui, Xuan Thanh; Kim, Sung Su; Chang, Soon Woong; Guo, Wenshan; Ngo, Huu Hao
2017-11-01
Performance of an electrocoagulation (EC) process in batch and continuous operating modes was thoroughly investigated and evaluated for enhancing wastewater phosphorus removal under various operating conditions, individually or combined with initial phosphorus concentration, wastewater conductivity, current density, and electrolysis times. The results revealed excellent phosphorus removal (72.7-100%) for both processes within 3-6 min of electrolysis, with relatively low energy requirements, i.e., less than 0.5 kWh/m 3 for treated wastewater. However, the removal efficiency of phosphorus in the continuous EC operation mode was better than that in batch mode within the scope of the study. Additionally, the rate and efficiency of phosphorus removal strongly depended on operational parameters, including wastewater conductivity, initial phosphorus concentration, current density, and electrolysis time. Based on experimental data, statistical model verification of the response surface methodology (RSM) (multiple factor optimization) was also established to provide further insights and accurately describe the interactive relationship between the process variables, thus optimizing the EC process performance. The EC process using iron electrodes is promising for improving wastewater phosphorus removal efficiency, and RSM can be a sustainable tool for predicting the performance of the EC process and explaining the influence of the process variables.
Electrochemical regeneration of phenol-saturated activated carbon - proposal of a reactor.
Zanella, Odivan; Bilibio, Denise; Priamo, Wagner Luiz; Tessaro, Isabel Cristina; Féris, Liliana Amaral
2017-03-01
An electrochemical process was used to investigate the activated carbon regeneration efficiency (RE) saturated with aromatics. For this purpose, an electrochemical reactor was developed and the operational conditions of this equipment were investigated, which is applied in activated carbon regeneration process. The influence of regeneration parameters such as processing time, the current used, the polarity and the processing fluid (electrolyte) were studied. The performance of electrochemical regeneration was evaluated by adsorption tests, using phenol as adsorbate. The increase in current applied and the process time was found to enhance the RE. Another aspect that indicated a better reactor performance was the type of electrolyte used, showing best results for NaCl. The polarity showed the highest influence on the process, when the cathodic regeneration was more efficient. The electrochemical regeneration process developed in this study presented regeneration capacities greater than 100% when the best process conditions were used, showing that this form of regeneration for activated carbon saturated with aromatics is very promising.
Ju, Feng; Lee, Hyo Kyung; Yu, Xinhua; Faris, Nicholas R; Rugless, Fedoria; Jiang, Shan; Li, Jingshan; Osarogiagbon, Raymond U
2017-12-01
The process of lung cancer care from initial lesion detection to treatment is complex, involving multiple steps, each introducing the potential for substantial delays. Identifying the steps with the greatest delays enables a focused effort to improve the timeliness of care-delivery, without sacrificing quality. We retrospectively reviewed clinical events from initial detection, through histologic diagnosis, radiologic and invasive staging, and medical clearance, to surgery for all patients who had an attempted resection of a suspected lung cancer in a community healthcare system. We used a computer process modeling approach to evaluate delays in care delivery, in order to identify potential 'bottlenecks' in waiting time, the reduction of which could produce greater care efficiency. We also conducted 'what-if' analyses to predict the relative impact of simulated changes in the care delivery process to determine the most efficient pathways to surgery. The waiting time between radiologic lesion detection and diagnostic biopsy, and the waiting time from radiologic staging to surgery were the two most critical bottlenecks impeding efficient care delivery (more than 3 times larger compared to reducing other waiting times). Additionally, instituting surgical consultation prior to cardiac consultation for medical clearance and decreasing the waiting time between CT scans and diagnostic biopsies, were potentially the most impactful measures to reduce care delays before surgery. Rigorous computer simulation modeling, using clinical data, can provide useful information to identify areas for improving the efficiency of care delivery by process engineering, for patients who receive surgery for lung cancer.
Quantitative optical diagnostics in pathology recognition and monitoring of tissue reaction to PDT
NASA Astrophysics Data System (ADS)
Kirillin, Mikhail; Shakhova, Maria; Meller, Alina; Sapunov, Dmitry; Agrba, Pavel; Khilov, Alexander; Pasukhin, Mikhail; Kondratieva, Olga; Chikalova, Ksenia; Motovilova, Tatiana; Sergeeva, Ekaterina; Turchin, Ilya; Shakhova, Natalia
2017-07-01
Optical coherence tomography (OCT) is currently actively introduced into clinical practice. Besides diagnostics, it can be efficiently employed for treatment monitoring allowing for timely correction of the treatment procedure. In monitoring of photodynamic therapy (PDT) traditionally employed fluorescence imaging (FI) can benefit from complementary use of OCT. Additional diagnostic efficiency can be derived from numerical processing of optical diagnostics data providing more information compared to visual evaluation. In this paper we report on application of OCT together with numerical processing for clinical diagnostic in gynecology and otolaryngology, for monitoring of PDT in otolaryngology and on OCT and FI applications in clinical and aesthetic dermatology. Image numerical processing and quantification provides increase in diagnostic accuracy. Keywords: optical coherence tomography, fluorescence imaging, photod
Mineral Carbonation Potential of CO2 from Natural and Industrial-based Alkalinity Sources
NASA Astrophysics Data System (ADS)
Wilcox, J.; Kirchofer, A.
2014-12-01
Mineral carbonation is a Carbon Capture and Storage (CSS) technology where gaseous CO2 is reacted with alkaline materials (such as silicate minerals and alkaline industrial wastes) and converted into stable and environmentally benign carbonate minerals (Metz et al., 2005). Here, we present a holistic, transparent life cycle assessment model of aqueous mineral carbonation built using a hybrid process model and economic input-output life cycle assessment approach. We compared the energy efficiency and the net CO2 storage potential of various mineral carbonation processes based on different feedstock material and process schemes on a consistent basis by determining the energy and material balance of each implementation (Kirchofer et al., 2011). In particular, we evaluated the net CO2 storage potential of aqueous mineral carbonation for serpentine, olivine, cement kiln dust, fly ash, and steel slag across a range of reaction conditions and process parameters. A preliminary systematic investigation of the tradeoffs inherent in mineral carbonation processes was conducted and guidelines for the optimization of the life-cycle energy efficiency are provided. The life-cycle assessment of aqueous mineral carbonation suggests that a variety of alkalinity sources and process configurations are capable of net CO2 reductions. The maximum carbonation efficiency, defined as mass percent of CO2 mitigated per CO2 input, was 83% for CKD at ambient temperature and pressure conditions. In order of decreasing efficiency, the maximum carbonation efficiencies for the other alkalinity sources investigated were: olivine, 66%; SS, 64%; FA, 36%; and serpentine, 13%. For natural alkalinity sources, availability is estimated based on U.S. production rates of a) lime (18 Mt/yr) or b) sand and gravel (760 Mt/yr) (USGS, 2011). The low estimate assumes the maximum sequestration efficiency of the alkalinity source obtained in the current work and the high estimate assumes a sequestration efficiency of 85%. The total CO2 storage potential for the alkalinity sources considered in the U.S. ranges from 1.3% to 23.7% of U.S. CO2 emissions, depending on the assumed availability of natural alkalinity sources and efficiency of the mineral carbonation processes.
IEEE 802.21 Assisted Seamless and Energy Efficient Handovers in Mixed Networks
NASA Astrophysics Data System (ADS)
Liu, Huaiyu; Maciocco, Christian; Kesavan, Vijay; Low, Andy L. Y.
Network selection is the decision process for a mobile terminal to handoff between homogeneous or heterogeneous networks. With multiple available networks, the selection process must evaluate factors like network services/conditions, monetary cost, system conditions, user preferences etc. In this paper, we investigate network selection using a cost function and information provided by IEEE 802.21. The cost function provides flexibility to balance different factors in decision making and our research is focused on improving both seamlessness and energy efficiency of handovers. Our solution is evaluated using real WiFi, WiMax, and 3G signal strength traces. The results show that appropriate networks were selected based on selection policies, handovers were triggered at optimal times to increase overall network connectivity as compared to traditional triggering schemes, while at the same time the energy consumption of multi-radio devices for both on-going operations as well as during handovers is optimized.
LeDell, Erin; Petersen, Maya; van der Laan, Mark
In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC.
Petersen, Maya; van der Laan, Mark
2015-01-01
In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC. PMID:26279737
O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor
2012-08-01
Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.
Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner
2013-09-01
This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ríos, Pedro Rizo; Rivera, Aurora González; Oropeza, Itzel Rivas; Rivas Bocanegra, Ruth E
2013-12-01
The high costs generated by the current epidemiological profile and the introduction of new technologies, impact on public health systems, this situation is complicated when the health budget is low and causes the drug to be paid by the patient's pocket. In this situation it is necessary to design strategies that strengthen the approval of drugs that will be used in public health institutions in Mexico. To describe the results of the drug approval process for use in public health institutions to ensure the efficacy, safety, and efficiency of health technologies used in public health institutions in Mexico. We conducted a cross-sectional drug approval process, from September 2011 to December 2012, with a descriptive analysis for each stage in the process considered. Of the 394 applications received for approval of health technology, 244 (62%) were for drugs; of these, 151 (62%) met the requirements for evaluation (32% and 68% were modifications inclusions), finally was approved of 42% (61% of the changes and 33% of inclusions). The 73% of the applications were for consensus approval, 12% were conditioned at low price and 6% were approved by majority vote. The main reasons for refusal were lack of clinical evidence (31%) and methodological problems in the economic evaluation (27%). The strengthening of the process was conducted with methodological rigor based on critical analysis of scientific evidence, with transparency and legitimacy under a legal framework to promote resource optimization. The highest percentage of requests was for drugs which are the most commonly used therapeutic technology; for this reason it requires a proper selection process to ensure greater health benefit that ensures efficient use of economic resources. The economic evaluation was a support tool to consider in addition to price, the value of health determined by the quality of evidence, establishing a GDP per capita as a threshold to define a drug as an efficient alternative. © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Published by International Society for Pharmacoeconomics and Outcomes Research (ISPOR) All rights reserved.
Keller, Carmen
2011-07-01
Previous experimental research provides evidence that a familiar risk comparison within a risk ladder is understood by low- and high-numerate individuals. It especially helps low numerates to better evaluate risk. In the present study, an eye tracker was used to capture individuals' visual attention to a familiar risk comparison, such as the risk associated with smoking. Two parameters of information processing-efficiency and level-were derived from visual attention. A random sample of participants from the general population (N= 68) interpreted a given risk level with the help of the risk ladder. Numeracy was negatively correlated with overall visual attention on the risk ladder (r(s) =-0.28, p= 0.01), indicating that the lower the numeracy, the more the time spent looking at the whole risk ladder. Numeracy was positively correlated with the efficiency of processing relevant frequency (r(s) = 0.34, p < 0.001) and relevant textual information (r(s) = 0.34, p < 0.001), but not with the efficiency of processing relevant comparative information and numerical information. There was a significant negative correlation between numeracy and the level of processing of relevant comparative risk information (r(s) =-0.21, p < 0.01), indicating that low numerates processed the comparative risk information more deeply than the high numerates. There was no correlation between numeracy and perceived risk. These results add to previous experimental research, indicating that the smoking risk comparison was crucial for low numerates to evaluate and understand risk. Furthermore, the eye-tracker method is promising for studying information processing and improving risk communication formats. © 2011 Society for Risk Analysis.
An Evaluation of Changes in the Curriculum in Elementary School Level in Turkey
ERIC Educational Resources Information Center
Helvaci, M. Akif
2009-01-01
The aim of this study is to evaluate the changes in the curriculum of 1-5 grades in Elementary Schools and the efficiency of school administrator in managing change in the change process. The questionnaire was applied to the school administrators for the elementary schools of Usak province of Turkiye. The questionnaire comprises 3 open-ended…
ERIC Educational Resources Information Center
Pawade, Yogesh R.; Diwase, Dipti S.
2016-01-01
Item analysis of Multiple Choice Questions (MCQs) is the process of collecting, summarizing and utilizing information from students' responses to evaluate the quality of test items. Difficulty Index (p-value), Discrimination Index (DI) and Distractor Efficiency (DE) are the parameters which help to evaluate the quality of MCQs used in an…
Pyrolysis characteristics of typical biomass thermoplastic composites
NASA Astrophysics Data System (ADS)
Cai, Hongzhen; Ba, Ziyu; Yang, Keyan; Zhang, Qingfa; Zhao, Kunpeng; Gu, Shiyan
The biomass thermoplastic composites were prepared by extrusion molding method with poplar flour, rice husk, cotton stalk and corn stalk. The thermo gravimetric analyzer (TGA) has also been used for evaluating the pyrolysis process of the composites. The results showed that the pyrolysis process mainly consists of two stages: biomass pyrolysis and the plastic pyrolysis. The increase of biomass content in the composite raised the first stage pyrolysis peak temperature. However, the carbon residue was reduced and the pyrolysis efficiency was better because of synergistic effect of biomass and plastic. The composite with different kinds of biomass have similar pyrolysis process, and the pyrolysis efficiency of the composite with corn stalk was best. The calcium carbonate could inhibit pyrolysis process and increase the first stage pyrolysis peak temperature and carbon residue as a filling material of the composite.
Winhusen, Theresa; Brady, Kathleen T.; Stitzer, Maxine; Woody, George; Lindblad, Robert; Kropp, Frankie; Brigham, Gregory; Liu, David; Sparenborg, Steven; Sharma, Gaurav; VanVeldhuisen, Paul; Adinoff, Bryon; Somoza, Eugene
2012-01-01
Cocaine dependence is a significant public health problem for which there are currently no FDA-approved medications. Hence, identifying candidate compounds and employing an efficient evaluation process is crucial. This paper describes key design decisions made for a National Institute on Drug Abuse (NIDA) Clinical Trials Network (CTN) study that uses a novel two-stage process to evaluate buspirone (60 mg/day) for cocaine-relapse prevention. The study includes pilot (N=60) and full-scale (estimated N=264) trials. Both trials will be randomized, double-blind, and placebo-controlled and both will enroll treatment-seeking cocaine-dependent participants engaged in inpatient/residential treatment and scheduled for outpatient treatment post-discharge. All participants will receive contingency management in which incentives are given for medication adherence as evaluated by the Medication Events Monitoring System (MEMS). The primary outcome measure is maximum days of continuous cocaine abstinence, as assessed by twice-weekly urine drug screens (UDS) and self-report, during the 15-week outpatient treatment phase. Drug-abuse outcomes include cocaine use as assessed by UDS and self-report of cocaine use, other substance use as assessed by UDS and self-report of substance use (i.e., alcohol and/or illicit drugs), cocaine bingeing, HIV risk behavior, quality of life, functioning, and substance-abuse treatment attendance. Unique aspects of the study include conducting an efficacy trial in community treatment programs, a two-stage process to efficiently evaluate buspirone, and an evaluation of mediators by which buspirone might exert a beneficial effect on relapse prevention. PMID:22613054
Winhusen, Theresa; Brady, Kathleen T; Stitzer, Maxine; Woody, George; Lindblad, Robert; Kropp, Frankie; Brigham, Gregory; Liu, David; Sparenborg, Steven; Sharma, Gaurav; Vanveldhuisen, Paul; Adinoff, Bryon; Somoza, Eugene
2012-09-01
Cocaine dependence is a significant public health problem for which there are currently no FDA-approved medications. Hence, identifying candidate compounds and employing an efficient evaluation process is crucial. This paper describes key design decisions made for a National Institute on Drug Abuse (NIDA) Clinical Trials Network (CTN) study that uses a novel two-stage process to evaluate buspirone (60 mg/day) for cocaine-relapse prevention. The study includes pilot (N=60) and full-scale (estimated N=264) trials. Both trials will be randomized, double-blind, and placebo-controlled and both will enroll treatment-seeking cocaine-dependent participants engaged in inpatient/residential treatment and scheduled for outpatient treatment post-discharge. All participants will receive contingency management in which incentives are given for medication adherence as evaluated by the Medication Events Monitoring System (MEMS). The primary outcome measure is maximum days of continuous cocaine abstinence, as assessed by twice-weekly urine drug screens (UDS) and self-report, during the 15-week outpatient treatment phase. Drug-abuse outcomes include cocaine use as assessed by UDS and self-report of cocaine use, other substance use as assessed by UDS and self-report of substance use (i.e., alcohol and/or illicit drugs), cocaine bingeing, HIV risk behavior, quality of life, functioning, and substance abuse treatment attendance. Unique aspects of the study include conducting an efficacy trial in community treatment programs, a two-stage process to efficiently evaluate buspirone, and an evaluation of mediators by which buspirone might exert a beneficial effect on relapse prevention. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Scott-Monck, J. A.; Stella, P. M.; Avery, J. E.
1975-01-01
Ten ohm-cm silicon solar cells, 0.2 mm thick, were produced with short circuit current efficiencies up to thirteen percent and using a combination of recent technical advances. The cells were fabricated in conventional and wraparound contact configurations. Improvement in cell collection efficiency from both the short and long wavelengths region of the solar spectrum was obtained by coupling a shallow junction and an optically transparent antireflection coating with back surface field technology. Both boron diffusion and aluminum alloying techniques were evaluated for forming back surface field cells. The latter method is less complicated and is compatible with wraparound cell processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Won-Seok; Nam, Seongsik; Chang, Seeun
Decontamination techniques proposed and used to remove Chalk River unidentified deposit (CRUD) in radioactive waste management. In cases of huge volumes of metal or radionuclides contaminated by CRUD, removal of CRUD by mechanical or chemical decontamination is difficult. An advanced electrokinetic process combined with chemical decontamination was applied to remove CRUD and experimentally evaluated. We used oxalic acid for CRUD removal, and cobalt (Co) released from the CRUD was transferred to the cathode in an electrokinetic reactor. Our results indicate that the combined system is efficient for CRUD removal with enhanced, efficiency by use of the cation exchange membrane andmore » zeolite.« less
Kim, Won-Seok; Nam, Seongsik; Chang, Seeun; ...
2017-08-13
Decontamination techniques proposed and used to remove Chalk River unidentified deposit (CRUD) in radioactive waste management. In cases of huge volumes of metal or radionuclides contaminated by CRUD, removal of CRUD by mechanical or chemical decontamination is difficult. An advanced electrokinetic process combined with chemical decontamination was applied to remove CRUD and experimentally evaluated. We used oxalic acid for CRUD removal, and cobalt (Co) released from the CRUD was transferred to the cathode in an electrokinetic reactor. Our results indicate that the combined system is efficient for CRUD removal with enhanced, efficiency by use of the cation exchange membrane andmore » zeolite.« less
Updated estimation of energy efficiencies of U.S. petroleum refineries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palou-Rivera, I.; Wang, M. Q.
2010-12-08
Evaluation of life-cycle (or well-to-wheels, WTW) energy and emission impacts of vehicle/fuel systems requires energy use (or energy efficiencies) of energy processing or conversion activities. In most such studies, petroleum fuels are included. Thus, determination of energy efficiencies of petroleum refineries becomes a necessary step for life-cycle analyses of vehicle/fuel systems. Petroleum refinery energy efficiencies can then be used to determine the total amount of process energy use for refinery operation. Furthermore, since refineries produce multiple products, allocation of energy use and emissions associated with petroleum refineries to various petroleum products is needed for WTW analysis of individual fuels suchmore » as gasoline and diesel. In particular, GREET, the life-cycle model developed at Argonne National Laboratory with DOE sponsorship, compares energy use and emissions of various transportation fuels including gasoline and diesel. Energy use in petroleum refineries is key components of well-to-pump (WTP) energy use and emissions of gasoline and diesel. In GREET, petroleum refinery overall energy efficiencies are used to determine petroleum product specific energy efficiencies. Argonne has developed petroleum refining efficiencies from LP simulations of petroleum refineries and EIA survey data of petroleum refineries up to 2006 (see Wang, 2008). This memo documents Argonne's most recent update of petroleum refining efficiencies.« less
Adverse outcome pathway networks II: Network analytics
The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...
Continued evaluation of pothole patching equipment, materials, and processes.
DOT National Transportation Integrated Search
2014-06-14
After the deaths of two Caltrans workers who were patching potholes in 2006-2007, Caltrans tasked the Advanced Highway Maintenance and Construction Technology (AHMCT) Research Center with developing a safer and more efficient means of patching pothol...
Processing uncertain RFID data in traceability supply chains.
Xie, Dong; Xiao, Jie; Guo, Guangjun; Jiang, Tong
2014-01-01
Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries.
Processing Uncertain RFID Data in Traceability Supply Chains
Xie, Dong; Xiao, Jie
2014-01-01
Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries. PMID:24737978
Fogarasi, Szabolcs; Imre-Lucaci, Florica; Imre-Lucaci, Arpád; Ilea, Petru
2014-05-30
The present study aims to develop an eco-friendly chemical-electrochemical process for the simultaneous recovery of copper and separation of a gold rich residue from waste printed circuit boards (WPCBs). The process was carried out by employing two different types of reactors coupled in series: a leaching reactor with a perforated rotating drum, for the dissolution of base metals and a divided electrochemical reactor for the regeneration of the leaching solution with the parallel electrowinning of copper. The process performances were evaluated on the basis of the dissolution efficiency, current efficiency and specific energy consumptions. Finally a process scale up was realized taking into consideration the optimal values of the operating parameters. The laboratory scale leaching plant allowed the recovery of a high purity copper deposit (99.04wt.%) at a current efficiency of 63.84% and specific energy consumption of 1.75kWh/kg cooper. The gold concentration in the remained solid residue was 25 times higher than the gold concentration in the initial WPCB samples. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarud, J.; Phillips, S.
This presentation provides a technoeconomic comparison of three biofuels - ethanol, methanol, and gasoline - produced by gasification of woody biomass residues. The presentation includes a brief discussion of the three fuels evaluated; discussion of equivalent feedstock and front end processes; discussion of back end processes for each fuel; process comparisons of efficiencies, yields, and water usage; and economic assumptions and results, including a plant gate price (PGP) for each fuel.
NASA Astrophysics Data System (ADS)
Kardas, Edyta; Brožova, Silvie; Pustějovská, Pavlína; Jursová, Simona
2017-12-01
In the paper the evaluation of efficiency of the use of machines in the selected production company was presented. The OEE method (Overall Equipment Effectiveness) was used for the analysis. The selected company deals with the production of tapered roller bearings. The analysis of effectiveness was done for 17 automatic grinding lines working in the department of grinding rollers. Low level of efficiency of machines was affected by problems with the availability of machines and devices. The causes of machine downtime on these lines was also analyzed. Three basic causes of downtime were identified: no kanban card, diamonding, no operator. Ways to improve the use of these machines were suggested. The analysis takes into account the actual results from the production process and covers the period of one calendar year.
Lanying Lin; Sheng He; Feng Fu; Xiping Wang
2015-01-01
Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...
VIEWDEX: an efficient and easy-to-use software for observer performance studies.
Håkansson, Markus; Svensson, Sune; Zachrisson, Sara; Svalkvist, Angelica; Båth, Magnus; Månsson, Lars Gunnar
2010-01-01
The development of investigation techniques, image processing, workstation monitors, analysing tools etc. within the field of radiology is vast, and the need for efficient tools in the evaluation and optimisation process of image and investigation quality is important. ViewDEX (Viewer for Digital Evaluation of X-ray images) is an image viewer and task manager suitable for research and optimisation tasks in medical imaging. ViewDEX is DICOM compatible and the features of the interface (tasks, image handling and functionality) are general and flexible. The configuration of a study and output (for example, answers given) can be edited in any text editor. ViewDEX is developed in Java and can run from any disc area connected to a computer. It is free to use for non-commercial purposes and can be downloaded from http://www.vgregion.se/sas/viewdex. In the present work, an evaluation of the efficiency of ViewDEX for receiver operating characteristic (ROC) studies, free-response ROC (FROC) studies and visual grading (VG) studies was conducted. For VG studies, the total scoring rate was dependent on the number of criteria per case. A scoring rate of approximately 150 cases h(-1) can be expected for a typical VG study using single images and five anatomical criteria. For ROC and FROC studies using clinical images, the scoring rate was approximately 100 cases h(-1) using single images and approximately 25 cases h(-1) using image stacks ( approximately 50 images case(-1)). In conclusion, ViewDEX is an efficient and easy-to-use software for observer performance studies.
Corona-Strauss, Farah I; Delb, Wolfgang; Bloching, Marc; Strauss, Daniel J
2008-01-01
We have recently shown that click evoked auditory brainstem responses (ABRs) single sweeps can efficiently be processed by a hybrid novelty detection system. This approach allowed for the objective detection of hearing thresholds in a fraction of time of conventional schemes, making it appropriate for the efficient implementation of newborn hearing screening procedures. It is the objective of this study to evaluate whether this approach might further be improved by different stimulation paradigms and electrode settings. In particular, we evaluate chirp stimulations which compensate the basilar-membrane dispersion and active electrodes which are less sensitive to movements. This is the first study which is directed to a single sweep processing of chirp evoked ABRs. By concentrating on transparent features and a minimum number of adjustable parameters, we present an objective comparison of click vs.chirp stimulations and active vs. passive electrodes in the ultrafast ABR detection. We show that chirp evoked brainstem responses and active electrodes might improve the single sweeps analysis of ABRs.Consequently, we conclude that a single sweep processing of ABRs for the objective determination of hearing thresholds can further be improved by the use of optimized chirp stimulations and active electrodes.
Selecting a Control Strategy for Plug and Process Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lobato, C.; Sheppy, M.; Brackney, L.
2012-09-01
Plug and Process Loads (PPLs) are building loads that are not related to general lighting, heating, ventilation, cooling, and water heating, and typically do not provide comfort to the building occupants. PPLs in commercial buildings account for almost 5% of U.S. primary energy consumption. On an individual building level, they account for approximately 25% of the total electrical load in a minimally code-compliant commercial building, and can exceed 50% in an ultra-high efficiency building such as the National Renewable Energy Laboratory's (NREL) Research Support Facility (RSF) (Lobato et al. 2010). Minimizing these loads is a primary challenge in the designmore » and operation of an energy-efficient building. A complex array of technologies that measure and manage PPLs has emerged in the marketplace. Some fall short of manufacturer performance claims, however. NREL has been actively engaged in developing an evaluation and selection process for PPLs control, and is using this process to evaluate a range of technologies for active PPLs management that will cap RSF plug loads. Using a control strategy to match plug load use to users' required job functions is a huge untapped potential for energy savings.« less
Structural Efficiency of Composite Struts for Aerospace Applications
NASA Technical Reports Server (NTRS)
Jegley, Dawn C.; Wu, K. Chauncey; McKenney, Martin J.; Oremont, Leonard
2011-01-01
The structural efficiency of carbon-epoxy tapered struts is considered through trade studies, detailed analysis, manufacturing and experimentation. Since some of the lunar lander struts are more highly loaded than struts used in applications such as satellites and telescopes, the primary focus of the effort is on these highly loaded struts. Lunar lander requirements include that the strut has to be tapered on both ends, complicating the design and limiting the manufacturing process. Optimal stacking sequences, geometries, and materials are determined and the sensitivity of the strut weight to each parameter is evaluated. The trade study results indicate that the most efficient carbon-epoxy struts are 30 percent lighter than the most efficient aluminum-lithium struts. Structurally efficient, highly loaded struts were fabricated and loaded in tension and compression to determine if they met the design requirements and to verify the accuracy of the analyses. Experimental evaluation of some of these struts demonstrated that they could meet the greatest Altair loading requirements in both tension and compression. These results could be applied to other vehicles requiring struts with high loading and light weight.
Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM
NASA Technical Reports Server (NTRS)
Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip
2017-01-01
The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.
Sim, Kyoung Mi; Park, Hyun-Seol; Bae, Gwi-Nam; Jung, Jae Hee
2015-11-15
In this study, we demonstrated an antimicrobial nanoparticle-coated electrostatic (ES) air filter. Antimicrobial natural-product Sophora flavescens nanoparticles were produced using an aerosol process, and were continuously deposited onto the surface of air filter media. For the electrostatic activation of the filter medium, a corona discharge electrification system was used before and after antimicrobial treatment of the filter. In the antimicrobial treatment process, the deposition efficiency of S. flavescens nanoparticles on the ES filter was ~12% higher than that on the pristine (Non-ES) filter. In the evaluation of filtration performance using test particles (a nanosized KCl aerosol and submicron-sized Staphylococcus epidermidis bioaerosol), the ES filter showed better filtration efficiency than the Non-ES filter. However, antimicrobial treatment with S. flavescens nanoparticles affected the filtration efficiency of the filter differently depending on the size of the test particles. While the filtration efficiency of the KCl nanoparticles was reduced on the ES filter after the antimicrobial treatment, the filtration efficiency was improved after the recharging process. In summary, we prepared an antimicrobial ES air filter with >99% antimicrobial activity, ~92.5% filtration efficiency (for a 300-nm KCl aerosol), and a ~0.8 mmAq pressure drop (at 13 cm/s). This study provides valuable information for the development of a hybrid air purification system that can serve various functions and be used in an indoor environment. Copyright © 2015 Elsevier B.V. All rights reserved.
Process development for single-crystal silicon solar cells
NASA Astrophysics Data System (ADS)
Bohra, Mihir H.
Solar energy is a viable, rapidly growing and an important renewable alternative to other sources of energy generation because of its abundant supply and low manufacturing cost. Silicon still remains the major contributor for manufacturing solar cells accounting for 80% of the market share. Of this, single-crystal solar cells account for half of the share. Laboratory cells have demonstrated 25% efficiency; however, commercial cells have efficiencies of 16% - 20% resulting from a focus on implementation processes geared to rapid throughput and low cost, thereby reducing the energy pay-back time. An example would be the use of metal pastes which dissolve the dielectric during the firing process as opposed to lithographically defined contacts. With current trends of single-crystal silicon photovoltaic (PV) module prices down to 0.60/W, almost all other PV technologies are challenged to remain cost competitive. This presents a unique opportunity in revisiting the PV cell fabrication process and incorporating moderately more expensive IC process practices into PV manufacturing. While they may drive the cost toward a 1/W benchmark, there is substantial room to "experiment", leading to higher efficiencies which will help maintain the overall system cost. This work entails a turn-key process designed to provide a platform for rapid evaluation of novel materials and processes. A two-step lithographic process yielding a baseline 11% - 13% efficient cell is described. Results of three studies have shown improvements in solar cell output parameters due to the inclusion of a back-surface field implant, a higher emitter doping and also an additional RCA Clean.
ACT Payload Shroud Structural Concept Analysis and Optimization
NASA Technical Reports Server (NTRS)
Zalewski, Bart B.; Bednarcyk, Brett A.
2010-01-01
Aerospace structural applications demand a weight efficient design to perform in a cost effective manner. This is particularly true for launch vehicle structures, where weight is the dominant design driver. The design process typically requires many iterations to ensure that a satisfactory minimum weight has been obtained. Although metallic structures can be weight efficient, composite structures can provide additional weight savings due to their lower density and additional design flexibility. This work presents structural analysis and weight optimization of a composite payload shroud for NASA s Ares V heavy lift vehicle. Two concepts, which were previously determined to be efficient for such a structure are evaluated: a hat stiffened/corrugated panel and a fiber reinforced foam sandwich panel. A composite structural optimization code, HyperSizer, is used to optimize the panel geometry, composite material ply orientations, and sandwich core material. HyperSizer enables an efficient evaluation of thousands of potential designs versus multiple strength and stability-based failure criteria across multiple load cases. HyperSizer sizing process uses a global finite element model to obtain element forces, which are statistically processed to arrive at panel-level design-to loads. These loads are then used to analyze each candidate panel design. A near optimum design is selected as the one with the lowest weight that also provides all positive margins of safety. The stiffness of each newly sized panel or beam component is taken into account in the subsequent finite element analysis. Iteration of analysis/optimization is performed to ensure a converged design. Sizing results for the hat stiffened panel concept and the fiber reinforced foam sandwich concept are presented.
Development of a kolanut peeling device.
Kareem, I; Owolarafe, O K; Ajayi, O A
2014-10-01
A kolanut peeling machine was designed, constructed and evaluated for the postharvest processing of the seed. The peeling machine consists of a standing frame, peeling unit and hopper. The peeling unit consists of a special paddle, which mixes the kolanut, rubs them against one another and against the wall of the barrel and also conveys the kolanut to the outlet. The performance of the kolanut peeling machine was evaluated for its peeling efficiency at different moisture content (53.0, 57.6, 61.4 % w.b.) and speeds of operation of the machine. The result of the analysis of variance shows that the main factors and their interaction had significant effects (p < 0.05) on the peeling efficiency of the machine. The result also shows that the peeling efficiency of the machine increased as the moisture content increase and decreased with increase in machine speed. The highest efficiency of the machine was 60.3 % at a moisture content of 61.4 % w.b. and speed of 40 rpm.
NASA Astrophysics Data System (ADS)
Shi, Junqin; Chen, Juan; Fang, Liang; Sun, Kun; Sun, Jiapeng; Han, Jing
2018-03-01
The effect of water film on the nanoscratching behavior of monocrystalline Cu was studied by molecular dynamics (MD) simulation. The results indicate that the friction force acting on abrasive particle increases due to the resistance of water film accumulating ahead of particle, but the water film with lubrication decreases friction force acting on Cu surface. The accumulation of water molecules around particle causes the anisotropy of ridge and the surface damage around the groove, and the water molecules remaining in the groove lead to the non-regular groove structure. The dislocation evolution displays the re-organization of the dislocation network in the nanoscratching process. The evaluation of removal efficiency shows the number of removed Cu atoms decreases with water film thickness. It is considered that an appropriate rather than a high removal efficiency should be adopted to evaluate the polishing process in real (chemical mechanical polishing) CMP. These results are helpful to reveal the polishing mechanism under the effect of water film from physical perspective, which benefits the development of ultra-precision manufacture and miniaturized components, as well as the innovation of CMP technology.
von Sperling, M; Oliveira, S C
2009-01-01
This article evaluates and compares the actual behavior of 166 full-scale anaerobic and aerobic wastewater treatment plants in operation in Brazil, providing information on the performance of the processes in terms of the quality of the generated effluent and the removal efficiency achieved. The observed results of effluent concentrations and removal efficiencies of the constituents BOD, COD, TSS (total suspended solids), TN (total nitrogen), TP (total phosphorus) and FC (faecal or thermotolerant coliforms) have been compared with the typical expected performance reported in the literature. The treatment technologies selected for study were: (a) predominantly anaerobic: (i) septic tank + anaerobic filter (ST + AF), (ii) UASB reactor without post-treatment (UASB) and (iii) UASB reactor followed by several post-treatment processes (UASB + POST); (b) predominantly aerobic: (iv) facultative pond (FP), (v) anaerobic pond followed by facultative pond (AP + FP) and (vi) activated sludge (AS). The results, confirmed by statistical tests, showed that, in general, the best performance was achieved by AS, but closely followed by UASB reactor, when operating with any kind of post-treatment. The effluent quality of the anaerobic processes ST + AF and UASB reactor without post-treatment was very similar to the one presented by facultative pond, a simpler aerobic process, regarding organic matter.
Kim, Min Woo; Sun, Gwanggyu; Lee, Jung Hyuk; Kim, Byung-Gee
2018-06-01
Ribozyme (Rz) is a very attractive RNA molecule in metabolic engineering and synthetic biology fields where RNA processing is required as a control unit or ON/OFF signal for its cleavage reaction. In order to use Rz for such RNA processing, Rz must have highly active and specific catalytic activity. However, current methods for assessing the intracellular activity of Rz have limitations such as difficulty in handling and inaccuracies in the evaluation of correct cleavage activity. In this paper, we proposed a simple method to accurately measure the "intracellular cleavage efficiency" of Rz. This method deactivates unwanted activity of Rz which may consistently occur after cell lysis using DNA quenching method, and calculates the cleavage efficiency by analyzing the cleaved fraction of mRNA by Rz from the total amount of mRNA containing Rz via quantitative real-time PCR (qPCR). The proposed method was applied to measure "intracellular cleavage efficiency" of sTRSV, a representative Rz, and its mutant, and their intracellular cleavage efficiencies were calculated as 89% and 93%, respectively. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ren, Ziqiu; Zhu, Menghua; Li, Xin; Dong, Cunku
2017-09-01
As a promising photovoltaic device, perovskite solar cells have attracted numerous attention in recent years, where forming a compact and pinhole-free perovskite film in air is of great importance. Herein, we evaluate highly efficient and air stable planar perovskite solar cells in air (relative humidity over 50%) with the modified two-step sequential deposition method by adjusting the CH3NH3I (MAI) concentrations and regulating the crystallization process of the perovskite film. The optimum MAI concentration is 60 mg mL-1 in isopropanol. With a planar structure of FTO/TiO2/MAPbI3/spiro-OMeTAD/Au, the efficient devices composed of compact and pinhole-free perovskite films are constructed in air, achieving a high efficiency of up to 15.10% and maintaining over 80% after 20 days storing without any encapsulation in air. With a facile fabrication process and high photovoltaic performance, this work represents a promising method for fabricating low-cost, highly efficient and stable photovoltaic device.
An efficient laboratory workflow for environmental risk assessment of organic chemicals.
Zhu, Linyan; Santiago-Schübel, Beatrix; Xiao, Hongxia; Thiele, Björn; Zhu, Zhiliang; Qiu, Yanling; Hollert, Henner; Küppers, Stephan
2015-07-01
In this study, we demonstrate a fast and efficient workflow to investigate the transformation mechanism of organic chemicals and evaluate the toxicity of their transformation products (TPs) in laboratory scale. The transformation process of organic chemicals was first simulated by electrochemistry coupled online to mass spectrometry (EC-MS). The simulated reactions were scaled up in a batch EC reactor to receive larger amounts of a reaction mixture. The mixture sample was purified and concentrated by solid phase extraction (SPE) for the further ecotoxicological testing. The combined toxicity of the reaction mixture was evaluated in fish egg test (FET) (Danio rerio) compared to the parent compound. The workflow was verified with carbamazepine (CBZ). By using EC-MS seven primary TPs of CBZ were identified; the degradation mechanism was elucidated and confirmed by comparison to literature. The reaction mixture and one primary product (acridine) showed higher ecotoxicity in fish egg assay with 96 h EC50 values of 1.6 and 1.0 mg L(-1) than CBZ with the value of 60.8 mg L(-1). The results highlight the importance of transformation mechanism study and toxicological effect evaluation for organic chemicals brought into the environment since transformation of them may increase the toxicity. The developed process contributes a fast and efficient laboratory method for the risk assessment of organic chemicals and their TPs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Enhancement of Micropollutant Degradation at the Outlet of Small Wastewater Treatment Plants
Rossi, Luca; Queloz, Pierre; Brovelli, Alessandro; Margot, Jonas; Barry, D. A.
2013-01-01
The aim of this work was to evaluate low-cost and easy-to-operate engineering solutions that can be added as a polishing step to small wastewater treatment plants to reduce the micropollutant load to water bodies. The proposed design combines a sand filter/constructed wetland with additional and more advanced treatment technologies (UV degradation, enhanced adsorption to the solid phase, e.g., an engineered substrate) to increase the elimination of recalcitrant compounds. The removal of five micropollutants with different physico-chemical characteristics (three pharmaceuticals: diclofenac, carbamazepine, sulfamethoxazole, one pesticide: mecoprop, and one corrosion inhibitor: benzotriazole) was studied to evaluate the feasibility of the proposed system. Separate batch experiments were conducted to assess the removal efficiency of UV degradation and adsorption. The efficiency of each individual process was substance-specific. No process was effective on all the compounds tested, although elimination rates over 80% using light expanded clay aggregate (an engineered material) were observed. A laboratory-scale flow-through setup was used to evaluate interactions when removal processes were combined. Four of the studied compounds were partially eliminated, with poor removal of the fifth (benzotriazole). The energy requirements for a field-scale installation were estimated to be the same order of magnitude as those of ozonation and powdered activated carbon treatments. PMID:23484055
Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..
The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.
Information efficiency in visual communication
NASA Astrophysics Data System (ADS)
Alter-Gartenberg, Rachel; Rahman, Zia-ur
1993-08-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
Borole, Abhijeet P.
2015-08-25
Conversion of biomass into bioenergy is possible via multiple pathways resulting in production of biofuels, bioproducts and biopower. Efficient and sustainable conversion of biomass, however, requires consideration of many environmental and societal parameters in order to minimize negative impacts. Integration of multiple conversion technologies and inclusion of upcoming alternatives such as bioelectrochemical systems can minimize these impacts and improve conservation of resources such as hydrogen, water and nutrients via recycle and reuse. This report outlines alternate pathways integrating microbial electrolysis in biorefinery schemes to improve energy efficiency while evaluating environmental sustainability parameters.
Information efficiency in visual communication
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Rahman, Zia-Ur
1993-01-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
NASA Technical Reports Server (NTRS)
Lloyd, J. F., Sr.
1987-01-01
Industrial radiography is a well established, reliable means of providing nondestructive structural integrity information. The majority of industrial radiographs are interpreted by trained human eyes using transmitted light and various visual aids. Hundreds of miles of radiographic information are evaluated, documented and archived annually. In many instances, there are serious considerations in terms of interpreter fatigue, subjectivity and limited archival space. Quite often it is difficult to quickly retrieve radiographic information for further analysis or investigation. Methods of improving the quality and efficiency of the radiographic process are being explored, developed and incorporated whenever feasible. High resolution cameras, digital image processing, and mass digital data storage offer interesting possibilities for improving the industrial radiographic process. A review is presented of computer aided radiographic interpretation technology in terms of how it could be used to enhance the radiographic interpretation process in evaluating radiographs of aluminum welds.
Adapting Nielsen’s Design Heuristics to Dual Processing for Clinical Decision Support
Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene
2016-01-01
The study objective was to improve the applicability of Nielson’s standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access. PMID:28269915
Adapting Nielsen's Design Heuristics to Dual Processing for Clinical Decision Support.
Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene
2016-01-01
The study objective was to improve the applicability of Nielson's standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
Neumann, Miguel G; Schmitt, Carla C; Ferreira, Giovana C; Corrêa, Ivo C
2006-06-01
To evaluate the efficiency of the photopolymerization of dental resins it is necessary to know to what extent the light emitted by the light curing units is absorbed by the photoinitiators. On the other hand, the efficiency of the absorbed photons to produce species that launch the polymerization process is also of paramount importance. Therefore, the previously determined PAE (photon absorption efficiency) is used in conjunction with the polymerization quantum yields for the photoinitiators, in order to be able to compare the total process on an equivalent basis. This parameter can be used to identify the best performance for the photochemical process with specific photoinitiators. The efficiency of LED (Ultrablue IS) and QTH (Optilux 401) lamps were tested comparing their performances with the photoinitiators camphorquinone (CQ); phenylpropanedione (PPD); monoacylphosphine oxide (Lucirin TPO); and bisacylphosphine oxide (Irgacure 819). The extent of photopolymerization per absorbed photon was determined from the polymerization quantum yields obtained by using the photoinitiators to polymerize methyl methacrylate, and afterwards combined with the previously determined PAEs. Although CQ presents a rather low polymerization quantum yield, its photopolymerization efficiency is practically the highest when irradiated with the Ultrablue LED. On the other hand, Lucirin is much more efficient than the other photoinitiators when irradiated with a QTH lamp, due to its high quantum yield and the overlap between its absorption spectrum and the output of the visible lamp light. Difference in photopolymerization efficiencies arise when combinations of photoinitiators are used, and when LED sources are used in preference to QTH. Mechanistic understanding is essential to optimal initiator formulation.
Omondi Aduda, Dickens S; Ouma, Collins; Onyango, Rosebella; Onyango, Mathews; Bertrand, Jane
2015-01-01
Voluntary medical male circumcision (VMMC) service delivery is complex and resource-intensive. In Kenya's context there is still paucity of information on resource use vis-à-vis outputs as programs scale up. Knowledge of technical efficiency, productivity and potential sources of constraints is desirable to improve decision-making. To evaluate technical efficiency and productivity of VMMC service delivery in Nyanza in 2011/2012 using data envelopment analysis. Comparative process evaluation of facilities providing VMMC in Nyanza in 2011/2012 using output orientated data envelopment analysis. Twenty one facilities were evaluated. Only 1 of 7 variables considered (total elapsed operation time) significantly improved from 32.8 minutes (SD 8.8) in 2011 to 30 minutes (SD 6.6) in 2012 (95%CI = 0.0350-5.2488; p = 0.047). Mean scale technical efficiency significantly improved from 91% (SD 19.8) in 2011 to 99% (SD 4.0) in 2012 particularly among outreach compared to fixed service delivery facilities (CI -31.47959-4.698508; p = 0.005). Increase in mean VRS technical efficiency from 84% (SD 25.3) in 2011 and 89% (SD 25.1) in 2012 was not statistically significant. Benchmark facilities were #119 and #125 in 2011 and #103 in 2012. Malmquist Productivity Index (MPI) at fixed facilities declined by 2.5% but gained by 4.9% at outreach ones by 2012. Total factor productivity improved by 83% (p = 0.032) in 2012, largely due to progress in technological efficiency by 79% (p = 0.008). Significant improvement in scale technical efficiency among outreach facilities in 2012 was attributable to accelerated activities. However, ongoing pure technical inefficiency requires concerted attention. Technological progress was the key driver of service productivity growth in Nyanza. Incorporating service-quality dimensions and using stepwise-multiple criteria in performance evaluation enhances comprehensiveness and validity. These findings highlight site-level resource use and sources of variations in VMMC service productivity, which are important for program planning.
Fuel Cell/Reformers Technology Development
NASA Technical Reports Server (NTRS)
2004-01-01
NASA Glenn Research Center is interested in developing Solid Oxide Fuel Cell for use in aerospace applications. Solid oxide fuel cell requires hydrogen rich feed stream by converting commercial aviation jet fuel in a fuel processing process. The grantee's primary research activities center on designing and constructing a test facility for evaluating injector concepts to provide optimum feeds to fuel processor; collecting and analyzing literature information on fuel processing and desulfurization technologies; establishing industry and academic contacts in related areas; providing technical support to in-house SOFC-based system studies. Fuel processing is a chemical reaction process that requires efficient delivery of reactants to reactor beds for optimum performance, i.e., high conversion efficiency and maximum hydrogen production, and reliable continuous operation. Feed delivery and vaporization quality can be improved by applying NASA's expertise in combustor injector design. A 10 KWe injector rig has been designed, procured, and constructed to provide a tool to employ laser diagnostic capability to evaluate various injector concepts for fuel processing reactor feed delivery application. This injector rig facility is now undergoing mechanical and system check-out with an anticipated actual operation in July 2004. Multiple injector concepts including impinging jet, venturi mixing, discrete jet, will be tested and evaluated with actual fuel mixture compatible with reforming catalyst requirement. Research activities from September 2002 to the closing of this collaborative agreement have been in the following areas: compiling literature information on jet fuel reforming; conducting autothermal reforming catalyst screening; establishing contacts with other government agencies for collaborative research in jet fuel reforming and desulfurization; providing process design basis for the build-up of injector rig facility and individual injector design.
Adverse outcome pathway networks: Development, analytics and applications
The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...
Adverse outcome pathway networks I: Development and applications
The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...
An observational assessment method for aging laboratory rats
The growth of the aging population highlights the need for laboratory animal models to study the basic biological processes ofaging and susceptibility to toxic chemicals and disease. Methods to evaluate health ofaging animals over time are needed, especially efficient methods for...
THE SOLAR TRANSFORMITY OF OIL AND PETROLEUM NATURAL GAS
This paper presents an emergy evaluation of the biogeochemical process of petroleum formation. Unlike the previous calculation, in which the transformity of crude oil was back calculated from the relative efficiency of electricity production and factors relating coal to transport...
Performance evaluation of buried pipe installation : LTRC research project capsule 08-6GT.
DOT National Transportation Integrated Search
2008-03-01
The Louisiana Department of : Transportation and Development : (LADOTD) is in the process of revising : the current specifications to obtain a : more cost efficient design and : installation of buried pipes for highway : infrastructure. It aims to de...
Biomaterials Evaluation: Conceptual Refinements and Practical Reforms.
Masaeli, Reza; Zandsalimi, Kavosh; Tayebi, Lobat
2018-01-01
Regarding the widespread and ever-increasing applications of biomaterials in different medical fields, their accurate assessment is of great importance. Hence the safety and efficacy of biomaterials is confirmed only through the evaluation process, the way it is done has direct effects on public health. Although every biomaterial undergoes rigorous premarket evaluation, the regulatory agencies receive a considerable number of complications and adverse event reports annually. The main factors that challenge the process of biomaterials evaluation are dissimilar regulations, asynchrony of biomaterials evaluation and biomaterials development, inherent biases of postmarketing data, and cost and timing issues. Several pieces of evidence indicate that current medical device regulations need to be improved so that they can be used more effectively in the evaluation of biomaterials. This article provides suggested conceptual refinements and practical reforms to increase the efficiency and effectiveness of the existing regulations. The main focus of the article is on strategies for evaluating biomaterials in US, and then in EU.
Evaluation of selected chemical processes for production of low-cost silicon, phase 3
NASA Technical Reports Server (NTRS)
Blocher, J. M., Jr.; Browning, M. F.; Seifert, D. A.
1981-01-01
A Process Development Unit (PDU), which consisted of the four major units of the process, was designed, installed, and experimentally operated. The PDU was sized to 50MT/Yr. The deposition took place in a fluidized bed reactor. As a consequences of the experiments, improvements in the design an operation of these units were undertaken and their experimental limitations were partially established. A parallel program of experimental work demonstrated that Zinc can be vaporized for introduction into the fluidized bed reactor, by direct induction-coupled r.f. energy. Residual zinc in the product can be removed by heat treatment below the melting point of silicon. Current efficiencies of 94 percent and above, and power efficiencies around 40 percent are achievable in the laboratory-scale electrolysis of ZnCl2.
Gerasimov, Gennady
2016-09-01
The efficiency of the electron beam treatment of industrial flue gases for the removal of sulfur and nitrogen oxides was investigated as applied to polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) using methods of mathematical modeling. The proposed kinetic model of the process includes mechanism of PCDD/Fs decomposition caused by their interaction with OH radicals generated in the flue gases under the electron beam (EB) irradiation as well as PCDD/Fs formation from unburned aromatic compounds. The model allows to predict the main features of the process, which are observed in pilot plant installations, as well as to evaluate the process efficiency. The results of calculations are compared with the available experimental data. Copyright © 2016 Elsevier Ltd. All rights reserved.
Agricultural drainage and wetland management in Ontario.
Walters, Dan; Shrubsole, Dan
2003-12-01
Land drainage is recognized as an integral part of agricultural activity throughout the world. However, the increase in agricultural production has resulted in the loss of wetland functions and values. Therefore, wetland management and agricultural drainage illustrate the conflict between economic development and natural values. This research assesses the approval process for agricultural land drainage in Ontario, Canada, to determine how the benefits of increased agricultural production are balanced against the loss of wetland values. A permit review of drainage applications was conducted from 1978 to 1997 in Zorra Township, Ontario, Canada. Data collection also included the document reviews, interviews with government agencies and wetland evaluation files. The selected criteria include efficiency, equity, consistency and adequacy. The results indicate that while the process is efficient, fundamental problems remain with the bargaining process.
Yalan Liu; Jinwu Wang; Michael P. Wolcott
2017-01-01
Currently, feedstock size effects on chemical pretreatment performance were not clear due to the complexity of the pretreatment process and multiple evaluation standards such as the sugar recovery in spent liquor or enzymatic digestibility. In this study, we evaluated the size effects by various ways: the sugar recovery and coproduct yields in spent liquor, the...
ERIC Educational Resources Information Center
Maryam, Ansary; Alireza, Shavakhi; Reza, Nasr Ahmad; Azizollah, Arbabisarjou
2012-01-01
Evaluation of faculty members' teaching is a device for recognition of their ability in teaching, assessing, the student's learning and it can improve efficiency of faculty members in teaching. In terms of growth of computer's technologies improvement of universities and its effect on achievement and information processing, it is necessary to use…
Yang, Qi; Luo, Kun; Li, Xiao-ming; Wang, Dong-bo; Zheng, Wei; Zeng, Guang-ming; Liu, Jing-jin
2010-05-01
In this investigation, the effects of commercial enzyme preparation containing alpha amylase and neutral protease on hydrolysis of excess sludge and the kinetic analysis of hydrolysis process were evaluated. The results indicated that amylase treatment displayed higher hydrolysis efficiency than that of protease. VSS reduction greatly increased to 39.70% for protease and 54.24% for amylase at the enzyme dosage of 6% (w/w), respectively. The hydrolysis rate of sludge improved with temperature increasing from 40 to 50 degrees Celsius, which could be well described by the amended Arrhenius equation. Mixed-enzyme had great impact on sludge solubilisation than single enzyme. The mixture of two enzymes (protease:amylase=1:3) resulted in optimum hydrolysis efficiency, the efficiency of solids hydrolysis increased from 10% (control test) to 68.43% at the temperature of 50 degrees Celsius. Correspondingly, the concentration of reducing sugar and NH(4)(+)-N improved about 377% and 201%, respectively. According to the kinetic analysis of enzymatic hydrolysis process, VSS solubilisation process within prior 4 h followed first-order kinetics. Compared with control test, the hydrolysis rate improved significantly at 50 degrees Celsius when either single enzyme or mixed-enzyme was added. Copyright 2009. Published by Elsevier Ltd.
Agile Electro-Mechanical Product Accelerator - Final Research Performance Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Brian
2016-07-29
NCDMM recognized the need to focus on the most efficient use of limited resources while ensuring compliance with regulations and minimizing the energy intensity and environmental impact of manufactured components. This was accomplished through the evaluation of current machining and processing practices, and their efficiencies, to further the sustainability of manufacturing as a whole. Additionally, the activities also identified, and furthered the implementation of new “best practices” within the southwestern Pennsylvania manufacturing sector.
The Efficiency and the Scalability of an Explicit Operator on an IBM POWER4 System
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Biegel, Bryan A. (Technical Monitor)
2002-01-01
We present an evaluation of the efficiency and the scalability of an explicit CFD operator on an IBM POWER4 system. The POWER4 architecture exhibits a common trend in HPC architectures: boosting CPU processing power by increasing the number of functional units, while hiding the latency of memory access by increasing the depth of the memory hierarchy. The overall machine performance depends on the ability of the caches-buses-fabric-memory to feed the functional units with the data to be processed. In this study we evaluate the efficiency and scalability of one explicit CFD operator on an IBM POWER4. This operator performs computations at the points of a Cartesian grid and involves a few dozen floating point numbers and on the order of 100 floating point operations per grid point. The computations in all grid points are independent. Specifically, we estimate the efficiency of the RHS operator (SP of NPB) on a single processor as the observed/peak performance ratio. Then we estimate the scalability of the operator on a single chip (2 CPUs), a single MCM (8 CPUs), 16 CPUs, and the whole machine (32 CPUs). Then we perform the same measurements for a chache-optimized version of the RHS operator. For our measurements we use the HPM (Hardware Performance Monitor) counters available on the POWER4. These counters allow us to analyze the obtained performance results.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1979-01-01
Analyses of slicing processes and junction formation processes are presented. A simple method for evaluation of the relative economic merits of competing process options with respect to the cost of energy produced by the system is described. An energy consumption analysis was developed and applied to determine the energy consumption in the solar module fabrication process sequence, from the mining of the SiO2 to shipping. The analysis shows that, in current technology practice, inordinate energy use in the purification step, and large wastage of the invested energy through losses, particularly poor conversion in slicing, as well as inadequate yields throughout. The cell process energy expenditures already show a downward trend based on increased throughput rates. The large improvement, however, depends on the introduction of a more efficient purification process and of acceptable ribbon growing techniques.
Degradation of Tetracycline with BiFeO3 Prepared by a Simple Hydrothermal Method
Xue, Zhehua; Wang, Ting; Chen, Bingdi; Malkoske, Tyler; Yu, Shuili; Tang, Yulin
2015-01-01
BiFeO3 particles (BFO) were prepared by a simple hydrothermal method and characterized. BFO was pure, with a wide particle size distribution, and was visible light responsive. Tetracycline was chosen as the model pollutant in this study. The pH value was an important factor influencing the degradation efficiency. The total organic carbon (TOC) measurement was emphasized as a potential standard to evaluate the visible light photocatalytic degradation efficiency. The photo-Fenton process showed much better degradation efficiency and a wider pH adaptive range than photocatalysis or the Fenton process solely. The optimal residual TOC concentrations of the photocatalysis, Fenton and photo-Fenton processes were 81%, 65% and 21%, while the rate constants of the three processes under the same condition where the best residual TOC was acquired were 9.7 × 10−3, 3.2 × 10−2 and 1.5 × 10−1 min−1, respectively. BFO was demonstrated to have excellent stability and reusability. A comparison among different reported advanced oxidation processes removing tetracycline (TC) was also made. Our findings showed that the photo-Fenton process had good potential for antibiotic-containing waste water treatment. It provides a new method to deal with antibiotic pollution. PMID:28793568
A PFI mill can be used to predict biomechanical pulp strength properties
Gary F. Leatham; Gary C. Myers
1990-01-01
Recently, we showed that a biomechanical pulping process in which aspen chips are pretreated with a white-rot fungus can give energy savings and can increase paper sheet strength. To optimize this process, we need more efficient ways to evaluate the fungal treatments. Here, we examine a method that consists of treating coarse refiner mechanical pulp, refining in a PFI...
Wang, Ding; Bolton, James R; Hofmann, Ron
2012-10-01
The effectiveness of ultraviolet (UV) combined with chlorine as a novel advanced oxidation process (AOP) for drinking water treatment was evaluated in a bench scale study by comparing the rate of trichloroethylene (TCE) decay when using UV/chlorine to the rates of decay by UV alone and UV/hydrogen peroxide (H₂O₂) at various pH values. A medium pressure mercury UV lamp was used. The UV/chlorine process was more efficient than the UV/H₂O₂ process at pH 5, but in the neutral and alkaline pH range, the UV/H₂O₂ process became more efficient. The pH effect was probably controlled by the increasing concentration of OCl⁻ at higher pH values. A mechanistic kinetic model of the UV/chlorine treatment of TCE showed good agreement with the experimental data. Copyright © 2012 Elsevier Ltd. All rights reserved.
An automated dose tracking system for adaptive radiation therapy.
Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J
2018-02-01
The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.
Minozzi, Clémentine; Caron, Antoine; Grenier-Petel, Jean-Christophe; Santandrea, Jeffrey; Collins, Shawn K
2018-05-04
A library of 50 copper-based complexes derived from bisphosphines and diamines was prepared and evaluated in three mechanistically distinct photocatalytic reactions. In all cases, a copper-based catalyst was identified to afford high yields, where new heteroleptic complexes derived from the bisphosphine BINAP displayed high efficiency across all reaction types. Importantly, the evaluation of the library of copper complexes revealed that even when photophysical data is available, it is not always possible to predict which catalyst structure will be efficient or inefficient in a given process, emphasizing the advantages for catalyst structures with high modularity and structural variability. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gear and survey efficiency of patent tongs for oyster populations on restoration reefs.
Schulte, David M; Lipcius, Romuald N; Burke, Russell P
2018-01-01
Surveys of restored oyster reefs need to produce accurate population estimates to assess the efficacy of restoration. Due to the complex structure of subtidal oyster reefs, one effective and efficient means to sample is by patent tongs, rather than SCUBA, dredges, or bottom cores. Restored reefs vary in relief and oyster density, either of which could affect survey efficiency. This study is the first to evaluate gear (the first full grab) and survey (which includes selecting a specific half portion of the first grab for further processing) efficiencies of hand-operated patent tongs as a function of reef height and oyster density on subtidal restoration reefs. In the Great Wicomico River, a tributary of lower Chesapeake Bay, restored reefs of high- and low-relief (25-45 cm, and 8-12 cm, respectively) were constructed throughout the river as the first large-scale oyster sanctuary reef restoration effort (sanctuary acreage > 20 ha at one site) in Chesapeake Bay. We designed a metal frame to guide a non-hydraulic mechanical patent tong repeatedly into the same plot on a restored reef until all oysters within the grab area were captured. Full capture was verified by an underwater remotely-operated vehicle. Samples (n = 19) were taken on nine different reefs, including five low- (n = 8) and four high-relief reefs (n = 11), over a two-year period. The gear efficiency of the patent tong was estimated to be 76% (± 5% standard error), whereas survey efficiency increased to 81% (± 10%) due to processing. Neither efficiency differed significantly between young-of-the-year oysters (spat) and adults, high- and low-relief reefs, or years. As this type of patent tong is a common and cost-effective tool to evaluate oyster restoration projects as well as population density on fished habitat, knowing the gear and survey efficiencies allows for accurate and precise population estimates.
Evaluation of phase separator number in hydrodesulfurization (HDS) unit
NASA Astrophysics Data System (ADS)
Jayanti, A. D.; Indarto, A.
2016-11-01
The removal process of acid gases such as H2S in natural gas processing industry is required in order to meet sales gas specification. Hydrodesulfurization (HDS)is one of the processes in the refinery that is dedicated to reduce sulphur.InHDS unit, phase separator plays important role to remove H2S from hydrocarbons, operated at a certain pressure and temperature. Optimization of the number of separator performed on the system is then evaluated to understand the performance and economics. From the evaluation, it shows that all systems were able to meet the specifications of H2S in the desired product. However, one separator system resulted the highest capital and operational costs. The process of H2S removal with two separator systems showed the best performance in terms of both energy efficiency with the lowest capital and operating cost. The two separator system is then recommended as a reference in the HDS unit to process the removal of H2S from natural gas.
DOT National Transportation Integrated Search
1998-09-01
Commercial Vehicle Administrative (CVO) Processes Cross-Cutting report summarizes and interprets the results of several Field Operational Tests (FOTs) conducted to evaluate systems that increase the efficiency of commercial vehicle administrative pro...
Evaluation of the late merge work zone traffic control strategy.
DOT National Transportation Integrated Search
2004-01-01
Several alternative lane merge strategies have been proposed in recent years to process vehicles through work zone lane closures more safely and efficiently. Among these is the late merge. With the late merge, drivers are instructed to use all lanes ...
NASA Astrophysics Data System (ADS)
Hartmann, D.; Sarfert, W.; Meier, S.; Bolink, H.; García Santamaría, S.; Wecker, J.
2010-05-01
Typically high efficient OLED device structures are based on a multitude of stacked thin organic layers prepared by thermal evaporation. For lighting applications these efficient device stacks have to be up-scaled to large areas which is clearly challenging in terms of high through-put processing at low-cost. One promising approach to meet cost-efficiency, high through-put and high light output is the combination of solution and evaporation processing. Moreover, the objective is to substitute as many thermally evaporated layers as possible by solution processing without sacrificing the device performance. Hence, starting from the anode side, evaporated layers of an efficient white light emitting OLED stack are stepwise replaced by solution processable polymer and small molecule layers. In doing so different solutionprocessable hole injection layers (= polymer HILs) are integrated into small molecule devices and evaluated with regard to their electro-optical performance as well as to their planarizing properties, meaning the ability to cover ITO spikes, defects and dust particles. Thereby two approaches are followed whereas in case of the "single HIL" approach only one polymer HIL is coated and in case of the "combined HIL" concept the coated polymer HIL is combined with a thin evaporated HIL. These HIL architectures are studied in unipolar as well as bipolar devices. As a result the combined HIL approach facilitates a better control over the hole current, an improved device stability as well as an improved current and power efficiency compared to a single HIL as well as pure small molecule based OLED stacks. Furthermore, emitting layers based on guest/host small molecules are fabricated from solution and integrated into a white hybrid stack (WHS). Up to three evaporated layers were successfully replaced by solution-processing showing comparable white light emission spectra like an evaporated small molecule reference stack and lifetime values of several 100 h.
Coderre, Sylvain; Woloschuk, Wayne; McLaughlin, Kevin
2009-04-01
Content validity is a requirement of every evaluation and is achieved when the evaluation content is congruent with the learning objectives and the learning experiences. Congruence between these three pillars of education can be facilitated by blueprinting. Here we describe an efficient process for creating a blueprint and explain how to use this tool to guide all aspects of course creation and evaluation. A well constructed blueprint is a valuable tool for medical educators. In addition to validating evaluation content, a blueprint can also be used to guide selection of curricular content and learning experiences.
Applicability of Zeolite Based Systems for Ammonia Removal and Recovery From Wastewater.
Das, Pallabi; Prasad, Bably; Singh, Krishna Kant Kumar
2017-09-01
Ammonia discharged in industrial effluents bears deleterious effects and necessitates remediation. Integrated systems devoted to recovery of ammonia in a useful form and remediation of the same addresses the challenges of waste management and its utilization. A comparative performance evaluation study was undertaken to access the suitability of different zeolite based systems (commercial zeolites and zeolites synthesized from fly ash) for removal of ammonia followed by its subsequent release. Four main parameters which were studied to evaluate the applicability of such systems for large scale usage are cost-effectiveness, ammonia removal efficiency, performance on regeneration, and ammonia release percentage. The results indicated that synthetic zeolites outperformed zeolites synthesized from fly ash, although the later proved to be more efficient in terms of total cost incurred. Process technology development in this direction will be a trade-of between cost and ammonia removal and release efficiencies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMordie Stoughton, Kate; Duan, Xiaoli; Wendel, Emily M.
This technology evaluation was prepared by Pacific Northwest National Laboratory on behalf of the U.S. Department of Energy’s Federal Energy Management Program (FEMP). ¬The technology evaluation assesses techniques for optimizing reverse osmosis (RO) systems to increase RO system performance and water efficiency. This evaluation provides a general description of RO systems, the influence of RO systems on water use, and key areas where RO systems can be optimized to reduce water and energy consumption. The evaluation is intended to help facility managers at Federal sites understand the basic concepts of the RO process and system optimization options, enabling them tomore » make informed decisions during the system design process for either new projects or recommissioning of existing equipment. This evaluation is focused on commercial-sized RO systems generally treating more than 80 gallons per hour.¬« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This technology evaluation was prepared by Pacific Northwest National Laboratory on behalf of the U.S. Department of Energy’s Federal Energy Management Program (FEMP). The technology evaluation assesses techniques for optimizing reverse osmosis (RO) systems to increase RO system performance and water efficiency. This evaluation provides a general description of RO systems, the influence of RO systems on water use, and key areas where RO systems can be optimized to reduce water and energy consumption. The evaluation is intended to help facility managers at Federal sites understand the basic concepts of the RO process and system optimization options, enabling them tomore » make informed decisions during the system design process for either new projects or recommissioning of existing equipment. This evaluation is focused on commercial-sized RO systems generally treating more than 80 gallons per hour.« less
A detailed evaluation of heating processes in the middle atmosphere
NASA Technical Reports Server (NTRS)
Mlynczak, Martin; Solomon, Susan
1994-01-01
A fundamental problem in the study of the terrestrial middle atmosphere is to calculate accurately the local heating due to the absorption of solar radiation. Knowledge of the heat budget is essential to understanding the atmospheric thermal structure, atmospheric motions, atmospheric chemistry, and their coupling. The evaluation of heating rates is complicated (especially above the stratopause) by the fact that the heating is not a simple one-step process. That is, the absorbed solar energy does not all immediately appear as heat. Rather, substantial portions of the incident energy may appear as internal energy of excited photolysis products (e.g., O(1D) or O2(1 delta)) or as chemical potential energy of product species such as atomic oxygen. The ultimate disposition of the internal and chemical energy possessed by the photolysis products determines the efficiency and thus the rate at which the middle atmosphere is heated. In studies of the heat budget, it is also vitally important to consider transport of long lived chemical species such as atomic oxygen above approximately 80 km. In such cases, the chemical potential energy may be transported great distances (horizontally or vertically) before undergoing a reaction to release the heat. Atomic oxygen influences the heating not only by reactions with itself and with O2 but also by reactions with odd-hydrogen species, especially those involving OH (Mlynczak and Solomon, 1991a). Consequently, absorbed solar energy may finally by converted to heat a long time after and at a location far from the original deposition. The purpose of this paper is to examine the solar and chemical heating processes and to present parameterizations for the heating efficiencies readily applicable for use in numerical models and heat budget studies. In the next two sections the processes relevant to the heating efficiencies for ozone and molecular oxygen will be reviewed. In section 4 the processes for the exothermic reactions will be reviewed and parameterizations for the heating efficiencies for both the solar and chemical processes will be presented in Section 5.
Layout optimization of GGISCR structure for on-chip system level ESD protection applications
NASA Astrophysics Data System (ADS)
Zeng, Jie; Dong, Shurong; Wong, Hei; Hu, Tao; Li, Xiang
2016-12-01
To improve the holding voltage, area efficiency and robustness, a comparative study on single finger, 4-finger and round shape layout of gate-grounded-nMOS incorporated SCR (GGISCR) devices are conducted. The devices were fabricated with a commercial 0.35 μm HV-CMOS process without any additional mask or process modification. To have a fair comparison, we develop a new Figure-of-Merit (FOM) modeling for the performance evaluation of these devices. We found that the ring type device which has an It2 value of 18.9 A is area efficient and has smaller effective capacitance. The different characteristics were explained with the different effective ESD currents in these layout structures.
High resolution, low cost solar cell contact development
NASA Technical Reports Server (NTRS)
Mardesich, N.
1981-01-01
The MIDFILM cell fabrication and encapsulation processes were demonstrated as a means of applying low-cost solar cell collector metallization. The average cell efficiency of 12.0 percent (AM1, 28 C) was achieved with fritted silver metallization with a demonstration run of 500 starting wafers. A 98 percent mechanical yield and 80 percent electrical yield were achieved through the MIDFILM process. High series resistance was responsible for over 90 percent of the electrical failures and was the major factor causing the low average cell efficiency. Environmental evaluations suggest that the MIDFILM cells do not degrade. A slight degradation in power was experienced in the MIDFILM minimodules when the AMP Solarlok connector delaminated during the environmental testing.
Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)
NASA Astrophysics Data System (ADS)
Raskovic, Dejan
Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.
Ashrafi, Omid; Yerushalmi, Laleh; Haghighat, Fariborz
2015-08-01
Pulp-and-paper mills produce various types of contaminants and a significant amount of wastewater depending on the type of processes used in the plant. Since the generated wastewaters can be potentially polluting and very dangerous, they should be treated in wastewater treatment plants before being released to the environment. This paper reviews different wastewater treatment processes used in the pulp-and-paper industry and compares them with respect to their contaminant removal efficiencies and the extent of greenhouse gas (GHG) emission. It also evaluates the impact of operating parameters on the performance of different treatment processes. Two mathematical models were used to estimate GHG emission in common biological treatment processes used in the pulp-and-paper industry. Nutrient removal processes and sludge treatment are discussed and their associated GHG emissions are calculated. Although both aerobic and anaerobic biological processes are appropriate for wastewater treatment, their combination known as hybrid processes showed a better contaminant removal capacity at higher efficiencies under optimized operating conditions with reduced GHG emission and energy costs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Advanced Video Analysis Needs for Human Performance Evaluation
NASA Technical Reports Server (NTRS)
Campbell, Paul D.
1994-01-01
Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.
FTIR Spectroscopy for Evaluation and Monitoring of Lipid Extraction Efficiency for Oleaginous Fungi.
Forfang, Kristin; Zimmermann, Boris; Kosa, Gergely; Kohler, Achim; Shapaval, Volha
2017-01-01
To assess whether Fourier Transform Infrared (FTIR) spectroscopy could be used to evaluate and monitor lipid extraction processes, the extraction methods of Folch, Bligh and Lewis were used. Biomass of the oleaginous fungi Mucor circinelloides and Mortierella alpina were employed as lipid-rich material for the lipid extraction. The presence of lipids was determined by recording infrared spectra of all components in the lipid extraction procedure, such as the biomass before and after extraction, the water and extract phases. Infrared spectra revealed the incomplete extraction after all three extraction methods applied to M.circinelloides and it was shown that mechanical disruption using bead beating and HCl treatment were necessary to complete the extraction in this species. FTIR spectroscopy was used to identify components, such as polyphosphates, that may have negatively affected the extraction process and resulted in differences in extraction efficiency between M.circinelloides and M.alpina. Residual lipids could not be detected in the infrared spectra of M.alpina biomass after extraction using the Folch and Lewis methods, indicating their complete lipid extraction in this species. Bligh extraction underestimated the fatty acid content of both M.circinelloides and M.alpina biomass and an increase in the initial solvent-to-sample ratio (from 3:1 to 20:1) was needed to achieve complete extraction and a lipid-free IR spectrum. In accordance with previous studies, the gravimetric lipid yield was shown to overestimate the potential of the SCO producers and FAME quantification in GC-FID was found to be the best-suited method for lipid quantification. We conclude that FTIR spectroscopy can serve as a tool for evaluating the lipid extraction efficiency, in addition to identifying components that may affect lipid extraction processes.
FTIR Spectroscopy for Evaluation and Monitoring of Lipid Extraction Efficiency for Oleaginous Fungi
Zimmermann, Boris; Kosa, Gergely; Kohler, Achim; Shapaval, Volha
2017-01-01
To assess whether Fourier Transform Infrared (FTIR) spectroscopy could be used to evaluate and monitor lipid extraction processes, the extraction methods of Folch, Bligh and Lewis were used. Biomass of the oleaginous fungi Mucor circinelloides and Mortierella alpina were employed as lipid-rich material for the lipid extraction. The presence of lipids was determined by recording infrared spectra of all components in the lipid extraction procedure, such as the biomass before and after extraction, the water and extract phases. Infrared spectra revealed the incomplete extraction after all three extraction methods applied to M.circinelloides and it was shown that mechanical disruption using bead beating and HCl treatment were necessary to complete the extraction in this species. FTIR spectroscopy was used to identify components, such as polyphosphates, that may have negatively affected the extraction process and resulted in differences in extraction efficiency between M.circinelloides and M.alpina. Residual lipids could not be detected in the infrared spectra of M.alpina biomass after extraction using the Folch and Lewis methods, indicating their complete lipid extraction in this species. Bligh extraction underestimated the fatty acid content of both M.circinelloides and M.alpina biomass and an increase in the initial solvent-to-sample ratio (from 3:1 to 20:1) was needed to achieve complete extraction and a lipid-free IR spectrum. In accordance with previous studies, the gravimetric lipid yield was shown to overestimate the potential of the SCO producers and FAME quantification in GC-FID was found to be the best-suited method for lipid quantification. We conclude that FTIR spectroscopy can serve as a tool for evaluating the lipid extraction efficiency, in addition to identifying components that may affect lipid extraction processes. PMID:28118388
Industrial application of semantic process mining
NASA Astrophysics Data System (ADS)
Espen Ingvaldsen, Jon; Atle Gulla, Jon
2012-05-01
Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.
Real-time Medical Emergency Response System: Exploiting IoT and Big Data for Public Health.
Rathore, M Mazhar; Ahmad, Awais; Paul, Anand; Wan, Jiafu; Zhang, Daqiang
2016-12-01
Healthy people are important for any nation's development. Use of the Internet of Things (IoT)-based body area networks (BANs) is increasing for continuous monitoring and medical healthcare in order to perform real-time actions in case of emergencies. However, in the case of monitoring the health of all citizens or people in a country, the millions of sensors attached to human bodies generate massive volume of heterogeneous data, called "Big Data." Processing Big Data and performing real-time actions in critical situations is a challenging task. Therefore, in order to address such issues, we propose a Real-time Medical Emergency Response System that involves IoT-based medical sensors deployed on the human body. Moreover, the proposed system consists of the data analysis building, called "Intelligent Building," depicted by the proposed layered architecture and implementation model, and it is responsible for analysis and decision-making. The data collected from millions of body-attached sensors is forwarded to Intelligent Building for processing and for performing necessary actions using various units such as collection, Hadoop Processing (HPU), and analysis and decision. The feasibility and efficiency of the proposed system are evaluated by implementing the system on Hadoop using an UBUNTU 14.04 LTS coreTMi5 machine. Various medical sensory datasets and real-time network traffic are considered for evaluating the efficiency of the system. The results show that the proposed system has the capability of efficiently processing WBAN sensory data from millions of users in order to perform real-time responses in case of emergencies.
Thébault, Caroline J; Ramniceanu, Grégory; Michel, Aude; Beauvineau, Claire; Girard, Christian; Seguin, Johanne; Mignet, Nathalie; Ménager, Christine; Doan, Bich-Thuy
2018-06-25
The development of theranostic nanocarriers as an innovative therapy against cancer has been improved by targeting properties in order to optimize the drug delivery to safely achieve its desired therapeutic effect. The aim of this paper is to evaluate the magnetic targeting (MT) efficiency of ultra-magnetic liposomes (UML) into CT26 murine colon tumor by magnetic resonance imaging (MRI). Dynamic susceptibility contrast MRI was applied to assess the bloodstream circulation time. A novel semi-quantitative method called %I 0.25 , based on the intensity distribution in T 2 * -weighted MRI images was developed to compare the accumulation of T 2 contrast agent in tumors with or without MT. To evaluate the efficiency of magnetic targeting, the percentage of pixels under the intensity value I 0.25 (I 0.25 = 0.25(I max - I min )) was calculated on the intensity distribution histogram. This innovative method of processing MRI images showed the MT efficiency by a %I 0.25 that was significantly higher in tumors using MT compared to passive accumulation, from 15.3 to 28.6 %. This methodology was validated by ex vivo methods with an iron concentration that is 3-fold higher in tumors using MT. We have developed a method that allows a semi-quantitative evaluation of targeting efficiency in tumors, which could be applied to different T 2 contrast agents.
Arsenic Removal and Its Chemistry in Batch Electrocoagulation Studies.
Sharma, Anshul; Adapureddy, Sri Malini; Goel, Sudha
2014-04-01
The aim of this study was to evaluate the impact of different oxidizing agents like light, aeration (by mixing) and electrocoagulation (EC) on the oxidation of As (III) and its subsequent removal in an EC batch reactor. Arsenic solutions prepared using distilled water and groundwater were evaluated. Optimum pH and the effect of varying initial pH on As removal efficiency were also evaluated. MaximumAs (III) removal efficiency with EC, light and aeration was 97% from distilled water and 71% from groundwater. Other results show that EC alone resulted in 90% As removal efficiency in the absence of light and mixing from distilled water and 53.6% from groundwater. Removal with light and mixing but without EC resulted in only 26% As removal from distilled water and 29% from groundwater proving that electro-oxidation and coagulation were more effective in removing arsenic compared to the other oxidizing agents examined. Initial pH was varied from 5 to 10 in distilled water and from 3 to 12 in groundwater for evaluating arsenic removal efficiency by EC. The optimum initial pH for arsenic removal was 7 for distilled water and groundwater. For all initial pHs tested between 5 and 10 in distilled water, the final pH ranged between 7 and 8 indicating that the EC process tends towards near neutral pH under the conditions examined in this study.
Eriksson, J; Ek, A; Johansson, G
2000-03-01
A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.
Fournier, Eric D; Keller, Arturo A; Geyer, Roland; Frew, James
2016-02-16
This project investigates the energy-water usage efficiency of large scale civil infrastructure projects involving the artificial recharge of subsurface groundwater aquifers via the reuse of treated municipal wastewater. A modeling framework is introduced which explores the various ways in which spatially heterogeneous variables such as topography, landuse, and subsurface infiltration capacity combine to determine the physical layout of proposed reuse system components and their associated process energy-water demands. This framework is applied to the planning and evaluation of the energy-water usage efficiency of hypothetical reuse systems in five case study regions within the State of California. Findings from these case study analyses suggest that, in certain geographic contexts, the water requirements attributable to the process energy consumption of a reuse system can exceed the volume of water that it is able to recover by as much as an order of magnitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Benton, Nathanael; Burns, Patrick
Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less
NASA Astrophysics Data System (ADS)
Pop, Florin; Dobre, Ciprian; Mocanu, Bogdan-Costel; Citoteanu, Oana-Maria; Xhafa, Fatos
2016-11-01
Managing the large dimensions of data processed in distributed systems that are formed by datacentres and mobile devices has become a challenging issue with an important impact on the end-user. Therefore, the management process of such systems can be achieved efficiently by using uniform overlay networks, interconnected through secure and efficient routing protocols. The aim of this article is to advance our previous work with a novel trust model based on a reputation metric that actively uses the social links between users and the model of interaction between them. We present and evaluate an adaptive model for the trust management in structured overlay networks, based on a Mobile Cloud architecture and considering a honeycomb overlay. Such a model can be useful for supporting advanced mobile market-share e-Commerce platforms, where users collaborate and exchange reliable information about, for example, products of interest and supporting ad-hoc business campaigns
González, Sheyla; Ibáñez, Elena
2010-01-01
Purpose The aim of the present study is to compare three previously described mouse embryonic stem cell derivation methods to evaluate the influence of culture conditions, number of isolated blastomeres and embryonic stage in the derivation process. Methods Three embryonic stem cell derivation methods: standard, pre-adhesion and defined culture medium method, were compared in the derivation from isolated blastomeres and whole embryos at 4- and 8-cell stages. Results A total of 200 embryonic stem cell lines were obtained with an efficiency ranging from 1.9% to 72%. Conclusions Using either isolated blastomeres or whole embryos, the highest rates of mouse embryonic stem cell establishment were achieved with the defined culture medium method and efficiencies increased as development progressed. Using isolated blastomeres, efficiencies increased in parallel to the proportion of the embryo volume used to start the derivation process. PMID:20862536
Ambulatory surgery centers best practices for the 90s.
Hoover, J A
1994-05-01
Outpatient surgery will be the driving force in the continued growth of ambulatory care in the 1990s. Providing efficient, high-quality ambulatory surgical services should therefore be a priority among healthcare providers. Arthur Andersen conducted a survey to discover best practices in ambulatory surgical service. General success characteristics of best performers were business-focused relationships with physicians, the use of clinical protocols, patient convenience, cost management, strong leadership, teamwork, streamlined processes and efficient design. Other important factors included scheduling to maximize OR room use; achieving surgical efficiencies through reduced case pack assembly errors and equipment availability; a focus on cost capture rather than charge capture; sound materiel management practices, such as standardization and vendor teaming; and the appropriate use of automated systems. It is important to evaluate whether the best practices are applicable to your environment and what specific changes to your current processes would be necessary to adopt them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-12-01
The objective of this proposed program is to evaluate the potential of rotating gas-liquid contactors for natural gas processing by expanding the currently available database. This expansion will focus on application of this technology to environments representative of those typically encountered in natural gas processing plants. Operational and reliability concerns will be addressed while generating pertinent engineering data relating to the mass-transfer process. Work to be performed this reporting period are: complete all negotiations and processing of agreements; complete assembly, modifications, shakedown, and conduct fluid dynamic studies using the plastic rotary contactor unit; confirmation of project test matrix; and locate,more » and transport an amine plant and dehydration plant. Accomplishment for this period are presented.« less
Ho, Tiffany C; Zhang, Shunan; Sacchet, Matthew D; Weng, Helen; Connolly, Colm G; Henje Blom, Eva; Han, Laura K M; Mobayed, Nisreen O; Yang, Tony T
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.
Ho, Tiffany C.; Zhang, Shunan; Sacchet, Matthew D.; Weng, Helen; Connolly, Colm G.; Henje Blom, Eva; Han, Laura K. M.; Mobayed, Nisreen O.; Yang, Tony T.
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed. PMID:26869950
BTEX biodegradation by bacteria from effluents of petroleum refinery.
Mazzeo, Dânia Elisa Christofoletti; Levy, Carlos Emílio; de Angelis, Dejanira de Franceschi; Marin-Morales, Maria Aparecida
2010-09-15
Groundwater contamination with benzene, toluene, ethylbenzene and xylene (BTEX) has been increasing, thus requiring an urgent development of methodologies that are able to remove or minimize the damages these compounds can cause to the environment. The biodegradation process using microorganisms has been regarded as an efficient technology to treat places contaminated with hydrocarbons, since they are able to biotransform and/or biodegrade target pollutants. To prove the efficiency of this process, besides chemical analysis, the use of biological assessments has been indicated. This work identified and selected BTEX-biodegrading microorganisms present in effluents from petroleum refinery, and evaluated the efficiency of microorganism biodegradation process for reducing genotoxic and mutagenic BTEX damage through two test-systems: Allium cepa and hepatoma tissue culture (HTC) cells. Five different non-biodegraded BTEX concentrations were evaluated in relation to biodegraded concentrations. The biodegradation process was performed in a BOD Trak Apparatus (HACH) for 20 days, using microorganisms pre-selected through enrichment. Although the biodegradation usually occurs by a consortium of different microorganisms, the consortium in this study was composed exclusively of five bacteria species and the bacteria Pseudomonas putida was held responsible for the BTEX biodegradation. The chemical analyses showed that BTEX was reduced in the biodegraded concentrations. The results obtained with genotoxicity assays, carried out with both A. cepa and HTC cells, showed that the biodegradation process was able to decrease the genotoxic damages of BTEX. By mutagenic tests, we observed a decrease in damage only to the A. cepa organism. Although no decrease in mutagenicity was observed for HTC cells, no increase of this effect after the biodegradation process was observed either. The application of pre-selected bacteria in biodegradation processes can represent a reliable and effective tool in the treatment of water contaminated with BTEX mixture. Therefore, the raw petroleum refinery effluent might be a source of hydrocarbon-biodegrading microorganisms. Copyright 2010 Elsevier B.V. All rights reserved.
Ma, Dehua; Chen, Lujun; Wu, Yuchao; Liu, Rui
2016-06-01
Antiestrogens and antiandrogens are relatively rarely studied endocrine disrupting chemicals which can be found in un/treated wastewaters. Antiestrogens and antiandrogens in the wastewater treatment effluents could contribute to sexual disruption of organisms. In this study, to assess the removal of non-specific antiestrogens and antiandrogens by advanced treatment processes, ozonation and adsorption to granular activated carbon (GAC), the biological activities and excitation emission matrix fluorescence spectroscopy of wastewater were evaluated. As the applied ozone dose increased to 12 mg/L, the antiestrogenic activity dramatically decreased to 3.2 μg 4-hydroxytamoxifen equivalent (4HEQ)/L, with a removal efficiency of 84.8%, while the antiandrogenic activity was 23.1 μg flutamide equivalent (FEQ)/L, with a removal efficiency of 75.5%. The removal of antiestrogenic/antiandrogenic activity has high correlation with the removal of fulvic acid-like materials and humic acid-like organics, suggesting that they can be used as surrogates for antiestrogenic/antiandrogenic activity during ozonation. The adsorption kinetics of antiestrogenic activity and antiandrogenic activity were well described by pseudo-second-order kinetics models. The estimated equilibrium concentration of antiestrogenic activity is 7.9 μg 4HEQ/L with an effective removal efficiency of 70.5%, while the equilibrium concentration of antiandrogenic activity is 33.7 μg FEQ/L with a removal efficiency of 67.0%. Biological activity evaluation of wastewater effluents is an attractive way to assess the removal of endocrine disrupting chemicals by different treatment processes. Fluorescence spectroscopy can be used as a surrogate measure of bioassays during ozonation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chen, Chia-Wei; Chow, Chi-Wai; Liu, Yang; Yeh, Chien-Hung
2017-10-02
Recently even the low-end mobile-phones are equipped with a high-resolution complementary-metal-oxide-semiconductor (CMOS) image sensor. This motivates using a CMOS image sensor for visible light communication (VLC). Here we propose and demonstrate an efficient demodulation scheme to synchronize and demodulate the rolling shutter pattern in image sensor based VLC. The implementation algorithm is discussed. The bit-error-rate (BER) performance and processing latency are evaluated and compared with other thresholding schemes.
Energy Efficiency Evaluation and Benchmarking of AFRL’s Condor High Performance Computer
2011-08-01
AUG 2011 2. REPORT TYPE CONFERENCE PAPER (Post Print) 3. DATES COVERED (From - To) JAN 2011 – JUN 2011 4 . TITLE AND SUBTITLE ENERGY EFFICIENCY...1716 Sony PlayStation 3s (PS3s), adding up to a total of 69,940 cores and a theoretical peak performance of 500 TFLOPS. There are 84 subcluster head...Thus, a critical component to achieving maximum performance is to find the optimum division of processing load between the CPU and GPU. 4 The
NASA Astrophysics Data System (ADS)
Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung
2010-12-01
This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.
Preliminary Evaluation of a Diagnostic Tool for Prosthetics
2017-10-01
volume change. Processing algorithms for data from the activity monitors were modified to run more efficiently so that large datasets could be...left) and blade style prostheses (right). Figure 4: Ankle ActiGraph correct position demonstrated for a left leg below-knee amputee cylindrical
A protocol for the health and fitness assessment of NBA players.
Scheller, A; Rask, B
1993-04-01
The assessment of the health and fitness of elite basketball players should be a multidisciplinary process. We have described an organized, efficient, and comprehensive protocol for preseason physical evaluations that could be used at the university as well as professional level.
Computer program determines performance efficiency of remote measuring systems
NASA Technical Reports Server (NTRS)
Merewether, E. K.
1966-01-01
Computer programs control and evaluate instrumentation system performance for numerous rocket engine test facilities and prescribe calibration and maintenance techniques to maintain the systems within process specifications. Similar programs can be written for other test equipment in an industry such as the petrochemical industry.
Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites
NASA Technical Reports Server (NTRS)
Blume, Jennifer L.
2010-01-01
Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.
NASA Technical Reports Server (NTRS)
Petersen, G. R.; Stokes, B. O.
1986-01-01
A hybrid chemical/biological approach to unconventional food regeneration is discussed. Carbon dioxide and water, the major wastes of human metabolism would be converted to methanol by one of several physiochemical processes available (thermal, photocatalytic, etc.). Methanol is then used to supply carbon and energy for the culture of microorganisms which in turn produce biological useful basic food stuffs for human nutrition. Our work has focused on increasing the carbohydrate levels of a candidate methylotrophic yeast to more nearly coincide with human nutritional requirements. Yeasts were chosen due to their high carbohydrate levels compared to bacteria and their present familiarity in the human diet. The initial candidate yeast studied was a thermotolerant strain of Hansenula polymor pha, DL-1. The quantitative results that permit an evaluation of the overall efficiency in hybrid chemical/biological food production schemes are discussed. A preliminary evaluation of the overall efficiency of such schemes is also discussed.
Wohlmuth da Silva, Salatiel; Arenhart Heberle, Alan Nelson; Pereira Santos, Alexia; Siqueira Rodrigues, Marco Antônio; Pérez-Herranz, Valentín; Moura Bernardes, Andréa
2018-05-29
Antibiotics are not efficiently removed in conventional wastewater treatments. In fact, different advanced oxidation process (AOPs), including ozone, peroxide, UV radiation, among others, are being investigated in the elimination of microcontaminants. Most of AOPs proved to be efficient on the degradation of antibiotics, but the mineralization is on the one hand not evaluated or on the other hand not high. At this work, the UV-based hybrid process, namely Photo-assisted electrochemical oxidation (PEO), was applied, aiming the mineralization of microcontaminants such as the antibiotics Amoxicillin (AMX), Norfloxacin (NOR) and Azithromycin (AZI). The influence of the individual contributions of electrochemical oxidation (EO) and the UV-base processes on the hybrid process (PEO) was analysed. Results showed that AMX and NOR presented higher mineralization rate under direct photolysis than AZI due to the high absorption of UV radiation. For the EO processes, a low mineralization was found for all antibiotics, what was associated to a mass-transport limitation related to the low concentration of contaminants (200 µg/L). Besides that, an increase in mineralization was found, when heterogeneous photocatalysis and EO are compared, due to the influence of UV radiation, which overcomes the mass-transport limitations. Although the UV-based processes control the reaction pathway that leads to mineralization, the best results to mineralize the antibiotics were achieved by PEO hybrid process. This can be explained by the synergistic effect of the processes that constitute them. A higher mineralization was achieved, which is an important and useful finding to avoid the discharge of microcontaminants in the environment.
Dumas, C; Perez, S; Paul, E; Lefebvre, X
2010-04-01
The efficiency of hyper-thermophilic (65 degrees Celsius) aerobic process coupled with a mesophilic (35 degrees Celsius) digester was evaluated for the activated sludge degradation and was compared to a conventional mesophilic digester. For two Sludge Retention Time (SRT), 21 and 42 days, the Chemical Oxygen Demand (COD) solubilisation and biodegradation processes, the methanisation yield and the aerobic oxidation were investigated during 180 days. The best results were obtained at SRT of 44 days; the COD removal yield was 30% higher with the Mesophilic Anaerobic Digestion/Thermophilic Aerobic Reactor (MAD-TAR) co-treatment. An increase of the sludge intrinsic biodegradability is also observed (20-40%), showing that the unbiodegradable COD in mesophilic conditions becomes bioavailable. However, the methanisation yield was quite similar for both processes at a same SRT. Finally, such a process enables to divide by two the volume of digester with an equivalent efficiency. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Kai; Chen, Xiangguang; Wang, Li; Jin, Huaiping
2017-01-01
In rubber mixing process, the key parameter (Mooney viscosity), which is used to evaluate the property of the product, can only be obtained with 4-6h delay offline. It is quite helpful for the industry, if the parameter can be estimate on line. Various data driven soft sensors have been used to prediction in the rubber mixing. However, it always not functions well due to the phase and nonlinear property in the process. The purpose of this paper is to develop an efficient soft sensing algorithm to solve the problem. Based on the proposed GMMD local sample selecting criterion, the phase information is extracted in the local modeling. Using the Gaussian local modeling method within Just-in-time (JIT) learning framework, nonlinearity of the process is well handled. Efficiency of the new method is verified by comparing the performance with various mainstream soft sensors, using the samples from real industrial rubber mixing process.
Pilot line report: Development of a high efficiency thin silicon solar cell
NASA Technical Reports Server (NTRS)
1978-01-01
Experimental technology advances were implemented to increase the conversion efficiency of ultrathin 2cm x 2cm cells, to demonstrate a capability for fabricating such cells at a rate of 10,000 per month, and to fabricate 200 large-area ultrathin cells to determine their feasibility of manufacture. A production rate of 10,000 50 micron m cells per month with lot average AM0 efficiencies of 11.5% was demonstrated, with peak efficiencies of 13.5% obtained. Losses in most stages of the processing were minimized, the remaining exceptions being in the photolithography and metallization steps for front contact generation and breakage handling. The 5cm x 5cm cells were fabricated with a peak yield in excess of 40% for over 10% AM0 efficiency. Greater fabrication volume is needed to fully evaluate the expected yield and efficiency levels for large cells.
Peter Christoper, G.V.; Vijaya Raghavan, C.; Siddharth, K.; Siva Selva Kumar, M.; Hari Prasad, R.
2013-01-01
In the current study zidovudine loaded PLGA nanoparticles were prepared, coated and further investigated for its effectiveness in brain targeting. IR and DSC studies were performed to determine the interaction between excipients used and to find out the nature of drug in the formulation. Formulations were prepared by adopting 23 factorial designs to evaluate the effects of process and formulation variables. The prepared formulations were subjected for in vitro and in vivo evaluations. In vitro evaluations showed particle size below 100 nm, entrapment efficiency of formulations ranges of 28–57%, process yield of 60–76% was achieved and drug release for the formulations were in the range of 50–85%. The drug release from the formulations was found to follow Higuchi release pattern, n–value obtained after Korsemeyer plot was in the range of 0.56–0.78. In vivo evaluations were performed in mice after intraperitoneal administration of zidovudine drug solution, uncoated and coated formulation. Formulation when coated with Tween 80 achieved a higher concentration in the brain than that of the drug in solution and of the uncoated formulation. Stability studies indicated that there was no degradation of the drug in the formulation after 90 days of preparation when stored in refrigerated condition. PMID:24648825
CO₂ carbonation under aqueous conditions using petroleum coke combustion fly ash.
González, A; Moreno, N; Navia, R
2014-12-01
Fly ash from petroleum coke combustion was evaluated for CO2 capture in aqueous medium. Moreover the carbonation efficiency based on different methodologies and the kinetic parameters of the process were determined. The results show that petroleum coke fly ash achieved a CO2 capture yield of 21% at the experimental conditions of 12 g L(-1), 363°K without stirring. The carbonation efficiency by petroleum coke fly ash based on reactive calcium species was within carbonation efficiencies reported by several authors. In addition, carbonation by petroleum coke fly ash follows a pseudo-second order kinetic model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Yiu, Sean; Tom, Brian Dm
2017-01-01
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the results of testing of a rotary flow cyclone, manufactured by Aerodyne Development Corporation under license by Siemens Kraftwerk Union. This cyclone was selected for evaluation due to the unusually high separative efficiencies claimed by the manufacturer (based on developer data), and relative lack of open literature data. The most significant finding of this work was the observation that electrostatic forces could enhance or, in fact, dominate the separation process. Separative efficiencies, with electrostatic forces present, were found to be substantially independent of flow rate and, by inference, could be independent of unit size. Hence this findingmore » offers a major hope that large cyclones employed in the hot gas cleanup train of the CFCC system may not suffer the performance degradation compared to small cyclones, as projected from conventional inertial theory. The separative efficiencies of the Aerodyne cyclone separator were found from both the cold flow and the hot flow tests to be disappointingly poorer than expectations (in agreement with Westinghouse results), and even poorer than conventional cyclones. (LTN)« less
Semipermeability Evolution of Wakkanai Mudstones During Isotropic Compression
NASA Astrophysics Data System (ADS)
Takeda, M.; Manaka, M.
2015-12-01
Precise identification of major processes that influence groundwater flow system is of fundamental importance for the performance assessment of waste disposal in subsurface. In the characterization of groundwater flow system, gravity- and pressure-driven flows have been conventionally assumed as dominant processes. However, recent studies have suggested that argillites can act as semipermeable membranes and they can cause chemically driven flow, i.e., chemical osmosis, under salinity gradients, which may generate erratic pore pressures in argillaceous formations. In order to identify the possibility that chemical osmosis is involved in erratic pore pressure generations in argillaceous formations, it is essential to measure the semipermeability of formation media; however, in the measurements of semipermeability, little consideration has been given to the stresses that the formation media would have experienced in past geologic processes. This study investigates the influence of stress history on the semipermeability of an argillite by an experimental approach. A series of chemical osmosis experiments were performed on Wakkanai mudstones to measure the evolution of semipermeability during loading and unloading confining pressure cycles. The osmotic efficiency, which represents the semipermeability, was estimated at each confining pressure. The results show that the osmotic efficiency increases almost linearly with increasing confining pressure; however, the increased osmotic efficiency does not recover during unloading unless the confining pressure is almost relieved. The observed unrecoverable change in osmotic efficiency may have an important implication on the evaluation of chemical osmosis in argillaceous formations that have been exposed to large stresses in past geologic processes. If the osmotic efficiency increased by the past stress can remain unchanged to date, the osmotic efficiency should be measured at the past highest stress rather than the current in-situ stress. Otherwise, the effect of chemical osmosis on the pore pressure generation would be underestimated.
Desktop microsimulation: a tool to improve efficiency in the medical office practice.
Montgomery, James B; Linville, Beth A; Slonim, Anthony D
2013-01-01
Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.
Effects of visual feedback-induced variability on motor learning of handrim wheelchair propulsion.
Leving, Marika T; Vegter, Riemer J K; Hartog, Johanneke; Lamoth, Claudine J C; de Groot, Sonja; van der Woude, Lucas H V
2015-01-01
It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability.
Effects of Visual Feedback-Induced Variability on Motor Learning of Handrim Wheelchair Propulsion
Leving, Marika T.; Vegter, Riemer J. K.; Hartog, Johanneke; Lamoth, Claudine J. C.; de Groot, Sonja; van der Woude, Lucas H. V.
2015-01-01
Background It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. Methods 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. Results The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. Conclusion These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability. PMID:25992626
No-Cook Process for Ethanol Production Using Indian Broken Rice and Pearl Millet
Gohel, Vipul; Duan, Gang
2012-01-01
No-cook process using granular starch hydrolyzing enzyme (GSHE) was evaluated for Indian broken rice and pearl millet. One-factor-at-a-time optimization method was used in ethanol production to identify optimum concentration of GSHE, under yeast fermentation conditions using broken rice and pearl millet as fermentation feedstocks. An acid fungal protease at a concentration of 0.2 kg per metric ton of grain was used along with various dosages of GSHE under yeast fermentation conditions to degrade the grain proteins into free amino nitrogen for yeast growth. To measure the efficacy of GSHE to hydrolyze no-cook broken rice and pearl millet, the chemical composition, fermentation efficiency, and ethanol recovery were determined. In both feedstocks, fermentation efficiency and ethanol recovery obtained through single-step no-cook process were higher than conventional multistep high-temperature process, currently considered the ideal industrial process. Furthermore, the no-cook process can directly impact energy consumption through steam saving and reducing the water cooling capacity needs, compared to conventional high-temperature process. PMID:22518148
Phase 1 of the automated array assembly task of the low cost silicon solar array project
NASA Technical Reports Server (NTRS)
Coleman, M. G.; Pryor, R. A.; Grenon, L. A.; Lesk, I. A.
1977-01-01
The state of technology readiness for the automated production of solar cells and modules is reviewed. Individual process steps and process sequences for making solar cells and modules were evaluated both technically and economically. High efficiency with a suggested cell goal of 15% was stressed. It is concluded that the technology exists to manufacture solar cells which will meet program goals.
On the evaluation of segmentation editing tools
Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.
2014-01-01
Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063
Aguilar, I; Misztal, I; Legarra, A; Tsuruta, S
2011-12-01
Genomic evaluations can be calculated using a unified procedure that combines phenotypic, pedigree and genomic information. Implementation of such a procedure requires the inverse of the relationship matrix based on pedigree and genomic relationships. The objective of this study was to investigate efficient computing options to create relationship matrices based on genomic markers and pedigree information as well as their inverses. SNP maker information was simulated for a panel of 40 K SNPs, with the number of genotyped animals up to 30 000. Matrix multiplication in the computation of the genomic relationship was by a simple 'do' loop, by two optimized versions of the loop, and by a specific matrix multiplication subroutine. Inversion was by a generalized inverse algorithm and by a LAPACK subroutine. With the most efficient choices and parallel processing, creation of matrices for 30 000 animals would take a few hours. Matrices required to implement a unified approach can be computed efficiently. Optimizations can be either by modifications of existing code or by the use of efficient automatic optimizations provided by open source or third-party libraries. © 2011 Blackwell Verlag GmbH.
Design and Control of Integrated Systems for Hydrogen Production and Power Generation
NASA Astrophysics Data System (ADS)
Georgis, Dimitrios
Growing concerns on CO2 emissions have led to the development of highly efficient power plants. Options for increased energy efficiencies include alternative energy conversion pathways, energy integration and process intensification. Solid oxide fuel cells (SOFC) constitute a promising alternative for power generation since they convert the chemical energy electrochemically directly to electricity. Their high operating temperature shows potential for energy integration with energy intensive units (e.g. steam reforming reactors). Although energy integration is an essential tool for increased efficiencies, it leads to highly complex process schemes with rich dynamic behavior, which are challenging to control. Furthermore, the use of process intensification for increased energy efficiency imposes an additional control challenge. This dissertation identifies and proposes solutions on design, operational and control challenges of integrated systems for hydrogen production and power generation. Initially, a study on energy integrated SOFC systems is presented. Design alternatives are identified, control strategies are proposed for each alternative and their validity is evaluated under different operational scenarios. The operational range of the proposed control strategies is also analyzed. Next, thermal management of water gas shift membrane reactors, which are a typical application of process intensification, is considered. Design and operational objectives are identified and a control strategy is proposed employing advanced control algorithms. The performance of the proposed control strategy is evaluated and compared with classical control strategies. Finally SOFC systems for combined heat and power applications are considered. Multiple recycle loops are placed to increase design flexibility. Different operational objectives are identified and a nonlinear optimization problem is formulated. Optimal designs are obtained and their features are discussed and compared. The results of the dissertation provide a deeper understanding on the design, operational and control challenges of the above systems and can potentially guide further commercialization efforts. In addition to this, the results can be generalized and used for applications from the transportation and residential sector to large--scale power plants.
Wilhelm, Nadja; Perle, Nadja; Simmoteit, Robert; Schlensak, Christian; Wendel, Hans P.; Avci-Adali, Meltem
2014-01-01
Surgical instruments are often strongly contaminated with patients' blood and tissues, possibly containing pathogens. The reuse of contaminated instruments without adequate cleaning and sterilization can cause postoperative inflammation and the transmission of infectious diseases from one patient to another. Thus, based on the stringent sterility requirements, the development of highly efficient, validated cleaning processes is necessary. Here, we use for the first time synthetic single-stranded DNA (ssDNA_ODN), which does not appear in nature, as a test soiling to evaluate the cleaning efficiency of routine washing processes. Stainless steel test objects were coated with a certain amount of ssDNA_ODN. After cleaning, the amount of residual ssDNA_ODN on the test objects was determined using quantitative real-time PCR. The established method is highly specific and sensitive, with a detection limit of 20 fg, and enables the determination of the cleaning efficiency of medical cleaning processes under different conditions to obtain optimal settings for the effective cleaning and sterilization of instruments. The use of this highly sensitive method for the validation of cleaning processes can prevent, to a significant extent, the insufficient cleaning of surgical instruments and thus the transmission of pathogens to patients. PMID:24672793
DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data.
Putri, Fadhilah Kurnia; Song, Giltae; Kwon, Joonho; Rao, Praveen
2017-09-25
One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query ( DISPAQ ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation's Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data.
DISPAQ: Distributed Profitable-Area Query from Big Taxi Trip Data †
Putri, Fadhilah Kurnia; Song, Giltae; Rao, Praveen
2017-01-01
One of the crucial problems for taxi drivers is to efficiently locate passengers in order to increase profits. The rapid advancement and ubiquitous penetration of Internet of Things (IoT) technology into transportation industries enables us to provide taxi drivers with locations that have more potential passengers (more profitable areas) by analyzing and querying taxi trip data. In this paper, we propose a query processing system, called Distributed Profitable-Area Query (DISPAQ) which efficiently identifies profitable areas by exploiting the Apache Software Foundation’s Spark framework and a MongoDB database. DISPAQ first maintains a profitable-area query index (PQ-index) by extracting area summaries and route summaries from raw taxi trip data. It then identifies candidate profitable areas by searching the PQ-index during query processing. Then, it exploits a Z-Skyline algorithm, which is an extension of skyline processing with a Z-order space filling curve, to quickly refine the candidate profitable areas. To improve the performance of distributed query processing, we also propose local Z-Skyline optimization, which reduces the number of dominant tests by distributing killer profitable areas to each cluster node. Through extensive evaluation with real datasets, we demonstrate that our DISPAQ system provides a scalable and efficient solution for processing profitable-area queries from huge amounts of big taxi trip data. PMID:28946679
Solventless visible light-curable coating: I. Critical formulation and processing parameters.
Bose, Sagarika; Bogner, Robin H
2010-06-30
Film coating is generally accomplished by spraying polymers dissolved in solvents onto a cascading bed of tablets. The limitations associated with the use of solvents (both aqueous and organic) can be overcome by the use of solventless coating technologies. In this proposed solventless photocurable film coating system, each layer of coating onto the pellets (non-pareil beads) was formed using liquid photocurable monomer, powdered pore-forming agents, photosensitizers and photoinitiators in a mini-coating pan and later cured by visible light. Yield, coating efficiency, variation in color, diameter and roundness were determined for each batch to evaluate process efficiency and coating quality. It was found that the ratio (S/L ratio) of the amount of solid (S) pore-forming agent to volume of liquid (L) monomer, particle size and type of the pore-forming agent, concentration of initiator, and total exposure (light intensity x exposure time) of light were critical formulation and processing parameters for the process. Using lactose as a pore-forming agent, an optimum ratio of pore-forming agent to photocurable polymer was 1.8-3.0 to achieve good process efficiency and uniformity. The ratio was sensitive to particle size and type of pore-forming agent. 2010 Elsevier B.V. All rights reserved.
Canestaro, William J; Pritchard, Daryl E; Garrison, Louis P; Dubois, Robert; Veenstra, David L
2015-08-01
Companion diagnostic tests (CDTs) have emerged as a vital technology in the effective use of an increasing number of targeted drug therapies. Although CDTs can offer a multitude of potential benefits, assessing their value within a health technology appraisal process can be challenging because of a complex array of factors that influence clinical and economic outcomes. To develop a user-friendly tool to assist managed care and other health care decision makers in screening companion tests and determining whether an intensive technology review is necessary and, if so, where the review should be focused to improve efficiency. First, we conducted a systematic literature review of CDT cost-effectiveness studies to identify value drivers. Second, we conducted key informant interviews with a diverse group of stakeholders to elicit feedback and solicit any additional value drivers and identify desirable attributes for an evidence review tool. A draft tool was developed based on this information that captured value drivers, usability features, and had a particular focus on practical use by nonexperts. Finally, the tool was pilot tested with test developers and managed care evidence evaluators to assess face-validity and usability. The tool was also evaluated using several diverse examples of existing companion diagnostics and refined accordingly. We identified 65 cost-effectiveness studies of companion diagnostic technologies. The following factors were most commonly identified as value drivers from our literature review: clinical validity of testing; efficacy, safety, and cost of baseline and alternative treatments; cost and mortality of health states; and biomarker prevalence and testing cost. Stakeholders identified the following additional factors that they believed influenced the overall value of a companion test: regulatory status, actionability, utility, and market penetration. These factors were used to maximize the efficiency of the evidence review process. Stakeholders also stated that a tool should be easy to use and time efficient. Cognitive interviews with stakeholders led to minor changes in the draft tool to improve usability and relevance. The final tool consisted of 4 sections: (1) eligibility for review (2 questions), (2) prioritization of review (3 questions), (3) clinical review (3 questions), and (4) economic review (5 questions). Although the evaluation of CDTs can be challenging because of limited evidence and the added complexity of incorporating a diagnostic test into drug treatment decisions, using a pragmatic tool to identify tests that do not need extensive evaluation may improve the efficiency and effectiveness of CDT value assessments.
Zanchetta, Priscilla Garozi; Heringer, Otávio; Scherer, Rodrigo; Pacheco, Henrique Poltronieri; Gonçalves, Ricardo; Pena, Angelina
2015-10-01
Pharmaceuticals are emerging contaminants and it must be noted that approximately 70 % of them are excreted via urine. Therefore, urine usage implies the risk of transfer of pharmaceutical residues to agricultural fields and environment contamination. Thus, this study aimed on the development and validation of a LC-MS/MS method for D-norgestrel (D-NOR) and progesterone (PRO) determination in human urine, as well as the evaluation of the removal efficiency of two methods (storage and evaporation), and the effects of acidification with sulfuric acid. The storage process was evaluated for 6 weeks, while the evaporation was assessed at three different temperatures (50, 75, and 100 °C). All experiments were done with normal urine (pH = 6.0) and acidified urine (pH = 2.0, with sulfuric acid). The results of validation showed good method efficiency. In the second week of storage, higher hormone degradation was observed. In the evaporation method, both D-NOR and PRO were almost completely degraded when the volume was reduced to the lowermost level. Results also indicate that acidification did not affect degradation. Overall, the results showed that combination of two methods can be employed for more efficient hormone removal in urine.
Development of a low-pressure materials pre-treatment process for improved energy efficiency
NASA Astrophysics Data System (ADS)
Lee, Kwanghee; You, Byung Don
2017-09-01
Low pressure materials pre-treatment process has been developed as an alternative to the existing high-temperature sludge drying, limestone calcination, and limonite dehydroxylation. Using the thermodynamic equilibrium relationship between temperature and pressure represented by the Clausius-Clapeyron equation, the operational temperature of these reactions could be lowered at reduced pressure for increased energy efficiency. For industrial sludge drying, the evaporation rate was controlled by interfacial kinetics showing a constant rate with time and significant acceleration in the reaction could be observed with reduced pressure. At this modified reaction rate under low pressure, the rate was also partially controlled by mass transfer. Temperature of limestone calcination was lowered, but the reaction was limited at the calculated equilibrium temperature of the Clausius-Clapeyron equation and slightly higher temperatures were required. The energy consumption during limestone calcination and limonite dehydroxylation were evaluated, where lower processing pressures could enhance the energy efficiency for limestone calcination, but limonite dehydroxylation could not achieve energy-savings due to the greater power consumption of the vacuum pump under lower pressure and reduced temperatures.
Dwivedi, Naveen; Balomajumder, Chandrajit; Mondal, Prasenji
2016-07-01
The present study aimed to investigate the removal efficiency of cyanide from contaminated water by adsorption, biodegradation and simultaneous adsorption and biodegradation (SAB) process individually in a batch reactor. Adsorption was achieved by using almond shell granules and biodegradation was conducted with suspended cultures of Bacillus cereus, whereas SAB process was carried out using Bacillus cereus and almond shell in a batch reactor. The effect of agitation time, pH, and initial cyanide concentration on the % removal of cyanide has been discussed. Under experimental conditions, optimum removal was obtained at pH 7 with agitation time of 48 hrs and temperature of 35 degrees C. Cyanide was utilized by bacteria as sole source of nitrogen for growth. The removal efficiencies of cyanide by adsorption, biodegradation, and SAB were found to be 91.38%, 95.87%, and 99.63%, respectively, at initial cyanide concentration of 100 mg l(-1). The removal efficiency of SAB was found to be better as compared to that of biodegradation and adsorption alone.
Misra, Rohit; Guldhe, Abhishek; Singh, Poonam; Rawat, Ismail; Stenström, Thor Axel; Bux, Faizal
2015-01-01
The efficient harvesting of microalgae is considered to be one of the challenging steps of algal biofuel production and a key factor limiting the commercial use of microalgae. To overcome the limitation of metallic electrodes depletion, the application of non-sacrificial electrode was investigated for the electrochemical harvesting (ECH) of microalgae. The effect of applied current, addition of electrolyte and initial pH were parameters investigated. The highest recovery efficiency of 83% was obtained for Scenedesmus obliquus at 1.5A, initial pH 9 and 6gL(-)(1) NaCl with power consumption of 3.84kWhkg(-)(1). Recovery efficiency of ECH process was comparable to literature reported centrifugation, filtration and chemical flocculation techniques but with a much lower power consumption. The ECH process with addition of electrolyte enhanced the lipid extraction by 22% without any adverse effects. The ECH process with non sacrificial carbon electrodes could be a possible harvesting step at commercial scale microalgal biomass production. Copyright © 2014 Elsevier Ltd. All rights reserved.
Quantum efficiency harmonic analysis of exciton annihilation in organic light emitting diodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, J. S.; Giebink, N. C., E-mail: ncg2@psu.edu
2015-06-29
Various exciton annihilation processes are known to impact the efficiency roll-off of organic light emitting diodes (OLEDs); however, isolating and quantifying their contribution in the presence of other factors such as changing charge balance continue to be a challenge for routine device characterization. Here, we analyze OLED electroluminescence resulting from a sinusoidal dither superimposed on the device bias and show that nonlinearity between recombination current and light output arising from annihilation mixes the quantum efficiency measured at different dither harmonics in a manner that depends uniquely on the type and magnitude of the annihilation process. We derive a series ofmore » analytical relations involving the DC and first harmonic external quantum efficiency that enable annihilation rates to be quantified through linear regression independent of changing charge balance and evaluate them for prototypical fluorescent and phosphorescent OLEDs based on the emitters 4-(dicyanomethylene)-2-methyl-6-(4-dimethylaminostyryl)-4H-pyran and platinum octaethylporphyrin, respectively. We go on to show that, in most cases, it is sufficient to calculate the needed quantum efficiency harmonics directly from derivatives of the DC light versus current curve, thus enabling this analysis to be conducted solely from standard light-current-voltage measurement data.« less
The effectiveness and efficiency of disease management programs for patients with chronic diseases.
Hisashige, Akinori
2012-11-26
Disease management (DM) approach is increasingly advocated as a means of improving effectiveness and efficiency of healthcare for chronic diseases. To evaluate the evidence on effectiveness and efficiency of DM, evidence synthesis was carried out. To locate eligible meta-analyses and systematic reviews, we searched Medline, EMBASE, the Cochrane Library, SCI-EXPANDED, SSCI, A&HCI, DARE, HTA and NHS EED from 1995 to 2010. Two reviewers independently extracted data and assessed a study quality. Twenty-eight meta-analyses and systematic reviews were included for synthesizing evidence. The proportion of articles which observed improvement with a reasonable amount of evidence was the highest at process (69%), followed by health services (63%), QOL (57%), health outcomes (51%), satisfaction (50%), costs (38%) and so on. As to mortality, statistically significant results were observed only in coronary heart disease. Important components in DM, such as a multidisciplinary approach, were identified. The evidence synthesized shows considerable evidence in the effectiveness and efficiency of DM programs in process, health services, QOL and so on. The question is no longer whether DM programs work, but rather which type or component of DM programs works best and efficiently in the context of each healthcare system or country.
Kato-Lin, Yi-Chin; Krishnamurti, Lakshmanan; Padman, Rema; Seltman, Howard J
2014-11-01
There is limited application and evaluation of health information systems in the management of vaso-occlusive pain crises in sickle cell disease (SCD) patients. This study evaluates the impact of digitization of paper-based individualized pain plans on process efficiency and care quality by examining both objective patient data and subjective clinician insights. Retrospective, before and after, mixed methods evaluation of digitization of paper documents in Children's Hospital of Pittsburgh of UPMC. Subjective perceptions are analyzed using surveys completed by 115 clinicians in emergency department (ED) and inpatient units (IP). Objective effects are evaluated using mixed models with data on 1089 ED visits collected via electronic chart review 28 months before and 22 months after the digitization. Surveys indicate that all clinicians perceived the digitization to improve the efficiency and quality of pain management. Physicians overwhelmingly preferred using the digitized plans, but only 44% of the nurses had the same response. Analysis of patient records indicates that adjusted time from analgesic order to administration was significantly reduced from 35.50 to 26.77 min (p<.05). However, time to first dose and some of the objective quality measures (time from administration to relief, relief rate, admission rate, and ED re-visit rate) were not significantly affected. The relatively simple intervention, high baseline performance, and limited accommodation of nurses' perspectives may account for the marginal improvements in process efficiency and quality outcomes. Additional efforts, particularly improved communication between physicians and nurses, are needed to further enhance quality of pain management. This study highlights the important role of health information technology (HIT) on vaso-occlusive pain management for pediatric patients with sickle cell disease and the critical challenges in accommodating human factor considerations in implementing and evaluating HIT effects. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Human exposure modeling in a life cycle framework for chemicals and products
A chemical enters into commerce to serve a specific function in a product or process. This decision triggers both the manufacture of the chemical and its potential release over the life cycle of the product. Efficiently evaluating chemical safety and sustainability requires combi...
Stunning and scalding techniques, implications for yield and processing efficiency
USDA-ARS?s Scientific Manuscript database
In most countries, humane slaughter of poultry includes the use of various stunning systems, each of which offers advantages and disadvantages (engineering challenges). The effects of the different stun and stun-to-kill methods will be reviewed, along with research study results evaluating bird welf...
Termination of School Employees: Legal Issues and Techniques.
ERIC Educational Resources Information Center
National School Boards Association, Alexandria, VA. Council of School Attorneys.
The termination of unsatisfactory school employees is an ubiquitous and often expensive legal problem for school districts. This monograph was designed to help school attorneys and administrators in handling the termination process as fairly and efficiently as possible. The monograph begins with articles on documentation, evaluation, and…
Optimizing the use of the thermal integrity system for evaluating auger-cast piles [summary].
DOT National Transportation Integrated Search
2016-07-01
Auger-cast-in-place (ACIP) piles offer an efficient method of constructing and installing piles, : but because the ACIP process is essentially blind and the configuration of the final pile cannot be : assured, applications for ACIP piles have been li...
Efficiency and Accuracy in Thermal Simulation of Powder Bed Fusion of Bulk Metallic Glass
NASA Astrophysics Data System (ADS)
Lindwall, J.; Malmelöv, A.; Lundbäck, A.; Lindgren, L.-E.
2018-05-01
Additive manufacturing by powder bed fusion processes can be utilized to create bulk metallic glass as the process yields considerably high cooling rates. However, there is a risk that reheated material set in layers may become devitrified, i.e., crystallize. Therefore, it is advantageous to simulate the process to fully comprehend it and design it to avoid the aforementioned risk. However, a detailed simulation is computationally demanding. It is necessary to increase the computational speed while maintaining accuracy of the computed temperature field in critical regions. The current study evaluates a few approaches based on temporal reduction to achieve this. It is found that the evaluated approaches save a lot of time and accurately predict the temperature history.
Tan, Y M; Flynn, M R
2000-10-01
The transfer efficiency of a spray-painting gun is defined as the amount of coating applied to the workpiece divided by the amount sprayed. Characterizing this transfer process allows for accurate estimation of the overspray generation rate, which is important for determining a spray painter's exposure to airborne contaminants. This study presents an experimental evaluation of a mathematical model for predicting the transfer efficiency of a high volume-low pressure spray gun. The effects of gun-to-surface distance and nozzle pressure on the agreement between the transfer efficiency measurement and prediction were examined. Wind tunnel studies and non-volatile vacuum pump oil in place of commercial paint were used to determine transfer efficiency at nine gun-to-surface distances and four nozzle pressure levels. The mathematical model successfully predicts transfer efficiency within the uncertainty limits. The least squares regression between measured and predicted transfer efficiency has a slope of 0.83 and an intercept of 0.12 (R2 = 0.98). Two correction factors were determined to improve the mathematical model. At higher nozzle pressure settings, 6.5 psig and 5.5 psig, the correction factor is a function of both gun-to-surface distance and nozzle pressure level. At lower nozzle pressures, 4 psig and 2.75 psig, gun-to-surface distance slightly influences the correction factor, while nozzle pressure has no discernible effect.
Recuperation de la matiere organique biodegradable presente dans l'effluent d'un MBBR a forte charge
NASA Astrophysics Data System (ADS)
Brosseau, Catherine
High-rate processes are receiving great interest due to their potential to favor the energy balance of water resource recovery facilities (WRRFs) either for their design or retrofit. Anaerobic digestion is a process that allows the valorization of organic biodegradable matter contained in sludge into biogas. This process also produces a stabilized sludge named digestate or biosolids that can be reused for agriculture purposes. This project proposed a secondary treatment train composed of a high-rate moving bed biofilm reactor (HR-MBBR) to biotransform colloidal and soluble biodegradable organics into particulate matter followed by an enhanced and compact physico-chemical separation process to recover mainly particulate organics and a part of the colloidal matter. A high-rate biological process operated at a low hydraulic retention time aimed at transforming colloidal and soluble fractions of organic matter into a particulate fraction for recovery by downstream separation process. The HR-MBBR effluent solids are known for their poor settleability, therefore requiring an efficient separation process downstream to ensure their recovery and to meet the effluent discharge regulations. The global objective of this project was to maximize the recovery of organic biodegradable matter for valorization into biogas by anaerobic digestion with an innovative treatment train combining an HR-MBBR and a separation process. The specific objectives of this report were 1) to characterize the HR-MBBR effluent solids and 2) to determine the efficiency of several physico-chemical separation processes combined with unbiodegradable or natural based coagulants and polymers. Effluents of lab-scale HR-MBBR fed with a synthetic soluble or domestic wastewater influent and the effluent of a full-scale HR-MBBR were used to evaluate the efficiency of separation processes adapted at bench-scale in jar-tests experiments. The processes studied were conventionnal settling, ballasted flocculation, dissolved air flotation and an innovative enhanced flotation process. Unlike conventional settling and dissolved air flotation, ballasted flocculation and enhanced flotation use a ballasted or flotation agent to accelerate the sludge settling or flotation rate. The original scientific hypothesis of this project is that the combination of enhanced flotation and natural based chemicals can meet a target total suspended solids (TSS) concentration of less or equal to 10 mg TSS/L in the clarified effluent of an HR-MBBR. The separation processes efficiencies were evaluated based on their TSS recoveries. Monitoring the chemical oxygen demand (COD) fractions allowed to better understand the underlying mechanisms of organic matter biotransformation and capture throughout the proposed treatment train. The concentration of solids expressed in TSS concentration in the MBBR effluent with a synthetic soluble influent was kept very low, from 27 to 61 mg TSS/L, which is about 2 to 9 times less than the expected concentration for an MBBR fed with domestic wastewater. Without the presence of particulate matter in the influent, the particulate matter in the MBBR effluent represented only the production of biomass detached by the shearing forces between the carriers. The TSS concentration and the efficiency of colloidal and soluble matter biotransformation into particulate matter increased with the MBBR hydraulic retention time. Wide volumetric particle size distributions ranging from 5 to 1000 mum in the lab-scale MBBR effluent were observed with a higher proportion of particles larger than 100 mum for a synthetic feed, and a higher proportion of small size particles of 30 mum for a domestic wastewater feed. The presence of lots of small size particles was attributed to unsettleable solids in the influent unchanged in the reactor. Despite the high proportion of large size particles for the MBBR with a synthetic feed, poor settleability of effluent solids was observed as static settling could only achieve TSS recoveries between 35 to 78%. Hence, coagulating agents were necessary to enhance the solids recovery. The combination of the innovative enhanced flotation process and unbiodegradable chemicals allowed to achieve TSS recovery efficiencies up to 97%. The enhanced flotation efficiency was reduced when using natural based chemicals, especially the natural based polymer which was not suited to treat waters with such high TSS concentrations. The hypothesis of the residual TSS concentration of 10 mg TSS/L was verified for half of the HR-MBBR operating conditions and the recovery efficiency did not seem to be influenced by the reactor hydraulic retention time, organic loading rate and temperature. More experiments are needed to confirm the effect of these parameters on TSS recovery efficiency. Although natural based chemicals reduced the coagulation and flocculation efficiency, they allowed a decrease in sludge production, which can represent a significant cost benefit. These chemicals resulted in an increase of 33 to 60% of the total COD of the MBBR effluent, compared to the unbiodegradable chemicals which only contributed about 2%. Natural based chemicals are recommended over unbiodegradable ones to promote the use of high biodegradability potential chemicals and to reduce the production of chemical sludge. However, to offset the increase of total COD, it may be required to add a treatment downstream to meet target secondary treatment COD concentration. Conventionnal settling and ballasted flocculation offered similar TSS recovery efficiencies to enhanced flottation (88% TSS recovery efficiency). The efficiency was reduced by 34% when using the dissolved air flotation process, much lower than the ones expected for such a separation process. The efficiency reduction was attributed to non-optimized and unadapted flotation lab-scale setups to treat medium strength wastewater. A similar innovative treatment train is currently being tested at pilot-scale in order to evaluate its carbon footprint and its potential to be eventually transposed to full-scale. Furthermore, the biodegradability and the biochemical methane production of the natural based chemicals are being determined. This project allowed to determine the potential of the innovative enhanced flotation process to recover the HR-MBBR solids when combined with natural based chemicals which are currently not often used in wastewater treatment for resource recovery.
Cartilage ablation studies using mid-IR free electron laser
NASA Astrophysics Data System (ADS)
Youn, Jong-In; Peavy, George M.; Venugopalan, Vasan
2005-04-01
The ablation rate of articular cartilage and fibrocartilage (meniscus), were quantified to examine wavelength and tissue-composition dependence of ablation efficiency for selected mid-infrared wavelengths. The wavelengths tested were 2.9 um (water dominant absorption), 6.1 (protein and water absorption) and 6.45 um (protein dominant absorption) generated by the Free Electron Laser (FEL) at Vanderbilt University. The measurement of tissue mass removal using a microbalance during laser ablation was conducted to determine the ablation rates of cartilage. The technique can be accurate over methods such as profilometer and histology sectioning where tissue surface and the crater morphology may be affected by tissue processing. The ablation efficiency was found to be dependent upon the wavelength. Both articular cartilage and meniscus (fibrocartilage) ablations at 6.1 um were more efficient than those at the other wavelengths evaluated. We observed the lowest ablation efficiency of both types of cartilage with the 6.45 um wavelength, possibly due to the reduction in water absorption at this wavelength in comparison to the other wavelengths that were evaluated.
Removal of Cu(II) in water by polymer enhanced ultrafiltration: Influence of polymer nature and pH.
Kochkodan, Olga D; Kochkodan, Viktor M; Sharma, Virender K
2018-01-02
This study presents an efficient removal of Cu(II) in water using the polymer enhanced ultrafiltration (PEUF) method. Polymer of different molecular weight (MW) (polyethyleneimine (PEI), sodium lignosulfonates (SLS) and dextrans) were investigated to evaluate efficiency in removal of Cu(II) in water by the PEUF method. The decomposition of Cu(II)-polymer complex was also evaluated in order to reuse polymers. Cu(II) complexation depends on the MW of chelating polymer and the pH of feed solution. It was found that the Cu(II) rejection increased with the polymer dosage with high removal of Cu(II) when using PEI and SLS at a 10:20 (mg/mg) ratio ([Cu(II)]:[polymer]). It was found that the maximum chelating capacity was 15 mg of Cu(II) per 20 mg of PEI. The Cu(II)-PEI complex could be decomposed by acid addition and the polymer could be efficiently reused with multiple complexation-decomplexation cycles. A conceptual flow chart of the integrated process of efficient removal of Cu(II) by PEUF method is suggested.
Nitrogen removal via nitrite from seawater contained sewage.
Peng, Yongzhen; Yu, De-Shuang; Liang, Dawei; Zhu, Guibing
2004-01-01
Under the control of both pH and the concentration of free ammonia (FA), the nitrification-denitrification via nitrite pathway was accomplished in SBR to achieve enhanced biological nitrogen removal from seawater contained wastewater, which is used to flush toilet, under relatively high salinity. Several parameters including salinity, temperature, pH, and NH4+-N loading rate were studied to evaluate their effects. The results indicate that at different salinity the nitrogen removal efficiency is relative to ammonia-nitrogen loading rate. The nitrogen removal efficiency reaches above 90% when the NH4+-N loading does not exceed 0.15 kg NH4+-N/kg MLSS d. With the salinity increasing, the ammonia-nitrogen loading rate should be lowered to obtain high removal efficiency. The evaluation of temperature effect shows that nitrogen removal efficiency is promoted twice when reaction temperature is elevated from 20 to 30 degrees C. Moderately high pH in the range of 7.5-8.5 has advantage to achieve effective nitrification-denitrification via nitrite, the process of which is caused by the selective inhibition of free ammonia (FA).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, R.H.; Calabro, D.S.
1969-11-01
The two methods normally used for the analysis of NO/sub x/ are the Saltzman and the phenoldisulfonic acid technique. This paper describes an evaluation of these wet chemical methods to determine their practical application to engine exhaust gas analysis. Parameters considered for the Saltzman method included bubbler collection efficiency, NO to NO/sub 2/ conversion efficiency, masking effect of other contaminants usually present in exhaust gases and the time-temperature effect of these contaminants on store developed solutions. Collection efficiency and the effects of contaminants were also considered for the phenoldisulfonic acid method. Test results indicated satisfactory collection and conversion efficiencies formore » the Saltzman method, but contaminants seriously affected the measurement accuracy particularly if the developed solution was stored for a number of hours at room temperature before analysis. Storage at 32/sup 0/F minimized effect. The standard procedure for the phenoldisulfonic acid method gave good results, but the process was found to be too time consuming for routine analysis and measured only total NO/sub x/. 3 references, 9 tables.« less
NASA Technical Reports Server (NTRS)
Salama, A. M.
1980-01-01
Microstructural and electrical evaluation tests were performed on nickel-doped p-type silicon wafers before and after solar cell fabrication. The concentration levels of nickel in silicon were 5 x 10 to the 14th power, 4 x 10 to the 15th power, and 8 x 10 to the 15th power atoms/cu cm. It was found that nickel precipitated out during the growth process in all three ingots. Clumps of precipitates, some of which exhibited star shape, were present at different depths. If the clumps are distributed at depths approximately 20 micron apart and if they are larger than 10 micron in diameter, degradation occurs in solar cell electrical properties and cell conversion efficiency. The larger the size of the precipitate clump, the greater the degradation in solar cell efficiency. A large grain boundary around the cell effective area acted as a gettering center for the precipitates and impurities and caused improvement in solar cell efficiency. Details of the evaluation test results are given.
Mukhopadhyay, Sudarsan; Tomasula, Peggy M; Luchansky, John B; Porto-Fett, Anna; Call, Jeffrey E
2010-09-01
Effectiveness of a cross flow microfiltration (MF) process for removal of a cocktail of Salmonella enterica serovar Enteritidis species from commercial unpasteurized liquid egg white (LEW) from a local egg breaking plant, while maintaining its functional properties was evaluated. To facilitate MF, LEW was wedge screened, homogenized and then diluted (1:2 w/w) with distilled water containing 0.5% sodium chloride. Diluted unpasteurized LEW was inoculated with five strains of S. Enteritidis (ATCC 4931, ATCC BAA-708, ATCC 49215, ATCC 49218, and ATCC BAA-1045) to a level of approximately 10(7)CFU/mL of LEW and microfiltered using a ceramic membrane. Process parameters influencing egg white functional properties and pathogen removal efficiency were evaluated. Average permeates flux increased by almost 126% when pH of LEW was adjusted from pH 8 to pH 7 at 25 degrees C. Microbial removal efficiency was at least, on average, 6.8Log(10)CFU/mL (limit of detection < or =0.5Log(10)CFU/mL). Functional property analysis indicated that the MF process did not alter the foaming power of LEW. Published by Elsevier B.V.
Müller, Gerdt; Kalyani, Dayanand Chandrahas; Horn, Svein Jarle
2017-03-01
Enzymatic catalysis plays a key role in the conversion of lignocellulosic biomass to fuels and chemicals such as lactic acid. In the last decade, the efficiency of commercial cellulase cocktails has increased significantly, in part due to the inclusion of lytic polysaccharide monooxygenases (LPMOs). However, the LPMOs' need for molecular oxygen to break down cellulose demands reinvestigations of process conditions. In this study, we evaluate the efficiency of lactic acid production from steam-exploded birch using an LPMO-containing cellulase cocktail in combination with lactic acid bacteria, investigating both separate hydrolysis and fermentation (SHF) and simultaneous saccharification and fermentation (SSF). While the SSF set up generally has been considered to be more efficient because it avoids sugar accumulation which may inhibit the cellulases, the SHF set up in our study yielded 26-32% more lactic acid than the SSF. This was mainly due to competition for oxygen between LPMOs and the fermenting organisms in the SSF process, which resulted in reduced LPMO activity and thus less efficient saccharification of the lignocellulosic substrate. By means of aeration it was possible to activate the LPMOs in the SSF, but less lactic acid was produced due to a shift in metabolic pathways toward production of acetic acid. Overall, this study shows that lactic acid can be produced efficiently from lignocellulosic biomass, but that the use of LPMO-containing cellulase cocktails in fermentation processes demands re-thinking of traditional process set ups due to the requirement of oxygen in the saccharification step. Biotechnol. Bioeng. 2017;114: 552-559. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Shigekawa, Y; Kasamatsu, Y; Shinohara, A
2016-05-01
The nucleus (235m)U is an isomer with extremely low excitation energy (76.8 eV) and decays dominantly through the internal conversion (IC) process. Because outer-shell electrons are involved in the IC process, the decay constant of (235m)U depends on its chemical environment. We plan to study the deexcitation process of (235m)U by measuring the energy spectra of IC electrons in addition to the decay constants for various chemical forms. In this paper, the preparation method of (235m)U samples from (239)Pu by using alpha-recoil energy is reported. A Collection Apparatus for Recoil Products was fabricated, and then collection efficiencies under various conditions were determined by collecting (224)Ra recoiling out of (228)Th electrodeposited and precipitated sources. The pressure in the apparatus (vacuum or 1 atm of N2 gas) affected the variations of the collection efficiencies depending on the negative voltage applied to the collector. The maximum values of the collection efficiencies were mainly affected by the thickness of the (228)Th sources. From these results, the suitable conditions of the (239)Pu sources for preparation of (235m)U were determined. In addition, dissolution efficiencies were determined by washing collected (224)Ra with solutions. When (224)Ra was collected in 1 atm of N2 gas and dissolved with polar solutions such as water, the dissolution efficiencies were nearly 100%. The method of rapid dissolution of recoil products would be applicable to rapid preparation of short-lived (235m)U samples for various chemical forms.
Karn, Pankaj Ranjan; Jin, Su-Eon; Lee, Benjamin Joon; Sun, Bo Kyung; Kim, Min-Soo; Sung, Jong-Hyuk; Hwang, Sung-Joo
2014-01-01
Objectives The objectives of this study were to prepare cyclosporin A (CsA)-containing proliposomes using the supercritical antisolvent (SAS) process and the conventional thin film method for the comparative study of proliposomal formulations and to evaluate the physicochemical properties of these proliposomes. Methods CsA-containing proliposomes were prepared by the SAS process and the conventional film method, composed of natural and synthetic phospholipids. We investigated particle size, polydispersity index, and zeta potential of CsA-containing proliposomes. In addition, both production yield and entrapment efficiency of CsA in different proliposomes were analyzed. Physicochemical properties of CsA-containing proliposomes were also evaluated, using differential scanning calorimetry and X-ray diffraction. The morphology and size of CsA-containing proliposomes were confirmed, using scanning electron microscopy. We checked the in vitro release of CsA from CsA-containing proliposomes prepared by different preparation methods, comparing them with Restasis® as a positive control and the stability of SAS-mediated proliposomes was also studied. Results CsA-containing proliposomes formed by the SAS process had a relatively smaller particle size, with a narrow size distribution and spherical particles compared with those of conventionally prepared proliposomes. The yield and entrapment efficiency of CsA in all proliposomes varied from 85% to 92% and from 86% to 89%, respectively. Differential scanning calorimetry and X-ray diffraction studies revealed that the anhydrous lactose powder used in this formulation retained its crystalline form and that CsA was present in an amorphous form. Proliposome powders were rapidly converted to liposomes on contact with water. The in vitro release study of proliposomal formulations demonstrated a similar pattern to Restasis®. The SAS-mediated CsA-containing proliposomes were stable on storage, with no significant changes in particle size, polydispersity index, and entrapment efficiency. Conclusion These results show promising features of CsA-containing proliposomal formulations, using the SAS process for the large-scale industrial application. PMID:25395846
Madi, Banyana Cecilia; Hussein, Julia; Hounton, Sennen; D'Ambruoso, Lucia; Achadi, Endang; Arhinful, Daniel Kojo
2007-09-01
A participatory approach to priority setting in programme evaluation may help improve the allocation and more efficient use of scarce resources especially in low-income countries. Research agendas that are the result of collaboration between researchers, programme managers, policy makers and other stakeholders have the potential to ensure rigorous studies are conducted on matters of local priority, based on local, expert knowledge. This paper describes a process involving key stakeholders to elicit and prioritise evaluation needs for safe motherhood in three developing countries. A series of reiterative consultations with safe motherhood stakeholders from each country was conducted over a period of 36 months. In each country, the consultation process consisted of a series of participatory workshops; firstly, stakeholder's views on evaluation were elicited with parallel descriptive work on the contexts. Secondly, priorities for evaluation were identified from stakeholders; thirdly, the evaluation-priorities were refined; and finally, the evaluation research questions, reflecting the identified priorities, were agreed and finalised. Three evaluation-questions were identified in each country, and one selected, on which a full scale evaluation was undertaken. While there is a great deal written about the importance of transparent and participatory priority setting in evaluation; few examples of how such processes could be implemented exist, particularly for maternal health programmes. Our experience demonstrates that the investment in a participatory priority-setting effort is high but the process undertaken resulted in both globally and contextually-relevant priorities for evaluation. This experience provides useful lessons for public health practitioners committed to bridging the research-policy interface.
A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance
NASA Astrophysics Data System (ADS)
Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying
2013-07-01
The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.
FastMag: Fast micromagnetic simulator for complex magnetic structures (invited)
NASA Astrophysics Data System (ADS)
Chang, R.; Li, S.; Lubarda, M. V.; Livshitz, B.; Lomakin, V.
2011-04-01
A fast micromagnetic simulator (FastMag) for general problems is presented. FastMag solves the Landau-Lifshitz-Gilbert equation and can handle multiscale problems with a high computational efficiency. The simulator derives its high performance from efficient methods for evaluating the effective field and from implementations on massively parallel graphics processing unit (GPU) architectures. FastMag discretizes the computational domain into tetrahedral elements and therefore is highly flexible for general problems. The magnetostatic field is computed via the superposition principle for both volume and surface parts of the computational domain. This is accomplished by implementing efficient quadrature rules and analytical integration for overlapping elements in which the integral kernel is singular. Thus, discretized superposition integrals are computed using a nonuniform grid interpolation method, which evaluates the field from N sources at N collocated observers in O(N) operations. This approach allows handling objects of arbitrary shape, allows easily calculating of the field outside the magnetized domains, does not require solving a linear system of equations, and requires little memory. FastMag is implemented on GPUs with ?> GPU-central processing unit speed-ups of 2 orders of magnitude. Simulations are shown of a large array of magnetic dots and a recording head fully discretized down to the exchange length, with over a hundred million tetrahedral elements on an inexpensive desktop computer.
Efficient Irregular Wavefront Propagation Algorithms on Hybrid CPU-GPU Machines
Teodoro, George; Pan, Tony; Kurc, Tahsin; Kong, Jun; Cooper, Lee; Saltz, Joel
2013-01-01
We address the problem of efficient execution of a computation pattern, referred to here as the irregular wavefront propagation pattern (IWPP), on hybrid systems with multiple CPUs and GPUs. The IWPP is common in several image processing operations. In the IWPP, data elements in the wavefront propagate waves to their neighboring elements on a grid if a propagation condition is satisfied. Elements receiving the propagated waves become part of the wavefront. This pattern results in irregular data accesses and computations. We develop and evaluate strategies for efficient computation and propagation of wavefronts using a multi-level queue structure. This queue structure improves the utilization of fast memories in a GPU and reduces synchronization overheads. We also develop a tile-based parallelization strategy to support execution on multiple CPUs and GPUs. We evaluate our approaches on a state-of-the-art GPU accelerated machine (equipped with 3 GPUs and 2 multicore CPUs) using the IWPP implementations of two widely used image processing operations: morphological reconstruction and euclidean distance transform. Our results show significant performance improvements on GPUs. The use of multiple CPUs and GPUs cooperatively attains speedups of 50× and 85× with respect to single core CPU executions for morphological reconstruction and euclidean distance transform, respectively. PMID:23908562
Shonnard, David R; Kicherer, Andreas; Saling, Peter
2003-12-01
Life without chemicals would be inconceivable, but the potential risks and impacts to the environment associated with chemical production and chemical products are viewed critically. Eco-efficiency analysis considers the economic and life cycle environmental effects of a product or process, giving these equal weighting. The major elements of the environmental assessment include primary energy use, raw materials utilization, emissions to all media, toxicity, safety risk, and land use. The relevance of each environmental category and also for the economic versus the environmental impacts is evaluated using national emissions and economic data. The eco-efficiency analysis method of BASF is briefly presented, and results from three applications to chemical processes and products are summarized. Through these applications, the eco-efficiency analyses mostly confirm the 12 Principles listed in Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37(5), 94A), with the exception that, in one application, production systems based on bio-based feedstocks were not the most eco-efficient as compared to those based on fossil resources. Over 180 eco-efficiency analyses have been conducted at BASF, and their results have been used to support strategic decision-making, marketing, research and development, and communication with external parties. Eco-efficiency analysis, as one important strategy and success factor in sustainable development, will continue to be a very strong operational tool at BASF.
NASA Astrophysics Data System (ADS)
Kongar, N. Elif
2004-12-01
Today, since customers are able to obtain similar-quality products for similar prices, the lead time has become the only preference criterion for most of the consumers. Therefore, it is crucial that the lead time, i.e., the time spent from the raw material phase till the manufactured good reaches the customer, is minimized. This issue can be investigated under the title of Supply Chain Management (SCM). An efficiently managed supply chain can lead to reduced response time for customers. To achieve this, continuous observation of supply chain efficiency, i.e., a constant performance evaluation of the current SCM is required. Widely used conventional performance measurement methods lack the ability to evaluate a SCM since the supply chain is a dynamic system that requires a more thorough and flexible performance measurement technique. Balanced Scorecard (BS) is an efficient tool for measuring the performance of dynamic systems and has a proven capability of providing the decision makers with the appropriate feedback data. In addition to SCM, a relatively new management field, namely reverse supply chain management (RSCM), also necessitates an appropriate evaluation approach. RSCM differs from SCM in many aspects, i.e., the criteria used for evaluation, the high level of uncertainty involved etc., not allowing the usage of identical evaluation techniques used for SCM. This study proposes a generic Balanced Scorecard to measure the performance of supply chain management while defining the appropriate performance measures for SCM. A scorecard prototype, ESCAPE, is presented to demonstrate the evaluation process.
NASA Astrophysics Data System (ADS)
Guan, Xiaofei; Pal, Uday B.; Powell, Adam C.
2014-06-01
This paper reports a solid oxide membrane (SOM) electrolysis experiment using an LSM(La0.8Sr0.2MnO3-δ)-Inconel inert anode current collector for production of magnesium and oxygen directly from magnesium oxide at 1423 K (1150 °C). The electrochemical performance of the SOM cell was evaluated by means of various electrochemical techniques including electrochemical impedance spectroscopy, potentiodynamic scan, and electrolysis. Electronic transference numbers of the flux were measured to assess the magnesium dissolution in the flux during SOM electrolysis. The effects of magnesium solubility in the flux on the current efficiency and the SOM stability during electrolysis are discussed. An inverse correlation between the electronic transference number of the flux and the current efficiency of the SOM electrolysis was observed. Based on the experimental results, a new equivalent circuit of the SOM electrolysis process is presented. A general electrochemical polarization model of SOM process for magnesium and oxygen gas production is developed, and the maximum allowable applied potential to avoid zirconia dissociation is calculated as well. The modeling results suggest that a high electronic resistance of the flux and a relatively low electronic resistance of SOM are required to achieve membrane stability, high current efficiency, and high production rates of magnesium and oxygen.
High temperature solar thermal receiver
NASA Technical Reports Server (NTRS)
1979-01-01
A design concept for a high temperature solar thermal receiver to operate at 3 atmospheres pressure and 2500 F outlet was developed. The performance and complexity of windowed matrix, tube-header, and extended surface receivers were evaluated. The windowed matrix receiver proved to offer substantial cost and performance benefits. An efficient and cost effective hardware design was evaluated for a receiver which can be readily interfaced to fuel and chemical processes or to heat engines for power generation.
Low-Temperature Forming of Beta Titanium Alloys
NASA Technical Reports Server (NTRS)
Kaneko, R. S.; Woods, C. A.
1983-01-01
Low cost methods for titanium structural fabrication using advanced cold-formable beta alloys were investigated for application in a Mach 2.7 supersonic cruise vehicle. This work focuses on improving processing and structural efficiencies as compared with standard hot formed and riveted construction of alpha-beta alloy sheet structure. Mechanical property data and manufacturing parameters were developed for cold forming, brazing, welding, and processing Ti-15V-3Cr-3Sn-3Al sheet, and Ti-3Al-8V-6Cr-4Zr on a more limited basis. Cost and structural benefits were assessed through the fabrication and evaluation of large structural panels. The feasibility of increasing structural efficiency of beta titanium structure by selective reinforcement with metal matrix composite was also explored.
Gholikandi, Gagik Badalians; Kazemirad, Khashayar
2018-03-01
In this study, the performance of the electrochemical peroxidation (ECP) process for removing the volatile suspended solids (VSS) content of waste-activated sludge was evaluated. The Fe 2+ ions required by the process were obtained directly from iron electrodes in the system. The performance of the ECP process was investigated in various operational conditions employing a laboratory-scale pilot setup and optimized by response surface methodology (RSM). According to the results, the ECP process showed its best performance when the pH value, current density, H 2 O 2 concentration and the retention time were 3, 3.2 mA/cm 2 , 1,535 mg/L and 240 min, respectively. In these conditions, the introduced Fe 2+ concentration was approximately 500 (mg/L) and the VSS removal efficiency about 74%. Moreover, the results of the microbial characteristics of the raw and the stabilized sludge demonstrated that the ECP process is able to remove close to 99.9% of the coliforms in the raw sludge during the stabilization process. The energy consumption evaluation showed that the required energy of the ECP reactor (about 1.8-2.5 kWh (kg VSS removed) -1 ) is considerably lower than for aerobic digestion, the conventional waste-activated sludge stabilization method (about 2-3 kWh (kg VSS removed) -1 ). The RSM optimization process showed that the best operational conditions of the ECP process comply with the experimental results, and the actual and the predicted results are in good conformity with each other. This feature makes it possible to predict the introduced Fe 2+ concentrations into the system and the VSS removal efficiency of the process precisely.
NASA Astrophysics Data System (ADS)
Fatrias, D.; Kamil, I.; Meilani, D.
2018-03-01
Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.
Omondi Aduda, Dickens S.; Ouma, Collins; Onyango, Rosebella; Onyango, Mathews; Bertrand, Jane
2015-01-01
Background Voluntary medical male circumcision (VMMC) service delivery is complex and resource-intensive. In Kenya’s context there is still paucity of information on resource use vis-à-vis outputs as programs scale up. Knowledge of technical efficiency, productivity and potential sources of constraints is desirable to improve decision-making. Objective To evaluate technical efficiency and productivity of VMMC service delivery in Nyanza in 2011/2012 using data envelopment analysis. Design Comparative process evaluation of facilities providing VMMC in Nyanza in 2011/2012 using output orientated data envelopment analysis. Results Twenty one facilities were evaluated. Only 1 of 7 variables considered (total elapsed operation time) significantly improved from 32.8 minutes (SD 8.8) in 2011 to 30 minutes (SD 6.6) in 2012 (95%CI = 0.0350–5.2488; p = 0.047). Mean scale technical efficiency significantly improved from 91% (SD 19.8) in 2011 to 99% (SD 4.0) in 2012 particularly among outreach compared to fixed service delivery facilities (CI -31.47959–4.698508; p = 0.005). Increase in mean VRS technical efficiency from 84% (SD 25.3) in 2011 and 89% (SD 25.1) in 2012 was not statistically significant. Benchmark facilities were #119 and #125 in 2011 and #103 in 2012. Malmquist Productivity Index (MPI) at fixed facilities declined by 2.5% but gained by 4.9% at outreach ones by 2012. Total factor productivity improved by 83% (p = 0.032) in 2012, largely due to progress in technological efficiency by 79% (p = 0.008). Conclusions Significant improvement in scale technical efficiency among outreach facilities in 2012 was attributable to accelerated activities. However, ongoing pure technical inefficiency requires concerted attention. Technological progress was the key driver of service productivity growth in Nyanza. Incorporating service-quality dimensions and using stepwise-multiple criteria in performance evaluation enhances comprehensiveness and validity. These findings highlight site-level resource use and sources of variations in VMMC service productivity, which are important for program planning. PMID:25706119
In Transit...Making the Change to Metrics.
ERIC Educational Resources Information Center
Farnsworth, Briant J.; And Others
1980-01-01
Granite School District (Utah) developed a systematic, effective, and cost-efficient teacher inservice program which provides a basic understanding of metrics, materials and methods for direct classroom use, and evaluation of the learning process, through the use of self-contained, three-phase modules for home or school use. (Author/SB)
Implementation and evaluation of ILLIAC 4 algorithms for multispectral image processing
NASA Technical Reports Server (NTRS)
Swain, P. H.
1974-01-01
Data concerning a multidisciplinary and multi-organizational effort to implement multispectral data analysis algorithms on a revolutionary computer, the Illiac 4, are reported. The effectiveness and efficiency of implementing the digital multispectral data analysis techniques for producing useful land use classifications from satellite collected data were demonstrated.
Planning the Library Media Center Facility for the 1990s and Beyond.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
This manual presents recommendations for incorporating present and future technological changes into workable, efficient, pleasant school library media facilities in two major sections: Planning the Facility and Activity Areas. The first section addresses the planning process (appointing the building committee, evaluating the library media…
COPROCESSING OF FOSSIL FUELS AND BIOMASS FOR CO2 EMISSION REDUCTION IN THE TRANSPORTATION SECTOR
The paper discusses an evaluation of the Hydrocarb process for conversion of carbonaceous raw material to clean carbon and methanol products. As fuel, methanol and carbon can be used economically, either independently or in slurry form, in efficient heat engines (turbines and int...
The Role of Microbial Processes in the Oxidation and Removal of Ammonia from Drinking Water
The purpose of this study was two-fold: (1) to monitor and evaluate nitrification in a full-scale iron removal filtration plant with biologically active granular media filters located in Ohio, and (2) to determine how to most efficiently regain nitrification following filter rebe...
celerite: Scalable 1D Gaussian Processes in C++, Python, and Julia
NASA Astrophysics Data System (ADS)
Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth
2017-09-01
celerite provides fast and scalable Gaussian Process (GP) Regression in one dimension and is implemented in C++, Python, and Julia. The celerite API is designed to be familiar to users of george and, like george, celerite is designed to efficiently evaluate the marginalized likelihood of a dataset under a GP model. This is then be used alongside a non-linear optimization or posterior inference library for the best results.
Analysis of electric power industry restructuring
NASA Astrophysics Data System (ADS)
Al-Agtash, Salem Yahya
1998-10-01
This thesis evaluates alternative structures of the electric power industry in a competitive environment. One structure is based on the principle of creating a mandatory power pool to foster competition and manage system economics. The structure is PoolCo (pool coordination). A second structure is based on the principle of allowing independent multilateral trading and decentralized market coordination. The structure is DecCo (decentralized coordination). The criteria I use to evaluate these two structures are: economic efficiency, system reliability and freedom of choice. Economic efficiency evaluation considers strategic behavior of individual generators as well as behavioral variations of different classes of consumers. A supply-function equilibria model is characterized for deriving bidding strategies of competing generators under PoolCo. It is shown that asymmetric equilibria can exist within the capacities of generators. An augmented Lagrangian approach is introduced to solve iteratively for global optimal operations schedules. Under DecCo, the process involves solving iteratively for system operations schedules. The schedules reflect generators strategic behavior and brokers' interactions for arranging profitable trades, allocating losses and managing network congestion. In the determination of PoolCo and DecCo operations schedules, overall costs of power generation (start-up and shut-down costs and availability of hydro electric power) as well as losses and costs of transmission network are considered. For system reliability evaluation, I examine the effect of PoolCo and DecCo operating conditions on the system security. Random component failure perturbations are generated to simulate the actual system behavior. This is done using Monte Carlo simulation. Freedom of choice evaluation accounts for schemes' beneficial opportunities and capabilities to respond to consumers expressed preferences. An IEEE 24-bus test system is used to illustrate the concepts developed for economic efficiency evaluation. The system was tested over two years time period. The results indicate 2.6684 and 2.7269 percent of efficiency loss on average for PoolCo and DecCo, respectively. These values, however, do not represent forecasts of efficiency losses of PoolCo- and DecCo-based competitive industries. Rather, they are illustrations of the efficiency losses for the given IEEE test system and based on the modeling assumptions underlying framework development.
Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun
2006-08-25
We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.
Optimized adipose tissue engineering strategy based on a neo-mechanical processing method.
He, Yunfan; Lin, Maohui; Wang, Xuecen; Guan, Jingyan; Dong, Ziqing; Feng, Lu; Xing, Malcolm; Feng, Chuanbo; Li, Xiaojian
2018-05-26
Decellularized adipose tissue (DAT) represents a promising scaffold for adipose tissue engineering. However, the unique and prolonged lipid removal process required for adipose tissue can damage extracellular matrix (ECM) constituents. Moreover, inadequate vascularization limits the recellularization of DAT in vivo. We proposed a neo-mechanical protocol for rapidly breaking adipocytes and removing lipid content from adipose tissue. The lipid-depleted adipose tissue was then subjected to a fast and mild decellularization to fabricate high-quality DAT (M-DAT). Adipose liquid extract (ALE) derived from this mechanical process was collected and incorporated into M-DAT to further optimize in vivo recellularization. Ordinary DAT was fabricated and served as a control. This developed strategy was evaluated based on decellularization efficiency, ECM quality, and recellularization efficiency. Angiogenic factor components and angiogenic potential of ALE were evaluated in vivo and in vitro. M-DAT achieved the same decellularization efficiency, but exhibited better retention of ECM components and recellularization, compared to those with ordinary DAT. Protein quantification revealed considerable levels of angiogenic factors (basic fibroblast growth factor, epidermal growth factor, transforming growth factor-β1, and vascular endothelial growth factor) in ALE. ALE promoted tube formation in vitro and induced intense angiogenesis in M-DAT in vivo; furthermore, higher expression of the adipogenic factor PPARγ and greater numbers of adipocytes were evident following ALE treatment, compared to those in the M-DAT group. Mechanical processing of adipose tissue led to the production of high-quality M-DAT and angiogenic factor-enriched ALE. The combination of ALE and M-DAT could be a promising strategy for engineered adipose tissue construction. This article is protected by copyright. All rights reserved. © 2018 by the Wound Healing Society.
Big Data Analysis of Manufacturing Processes
NASA Astrophysics Data System (ADS)
Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert
2015-11-01
The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.
Receptor-mediated gene transfer vectors: progress towards genetic pharmaceuticals.
Molas, M; Gómez-Valadés, A G; Vidal-Alabró, A; Miguel-Turu, M; Bermudez, J; Bartrons, R; Perales, J C
2003-10-01
Although specific delivery to tissues and unique cell types in vivo has been demonstrated for many non-viral vectors, current methods are still inadequate for human applications, mainly because of limitations on their efficiencies. All the steps required for an efficient receptor-mediated gene transfer process may in principle be exploited to enhance targeted gene delivery. These steps are: DNA/vector binding, internalization, subcellular trafficking, vesicular escape, nuclear import, and unpacking either for transcription or other functions (i.e., antisense, RNA interference, etc.). The large variety of vector designs that are currently available, usually aimed at improving the efficiency of these steps, has complicated the evaluation of data obtained from specific derivatives of such vectors. The importance of the structure of the final vector and the consequences of design decisions at specific steps on the overall efficiency of the vector will be discussed in detail. We emphasize in this review that stability in serum and thus, proper bioavailability of vectors to their specific receptors may be the single greatest limiting factor on the overall gene transfer efficiency in vivo. We discuss current approaches to overcome the intrinsic instability of synthetic vectors in the blood. In this regard, a summary of the structural features of the vectors obtained from current protocols will be presented and their functional characteristics evaluated. Dissecting information on molecular conjugates obtained by such methodologies, when carefully evaluated, should provide important guidelines for the creation of effective, targeted and safe DNA therapeutics.
Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology
NASA Astrophysics Data System (ADS)
Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.
2015-03-01
In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, Robert L
It is possible to significantly improve the efficiency of spark-ignition engines given fuels with improved autoignition, evaporative cooling, and particle emission properties. At the same time, a vast range of different fuel chemistries are accessible from biomass - leading to questions about how fuel chemistries outside the range available from petroleum and ethanol can impact engine operation. This presentation will briefly describe the factors leading to poor efficiency in current SI engines, and the technologies available for improving efficiency. Improved fuel properties that enable high efficiency engine designs to be pursued aggressively will be reviewed, including octane index and sensitivity.more » A screening process based on fuel properties was applied to a large set of proposed biomass-derived gasoline blendstocks, and the properties of the best blendstocks were evaluated. Some of these fuels exhibit poor stability towards oxidation in the liquid phase, and storage stability studies for alkyl furans and cyclopentanone will be presented in brief. The importance of fuel heat of vaporization for direct injection engines, along with new research on measurement of this parameter, will be presented including an SI engine study of the impact of heat of vaporization on octane index and engine knock. Fuel effects on fine particle emissions and how our understanding breaks down for oxygenates will be discussed. Engine combustion experiments, droplet evaporation simulations, and heat of vaporization measurements conducted to better understand how oxygenates affect particle emissions will be described. This research defines a process that can be used to evaluate fuels for other types of combustion such as diesel, gasoline compression ignition, or strategies with mixed modes.« less
Limitation of Shrinkage Porosity in Aluminum Rotor Die Casting
NASA Astrophysics Data System (ADS)
Kim, Young-Chan; Choi, Se-Weon; Kim, Cheol-Woo; Cho, Jae-Ik; Lee, Sung-Ho; Kang, Chang-Seog
Aluminum rotor prone to have many casting defects especially large amount of air and shrinkage porosity, which caused eccentricity, loss and noise during motor operation. Many attempts have been made to develop methods of shrinkage porosity control, but still there are some problems to solve. In this research, the process of vacuum squeeze die casting is proposed for limitation of defects. The 6 pin point gated dies which were in capable of local squeeze at the end ring were used. Influences of filling patterns on HPDC were evaluated and the important process control parameters were high injection speed, squeeze length, venting and process conditions. By using local squeeze and vacuum during filling and solidification, air and shrinkage porosity were significantly reduced and the feeding efficiency at the upper end ring was improved 10%. As a result of controlling the defects, the dynamometer test showed improved motor efficiency by more than 4%.
A parallel computational model for GATE simulations.
Rannou, F R; Vega-Acevedo, N; El Bitar, Z
2013-12-01
GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Epelde, Lur; Ma Becerril, José; Alkorta, Itziar; Garbisu, Carlos
Phytoremediation is an effective, non-intrusive, inexpensive, aesthetically pleasing, socially accepted, promising phytotechnology for the remediation of polluted soils. The objective of any soil remediation process must be not only to remove the contaminant(s) from the soil but, most importantly, to restore the continued capacity of the soil to perform or function according to its potential (i.e., to recover soil health). Hence, indicators of soil health are needed to properly assess the efficiency of a phytoremediation process. Biological indicators of soil health, especially those related to the size, activity and diversity of the soil microbial communities, are becoming increasingly used, due to their sensitivity and capacity to provide information that integrates many environmental factors. In particular, microbial indicators of soil health are valid tools to evaluate the success of metal phytoremediation procedures such as phytoextraction and phytostabilization processes.
A Single-use Strategy to Enable Manufacturing of Affordable Biologics.
Jacquemart, Renaud; Vandersluis, Melissa; Zhao, Mochao; Sukhija, Karan; Sidhu, Navneet; Stout, Jim
2016-01-01
The current processing paradigm of large manufacturing facilities dedicated to single product production is no longer an effective approach for best manufacturing practices. Increasing competition for new indications and the launch of biosimilars for the monoclonal antibody market have put pressure on manufacturers to produce at lower cost. Single-use technologies and continuous upstream processes have proven to be cost-efficient options to increase biomass production but as of today the adoption has been only minimal for the purification operations, partly due to concerns related to cost and scale-up. This review summarizes how a single-use holistic process and facility strategy can overcome scale limitations and enable cost-efficient manufacturing to support the growing demand for affordable biologics. Technologies enabling high productivity, right-sized, small footprint, continuous, and automated upstream and downstream operations are evaluated in order to propose a concept for the flexible facility of the future.
Effects of the number of people on efficient capture and sample collection: a lion case study.
Ferreira, Sam M; Maruping, Nkabeng T; Schoultz, Darius; Smit, Travis R
2013-05-24
Certain carnivore research projects and approaches depend on successful capture of individuals of interest. The number of people present at a capture site may determine success of a capture. In this study 36 lion capture cases in the Kruger National Park were used to evaluate whether the number of people present at a capture site influenced lion response rates and whether the number of people at a sampling site influenced the time it took to process the collected samples. The analyses suggest that when nine or fewer people were present, lions appeared faster at a call-up locality compared with when there were more than nine people. The number of people, however, did not influence the time it took to process the lions. It is proposed that efficient lion capturing should spatially separate capture and processing sites and minimise the number of people at a capture site.
Thermogravimetric characterization and gasification of pecan nut shells.
Aldana, Hugo; Lozano, Francisco J; Acevedo, Joaquín; Mendoza, Alberto
2015-12-01
This study focuses on the evaluation of pecan nut shells as an alternative source of energy through pyrolysis and gasification. The physicochemical characteristics of the selected biomass that can influence the process efficiency, consumption rates, and the product yield, as well as create operational problems, were determined. In addition, the thermal decomposition kinetics necessary for prediction of consumption rates and yields were determined. Finally, the performance of a downdraft gasifier fed with pecan nut shells was analyzed in terms of process efficiency and exit gas characteristics. It was found that the pyrolytic decomposition of the nut shells can be modeled adequately using a single equation considering two independent parallel reactions. The performance of the gasification process can be influenced by the particle size and air flow rate, requiring a proper combination of these parameters for reliable operation and production of a valuable syngas. Copyright © 2015 Elsevier Ltd. All rights reserved.
Daguerre-Martini, S; Vanotti, M B; Rodriguez-Pastor, M; Rosal, A; Moral, R
2018-06-15
Gas-permeable membranes coupled with low-rate aeration is useful to recover ammonia (NH 4 + ) from livestock effluents. In this study, the role of inorganic carbon (bicarbonate, HCO 3 - ) to enhance the N recovery process was evaluated using synthetic effluents with various NH 4 + to HCO 3 - molar ratios of 0.5, 1.0, 1.5 and 2.0. The study also evaluated the effect of increased organic matter on the NH 4 + recovery using humic acids (3000-6000 mg L -1 ), and the N recovery from high-strength swine manure. The release of hydroxide from the HCO 3 - with aeration increased the wastewater pH and promoted gaseous ammonia formation and membrane uptake. At the same time, the recovery of gaseous ammonia (NH 3 ) through the membrane acidified the wastewater. Therefore, an abundant inorganic carbon supply in balance with the NH 4 + is needed for a successful operation of the technology. NH 4 + removal efficiencies >96% were obtained with NH 4 + to HCO 3 - ratios ≤1. However, higher molar ratios inhibited the N recovery process resulting in lower efficiencies (<65%). Fortunately, most swine manures contain ample supply of endogenous inorganic carbon and the process can be used to more economically recover the ammonia using the natural inorganic carbon instead of expensive alkali chemicals. In 4 days, the recovered NH 4 + from swine manure contained 48,000 mg L -1 . Finally, it was found the process was not inhibited by the increasing levels of organic matter in the wastewater evaluated. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Dambreville, Frédéric
2013-10-01
While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.
Process mining is an underutilized clinical research tool in transfusion medicine.
Quinn, Jason G; Conrad, David M; Cheng, Calvino K
2017-03-01
To understand inventory performance, transfusion services commonly use key performance indicators (KPIs) as summary descriptors of inventory efficiency that are graphed, trended, and used to benchmark institutions. Here, we summarize current limitations in KPI-based evaluation of blood bank inventory efficiency and propose process mining as an ideal methodology for application to inventory management research to improve inventory flows and performance. The transit of a blood product from inventory receipt to final disposition is complex and relates to many internal and external influences, and KPIs may be inadequate to fully understand the complexity of the blood supply chain and how units interact with its processes. Process mining lends itself well to analysis of blood bank inventories, and modern laboratory information systems can track nearly all of the complex processes that occur in the blood bank. Process mining is an analytical tool already used in other industries and can be applied to blood bank inventory management and research through laboratory information systems data using commercial applications. Although the current understanding of real blood bank inventories is value-centric through KPIs, it potentially can be understood from a process-centric lens using process mining. © 2017 AABB.
NASA Astrophysics Data System (ADS)
Aad, G.; Abajyan, T.; Abbott, B.; Abdallah, J.; Khalek, S. Abdel; Abdinov, O.; Aben, R.; Abi, B.; Abolins, M.; AbouZeid, O. S.; Abramowicz, H.; Abreu, H.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Addy, T. N.; Adelman, J.; Adomeit, S.; Adye, T.; Agatonovic-Jovin, T.; Aguilar-Saavedra, J. A.; Agustoni, M.; Ahlen, S. P.; Ahmad, A.; Ahmadov, F.; Aielli, G.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Albert, J.; Albrand, S.; Verzini, M. J. Alconada; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Alimonti, G.; Alio, L.; Alison, J.; Allbrooke, B. M. M.; Allison, L. J.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alonso, F.; Alpigiani, C.; Altheimer, A.; Gonzalez, B. Alvarez; Alviggi, M. G.; Amako, K.; Coutinho, Y. Amaral; Amelung, C.; Amidei, D.; Ammosov, V. V.; Santos, S. P. Amor Dos; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Anduaga, X. S.; Angelidakis, S.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Bella, L. Aperio; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnal, V.; Arslan, O.; Artamonov, A.; Artoni, G.; Asai, S.; Asbah, N.; Ashkenazi, A.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Auerbach, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Avolio, G.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Mayes, J. Backus; Badescu, E.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, S.; Balek, P.; Balli, F.; Banas, E.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bannoura, A. A. E.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; Costa, J. Barreiro Guimarães da; Bartoldus, R.; Barton, A. E.; Bartos, P.; Bartsch, V.; Bassalat, A.; Basye, A.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battistin, M.; Bauer, F.; Bawa, H. S.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, S.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, K.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belloni, A.; Belotskiy, K.; Beltramello, O.; Benary, O.; Benchekroun, D.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Noccioli, E. Benhar; Garcia, J. A. Benitez; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Kuutmann, E. Bergeaas; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernard, C.; Bernat, P.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertolucci, F.; Besana, M. I.; Besjes, G. J.; Bessidskaia, O.; Besson, N.; Betancourt, C.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; De Mendizabal, J. Bilbao; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boek, T. T.; Bogaerts, J. A.; Bogdanchikov, A. G.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bolnet, N. M.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Borri, M.; Borroni, S.; Bortfeldt, J.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Bousson, N.; Boutouil, S.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozovic-Jelisavcic, I.; Bracinik, J.; Branchini, P.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brazzale, S. F.; Brelier, B.; Brendlinger, K.; Brennan, A. J.; Brenner, R.; Bressler, S.; Bristow, K.; Bristow, T. M.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Bromberg, C.; Bronner, J.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Brown, G.; Brown, J.; Renstrom, P. A. Bruckman de; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Bryngemark, L.; Buanes, T.; Buat, Q.; Bucci, F.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Buehrer, F.; Bugge, L.; Bugge, M. K.; Bulekov, O.; Bundock, A. C.; Burckhart, H.; Burdin, S.; Burghgrave, B.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, V.; Bussey, P.; Buszello, C. P.; Butler, B.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Byszewski, M.; Urbán, S. Cabrera; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Calvet, D.; Calvet, S.; Toro, R. Camacho; Camarda, S.; Cameron, D.; Caminada, L. M.; Armadans, R. Caminal; Campana, S.; Campanelli, M.; Campoverde, A.; Canale, V.; Canepa, A.; Cantero, J.; Cantrill, R.; Cao, T.; Garrido, M. D. M. Capeans; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Castaneda-Miranda, E.; Castelli, A.; Gimenez, V. Castillo; Castro, N. F.; Catastini, P.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cavaliere, V.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerio, B.; Cerny, K.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, K.; Chang, P.; Chapleau, B.; Chapman, J. D.; Charfeddine, D.; Charlton, D. G.; Chau, C. C.; Barajas, C. A. Chavez; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, L.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, Y.; Cheplakov, A.; Moursli, R. Cherkaoui El; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiefari, G.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chisholm, A. S.; Chislett, R. T.; Chitan, A.; Chizhov, M. V.; Chouridou, S.; Chow, B. K. B.; Christidi, I. A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciocio, A.; Cirkovic, P.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Clarke, R. N.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Cogan, J. G.; Coggeshall, J.; Cole, B.; Cole, S.; Colijn, A. P.; Collins-Tooth, C.; Collot, J.; Colombo, T.; Colon, G.; Compostella, G.; Muiño, P. Conde; Coniavitis, E.; Conidi, M. C.; Connell, S. H.; Connelly, I. A.; Consonni, S. M.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cooper-Smith, N. J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Côté, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Ortuzar, M. Crispin; Cristinziani, M.; Crosetti, G.; Cuciuc, C.-M.; Cuenca Almenar, C.; Donszelmann, T. Cuhadar; Cummings, J.; Curatolo, M.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; De Sousa, M. J. Da Cunha Sargedas; Da Via, C.; Dabrowski, W.; Dafinca, A.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Daniells, A. C.; Hoffmann, M. Dano; Dao, V.; Darbo, G.; Darlea, G. L.; Darmora, S.; Dassoulas, J. A.; Davey, W.; David, C.; Davidek, T.; Davies, E.; Davies, M.; Davignon, O.; Davison, A. R.; Davison, P.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Castro, S.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De La Taille, C.; De la Torre, H.; De Lorenzi, F.; De Nooij, L.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; De Zorzi, G.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dechenaux, B.; Dedovich, D. V.; Degenhardt, J.; Deigaard, I.; Del Peso, J.; Del Prete, T.; Deliot, F.; Delitzsch, C. M.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Ciaccio, A. Di; Di Ciaccio, L.; Domenico, A. Di; Donato, C. Di; Girolamo, A. Di; Girolamo, B. Di; Mattia, A. Di; Micco, B. Di; Nardo, R. Di; Simone, A. Di; Sipio, R. Di; Valentino, D. Di; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Vale, M. A. B. do; Wemans, A. Do Valle; Doan, T. K. O.; Dobos, D.; Dobson, E.; Doglioni, C.; Doherty, T.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dova, M. T.; Doyle, A. T.; Dris, M.; Dubbert, J.; Dube, S.; Dubreuil, E.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudziak, F.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Durglishvili, A.; Dwuznik, M.; Dyndal, M.; Ebke, J.; Edson, W.; Edwards, N. C.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Engelmann, R.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernis, G.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Favareto, A.; Fayard, L.; Federic, P.; Fedin, O. L.; Fedorko, W.; Fehling-Kaschek, M.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Perez, S. Fernandez; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; de Lima, D. E. Ferreira; Ferrer, A.; Ferrere, D.; Ferretti, C.; Parodi, A. Ferretto; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, J.; Fisher, M. J.; Fisher, W. C.; Fitzgerald, E. A.; Flechl, M.; Fleck, I.; Fleischmann, P.; Fleischmann, S.; Fletcher, G. T.; Fletcher, G.; Flick, T.; Floderus, A.; Castillo, L. R. Flores; Bustos, A. C. Florez; Flowerdew, M. J.; Formica, A.; Forti, A.; Fortin, D.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Franchino, S.; Francis, D.; Franklin, M.; Franz, S.; Fraternali, M.; French, S. T.; Friedrich, C.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Torregrosa, E. Fullana; Fulsom, B. G.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gandrajula, R. P.; Gao, J.; Gao, Y. S.; Walls, F. M. Garay; Garberson, F.; García, C.; Navarro, J. E. García; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gatti, C.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Ge, P.; Gecse, Z.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerbaudo, D.; Gershon, A.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giangiobbe, V.; Giannetti, P.; Gianotti, F.; Gibbard, B.; Gibson, S. M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giordano, R.; Giorgi, F. M.; Giraud, P. F.; Giugni, D.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkialas, I.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Glonti, G. L.; Goblirsch-Kolb, M.; Goddard, J. R.; Godfrey, J.; Godlewski, J.; Goeringer, C.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Fajardo, L. S. Gomez; Gonçalo, R.; Costa, J. Goncalves Pinto Firmino Da; Gonella, L.; de la Hoz, S. González; Parra, G. Gonzalez; Silva, M. L. Gonzalez; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Gozpinar, S.; Grabas, H. M. X.; Graber, L.; Grabowska-Bold, I.; Grafström, P.; Grahn, K.-J.; Gramling, J.; Gramstad, E.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Gray, H. M.; Graziani, E.; Grebenyuk, O. G.; Greenwood, Z. D.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Griffiths, J.; Grigalashvili, N.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grishkevich, Y. V.; Grivaz, J.-F.; Grohs, J. P.; Grohsjean, A.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Groth-Jensen, J.; Grout, Z. J.; Grybel, K.; Guan, L.; Guescini, F.; Guest, D.; Gueta, O.; Guicheney, C.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Gunther, J.; Guo, J.; Gupta, S.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guttman, N.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Haefner, P.; Hageboeck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Hall, D.; Halladjian, G.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamer, M.; Hamilton, A.; Hamilton, S.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Hanke, P.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Harkusha, S.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, P. F.; Hartjes, F.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayashi, T.; Hayden, D.; Hays, C. P.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, L.; Heisterkamp, S.; Hejbal, J.; Helary, L.; Heller, C.; Heller, M.; Hellman, S.; Hellmich, D.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Hengler, C.; Henrichs, A.; Henriques Correia, A. M.; Henrot-Versille, S.; Hensel, C.; Herbert, G. H.; Jiménez, Y. Hernández; Herrberg-Schubert, R.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hickling, R.; Higón-Rodriguez, E.; Hill, J. C.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hofmann, J. I.; Hohlfeld, M.; Holmes, T. R.; Hong, T. M.; Hooft van Huysduynen, L.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, X.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Hurwitz, M.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Idarraga, J.; Ideal, E.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikematsu, K.; Ikeno, M.; Iliadis, D.; Ilic, N.; Inamaru, Y.; Ince, T.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Quiles, A. Irles; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Iturbe Ponce, J. M.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, M.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jakubek, J.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansen, H.; Janssen, J.; Janus, M.; Jarlskog, G.; Javůrek, T.; Jeanty, L.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Joergensen, M. D.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Joshi, K. D.; Jovicevic, J.; Ju, X.; Jung, C. A.; Jungst, R. M.; Jussel, P.; Juste Rozas, A.; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kajomovitz, E.; Kama, S.; Kanaya, N.; Kaneda, M.; Kaneti, S.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kar, D.; Karakostas, K.; Karastathis, N.; Karnevskiy, M.; Karpov, S. N.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasieczka, G.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Katre, A.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Kazarinov, M. Y.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Keller, J. S.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Keung, J.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Khodinov, A.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khoroshilov, A.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H. Y.; Kim, H.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kitamura, T.; Kittelmann, T.; Kiuchi, K.; Kladiva, E.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Kohlmann, S.; Kohout, Z.; Kohriki, T.; Koi, T.; Kolanoski, H.; Koletsou, I.; Koll, J.; Komar, A. A.; Komori, Y.; Kondo, T.; Köneke, K.; König, A. C.; König, S.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotov, S.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kreiss, S.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruker, T.; Krumnack, N.; Krumshteyn, Z. V.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kuday, S.; Kuehn, S.; Kugel, A.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kurumida, R.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laier, H.; Lambourne, L.; Lammers, S.; Lampen, C. L.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Lassnig, M.; Laurelli, P.; Lavorini, V.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Le, B. T.; Le Dortz, O.; Guirriec, E. Le; Menedeu, E. Le; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmacher, M.; Miotto, G. Lehmann; Lei, X.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leone, R.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lester, C. G.; Lester, C. M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, A.; Lewis, G. H.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, H. L.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liao, H.; Liberti, B.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Lindquist, B. E.; Linnemann, J. T.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Lombardo, V. P.; Long, J. D.; Long, R. E.; Lopes, L.; Lopez Mateos, D.; Paredes, B. Lopez; Lorenz, J.; Lorenzo Martinez, N.; Losada, M.; Loscutoff, P.; Losty, M. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Lungwitz, M.; Lynn, D.; Lysak, R.; Lytken, E.; Ma, H.; Ma, L. L.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Miguens, J. Machado; Macina, D.; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeno, M.; Maeno, T.; Magradze, E.; Mahboubi, K.; Mahlstedt, J.; Mahmoud, S.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V. M.; Malyukov, S.; Mamuzic, J.; Mandelli, B.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Manfredini, A.; de Andrade Filho, L. Manhaes; Ramos, J. A. Manjarres; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Mantifel, R.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marino, C. P.; Marques, C. N.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, J. P.; Martin, T. A.; Martin, V. J.; Martin dit Latour, B.; Martinez, H.; Martinez, M.; Martin-Haugh, S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massol, N.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Matricon, P.; Matsunaga, H.; Matsushita, T.; Mättig, P.; Mättig, S.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazzaferro, L.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; Mclaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Medinnis, M.; Meehan, S.; Meera-Lebbai, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B. R.; Meloni, F.; Mendoza Navas, L.; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Meric, N.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Merritt, H.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Middleton, R. P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Moya, M. Miñano; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Mitsui, S.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Moeller, V.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Mönig, K.; Monini, C.; Monk, J.; Monnier, E.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Herrera, C. Mora; Moraes, A.; Morange, N.; Morel, J.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morgenstern, M.; Morii, M.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Morvaj, L.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, K.; Mueller, T.; Muenstermann, D.; Munwes, Y.; Murillo Quijada, J. A.; Murray, W. J.; Musto, E.; Myagkov, A. G.; Myska, M.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagai, Y.; Nagano, K.; Nagarkar, A.; Nagasaka, Y.; Nagel, M.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Nanava, G.; Narayan, R.; Nattermann, T.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negri, G.; Negrini, M.; Nektarijevic, S.; Nelson, A.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newcomer, F. M.; Newman, P. R.; Nguyen, D. H.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforou, N.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolics, K.; Nikolopoulos, K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Norberg, S.; Nordberg, M.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; Nuti, F.; O'Brien, B. J.; O'grady, F.; O'Neil, D. C.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, M. I.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Ohshima, T.; Okamura, W.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Olchevski, A. G.; Olivares Pino, S. A.; Damazio, D. Oliveira; Garcia, E. Oliver; Olivito, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Barrera, C. Oropeza; Orr, R. S.; Osculati, B.; Ospanov, R.; Garzon, G. Otero y.; Otono, H.; Ouchrif, M.; Ouellette, E. A.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pages, A. Pacheco; Padilla Aranda, C.; Pagáčová, M.; Pagan Griso, S.; Paganis, E.; Pahl, C.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Panduro Vazquez, J. G.; Pani, P.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passaggio, S.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pearce, J.; Pedersen, M.; Lopez, S. Pedraza; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Peng, H.; Penning, B.; Penwell, J.; Perepelitsa, D. V.; Perez Codina, E.; García-Estan, M. T. Pérez; Perez Reale, V.; Perini, L.; Pernegger, H.; Perrino, R.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petteni, M.; Pettersson, N. E.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Piec, S. M.; Piegaia, R.; Pignotti, D. T.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Pingel, A.; Pinto, B.; Pires, S.; Pizio, C.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Poddar, S.; Podlyski, F.; Poettgen, R.; Poggioli, L.; Pohl, D.; Pohl, M.; Polesello, G.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Portell Bueso, X.; Pospelov, G. E.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Prabhu, R.; Pralavorio, P.; Pranko, A.; Prasad, S.; Pravahan, R.; Prell, S.; Price, D.; Price, J.; Price, L. E.; Prieur, D.; Primavera, M.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopapadaki, E.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Przysiezniak, H.; Ptacek, E.; Pueschel, E.; Puldon, D.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Qin, G.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quilty, D.; Qureshi, A.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Randle-Conde, A. S.; Rangel-Smith, C.; Rao, K.; Rauscher, F.; Rave, T. C.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reinsch, A.; Reisin, H.; Relich, M.; Rembser, C.; Ren, Z. L.; Renaud, A.; Rescigno, M.; Resconi, S.; Resende, B.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter-Was, E.; Ridel, M.; Rieck, P.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodrigues, L.; Roe, S.; Røhne, O.; Rolli, S.; Romaniouk, A.; Romano, M.; Romeo, G.; Adam, E. Romero; Rompotis, N.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, A.; Rose, M.; Rosendahl, P. L.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, C.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Saavedra, A. F.; Sacerdoti, S.; Saddique, A.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Saleem, M.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Ferrando, B. M. Salvachua; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, T.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Castillo, I. Santoyo; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sartisohn, G.; Sasaki, O.; Sasaki, Y.; Satsounkevitch, I.; Sauvage, G.; Sauvan, E.; Savard, P.; Savu, D. O.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaefer, R.; Schaelicke, A.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schorlemmer, A. L. S.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schramm, S.; Schreyer, M.; Schroeder, C.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwegler, Ph.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Schwoerer, M.; Sciacca, F. G.; Scifo, E.; Sciolla, G.; Scott, W. G.; Scuri, F.; Scutti, F.; Searcy, J.; Sedov, G.; Sedykh, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekula, S. J.; Selbach, K. E.; Seliverstov, D. M.; Sellers, G.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Serre, T.; Seuster, R.; Severini, H.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Sherwood, P.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Shushkevich, S.; Sicho, P.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simoniello, R.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sircar, A.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skottowe, H. P.; Skovpen, K. Yu.; Skubic, P.; Slater, M.; Slavicek, T.; Sliwa, K.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snow, J.; Snyder, S.; Sobie, R.; Socher, F.; Sodomka, J.; Soffer, A.; Soh, D. A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E. Yu.; Soldevila, U.; Camillocci, E. Solfaroli; Solodkov, A. A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Song, H. Y.; Soni, N.; Sood, A.; Sopko, V.; Sopko, B.; Sorin, V.; Sosebee, M.; Soualah, R.; Soueid, P.; Soukharev, A. M.; South, D.; Spagnolo, S.; Spanò, F.; Spearman, W. R.; Spighi, R.; Spigo, G.; Spousta, M.; Spreitzer, T.; Spurlock, B.; Denis, R. D. St.; Staerz, S.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staszewski, R.; Stavina, P.; Steele, G.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stern, S.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoerig, K.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramania, HS.; Subramaniam, R.; Succurro, A.; Sugaya, Y.; Suhr, C.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Svatos, M.; Swedish, S.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tamsett, M. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanasijczuk, A. J.; Tani, K.; Tannoury, N.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teischinger, F. A.; Teixeira Dias Castanheira, M.; Teixeira-Dias, P.; Temming, K. K.; Ten Kate, H.; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thoma, S.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Thong, W. M.; Thun, R. P.; Tian, F.; Tibbetts, M. J.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tiouchichine, E.; Tipton, P.; Tisserant, S.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tollefson, K.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Topilin, N. D.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Tran, H. L.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Triplett, N.; Trischuk, W.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; True, P.; Trzebinski, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turk Cakir, I.; Turra, R.; Tuts, P. M.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Uchida, K.; Ueda, I.; Ueno, R.; Ughetto, M.; Ugland, M.; Uhlenbrock, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Urbaniec, D.; Urquijo, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Gallego, E. Valladolid; Vallecorsa, S.; Ferrer, J. A. Valls; Van Berg, R.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; Van Der Leeuw, R.; van der Ster, D.; Eldik, N. van; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vazeille, F.; Schroeder, T. Vazquez; Veatch, J.; Veloso, F.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Vigne, R.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Virzi, J.; Vitells, O.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vladoiu, D.; Vlasak, M.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; Schmitt, H. von der; Radziewski, H. von; Toerne, E. von; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Milosavljevic, M. Vranjes; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, W.; Wagner, P.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Walsh, B.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, X.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Warsinsky, M.; Washbrook, A.; Wasicki, C.; Watanabe, I.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weigell, P.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wendland, D.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilkens, H. G.; Will, J. Z.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wittig, T.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wright, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xiao, M.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yamada, M.; Yamaguchi, H.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, U. K.; Yang, Y.; Yanush, S.; Yao, L.; Yao, W.-M.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yen, A. L.; Yildirim, E.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zaytsev, A.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zevi della Porta, G.; Zhang, D.; Zhang, F.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, X.; Zhang, Z.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, L.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Zinonos, Z.; Ziolkowski, M.; Zitoun, R.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zurzolo, G.; Zutshi, V.; Zwalinski, L.
2014-07-01
Many of the interesting physics processes to be measured at the LHC have a signature involving one or more isolated electrons. The electron reconstruction and identification efficiencies of the ATLAS detector at the LHC have been evaluated using proton-proton collision data collected in 2011 at TeV and corresponding to an integrated luminosity of 4.7 fb. Tag-and-probe methods using events with leptonic decays of and bosons and mesons are employed to benchmark these performance parameters. The combination of all measurements results in identification efficiencies determined with an accuracy at the few per mil level for electron transverse energy greater than 30 GeV.
NASA Technical Reports Server (NTRS)
Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei
2016-01-01
Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.
Performance of point-of-use devices to remove manganese from drinking water.
Carrière, Annie; Brouillon, Manon; Sauvé, Sébastien; Bouchard, Maryse F; Barbeau, Benoit
2011-01-01
A recent epidemiological study reported significant cognitive deficits among children in relation with consumption of water with manganese concentrations in the order of 50-100 ug/L. Concerns for neurotoxic effects of manganese raises the need for evaluating the efficiency of domestic water treatment systems for removal of this metal. The objective of the present study was to determine whether POU devices are efficient at reducing dissolved manganese concentration in drinking water. Various devices were tested according to the NSF 53 protocol for general metals for high pH test water. Based on these assays, the pour-through filters were identified as the most promising POU devices, with dissolved manganese removal greater than 60% at 100% rated capacity, and greater than 45% at 200% rated capacity (influent Mn ≈1,000 μg/L). Under-the-sink filters using cationic exchange resins (i.e., water softeners) were also efficient at removing dissolved manganese but over a shorter operating life. Manganese leaching was also observed beyond their rated capacity, making them less robust treatments. The activated carbon block filters and other proprietary technologies were found to be inappropriate for dissolved manganese removal. Further evaluation of POU devices performance should evaluate the impact of hardness on process performance. The impact of particulate Mn should also be evaluated.
Influence of operational parameters on electro-Fenton degradation of organic pollutants from soil.
Rosales, E; Pazos, M; Longo, M A; Sanroman, M A
2009-09-01
The combination of the Fenton's reagent with electrochemistry (the electro-Fenton process) represents an efficient method for wastewater treatment. This study describes the use of this process to clean soil or clay contaminated by organic compounds. Model soil of kaolinite clay polluted with the dye Lissamine Green B (LGB) was used to evaluate the capability of the electro-Fenton process. The effects of operating parameters such as electrode material and dye concentration were investigated. Operating in an electrochemical cell under optimized conditions while using electrodes of graphite, a constant potential difference of 5 V, pH 3, 0.2 mM FeSO(4). 7H(2)O, and electrolyte 0.1 M Na(2)SO(4), around 80% of the LGB dye on kaolinite clay was decolorized after 3 hours with an electric power consumption around 0.15 W h g(-1). Furthermore, the efficiency of this process for the remediation of a real soil polluted with phenanthrene, a typical polycyclic aromatic hydrocarbon, has been demonstrated.
Evaluation Of Sludge Heel Dissolution Efficiency With Oxalic Acid Cleaning At Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudduth, Christie; Vitali, Jason; Keefer, Mark
The chemical cleaning process baseline strategy at the Savannah River Site was revised to improve efficiency during future execution of the process based on lessons learned during previous bulk oxalic acid cleaning activities and to account for operational constraints imposed by safety basis requirements. These improvements were also intended to transcend the difficulties that arise from waste removal in higher rheological yield stress sludge tanks. Tank 12 implemented this improved strategy and the bulk oxalic acid cleaning efforts concluded in July 2013. The Tank 12 radiological removal results were similar to previous bulk oxalic acid cleaning campaigns despite the factmore » that Tank 12 contained higher rheological yield stress sludge that would make removal more difficult than the sludge treated in previous cleaning campaigns. No appreciable oxalate precipitation occurred during the cleaning process in Tank 12 compared to previous campaigns, which aided in the net volume reduction of 75-80%. Overall, the controls established for Tank 12 provide a template for an improved cleaning process.« less
Crawford, Forrest W.; Suchard, Marc A.
2011-01-01
A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359
NASA Astrophysics Data System (ADS)
Keey, Tony Tiew Chun; Azuddin, M.
2017-06-01
Injection molding process appears to be one of the most suitable mass and cost efficiency manufacturing processes for polymeric parts nowadays due to its high efficiency of large scale production. When down-scaling the products and components, the limits of conventional injection molding process are reached. These constraints had initiated the development of conventional injection molding process into a new era of micro injection molding technology. In this study, fiberglass reinforced polypropylenes (PP) with various glass fiber percentage materials were used. The study start with fabrication of micro tensile specimens at three different injection temperature, 260°C, 270°C and 280°C for different percentage by weight of fiberglass reinforced PP. Then evaluate the effects of various injection temperatures on the tensile properties of micro tensile specimens. Different percentage by weight of fiberglass reinforced PP were tested as well and it was found that 20% fiberglass reinforced PP possessed the greatest percentage increase of tensile strength with increasing temperatures.
Treatment of emulsified oils by electrocoagulation: pulsed voltage applications.
Genc, Ayten; Bakirci, Busra
2015-01-01
The effect of pulsed voltage application on energy consumption during electrocoagulation was investigated. Three voltage profiles having the same arithmetic average with respect to time were applied to the electrodes. The specific energy consumption for these profiles were evaluated and analyzed together with oil removal efficiencies. The effects of applied voltages, electrode materials, electrode configurations, and pH on oil removal efficiency were determined. Electrocoagulation experiments were performed by using synthetic and real wastewater samples. The pulsed voltages saved energy during the electrocoagulation process. In continuous operation, energy saving was as high as 48%. Aluminum electrodes used for the treatment of emulsified oils resulted in higher oil removal efficiencies in comparison with stainless steel and iron electrodes. When the electrodes gap was less than 1 cm, higher oil removal efficiencies were obtained. The highest oil removal efficiencies were 95% and 35% for the batch and continuous operating modes, respectively.
Improvement of Speckle Contrast Image Processing by an Efficient Algorithm.
Steimers, A; Farnung, W; Kohl-Bareis, M
2016-01-01
We demonstrate an efficient algorithm for the temporal and spatial based calculation of speckle contrast for the imaging of blood flow by laser speckle contrast analysis (LASCA). It reduces the numerical complexity of necessary calculations, facilitates a multi-core and many-core implementation of the speckle analysis and enables an independence of temporal or spatial resolution and SNR. The new algorithm was evaluated for both spatial and temporal based analysis of speckle patterns with different image sizes and amounts of recruited pixels as sequential, multi-core and many-core code.
Hierarchy of Efficiently Computable and Faithful Lower Bounds to Quantum Discord
NASA Astrophysics Data System (ADS)
Piani, Marco
2016-08-01
Quantum discord expresses a fundamental nonclassicality of correlations that is more general than entanglement, but that, in its standard definition, is not easily evaluated. We derive a hierarchy of computationally efficient lower bounds to the standard quantum discord. Every nontrivial element of the hierarchy constitutes by itself a valid discordlike measure, based on a fundamental feature of quantum correlations: their lack of shareability. Our approach emphasizes how the difference between entanglement and discord depends on whether shareability is intended as a static property or as a dynamical process.
Measuring the performance of Internet companies using a two-stage data envelopment analysis model
NASA Astrophysics Data System (ADS)
Cao, Xiongfei; Yang, Feng
2011-05-01
In exploring the business operation of Internet companies, few researchers have used data envelopment analysis (DEA) to evaluate their performance. Since the Internet companies have a two-stage production process: marketability and profitability, this study employs a relational two-stage DEA model to assess the efficiency of the 40 dot com firms. The results show that our model performs better in measuring efficiency, and is able to discriminate the causes of inefficiency, thus helping business management to be more effective through providing more guidance to business performance improvement.
1984-02-01
appropriate. Flood damage prevention activities may be categorized in three major strategy group- ings: 1) modifying flooding, 2) modifying susceptibility...it is a more complex process. He categorizes the forces that interact on the allocation of resources for Corps projects as economic efficiency...outputs to outcomes. It is this relationship that is the essence of the research effort. * 1 bid. p. 8. 2 Lewis A. Froman, ’The Categorization of
Program evaluation in integrated resource planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Council, C.D.
1994-12-31
The Western Area Power Administration along with the Southwestern and Southeastern Power Administrations joined together to develop a set of integrated resource planning (IRP) tools to help their customers development and implement an IRP process. The project has been entitled the Resource Planning Guide (RPG), and is specifically designed to help small- to mid-sized utilities analyze supply- and demand-side alternatives as part of an IRP process. The RPG project will be available in January 1994 and will include such support as: workshops, technical assistance, an RPG hotline, and an RPG User`s Group for the project. The RPG grew out ofmore » the interest shown by utility customers who wanted a user-friendly tool to aid in their application of the IRP process. The project has been field tested by 43 utilities and related organizations over the last year, has sparked interest both nationally and internationally, and is now available for public use. The program evaluation aspects of the IRP process are heightened by a requirement of the Energy Policy Act of 1992 which requires all long-term power customers of the Western Area Power Administration to develop, implement, and monitor an IRP process. The EPAct defines IRP as: A planning process for new energy resources that evaluates the full range of alternatives, including new generating capacity, power purchases, energy conservation and efficiency, cogeneration and district heating and cooling applications, and renewable energy resources, to provide adequate and reliable service to its electric customers at the lowest system cost. The process takes into account necessary features for system operation, such as diversity, reliability, dispatchability, and other factors of risk; the ability to verify energy savings achieved through energy conservation and efficiency and the projected durability of such savings measured over time; and treats demand and supply resources on a consistent and integrated basis.« less
A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data
NASA Astrophysics Data System (ADS)
Li, Z.; Hodgson, M.; Li, W.
2016-12-01
Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
A Systems Approach to Nitrogen Delivery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
Estimation of Transpiration and Water Use Efficiency Using Satellite and Field Observations
NASA Technical Reports Server (NTRS)
Choudhury, Bhaskar J.; Quick, B. E.
2003-01-01
Structure and function of terrestrial plant communities bring about intimate relations between water, energy, and carbon exchange between land surface and atmosphere. Total evaporation, which is the sum of transpiration, soil evaporation and evaporation of intercepted water, couples water and energy balance equations. The rate of transpiration, which is the major fraction of total evaporation over most of the terrestrial land surface, is linked to the rate of carbon accumulation because functioning of stomata is optimized by both of these processes. Thus, quantifying the spatial and temporal variations of the transpiration efficiency (which is defined as the ratio of the rate of carbon accumulation and transpiration), and water use efficiency (defined as the ratio of the rate of carbon accumulation and total evaporation), and evaluation of modeling results against observations, are of significant importance in developing a better understanding of land surface processes. An approach has been developed for quantifying spatial and temporal variations of transpiration, and water-use efficiency based on biophysical process-based models, satellite and field observations. Calculations have been done using concurrent meteorological data derived from satellite observations and four dimensional data assimilation for four consecutive years (1987-1990) over an agricultural area in the Northern Great Plains of the US, and compared with field observations within and outside the study area. The paper provides substantive new information about interannual variation, particularly the effect of drought, on the efficiency values at a regional scale.
Evaluation and analysis on the coupling performance of a high-speed turboexpander compressor
NASA Astrophysics Data System (ADS)
Chen, Shuangtao; Fan, Yufeng; Yang, Shanju; Chen, Xingya; Hou, Yu
2017-12-01
A high-speed turboexpander compressor (TEC) for small reverse Brayton air refrigerator is tested and analyzed in the present work. A TEC consists of an expander and a compressor, which are coupled together and interact with each other directly. Meanwhile, the expander and compressor have different effects on the refrigerator. The TEC overall efficiency, which contains effects of the expander's expansion, the compressor's pre-compression, and the pressure drop between them, was proved. It unifies influences of both compression and expansion processes on the COP of refrigerator and could be used to evaluate the TEC overall performance. Then, the coupling parameters were analyzed, which shows that for a TEC, the expander efficiency should be fully utilized first, followed by the compressor pressure ratio. Experiments were carried out to test the TEC coupling performances. The results indicated that, the TEC overall efficiency could reach 67.2%, and meanwhile 22.3% of the energy output was recycled.
NASA Astrophysics Data System (ADS)
Venugopal, Krishnaveni; Murugappan, Minnoli; Dharmalingam, Sangeetha
2017-07-01
Potable water has become a scarce resource in many countries. In fact, the world is not running out of water, but rather, the relatively fixed quantity is becoming too contaminated for many applications. Hence, the present work was designed to evaluate the desalination efficiency of resin and glass fiber-reinforced Polysulfone polymer-based monopolar and bipolar (BPM) ion exchange membranes (with polyvinyl pyrrolidone as the intermediate layer) on a real sample brine solution for 8 h duration. The prepared ion exchange membranes (IEMs) were characterized using FTIR, SEM, TGA, water absorption, and contact angle measurements. The BPM efficiency, electrical conductivity, salinity, sodium, and chloride ion concentration were evaluated for both prepared and commercial-based IEM systems. The current efficiency and energy consumption values obtained during BPMED process were found to be 45 % and 0.41 Wh for RPSu-PVP-based IEM system and 38 % and 1.60 Wh for PSDVB-based IEM system, respectively.
NASA Astrophysics Data System (ADS)
Ariyarit, Atthaphon; Sugiura, Masahiko; Tanabe, Yasutada; Kanazaki, Masahiro
2018-06-01
A multi-fidelity optimization technique by an efficient global optimization process using a hybrid surrogate model is investigated for solving real-world design problems. The model constructs the local deviation using the kriging method and the global model using a radial basis function. The expected improvement is computed to decide additional samples that can improve the model. The approach was first investigated by solving mathematical test problems. The results were compared with optimization results from an ordinary kriging method and a co-kriging method, and the proposed method produced the best solution. The proposed method was also applied to aerodynamic design optimization of helicopter blades to obtain the maximum blade efficiency. The optimal shape obtained by the proposed method achieved performance almost equivalent to that obtained using the high-fidelity, evaluation-based single-fidelity optimization. Comparing all three methods, the proposed method required the lowest total number of high-fidelity evaluation runs to obtain a converged solution.
Wang, Zhaojiang; Qin, Menghua; Zhu, J Y; Tian, Guoyu; Li, Zongquan
2013-02-01
Rejects from sulfite pulp mill that otherwise would be disposed of by incineration were converted to ethanol by a combined physical-biological process that was comprised of physical refining and simultaneous saccharification and fermentation (SSF). The energy efficiency was evaluated with comparison to thermochemically pretreated biomass, such as those pretreated by dilute acid (DA) and sulfite pretreatment to overcome recalcitrance of lignocelluloses (SPORL). It was observed that the structure deconstruction of rejects by physical refining was indispensable to effective bioconversion but more energy intensive than that of thermochemically pretreated biomass. Fortunately, the energy consumption was compensated by the reduced enzyme dosage and the elevated ethanol yield. Furthermore, adjustment of disk-plates gap led to reduction in energy consumption with negligible influence on ethanol yield. In this context, energy efficiency up to 717.7% was achieved for rejects, much higher than that of SPORL sample (283.7%) and DA sample (152.8%). Copyright © 2012 Elsevier Ltd. All rights reserved.
Potential Evaluation of Solar Heat Assisted Desiccant Hybrid Air Conditioning System
NASA Astrophysics Data System (ADS)
Tran, Thien Nha; Hamamoto, Yoshinori; Akisawa, Atsushi; Kashiwagi, Takao
The solar thermal driven desiccant dehumidification-absorption cooling hybrid system has superior advantage in hot-humid climate regions. The reasonable air processing of desiccant hybrid air conditioning system and the utility of clean and free energy make the system environment friendly and energy efficient. The study investigates the performance of the desiccant dehumidification air conditioning systems with solar thermal assistant. The investigation is performed for three cases which are combinations of solar thermal and absorption cooling systems with different heat supply temperature levels. Two solar thermal systems are used in the study: the flat plate collector (FPC) and the vacuum tube with compound parabolic concentrator (CPC). The single-effect and high energy efficient double-, triple-effect LiBr-water absorption cooling cycles are considered for cooling systems. COP of desiccant hybrid air conditioning systems are determined. The evaluation of these systems is subsequently performed. The single effect absorption cooling cycle combined with the flat plate collector solar system is found to be the most energy efficient air conditioning system.
Data Envelopment Analysis for steel production with the use of Total Material Requirement
NASA Astrophysics Data System (ADS)
Oyaizu, Akira; Cravioto, Jordi; Yamasue, Eiji; Daigo, Ichiro
2018-06-01
High properties of stainless steels can be achieved by adding alloying elements and/or by structure control through complex heat treatments. Evaluations of such processes have so far been made in terms of the carbon dioxide emissions emitted or the energy consumption, but less attention has been given to resource intensity and "hidden flows". Using Data Envelopment Analysis (DEA), this study evaluates the efficiency of the environmental impact, characterised through Total Material Requirement (TMR), a measurement of hidden flows, and three properties (two mechanical and one chemical) of the materials obtained after manufacturing. Out of sample of 72 stainless steels it was found that seven (SUS312L, SUS836L, SUS447J1, SUSXM27, SUS410, SUS420F2, and SUS329J4L) showed the highest comparative efficiency, and that, when classifying the sample into four groups, the production of martensitic steels showed the highest efficiency (88%), followed by ferritic and duplex (82.6% each) and lastly austenitic (43.3%).
The effect of primary sedimentation on full-scale WWTP nutrient removal performance.
Puig, S; van Loosdrecht, M C M; Flameling, A G; Colprim, J; Meijer, S C F
2010-06-01
Traditionally, the performance of full-scale wastewater treatment plants (WWTPs) is measured based on influent and/or effluent and waste sludge flows and concentrations. Full-scale WWTP data typically have a high variance which often contains (large) measurement errors. A good process engineering evaluation of the WWTP performance is therefore difficult. This also makes it usually difficult to evaluate effect of process changes in a plant or compare plants to each other. In this paper we used a case study of a full-scale nutrient removing WWTP. The plant normally uses presettled wastewater, as a means to increase the nutrient removal the plant was operated for a period by-passing raw wastewater (27% of the influent flow). The effect of raw wastewater addition has been evaluated by different approaches: (i) influent characteristics, (ii) design retrofit, (iii) effluent quality, (iv) removal efficiencies, (v) activated sludge characteristics, (vi) microbial activity tests and FISH analysis and, (vii) performance assessment based on mass balance evaluation. This paper demonstrates that mass balance evaluation approach helps the WWTP engineers to distinguish and quantify between different strategies, where others could not. In the studied case, by-passing raw wastewater (27% of the influent flow) directly to the biological reactor did not improve the effluent quality and the nutrient removal efficiency of the WWTP. The increase of the influent C/N and C/P ratios was associated to particulate compounds with low COD/VSS ratio and a high non-biodegradable COD fraction. Copyright 2010 Elsevier Ltd. All rights reserved.
Assessment and evaluation of engineering options at a low-level radioactive waste storage site
NASA Astrophysics Data System (ADS)
Kanehiro, B. Y.; Guvanasen, V.
1982-09-01
Solutions to hydrologic and geotechnical problems associated with existing disposal sites were sought and the efficiency of engineering options that were proposed to improve the integrity of such sites were evaluated. The Weldon Spring site is generally like other low-level nuclear waste sites, except that the wastes are primarily in the form of residues and contaminated rubble from the processing of uranium and thorium ores rather than industrial isotopes or mill tailings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, J.D.; Tillotson, R.D.; Todd, T.A.
2002-09-19
The Caustic-Side Solvent Extraction (CSSX) process has been selected for the separation of cesium from Savannah River Site high-level waste. The solvent composition used in the CSSX process was recently optimized so that the solvent is no longer supersaturated with respect to the calixarene crown ether extractant. Hydraulic performance and mass transfer efficiency testing of a single stage of 5.5-cm ORNL-designed centrifugal contactor has been performed for the CSSX process with the optimized solvent. Maximum throughputs of the 5.5-cm centrifugal contactor, as a function of contactor rotor speed, have been measured for the extraction, scrub, strip, and wash sections ofmore » the CSSX flowsheet at the baseline organic/aqueous flow ratios (O/A) of the process, as well as at O/A's 20% higher and 20% lower than the baseline. Maximum throughputs are comparable to the design throughput of the contactor, as well as with throughputs obtained previously in a 5-cm centrifugal contactor with the non-optimized CSSX solvent formulation. The 20% variation in O/A had minimal effect on contactor throughput. Additionally, mass transfer efficiencies have been determined for the extraction and strip sections of the flowsheet. Efficiencies were lower than the process goal of greater than or equal to 80%, ranging from 72 to 75% for the extraction section and from 36 to 60% in the strip section. Increasing the mixing intensity and/or the solution level in the mixing zone of the centrifugal contactor (residence time) could potentially increase efficiencies. Several methods are available to accomplish this including (1) increasing the size of the opening in the bottom of the rotor, resulting in a contactor which is partially pumping instead of fully pumping, (2) decreasing the number of vanes in the contactor, (3) increasing the vane height, or (4) adding vanes on the rotor and baffles on the housing of the contactor. The low efficiency results obtained stress the importance of proper design of a centrifugal contactor for use in the CSSX process. A prototype of any centrifugal contactors designed for future pilot-scale or full-scale processing should be thoroughly tested prior to implementation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Jack Douglas; Tillotson, Richard Dean; Todd, Terry Allen
2002-09-01
The Caustic-Side Solvent Extraction (CSSX) process has been selected for the separation of cesium from Savannah River Site high-level waste. The solvent composition used in the CSSX process was recently optimized so that the solvent is no longer supersaturated with respect to the calixarene crown ether extractant. Hydraulic performance and mass transfer efficiency testing of a single stage of 5.5-cm ORNL-designed centrifugal contactor has been performed for the CSSX process with the optimized solvent. Maximum throughputs of the 5.5-cm centrifugal contactor, as a function of contactor rotor speed, have been measured for the extraction, scrub, strip, and wash sections ofmore » the CSSX flowsheet at the baseline organic/aqueous flow ratios (O/A) of the process, as well as at O/A’s 20% higher and 20% lower than the baseline. Maximum throughputs are comparable to the design throughput of the contactor, as well as with throughputs obtained previously in a 5-cm centrifugal contactor with the non-optimized CSSX solvent formulation. The 20% variation in O/A had minimal effect on contactor throughput. Additionally, mass transfer efficiencies have been determined for the extraction and strip sections of the flowsheet. Efficiencies were lower than the process goal of greater than or equal to 80%, ranging from 72 to 75% for the extraction section and from 36 to 60% in the strip section. Increasing the mixing intensity and/or the solution level in the mixing zone of the centrifugal contactor (residence time) could potentially increase efficiencies. Several methods are available to accomplish this including (1) increasing the size of the opening in the bottom of the rotor, resulting in a contactor which is partially pumping instead of fully pumping, (2) decreasing the number of vanes in the contactor, (3) increasing the vane height, or (4) adding vanes on the rotor and baffles on the housing of the contactor. The low efficiency results obtained stress the importance of proper design of a centrifugal contactor for use in the CSSX process. A prototype of any centrifugal contactors designed for future pilot-scale or full-scale processing should be thoroughly tested prior to implementation.« less
Application of a responsive evaluation approach in medical education.
Curran, Vernon; Christopher, Jeanette; Lemire, Francine; Collins, Alice; Barrett, Brendan
2003-03-01
This paper reports on the usefulness of a responsive evaluation model in evaluating the clinical skills assessment and training (CSAT) programme at the Faculty of Medicine, Memorial University of Newfoundland, Canada. The purpose of this paper is to introduce the responsive evaluation approach, ascertain its utility, feasibility, propriety and accuracy in a medical education context, and discuss its applicability as a model for medical education programme evaluation. Robert Stake's original 12-step responsive evaluation model was modified and reduced to five steps, including: (1) stakeholder audience identification, consultation and issues exploration; (2) stakeholder concerns and issues analysis; (3) identification of evaluative standards and criteria; (4) design and implementation of evaluation methodology; and (5) data analysis and reporting. This modified responsive evaluation process was applied to the CSAT programme and a meta-evaluation was conducted to evaluate the effectiveness of the approach. The responsive evaluation approach was useful in identifying the concerns and issues of programme stakeholders, solidifying the standards and criteria for measuring the success of the CSAT programme, and gathering rich and descriptive evaluative information about educational processes. The evaluation was perceived to be human resource dependent in nature, yet was deemed to have been practical, efficient and effective in uncovering meaningful and useful information for stakeholder decision-making. Responsive evaluation is derived from the naturalistic paradigm and concentrates on examining the educational process rather than predefined outcomes of the process. Responsive evaluation results are perceived as having more relevance to stakeholder concerns and issues, and therefore more likely to be acted upon. Conducting an evaluation that is responsive to the needs of these groups will ensure that evaluative information is meaningful and more likely to be used for programme enhancement and improvement.
Liu, Xing; Hou, Kun Mean; de Vaulx, Christophe; Shi, Hongling; Gholami, Khalid El
2014-01-01
Operating system (OS) technology is significant for the proliferation of the wireless sensor network (WSN). With an outstanding OS; the constrained WSN resources (processor; memory and energy) can be utilized efficiently. Moreover; the user application development can be served soundly. In this article; a new hybrid; real-time; memory-efficient; energy-efficient; user-friendly and fault-tolerant WSN OS MIROS is designed and implemented. MIROS implements the hybrid scheduler and the dynamic memory allocator. Real-time scheduling can thus be achieved with low memory consumption. In addition; it implements a mid-layer software EMIDE (Efficient Mid-layer Software for User-Friendly Application Development Environment) to decouple the WSN application from the low-level system. The application programming process can consequently be simplified and the application reprogramming performance improved. Moreover; it combines both the software and the multi-core hardware techniques to conserve the energy resources; improve the node reliability; as well as achieve a new debugging method. To evaluate the performance of MIROS; it is compared with the other WSN OSes (TinyOS; Contiki; SOS; openWSN and mantisOS) from different OS concerns. The final evaluation results prove that MIROS is suitable to be used even on the tight resource-constrained WSN nodes. It can support the real-time WSN applications. Furthermore; it is energy efficient; user friendly and fault tolerant. PMID:25248069
Liu, Xing; Hou, Kun Mean; de Vaulx, Christophe; Shi, Hongling; El Gholami, Khalid
2014-09-22
Operating system (OS) technology is significant for the proliferation of the wireless sensor network (WSN). With an outstanding OS; the constrained WSN resources (processor; memory and energy) can be utilized efficiently. Moreover; the user application development can be served soundly. In this article; a new hybrid; real-time; memory-efficient; energy-efficient; user-friendly and fault-tolerant WSN OS MIROS is designed and implemented. MIROS implements the hybrid scheduler and the dynamic memory allocator. Real-time scheduling can thus be achieved with low memory consumption. In addition; it implements a mid-layer software EMIDE (Efficient Mid-layer Software for User-Friendly Application Development Environment) to decouple the WSN application from the low-level system. The application programming process can consequently be simplified and the application reprogramming performance improved. Moreover; it combines both the software and the multi-core hardware techniques to conserve the energy resources; improve the node reliability; as well as achieve a new debugging method. To evaluate the performance of MIROS; it is compared with the other WSN OSes (TinyOS; Contiki; SOS; openWSN and mantisOS) from different OS concerns. The final evaluation results prove that MIROS is suitable to be used even on the tight resource-constrained WSN nodes. It can support the real-time WSN applications. Furthermore; it is energy efficient; user friendly and fault tolerant.
Durán, A; Monteagudo, J M; San Martín, I; Merino, S
2018-03-15
The aim of this work was to evaluate the performance of a novel self-autonomous reactor technology (capable of working with solar irradiation and artificial UV light) for water treatment using aniline as model compound. This new reactor design overcomes the problems of the external mass transfer effect and the accessibility to photons occurring in traditional reaction systems. The UV-light source is located inside the rotating quartz drums (where TiO 2 is immobilized), allowing light to easily reach the water and the TiO 2 surface. Several processes (UV, H 2 O 2 , Solar, TiO 2 , Solar/TiO 2 , Solar/TiO 2 /H 2 O 2 and UV/Solar/H 2 O 2 /TiO 2 ) were tested. The synergy between Solar/H 2 O 2 and Solar/TiO 2 processes was quantified to be 40.3% using the pseudo-first-order degradation rate. The apparent photonic efficiency, ζ, was also determined for evaluating light utilization. For the Solar/TiO 2 /H 2 O 2 process, the efficiency was found to be practically constant (0.638-0.681%) when the film thickness is in the range of 1.67-3.87 μm. However, the efficiency increases up to 2.67% when artificial UV light was used in combination, confirming the efficient design of this installation. Thus, if needed, lamps can be switched on during cloudy days to improve the degradation rate of aniline and its mineralization. Under the optimal conditions selected for the Solar/TiO 2 /H 2 O 2 process ([H 2 O 2 ] = 250 mg/L; pH = 4, [TiO 2 ] = 0.65-1.25 mg/cm 2 ), 89.6% of aniline is degraded in 120 min. If the lamps are switched on, aniline is completely degraded in 10 min, reaching 85% of mineralization in 120 min. TiO 2 was re-used during 5 reaction cycles without apparent loss in activity (<2%). Quantification of hydroxyl radicals and dissolved oxygen allows a chemical-based explanation of the process. Finally, the UV/Solar/TiO 2 /H 2 O 2 process was found to have lower operation costs than other systems described in literature (0.67 €/m 3 ). Copyright © 2018 Elsevier Ltd. All rights reserved.
Moreti, Livia O R; Coldebella, Priscila Ferri; Camacho, Franciele P; Carvalho Bongiovani, Milene; Pereira de Souza, Aloisio Henrique; Kirie Gohara, Aline; Matsushita, Makoto; Fernandes Silva, Marcela; Nishi, Letícia; Bergamasco, Rosângela
2016-01-01
This study aimed to evaluate the efficiency of the coagulation/flocculation/dissolved air flotation (C/F/DAF) process using the coagulant Moringa oleifera (MO) seed powder, and to analyse the profile of fatty acids present in the generated sludge after treatment. For the tests, deionized water artificially contaminated with cell cultures of Anabaena flos-aquae was used, with a cell density in the order of 10(4) cells mL(-1). C/F/DAF tests were conducted using 'Flotest' equipment. For fatty acid profile analyses, a gas chromatograph equipped with a flame ionization detector was used. It was seen that the optimal dosage (100 mg L(-1)) of MO used in the C/F/DAF process was efficient at removing nearly all A. flos-aquae cells (96.4%). The sludge obtained after treatment contained oleic acid (61.7%) and palmitic acid (10.8%). Thus, a water treatment process using C/F/DAF linked to integral MO powder seed was found to be efficient in removing cells of cyanobacteria, and produced a sludge rich in oleic acid that is a precursor favourable for obtaining quality biodiesel, thus becoming an alternative application for the recycling of such biomass.
Zhou, Yingbiao; Zhu, Yueming; Dai, Longhai; Men, Yan; Wu, Jinhai; Zhang, Juankun; Sun, Yuanxia
2017-01-01
Melibiose is widely used as a functional carbohydrate. Whole-cell biocatalytic production of melibiose from raffinose could reduce its cost. However, characteristics of strains for whole-cell biocatalysis and mechanism of such process are unclear. We compared three different Saccharomyces cerevisiae strains (liquor, wine, and baker's yeasts) in terms of concentration variations of substrate (raffinose), target product (melibiose), and by-products (fructose and galactose) in whole-cell biocatalysis process. Distinct difference was observed in whole-cell catalytic efficiency among three strains. Furthermore, activities of key enzymes (invertase, α-galactosidase, and fructose transporter) involved in process and expression levels of their coding genes (suc2, mel1, and fsy1) were investigated. Conservation of key genes in S. cerevisiae strains was also evaluated. Results show that whole-cell catalytic efficiency of S. cerevisiae in the raffinose substrate was closely related to activity of key enzymes and expression of their coding genes. Finally, we summarized characteristics of producing strain that offered advantages, as well as contributions of key genes to excellent strains. Furthermore, we presented a dynamic mechanism model to achieve some mechanism insight for this whole-cell biocatalytic process. This pioneering study should contribute to improvement of whole-cell biocatalytic production of melibiose from raffinose.
Process optimization by use of design of experiments: Application for liposomalization of FK506.
Toyota, Hiroyasu; Asai, Tomohiro; Oku, Naoto
2017-05-01
Design of experiments (DoE) can accelerate the optimization of drug formulations, especially complexed formulas such as those of drugs, using delivery systems. Administration of FK506 encapsulated in liposomes (FK506 liposomes) is an effective approach to treat acute stroke in animal studies. To provide FK506 liposomes as a brain protective agent, it is necessary to manufacture these liposomes with good reproducibility. The objective of this study was to confirm the usefulness of DoE for the process-optimization study of FK506 liposomes. The Box-Behnken design was used to evaluate the effect of the process parameters on the properties of FK506 liposomes. The results of multiple regression analysis showed that there was interaction between the hydration temperature and the freeze-thaw cycle on both the particle size and encapsulation efficiency. An increase in the PBS hydration volume resulted in an increase in encapsulation efficiency. Process parameters had no effect on the ζ-potential. The multiple regression equation showed good predictability of the particle size and the encapsulation efficiency. These results indicated that manufacturing conditions must be taken into consideration to prepare liposomes with desirable properties. DoE would thus be promising approach to optimize the conditions for the manufacturing of liposomes. Copyright © 2017 Elsevier B.V. All rights reserved.
Biohydrogen Production: Strategies to Improve Process Efficiency through Microbial Routes
Chandrasekhar, Kuppam; Lee, Yong-Jik; Lee, Dong-Woo
2015-01-01
The current fossil fuel-based generation of energy has led to large-scale industrial development. However, the reliance on fossil fuels leads to the significant depletion of natural resources of buried combustible geologic deposits and to negative effects on the global climate with emissions of greenhouse gases. Accordingly, enormous efforts are directed to transition from fossil fuels to nonpolluting and renewable energy sources. One potential alternative is biohydrogen (H2), a clean energy carrier with high-energy yields; upon the combustion of H2, H2O is the only major by-product. In recent decades, the attractive and renewable characteristics of H2 led us to develop a variety of biological routes for the production of H2. Based on the mode of H2 generation, the biological routes for H2 production are categorized into four groups: photobiological fermentation, anaerobic fermentation, enzymatic and microbial electrolysis, and a combination of these processes. Thus, this review primarily focuses on the evaluation of the biological routes for the production of H2. In particular, we assess the efficiency and feasibility of these bioprocesses with respect to the factors that affect operations, and we delineate the limitations. Additionally, alternative options such as bioaugmentation, multiple process integration, and microbial electrolysis to improve process efficiency are discussed to address industrial-level applications. PMID:25874756
Biohydrogen production: strategies to improve process efficiency through microbial routes.
Chandrasekhar, Kuppam; Lee, Yong-Jik; Lee, Dong-Woo
2015-04-14
The current fossil fuel-based generation of energy has led to large-scale industrial development. However, the reliance on fossil fuels leads to the significant depletion of natural resources of buried combustible geologic deposits and to negative effects on the global climate with emissions of greenhouse gases. Accordingly, enormous efforts are directed to transition from fossil fuels to nonpolluting and renewable energy sources. One potential alternative is biohydrogen (H2), a clean energy carrier with high-energy yields; upon the combustion of H2, H2O is the only major by-product. In recent decades, the attractive and renewable characteristics of H2 led us to develop a variety of biological routes for the production of H2. Based on the mode of H2 generation, the biological routes for H2 production are categorized into four groups: photobiological fermentation, anaerobic fermentation, enzymatic and microbial electrolysis, and a combination of these processes. Thus, this review primarily focuses on the evaluation of the biological routes for the production of H2. In particular, we assess the efficiency and feasibility of these bioprocesses with respect to the factors that affect operations, and we delineate the limitations. Additionally, alternative options such as bioaugmentation, multiple process integration, and microbial electrolysis to improve process efficiency are discussed to address industrial-level applications.
2011-01-01
Background Despite more than a decade of research on hospitalists and their performance, disagreement still exists regarding whether and how hospital-based physicians improve the quality of inpatient care delivery. This systematic review summarizes the findings from 65 comparative evaluations to determine whether hospitalists provide a higher quality of inpatient care relative to traditional inpatient physicians who maintain hospital privileges with concurrent outpatient practices. Methods Articles on hospitalist performance published between January 1996 and December 2010 were identified through MEDLINE, Embase, Science Citation Index, CINAHL, NHS Economic Evaluation Database and a hand-search of reference lists, key journals and editorials. Comparative evaluations presenting original, quantitative data on processes, efficiency or clinical outcome measures of care between hospitalists, community-based physicians and traditional academic attending physicians were included (n = 65). After proposing a conceptual framework for evaluating inpatient physician performance, major findings on quality are summarized according to their percentage change, direction and statistical significance. Results The majority of reviewed articles demonstrated that hospitalists are efficient providers of inpatient care on the basis of reductions in their patients' average length of stay (69%) and total hospital costs (70%); however, the clinical quality of hospitalist care appears to be comparable to that provided by their colleagues. The methodological quality of hospitalist evaluations remains a concern and has not improved over time. Persistent issues include insufficient reporting of source or sample populations (n = 30), patients lost to follow-up (n = 42) and estimates of effect or random variability (n = 35); inappropriate use of statistical tests (n = 55); and failure to adjust for established confounders (n = 37). Conclusions Future research should include an expanded focus on the specific structures of care that differentiate hospitalists from other inpatient physician groups as well as the development of better conceptual and statistical models that identify and measure underlying mechanisms driving provider-outcome associations in quality. PMID:21592322
Sarkar, Sudipto; Kamilya, Dibyendu; Mal, B C
2007-03-01
Inclined plate settlers are used in treating wastewater due to their low space requirement and high removal rates. The prediction of sedimentation efficiency of these settlers is essential for their performance evaluation. In the present study, the technique of dimensional analysis was applied to predict the sedimentation efficiency of these inclined plate settlers. The effect of various geometric parameters namely, distance between plates (w(p)), plate angle (alpha), length of plate (l(p)), plate roughness (epsilon(p)), number of plates (n(p)) and particle diameter (d(s)) on the dynamic conditions, influencing the sedimentation process was studied. From the study it was established that neither the Reynolds criterion nor the Froude criterion was singularly valid to simulate the sedimentation efficiency (E) for different values of w(p) and flow velocity (v(f)). Considering the prevalent scale effect, simulation equations were developed to predict E at different dynamic conditions. The optimum dynamic condition producing the maximum E is also discussed.
Probabilistic cost-benefit analysis of disaster risk management in a development context.
Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan
2013-07-01
Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
Design of an efficient music-speech discriminator.
Tardón, Lorenzo J; Sammartino, Simone; Barbancho, Isabel
2010-01-01
In this paper, the problem of the design of a simple and efficient music-speech discriminator for large audio data sets in which advanced music playing techniques are taught and voice and music are intrinsically interleaved is addressed. In the process, a number of features used in speech-music discrimination are defined and evaluated over the available data set. Specifically, the data set contains pieces of classical music played with different and unspecified instruments (or even lyrics) and the voice of a teacher (a top music performer) or even the overlapped voice of the translator and other persons. After an initial test of the performance of the features implemented, a selection process is started, which takes into account the type of classifier selected beforehand, to achieve good discrimination performance and computational efficiency, as shown in the experiments. The discrimination application has been defined and tested on a large data set supplied by Fundacion Albeniz, containing a large variety of classical music pieces played with different instrument, which include comments and speeches of famous performers.
Uncertainty quantification-based robust aerodynamic optimization of laminar flow nacelle
NASA Astrophysics Data System (ADS)
Xiong, Neng; Tao, Yang; Liu, Zhiyong; Lin, Jun
2018-05-01
The aerodynamic performance of laminar flow nacelle is highly sensitive to uncertain working conditions, especially the surface roughness. An efficient robust aerodynamic optimization method on the basis of non-deterministic computational fluid dynamic (CFD) simulation and Efficient Global Optimization (EGO)algorithm was employed. A non-intrusive polynomial chaos method is used in conjunction with an existing well-verified CFD module to quantify the uncertainty propagation in the flow field. This paper investigates the roughness modeling behavior with the γ-Ret shear stress transport model including modeling flow transition and surface roughness effects. The roughness effects are modeled to simulate sand grain roughness. A Class-Shape Transformation-based parametrical description of the nacelle contour as part of an automatic design evaluation process is presented. A Design-of-Experiments (DoE) was performed and surrogate model by Kriging method was built. The new design nacelle process demonstrates that significant improvements of both mean and variance of the efficiency are achieved and the proposed method can be applied to laminar flow nacelle design successfully.
Li, Run-dong; Nie, Yong-feng; Li, Ai-min; Wang, Lei; Chi, Yong; Cen, Ke-fa
2004-09-01
Vitrification process can effectively control the leachability of heavy metals in fly ash generated from municipal solid waste incinerator (MWSI). The use of liquid ceramic (LC) additive as a heavy metal chemical stabilization agent was evaluated for MSWI fly ash. The residuals of chromium, lead and zinc in slag increase by different degree with liquid ceramic additive at 1400 degrees C, while those of cadmium and copper decreases. The migrating characteristic of nickel is hardly affected by the additive less than 10%. The volatilization of Cr and Zn occurs after 61 minute with 10% addition of LC, and the binding efficiency of Cr decreases with increasing of melting temperature. The results indicate that the binding efficiency of heavy metals was affected greatly by LC additive and showed significant differences according to type of heavy metal during melting process. The short melting time (no longer than 33 min) is useful to obtain high binding efficiency of heavy metals.
Combining Image Processing with Signal Processing to Improve Transmitter Geolocation Estimation
2014-03-27
transmitter by searching a grid of possible transmitter locations within the image region. At each evaluated grid point, theoretical TDOA values are computed...requires converting the image to a grayscale intensity image. This allows efficient manipulation of data and ease of comparison among pixel values . The...cluster of redundant y values along the top edge of an ideal rectangle. The same is true for the bottom edge, as well as for the x values along the
Evaluation of target efficiencies for solid-liquid separation steps in biofuels production.
Kochergin, Vadim; Miller, Keith
2011-01-01
Development of liquid biofuels has entered a new phase of large scale pilot demonstration. A number of plants that are in operation or under construction face the task of addressing the engineering challenges of creating a viable plant design, scaling up and optimizing various unit operations. It is well-known that separation technologies account for 50-70% of both capital and operating cost. Additionally, reduction of environmental impact creates technological challenges that increase project cost without adding to the bottom line. Different technologies vary in terms of selection of unit operations; however, solid-liquid separations are likely to be a major contributor to the overall project cost. Despite the differences in pretreatment approaches, similar challenges arise for solid-liquid separation unit operations. A typical process for ethanol production from biomass includes several solid-liquid separation steps, depending on which particular stream is targeted for downstream processing. The nature of biomass-derived materials makes it either difficult or uneconomical to accomplish complete separation in a single step. Therefore, setting realistic efficiency targets for solid-liquid separations is an important task that influences overall process recovery and economics. Experimental data will be presented showing typical characteristics for pretreated cane bagasse at various stages of processing into cellulosic ethanol. Results of generic material balance calculations will be presented to illustrate the influence of separation target efficiencies on overall process recoveries and characteristics of waste streams.
Tomei, M Concetta; Rita, Sara; Mininni, Giuseppe
2011-12-15
Sequential anaerobic-aerobic digestion was applied to waste activated sludge (WAS) of a full scale wastewater treatment plant. The study was performed with the objective of testing the sequential digestion process on WAS, which is characterized by worse digestibility in comparison with the mixed sludge. Process performance was evaluated in terms of biogas production, volatile solids (VS) and COD reduction, and patterns of biopolymers (proteins and polysaccharides) in the subsequent digestion stages. VS removal efficiency of 40%, in the anaerobic phase, and an additional removal of 26%, in the aerobic one, were observed. For total COD removal efficiencies of 35% and 25% for anaerobic and aerobic stage respectively, were obtained. Kinetics of VS degradation process was analyzed by assuming a first order equation with respect to VS concentration. Evaluated kinetic parameters were 0.44 ± 0.20 d(-1) and 0.25 ± 0.15 d(-1) for the anaerobic stage and aerobic stage, respectively. With regard to biopolymers, in the anaerobic phase the content of proteins and polysaccharides increased to 50% and 69%, respectively, whereas in the subsequent aerobic phase, a decrease of 71% for proteins and 67% for polysaccharides was observed. The average specific biogas production 0.74 m(3)/(kg VS destroyed), was in the range of values reported in the specialized literature for conventional anaerobic mesophilic WAS digestion. Copyright © 2011 Elsevier B.V. All rights reserved.
Ueda, Keisuke; Higashi, Kenjirou; Kataoka, Makoto; Yamashita, Shinji; Yamamoto, Keiji; Moribe, Kunikazu
2014-10-01
The effects of drug-crystallization inhibitor in bile acid/lipid micelles solution on drug permeation was evaluated during the drug crystallization process. Hydroxypropyl methylcellulose acetate succinate (HPMC-AS) was used as a drug-crystallization inhibitor, which efficiently suppressed dexamethasone (DEX) crystallization in a gastrointestinal fluid model containing sodium taurocholate (NaTC) and egg-phosphatidylcholine (egg-PC). Changes of molecular state of supersaturated DEX during the DEX crystallization process was monitored in real time using proton nuclear magnetic resonance (1H NMR). It revealed that DEX distribution to bulk water and micellar phases formed by NaTC and egg-PC was not changed during the DEX crystallization process even in the presence of HPMC-AS. DEX permeation during DEX crystallization was evaluated using dissolution/permeability system. The combination of crystallization inhibition by HPMC-AS and micellar encapsulation by NaTC and egg-PC led to considerably higher DEX concentrations and improvement of DEX permeation at the beginning of the DEX crystallization process. Crystallization inhibition by HPMC-AS can efficiently work even in the micellar solution, where NaTC/egg-PC micelles encapsulates some DEX. It was concluded that a crystallization inhibitor contributed to improvement of permeation of a poorly water-soluble drug in gastrointestinal fluid. Copyright © 2014 Elsevier B.V. All rights reserved.
A General Accelerated Degradation Model Based on the Wiener Process.
Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning
2016-12-06
Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.
A General Accelerated Degradation Model Based on the Wiener Process
Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning
2016-01-01
Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107
Adaptation of in-situ microscopy for crystallization processes
NASA Astrophysics Data System (ADS)
Bluma, A.; Höpfner, T.; Rudolph, G.; Lindner, P.; Beutel, S.; Hitzmann, B.; Scheper, T.
2009-08-01
In biotechnological and pharmaceutical engineering, the study of crystallization processes gains importance. An efficient analytical inline sensor could help to improve the knowledge about these processes in order to increase efficiency and yields. The in-situ microscope (ISM) is an optical sensor developed for the monitoring of bioprocesses. A new application for this sensor is the monitoring in downstream processes, e.g. the crystallization of proteins and other organic compounds. This contribution shows new aspects of using in-situ microscopy to monitor crystallization processes. Crystals of different chemical compounds were precipitated from supersaturated solutions and the crystal growth was monitored. Exemplified morphological properties and different forms of crystals could be distinguished on the basis of offline experiments. For inline monitoring of crystallization processes, a special 0.5 L stirred tank reactor was developed and equipped with the in-situ microscope. This reactor was utilized to carry out batch experiments for crystallizations of O-acetylsalicyclic acid (ASS) and hen egg white lysozyme (HEWL). During the whole crystallization process, the in-situ microscope system acquired images directly from the crystallization broth. For the data evaluation, an image analysis algorithm was developed and implemented in the microscope analysis software.
Ryan, Marybeth
2011-01-01
The portfolio is emerging as an efficient and effective method for evaluating program outcomes and professional development in nursing education. Although there is a host of literature about the use of portfolios in undergraduate nursing programs, fewer reports exist about their use in graduate nursing education. This article presents the results of a formative evaluation process, using student and faculty focus groups, conducted at a midsized university's graduate nursing education program to determine the effectiveness of portfolio use. Content analysis of the focus group data yielded three student themes and two faculty themes with associated theme clusters that revealed similarities and unique perceptions of students and faculty regarding the portfolio process. The information gleaned will provide direction to faculty as they make decisions about the use of this evaluation method in the graduate program. Copyright © 2011. Published by Elsevier Inc.
Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics
NASA Astrophysics Data System (ADS)
Vasil'ev, V. A.; Dobrynina, N. V.
2017-06-01
The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
Research on assessment and improvement method of remote sensing image reconstruction
NASA Astrophysics Data System (ADS)
Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping
2018-01-01
Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.
Correlation between safety assessments in the driver-car interaction design process.
Broström, Robert; Bengtsson, Peter; Axelsson, Jakob
2011-05-01
With the functional revolution in modern cars, evaluation methods to be used in all phases of driver-car interaction design have gained importance. It is crucial for car manufacturers to discover and solve safety issues early in the interaction design process. A current problem is thus to find a correlation between the formative methods that are used during development and the summative methods that are used when the product has reached the customer. This paper investigates the correlation between efficiency metrics from summative and formative evaluations, where the results of two studies on sound and navigation system tasks are compared. The first, an analysis of the J.D. Power and Associates APEAL survey, consists of answers given by about two thousand customers. The second, an expert evaluation study, was done by six evaluators who assessed the layouts by task completion time, TLX and Nielsen heuristics. The results show a high degree of correlation between the studies in terms of task efficiency, i.e. between customer ratings and task completion time, and customer ratings and TLX. However, no correlation was observed between Nielsen heuristics and customer ratings, task completion time or TLX. The results of the studies introduce a possibility to develop a usability evaluation framework that includes both formative and summative approaches, as the results show a high degree of consistency between the different methodologies. Hence, combining a quantitative approach with the expert evaluation method, such as task completion time, should be more useful for driver-car interaction design. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Shiroiwa, Takeru; Fukuda, Takashi; Ikeda, Shunya; Takura, Tomoyuki
2017-08-01
Economic evaluation is used for decision-making processes in healthcare technologies in many developed countries. In Japan, no health economic data have been requested for drugs, medical devices, and interventions till date. However, economic evaluation is gradually gaining importance, and a trial implementation of the cost-effectiveness evaluation of drugs and medical devices has begun. Discussions on economic evaluation began in May 2012 within a newly established sub-committee of the Chuikyo, referred to as the "Special Committee on Cost Effectiveness." After four years of discussions, this committee determined that during the trial implementation, the results of the cost-effectiveness evaluation would be used for the re-pricing of drugs and medical devices at the end of fiscal year (FY) 2017. Chuikyo selected 13 products (7 drugs and 6 medical devices) as targets for this evaluation. These products will be evaluated until the end of FY 2017 based on the following process: manufacturers will submit the data of economic evaluation; the National Institute of Public Health will coordinate the review process; academic groups will perform the actual review of the submitted data, and the expert committee will appraise these data. This represents the first step to introducing cost-effectiveness analysis in the Japanese healthcare system. We believe that these efforts will contribute to the efficiency and sustainability of the Japanese healthcare system. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Modeling Hydrological Processes in New Mexico-Texas-Mexico Border Region
NASA Astrophysics Data System (ADS)
Samimi, M.; Jahan, N. T.; Mirchi, A.
2017-12-01
Efficient allocation of limited water resources to competing use sectors is becoming increasingly critical for water-scarce regions. Understanding natural and anthropogenic processes affecting hydrological processes is key for efficient water management. We used Soil and Water Assessment Tool (SWAT) to model governing hydrologic processes in New Mexico-Texas-Mexico border region. Our study area includes the Elephant Butte Irrigation District (EBID), which manages water resources to support irrigated agriculture. The region is facing water resources challenges associated with chronic water scarcity, over-allocation, diminishing water supply, and growing water demand. Agricultural activities rely on conjunctive use of Rio Grande River water supply and groundwater withdrawal. The model is calibrated and validated under baseline conditions in the arid and semi-arid climate in order to evaluate potential impacts of climate change on the agricultural sector and regional water availability. We highlight the importance of calibrating the crop growth parameters, evapotranspiration, and groundwater recharge to provide a realistic representation of the hydrological processes and water availability in the region. Furthermore, limitations of the model and its utility to inform stakeholders will be discussed.
Overview of CMOS process and design options for image sensor dedicated to space applications
NASA Astrophysics Data System (ADS)
Martin-Gonthier, P.; Magnan, P.; Corbiere, F.
2005-10-01
With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
Compression-RSA technique: A more efficient encryption-decryption procedure
NASA Astrophysics Data System (ADS)
Mandangan, Arif; Mei, Loh Chai; Hung, Chang Ee; Che Hussin, Che Haziqah
2014-06-01
The efficiency of encryption-decryption procedures has become a major problem in asymmetric cryptography. Compression-RSA technique is developed to overcome the efficiency problem by compressing the numbers of kplaintext, where k∈Z+ and k > 2, becoming only 2 plaintext. That means, no matter how large the numbers of plaintext, they will be compressed to only 2 plaintext. The encryption-decryption procedures are expected to be more efficient since these procedures only receive 2 inputs to be processed instead of kinputs. However, it is observed that as the numbers of original plaintext are increasing, the size of the new plaintext becomes bigger. As a consequence, it will probably affect the efficiency of encryption-decryption procedures, especially for RSA cryptosystem since both of its encryption-decryption procedures involve exponential operations. In this paper, we evaluated the relationship between the numbers of original plaintext and the size of the new plaintext. In addition, we conducted several experiments to show that the RSA cryptosystem with embedded Compression-RSA technique is more efficient than the ordinary RSA cryptosystem.
Efficient path-based computations on pedigree graphs with compact encodings
2012-01-01
A pedigree is a diagram of family relationships, and it is often used to determine the mode of inheritance (dominant, recessive, etc.) of genetic diseases. Along with rapidly growing knowledge of genetics and accumulation of genealogy information, pedigree data is becoming increasingly important. In large pedigree graphs, path-based methods for efficiently computing genealogical measurements, such as inbreeding and kinship coefficients of individuals, depend on efficient identification and processing of paths. In this paper, we propose a new compact path encoding scheme on large pedigrees, accompanied by an efficient algorithm for identifying paths. We demonstrate the utilization of our proposed method by applying it to the inbreeding coefficient computation. We present time and space complexity analysis, and also manifest the efficiency of our method for evaluating inbreeding coefficients as compared to previous methods by experimental results using pedigree graphs with real and synthetic data. Both theoretical and experimental results demonstrate that our method is more scalable and efficient than previous methods in terms of time and space requirements. PMID:22536898
Measuring the efficiency of zakat collection process using data envelopment analysis
NASA Astrophysics Data System (ADS)
Hamzah, Ahmad Aizuddin; Krishnan, Anath Rau
2016-10-01
It is really necessary for each zakat institution in the nation to timely measure and understand their efficiency in collecting zakat for the sake of continuous betterment. Pusat Zakat Sabah, Malaysia which has kicked off its operation in early of 2007, is not excused from this obligation as well. However, measuring the collection efficiency is not a very easy task as it usually incorporates the consideration of multiple inputs or/and outputs. This paper sequentially employed three data envelopment analysis models, namely Charnes-Cooper-Rhodes (CCR) primal model, CCR dual model, and slack based model to quantitatively evaluate the efficiency of zakat collection in Sabah across the year of 2007 up to 2015 by treating each year as a decision making unit. The three models were developed based on two inputs (i.e. number of zakat branches and number of staff) and one output (i.e. total collection). The causes for not achieving efficiency and the suggestions on how the efficiency in each year could have been improved were disclosed.
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
Observer efficiency in free-localization tasks with correlated noise.
Abbey, Craig K; Eckstein, Miguel P
2014-01-01
The efficiency of visual tasks involving localization has traditionally been evaluated using forced choice experiments that capitalize on independence across locations to simplify the performance of the ideal observer. However, developments in ideal observer analysis have shown how an ideal observer can be defined for free-localization tasks, where a target can appear anywhere in a defined search region and subjects respond by localizing the target. Since these tasks are representative of many real-world search tasks, it is of interest to evaluate the efficiency of observer performance in them. The central question of this work is whether humans are able to effectively use the information in a free-localization task relative to a similar task where target location is fixed. We use a yes-no detection task at a cued location as the reference for this comparison. Each of the tasks is evaluated using a Gaussian target profile embedded in four different Gaussian noise backgrounds having power-law noise power spectra with exponents ranging from 0 to 3. The free localization task had a square 6.7° search region. We report on two follow-up studies investigating efficiency in a detect-and-localize task, and the effect of processing the white-noise backgrounds. In the fixed-location detection task, we find average observer efficiency ranges from 35 to 59% for the different noise backgrounds. Observer efficiency improves dramatically in the tasks involving localization, ranging from 63 to 82% in the forced localization tasks and from 78 to 92% in the detect-and- localize tasks. Performance in white noise, the lowest efficiency condition, was improved by filtering to give them a power-law exponent of 2. Classification images, used to examine spatial frequency weights for the tasks, show better tuning to ideal weights in the free-localization tasks. The high absolute levels of efficiency suggest that observers are well-adapted to free-localization tasks.
Observer efficiency in free-localization tasks with correlated noise
Abbey, Craig K.; Eckstein, Miguel P.
2014-01-01
The efficiency of visual tasks involving localization has traditionally been evaluated using forced choice experiments that capitalize on independence across locations to simplify the performance of the ideal observer. However, developments in ideal observer analysis have shown how an ideal observer can be defined for free-localization tasks, where a target can appear anywhere in a defined search region and subjects respond by localizing the target. Since these tasks are representative of many real-world search tasks, it is of interest to evaluate the efficiency of observer performance in them. The central question of this work is whether humans are able to effectively use the information in a free-localization task relative to a similar task where target location is fixed. We use a yes-no detection task at a cued location as the reference for this comparison. Each of the tasks is evaluated using a Gaussian target profile embedded in four different Gaussian noise backgrounds having power-law noise power spectra with exponents ranging from 0 to 3. The free localization task had a square 6.7° search region. We report on two follow-up studies investigating efficiency in a detect-and-localize task, and the effect of processing the white-noise backgrounds. In the fixed-location detection task, we find average observer efficiency ranges from 35 to 59% for the different noise backgrounds. Observer efficiency improves dramatically in the tasks involving localization, ranging from 63 to 82% in the forced localization tasks and from 78 to 92% in the detect-and- localize tasks. Performance in white noise, the lowest efficiency condition, was improved by filtering to give them a power-law exponent of 2. Classification images, used to examine spatial frequency weights for the tasks, show better tuning to ideal weights in the free-localization tasks. The high absolute levels of efficiency suggest that observers are well-adapted to free-localization tasks. PMID:24817854
Catalysts for Efficient Production of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Sun, Ted X.; Dong, Yi
2009-01-01
Several metal alloys have shown promise as improved catalysts for catalytic thermal decomposition of hydrocarbon gases to produce carbon nanotubes (CNTs). Heretofore almost every experiment on the production of carbon nanotubes by this method has involved the use of iron, nickel, or cobalt as the catalyst. However, the catalytic-conversion efficiencies of these metals have been observed to be limited. The identification of better catalysts is part of a continuing program to develop means of mass production of high-quality carbon nanotubes at costs lower than those achieved thus far (as much as $100/g for purified multi-wall CNTs or $1,000/g for single-wall CNTs in year 2002). The main effort thus far in this program has been the design and implementation of a process tailored specifically for high-throughput screening of alloys for catalyzing the growth of CNTs. The process includes an integral combination of (1) formulation of libraries of catalysts, (2) synthesis of CNTs from decomposition of ethylene on powders of the alloys in a pyrolytic chemical-vapor-decomposition reactor, and (3) scanning- electron-microscope screening of the CNTs thus synthesized to evaluate the catalytic efficiencies of the alloys. Information gained in this process is put into a database and analyzed to identify promising alloy compositions, which are to be subjected to further evaluation in a subsequent round of testing. Some of these alloys have been found to catalyze the formation of carbon nano tubes from ethylene at temperatures as low as 350 to 400 C. In contrast, the temperatures typically required for prior catalysts range from 550 to 750 C.
Yang, Shuo; Lin, Ling; Li, Shao Peng; Li, Qiang; Wang, Xiu Teng; Sun, Liang
2017-05-01
Utilization of fly ash is of great importance in China in the context of resource and environmental crises. Different fly ash utilization processes are proposed, and some have been practically applied. However, none of these fly ash utilization pathways has been evaluated comprehensively by integrating both environmental and economic perspectives. In this study, three high-aluminum fly ash utilization methods in Mongolia were assessed and compared based on the concept of eco-efficiency. The environmental assessment was conducted in accordance with life-cycle assessment principles, and a monetization-weighting approach was applied to obtain social willingness-to-pay as a reflection of environmental impact. The environmental assessment results revealed that the reuse of fly ash had significant advantage for saving primary resource, while solid waste, depletion of water, and global warming were the three highest environmental impacts from the life cycle perspective. The economic performance assessment showed positive net profits for fly ash utilization, but high value-added products were not necessarily indicative of better economic performance due to the relatively high operation cost. Comparison of the eco-efficiency indicators (EEIs) implied that the process of scenario 1#, which produced mullite ceramic and active calcium silicate, was the most recommended out of the three scenarios on the present scale. This judgment was consistent with the evaluation of the resource utilization rate. The present study showed that the EEI could be used to compare different fly ash utilization processes in a comprehensive and objective manner, thus providing definitive and insightful suggestions for decision-making and technical improvement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Steven R.; Schwartz, Lisa C.
Demand-side energy efficiency (efficiency) represents a low-cost opportunity to reduce electricity consumption and demand and provide a wide range of non-energy benefits, including avoiding air pollution. Efficiency-related energy and non-energy impacts are determined and documented by implementing evaluation, measurement and verification (EM&V) systems. This technical brief describes efficiency EM&V coordination strategies that Western states can consider taking on together, outlines EM&V-related products that might be appropriate for multistate coordination, and identifies some implications of coordination. Coordinating efficiency EM&V activities can save both time and costs for state agencies and stakeholders engaged in efficiency activities and can be particularly beneficial formore » multiple states served by the same utility. First, the brief summarizes basic information on efficiency, its myriad potential benefits and EM&V for assessing those benefits. Second, the brief introduces the concept of multistate EM&V coordination in the context of assessing such benefits, including achievement of state and federal goals to reduce air pollutants.1 Next, the brief presents three coordination strategy options for efficiency EM&V: information clearinghouse/exchange, EM&V product development, and a regional energy efficiency tracking system platform. The brief then describes five regional EM&V products that could be developed on a multistate basis: EM&V reporting formats, database of consistent deemed electricity savings values, glossary of definitions and concepts, efficiency EM&V methodologies, and EM&V professional standards or accreditation processes. Finally, the brief discusses options for next steps that Western states can take to consider multistate coordination on efficiency EM&V. Appendices provide background information on efficiency and EM&V, as well as definitions and suggested resources on the covered topics. This brief is intended to inform state public utility commissions, boards for public and consumer-owned utilities, state energy offices and air agencies, and other organizations involved in discussions about the use of efficiency EM&V.« less
Gohlke, Oliver
2009-11-01
Global warming is a focus of political interest and life-cycle assessment of waste management systems reveals that energy recovery from municipal solid waste is a key issue. This paper demonstrates how the greenhouse gas effects of waste treatment processes can be described in a simplified manner by considering energy efficiency indicators. For evaluation to be consistent, it is necessary to use reasonable system boundaries and to take the generation of electricity and the use of heat into account. The new European R1 efficiency criterion will lead to the development and implementation of optimized processes/systems with increased energy efficiency which, in turn, will exert an influence on the greenhouse gas effects of waste management in Europe. Promising technologies are: the increase of steam parameters, reduction of in-plant energy consumption, and the combined use of heat and power. Plants in Brescia and Amsterdam are current examples of good performance with highly efficient electricity generation. Other examples of particularly high heat recovery rates are the energy-from-waste (EfW) plants in Malmö and Gothenburg. To achieve the full potential of greenhouse gas reduction in waste management, it is necessary to avoid landfilling combustible wastes, for example, by means of landfill taxes and by putting incentives in place for increasing the efficiency of EfW systems.
Ooi, Shing Ming; Sarkar, Srimanta; van Varenbergh, Griet; Schoeters, Kris; Heng, Paul Wan Sia
2013-04-01
Continuous processing and production in pharmaceutical manufacturing has received increased attention in recent years mainly due to the industries' pressing needs for more efficient, cost-effective processes and production, as well as regulatory facilitation. To achieve optimum product quality, the traditional trial-and-error method for the optimization of different process and formulation parameters is expensive and time consuming. Real-time evaluation and the control of product quality using an online process analyzer in continuous processing can provide high-quality production with very high-throughput at low unit cost. This review focuses on continuous processing and the application of different real-time monitoring tools used in the pharmaceutical industry for continuous processing from powder to tablets.
Li, Meng-Nan; Zheng, Guang-Hong; Wang, Lei; Xiao, Wei; Fu, Xiao-Hua; Le, Yi-Quan; Ren, Da-Ming
2009-01-01
The discharge of recombinant DNA waste from biological laboratories into the eco-system may be one of the pathways resulting in horizontal gene transfer or "gene pollution". Heating at 100 degrees C for 5-10 min is a common method for treating recombinant DNA waste in biological research laboratories in China. In this study, we evaluated the effectiveness and the safety of the thermo-treatment method in the disposal of recombinant DNA waste. Quantitative PCR, plasmid transformation and electrophoresis technology were used to evaluate the decay/denaturation efficiency during the thermo-treatment process of recombinant plasmid, pET-28b. Results showed that prolonging thermo-treatment time could improve decay efficiency of the plasmid, and its decay half-life was 2.7-4.0 min during the thermo-treatment at 100 degrees C. However, after 30 min of thermo-treatment some transforming activity remained. Higher ionic strength could protect recombinant plasmid from decay during the treatment process. These results indicate that thermo-treatment at 100 degrees C cannot decay and inactivate pET-28b completely. In addition, preliminary results showed that thermo-treated recombinant plasmids were not degraded completely in a short period when they were discharged into an aquatic environment. This implies that when thermo-treated recombinant DNAs are discharged into the eco-system, they may have enough time to re-nature and transform, thus resulting in gene diffusion.
New Design Tool Can Help Cut building Energy Use
help almost any architect or engineer evaluate passive solar and efficiency design strategies in a tool that enables them to walk through the design process and understand the consequences of design , a feature that tells designers how large of a heating, ventilation and air conditioning (HVAC
Developing the Mathematics Learning Management Model for Improving Creative Thinking in Thailand
ERIC Educational Resources Information Center
Sriwongchai, Arunee; Jantharajit, Nirat; Chookhampaeng, Sumalee
2015-01-01
The study purposes were: 1) To study current states and problems of relevant secondary students in developing mathematics learning management model for improving creative thinking, 2) To evaluate the effectiveness of model about: a) efficiency of learning process, b) comparisons of pretest and posttest on creative thinking and achievement of…
Corrosion Prevention for Wheeled Vehicle Systems
1993-08-13
The audit objective was to evaluate the effectiveness and efficiency of the Army’s procedures for acquiring corrosion prevention and chemical agent...resistant coatings for wheeled vehicle systems. To accomplish this objective, we reviewed corrosion controls and painting processes. The audit also...included a review of the adequacy of internal controls related to the audit objective.
Neural Correlates of Performance Monitoring during the Transition to Young Adulthood
ERIC Educational Resources Information Center
Kneževic, Martina; Veroude, Kim; Jolles, Jelle; Krabbendam, Lydia
2016-01-01
Cognitive challenges during transition to adulthood are generally high and require particular skills, such as self-control, performance evaluation, and behavioral adjustment for success in everyday living. However, age and sex differences in timing and efficiency of brain maturational processes in the early twenties are not well known. We used a…
Reverse Engineering Course at Philadelphia University in Jordan
ERIC Educational Resources Information Center
Younis, M. Bani; Tutunji, T.
2012-01-01
Reverse engineering (RE) is the process of testing and analysing a system or a device in order to identify, understand and document its functionality. RE is an efficient tool in industrial benchmarking where competitors' products are dissected and evaluated for performance and costs. RE can play an important role in the re-configuration and…
ERIC Educational Resources Information Center
Kim, Jisun
2012-01-01
This dissertation aims to provide a better understanding of the technology licensing practices of academic research institutions. The study identifies time durations in licensing and incorporates these into a model to evaluate licensing performance. Performance is measured by the efficiency of an institution's technology licensing process and…
Welding And Cutting A Nickel Alloy By Laser
NASA Technical Reports Server (NTRS)
Banas, C. M.
1990-01-01
Technique effective and energy-efficient. Report describes evaluation of laser welding and cutting of Inconel(R) 718. Notes that electron-beam welding processes developed for In-718, but difficult to use on large or complex structures. Cutting of In-718 by laser fast and produces only narrow kerf. Cut edge requires dressing, to endure fatigue.
Efficient Evaluation System for Learning Management Systems
ERIC Educational Resources Information Center
Cavus, Nadire
2009-01-01
A learning management system (LMS) provides the platform for web-based learning environment by enabling the management, delivery, tracking of learning, testing, communication, registration process and scheduling. There are many LMS systems on the market that can be obtained for free or through payment. It has now become an important task to choose…
ERIC Educational Resources Information Center
Karatas, Ilhan; Baki, Adnan
2013-01-01
Problem solving is recognized as an important life skill involving a range of processes including analyzing, interpreting, reasoning, predicting, evaluating and reflecting. For that reason educating students as efficient problem solvers is an important role of mathematics education. Problem solving skill is the centre of mathematics curriculum.…
Performance Evaluation of Indoor Environment towards Sustainability for Higher Educational Buildings
ERIC Educational Resources Information Center
Khalil, Natasha; Husin, Husrul Nizam; Wahab, Lilawati Ab; Kamal, Kamarul Syahril; Mahat, Noorsaidi
2011-01-01
The indoor environmental factors considered in higher educational building must be determined in order to meet the user's requirement. Disruption of indoor environment may constitute to reduce occupants' efficiencies and their learning process and activities. But the question is, how to ensure that the provision of indoor environmental aspects…
Beyond the Computer: Reading as a Process of Intellectual Development.
ERIC Educational Resources Information Center
Thompson, Mark E.
With more than 100,000 computers in public schools across the United States, the impact of computer assisted instruction (CAI) on students' reading behavior needs to be evaluated. In reading laboratories, CAI has been found to provide an efficient and highly motivating means of teaching specific educational objectives. Yet, while computer…
Evaluating architecture impact on system energy efficiency
Yu, Shijie; Wang, Rui; Luan, Zhongzhi; Qian, Depei
2017-01-01
As the energy consumption has been surging in an unsustainable way, it is important to understand the impact of existing architecture designs from energy efficiency perspective, which is especially valuable for High Performance Computing (HPC) and datacenter environment hosting tens of thousands of servers. One obstacle hindering the advance of comprehensive evaluation on energy efficiency is the deficient power measuring approach. Most of the energy study relies on either external power meters or power models, both of these two methods contain intrinsic drawbacks in their practical adoption and measuring accuracy. Fortunately, the advent of Intel Running Average Power Limit (RAPL) interfaces has promoted the power measurement ability into next level, with higher accuracy and finer time resolution. Therefore, we argue it is the exact time to conduct an in-depth evaluation of the existing architecture designs to understand their impact on system energy efficiency. In this paper, we leverage representative benchmark suites including serial and parallel workloads from diverse domains to evaluate the architecture features such as Non Uniform Memory Access (NUMA), Simultaneous Multithreading (SMT) and Turbo Boost. The energy is tracked at subcomponent level such as Central Processing Unit (CPU) cores, uncore components and Dynamic Random-Access Memory (DRAM) through exploiting the power measurement ability exposed by RAPL. The experiments reveal non-intuitive results: 1) the mismatch between local compute and remote memory node caused by NUMA effect not only generates dramatic power and energy surge but also deteriorates the energy efficiency significantly; 2) for multithreaded application such as the Princeton Application Repository for Shared-Memory Computers (PARSEC), most of the workloads benefit a notable increase of energy efficiency using SMT, with more than 40% decline in average power consumption; 3) Turbo Boost is effective to accelerate the workload execution and further preserve the energy, however it may not be applicable on system with tight power budget. PMID:29161317
Evaluating architecture impact on system energy efficiency.
Yu, Shijie; Yang, Hailong; Wang, Rui; Luan, Zhongzhi; Qian, Depei
2017-01-01
As the energy consumption has been surging in an unsustainable way, it is important to understand the impact of existing architecture designs from energy efficiency perspective, which is especially valuable for High Performance Computing (HPC) and datacenter environment hosting tens of thousands of servers. One obstacle hindering the advance of comprehensive evaluation on energy efficiency is the deficient power measuring approach. Most of the energy study relies on either external power meters or power models, both of these two methods contain intrinsic drawbacks in their practical adoption and measuring accuracy. Fortunately, the advent of Intel Running Average Power Limit (RAPL) interfaces has promoted the power measurement ability into next level, with higher accuracy and finer time resolution. Therefore, we argue it is the exact time to conduct an in-depth evaluation of the existing architecture designs to understand their impact on system energy efficiency. In this paper, we leverage representative benchmark suites including serial and parallel workloads from diverse domains to evaluate the architecture features such as Non Uniform Memory Access (NUMA), Simultaneous Multithreading (SMT) and Turbo Boost. The energy is tracked at subcomponent level such as Central Processing Unit (CPU) cores, uncore components and Dynamic Random-Access Memory (DRAM) through exploiting the power measurement ability exposed by RAPL. The experiments reveal non-intuitive results: 1) the mismatch between local compute and remote memory node caused by NUMA effect not only generates dramatic power and energy surge but also deteriorates the energy efficiency significantly; 2) for multithreaded application such as the Princeton Application Repository for Shared-Memory Computers (PARSEC), most of the workloads benefit a notable increase of energy efficiency using SMT, with more than 40% decline in average power consumption; 3) Turbo Boost is effective to accelerate the workload execution and further preserve the energy, however it may not be applicable on system with tight power budget.
Nguyen, D Duc; Ngo, H Hao; Guo, W; Nguyen, T Thanh; Chang, Soon W; Jang, A; Yoon, Yong S
2016-09-01
This paper evaluated a novel pilot scale electrocoagulation (EC) system for improving total phosphorus (TP) removal from municipal wastewater. This EC system was operated in continuous and batch operating mode under differing conditions (e.g. flow rate, initial concentration, electrolysis time, conductivity, voltage) to evaluate correlative phosphorus and electrical energy consumption. The results demonstrated that the EC system could effectively remove phosphorus to meet current stringent discharge standards of less than 0.2mg/L within 2 to 5min. This target was achieved in all ranges of initial TP concentrations studied. It was also found that an increase in conductivity of solution, voltages, or electrolysis time, correlated with improved TP removal efficiency and reduced specific energy consumption. Based on these results, some key economic considerations, such as operating costs, cost-effectiveness, product manufacturing feasibility, facility design and retrofitting, and program implementation are also discussed. This EC process can conclusively be highly efficient in a relatively simple, easily managed, and cost-effective for wastewater treatment system. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zareei, Zahra; Navi, Keivan; Keshavarziyan, Peiman
2018-03-01
In this paper, three novel low-power and high-speed 1-bit inexact Full Adder cell designs are presented based on current mode logic in 32 nm carbon nanotube field effect transistor technology for the first time. The circuit-level figures of merits, i.e. power, delay and power-delay product as well as application-level metric such as error distance, are considered to assess the efficiency of the proposed cells over their counterparts. The effect of voltage scaling and temperature variation on the proposed cells is studied using HSPICE tool. Moreover, using MATLAB tool, the peak signal to noise ratio of the proposed cells is evaluated in an image-processing application referred to as motion detector. Simulation results confirm the efficiency of the proposed cells.
NASA Astrophysics Data System (ADS)
Ueno, Tetsuro; Hino, Hideitsu; Hashimoto, Ai; Takeichi, Yasuo; Sawada, Masahiro; Ono, Kanta
2018-01-01
Spectroscopy is a widely used experimental technique, and enhancing its efficiency can have a strong impact on materials research. We propose an adaptive design for spectroscopy experiments that uses a machine learning technique to improve efficiency. We examined X-ray magnetic circular dichroism (XMCD) spectroscopy for the applicability of a machine learning technique to spectroscopy. An XMCD spectrum was predicted by Gaussian process modelling with learning of an experimental spectrum using a limited number of observed data points. Adaptive sampling of data points with maximum variance of the predicted spectrum successfully reduced the total data points for the evaluation of magnetic moments while providing the required accuracy. The present method reduces the time and cost for XMCD spectroscopy and has potential applicability to various spectroscopies.
Tony, Maha A; Zhao, Y Q; Purcell, P J; El-Sherbiny, M F
2009-04-01
In the present study, homogenous (photo-Fenton) and heterogeneous photo-assisted systems (Fenton/TiO(2)/UV, Fenton/ZnO/UV and Fenton/TiO(2)/UV/Air) were investigated for the treatment of a diesel-oil wastewater emulsion. The augmentation of the photo-Fenton process by heterogeneous TiO(2) increased the reaction rate, in terms of COD reduction efficiency from 61% to 71%. Furthermore, the COD removal efficiency was increased to 84% when air was bubbled through the reactants. However, if the Fenton/TiO(2) /UV/Air process is to be utilized as a treatment for this wastewater, the separation of the TiO(2) from the treated effluent would need further consideration.
NASA Astrophysics Data System (ADS)
Beyhaghi, Pooriya
2016-11-01
This work considers the problem of the efficient minimization of the infinite time average of a stationary ergodic process in the space of a handful of independent parameters which affect it. Problems of this class, derived from physical or numerical experiments which are sometimes expensive to perform, are ubiquitous in turbulence research. In such problems, any given function evaluation, determined with finite sampling, is associated with a quantifiable amount of uncertainty, which may be reduced via additional sampling. This work proposes the first algorithm of this type. Our algorithm remarkably reduces the overall cost of the optimization process for problems of this class. Further, under certain well-defined conditions, rigorous proof of convergence is established to the global minimum of the problem considered.
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
Rossmann, Maike; Matos, Antonio Teixeira; Abreu, Edgar Carneiro; Silva, Fabyano Fonseca; Borges, Alisson Carraro
2013-10-15
The aim of the present study was to evaluate the influence of aeration and vegetation on the removal of organic matter in coffee processing wastewater (CPW) treated in 4 constructed wetlands (CWs), characterized as follows: (i) ryegrass (Lolium multiflorum) cultivated system operating with an aerated influent; (ii) non-cultivated system operating with an aerated influent, (iii) ryegrass cultivated system operating with a non-aerated influent; and (iv) non-cultivated system operating with a non-aerated influent. The lowest average chemical oxygen demand (COD), biochemical oxygen demand (BOD) and total suspended solids (TSS) removal efficiencies of 87, 84 and 73%, respectively, were obtained in the ryegrass cultivated system operating with a non-aerated influent. However, ryegrass cultivation did not influence the removal efficiency of organic matter. Artificial aeration of the CPW, prior to its injection in the CW, did not improve the removal efficiencies of organic matter. On other hand it did contribute to increase the instantaneous rate at which the maximum COD removal efficiency was reached. Although aeration did not result in greater organic matter removal efficiencies, it is important to consider the benefits of aeration on the removal of the other compounds. Copyright © 2013 Elsevier Ltd. All rights reserved.
Jin, Zhan; He, Yin; Xu, Xuan; Zheng, Xiang-yong
2017-01-01
There are two biological systems available for removing phosphorus from waste water, conventional phosphorus removal (CPR) and denitrifying phosphorus removal (DPR) systems, and each is characterized by the type of sludge used in the process. In this study, we compared the characteristics associated with the efficiency of carbon utilization between CPR and DPR sludge using acetate as a carbon source. For DPR sludge, the heat emitted during the phosphorus release and phosphorus uptake processes were 45.79 kJ/mol e- and 84.09 kJ/mol e-, respectively. These values were about 2 fold higher than the corresponding values obtained for CPR sludge, suggesting that much of the energy obtained from the carbon source was emitted as heat. Further study revealed a smaller microbial mass within the DPR sludge compared to CPR sludge, as shown by a lower sludge yield coefficient (0.05 gVSS/g COD versus 0.36 gVSS/g COD), a result that was due to the lower energy capturing efficiency of DPR sludge according to bioenergetic analysis. Although the efficiency of anoxic phosphorus removal was only 39% the efficiency of aerobic phosphorus removal, the consumption of carbon by DPR sludge was reduced by 27.8% compared to CPR sludge through the coupling of denitrification with dephosphatation. PMID:29065157
Najafpoor, Ali Asghar; Jonidi Jafari, Ahmad; Hosseinzadeh, Ahmad; Khani Jazani, Reza; Bargozin, Hasan
2018-01-01
Treatment with a non-thermal plasma (NTP) is a new and effective technology applied recently for conversion of gases for air pollution control. This research was initiated to optimize the efficient application of the NTP process in benzene, toluene, ethyl-benzene, and xylene (BTEX) removal. The effects of four variables including temperature, initial BTEX concentration, voltage, and flow rate on the BTEX elimination efficiency were investigated using response surface methodology (RSM). The constructed model was evaluated by analysis of variance (ANOVA). The model goodness-of-fit and statistical significance was assessed using determination coefficients (R 2 and R 2 adj ) and the F-test. The results revealed that the R 2 proportion was greater than 0.96 for BTEX removal efficiency. The statistical analysis demonstrated that the BTEX removal efficiency was significantly correlated with the temperature, BTEX concentration, voltage, and flow rate. Voltage was the most influential variable affecting the dependent variable as it exerted a significant effect (p < 0.0001) on the response variable. According to the achieved results, NTP can be applied as a progressive, cost-effective, and practical process for treatment of airstreams polluted with BTEX in conditions of low residence time and high concentrations of pollutants.
The Effectiveness and Efficiency of Disease Management Programs for Patients with Chronic Diseases
Hisashige, Akinori
2013-01-01
Objective: Disease management (DM) approach is increasingly advocated as a means of improving effectiveness and efficiency of healthcare for chronic diseases. To evaluate the evidence on effectiveness and efficiency of DM, evidence synthesis was carried out. Methods: To locate eligible meta-analyses and systematic reviews, we searched Medline, EMBASE, the Cochrane Library, SCI-EXPANDED, SSCI, A&HCI, DARE, HTA and NHS EED from 1995 to 2010. Two reviewers independently extracted data and assessed a study quality. Results: Twenty-eight meta-analyses and systematic reviews were included for synthesizing evidence. The proportion of articles which observed improvement with a reasonable amount of evidence was the highest at process (69%), followed by health services (63%), QOL (57%), health outcomes (51%), satisfaction (50%), costs (38%) and so on. As to mortality, statistically significant results were observed only in coronary heart disease. Important components in DM, such as a multidisciplinary approach, were identified. Conclusion: The evidence synthesized shows considerable evidence in the effectiveness and efficiency of DM programs in process, health services, QOL and so on. The question is no longer whether DM programs work, but rather which type or component of DM programs works best and efficiently in the context of each healthcare system or country. PMID:23445693
Cai, Di; Li, Ping; Chen, Changjing; Wang, Yong; Hu, Song; Cui, Caixia; Qin, Peiyong; Tan, Tianwei
2016-11-01
In this study, different pretreatment methods were evaluated for modified the corn stalk bagasse and further used the pretreated bagasse as immobilized carrier in acetone-butanol-ethanol fermentation process. Structural changes of the bagasses pretreated by different methods were analyzed by Fourier transform infrared, crystallinity index and scanning pictures by electron microscope. And the performances of batch fermentation using the corn stalk based carriers were evaluated. Results indicated that the highest ABE concentration of 23.86g/L was achieved using NaOH pretreated carrier in batch fermentation. Immobilized fermentation-pervaporation integration process was further carried out. The integration process showed long-term stability with 225-394g/L of ABE solvents on the permeate side of pervaporation membrane. This novel integration process was found to be an efficient method for biobutanol production. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.
Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J
2017-10-01
The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.
Adams, Pornpimon; Kaewkungwal, Jaranit; Limphattharacharoen, Chanthima; Prakobtham, Sukanya; Pengsaa, Krisana; Khusmith, Srisin
2014-01-01
Tensions between researchers and ethics committees have been reported in several institutions. Some reports suggest researchers lack confidence in the quality of institutional review board (IRB) reviews, and that emphasis on strict procedural compliance and ethical issues raised by the IRB might unintentionally lead to delays in correspondence between researchers and ethics committees, and/or even encourage prevarication/equivocation, if researchers perceive committee concerns and criticisms unjust. This study systematically analyzed the efficiency of different IRB functions, and the relationship between efficiency and perceived quality of the decision-making process. The major purposes of this study were thus (1) to use the IRB Metrics developed by the Faculty of Tropical Medicine, Mahidol University, Thailand (FTM-EC) to assess the operational efficiency and perceived effectiveness of its ethics committees, and (2) to determine ethical issues that may cause the duration of approval process to be above the target limit of 60 days. Based on a literature review of definitions and methods used and proposed for use, in assessing aspects of IRB quality, an “IRB Metrics” was developed to assess IRB processes using a structure-process-outcome measurement model. To observe trends in the indicators evaluated, data related to all protocols submitted to the two panels of the FTM-EC (clinical and non-clinical), between January 2010–September 2013, were extracted and analyzed. Quantitative information based on IRB Metrics structure-process-outcome illuminates different areas for internal-process improvement. Ethical issues raised with researchers by the IRB, which were associated with the duration of the approval process in protocol review, could be considered root causes of tensions between the parties. The assessment of IRB structure-process-outcome thus provides a valuable opportunity to strengthen relationships and reduce conflicts between IRBs and researchers, with positive outcomes for all parties involved in the conduct of human-subject research. PMID:25406085
Hata, Akihiko; Katayama, Hiroyuki; Kojima, Keisuke; Sano, Shoichi; Kasuga, Ikuro; Kitajima, Masaaki; Furumai, Hiroaki
2014-01-15
Rainfall events can introduce large amount of microbial contaminants including human enteric viruses into surface water by intermittent discharges from combined sewer overflows (CSOs). The present study aimed to investigate the effect of rainfall events on viral loads in surface waters impacted by CSO and the reliability of molecular methods for detection of enteric viruses. The reliability of virus detection in the samples was assessed by using process controls for virus concentration, nucleic acid extraction and reverse transcription (RT)-quantitative PCR (qPCR) steps, which allowed accurate estimation of virus detection efficiencies. Recovery efficiencies of poliovirus in river water samples collected during rainfall events (<10%) were lower than those during dry weather conditions (>10%). The log10-transformed virus concentration efficiency was negatively correlated with suspended solid concentration (r(2)=0.86) that increased significantly during rainfall events. Efficiencies of DNA extraction and qPCR steps determined with adenovirus type 5 and a primer sharing control, respectively, were lower in dry weather. However, no clear relationship was observed between organic water quality parameters and efficiencies of these two steps. Observed concentrations of indigenous enteric adenoviruses, GII-noroviruses, enteroviruses, and Aichi viruses increased during rainfall events even though the virus concentration efficiency was presumed to be lower than in dry weather. The present study highlights the importance of using appropriate process controls to evaluate accurately the concentration of water borne enteric viruses in natural waters impacted by wastewater discharge, stormwater, and CSOs. © 2013.
Brodic, Darko; Milivojevic, Dragan R.; Milivojevic, Zoran N.
2011-01-01
The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures. PMID:22164106
Brodic, Darko; Milivojevic, Dragan R; Milivojevic, Zoran N
2011-01-01
The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.
Martin, José Guilherme Prado; de Oliveira E Silva, Gabriela; da Fonseca, Carolina Rodrigues; Morales, Caio Baptista; Souza Pamplona Silva, Caroline; Miquelluti, Daniel Lima; Porto, Ernani
2016-12-05
Staphylococci are considered a major concern in dairy plants mainly due to the intensive production flow, automation of processing plants and increased demand in the microbiological quality of dairy products. This study aimed to identify S. aureus strains isolated from three Brazilian dairy plants, evaluate the influence of time, temperature and contact surface on the bacterial adhesion process, as well as the efficiency of simulated hygiene and sanitation protocol in removing adhered cells. For genotypic analyses, the presence of icaA and icaD in strains was evaluated. Adherence assays were performed in biofilm reactor, comparing the influence of 2 temperatures (5°C and 35°C), 2 surfaces (stainless steel and polypropylene) and 4 contact times (3, 6, 12h and post-sanitization). To evaluate the process effectiveness in removing adhered cells, neutral detergent and sanitizing agent based on sodium hypochlorite were used in order to simulate the situation observed in one of the dairy plants analyzed. The presence of icaA and icaD genes was determined in 75.3% and 77.6% of strains, respectively; 70.6% of isolates showed both genes, whereas 17.6% showed no genes. Genes for enterotoxin production were found in all samples, relating to SEG and SEH toxins. The number of cells adhered on both surfaces was about 3 and 6 log 10 CFU/cm 2 at temperatures of 5°C and 35°C, respectively, for most situations evaluated, with significant increase over the evaluation period. In general, the temperature of 35°C favored greater adherence of S. aureus. At 5°C, there was a considerable number of adhered cells, but in populations significantly lower than those observed at 35°C. The cleaning and sanitizing protocol was ineffective in removing adhered cells; better performance of sodium hypochlorite was observed at 5°C, which should be related to lower adherence observed at this temperature. Thus, the process was not able to reduce the number of S. aureus bacteria adhered on both surfaces to safe levels under the conditions evaluated. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy
NASA Astrophysics Data System (ADS)
Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli
2014-03-01
One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.
Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean
2013-07-01
Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.