Process safety improvement--quality and target zero.
Van Scyoc, Karl
2008-11-15
Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.
Selker, Harry P.; Leslie, Laurel K.
2015-01-01
Abstract There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in‐person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869
Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K
2015-12-01
There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.
Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S
2008-10-01
The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.
System and method for networking electrochemical devices
Williams, Mark C.; Wimer, John G.; Archer, David H.
1995-01-01
An improved electrochemically active system and method including a plurality of electrochemical devices, such as fuel cells and fluid separation devices, in which the anode and cathode process-fluid flow chambers are connected in fluid-flow arrangements so that the operating parameters of each of said plurality of electrochemical devices which are dependent upon process-fluid parameters may be individually controlled to provide improved operating efficiency. The improvements in operation include improved power efficiency and improved fuel utilization in fuel cell power generating systems and reduced power consumption in fluid separation devices and the like through interstage process fluid parameter control for series networked electrochemical devices. The improved networking method includes recycling of various process flows to enhance the overall control scheme.
An exploratory survey of methods used to develop measures of performance
NASA Astrophysics Data System (ADS)
Hamner, Kenneth L.; Lafleur, Charles A.
1993-09-01
Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.
Improved silicon carbide for advanced heat engines. I - Process development for injection molding
NASA Technical Reports Server (NTRS)
Whalen, Thomas J.; Trela, Walter
1989-01-01
Alternate processing methods have been investigated as a means of improving the mechanical properties of injection-molded SiC. Various mixing processes (dry, high-sheer, and fluid) were evaluated along with the morphology and particle size of the starting beta-SiC powder. Statistically-designed experiments were used to determine significant effects and interactions of variables in the mixing, injection molding, and binder removal process steps. Improvements in mechanical strength can be correlated with the reduction in flaw size observed in the injection molded green bodies obtained with improved processing methods.
Bertholey, F; Bourniquel, P; Rivery, E; Coudurier, N; Follea, G
2009-05-01
Continuous improvement of efficiency as well as new expectations from customers (quality and safety of blood products) and employees (working conditions) imply constant efforts in Blood Transfusion Establishments (BTE) to improve work organisations. The Lean method (from "Lean" meaning "thin") aims at identifying wastages in the process (overproduction, waiting, over-processing, inventory, transport, motion) and then reducing them in establishing a mapping of value chain (Value Stream Mapping). It consists in determining the added value of each step of the process from a customer perspective. Lean also consists in standardizing operations while implicating and responsabilizing all collaborators. The name 5S comes from the first letter of five operations of a Japanese management technique: to clear, rank, keep clean, standardize, make durable. The 5S method leads to develop the team working inducing an evolution of the way in the management is performed. The Lean VSM method has been applied to blood processing (component laboratory) in the Pays de la Loire BTE. The Lean 5S method has been applied to blood processing, quality control, purchasing, warehouse, human resources and quality assurance in the Rhône-Alpes BTE. The experience returns from both BTE shows that these methods allowed improving: (1) the processes and working conditions from a quality perspective, (2) the staff satisfaction, (3) the efficiency. These experiences, implemented in two BTE for different processes, confirm the applicability and usefulness of these methods to improve working organisations in BTE.
Asou, Hiroya; Imada, N; Sato, T
2010-06-20
On coronary MR angiography (CMRA), cardiac motions worsen the image quality. To improve the image quality, detection of cardiac especially for individual coronary motion is very important. Usually, scan delay and duration were determined manually by the operator. We developed a new evaluation method to calculate static time of individual coronary artery. At first, coronary cine MRI was taken at the level of about 3 cm below the aortic valve (80 images/R-R). Chronological change of the signals were evaluated with Fourier transformation of each pixel of the images were done. Noise reduction with subtraction process and extraction process were done. To extract higher motion such as coronary arteries, morphological filter process and labeling process were added. Using these imaging processes, individual coronary motion was extracted and individual coronary static time was calculated automatically. We compared the images with ordinary manual method and new automated method in 10 healthy volunteers. Coronary static times were calculated with our method. Calculated coronary static time was shorter than that of ordinary manual method. And scan time became about 10% longer than that of ordinary method. Image qualities were improved in our method. Our automated detection method for coronary static time with chronological Fourier transformation has a potential to improve the image quality of CMRA and easy processing.
The Data-to-Action Framework: A Rapid Program Improvement Process
ERIC Educational Resources Information Center
Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.
2015-01-01
Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
Actively Teaching Research Methods with a Process Oriented Guided Inquiry Learning Approach
ERIC Educational Resources Information Center
Mullins, Mary H.
2017-01-01
Active learning approaches have shown to improve student learning outcomes and improve the experience of students in the classroom. This article compares a Process Oriented Guided Inquiry Learning style approach to a more traditional teaching method in an undergraduate research methods course. Moving from a more traditional learning environment to…
The use of process mapping in healthcare quality improvement projects.
Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James
2018-05-01
Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.
Research on pre-processing of QR Code
NASA Astrophysics Data System (ADS)
Sun, Haixing; Xia, Haojie; Dong, Ning
2013-10-01
QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.
Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia
2018-05-15
We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.
O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor
2012-08-01
Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.
SEIPS-based process modeling in primary care.
Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T
2017-04-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.
SEIPS-Based Process Modeling in Primary Care
Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter
2016-01-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
Intelligent process mapping through systematic improvement of heuristics
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.
1992-01-01
The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.
Wavelet-Based Processing for Fiber Optic Sensing Systems
NASA Technical Reports Server (NTRS)
Hamory, Philip J. (Inventor); Parker, Allen R., Jr. (Inventor)
2016-01-01
The present invention is an improved method of processing conglomerate data. The method employs a Triband Wavelet Transform that decomposes and decimates the conglomerate signal to obtain a final result. The invention may be employed to improve performance of Optical Frequency Domain Reflectometry systems.
Laurila, J; Standertskjöld-Nordenstam, C G; Suramo, I; Tolppanen, E M; Tervonen, O; Korhola, O; Brommels, M
2001-01-01
To study the efficacy of continuous quality improvement (CQI) compared to ordinary management in an on-duty radiology department. Because of complaints regarding delivery of on-duty radiological services, an improvement was initiated simultaneously at two hospitals, at the HUCH (Helsinki University Central Hospital) utilising the CQI-method, and at the OUH (Oulu University Hospital) with a traditional management process. For the CQI project, a team was formed to evaluate the process with flow-charts, cause and effect diagrams, Pareto analysis and control charts. Interventions to improve the process were based on the results of these analyses. The team at the HUCH implemented the following changes: A radiologist was added to the evening shift between 15:00-22:00 and a radiographer was moved from the morning shift to 15:00-22:00. A clear improvement was achieved in the turn-around time, but in the follow-up some of the gains were lost. Only minimal changes were achieved at the OUH, where the intervention was based on traditional management processes. CQI was an effective method for improving the quality of performance of a radiology department compared with ordinary management methods, but some of this improvement may be subsequently lost without a continuous measurement system.
The Taguchi Method Application to Improve the Quality of a Sustainable Process
NASA Astrophysics Data System (ADS)
Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.
2018-06-01
Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.
System approach to modeling of industrial technologies
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, E. S.
2018-03-01
The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.
Optimizing The DSSC Fabrication Process Using Lean Six Sigma
NASA Astrophysics Data System (ADS)
Fauss, Brian
Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.
Novel image processing method study for a label-free optical biosensor
NASA Astrophysics Data System (ADS)
Yang, Chenhao; Wei, Li'an; Yang, Rusong; Feng, Ying
2015-10-01
Optical biosensor is generally divided into labeled type and label-free type, the former mainly contains fluorescence labeled method and radioactive-labeled method, while fluorescence-labeled method is more mature in the application. The mainly image processing methods of fluorescent-labeled biosensor includes smooth filtering, artificial gridding and constant thresholding. Since some fluorescent molecules may influence the biological reaction, label-free methods have been the main developing direction of optical biosensors nowadays. The using of wider field of view and larger angle of incidence light path which could effectively improve the sensitivity of the label-free biosensor also brought more difficulties in image processing, comparing with the fluorescent-labeled biosensor. Otsu's method is widely applied in machine vision, etc, which choose the threshold to minimize the intraclass variance of the thresholded black and white pixels. It's capacity-constrained with the asymmetrical distribution of images as a global threshold segmentation. In order to solve the irregularity of light intensity on the transducer, we improved the algorithm. In this paper, we present a new image processing algorithm based on a reflectance modulation biosensor platform, which mainly comprises the design of sliding normalization algorithm for image rectification and utilizing the improved otsu's method for image segmentation, in order to implement automatic recognition of target areas. Finally we used adaptive gridding method extracting the target parameters for analysis. Those methods could improve the efficiency of image processing, reduce human intervention, enhance the reliability of experiments and laid the foundation for the realization of high throughput of label-free optical biosensors.
ERIC Educational Resources Information Center
Hahn, William G.; Bart, Barbara D.
2003-01-01
Business students were taught a total quality management-based outlining process for course readings and a tally method to measure learning efficiency. Comparison of 233 who used the process and 99 who did not showed that the group means of users' test scores were 12.4 points higher than those of nonusers. (Contains 25 references.) (SK)
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Improvement of radiology services based on the process management approach.
Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria
2011-06-01
The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
The Rocket Equation Improvement under ICF Implosion Experiment
NASA Astrophysics Data System (ADS)
Wang, Yanbin; Zheng, Zhijian
2013-10-01
The ICF explosion process has been studied in details. The rocket equation has been improved in explosive process by introducing the pressure parameter of fuel. Some methods could be drawn by the improved rocket equation. And the methods could be used to improve ICF target design, driving pulse design and experimental design. The First is to increase ablation pressure. The second is to decrease pressure of fuel. The third is to use larger diameter of target sphere. And the forth is to a shorten driving pulse.
An improved clustering algorithm based on reverse learning in intelligent transportation
NASA Astrophysics Data System (ADS)
Qiu, Guoqing; Kou, Qianqian; Niu, Ting
2017-05-01
With the development of artificial intelligence and data mining technology, big data has gradually entered people's field of vision. In the process of dealing with large data, clustering is an important processing method. By introducing the reverse learning method in the clustering process of PAM clustering algorithm, to further improve the limitations of one-time clustering in unsupervised clustering learning, and increase the diversity of clustering clusters, so as to improve the quality of clustering. The algorithm analysis and experimental results show that the algorithm is feasible.
A novel double loop control model design for chemical unstable processes.
Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He
2014-03-01
In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.
Methods for consistent forewarning of critical events across multiple data channels
Hively, Lee M.
2006-11-21
This invention teaches further method improvements to forewarn of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves conversion of time-serial data into equiprobable symbols. A second improvement is a method to maximize the channel-consistent total-true rate of forewarning from a plurality of data channels over multiple data sets from the same patient or process. This total-true rate requires resolution of the forewarning indications into true positives, true negatives, false positives and false negatives. A third improvement is the use of various objective functions, as derived from the phase-space dissimilarity measures, to give the best forewarning indication. A fourth improvement uses various search strategies over the phase-space analysis parameters to maximize said objective functions. A fifth improvement shows the usefulness of the method for various biomedical and machine applications.
Improved Method of Purifying Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Delzeit, Lance D.
2004-01-01
An improved method of removing the residues of fabrication from carbon nanotubes has been invented. These residues comprise amorphous carbon and metal particles that are produced during the growth process. Prior methods of removing the residues include a variety of processes that involved the use of halogens, oxygen, or air in both thermal and plasma processes. Each of the prior methods entails one or more disadvantages, including non-selectivity (removal or damage of nanotubes in addition to removal of the residues), the need to dispose of toxic wastes, and/or processing times as long as 24 hours or more. In contrast, the process described here does not include the use of toxic chemicals, the generation of toxic wastes, causes little or no damage to the carbon nanotubes, and involves processing times of less than 1 hour. In the improved method, purification is accomplished by flowing water vapor through the reaction chamber at elevated temperatures and ambient pressures. The impurities are converted to gaseous waste products by the selective hydrogenation and hydroxylation by the water in a reaction chamber. This process could be performed either immediately after growth or in a post-growth purification process. The water used needs to be substantially free of oxygen and can be obtained by a repeated freeze-pump-thaw process. The presence of oxygen will non-selectively attach the carbon nanotubes in addition to the amorphous carbon.
2011-01-01
Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303
NASA Astrophysics Data System (ADS)
Lu, Lei; Yan, Jihong; Chen, Wanqun; An, Shi
2018-03-01
This paper proposed a novel spatial frequency analysis method for the investigation of potassium dihydrogen phosphate (KDP) crystal surface based on an improved bidimensional empirical mode decomposition (BEMD) method. Aiming to eliminate end effects of the BEMD method and improve the intrinsic mode functions (IMFs) for the efficient identification of texture features, a denoising process was embedded in the sifting iteration of BEMD method. With removing redundant information in decomposed sub-components of KDP crystal surface, middle spatial frequencies of the cutting and feeding processes were identified. Comparative study with the power spectral density method, two-dimensional wavelet transform (2D-WT), as well as the traditional BEMD method, demonstrated that the method developed in this paper can efficiently extract texture features and reveal gradient development of KDP crystal surface. Furthermore, the proposed method was a self-adaptive data driven technique without prior knowledge, which overcame shortcomings of the 2D-WT model such as the parameters selection. Additionally, the proposed method was a promising tool for the application of online monitoring and optimal control of precision machining process.
Improving operational anodising process performance using simulation approach
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Ghazali, Syarah Syahidah
2015-10-01
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.
NASA Astrophysics Data System (ADS)
Witantyo; Setyawan, David
2018-03-01
In a lead acid battery industry, grid casting is a process that has high defect and thickness variation level. DMAIC (Define-Measure-Analyse-Improve-Control) method and its tools will be used to improve the casting process. In the Define stage, it is used project charter and SIPOC (Supplier Input Process Output Customer) method to map the existent problem. In the Measure stage, it is conducted a data retrieval related to the types of defect and the amount of it, also the grid thickness variation that happened. And then the retrieved data is processed and analyzed by using 5 Why’s and FMEA method. In the Analyze stage, it is conducted a grid observation that experience fragile and crack type of defect by using microscope showing the amount of oxide Pb inclusion in the grid. Analysis that is used in grid casting process shows the difference of temperature that is too high between the metal fluid and mold temperature, also the corking process that doesn’t have standard. The Improve stage is conducted a fixing process which generates the reduction of grid variation thickness level and defect/unit level from 9,184% to 0,492%. In Control stage, it is conducted a new working standard determination and already fixed control process.
Sethi, Rajiv; Yanamadala, Vijay; Burton, Douglas C; Bess, Robert Shay
2017-11-01
Lean methodology was developed in the manufacturing industry to increase output and decrease costs. These labor organization methods have become the mainstay of major manufacturing companies worldwide. Lean methods involve continuous process improvement through the systematic elimination of waste, prevention of mistakes, and empowerment of workers to make changes. Because of the profit and productivity gains made in the manufacturing arena using lean methods, several healthcare organizations have adopted lean methodologies for patient care. Lean methods have now been implemented in many areas of health care. In orthopaedic surgery, lean methods have been applied to reduce complication rates and create a culture of continuous improvement. A step-by-step guide based on our experience can help surgeons use lean methods in practice. Surgeons and hospital centers well versed in lean methodology will be poised to reduce complications, improve patient outcomes, and optimize cost/benefit ratios for patient care.
NASA Astrophysics Data System (ADS)
Abitew, T. A.; van Griensven, A.; Bauwens, W.
2015-12-01
Evapotranspiration is the main process in hydrology (on average around 60%), though has not received as much attention in the evaluation and calibration of hydrological models. In this study, Remote Sensing (RS) derived Evapotranspiration (ET) is used to improve the spatially distributed processes of ET of SWAT model application in the upper Mara basin (Kenya) and the Blue Nile basin (Ethiopia). The RS derived ET data is obtained from recently compiled global datasets (continuously monthly data at 1 km resolution from MOD16NBI,SSEBop,ALEXI,CMRSET models) and from regionally applied Energy Balance Models (for several cloud free days). The RS-RT data is used in different forms: Method 1) to evaluate spatially distributed evapotransiration model resultsMethod 2) to calibrate the evotranspiration processes in hydrological modelMethod 3) to bias-correct the evapotranpiration in hydrological model during simulation after changing the SWAT codesAn inter-comparison of the RS-ET products shows that at present there is a significant bias, but at the same time an agreement on the spatial variability of ET. The ensemble mean of different ET products seems the most realistic estimation and was further used in this study.The results show that:Method 1) the spatially mapped evapotranspiration of hydrological models shows clear differences when compared to RS derived evapotranspiration (low correlations). Especially evapotranspiration in forested areas is strongly underestimated compared to other land covers.Method 2) Calibration allows to improve the correlations between the RS and hydrological model results to some extent.Method 3) Bias-corrections are efficient in producing (sesonal or annual) evapotranspiration maps from hydrological models which are very similar to the patterns obtained from RS data.Though the bias-correction is very efficient, it is advised to improve the model results by better representing the ET processes by improved plant/crop computations, improved agricultural management practices or by providing improved meteorological data.
Improving Informed Consent with Minority Participants: Results from Researcher and Community Surveys
Quinn, Sandra Crouse; Garza, Mary A.; Butler, James; Fryer, Craig S.; Casper, Erica T.; Thomas, Stephen B.; Barnard, David; Kim, Kevin H.
2013-01-01
Strengthening the informed consent process is one avenue for improving recruitment of minorities into research. This study examines that process from two different perspectives, that of researchers and that of African American and Latino community members. Through the use of two separate surveys, we compared strategies used by researchers with the preferences and attitudes of community members during the informed consent process. Our data suggest that researchers can improve the informed consent process by incorporating methods preferred by the community members along with methods shown in the literature for increasing comprehension. With this approach, the informed consent process may increase both participants’ comprehension of the material and overall satisfaction, fostering greater trust in research and openness to future research opportunities. PMID:23324203
George, David L; Smith, Michael J; Draugalis, JoLaine R; Tolma, Eleni L; Keast, Shellie L; Wilson, Justin B
2018-03-01
The Center for Medicare and Medicaid Services (CMS) created the Star Rating system based on multiple measures that indicate the overall quality of health plans. Community pharmacists can impact certain Star Ratings measure scores through medication adherence and patient safety interventions. To explore methods, needs, and workflow issues of community pharmacists to improve CMS Star Ratings measures. Think-aloud protocols (TAPs) were conducted with active community retail pharmacists in Oklahoma. Each TAP was audio recorded and transcribed to documents for analysis. Analysts agreed on common themes, illuminated differences in findings, and saturation of the data gathered. Methods, needs, and workflow themes of community pharmacists associated with improving Star Ratings measures were compiled and organized to exhibit a decision-making process. Five TAPs were performed among three independent pharmacy owners, one multi-store owner, and one chain-store administrator. A thematically common 4-step process to monitor and improve CMS Star Ratings scores among participants was identified. To improve Star Ratings measures, pharmacists: 1) used technology to access scores, 2) analyzed data to strategically set goals, 3) assessed individual patient information for comprehensive assessment, and 4) decided on interventions to best impact Star Ratings scores. Participants also shared common needs, workflow issues, and benefits associated with methods used in improving Star Ratings. TAPs were useful in exploring processes of pharmacists who improve CMS Star Ratings scores. Pharmacists demonstrated and verbalized their methods, workflow issues, needs, and benefits related to performing the task. The themes and decision-making process identified to improving CMS Star Ratings scores will assist in the development of training and education programs for pharmacists in the community setting. Published by Elsevier Inc.
Significant improvement in the thermal annealing process of optical resonators
NASA Astrophysics Data System (ADS)
Salzenstein, Patrice; Zarubin, Mikhail
2017-05-01
Thermal annealing performed during process improves the quality of the roughness of optical resonators reducing stresses at the periphery of their surface thus allowing higher Q-factors. After a preliminary realization, the design of the oven and the electronic method were significantly improved thanks to nichrome resistant alloy wires and chopped basalt fibers for thermal isolation during the annealing process. Q-factors can then be improved.
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
[An Improved Spectral Quaternion Interpolation Method of Diffusion Tensor Imaging].
Xu, Yonghong; Gao, Shangce; Hao, Xiaofei
2016-04-01
Diffusion tensor imaging(DTI)is a rapid development technology in recent years of magnetic resonance imaging.The diffusion tensor interpolation is a very important procedure in DTI image processing.The traditional spectral quaternion interpolation method revises the direction of the interpolation tensor and can preserve tensors anisotropy,but the method does not revise the size of tensors.The present study puts forward an improved spectral quaternion interpolation method on the basis of traditional spectral quaternion interpolation.Firstly,we decomposed diffusion tensors with the direction of tensors being represented by quaternion.Then we revised the size and direction of the tensor respectively according to different situations.Finally,we acquired the tensor of interpolation point by calculating the weighted average.We compared the improved method with the spectral quaternion method and the Log-Euclidean method by the simulation data and the real data.The results showed that the improved method could not only keep the monotonicity of the fractional anisotropy(FA)and the determinant of tensors,but also preserve the tensor anisotropy at the same time.In conclusion,the improved method provides a kind of important interpolation method for diffusion tensor image processing.
Improved wavelet de-noising method of rail vibration signal for wheel tread detection
NASA Astrophysics Data System (ADS)
Zhao, Quan-ke; Zhao, Quanke; Gao, Xiao-rong; Luo, Lin
2011-12-01
The irregularities of wheel tread can be detected by processing acceleration vibration signal of railway. Various kinds of noise from different sources such as wheel-rail resonance, bad weather and artificial reasons are the key factors influencing detection accuracy. A method which uses wavelet threshold de-noising is investigated to reduce noise in the detection signal, and an improved signal processing algorithm based on it has been established. The results of simulations and field experiments show that the proposed method can increase signal-to-noise ratio (SNR) of the rail vibration signal effectively, and improve the detection accuracy.
Process Security in Chemical Engineering Education
ERIC Educational Resources Information Center
Piluso, Cristina; Uygun, Korkut; Huang, Yinlun; Lou, Helen H.
2005-01-01
The threats of terrorism have greatly alerted the chemical process industries to assure plant security at all levels: infrastructure-improvement-focused physical security, information-protection-focused cyber security, and design-and-operation-improvement-focused process security. While developing effective plant security methods and technologies…
Sanchez-Lite, Alberto; Garcia, Manuel; Domingo, Rosario; Angel Sebastian, Miguel
2013-01-01
Musculoskeletal disorders (MSDs) that result from poor ergonomic design are one of the occupational disorders of greatest concern in the industrial sector. A key advantage in the primary design phase is to focus on a method of assessment that detects and evaluates the potential risks experienced by the operative when faced with these types of physical injuries. The method of assessment will improve the process design identifying potential ergonomic improvements from various design alternatives or activities undertaken as part of the cycle of continuous improvement throughout the differing phases of the product life cycle. This paper presents a novel postural assessment method (NERPA) fit for product-process design, which was developed with the help of a digital human model together with a 3D CAD tool, which is widely used in the aeronautic and automotive industries. The power of 3D visualization and the possibility of studying the actual assembly sequence in a virtual environment can allow the functional performance of the parts to be addressed. Such tools can also provide us with an ergonomic workstation design, together with a competitive advantage in the assembly process. The method developed was used in the design of six production lines, studying 240 manual assembly operations and improving 21 of them. This study demonstrated the proposed method's usefulness and found statistically significant differences in the evaluations of the proposed method and the widely used Rapid Upper Limb Assessment (RULA) method.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
NASA Astrophysics Data System (ADS)
Sembiring, M. T.; Wahyuni, D.; Sinaga, T. S.; Silaban, A.
2018-02-01
Cost allocation at manufacturing industry particularly in Palm Oil Mill still widely practiced based on estimation. It leads to cost distortion. Besides, processing time determined by company is not in accordance with actual processing time in work station. Hence, the purpose of this study is to eliminates non-value-added activities therefore processing time could be shortened and production cost could be reduced. Activity Based Costing Method is used in this research to calculate production cost with Value Added and Non-Value-Added Activities consideration. The result of this study is processing time decreasing for 35.75% at Weighting Bridge Station, 29.77% at Sorting Station, 5.05% at Loading Ramp Station, and 0.79% at Sterilizer Station. Cost of Manufactured for Crude Palm Oil are IDR 5.236,81/kg calculated by Traditional Method, IDR 4.583,37/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.581,71/kg after implementation of Activity Improvement Meanwhile Cost of Manufactured for Palm Kernel are IDR 2.159,50/kg calculated by Traditional Method, IDR 4.584,63/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.582,97/kg after implementation of Activity Improvement.
Employee empowerment through team building and use of process control methods.
Willems, S
1998-02-01
The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.
Ensemble analyses improve signatures of tumour hypoxia and reveal inter-platform differences
2014-01-01
Background The reproducibility of transcriptomic biomarkers across datasets remains poor, limiting clinical application. We and others have suggested that this is in-part caused by differential error-structure between datasets, and their incomplete removal by pre-processing algorithms. Methods To test this hypothesis, we systematically assessed the effects of pre-processing on biomarker classification using 24 different pre-processing methods and 15 distinct signatures of tumour hypoxia in 10 datasets (2,143 patients). Results We confirm strong pre-processing effects for all datasets and signatures, and find that these differ between microarray versions. Importantly, exploiting different pre-processing techniques in an ensemble technique improved classification for a majority of signatures. Conclusions Assessing biomarkers using an ensemble of pre-processing techniques shows clear value across multiple diseases, datasets and biomarkers. Importantly, ensemble classification improves biomarkers with initially good results but does not result in spuriously improved performance for poor biomarkers. While further research is required, this approach has the potential to become a standard for transcriptomic biomarkers. PMID:24902696
An improved correlation method for determining the period of a torsion pendulum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo Jie; Wang Dianhong
Considering variation of environment temperature and unhomogeneity of background gravitational field, an improved correlation method was proposed to determine the variational period of a torsion pendulum with high precision. The result of processing experimental data shows that the uncertainty of determining the period with this method has been improved about twofolds than traditional correlation method, which is significant for the determination of gravitational constant with time-of-swing method.
Processing the image gradient field using a topographic primal sketch approach.
Gambaruto, A M
2015-03-01
The spatial derivatives of the image intensity provide topographic information that may be used to identify and segment objects. The accurate computation of the derivatives is often hampered in medical images by the presence of noise and a limited resolution. This paper focuses on accurate computation of spatial derivatives and their subsequent use to process an image gradient field directly, from which an image with improved characteristics can be reconstructed. The improvements include noise reduction, contrast enhancement, thinning object contours and the preservation of edges. Processing the gradient field directly instead of the image is shown to have numerous benefits. The approach is developed such that the steps are modular, allowing the overall method to be improved and possibly tailored to different applications. As presented, the approach relies on a topographic representation and primal sketch of an image. Comparisons with existing image processing methods on a synthetic image and different medical images show improved results and accuracy in segmentation. Here, the focus is on objects with low spatial resolution, which is often the case in medical images. The methods developed show the importance of improved accuracy in derivative calculation and the potential in processing the image gradient field directly. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Application Process Improvement Yields Results.
ERIC Educational Resources Information Center
Holesovsky, Jan Paul
1995-01-01
After a continuing effort to improve its grant application process, the department of medical microbiology and immunology at the University of Wisconsin-Madison is submitting many more applications and realizing increased funding. The methods and strategy used to make the process more efficient and effective are outlined. (Author/MSE)
Improving operational anodising process performance using simulation approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less
Multithreading with separate data to improve the performance of Backpropagation method
NASA Astrophysics Data System (ADS)
Dhamma, Mulia; Zarlis, Muhammad; Budhiarti Nababan, Erna
2017-12-01
Backpropagation is one method of artificial neural network that can make a prediction for a new data with learning by supervised of the past data. The learning process of backpropagation method will become slow if we give too much data for backpropagation method to learn the data. Multithreading with a separate data inside of each thread are being used in order to improve the performance of backpropagtion method . Base on the research for 39 data and also 5 times experiment with separate data into 2 thread, the result showed that the average epoch become 6490 when using 2 thread and 453049 epoch when using only 1 thread. The most lowest epoch for 2 thread is 1295 and 1 thread is 356116. The process of improvement is caused by the minimum error from 2 thread that has been compared to take the weight and bias value. This process will be repeat as long as the backpropagation do learning.
NASA Astrophysics Data System (ADS)
Vajedian, S.; Motagh, M.; Nilfouroushan, F.
2013-09-01
InSAR capacity to detect slow deformation over terrain areas is limited by temporal and geometric decorrelations. Multitemporal InSAR techniques involving Persistent Scatterer (Ps-InSAR) and Small Baseline (SBAS) are recently developed to compensate the decorrelation problems. Geometric decorrelation in mountainous areas especially for Envisat images makes phase unwrapping process difficult. To improve this unwrapping problem, we first modified phase filtering to make the wrapped phase image as smooth as possible. In addition, in order to improve unwrapping results, a modified unwrapping method has been developed. This method includes removing possible orbital and tropospheric effects. Topographic correction is done within three-dimensional unwrapping, Orbital and tropospheric corrections are done after unwrapping process. To evaluate the effectiveness of our improved method we tested the proposed algorithm by Envisat and ALOS dataset and compared our results with recently developed PS software (StaMAPS). In addition we used GPS observations for evaluating the modified method. The results indicate that our method improves the estimated deformation significantly.
Scandurra, Isabella; Hägglund, Maria
2009-01-01
Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].
McCarty, L Kelsey; Saddawi-Konefka, Daniel; Gargan, Lauren M; Driscoll, William D; Walsh, John L; Peterfreund, Robert A
2014-12-01
Process improvement in healthcare delivery settings can be difficult, even when there is consensus among clinicians about a clinical practice or desired outcome. Airway management is a medical intervention fundamental to the delivery of anesthesia care. Like other medical interventions, a detailed description of the management methods should be documented. Despite this expectation, airway documentation is often insufficient. The authors hypothesized that formal adoption of process improvement methods could be used to increase the rate of "complete" airway management documentation. The authors defined a set of criteria as a local practice standard of "complete" airway management documentation. The authors then employed selected process improvement methodologies over 13 months in three iterative and escalating phases to increase the percentage of records with complete documentation. The criteria were applied retrospectively to determine the baseline frequency of complete records, and prospectively to measure the impact of process improvements efforts over the three phases of implementation. Immediately before the initial intervention, a retrospective review of 23,011 general anesthesia cases over 6 months showed that 13.2% of patient records included complete documentation. At the conclusion of the 13-month improvement effort, documentation improved to a completion rate of 91.6% (P<0.0001). During the subsequent 21 months, the completion rate was sustained at an average of 90.7% (SD, 0.9%) across 82,571 general anesthetic records. Systematic application of process improvement methodologies can improve airway documentation and may be similarly effective in improving other areas of anesthesia clinical practice.
A Mixed-Methods Research Framework for Healthcare Process Improvement.
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
2016-01-01
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Studies on Tasar Cocoon Cooking Using Permeation Method
NASA Astrophysics Data System (ADS)
Javali, Uday C.; Malali, Kiran B.; Ramya, H. G.; Naik, Subhas V.; Padaki, Naveen V.
2018-02-01
Cocoon cooking is an important process before reeling of tasar silk yarn. Cooking ensures loosening of the filaments in the tasar cocoons thereby easing the process of yarn withdrawal during reeling process. Tasar cocoons have very hard shell and hence these cocoons need chemical cooking process to loosen the silk filaments. Attempt has been made in this article to study the effect of using vacuum permeation chamber for tasar cocoon cooking in order to reduce the cooking time and improve the quality of tasar silk yarn. Vacuum assisted permeation cooking method has been studied in this article on tasar daba cocoons for cooking efficiency, deflossing and reelability. Its efficiency has been evaluated with respect to different cooking methods viz, traditional and open pan cooking methods. The tasar silk produced after reeling process has been tested for fineness, strength and cohesion properties. Results indicate that permeation method of tasar cooking ensures uniform cooking with higher efficiency along with better reeling performance and improved yarn properties.
2011-01-01
Background Segmentation is the most crucial part in the computer-aided bone age assessment. A well-known type of segmentation performed in the system is adaptive segmentation. While providing better result than global thresholding method, the adaptive segmentation produces a lot of unwanted noise that could affect the latter process of epiphysis extraction. Methods A proposed method with anisotropic diffusion as pre-processing and a novel Bounded Area Elimination (BAE) post-processing algorithm to improve the algorithm of ossification site localization technique are designed with the intent of improving the adaptive segmentation result and the region-of interest (ROI) localization accuracy. Results The results are then evaluated by quantitative analysis and qualitative analysis using texture feature evaluation. The result indicates that the image homogeneity after anisotropic diffusion has improved averagely on each age group for 17.59%. Results of experiments showed that the smoothness has been improved averagely 35% after BAE algorithm and the improvement of ROI localization has improved for averagely 8.19%. The MSSIM has improved averagely 10.49% after performing the BAE algorithm on the adaptive segmented hand radiograph. Conclusions The result indicated that hand radiographs which have undergone anisotropic diffusion have greatly reduced the noise in the segmented image and the result as well indicated that the BAE algorithm proposed is capable of removing the artifacts generated in adaptive segmentation. PMID:21952080
A Study on Improving Information Processing Abilities Based on PBL
ERIC Educational Resources Information Center
Kim, Du Gyu; Lee, JaeMu
2014-01-01
This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…
Pre-processing by data augmentation for improved ellipse fitting.
Kumar, Pankaj; Belchamber, Erika R; Miklavcic, Stanley J
2018-01-01
Ellipse fitting is a highly researched and mature topic. Surprisingly, however, no existing method has thus far considered the data point eccentricity in its ellipse fitting procedure. Here, we introduce the concept of eccentricity of a data point, in analogy with the idea of ellipse eccentricity. We then show empirically that, irrespective of ellipse fitting method used, the root mean square error (RMSE) of a fit increases with the eccentricity of the data point set. The main contribution of the paper is based on the hypothesis that if the data point set were pre-processed to strategically add additional data points in regions of high eccentricity, then the quality of a fit could be improved. Conditional validity of this hypothesis is demonstrated mathematically using a model scenario. Based on this confirmation we propose an algorithm that pre-processes the data so that data points with high eccentricity are replicated. The improvement of ellipse fitting is then demonstrated empirically in real-world application of 3D reconstruction of a plant root system for phenotypic analysis. The degree of improvement for different underlying ellipse fitting methods as a function of data noise level is also analysed. We show that almost every method tested, irrespective of whether it minimizes algebraic error or geometric error, shows improvement in the fit following data augmentation using the proposed pre-processing algorithm.
An Improved Aerial Target Localization Method with a Single Vector Sensor
Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin
2017-01-01
This paper focuses on the problems encountered in the actual data processing with the use of the existing aerial target localization methods, analyzes the causes of the problems, and proposes an improved algorithm. Through the processing of the sea experiment data, it is found that the existing algorithms have higher requirements for the accuracy of the angle estimation. The improved algorithm reduces the requirements of the angle estimation accuracy and obtains the robust estimation results. The closest distance matching estimation algorithm and the horizontal distance estimation compensation algorithm are proposed. The smoothing effect of the data after being post-processed by using the forward and backward two-direction double-filtering method has been improved, thus the initial stage data can be filtered, so that the filtering results retain more useful information. In this paper, the aerial target height measurement methods are studied, the estimation results of the aerial target are given, so as to realize the three-dimensional localization of the aerial target and increase the understanding of the underwater platform to the aerial target, so that the underwater platform has better mobility and concealment. PMID:29135956
Research on the raw data processing method of the hydropower construction project
NASA Astrophysics Data System (ADS)
Tian, Zhichao
2018-01-01
In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.
Boe, Debra Thingstad; Parsons, Helen
2009-01-01
Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964
A Study on Micropipetting Detection Technology of Automatic Enzyme Immunoassay Analyzer.
Shang, Zhiwu; Zhou, Xiangping; Li, Cheng; Tsai, Sang-Bing
2018-04-10
In order to improve the accuracy and reliability of micropipetting, a method of micro-pipette detection and calibration combining the dynamic pressure monitoring in pipetting process and quantitative identification of pipette volume in image processing was proposed. Firstly, the normalized pressure model for the pipetting process was established with the kinematic model of the pipetting operation, and the pressure model is corrected by the experimental method. Through the pipetting process pressure and pressure of the first derivative of real-time monitoring, the use of segmentation of the double threshold method as pipetting fault evaluation criteria, and the pressure sensor data are processed by Kalman filtering, the accuracy of fault diagnosis is improved. When there is a fault, the pipette tip image is collected through the camera, extract the boundary of the liquid region by the background contrast method, and obtain the liquid volume in the tip according to the geometric characteristics of the pipette tip. The pipette deviation feedback to the automatic pipetting module and deviation correction is carried out. The titration test results show that the combination of the segmented pipetting kinematic model of the double threshold method of pressure monitoring, can effectively real-time judgment and classification of the pipette fault. The method of closed-loop adjustment of pipetting volume can effectively improve the accuracy and reliability of the pipetting system.
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
Training for Template Creation: A Performance Improvement Method
ERIC Educational Resources Information Center
Lyons, Paul
2008-01-01
Purpose: There are three purposes to this article: first, to offer a training approach to employee learning and performance improvement that makes use of a step-by-step process of skill/knowledge creation. The process offers follow-up opportunities for skill maintenance and improvement; second, to explain the conceptual bases of the approach; and…
Interactive Management and Updating of Spatial Data Bases
NASA Technical Reports Server (NTRS)
French, P.; Taylor, M.
1982-01-01
The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.
Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768
Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan
2011-12-01
In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.
[Sustainable process improvement with application of 'lean philosophy'].
Rouppe van der Voort, Marc B V; van Merode, G G Frits; Veraart, Henricus G N
2013-01-01
Process improvement is increasingly being implemented, particularly with the aid of 'lean philosophy'. This management philosophy aims to improve quality by reducing 'wastage'. Local improvements can produce negative effects elsewhere due to interdependence of processes. An 'integrated system approach' is required to prevent this. Some hospitals claim that this has been successful. Research into process improvement with the application of lean philosophy has reported many positive effects, defined as improved safety, quality and efficiency. Due to methodological shortcomings and lack of rigorous evaluations it is, however, not yet possible to determine the impact of this approach. It is, however, obvious that the investigated applications are fragmentary, with a dominant focus on the instrumental aspect of the philosophy and a lack of integration in a total system, and with insufficient attention to human aspects. Process improvement is required to achieve better and more goal-oriented healthcare. To achieve this, hospitals must develop integrated system approaches that combine methods for process design with continuous improvement of processes and with personnel management. It is crucial that doctors take the initiative to guide and improve processes in an integral manner.
Computer-Based Enhancements for the Improvement of Learning.
ERIC Educational Resources Information Center
Tennyson, Robert D.
The third of four symposium papers argues that, if instructional methods are to improve learning, they must have two aspects: a direct trace to a specific learning process, and empirical support that demonstrates their significance. Focusing on the tracing process, the paper presents an information processing model of learning that can be used by…
Sanchez-Lite, Alberto; Garcia, Manuel; Domingo, Rosario; Angel Sebastian, Miguel
2013-01-01
Background Musculoskeletal disorders (MSDs) that result from poor ergonomic design are one of the occupational disorders of greatest concern in the industrial sector. A key advantage in the primary design phase is to focus on a method of assessment that detects and evaluates the potential risks experienced by the operative when faced with these types of physical injuries. The method of assessment will improve the process design identifying potential ergonomic improvements from various design alternatives or activities undertaken as part of the cycle of continuous improvement throughout the differing phases of the product life cycle. Methodology/Principal Findings This paper presents a novel postural assessment method (NERPA) fit for product-process design, which was developed with the help of a digital human model together with a 3D CAD tool, which is widely used in the aeronautic and automotive industries. The power of 3D visualization and the possibility of studying the actual assembly sequence in a virtual environment can allow the functional performance of the parts to be addressed. Such tools can also provide us with an ergonomic workstation design, together with a competitive advantage in the assembly process. Conclusions The method developed was used in the design of six production lines, studying 240 manual assembly operations and improving 21 of them. This study demonstrated the proposed method’s usefulness and found statistically significant differences in the evaluations of the proposed method and the widely used Rapid Upper Limb Assessment (RULA) method. PMID:23977340
Jain, Rahi; Venkatasubramanian, Padma
2014-01-01
Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.
An Improved Experimental Method for Simulating Erosion Processes by Concentrated Channel Flow
Chen, Xiao-Yan; Zhao, Yu; Mo, Bin; Mi, Hong-Xing
2014-01-01
Rill erosion is an important process that occurs on hill slopes, including sloped farmland. Laboratory simulations have been vital to understanding rill erosion. Previous experiments obtained sediment yields using rills of various lengths to get the sedimentation process, which disrupted the continuity of the rill erosion process and was time-consuming. In this study, an improved experimental method was used to measure the rill erosion processes by concentrated channel flow. By using this method, a laboratory platform, 12 m long and 3 m wide, was used to construct rills of 0.1 m wide and 12 m long for experiments under five slope gradients (5, 10, 15, 20, and 25 degrees) and three flow rates (2, 4, and 8 L min−1). Sediment laden water was simultaneously sampled along the rill at locations 0.5 m, 1 m, 2 m, 3 m, 4 m, 5 m, 6 m, 7 m, 8 m, 10 m, and 12 m from the water inlet to determine the sediment concentration distribution. The rill erosion process measured by the method used in this study and that by previous experimental methods are approximately the same. The experimental data indicated that sediment concentrations increase with slope gradient and flow rate, which highlights the hydraulic impact on rill erosion. Sediment concentration increased rapidly at the initial section of the rill, and the rate of increase in sediment concentration reduced with the rill length. Overall, both experimental methods are feasible and applicable. However, the method proposed in this study is more efficient and easier to operate. This improved method will be useful in related research. PMID:24949621
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Yanmei; Li, Xinli; Bai, Yan
The measurement of multiphase flow parameters is of great importance in a wide range of industries. In the measurement of multiphase, the signals from the sensors are extremely weak and often buried in strong background noise. It is thus desirable to develop effective signal processing techniques that can detect the weak signal from the sensor outputs. In this paper, two methods, i.e., lock-in-amplifier (LIA) and improved Duffing chaotic oscillator are compared to detect and process the weak signal. For sinusoidal signal buried in noise, the correlation detection with sinusoidal reference signal is simulated by using LIA. The improved Duffing chaoticmore » oscillator method, which based on the Wigner transformation, can restore the signal waveform and detect the frequency. Two methods are combined to detect and extract the weak signal. Simulation results show the effectiveness and accuracy of the proposed improved method. The comparative analysis shows that the improved Duffing chaotic oscillator method can restrain noise strongly since it is sensitive to initial conditions.« less
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Improved performance of mesostructured perovskite solar cells via an anti-solvent method
NASA Astrophysics Data System (ADS)
Hao, Jiabin; Hao, Huiying; Cheng, Feiyu; Li, Jianfeng; Zhang, Haiyu; Dong, Jingjing; Xing, Jie; Liu, Hao; Wu, Jian
2018-06-01
One-step solution process is a facile and widely used procedure to prepare organic-inorganic perovskite materials. However, the poor surface morphology of the films attributed to the uncontrollable nucleation and crystal growth in the process is unfavorable to solar cells. In this study, an anti-solvent treatment during the one-step solution process, in which ethyl acetate (EA) was dropped on the sample during spinning the precursor solution containing CH3NH3Cl, was adopted to fabricate perovskite materials and solar cells. It was found that the morphology of the perovskite film was significantly improved due to the rapid nucleation and slow crystal growth process. The modified process enabled us to fabricate the mesoporous solar cell with power conversion efficiency of 14%, showing an improvement of 40% over the efficiency of 9.7% of the device prepared by conventional one-step method. The controlling effect of annealing time on the morphology, crystal structure and transport properties of perovskite layer as well as the performance of device fabricated by the anti-solvent method were investigated and the possible mechanism was discussed.
NASA Astrophysics Data System (ADS)
Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa
2018-03-01
Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.
Improving the medical records department processes by lean management
Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine
2015-01-01
Background: Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. Aims: The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. Materials and Methods: This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. Statistical Analysis Used: The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. Results: The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. Conclusion: The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. Originality/Value: The study represents one of the few attempts trying to eliminate wastes in the MRD. PMID:26097862
NASA Astrophysics Data System (ADS)
Jiang, Junfeng; An, Jianchang; Liu, Kun; Ma, Chunyu; Li, Zhichen; Liu, Tiegen
2017-09-01
We propose a fast positioning algorithm for the asymmetric dual Mach-Zehnder interferometric infrared fiber vibration sensor. Using the approximately derivation method and the enveloping detection method, we successfully eliminate the asymmetry of the interference outputs and improve the processing speed. A positioning measurement experiment was carried out to verify the effectiveness of the proposed algorithm. At the sensing length of 85 km, the experimental results show that the mean positioning error is 18.9 m and the mean processing time is 116 ms. The processing speed is improved by 5 times compared to what can be achieved by using the traditional time-frequency analysis-based positioning method.
Remote Sensing Image Quality Assessment Experiment with Post-Processing
NASA Astrophysics Data System (ADS)
Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.
2018-04-01
This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
An approach of point cloud denoising based on improved bilateral filtering
NASA Astrophysics Data System (ADS)
Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin
2018-04-01
An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.
NASA Astrophysics Data System (ADS)
Budzan, Sebastian
2018-04-01
In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.
Hue-preserving and saturation-improved color histogram equalization algorithm.
Song, Ki Sun; Kang, Hee; Kang, Moon Gi
2016-06-01
In this paper, an algorithm is proposed to improve contrast and saturation without color degradation. The local histogram equalization (HE) method offers better performance than the global HE method, whereas the local HE method sometimes produces undesirable results due to the block-based processing. The proposed contrast-enhancement (CE) algorithm reflects the characteristics of the global HE method in the local HE method to avoid the artifacts, while global and local contrasts are enhanced. There are two ways to apply the proposed CE algorithm to color images. One is luminance processing methods, and the other one is each channel processing methods. However, these ways incur excessive or reduced saturation and color degradation problems. The proposed algorithm solves these problems by using channel adaptive equalization and similarity of ratios between the channels. Experimental results show that the proposed algorithm enhances contrast and saturation while preserving the hue and producing better performance than existing methods in terms of objective evaluation metrics.
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.
Lively, Brooks; Kumar, Sandeep; Tian, Liu; Li, Bin; Zhong, Wei-Hong
2011-05-01
In this study we report the advantages of a 2-step method that incorporates an additional process pre-conditioning step for rapid and precise blending of the constituents prior to the commonly used melt compounding method for preparing polycarbonate/oxidized carbon nanofiber composites. This additional step (equivalent to a manufacturing cell) involves the formation of a highly concentrated solid nano-nectar of polycarbonate/carbon nanofiber composite using a solution mixing process followed by melt mixing with pure polycarbonate. This combined method yields excellent dispersion and improved mechanical and thermal properties as compared to the 1-step melt mixing method. The test results indicated that inclusion of carbon nanofibers into composites via the 2-step method resulted in dramatically reduced ( 48% lower) coefficient of thermal expansion compared to that of pure polycarbonate and 30% lower than that from the 1-step processing, at the same loading of 1.0 wt%. Improvements were also found in dynamic mechanical analysis and flexural mechanical properties. The 2-step approach is more precise and leads to better dispersion, higher quality, consistency, and improved performance in critical application areas. It is also consistent with Lean Manufacturing principles in which manufacturing cells are linked together using less of the key resources and creates a smoother production flow. Therefore, this 2-step process can be more attractive for industry.
Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.
ERIC Educational Resources Information Center
Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn
This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…
Enhancing performing characteristics of organic semiconducting films by improved solution processing
Bazan, Guillermo C; Moses, Daniel; Peet, Jeffrey; Heeger, Alan J
2014-05-13
Improved processing methods for enhanced properties of conjugated polymer films are disclosed, as well as the enhanced conjugated polymer films produced thereby. Addition of low molecular weight alkyl-containing molecules to solutions used to form conjugated polymer films leads to improved photoconductivity and improvements in other electronic properties. The enhanced conjugated polymer films can be used in a variety of electronic devices, such as solar cells and photodiodes.
A new data processing technique for Rayleigh-Taylor instability growth experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Yongteng; Tu, Shaoyong; Miao, Wenyong
Typical face-on experiments for Rayleigh-Taylor instability study involve the time-resolved radiography of an accelerated foil with line-of-sight of the radiography along the direction of motion. The usual method which derives perturbation amplitudes from the face-on images reverses the actual image transmission procedure, so the obtained results will have a large error in the case of large optical depth. In order to improve the accuracy of data processing, a new data processing technique has been developed to process the face-on images. This technique based on convolution theorem, refined solutions of optical depth can be achieved by solving equations. Furthermore, we discussmore » both techniques for image processing, including the influence of modulation transfer function of imaging system and the backlighter spatial profile. Besides, we use the two methods to the process the experimental results in Shenguang-II laser facility and the comparison shows that the new method effectively improve the accuracy of data processing.« less
Image enhancement in positron emission mammography
NASA Astrophysics Data System (ADS)
Slavine, Nikolai V.; Seiler, Stephen; McColl, Roderick W.; Lenkinski, Robert E.
2017-02-01
Purpose: To evaluate an efficient iterative deconvolution method (RSEMD) for improving the quantitative accuracy of previously reconstructed breast images by commercial positron emission mammography (PEM) scanner. Materials and Methods: The RSEMD method was tested on breast phantom data and clinical PEM imaging data. Data acquisition was performed on a commercial Naviscan Flex Solo II PEM camera. This method was applied to patient breast images previously reconstructed with Naviscan software (MLEM) to determine improvements in resolution, signal to noise ratio (SNR) and contrast to noise ratio (CNR.) Results: In all of the patients' breast studies the post-processed images proved to have higher resolution and lower noise as compared with images reconstructed by conventional methods. In general, the values of SNR reached a plateau at around 6 iterations with an improvement factor of about 2 for post-processed Flex Solo II PEM images. Improvements in image resolution after the application of RSEMD have also been demonstrated. Conclusions: A rapidly converging, iterative deconvolution algorithm with a novel resolution subsets-based approach RSEMD that operates on patient DICOM images has been used for quantitative improvement in breast imaging. The RSEMD method can be applied to clinical PEM images to improve image quality to diagnostically acceptable levels and will be crucial in order to facilitate diagnosis of tumor progression at the earliest stages. The RSEMD method can be considered as an extended Richardson-Lucy algorithm with multiple resolution levels (resolution subsets).
Improvement of Selected Logistics Processes Using Quality Engineering Tools
NASA Astrophysics Data System (ADS)
Zasadzień, Michał; Žarnovský, Jozef
2018-03-01
Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burian, Cosmin; Llobet, Eduard; Vilanova, Xavier
We have designed a challenging experimental sample set in the form of 20 solutions with a high degree of similarity in order to study whether the addition of chromatographic separation information improves the performance of regular MS based electronic noses. In order to make an initial study of the approach, two different chromatographic methods were used. By processing the data of these experiments with 2 and 3-way algorithms, we have shown that the addition of chromatographic separation information improves the results compared to the 2-way analysis of mass spectra or total ion chromatogram treated separately. Our findings show that whenmore » the chromatographic peaks are resolved (longer measurement times), 2-way methods work better than 3-way methods, whereas in the case of a more challenging measurement (more coeluted chromatograms, much faster GC-MS measurements) 3-way methods work better.« less
Şahinkaya, S; Sevimli, M F; Aygün, A
2012-01-01
One of the most serious problems encountered in biological wastewater treatment processes is the production of waste activated sludge (WAS). Sonication, which is an energy-intensive process, is the most powerful sludge pre-treatment method. Due to lack of information about the combined pre-treatment methods of sonication, the combined pre-treatment methods were investigated and it was aimed to improve the disintegration efficiency of sonication by combining sonication with alkalization and thermal pre-treatment methods in this study. The process performances were evaluated based on the quantities of increases in soluble chemical oxygen demand (COD), protein and carbohydrate. The releases of soluble COD, carbohydrate and protein by the combined methods were higher than those by sonication, alkalization and thermal pre-treatment alone. Degrees of sludge disintegration in various options of sonication were in the following descending order: sono-alkalization > sono-thermal pre-treatment > sonication. Therefore, it was determined that combining sonication with alkalization significantly improved the sludge disintegration and decreased the required energy to reach the same yield by sonication. In addition, effects on sludge settleability and dewaterability and kinetic mathematical modelling of pre-treatment performances of these methods were investigated. It was proven that the proposed model accurately predicted the efficiencies of ultrasonic pre-treatment methods.
Improved soft-agar colony assay in a fluid processing apparatus.
Forsman, A D; Herpich, A R; Chapes, S K
1999-01-01
The standard method for quantitating bone marrow precursor cells has been to count the number of colony-forming units that form in semisolid (0.3%) agar. Recently we adapted this assay for use in hardware, the Fluid Processing Apparatus, that is flown in standard payload lockers of the space shuttle. When mouse or rat macrophage colony-forming units were measured with this hardware in ground-based assays, we found significantly more colony growth than that seen in standard plate assays. The improved growth correlates with increased agar thickness but also appears to be due to properties inherent to the Fluid Processing Apparatus. This paper describes an improved method for determining bone marrow macrophage precursor numbers in semisolid agar.
NASA Astrophysics Data System (ADS)
Kobayashi, Takashi; Komoda, Norihisa
The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.
Reducing the complexity of the software design process with object-oriented design
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.
Investigation of micro-injection molding based on longitudinal ultrasonic vibration core.
Qiu, Zhongjun; Yang, Xue; Zheng, Hui; Gao, Shan; Fang, Fengzhou
2015-10-01
An ultrasound-assisted micro-injection molding method is proposed to improve the rheological behavior of the polymer melt radically, and a micro-injection molding system based on a longitudinal ultrasonic vibration core is developed and employed in the micro-injection molding process of Fresnel lenses. The verification experiments show that the filling mold area of the polymer melt is increased by 6.08% to 19.12%, and the symmetric deviation of the Fresnel lens is improved 15.62% on average. This method improved the filling performance and replication quality of the polymer melt in the injection molding process effectively.
Improving designer productivity
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting those challenges.
Improved image processing of road pavement defect by infrared thermography
NASA Astrophysics Data System (ADS)
Sim, Jun-Gi
2018-03-01
This paper intends to achieve improved image processing for the clear identification of defects in damaged road pavement structure using infrared thermography non-destructive testing (NDT). To that goal, 4 types of pavement specimen including internal defects were fabricated to exploit the results obtained by heating the specimens by natural light. The results showed that defects located down to a depth of 3 cm could be detected by infrared thermography NDT using the improved image processing method.
Operations research methods improve chemotherapy patient appointment scheduling.
Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott
2012-12-01
Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.
Six-sigma application in tire-manufacturing company: a case study
NASA Astrophysics Data System (ADS)
Gupta, Vikash; Jain, Rahul; Meena, M. L.; Dangayach, G. S.
2017-09-01
Globalization, advancement of technologies, and increment in the demand of the customer change the way of doing business in the companies. To overcome these barriers, the six-sigma define-measure-analyze-improve-control (DMAIC) method is most popular and useful. This method helps to trim down the wastes and generating the potential ways of improvement in the process as well as service industries. In the current research, the DMAIC method was used for decreasing the process variations of bead splice causing wastage of material. This six-sigma DMAIC research was initiated by problem identification through voice of customer in the define step. The subsequent step constitutes of gathering the specification data of existing tire bead. This step was followed by the analysis and improvement steps, where the six-sigma quality tools such as cause-effect diagram, statistical process control, and substantial analysis of existing system were implemented for root cause identification and reduction in process variation. The process control charts were used for systematic observation and control the process. Utilizing DMAIC methodology, the standard deviation was decreased from 2.17 to 1.69. The process capability index (C p) value was enhanced from 1.65 to 2.95 and the process performance capability index (C pk) value was enhanced from 0.94 to 2.66. A DMAIC methodology was established that can play a key role for reducing defects in the tire-manufacturing process in India.
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
2010-01-01
Background Histologic samples all funnel through the H&E microtomy staining area. Here manual processes intersect with semi-automated processes creating a bottleneck. We compare alternate work processes in anatomic pathology primarily in the H&E staining work cell. Methods We established a baseline measure of H&E process impact on personnel, information management and sample flow from historical workload and production data and direct observation. We compared this to performance after implementing initial Lean process modifications, including workstation reorganization, equipment relocation and workflow levelling, and the Ventana Symphony stainer to assess the impact on productivity in the H&E staining work cell. Results Average time from gross station to assembled case decreased by 2.9 hours (12%). Total process turnaround time (TAT) exclusive of processor schedule changes decreased 48 minutes/case (4%). Mean quarterly productivity increased 8.5% with the new methods. Process redesign reduced the number of manual steps from 219 to 182, a 17% reduction. Specimen travel distance was reduced from 773 ft/case to 395 ft/case (49%) overall, and from 92 to 53 ft/case in the H&E cell (42% improvement). Conclusions Implementation of Lean methods in the H&E work cell of histology can result in improved productivity, improved through-put and case availability parameters including TAT. PMID:20181123
Jain, Tanu; Grover, Kiran; Kaur, Gurpreet
2016-12-15
Garden cress seeds were undergone for different processing methods and analyzed for its nutritional composition. Effect of processing on nutrient retention was evaluated to attain the best processed form of seeds with maximum amount of nutrients. Soaking improved protein and ash by 2.10 and 2.48 percent respectively. Boiling improved fat and fibre by 1.66 and 8.32 percent respectively. Maximum retention of iron and zinc was found with roasting. It also improved calcium by 3.18 percent. Percent ionizable iron and bioavailability was found maximum with boiling (13.59 and 6.88% respectively). In vitro starch and protein digestibility were found maximum on boiling (57.98 and 32.39% respectively) with a decrease of 9.65 and 14.13 percent in phytin phosphorus and oxalate respectively. Amino acids and fatty acids were decreased with heat treatment and maximum retention was found with soaking. Overall improvement in nutrient composition and maximum nutrient retention was found with boiling method. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Qian, Xiaoshan
2018-01-01
The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.
Improvement attributes in healthcare: implications for integrated care.
Harnett, Patrick John
2018-04-16
Purpose Healthcare quality improvement is a key concern for policy makers, regulators, carers and service users. Despite a contemporary consensus among policy makers that integrated care represents a means to substantially improve service outcomes, progress has been slow. Difficulties achieving sustained improvement at scale imply that methods employed are not sufficient and that healthcare improvement attributes may be different when compared to prior reference domains. The purpose of this paper is to examine and synthesise key improvement attributes relevant to a complex healthcare change process, specifically integrated care. Design/methodology/approach This study is based on an integrative literature review on systemic improvement in healthcare. Findings A central theme emerging from the literature review indicates that implementing systemic change needs to address the relationship between vision, methods and participant social dynamics. Practical implications Accommodating personal and professional network dynamics is required for systemic improvement, especially among high autonomy individuals. This reinforces the need to recognise the change process as taking place in a complex adaptive system where personal/professional purpose/meaning is central to the process. Originality/value Shared personal/professional narratives are insufficiently recognised as a powerful change force, under-represented in linear and rational empirical improvement approaches.
NASA Astrophysics Data System (ADS)
Widianta, M. M. D.; Rizaldi, T.; Setyohadi, D. P. S.; Riskiawan, H. Y.
2018-01-01
The right decision in placing employees in an appropriate position in a company will support the quality of management and will have an impact on improving the quality of human resources of the company. Such decision-making can be assisted by an approach through the Decision Support System (DSS) to improve accuracy in the employee placement process. The purpose of this paper is to compare the four methods of Multi Criteria Decision Making (MCDM), ie Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Simple Additive Weighting (SAW), Analytic Hierarchy Process (AHP) and Preference Ranking Organization Method for Enrichment Of Evaluations (PROMETHEE) for the application of employee placement in accordance with predetermined criteria. The ranking results and the accuracy level obtained from each method are different depending on the different scaling and weighting processes in each method.
An intraorganizational model for developing and spreading quality improvement innovations.
Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J
Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.
An intraorganizational model for developing and spreading quality improvement innovations
Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.
2017-01-01
Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788
Joint Processing of Envelope Alignment and Phase Compensation for Isar Imaging
NASA Astrophysics Data System (ADS)
Chen, Tao; Jin, Guanghu; Dong, Zhen
2018-04-01
Range envelope alignment and phase compensation are spilt into two isolated parts in the classical methods of translational motion compensation in Inverse Synthetic Aperture Radar (ISAR) imaging. In classic method of the rotating object imaging, the two reference points of the envelope alignment and the Phase Difference (PD) estimation are probably not the same point, making it difficult to uncouple the coupling term by conducting the correction of Migration Through Resolution Cell (MTRC). In this paper, an improved approach of joint processing which chooses certain scattering point as the sole reference point is proposed to perform with utilizing the Prominent Point Processing (PPP) method. With this end in view, we firstly get the initial image using classical methods from which a certain scattering point can be chose. The envelope alignment and phase compensation using the selected scattering point as the same reference point are subsequently conducted. The keystone transform is thus smoothly applied to further improve imaging quality. Both simulation experiments and real data processing are provided to demonstrate the performance of the proposed method compared with classical method.
A method to evaluate process performance by integrating time and resources
NASA Astrophysics Data System (ADS)
Wang, Yu; Wei, Qingjie; Jin, Shuang
2017-06-01
The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.
Improved compression molding process
NASA Technical Reports Server (NTRS)
Heier, W. C.
1967-01-01
Modified compression molding process produces plastic molding compounds that are strong, homogeneous, free of residual stresses, and have improved ablative characteristics. The conventional method is modified by applying a vacuum to the mold during the molding cycle, using a volatile sink, and exercising precise control of the mold closure limits.
Visual improvement for bad handwriting based on Monte-Carlo method
NASA Astrophysics Data System (ADS)
Shi, Cao; Xiao, Jianguo; Xu, Canhui; Jia, Wenhua
2014-03-01
A visual improvement algorithm based on Monte Carlo simulation is proposed in this paper, in order to enhance visual effects for bad handwriting. The whole improvement process is to use well designed typeface so as to optimize bad handwriting image. In this process, a series of linear operators for image transformation are defined for transforming typeface image to approach handwriting image. And specific parameters of linear operators are estimated by Monte Carlo method. Visual improvement experiments illustrate that the proposed algorithm can effectively enhance visual effect for handwriting image as well as maintain the original handwriting features, such as tilt, stroke order and drawing direction etc. The proposed visual improvement algorithm, in this paper, has a huge potential to be applied in tablet computer and Mobile Internet, in order to improve user experience on handwriting.
Conductivity fuel cell collector plate and method of fabrication
Braun, James C.
2002-01-01
An improved method of manufacturing a PEM fuel cell collector plate is disclosed. During molding a highly conductive polymer composite is formed having a relatively high polymer concentration along its external surfaces. After molding the polymer rich layer is removed from the land areas by machining, grinding or similar process. This layer removal results in increased overall conductivity of the molded collector plate. The polymer rich surface remains in the collector plate channels, providing increased mechanical strength and other benefits to the channels. The improved method also permits greater mold cavity thickness providing a number of advantages during the molding process.
Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce
2012-08-28
Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.
Brooks, Robin; Thorpe, Richard; Wilson, John
2004-11-11
A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.
Alan A. Ager; Jeffrey D. Kline; A. Paige Fisher
2015-01-01
We describe recent advances in biophysical and social aspects of risk and their potential combined contribution to improve mitigation planning on fire-prone landscapes. The methods and tools provide an improved method for defining the spatial extent of wildfire risk to communities compared to current planning processes. They also propose an expanded role for social...
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Is audit research? The relationships between clinical audit and social-research.
Hughes, Rhidian
2005-01-01
Quality has an established history in health care. Audit, as a means of quality assessment, is well understood and the existing literature has identified links between audit and research processes. This paper reviews the relationships between audit and research processes, highlighting how audit can be improved through the principles and practice of social research. The review begins by defining the audit process. It goes on to explore salient relationships between clinical audit and research, grouped into the following broad themes: ethical considerations, highlighting responsibilities towards others and the need for ethical review for audit; asking questions and using appropriate methods, emphasising transparency in audit methods; conceptual issues, including identifying problematic concepts, such as "satisfaction", and the importance of reflexivity within audit; emphasising research in context, highlighting the benefits of vignettes and action research; complementary methods, demonstrating improvements for the quality of findings; and training and multidisciplinary working, suggesting the need for closer relationships between researchers and clinical practitioners. Audit processes cannot be considered research. Both audit and research processes serve distinct purposes. Attention to the principles of research when conducting audit are necessary to improve the quality of audit and, in turn, the quality of health care.
Quality management benchmarking: FDA compliance in pharmaceutical industry.
Jochem, Roland; Landgraf, Katja
2010-01-01
By analyzing and comparing industry and business best practice, processes can be optimized and become more successful mainly because efficiency and competitiveness increase. This paper aims to focus on some examples. Case studies are used to show knowledge exchange in the pharmaceutical industry. Best practice solutions were identified in two companies using a benchmarking method and five-stage model. Despite large administrations, there is much potential regarding business process organization. This project makes it possible for participants to fully understand their business processes. The benchmarking method gives an opportunity to critically analyze value chains (a string of companies or players working together to satisfy market demands for a special product). Knowledge exchange is interesting for companies that like to be global players. Benchmarking supports information exchange and improves competitive ability between different enterprises. Findings suggest that the five-stage model improves efficiency and effectiveness. Furthermore, the model increases the chances for reaching targets. The method gives security to partners that did not have benchmarking experience. The study identifies new quality management procedures. Process management and especially benchmarking is shown to support pharmaceutical industry improvements.
Research on assessment and improvement method of remote sensing image reconstruction
NASA Astrophysics Data System (ADS)
Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping
2018-01-01
Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.
Chen, Kun; Zhang, Hongyuan; Wei, Haoyun; Li, Yan
2014-08-20
In this paper, we propose an improved subtraction algorithm for rapid recovery of Raman spectra that can substantially reduce the computation time. This algorithm is based on an improved Savitzky-Golay (SG) iterative smoothing method, which involves two key novel approaches: (a) the use of the Gauss-Seidel method and (b) the introduction of a relaxation factor into the iterative procedure. By applying a novel successive relaxation (SG-SR) iterative method to the relaxation factor, additional improvement in the convergence speed over the standard Savitzky-Golay procedure is realized. The proposed improved algorithm (the RIA-SG-SR algorithm), which uses SG-SR-based iteration instead of Savitzky-Golay iteration, has been optimized and validated with a mathematically simulated Raman spectrum, as well as experimentally measured Raman spectra from non-biological and biological samples. The method results in a significant reduction in computing cost while yielding consistent rejection of fluorescence and noise for spectra with low signal-to-fluorescence ratios and varied baselines. In the simulation, RIA-SG-SR achieved 1 order of magnitude improvement in iteration number and 2 orders of magnitude improvement in computation time compared with the range-independent background-subtraction algorithm (RIA). Furthermore the computation time of the experimentally measured raw Raman spectrum processing from skin tissue decreased from 6.72 to 0.094 s. In general, the processing of the SG-SR method can be conducted within dozens of milliseconds, which can provide a real-time procedure in practical situations.
School Improvement Model to Foster Student Learning
ERIC Educational Resources Information Center
Rulloda, Rudolfo Barcena
2011-01-01
Many classroom teachers are still using the traditional teaching methods. The traditional teaching methods are one-way learning process, where teachers would introduce subject contents such as language arts, English, mathematics, science, and reading separately. However, the school improvement model takes into account that all students have…
NASA Astrophysics Data System (ADS)
Wang, Hongyan
2017-04-01
This paper addresses the waveform optimization problem for improving the detection performance of multi-input multioutput (MIMO) orthogonal frequency division multiplexing (OFDM) radar-based space-time adaptive processing (STAP) in the complex environment. By maximizing the output signal-to-interference-and-noise-ratio (SINR) criterion, the waveform optimization problem for improving the detection performance of STAP, which is subjected to the constant modulus constraint, is derived. To tackle the resultant nonlinear and complicated optimization issue, a diagonal loading-based method is proposed to reformulate the issue as a semidefinite programming one; thereby, this problem can be solved very efficiently. In what follows, the optimized waveform can be obtained to maximize the output SINR of MIMO-OFDM such that the detection performance of STAP can be improved. The simulation results show that the proposed method can improve the output SINR detection performance considerably as compared with that of uncorrelated waveforms and the existing MIMO-based STAP method.
Method for producing ethanol and co-products from cellulosic biomass
Nguyen, Quang A
2013-10-01
The present invention generally relates to processes for production of ethanol from cellulosic biomass. The present invention also relates to production of various co-products of preparation of ethanol from cellulosic biomass. The present invention further relates to improvements in one or more aspects of preparation of ethanol from cellulosic biomass including, for example, improved methods for cleaning biomass feedstocks, improved acid impregnation, and improved steam treatment, or "steam explosion."
NASA Astrophysics Data System (ADS)
Shao, Rongjun; Qiu, Lirong; Yang, Jiamiao; Zhao, Weiqian; Zhang, Xin
2013-12-01
We have proposed the component parameters measuring method based on the differential confocal focusing theory. In order to improve the positioning precision of the laser differential confocal component parameters measurement system (LDDCPMS), the paper provides a data processing method based on tracking light spot. To reduce the error caused by the light point moving in collecting the axial intensity signal, the image centroiding algorithm is used to find and track the center of Airy disk of the images collected by the laser differential confocal system. For weakening the influence of higher harmonic noises during the measurement, Gaussian filter is used to process the axial intensity signal. Ultimately the zero point corresponding to the focus of the objective in a differential confocal system is achieved by linear fitting for the differential confocal axial intensity data. Preliminary experiments indicate that the method based on tracking light spot can accurately collect the axial intensity response signal of the virtual pinhole, and improve the anti-interference ability of system. Thus it improves the system positioning accuracy.
Digital Signal Processing Based on a Clustering Algorithm for Ir/Au TES Microcalorimeter
NASA Astrophysics Data System (ADS)
Zen, N.; Kunieda, Y.; Takahashi, H.; Hiramoto, K.; Nakazawa, M.; Fukuda, D.; Ukibe, M.; Ohkubo, M.
2006-02-01
In recent years, cryogenic microcalorimeters using their superconducting transition edge have been under development for possible application to the research for astronomical X-ray observations. To improve the energy resolution of superconducting transition edge sensors (TES), several correction methods have been developed. Among them, a clustering method based on digital signal processing has recently been proposed. In this paper, we applied the clustering method to Ir/Au bilayer TES. This method resulted in almost a 10% improvement in the energy resolution. Conversely, from the point of view of imaging X-ray spectroscopy, we applied the clustering method to pixellated Ir/Au-TES devices. We will thus show how a clustering method which sorts signals by their shapes is also useful for position identification
Navigating change: how outreach facilitators can help clinicians improve patient outcomes.
Laferriere, Dianne; Liddy, Clare; Nash, Kate; Hogg, William
2012-01-01
The objective of this study was to describe outreach facilitation as an effective method of assisting and supporting primary care practices to improve processes and delivery of care. We spent 4 years working with 83 practices in Eastern Ontario, Canada, on the Improved Delivery of Cardiovascular Care through the Outreach Facilitation program. Primary care practices, even if highly motivated, face multiple challenges when providing quality patient care. Outreach facilitation can be an effective method of assisting and supporting practices to make the changes necessary to improve processes and delivery of care. Multiple jurisdictions use outreach facilitation for system redesign, improved efficiencies, and advanced access. The development and implementation of quality improvement programs using practice facilitation can be challenging. Our research team has learned valuable lessons in developing tools, finding resources, and assisting practices to reach their quality improvement goals. These lessons can lead to improved experiences for the practices and overall improved outcomes for the patients they serve.
A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes
NASA Technical Reports Server (NTRS)
Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw
2004-01-01
There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.
Clarifying values: an updated review
2013-01-01
Background Consensus guidelines have recommended that decision aids include a process for helping patients clarify their values. We sought to examine the theoretical and empirical evidence related to the use of values clarification methods in patient decision aids. Methods Building on the International Patient Decision Aid Standards (IPDAS) Collaboration’s 2005 review of values clarification methods in decision aids, we convened a multi-disciplinary expert group to examine key definitions, decision-making process theories, and empirical evidence about the effects of values clarification methods in decision aids. To summarize the current state of theory and evidence about the role of values clarification methods in decision aids, we undertook a process of evidence review and summary. Results Values clarification methods (VCMs) are best defined as methods to help patients think about the desirability of options or attributes of options within a specific decision context, in order to identify which option he/she prefers. Several decision making process theories were identified that can inform the design of values clarification methods, but no single “best” practice for how such methods should be constructed was determined. Our evidence review found that existing VCMs were used for a variety of different decisions, rarely referenced underlying theory for their design, but generally were well described in regard to their development process. Listing the pros and cons of a decision was the most common method used. The 13 trials that compared decision support with or without VCMs reached mixed results: some found that VCMs improved some decision-making processes, while others found no effect. Conclusions Values clarification methods may improve decision-making processes and potentially more distal outcomes. However, the small number of evaluations of VCMs and, where evaluations exist, the heterogeneity in outcome measures makes it difficult to determine their overall effectiveness or the specific characteristics that increase effectiveness. PMID:24625261
Kania, John; Qiao, Ming; Woods, Elizabeth M.; Cortright, Randy D.; Myren, Paul
2015-12-15
The present invention includes improved systems and methods for producing biomass-derived feedstocks for biofuel and biochemical manufacturing processes. The systems and methods use components that are capable of transferring relatively high concentrations of solid biomass utilizing pressure variations between vessels, and allows for the recovery and recycling of heterogeneous catalyst materials.
Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques
Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh
2016-01-01
Background: Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. Methods: In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. Results: With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Conclusion: Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications. PMID:28077898
Improving designer productivity. [artificial intelligence
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting these challenges.
Fingerprint pattern restoration by digital image processing techniques.
Wen, Che-Yen; Yu, Chiu-Chung
2003-09-01
Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.
Quality control process improvement of flexible printed circuit board by FMEA
NASA Astrophysics Data System (ADS)
Krasaephol, Siwaporn; Chutima, Parames
2018-02-01
This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.
2012-01-01
In this work, we report a direct synthesis of vertically aligned ZnO nanowires on fluorine-doped tin oxide-coated substrates using the chemical vapor deposition (CVD) method. ZnO nanowires with a length of more than 30 μm were synthesized, and dye-sensitized solar cells (DSSCs) based on the as-grown nanowires were fabricated, which showed improvement of the device performance compared to those fabricated using transferred ZnO nanowires. Dependence of the cell performance on nanowire length and annealing temperature was also examined. This synthesis method provided a straightforward, one-step CVD process to grow relatively long ZnO nanowires and avoided subsequent nanowire transfer process, which simplified DSSC fabrication and improved cell performance. PMID:22673046
Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki
2012-12-21
There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.
Expanding the printable design space for lithography processes utilizing a cut mask
NASA Astrophysics Data System (ADS)
Wandell, Jerome; Salama, Mohamed; Wilkinson, William; Curtice, Mark; Feng, Jui-Hsuan; Gao, Shao Wen; Asthana, Abhishek
2016-03-01
The utilization of a cut-mask in semiconductor patterning processes has been in practice for logic devices since the inception of 32nm-node devices, notably with unidirectional gate level printing. However, the microprocessor applications where cut-mask patterning methods are used are expanding as Self-Aligned Double Patterning (SADP) processes become mainstream for 22/14nm fin diffusion, and sub-14nm metal levels. One common weakness for these types of lithography processes is that the initial pattern requiring the follow-up cut-mask typically uses an extreme off-axis imaging source such as dipole to enhance the resolution and line-width roughness (LWR) for critical dense patterns. This source condition suffers from poor process margin in the semi-dense (forbidden pitch) realm and wrong-way directional design spaces. Common pattern failures in these limited design regions include bridging and extra-printing defects that are difficult to resolve with traditional mask improvement means. This forces the device maker to limit the allowable geometries that a designer may use on a device layer. This paper will demonstrate methods to expand the usable design space on dipole-like processes such as unidirectional gate and SADP processes by utilizing the follow-up cut mask to improve the process window. Traditional mask enhancement means for improving the process window in this design realm will be compared to this new cut-mask approach. The unique advantages and disadvantages of the cut-mask solution will be discussed in contrast to those customary methods.
Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B; Sturm, Benjamin W
2014-11-11
A scintillator radiation detector system according to one embodiment includes a scintillator; and a processing device for processing pulse traces corresponding to light pulses from the scintillator, wherein pulse digitization is used to improve energy resolution of the system. A scintillator radiation detector system according to another embodiment includes a processing device for fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times and performing a direct integration of fit parameters. A method according to yet another embodiment includes processing pulse traces corresponding to light pulses from a scintillator, wherein pulse digitization is used to improve energy resolution of the system. A method in a further embodiment includes fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times; and performing a direct integration of fit parameters. Additional systems and methods are also presented.
Studies on Hot-Melt Prepregging on PRM-II-50 Polyimide Resin with Graphite Fibers
NASA Technical Reports Server (NTRS)
Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim
2004-01-01
A second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated the poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e. hot press vs. autoclave on composite quality and properties are discussed.
Studies on Hot-Melt Prepregging of PMR-II-50 Polyimide Resin with Graphite Fibers
NASA Technical Reports Server (NTRS)
Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim
2003-01-01
A Second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin, PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated that poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e., hot press vs. autoclave on composite quality and properties are discussed.
Combining Project Management Methods: A Case Study of Dlstributed Work Practices
NASA Astrophysics Data System (ADS)
Backlund, Per; Lundell, Björn
The increasing complexity of information systems development (ISD) projects call for improved project management practices. This, together with an endeavour to improve the success rate of ISD projects (Lyytinen and Robey 1999; Cooke-Davies 2002; White and Fortune 2002), has served as drivers for various efforts in process improvement such as the introduction of new development methods (Fitzgerald 1997; Iivari and Maansaari 1998).
Methods for improved forewarning of condition changes in monitoring physical processes
Hively, Lee M.
2013-04-09
This invention teaches further improvements in methods for forewarning of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves objective determination of a forewarning threshold (U.sub.FW), together with a failure-onset threshold (U.sub.FAIL) corresponding to a normalized value of a composite measure (C) of dissimilarity; and providing a visual or audible indication to a human observer of failure forewarning and/or failure onset. Another improvement relates to symbolization of the data according the binary numbers representing the slope between adjacent data points. Another improvement relates to adding measures of dissimilarity based on state-to-state dynamical changes of the system. And still another improvement relates to using a Shannon entropy as the measure of condition change in lieu of a connected or unconnected phase space.
Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy
2015-09-15
One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.
Underwater image enhancement based on the dark channel prior and attenuation compensation
NASA Astrophysics Data System (ADS)
Guo, Qingwen; Xue, Lulu; Tang, Ruichun; Guo, Lingrui
2017-10-01
Aimed at the two problems of underwater imaging, fog effect and color cast, an Improved Segmentation Dark Channel Prior (ISDCP) defogging method is proposed to solve the fog effects caused by physical properties of water. Due to mass refraction of light in the process of underwater imaging, fog effects would lead to image blurring. And color cast is closely related to different degree of attenuation while light with different wavelengths is traveling in water. The proposed method here integrates the ISDCP and quantitative histogram stretching techniques into the image enhancement procedure. Firstly, the threshold value is set during the refinement process of the transmission maps to identify the original mismatching, and to conduct the differentiated defogging process further. Secondly, a method of judging the propagating distance of light is adopted to get the attenuation degree of energy during the propagation underwater. Finally, the image histogram is stretched quantitatively in Red-Green-Blue channel respectively according to the degree of attenuation in each color channel. The proposed method ISDCP can reduce the computational complexity and improve the efficiency in terms of defogging effect to meet the real-time requirements. Qualitative and quantitative comparison for several different underwater scenes reveals that the proposed method can significantly improve the visibility compared with previous methods.
Hussein, Khaled; Türk, Michael; Wahl, Martin A
2007-03-01
The preparation of drug/cyclodextrin complexes is a suitable method to improve the dissolution of poor soluble drugs. The efficacy of the Controlled Particle Deposition (CPD) as a new developed method to prepare these complexes in a single stage process using supercritical carbon dioxide is therefore compared with other conventional methods. Ibuprofen/beta-cyclodextrin complexes were prepared with different techniques and characterized using FTIR-ATR spectroscopy, powder X-ray diffractometry (PXRD), differential scanning calorimetry (DSC) and scanning electron microscopy (SEM). In addition, the influences of the processing technique on the drug content (HPLC) and the dissolution behavior were studied. Employing the CPD-process resulted in a drug content of 2.8+/-0.22 wt.% in the carrier. The material obtained by CPD showed an improved dissolution rate of ibuprofen at pH 5 compared with the pure drug and its physical mixture with beta-cyclodextrin. In addition CPD material displays the highest dissolution (93.5+/- 2.89% after 75 min) compared to material obtained by co-precipitation (61.3 +/-0.52%) or freeze-drying (90.6 +/-2.54%). This study presents the CPD-technique as a well suitable method to prepare a drug/beta-cyclodextrin complex with improved drug dissolution compared to the pure drug and materials obtained by other methods.
NASA Astrophysics Data System (ADS)
Griesbach, Christopher
Methods used to process raw Light Detection and Ranging (LiDAR) data can sometimes obscure the digital signatures indicative of an archaeological site. This thesis explains the negative effects that certain LiDAR data processing procedures can have on the preservation of an archaeological site. This thesis also presents methods for effectively integrating LiDAR with other forms of mapping data in a Geographic Information Systems (GIS) environment in order to improve LiDAR archaeological signatures by examining several pre-Columbian Native American shell middens located in Canaveral National Seashore Park (CANA).
Improving wavelet denoising based on an in-depth analysis of the camera color processing
NASA Astrophysics Data System (ADS)
Seybold, Tamara; Plichta, Mathias; Stechele, Walter
2015-02-01
While Denoising is an extensively studied task in signal processing research, most denoising methods are designed and evaluated using readily processed image data, e.g. the well-known Kodak data set. The noise model is usually additive white Gaussian noise (AWGN). This kind of test data does not correspond to nowadays real-world image data taken with a digital camera. Using such unrealistic data to test, optimize and compare denoising algorithms may lead to incorrect parameter tuning or suboptimal choices in research on real-time camera denoising algorithms. In this paper we derive a precise analysis of the noise characteristics for the different steps in the color processing. Based on real camera noise measurements and simulation of the processing steps, we obtain a good approximation for the noise characteristics. We further show how this approximation can be used in standard wavelet denoising methods. We improve the wavelet hard thresholding and bivariate thresholding based on our noise analysis results. Both the visual quality and objective quality metrics show the advantage of the proposed method. As the method is implemented using look-up-tables that are calculated before the denoising step, our method can be implemented with very low computational complexity and can process HD video sequences real-time in an FPGA.
Service tough composite structures using the Z-direction reinforcement process
NASA Technical Reports Server (NTRS)
Freitas, Glenn; Magee, Constance; Boyce, Joseph; Bott, Richard
1992-01-01
Foster-Miller has developed a new process to provide through thickness reinforcement of composite structures. The process reinforces laminates locally or globally on-tool during standard autoclave processing cycles. Initial test results indicate that the method has the potential to significantly reduce delamination in carbon-epoxy. Laminates reinforced with the z-fiber process have demonstrated significant improvements in mode 1 fracture toughness and compression strength after impact. Unlike alternative methods, in-plane properties are not adversely affected.
Kang, Edith Y; Fields, Henry W; Kiyak, Asuman; Beck, F Michael; Firestone, Allen R
2009-10-01
Low general and health literacy in the United States means informed consent documents are not well understood by most adults. Methods to improve recall and comprehension of informed consent have not been tested in orthodontics. The purposes of this study were to evaluate (1) recall and comprehension among patients and parents by using the American Association of Orthodontists' (AAO) informed consent form and new forms incorporating improved readability and processability; (2) the association between reading ability, anxiety, and sociodemographic variables and recall and comprehension; and (3) how various domains (treatment, risk, and responsibility) of information are affected by the forms. Three treatment groups (30 patient-parent pairs in each) received an orthodontic case presentation and either the AAO form, an improved readability form (MIC), or an improved readability and processability (pairing audio and visual cues) form (MIC + SS). Structured interviews were transcribed and coded to evaluate recall and comprehension. Significant relationships among patient-related variables and recall and comprehension explained little of the variance. The MIC + SS form significantly improved patient recall and parent recall and comprehension. Recall was better than comprehension, and parents performed better than patients. The MIC + SS form significantly improved patient treatment comprehension and risk recall and parent treatment recall and comprehension. Patients and parents both overestimated their understanding of the materials. Improving the readability of consent materials made little difference, but combining improved readability and processability benefited both patients' recall and parents' recall and comprehension compared with the AAO form.
DOT National Transportation Integrated Search
2009-09-01
Changing At-Risk Behavior (CAB) is a safety process that is being conducted at Union Pacifics San Antonio Service Unit (SASU) with the aim of improving road and yard safety. CAB is an example of a proactive safety risk-reduction method called Clea...
ERIC Educational Resources Information Center
Educational Products Information Exchange Inst., Stony Brook, NY.
Learner Verification and Revision (LVR) Process of Instructional Materials is an ongoing effort for the improvement of instructional materials based on systematic feedback from learners who have used the materials. This evaluation gives publishers a method of identifying instructional strengths and weaknesses of a product and provides an…
Cognitive Science and Instructional Technology: Improvements in Higher Order Thinking Strategies.
ERIC Educational Resources Information Center
Tennyson, Robert D.
This paper examines the cognitive processes associated with higher-order thinking strategies--i.e., cognitive processes directly associated with the employment of knowledge in the service of problem solving and creativity--in order to more clearly define a prescribed instructional method to improve problem-solving skills. The first section of the…
Evaluating and Improving the Mathematics Teaching-Learning Process through Metacognition
ERIC Educational Resources Information Center
Desoete, Annemie
2007-01-01
Introduction: Despite all the emphasis on metacognition, researchers currently use different techniques to assess metacognition. The purpose of this contribution is to help to clarify some of the paradigms on the evaluation of metacognition. In addition the paper reviews studies aiming to improve the learning process through metacognition. Method:…
A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms
Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein
2017-01-01
Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts’ Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2–100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms. PMID:28487831
A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms.
Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein
2017-01-01
Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts' Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2-100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms.
An Improved Heuristic Method for Subgraph Isomorphism Problem
NASA Astrophysics Data System (ADS)
Xiang, Yingzhuo; Han, Jiesi; Xu, Haijiang; Guo, Xin
2017-09-01
This paper focus on the subgraph isomorphism (SI) problem. We present an improved genetic algorithm, a heuristic method to search the optimal solution. The contribution of this paper is that we design a dedicated crossover algorithm and a new fitness function to measure the evolution process. Experiments show our improved genetic algorithm performs better than other heuristic methods. For a large graph, such as a subgraph of 40 nodes, our algorithm outperforms the traditional tree search algorithms. We find that the performance of our improved genetic algorithm does not decrease as the number of nodes in prototype graphs.
Consistent linguistic fuzzy preference relations method with ranking fuzzy numbers
NASA Astrophysics Data System (ADS)
Ridzuan, Siti Amnah Mohd; Mohamad, Daud; Kamis, Nor Hanimah
2014-12-01
Multi-Criteria Decision Making (MCDM) methods have been developed to help decision makers in selecting the best criteria or alternatives from the options given. One of the well known methods in MCDM is the Consistent Fuzzy Preference Relation (CFPR) method, essentially utilizes a pairwise comparison approach. This method was later improved to cater subjectivity in the data by using fuzzy set, known as the Consistent Linguistic Fuzzy Preference Relations (CLFPR). The CLFPR method uses the additive transitivity property in the evaluation of pairwise comparison matrices. However, the calculation involved is lengthy and cumbersome. To overcome this problem, a method of defuzzification was introduced by researchers. Nevertheless, the defuzzification process has a major setback where some information may lose due to the simplification process. In this paper, we propose a method of CLFPR that preserves the fuzzy numbers form throughout the process. In obtaining the desired ordering result, a method of ranking fuzzy numbers is utilized in the procedure. This improved procedure for CLFPR is implemented to a case study to verify its effectiveness. This method is useful for solving decision making problems and can be applied to many areas of applications.
Lanying Lin; Sheng He; Feng Fu; Xiping Wang
2015-01-01
Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...
Measuring and improving quality of care in an academic medical center.
Blayney, Douglas W
2013-05-01
The Donabedian definition of quality—structure, process, and outcome—provides a useful framework. A relentless focus on measuring process adherence and outcome is critical. Systemic improvements usually require teams to plan and to implement them. The lean or Toyota production system for process improvement is one useful method of organizing work, although different approaches are often necessary at the physician, practice unit, and statewide level. Challenges include scalability of the change (ie, rolling them out across the institution or system), tailoring the information technology tools, and building systems for sustainability.
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
Bridging the gap between finance and clinical operations with activity-based cost management.
Storfjell, J L; Jessup, S
1996-12-01
Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.
Continuous quality improvement for continuity of care.
Kibbe, D C; Bentz, E; McLaughlin, C P
1993-03-01
Continuous quality improvement (CQI) techniques have been used most frequently in hospital operations such as pharmaceutical ordering, patient admitting, and billing of insurers, and less often to analyze and improve processes that are close to the clinical interaction of physicians and their patients. This paper describes a project in which CQI was implemented in a family practice setting to improve continuity of care. A CQI study team was assembled in response to patients' complaints about not being able to see their regular physician providers when they wanted. Following CQI methods, the performance of the practice in terms of provider continuity was measured. Two "customer" groups were surveyed: physician faculty members were surveyed to assess their attitudes about continuity, and patients were surveyed about their preferences for provider continuity and convenience factors. Process improvements were selected in the critical pathways that influence provider continuity. One year after implementation of selected process improvements, repeat chart audit showed that provider continuity levels had improved from .45 to .74, a 64% increase from 1 year earlier. The project's main accomplishment was to establish the practicality of using CQI methods in a primary care setting to identify a quality issue of value to both providers and patients, in this case, continuity of provider care, and to identify processes that linked the performance of health care delivery procedures with patient expectations.
IT investments can add business value.
Williams, Terry G
2002-05-01
Investment in information technology (IT) is costly, but necessary to enable healthcare organizations to improve their infrastructure and achieve other improvement initiatives. Such an investment is even more costly, however, if the technology does not appropriately enable organizations to perform business processes that help them accomplish their mission of providing safe, high-quality care cost-effectively. Before committing to a costly IT investment, healthcare organizations should implement a decision-making process that can help them choose, implement, and use technology that will provide sustained business value. A seven-step decision-making process that can help healthcare organizations achieve this result involves performing a gap analysis, assessing and aligning organizational goals, establishing distributed accountability, identifying linked organizational-change initiatives, determining measurement methods, establishing appropriate teams to ensure systems are integrated with multidisciplinary improvement methods, and developing a plan to accelerate adoption of the IT product.
NASA Astrophysics Data System (ADS)
Song, Yanpo; Peng, Xiaoqi; Tang, Ying; Hu, Zhikun
2013-07-01
To improve the operation level of copper converter, the approach to optimal decision making modeling for coppermatte converting process based on data mining is studied: in view of the characteristics of the process data, such as containing noise, small sample size and so on, a new robust improved ANN (artificial neural network) modeling method is proposed; taking into account the application purpose of decision making model, three new evaluation indexes named support, confidence and relative confidence are proposed; using real production data and the methods mentioned above, optimal decision making model for blowing time of S1 period (the 1st slag producing period) are developed. Simulation results show that this model can significantly improve the converting quality of S1 period, increase the optimal probability from about 70% to about 85%.
Perera, D P; Andrades, Marie; Wass, Val
2017-12-08
The International Membership Examination (MRCGP[INT]) of the Royal College of General Practitioners UK is a unique collaboration between four South Asian countries with diverse cultures, epidemiology, clinical facilities and resources. In this setting good quality assurance is imperative to achieve acceptable standards of inter rater reliability. This study aims to explore the process of peer feedback for examiner quality assurance with regard to factors affecting the implementation and acceptance of the method. A sequential mixed methods approach was used based on focus group discussions with examiners (n = 12) and clinical examination convenors who acted as peer reviewers (n = 4). A questionnaire based on emerging themes and literature review was then completed by 20 examiners at the subsequent OSCE exam. Qualitative data were analysed using an iterative reflexive process. Quantitative data were integrated by interpretive analysis looking for convergence, complementarity or dissonance. The qualitative data helped understand the issues and informed the process of developing the questionnaire. The quantitative data allowed for further refining of issues, wider sampling of examiners and giving voice to different perspectives. Examiners stated specifically that peer feedback gave an opportunity for discussion, standardisation of judgments and improved discriminatory abilities. Interpersonal dynamics, hierarchy and perception of validity of feedback were major factors influencing acceptance of feedback. Examiners desired increased transparency, accountability and the opportunity for equal partnership within the process. The process was stressful for examiners and reviewers; however acceptance increased with increasing exposure to receiving feedback. The process could be refined to improve acceptability through scrupulous attention to training and selection of those giving feedback to improve the perceived validity of feedback and improved reviewer feedback skills to enable better interpersonal dynamics and a more equitable feedback process. It is important to highlight the role of quality assurance and peer feedback as a tool for continuous improvement and maintenance of standards to examiners during training. Examiner quality assurance using peer feedback was generally a successful and accepted process. The findings highlight areas for improvement and guide the path towards a model of feedback that is responsive to examiner views and cultural sensibilities.
NASA Astrophysics Data System (ADS)
Jean, Yoomin; Meyer, Ulrich; Arnold, Daniel; Bentel, Katrin; Jäggi, Adrian
2017-04-01
The monthly global gravity field solutions derived using the measurements from the GRACE (Gravity Recovery and Climate Experiment) satellites have been continuously improved by the processing centers. One of the improvements in the processing method is a more detailed calibration of the on-board accelerometers in the GRACE satellites. The accelerometer data calibration is usually restricted to the scale factors and biases. It has been assumed that the three different axes are perfectly orthogonal in the GRACE science reference frame. Recently, it was shown by Klinger and Mayer-Gürr (2016) that a fully-populated scale matrix considering the non-orthogonality of the axes and the misalignment of the GRACE science reference frame and the GRACE accelerometer frame improves the quality of the C20 coefficient in the GRACE monthly gravity field solutions. We investigate the effect of the more detailed calibration of the GRACE accelerometer data on the C20 coefficient in the case of the AIUB (Astronomical Institute of the University of Bern) processing method using the Celestial Mechanics Approach. We also investigate the effect of the new calibration parameters on the stochastic parameters in the Celestial Mechanics Approach.
Evaluating Process Improvement Courses of Action Through Modeling and Simulation
2017-09-16
changes to a process is time consuming and has potential to overlook stochastic effects. By modeling a process as a Numerical Design Structure Matrix...13 Methods to Evaluate Process Performance ................................................................15 The Design Structure...Matrix ......................................................................................16 Numerical Design Structure Matrix
The Data-to-Action Framework: A Rapid Program Improvement Process.
Zakocs, Ronda; Hill, Jessica A; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E
2015-08-01
Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to improve implementation of ongoing programs. The framework was designed while implementing DELTA PREP, a 3-year project aimed at building the primary prevention capacities of statewide domestic violence coalitions. The authors describe the framework's main steps and provide a case example of a rapid-feedback cycle and several examples of rapid-feedback memos produced during the project period. The authors also discuss implications for health education evaluation and practice. © 2015 Society for Public Health Education.
Designs and Methods in School Improvement Research: A Systematic Review
ERIC Educational Resources Information Center
Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie
2016-01-01
Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…
NASA Astrophysics Data System (ADS)
Ai, Yuewei; Shao, Xinyu; Jiang, Ping; Li, Peigen; Liu, Yang; Yue, Chen
2015-11-01
The welded joints of dissimilar materials have been widely used in automotive, ship and space industries. The joint quality is often evaluated by weld seam geometry, microstructures and mechanical properties. To obtain the desired weld seam geometry and improve the quality of welded joints, this paper proposes a process modeling and parameter optimization method to obtain the weld seam with minimum width and desired depth of penetration for laser butt welding of dissimilar materials. During the process, Taguchi experiments are conducted on the laser welding of the low carbon steel (Q235) and stainless steel (SUS301L-HT). The experimental results are used to develop the radial basis function neural network model, and the process parameters are optimized by genetic algorithm. The proposed method is validated by a confirmation experiment. Simultaneously, the microstructures and mechanical properties of the weld seam generated from optimal process parameters are further studied by optical microscopy and tensile strength test. Compared with the unoptimized weld seam, the welding defects are eliminated in the optimized weld seam and the mechanical properties are improved. The results show that the proposed method is effective and reliable for improving the quality of welded joints in practical production.
Dellal, George; Peterson, Laura E; Provost, Lloyd; Gloor, Peter A; Fore, David Livingstone; Margolis, Peter A
2018-01-01
Background Our health care system fails to deliver necessary results, and incremental system improvements will not deliver needed change. Learning health systems (LHSs) are seen as a means to accelerate outcomes, improve care delivery, and further clinical research; yet, few such systems exist. We describe the process of codesigning, with all relevant stakeholders, an approach for creating a collaborative chronic care network (C3N), a peer-produced networked LHS. Objective The objective of this study was to report the methods used, with a diverse group of stakeholders, to translate the idea of a C3N to a set of actionable next steps. Methods The setting was ImproveCareNow, an improvement network for pediatric inflammatory bowel disease. In collaboration with patients and families, clinicians, researchers, social scientists, technologists, and designers, C3N leaders used a modified idealized design process to develop a design for a C3N. Results Over 100 people participated in the design process that resulted in (1) an overall concept design for the ImproveCareNow C3N, (2) a logic model for bringing about this system, and (3) 13 potential innovations likely to increase awareness and agency, make it easier to collect and share information, and to enhance collaboration that could be tested collectively to bring about the C3N. Conclusions We demonstrate methods that resulted in a design that has the potential to transform the chronic care system into an LHS. PMID:29472173
NASA Astrophysics Data System (ADS)
Marchukov, E.; Egorov, I.; Popov, G.; Baturin, O.; Goriachkin, E.; Novikova, Y.; Kolmakova, D.
2017-08-01
The article presents one optimization method for improving of the working process of an axial compressor of gas turbine engine. Developed method allows to perform search for the best geometry of compressor blades automatically by using optimization software IOSO and CFD software NUMECA Fine/Turbo. Optimization was performed by changing the form of the middle line in the three sections of each blade and shifts of three sections of the guide vanes in the circumferential and axial directions. The calculation of the compressor parameters was performed for work and stall point of its performance map on each optimization step. Study was carried out for seven-stage high-pressure compressor and three-stage low-pressure compressors. As a result of optimization, improvement of efficiency was achieved for all investigated compressors.
Standing on the shoulders of giants: improving medical image segmentation via bias correction.
Wang, Hongzhi; Das, Sandhitsu; Pluta, John; Craige, Caryne; Altinay, Murat; Avants, Brian; Weiner, Michael; Mueller, Susanne; Yushkevich, Paul
2010-01-01
We propose a simple strategy to improve automatic medical image segmentation. The key idea is that without deep understanding of a segmentation method, we can still improve its performance by directly calibrating its results with respect to manual segmentation. We formulate the calibration process as a bias correction problem, which is addressed by machine learning using training data. We apply this methodology on three segmentation problems/methods and show significant improvements for all of them.
Processing of micro-nano bacterial cellulose with hydrolysis method as a reinforcing bioplastic
NASA Astrophysics Data System (ADS)
Maryam, Maryam; Dedy, Rahmad; Yunizurwan, Yunizurwan
2017-01-01
Nanotechnology is the ability to create and manipulate atoms and molecules on the smallest of scales. Their size allows them to exhibit novel and significantly improved physical, chemical, biological properties, phenomena, and processes because of their size. The purpose of this research is obtaining micro-nano bacterial cellulose as reinforcing bioplastics. Bacterial cellulose (BC) was made from coconut water for two weeks. BC was dried and grinded. Bacterial cellulose was given purification process with NaOH 5% for 6 hours. Making the micro-nano bacterial cellulose with hydrolysis method. Hydrolysis process with hydrochloric acid (HCl) at the conditions 3,5M, 55°C, 6 hours. Drying process used spray dryer. The hydrolysis process was obtained bacterial cellulose with ±7 μm. The addition 2% micro-nano bacterial cellulose as reinforcing in bioplastics composite can improve the physical characteristics.
Increasing patient safety and efficiency in transfusion therapy using formal process definitions.
Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L
2007-01-01
The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.
Improving informed consent: Stakeholder views
Anderson, Emily E.; Newman, Susan B.; Matthews, Alicia K.
2017-01-01
Purpose Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders—research participants and those responsible for obtaining informed consent—to inform potential development of a multimedia informed consent “app.” Methods This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. Results We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Conclusions Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms. PMID:28949896
Resist process optimization for further defect reduction
NASA Astrophysics Data System (ADS)
Tanaka, Keiichi; Iseki, Tomohiro; Marumoto, Hiroshi; Takayanagi, Koji; Yoshida, Yuichi; Uemura, Ryouichi; Yoshihara, Kosuke
2012-03-01
Defect reduction has become one of the most important technical challenges in device mass-production. Knowing that resist processing on a clean track strongly impacts defect formation in many cases, we have been trying to improve the track process to enhance customer yield. For example, residual type defect and pattern collapse are strongly related to process parameters in developer, and we have reported new develop and rinse methods in the previous papers. Also, we have reported the optimization method of filtration condition to reduce bridge type defects, which are mainly caused by foreign substances such as gels in resist. Even though we have contributed resist caused defect reduction in past studies, defect reduction requirements continue to be very important. In this paper, we will introduce further process improvements in terms of resist defect reduction, including the latest experimental data.
NASA Technical Reports Server (NTRS)
Ankenman, Bruce; Ermer, Donald; Clum, James A.
1994-01-01
Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.
Improving performances of suboptimal greedy iterative biclustering heuristics via localization.
Erten, Cesim; Sözdinler, Melih
2010-10-15
Biclustering gene expression data is the problem of extracting submatrices of genes and conditions exhibiting significant correlation across both the rows and the columns of a data matrix of expression values. Even the simplest versions of the problem are computationally hard. Most of the proposed solutions therefore employ greedy iterative heuristics that locally optimize a suitably assigned scoring function. We provide a fast and simple pre-processing algorithm called localization that reorders the rows and columns of the input data matrix in such a way as to group correlated entries in small local neighborhoods within the matrix. The proposed localization algorithm takes its roots from effective use of graph-theoretical methods applied to problems exhibiting a similar structure to that of biclustering. In order to evaluate the effectivenesss of the localization pre-processing algorithm, we focus on three representative greedy iterative heuristic methods. We show how the localization pre-processing can be incorporated into each representative algorithm to improve biclustering performance. Furthermore, we propose a simple biclustering algorithm, Random Extraction After Localization (REAL) that randomly extracts submatrices from the localization pre-processed data matrix, eliminates those with low similarity scores, and provides the rest as correlated structures representing biclusters. We compare the proposed localization pre-processing with another pre-processing alternative, non-negative matrix factorization. We show that our fast and simple localization procedure provides similar or even better results than the computationally heavy matrix factorization pre-processing with regards to H-value tests. We next demonstrate that the performances of the three representative greedy iterative heuristic methods improve with localization pre-processing when biological correlations in the form of functional enrichment and PPI verification constitute the main performance criteria. The fact that the random extraction method based on localization REAL performs better than the representative greedy heuristic methods under same criteria also confirms the effectiveness of the suggested pre-processing method. Supplementary material including code implementations in LEDA C++ library, experimental data, and the results are available at http://code.google.com/p/biclustering/ cesim@khas.edu.tr; melihsozdinler@boun.edu.tr Supplementary data are available at Bioinformatics online.
Transitions of Care in Medical Education: A Compilation of Effective Teaching Methods.
McBryde, Meagan; Vandiver, Jeremy W; Onysko, Mary
2016-04-01
Transitioning patients safely from the inpatient environment back to an outpatient environment is an important component of health care, and multidisciplinary cooperation and formal processes are necessary to accomplish this task. This Transitions of Care (TOC) process is constantly being shaped in health care systems to improve patient safety, outcomes, and satisfaction. While there are many models that have been published on methods to improve the TOC process systematically, there is no clear roadmap for educators to teach TOC concepts to providers in training. This article reviews published data to highlight specific methods shown to effectively instill these concepts and values into medical students and residents. Formal, evidence-based, TOC curriculum should be developed within medical schools and residency programs. TOC education should ideally begin early in the education process, and its importance should be reiterated throughout the curriculum longitudinally. Curriculum should have a specific focus on recognition of common causes of hospital readmissions, such as medication errors, lack of adequate follow-up visits, and social/economic barriers. Use of didactic lectures, case-based workshops, role-playing activities, home visits, interprofessional activities, and resident-led quality improvement projects have all be shown to be effective ways to teach TOC concepts.
Modified Confidence Intervals for the Mean of an Autoregressive Process.
1985-08-01
Validity of the method 45 3.6 Theorem 47 4 Derivation of corrections 48 Introduction 48 The zero order pivot 50 4.1 Algorithm 50 CONTENTS The first...of standard confidence intervals. There are several standard methods of setting confidence intervals in simulations, including the regener- ative... method , batch means, and time series methods . We-will focus-s on improved confidence intervals for the mean of an autoregressive process, and as such our
Improving Mathematics Achievement of Indonesian 5th Grade Students through Guided Discovery Learning
ERIC Educational Resources Information Center
Yurniwati; Hanum, Latipa
2017-01-01
This research aims to find information about the improvement of mathematics achievement of grade five student through guided discovery learning. This research method is classroom action research using Kemmis and Taggart model consists of three cycles. Data used in this study is learning process and learning results. Learning process data is…
When Kids Act Out: A Comparison of Embodied Methods to Improve Children's Memory for a Story
ERIC Educational Resources Information Center
Berenhaus, Molly; Oakhill, Jane; Rusted, Jennifer
2015-01-01
Over the last decade, embodied cognition, the idea that sensorimotor processes facilitate higher cognitive processes, has proven useful for improving children's memory for a story. In order to compare the benefits of two embodiment techniques, active experiencing (AE) and indexing, for children's memory for a story, we compared the immediate…
NASA Technical Reports Server (NTRS)
Jensen, Brian J. (Inventor)
2000-01-01
Polyimide copolymers were obtained containing 1,3-bis(3-aminophenoxy)benzene (APB) and other diamines and dianhydrides and terminating with the appropriate amount of a non-reactive endcapper, such as phthalic anhydride. Homopolymers containing only other diamines and dianhydrides which are not processable under conditions described previously can be made processable by incorporating various amounts of APB, depending on the chemical structures of the diamines and dianhydrides used. Polyimides that are more rigid in nature require more APB to impart processability than polyimides that are less rigid in nature. The copolymers that result from using APB to enhance processability have a unique combination of properties including excellent thin film properties, low pressure processing (200 psi and below), improved toughness, improved solvent resistance, improved adhesive properties, improved composite mechanical properties, long term melt stability (several hours at 390 C), and lower melt viscosities.
NASA Technical Reports Server (NTRS)
Jensen, Brian J. (Inventor)
2001-01-01
Polyimide copolymers were obtained containing 1,3-bis(3-aminophenoxy)benzene (APB) and other diamines and dianhydrides and terminating with the appropriate amount of a non-reactive endcapper, such as phthalic anhydride. Homopolymers containing only other diamines and dianhydrides which are not processable under conditions described previously can be made processable by incorporating various amounts of APB, depending on the chemical structures of the diamines and dianhydrides used. Polyimides that are more rigid in nature require more APB to impart processability than polyimides that are less rigid in nature. The copolymers that result from using APB to enhance processability have a unique combination of properties including excellent thin film properties, low pressure processing (200 psi and below), improved toughness, improved solvent resistance, improved adhesive properties, improved composite mechanical properties, long term melt stability (several hours at 390 C), and lower melt viscosities.
An improved approximate-Bayesian model-choice method for estimating shared evolutionary history
2014-01-01
Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa from multi-locus DNA sequence data. The model is an extension of that implemented in msBayes. Results By reparameterizing the model, introducing more flexible priors on demographic and divergence-time parameters, and implementing a non-parametric Dirichlet-process prior over divergence models, I improved the robustness, accuracy, and power of the method for estimating shared evolutionary history across taxa. Conclusions The results demonstrate the improved performance of the new method is due to (1) more appropriate priors on divergence-time and demographic parameters that avoid prohibitively small marginal likelihoods for models with more divergence events, and (2) the Dirichlet-process providing a flexible prior on divergence histories that does not strongly disfavor models with intermediate numbers of divergence events. The new method yields more robust estimates of posterior uncertainty, and thus greatly reduces the tendency to incorrectly estimate models of shared evolutionary history with strong support. PMID:24992937
Monge, Paul
2006-01-01
Activity-based methods serve as a dynamic process that has allowed many other industries to reduce and control their costs, increase productivity, and streamline their processes while improving product quality and service. The method could serve the healthcare industry in an equally beneficial way. Activity-based methods encompass both activity based costing (ABC) and activity-based management (ABM). ABC is a cost management approach that links resource consumption to activities that an enterprise performs, and then assigns those activities and their associated costs to customers, products, or product lines. ABM uses the resource assignments derived in ABC so that operation managers can improve their departmental processes and workflows. There are three fundamental problems with traditional cost systems. First, traditional systems fail to reflect the underlying diversity of work taking place within an enterprise. Second, it uses allocations that are, for the most part, arbitrary Single step allocations fail to reflect the real work-the activities being performed and the associate resources actually consumed. Third, they only provide a cost number that, standing alone, does not provide any guidance on how to improve performance by lowering cost or enhancing throughput.
Zhou, Yanli; Faber, Tracy L.; Patel, Zenic; Folks, Russell D.; Cheung, Alice A.; Garcia, Ernest V.; Soman, Prem; Li, Dianfu; Cao, Kejiang; Chen, Ji
2013-01-01
Objective Left ventricular (LV) function and dyssynchrony parameters measured from serial gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) using blinded processing had a poorer repeatability than when manual side-by-side processing was used. The objective of this study was to validate whether an automatic alignment tool can reduce the variability of LV function and dyssynchrony parameters in serial gated SPECT MPI. Methods Thirty patients who had undergone serial gated SPECT MPI were prospectively enrolled in this study. Thirty minutes after the first acquisition, each patient was repositioned and a gated SPECT MPI image was reacquired. The two data sets were first processed blinded from each other by the same technologist in different weeks. These processed data were then realigned by the automatic tool, and manual side-by-side processing was carried out. All processing methods used standard iterative reconstruction and Butterworth filtering. The Emory Cardiac Toolbox was used to measure the LV function and dyssynchrony parameters. Results The automatic tool failed in one patient, who had a large, severe scar in the inferobasal wall. In the remaining 29 patients, the repeatability of the LV function and dyssynchrony parameters after automatic alignment was significantly improved from blinded processing and was comparable to manual side-by-side processing. Conclusion The automatic alignment tool can be an alternative method to manual side-by-side processing to improve the repeatability of LV function and dyssynchrony measurements by serial gated SPECT MPI. PMID:23211996
The applicability of Lean and Six Sigma techniques to clinical and translational research.
Schweikhart, Sharon A; Dembe, Allard E
2009-10-01
Lean and Six Sigma are business management strategies commonly used in production industries to improve process efficiency and quality. During the past decade, these process improvement techniques increasingly have been applied outside the manufacturing sector, for example, in health care and in software development. This article concerns the potential use of Lean and Six Sigma in improving the processes involved in clinical and translational research. Improving quality, avoiding delays and errors, and speeding up the time to implementation of biomedical discoveries are prime objectives of the National Institutes of Health (NIH) Roadmap for Medical Research and the NIH's Clinical and Translational Science Award program. This article presents a description of the main principles, practices, and methods used in Lean and Six Sigma. Available literature involving applications of Lean and Six Sigma to health care, laboratory science, and clinical and translational research is reviewed. Specific issues concerning the use of these techniques in different phases of translational research are identified. Examples of Lean and Six Sigma applications that are being planned at a current Clinical and Translational Science Award site are provided, which could potentially be replicated elsewhere. We describe how different process improvement approaches are best adapted for particular translational research phases. Lean and Six Sigma process improvement methods are well suited to help achieve NIH's goal of making clinical and translational research more efficient and cost-effective, enhancing the quality of the research, and facilitating the successful adoption of biomedical research findings into practice.
Biotechnology in Food Production and Processing
NASA Astrophysics Data System (ADS)
Knorr, Dietrich; Sinskey, Anthony J.
1985-09-01
The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.
NASA Astrophysics Data System (ADS)
Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan
2018-01-01
This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.
Sepehrband, Farshid; Choupan, Jeiran; Caruyer, Emmanuel; Kurniawan, Nyoman D; Gal, Yaniv; Tieng, Quang M; McMahon, Katie L; Vegh, Viktor; Reutens, David C; Yang, Zhengyi
2014-01-01
We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion-gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo-periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fiber tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fiber tracks was significantly improved using our method.
NASA Astrophysics Data System (ADS)
Roger-Estrade, Jean; Boizard, Hubert; Peigné, Josephine; Sasal, Maria Carolina; Guimaraes, Rachel; Piron, Denis; Tomis, Vincent; Vian, Jean-François; Cadoux, Stephane; Ralisch, Ricardo; Filho, Tavares; Heddadj, Djilali; de Battista, Juan; Duparque, Annie
2016-04-01
In France, agronomists have studied the effects of cropping systems on soil structure, using a field method based on a visual description of soil structure. The "profil cultural" method (Manichon and Gautronneau, 1987) has been designed to perform a field diagnostic of the effects of tillage and compaction on soil structure dynamics. This method is of great use to agronomists improving crop management for a better preservation of soil structure. However, this method was developed and mainly used in conventional tillage systems, with ploughing. As several forms of reduced, minimum and no tillage systems are expanding in many parts of the world, it is necessary to re-evaluate the ability of this method to describe and interpret soil macrostructure in unploughed situations. In unploughed fields, soil structure dynamics of untilled layers is mainly driven by compaction and regeneration by natural agents (climatic conditions, root growth and macrofauna) and it is of major importance to evaluate the importance of these natural processes on soil structure regeneration. These concerns have led us to adapt the standard method and to propose amendments based on a series of field observations and experimental work in different situations of cropping systems, soil types and climatic conditions. We improved the description of crack type and we introduced an index of biological activity, based on the visual examination of clods. To test the improved method, a comparison with the reference method was carried out and the ability of the "profil cultural" method to make a diagnosis was tested on five experiments in France, Brazil and Argentina. Using the improved method, the impact of cropping systems on soil functioning was better assessed when natural processes were integrated into the description.
A continuous quality improvement project to improve the quality of cervical Papanicolaou smears.
Burkman, R T; Ward, R; Balchandani, K; Kini, S
1994-09-01
To improve the quality of cervical Papanicolaou smears by continuous quality improvement techniques. The study used a Papanicolaou smear data base of over 200,000 specimens collected between June 1988 and December 1992. A team approach employing techniques such as process flow-charting, cause and effect diagrams, run charts, and a randomized trial of collection methods was used to evaluate potential causes of Papanicolaou smear reports with the notation "inadequate" or "less than optimal" due to too few or absent endocervical cells. Once a key process variable (method of collection) was identified, the proportion of Papanicolaou smears with inadequate or absent endocervical cells was determined before and after employment of a collection technique using a spatula and Cytobrush. We measured the rate of less than optimal Papanicolaou smears due to too few or absent endocervical cells. Before implementing the new collection technique fully by June 1990, the overall rate of less than optimal cervical Papanicolaou smears ranged from 20-25%; by December 1993, it had stabilized at about 10%. Continuous quality improvement can be used successfully to study a clinical process and implement change that will lead to improvement.
Improvement of Functional Properties by Sever Plastic Deformation on Parts of Titanium Biomaterials
NASA Astrophysics Data System (ADS)
Czán, Andrej; Babík, Ondrej; Daniš, Igor; Martikáň, Pavol; Czánová, Tatiana
2017-12-01
Main task of materials for invasive implantology is their biocompatibility with the tissue but also requirements for improving the functional properties of given materials are increasing constantly. One of problems of materials biocompatibility is the impossibility to improve of functional properties by change the percentage of the chemical elements and so it is necessary to find other innovative methods of improving of functional properties such as mechanical action in the form of high deformation process. This paper is focused on various methods of high deformation process such as Equal Channel Angular Pressing (ECAP) when rods with record strength properties were obtained.The actual studies of the deformation process properties as tri-axial compress stress acting on workpiece with high speed of deformation shows effects similar to results obtained using the other methods, but in lower levels of stress. Hydrostatic extrusion (HE) is applying for the purpose of refining the structure of the commercially pure titanium up to nano-scale. Experiments showed the ability to reduce the grain size below 100 nm. Due to the significant change in the performance of the titanium materials by severe plastic deformation is required to identify the processability of materials with respect to the identification of created surfaces and monitoring the surface integrity, where the experimental results show ability of SPD technologies application on biomaterials.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
Demonstration of improved seismic source inversion method of tele-seismic body wave
NASA Astrophysics Data System (ADS)
Yagi, Y.; Okuwaki, R.
2017-12-01
Seismic rupture inversion of tele-seismic body wave has been widely applied to studies of large earthquakes. In general, tele-seismic body wave contains information of overall rupture process of large earthquake, while the tele-seismic body wave is inappropriate for analyzing a detailed rupture process of M6 7 class earthquake. Recently, the quality and quantity of tele-seismic data and the inversion method has been greatly improved. Improved data and method enable us to study a detailed rupture process of M6 7 class earthquake even if we use only tele-seismic body wave. In this study, we demonstrate the ability of the improved data and method through analyses of the 2016 Rieti, Italy earthquake (Mw 6.2) and the 2016 Kumamoto, Japan earthquake (Mw 7.0) that have been well investigated by using the InSAR data set and the field observations. We assumed the rupture occurring on a single fault plane model inferred from the moment tensor solutions and the aftershock distribution. We constructed spatiotemporal discretized slip-rate functions with patches arranged as closely as possible. We performed inversions using several fault models and found that the spatiotemporal location of large slip-rate area was robust. In the 2016 Kumamoto, Japan earthquake, the slip-rate distribution shows that the rupture propagated to southwest during the first 5 s. At 5 s after the origin time, the main rupture started to propagate toward northeast. First episode and second episode correspond to rupture propagation along the Hinagu fault and the Futagawa fault, respectively. In the 2016 Rieti, Italy earthquake, the slip-rate distribution shows that the rupture propagated to up-dip direction during the first 2 s, and then rupture propagated toward northwest. From both analyses, we propose that the spatiotemporal slip-rate distribution estimated by improved inversion method of tele-seismic body wave has enough information to study a detailed rupture process of M6 7 class earthquake.
McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna
2016-06-01
Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.
A simple method for processing data with least square method
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning
2017-08-01
The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.
Hayashi, Norio; Miyati, Tosiaki; Takanaga, Masako; Ohno, Naoki; Hamaguchi, Takashi; Kozaka, Kazuto; Sanada, Shigeru; Yamamoto, Tomoyuki; Matsui, Osamu
2011-01-01
In the direction where the phased array coil used in parallel magnetic resonance imaging (MRI) is perpendicular to the arrangement, sensitivity falls significantly. Moreover, in a 3.0 tesla (3T) abdominal MRI, the quality of the image is reduced by changes in the relaxation time, reinforcement of the magnetic susceptibility effect, etc. In a 3T MRI, which has a high resonant frequency, the signal of the depths (central part) is reduced in the trunk part. SCIC, which is sensitivity correction processing, has inadequate correction processing, such as that edges are emphasized and the central part is corrected. Therefore, we used 3T with a Gaussian distribution. The uneven compensation processing for sensitivity of an abdomen MR image was considered. The correction processing consisted of the following methods. 1) The center of gravity of the domain of the human body in an abdomen MR image was calculated. 2) The correction coefficient map was created from the center of gravity using the Gaussian distribution. 3) The sensitivity correction image was created from the correction coefficient map and the original picture image. Using the Gaussian correction to process the image, the uniformity calculated using the NEMA method was improved significantly compared to the original image of a phantom. In a visual evaluation by radiologists, the uniformity was improved significantly using the Gaussian correction processing. Because of the homogeneous improvement of the abdomen image taken using 3T MRI, the Gaussian correction processing is considered to be a very useful technique.
Improved Discrete Approximation of Laplacian of Gaussian
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr.
2004-01-01
An improved method of computing a discrete approximation of the Laplacian of a Gaussian convolution of an image has been devised. The primary advantage of the method is that without substantially degrading the accuracy of the end result, it reduces the amount of information that must be processed and thus reduces the amount of circuitry needed to perform the Laplacian-of- Gaussian (LOG) operation. Some background information is necessary to place the method in context. The method is intended for application to the LOG part of a process of real-time digital filtering of digitized video data that represent brightnesses in pixels in a square array. The particular filtering process of interest is one that converts pixel brightnesses to binary form, thereby reducing the amount of information that must be performed in subsequent correlation processing (e.g., correlations between images in a stereoscopic pair for determining distances or correlations between successive frames of the same image for detecting motions). The Laplacian is often included in the filtering process because it emphasizes edges and textures, while the Gaussian is often included because it smooths out noise that might not be consistent between left and right images or between successive frames of the same image.
Lei, Xusheng; Li, Jingjing
2012-01-01
This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993
PRECIPITATION METHOD OF SEPARATING PLUTONIUM FROM CONTAMINATING ELEMENTS
Sutton, J.B.
1958-02-18
This patent relates to an improved method for the decontamination of plutonium. The process consists broadly in an improvement in a method for recovering plutonium from radioactive uranium fission products in aqueous solutions by decontamination steps including byproduct carrier precipitation comprising the step of introducing a preformed aqueous slurry of a hydroxide of a metal of group IV B into any aqueous acidic solution which contains the plutonium in the hexavalent state, radioactive uranium fission products contaminant and a by-product carrier precipitate and separating the metal hydroxide and by-product precipitate from the solution. The process of this invention is especially useful in the separation of plutonium from radioactive zirconium and columbium fission products.
Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying
2013-12-01
Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.
Towards a Better Corrosion Resistance and Biocompatibility Improvement of Nitinol Medical Devices
NASA Astrophysics Data System (ADS)
Rokicki, Ryszard; Hryniewicz, Tadeusz; Pulletikurthi, Chandan; Rokosz, Krzysztof; Munroe, Norman
2015-04-01
Haemocompatibility of Nitinol implantable devices and their corrosion resistance as well as resistance to fracture are very important features of advanced medical implants. The authors of the paper present some novel methods capable to improve Nitinol implantable devices to some marked degree beyond currently used electropolishing (EP) processes. Instead, a magnetoelectropolishing process should be advised. The polarization study shows that magnetoelectropolished Nitinol surface is more corrosion resistant than that obtained after a standard EP and has a unique ability to repassivate the surface. Currently used sterilization processes of Nitinol implantable devices can dramatically change physicochemical properties of medical device and by this influence its biocompatibility. The Authors' experimental results clearly show the way to improve biocompatibility of NiTi alloy surface. The final sodium hypochlorite treatment should replace currently used Nitinol implantable devices sterilization methods which rationale was also given in our previous study.
Improvement of the GRACE star camera data based on the revision of the combination method
NASA Astrophysics Data System (ADS)
Bandikova, Tamara; Flury, Jakob
2014-11-01
The new release of the sensor and instrument data (Level-1B release 02) of the Gravity Recovery and Climate Experiment (GRACE) had a substantial impact on the improvement of the overall accuracy of the gravity field models. This has implied that improvements on the sensor data level can still significantly contribute to arriving closer to the GRACE baseline accuracy. The recent analysis of the GRACE star camera data (SCA1B RL02) revealed their unexpectedly higher noise. As the star camera (SCA) data are essential for the processing of the K-band ranging data and the accelerometer data, thorough investigation of the data set was needed. We fully reexamined the SCA data processing from Level-1A to Level-1B with focus on the combination method of the data delivered by the two SCA heads. In the first step, we produced and compared our own combined attitude solution by applying two different combination methods on the SCA Level-1A data. The first method introduces the information about the anisotropic accuracy of the star camera measurement in terms of a weighing matrix. This method was applied in the official processing as well. The alternative method merges only the well determined SCA boresight directions. This method was implemented on the GRACE SCA data for the first time. Both methods were expected to provide optimal solution characteristic by the full accuracy about all three axes, which was confirmed. In the second step, we analyzed the differences between the official SCA1B RL02 data generated by the Jet Propulsion Laboratory (JPL) and our solution. SCA1B RL02 contains systematically higher noise of about a factor 3-4. The data analysis revealed that the reason is the incorrect implementation of algorithms in the JPL processing routines. After correct implementation of the combination method, significant improvement within the whole spectrum was achieved. Based on these results, the official reprocessing of the SCA data is suggested, as the SCA attitude data are one of the key observations needed for the gravity field recovery.
Improving Transfer of Learning: Relationship to Methods of Using Business Simulation
ERIC Educational Resources Information Center
Mayer, Brad W.; Dale, Kathleen M.; Fraccastoro, Katherine A.; Moss, Gisele
2011-01-01
This study investigates whether the processes associated with the use of business simulations can be structured to improve transfer of learning from the classroom environment to the workplace.The answer to this question is explored by investigating teaching methods used to introduce the simulation, the amount of time students spend on decisions,…
ERIC Educational Resources Information Center
Lin, P. L.; Tan, W. H.
2003-01-01
Presents a new method to improve the performance of query processing in a spatial database. Experiments demonstrated that performance of database systems can be improved because both the number of objects accessed and number of objects requiring detailed inspection are much less than those in the previous approach. (AEF)
Horwood, Christiane M; Youngleson, Michele S; Moses, Edward; Stern, Amy F; Barker, Pierre M
2015-07-01
Achieving long-term retention in HIV care is an important challenge for HIV management and achieving elimination of mother-to-child transmission. Sustainable, affordable strategies are required to achieve this, including strengthening of community-based interventions. Deployment of community-based health workers (CHWs) can improve health outcomes but there is a need to identify systems to support and maintain high-quality performance. Quality-improvement strategies have been successfully implemented to improve quality and coverage of healthcare in facilities and could provide a framework to support community-based interventions. Four community-based quality-improvement projects from South Africa, Malawi and Mozambique are described. Community-based improvement teams linked to the facility-based health system participated in learning networks (modified Breakthrough Series), and used quality-improvement methods to improve process performance. Teams were guided by trained quality mentors who used local data to help nurses and CHWs identify gaps in service provision and test solutions. Learning network participants gathered at intervals to share progress and identify successful strategies for improvement. CHWs demonstrated understanding of quality-improvement concepts, tools and methods, and implemented quality-improvement projects successfully. Challenges of using quality-improvement approaches in community settings included adapting processes, particularly data reporting, to the education level and first language of community members. Quality-improvement techniques can be implemented by CHWs to improve outcomes in community settings but these approaches require adaptation and additional mentoring support to be successful. More research is required to establish the effectiveness of this approach on processes and outcomes of care.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
Accuracy improvement of multimodal measurement of speed of sound based on image processing
NASA Astrophysics Data System (ADS)
Nitta, Naotaka; Kaya, Akio; Misawa, Masaki; Hyodo, Koji; Numano, Tomokazu
2017-07-01
Since the speed of sound (SOS) reflects tissue characteristics and is expected as an evaluation index of elasticity and water content, the noninvasive measurement of SOS is eagerly anticipated. However, it is difficult to measure the SOS by using an ultrasound device alone. Therefore, we have presented a noninvasive measurement method of SOS using ultrasound (US) and magnetic resonance (MR) images. By this method, we determine the longitudinal SOS based on the thickness measurement using the MR image and the time of flight (TOF) measurement using the US image. The accuracy of SOS measurement is affected by the accuracy of image registration and the accuracy of thickness measurements in the MR and US images. In this study, we address the accuracy improvement in the latter thickness measurement, and present an image-processing-based method for improving the accuracy of thickness measurement. The method was investigated by using in vivo data obtained from a tissue-engineered cartilage implanted in the back of a rat, with an unclear boundary.
Frank Gilbreth and health care delivery method study driven learning.
Towill, Denis R
2009-01-01
The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised in the "learn-do-learn-do" feedback loop in the Gilbreth Knowledge Flywheel.
Computer image analysis in caryopses quality evaluation as exemplified by malting barley
NASA Astrophysics Data System (ADS)
Koszela, K.; Raba, B.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Przybył, J.
2015-07-01
One of the purposes to employ modern technologies in agricultural and food industry is to increase the efficiency and automation of production processes, which helps improve productive effectiveness of business enterprises, thus making them more competitive. Nowadays, a challenge presents itself for this branch of economy, to produce agricultural and food products characterized by the best parameters in terms of quality, while maintaining optimum production and distribution costs of the processed biological material. Thus, several scientific centers seek to devise new and improved methods and technologies in this field, which will allow to meet the expectations. A new solution, under constant development, is to employ the so-called machine vision which is to replace human work in both quality and quantity evaluation processes. An indisputable advantage of employing the method is keeping the evaluation unbiased while improving its rate and, what is important, eliminating the fatigue factor of the expert. This paper elaborates on the topic of quality evaluation by marking the contamination in malting barley grains using computer image analysis and selected methods of artificial intelligence [4-5].
Analysis of Non Local Image Denoising Methods
NASA Astrophysics Data System (ADS)
Pardo, Álvaro
Image denoising is probably one of the most studied problems in the image processing community. Recently a new paradigm on non local denoising was introduced. The Non Local Means method proposed by Buades, Morel and Coll attracted the attention of other researches who proposed improvements and modifications to their proposal. In this work we analyze those methods trying to understand their properties while connecting them to segmentation based on spectral graph properties. We also propose some improvements to automatically estimate the parameters used on these methods.
NASA Technical Reports Server (NTRS)
Mickey, F. E.; Mcewan, A. J.; Ewing, E. G.; Huyler, W. C., Jr.; Khajeh-Nouri, B.
1970-01-01
An analysis was conducted with the objective of upgrading and improving the loads, stress, and performance prediction methods for Apollo spacecraft parachutes. The subjects considered were: (1) methods for a new theoretical approach to the parachute opening process, (2) new experimental-analytical techniques to improve the measurement of pressures, stresses, and strains in inflight parachutes, and (3) a numerical method for analyzing the dynamical behavior of rapidly loaded pilot chute risers.
NASA Astrophysics Data System (ADS)
Gomes, Zahra; Jarvis, Matt J.; Almosallam, Ibrahim A.; Roberts, Stephen J.
2018-03-01
The next generation of large-scale imaging surveys (such as those conducted with the Large Synoptic Survey Telescope and Euclid) will require accurate photometric redshifts in order to optimally extract cosmological information. Gaussian Process for photometric redshift estimation (GPZ) is a promising new method that has been proven to provide efficient, accurate photometric redshift estimations with reliable variance predictions. In this paper, we investigate a number of methods for improving the photometric redshift estimations obtained using GPZ (but which are also applicable to others). We use spectroscopy from the Galaxy and Mass Assembly Data Release 2 with a limiting magnitude of r < 19.4 along with corresponding Sloan Digital Sky Survey visible (ugriz) photometry and the UKIRT Infrared Deep Sky Survey Large Area Survey near-IR (YJHK) photometry. We evaluate the effects of adding near-IR magnitudes and angular size as features for the training, validation, and testing of GPZ and find that these improve the accuracy of the results by ˜15-20 per cent. In addition, we explore a post-processing method of shifting the probability distributions of the estimated redshifts based on their Quantile-Quantile plots and find that it improves the bias by ˜40 per cent. Finally, we investigate the effects of using more precise photometry obtained from the Hyper Suprime-Cam Subaru Strategic Program Data Release 1 and find that it produces significant improvements in accuracy, similar to the effect of including additional features.
Shi, Li-Ping; Ou, Qiao-Ming; Cui, Wen-Juan; Chen, Yu-Liang
2014-04-01
To break the hard testa and improve seed germination situation of Astragalus membranaceus var. mongholicus, in order to solve the problems of low success rate of seed germination and seedling. Longxi Astragalus membranaceus var. mongholicus seed was treated by soaking seed with 75% alcohol and concentrated sulfuric acid, warm-water incubating, grinding and comprehensive treating with warm-water incubating, grinding and sand culture. Its seed germination situation was evaluated by germination potential, germination rate and germination index. Different processing methods significantly improved seed germination with different effect. Comprehensive treatment with warm-water incubating, grinding and sand culture was the best one on Astragalus membranaceus var. mongholicus seed germination. Its germination potential, germination rate and germination index was 66.04%, 87.70% and 1.34,respectively. Comprehensive treatment with warm-water incubating, grinding and sand culture is an economic and effective processing method, which is suitable for actual production.
Grey Comprehensive Evaluation of Biomass Power Generation Project Based on Group Judgement
NASA Astrophysics Data System (ADS)
Xia, Huicong; Niu, Dongxiao
2017-06-01
The comprehensive evaluation of benefit is an important task needed to be carried out at all stages of biomass power generation projects. This paper proposed an improved grey comprehensive evaluation method based on triangle whiten function. To improve the objectivity of weight calculation result of only reference comparison judgment method, this paper introduced group judgment to the weighting process. In the process of grey comprehensive evaluation, this paper invited a number of experts to estimate the benefit level of projects, and optimized the basic estimations based on the minimum variance principle to improve the accuracy of evaluation result. Taking a biomass power generation project as an example, the grey comprehensive evaluation result showed that the benefit level of this project was good. This example demonstrates the feasibility of grey comprehensive evaluation method based on group judgment for benefit evaluation of biomass power generation project.
Effects of atamp-charging coke making on strength and high temperature thermal properties of coke.
Zhang, Yaru; Bai, Jinfeng; Xu, Jun; Zhong, Xiangyun; Zhao, Zhenning; Liu, Hongchun
2013-12-01
The stamp-charging coke making process has some advantages of improving the operation environment, decreasing fugitive emission, higher gas collection efficiency as well as less environmental pollution. This article describes the different structure strength and high temperature thermal properties of 4 different types of coke manufactured using a conventional coking process and the stamp-charging coke making process. The 4 kinds of cokes were prepared from the mixture of five feed coals blended by the petrography blending method. The results showed that the structure strength indices of coke prepared using the stamp-charging coke method increase sharply. In contrast with conventional coking process, the stamp-charging process improved the coke strength after reaction but had little impact on the coke reactivity index. Copyright © 2013 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Morgan, Philip
2008-01-01
The use of evaluation to examine and improve the quality of teaching and courses is now a component of most universities. However, despite the various methods and opportunities for evaluation, a lack of understanding of the processes, measures and value are some of the major impediments to effective evaluation. Evaluation requires an understanding…
Improved Sand-Compaction Method for Lost-Foam Metal Casting
NASA Technical Reports Server (NTRS)
Bakhtiyarov, Sayavur I.; Overfelt, Ruel A.
2008-01-01
An improved method of filling a molding flask with sand and compacting the sand around a refractory-coated foam mold pattern has been developed for incorporation into the lost-foam metal-casting process. In comparison with the conventional method of sand filling and compaction, this method affords more nearly complete filling of the space around the refractory-coated foam mold pattern and more thorough compaction of the sand. In so doing, this method enables the sand to better support the refractory coat under metallostatic pressure during filling of the mold with molten metal.
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wei, A.C.; Devitt, K.S.; Wiebe, M.; Bathe, O.F.; McLeod, R.S.; Urbach, D.R.
2014-01-01
Background Surgery is a cornerstone of cancer treatment, but significant differences in the quality of surgery have been reported. Surgical process improvement tools (spits) modify the processes of care as a means to quality improvement (qi). We were interested in developing spits in the area of gastrointestinal (gi) cancer surgery. We report the recommendations of an expert panel held to define quality gaps and establish priority areas that would benefit from spits. Methods The present study used the knowledge-to-action cycle was as a framework. Canadian experts in qi and in gi cancer surgery were assembled in a nominal group workshop. Participants evaluated the merits of spits, described gaps in current knowledge, and identified and ranked processes of care that would benefit from qi. A qualitative analysis of the workshop deliberations using modified grounded theory methods identified major themes. Results The expert panel consisted of 22 participants. Experts confirmed that spits were an important strategy for qi. The top-rated spits included clinical pathways, electronic information technology, and patient safety tools. The preferred settings for use of spits included preoperative and intraoperative settings and multidisciplinary contexts. Outcomes of interest were cancer-related outcomes, process, and the technical quality of surgery measures. Conclusions Surgical process improvement tools were confirmed as an important strategy. Expert panel recommendations will be used to guide future research efforts for spits in gi cancer surgery. PMID:24764704
High-yielding continuous-flow synthesis of antimalarial drug hydroxychloroquine
Telang, Nakul S; Kong, Caleb J; Verghese, Jenson; Gilliland III, Stanley E; Ahmad, Saeed; Dominey, Raymond N
2018-01-01
Numerous synthetic methods for the continuous preparation of fine chemicals and active pharmaceutical ingredients (API’s) have been reported in recent years resulting in a dramatic improvement in process efficiencies. Herein we report a highly efficient continuous synthesis of the antimalarial drug hydroxychloroquine (HCQ). Key improvements in the new process include the elimination of protecting groups with an overall yield improvement of 52% over the current commercial process. The continuous process employs a combination of packed bed reactors with continuous stirred tank reactors for the direct conversion of the starting materials to the product. This high-yielding, multigram-scale continuous synthesis provides an opportunity to achieve increase global access to hydroxychloroquine for treatment of malaria. PMID:29623120
Chemically etched fiber tips for near-field optical microscopy: a process for smoother tips.
Lambelet, P; Sayah, A; Pfeffer, M; Philipona, C; Marquis-Weible, F
1998-11-01
An improved method for producing fiber tips for scanning near-field optical microscopy is presented. The improvement consists of chemically etching quartz optical fibers through their acrylate jacket. This new method is compared with the previous one in which bare fibers were etched. With the new process the meniscus formed by the acid along the fiber does not move during etching, leading to a much smoother surface of the tip cone. Subsequent metallization is thus improved, resulting in better coverage of the tip with an aluminum opaque layer. Our results show that leakage can be avoided along the cone, and light transmission through the tip is spatially limited to an optical aperture of a 100-nm dimension.
Developing and executing quality improvement projects (concept, methods, and evaluation).
Likosky, Donald S
2014-03-01
Continuous quality improvement, quality assurance, cycles of change--these words of often used to express the process of using data to inform and improve clinical care. Although many of us have been exposed to theories and practice of experimental work (e.g., randomized trial), few of us have been similarly exposed to the science underlying quality improvement. Through the lens of a single-center quality improvement study, this article exposes the reader to methodology for conducting such studies. The reader will gain an understanding of these methods required to embark on such a study.
Research on signal processing method for total organic carbon of water quality online monitor
NASA Astrophysics Data System (ADS)
Ma, R.; Xie, Z. X.; Chu, D. Z.; Zhang, S. W.; Cao, X.; Wu, N.
2017-08-01
At present, there is no rapid, stable and effective approach of total organic carbon (TOC) measurement in the Marine environmental online monitoring field. Therefore, this paper proposes an online TOC monitor of chemiluminescence signal processing method. The weak optical signal detected by photomultiplier tube can be enhanced and converted by a series of signal processing module: phase-locked amplifier module, fourth-order band pass filter module and AD conversion module. After a long time of comparison test & measurement, compared with the traditional method, on the premise of sufficient accuracy, this chemiluminescence signal processing method can offer greatly improved measuring speed and high practicability for online monitoring.
Development of optimized, graded-permeability axial groove heat pipes
NASA Technical Reports Server (NTRS)
Kapolnek, Michael R.; Holmes, H. Rolland
1988-01-01
Heat pipe performance can usually be improved by uniformly varying or grading wick permeability from end to end. A unique and cost effective method for grading the permeability of an axial groove heat pipe is described - selective chemical etching of the pipe casing. This method was developed and demonstrated on a proof-of-concept test article. The process improved the test article's performance by 50 percent. Further improvement is possible through the use of optimally etched grooves.
Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques.
Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh
2016-12-01
Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications.
Reasoning with case histories of process knowledge for efficient process development
NASA Technical Reports Server (NTRS)
Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.
1988-01-01
The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.
Method For Enhanced Gas Monitoring In High Density Flow Streams
Von Drasek, William A.; Mulderink, Kenneth A.; Marin, Ovidiu
2005-09-13
A method for conducting laser absorption measurements in high temperature process streams having high levels of particulate matter is disclosed. An impinger is positioned substantially parallel to a laser beam propagation path and at upstream position relative to the laser beam. Beam shielding pipes shield the beam from the surrounding environment. Measurement is conducted only in the gap between the two shielding pipes where the beam propagates through the process gas. The impinger facilitates reduced particle presence in the measurement beam, resulting in improved SNR (signal-to-noise) and improved sensitivity and dynamic range of the measurement.
Human body region enhancement method based on Kinect infrared imaging
NASA Astrophysics Data System (ADS)
Yang, Lei; Fan, Yubo; Song, Xiaowei; Cai, Wenjing
2016-10-01
To effectively improve the low contrast of human body region in the infrared images, a combing method of several enhancement methods is utilized to enhance the human body region. Firstly, for the infrared images acquired by Kinect, in order to improve the overall contrast of the infrared images, an Optimal Contrast-Tone Mapping (OCTM) method with multi-iterations is applied to balance the contrast of low-luminosity infrared images. Secondly, to enhance the human body region better, a Level Set algorithm is employed to improve the contour edges of human body region. Finally, to further improve the human body region in infrared images, Laplacian Pyramid decomposition is adopted to enhance the contour-improved human body region. Meanwhile, the background area without human body region is processed by bilateral filtering to improve the overall effect. With theoretical analysis and experimental verification, the results show that the proposed method could effectively enhance the human body region of such infrared images.
NASA Astrophysics Data System (ADS)
Meng, Hao; Wang, Zhongyu; Fu, Jihua
2008-12-01
The non-diffracting beam triangulation measurement system possesses the advantages of longer measurement range, higher theoretical measurement accuracy and higher resolution over the traditional laser triangulation measurement system. Unfortunately the measurement accuracy of the system is greatly degraded due to the speckle noise, the CCD photoelectric noise and the background light noise in practical applications. Hence, some effective signal processing methods must be applied to improve the measurement accuracy. In this paper a novel effective method for removing the noises in the non-diffracting beam triangulation measurement system is proposed. In the method the grey system theory is used to process and reconstruct the measurement signal. Through implementing the grey dynamic filtering based on the dynamic GM(1,1), the noises can be effectively removed from the primary measurement data and the measurement accuracy of the system can be improved as a result.
2010-01-01
Background Benchmarking is one of the methods used in business that is applied to hospitals to improve the management of their operations. International comparison between hospitals can explain performance differences. As there is a trend towards specialization of hospitals, this study examines the benchmarking process and the success factors of benchmarking in international specialized cancer centres. Methods Three independent international benchmarking studies on operations management in cancer centres were conducted. The first study included three comprehensive cancer centres (CCC), three chemotherapy day units (CDU) were involved in the second study and four radiotherapy departments were included in the final study. Per multiple case study a research protocol was used to structure the benchmarking process. After reviewing the multiple case studies, the resulting description was used to study the research objectives. Results We adapted and evaluated existing benchmarking processes through formalizing stakeholder involvement and verifying the comparability of the partners. We also devised a framework to structure the indicators to produce a coherent indicator set and better improvement suggestions. Evaluating the feasibility of benchmarking as a tool to improve hospital processes led to mixed results. Case study 1 resulted in general recommendations for the organizations involved. In case study 2, the combination of benchmarking and lean management led in one CDU to a 24% increase in bed utilization and a 12% increase in productivity. Three radiotherapy departments of case study 3, were considering implementing the recommendations. Additionally, success factors, such as a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, analysis of both the process and its results and, adapt the identified better working methods to the own setting, were found. Conclusions The improved benchmarking process and the success factors can produce relevant input to improve the operations management of specialty hospitals. PMID:20807408
An electromagnetic induction method for underground target detection and characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartel, L.C.; Cress, D.H.
1997-01-01
An improved capability for subsurface structure detection is needed to support military and nonproliferation requirements for inspection and for surveillance of activities of threatening nations. As part of the DOE/NN-20 program to apply geophysical methods to detect and characterize underground facilities, Sandia National Laboratories (SNL) initiated an electromagnetic induction (EMI) project to evaluate low frequency electromagnetic (EM) techniques for subsurface structure detection. Low frequency, in this case, extended from kilohertz to hundreds of kilohertz. An EMI survey procedure had already been developed for borehole imaging of coal seams and had successfully been applied in a surface mode to detect amore » drug smuggling tunnel. The SNL project has focused on building upon the success of that procedure and applying it to surface and low altitude airborne platforms. Part of SNL`s work has focused on improving that technology through improved hardware and data processing. The improved hardware development has been performed utilizing Laboratory Directed Research and Development (LDRD) funding. In addition, SNL`s effort focused on: (1) improvements in modeling of the basic geophysics of the illuminating electromagnetic field and its coupling to the underground target (partially funded using LDRD funds) and (2) development of techniques for phase-based and multi-frequency processing and spatial processing to support subsurface target detection and characterization. The products of this project are: (1) an evaluation of an improved EM gradiometer, (2) an improved gradiometer concept for possible future development, (3) an improved modeling capability, (4) demonstration of an EM wave migration method for target recognition, and a demonstration that the technology is capable of detecting targets to depths exceeding 25 meters.« less
Improved Imaging With Laser-Induced Eddy Currents
NASA Technical Reports Server (NTRS)
Chern, Engmin J.
1993-01-01
System tests specimen of material nondestructively by laser-induced eddy-current imaging improved by changing method of processing of eddy-current signal. Changes in impedance of eddy-current coil measured in absolute instead of relative units.
Reducing intraoperative red blood cell unit wastage in a large academic medical center
Whitney, Gina M.; Woods, Marcella C.; France, Daniel J.; Austin, Thomas M.; Deegan, Robert J.; Paroskie, Allison; Booth, Garrett S.; Young, Pampee P.; Dmochowski, Roger R.; Sandberg, Warren S.; Pilla, Michael A.
2015-01-01
BACKGROUND The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. STUDY DESIGN AND METHODS Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. RESULTS Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p <0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15–0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. CONCLUSIONS These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. PMID:26202213
Process Improvement for Interinstitutional Research Contracting
Logan, Jennifer; Bjorklund, Todd; Whitfield, Jesse; Reed, Peggy; Lesher, Laurie; Sikalis, Amy; Brown, Brent; Drollinger, Sandy; Larrabee, Kristine; Thompson, Kristie; Clark, Erin; Workman, Michael; Boi, Luca
2015-01-01
Abstract Introduction Sponsored research increasingly requires multiinstitutional collaboration. However, research contracting procedures have become more complicated and time consuming. The perinatal research units of two colocated healthcare systems sought to improve their research contracting processes. Methods The Lean Process, a management practice that iteratively involves team members in root cause analyses and process improvement, was applied to the research contracting process, initially using Process Mapping and then developing Problem Solving Reports. Results Root cause analyses revealed that the longest delays were the individual contract legal negotiations. In addition, the “business entity” was the research support personnel of both healthcare systems whose “customers” were investigators attempting to conduct interinstitutional research. Development of mutually acceptable research contract templates and language, chain of custody templates, and process development and refinement formats decreased the Notice of Grant Award to Purchase Order time from a mean of 103.5 days in the year prior to Lean Process implementation to 45.8 days in the year after implementation (p = 0.004). Conclusions The Lean Process can be applied to interinstitutional research contracting with significant improvement in contract implementation. PMID:26083433
biomass analysis methods and is primary author on 11 Laboratory Analytical Procedures, which are ) spectroscopic analysis methods. These methods allow analysts to predict the composition of feedstock and process . Patent No. 6,737,258 (2002) Featured Publications "Improved methods for the determination of drying
NASA Astrophysics Data System (ADS)
Buchari; Tarigan, U.; Ambarita, M. B.
2018-02-01
PT. XYZ is a wood processing company which produce semi-finished wood with production system is make to order. In the production process, it can be seen that the production line is not balanced. The imbalance of the production line is caused by the difference in cycle time between work stations. In addition, there are other issues, namely the existence of material flow pattern is irregular so it resulted in the backtracking and displacement distance away. This study aimed to obtain the allocation of work elements to specific work stations and propose an improvement of the production layout based on the result of improvements in the line balancing. The method used in the balancing is Ranked Positional Weight (RPW) or also known as Helgeson Birnie method. While the methods used in the improvement of the layout is the method of Systematic Layout Planning (SLP). By using Ranked Positional Weight (RPW) obtained increase in line efficiency becomes 84,86% and decreased balance delay becomes 15,14%. Repairing the layout using the method of Systematic Layout Planning (SLP) also give good results with a reduction in path length becomes 133,82 meters from 213,09 meters previously or a decrease of 37.2%.
Fabrication of titanium thermal protection system panels by the NOR-Ti-bond process
NASA Technical Reports Server (NTRS)
Wells, R. R.
1971-01-01
A method for fabricating titanium thermal protection system panels is described. The method has the potential for producing wide faying surface bonds to minimize temperature gradients and thermal stresses resulting during service at elevated temperatures. Results of nondestructive tests of the panels are presented. Concepts for improving the panel quality and for improved economy in production are discussed.
Intelligent form removal with character stroke preservation
NASA Astrophysics Data System (ADS)
Garris, Michael D.
1996-03-01
A new technique for intelligent form removal has been developed along with a new method for evaluating its impact on optical character recognition (OCR). All the dominant lines in the image are automatically detected using the Hough line transform and intelligently erased while simultaneously preserving overlapping character strokes by computing line width statistics and keying off of certain visual cues. This new method of form removal operates on loosely defined zones with no image deskewing. Any field in which the writer is provided a horizontal line to enter a response can be processed by this method. Several examples of processed fields are provided, including a comparison of results between the new method and a commercially available forms removal package. Even if this new form removal method did not improve character recognition accuracy, it is still a significant improvement to the technology because the requirement of a priori knowledge of the form's geometric details has been greatly reduced. This relaxes the recognition system's dependence on rigid form design, printing, and reproduction by automatically detecting and removing some of the physical structures (lines) on the form. Using the National Institute of Standards and Technology (NIST) public domain form-based handprint recognition system, the technique was tested on a large number of fields containing randomly ordered handprinted lowercase alphabets, as these letters (especially those with descenders) frequently touch and extend through the line along which they are written. Preserving character strokes improves overall lowercase recognition performance by 3%, which is a net improvement, but a single performance number like this doesn't communicate how the recognition process was really influenced. There is expected to be trade- offs with the introduction of any new technique into a complex recognition system. To understand both the improvements and the trade-offs, a new analysis was designed to compare the statistical distributions of individual confusion pairs between two systems. As OCR technology continues to improve, sophisticated analyses like this are necessary to reduce the errors remaining in complex recognition problems.
Matrix model of the grinding process of cement clinker in the ball mill
NASA Astrophysics Data System (ADS)
Sharapov, Rashid R.
2018-02-01
In the article attention is paid to improving the efficiency of production of fine powders, in particular Portland cement clinker. The questions of Portland cement clinker grinding in closed circuit ball mills. Noted that the main task of modeling the grinding process is predicting the granulometric composition of the finished product taking into account constructive and technological parameters used ball mill and separator. It is shown that the most complete and informative characterization of the grinding process in a ball mill is a grinding matrix taking into account the transformation of grain composition inside the mill drum. Shows how the relative mass fraction of the particles of crushed material, get to corresponding fraction. Noted, that the actual task of reconstruction of the matrix of grinding on the experimental data obtained in the real operating installations. On the basis of experimental data obtained on industrial installations, using matrix method to determine the kinetics of the grinding process in closed circuit ball mills. The calculation method of the conversion of the grain composition of the crushed material along the mill drum developed. Taking into account the proposed approach can be optimized processing methods to improve the manufacturing process of Portland cement clinker.
NASA Astrophysics Data System (ADS)
Shin, Hyeonwoo; Kang, Chan-mo; Baek, Kyu-Ha; Kim, Jun Young; Do, Lee-Mi; Lee, Changhee
2018-05-01
We present a novel methods of fabricating low-temperature (180 °C), solution-processed zinc oxide (ZnO) transistors using a ZnO precursor that is blended with zinc hydroxide [Zn(OH)2] and zinc oxide hydrate (ZnO • H2O) in an ammonium solution. By using the proposed method, we successfully improved the electrical performance of the transistor in terms of the mobility (μ), on/off current ratio (I on/I off), sub-threshold swing (SS), and operational stability. Our new approach to forming a ZnO film was systematically compared with previously proposed methods. An atomic forced microscopic (AFM) image and an X-ray photoelectron spectroscopy (XPS) analysis showed that our method increases the ZnO crystallite size with less OH‑ impurities. Thus, we attribute the improved electrical performance to the better ZnO film formation using the blending methods.
NASA Astrophysics Data System (ADS)
Wang, Tai-Han; Huang, Da-Nian; Ma, Guo-Qing; Meng, Zhao-Hai; Li, Ye
2017-06-01
With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noisecontaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airborne gravity-gradiometry data from Vinton salt dome (southwest Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.
Yamamoto, J Jay; Malatestinic, Bill; Lehman, Angela; Juneja, Rattan
2010-01-01
The objective of this project was to improve the timing of inpatient insulin administration related to meal delivery and the scheduling of radiology tests by Lean Six Sigma method. A multidisciplinary hospital team and a Six Sigma team from a pharmaceutical manufacturer collaborated to evaluate food delivery and radiology scheduling processes related to the timing of insulin administration. Key factors leading to problems within each system were addressed to improve the efficiency of each process while improving the timeliness of glucose testing and insulin administration. Standardizing the food delivery schedule and utilizing scorecards to track on-time meal deliveries to the floor enabled nursing to more accurately administer insulin in coordination with the delivery of meals. Increasing communication and restricting the scheduling of inpatient procedures during mealtimes reduced disruptions to insulin administration. Data at 6 months postimplementation demonstrated that the institution met goals for most primary outcome metrics including increasing on-time meal delivery and the proportion of patients taking insulin scheduled for radiology tests during appropriate times. By implementing the recommendations identified via Lean Six Sigma, this collaborative effort improved the timing of inpatient insulin administration related to meal delivery and radiology testing.
Josefsberg, Jessica O; Buckland, Barry
2012-06-01
The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory perspective, Quality by Design (QbD) and Process Analytical Technology (PAT) are important initiatives that can be applied effectively to many types of vaccine processes. Universal demand for vaccines requires that a manufacturer plan to supply tens and sometimes hundreds of millions of doses per year at low cost. To enable broader use, there is intense interest in improving temperature stability to allow for excursions from a rigid cold chain supply, especially at the point of vaccination. Finally, there is progress in novel routes of delivery to move away from the traditional intramuscular injection by syringe approach. Copyright © 2012 Wiley Periodicals, Inc.
Watkins, Eren Youmans; Kemeter, Dave M; Spiess, Anita; Corrigan, Elizabeth; Kateley, Keri; Wills, John V; Mancha, Brent Edward; Nichols, Jerrica; Bell, Amy Millikan
2014-01-01
Lean Six Sigma (LSS) is a process improvement, problem-solving methodology used in business and manufacturing to improve the speed, quality, and cost of products. LSS can also be used to improve knowledge-based products integral to public health surveillance. An LSS project by the Behavioral Social Health Outcomes Program of the Army Institute of Public Health reduced the number of labor hours spent producing the routine surveillance of suicidal behavior publication. At baseline, the total number of labor hours was 448; after project completion, total labor hours were 199. Based on customer feedback, publication production was reduced from quarterly to annually. Process improvements enhanced group morale and established best practices in the form of standard operating procedures and business rules to ensure solutions are sustained. LSS project participation also fostered a change in the conceptualization of tasks and projects. These results demonstrate that LSS can be used to inform the public health process and should be considered a viable method of improving knowledge-based products and processes.
The development of a patient-specific method for physiotherapy goal setting: a user-centered design.
Stevens, Anita; Köke, Albère; van der Weijden, Trudy; Beurskens, Anna
2018-08-01
To deliver client-centered care, physiotherapists need to identify the patients' individual treatment goals. However, practical tools for involving patients in goal setting are lacking. The purpose of this study was to improve the frequently used Patient-Specific Complaints instrument in Dutch physiotherapy, and to develop it into a feasible method to improve physiotherapy goal setting. An iterative user-centered design was conducted in co-creation with the physiotherapists and patients, in three phases. Their needs and preferences were identified by means of group meetings and questionnaires. The new method was tested in several field tests in physiotherapy practices. Four main objectives for improvement were formulated: clear instructions for the administration procedure, targeted use across the physiotherapy process, client-activating communication skills, and a client-centered attitude of the physiotherapist. A theoretical goal-setting framework and elements of shared decision making were integrated into the new-called, Patient-Specific Goal-setting method, together with a practical training course. The user-centered approach resulted in a goal-setting method that is fully integrated in the physiotherapy process. The new goal-setting method contributes to a more structured approach to goal setting and enables patient participation and goal-oriented physiotherapy. Before large-scale implementation, its feasibility in physiotherapy practice needs to be investigated. Implications for rehabilitation Involving patients and physiotherapists in the development and testing of a goal-setting method, increases the likelihood of its feasibility in practice. The integration of a goal-setting method into the physiotherapy process offers the opportunity to focus more fully on the patient's goals. Patients should be informed about the aim of every step of the goal-setting process in order to increase their awareness and involvement. Training physiotherapists to use a patient-specific method for goal setting is crucial for a correct application.
Innovations in coating technology.
Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut
2008-01-01
Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.
NASA Astrophysics Data System (ADS)
Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur
2017-12-01
Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.
Kim, Christopher S.; Hayman, James A.; Billi, John E.; Lash, Kathy; Lawrence, Theodore S.
2007-01-01
Purpose Patients with bone and brain metastases are among the most symptomatic nonemergency patients treated by radiation oncologists. Treatment should begin as soon as possible after the request is generated. We tested the hypothesis that the operational improvement method based on lean thinking could help streamline the treatment of our patients referred for bone and brain metastases. Methods University of Michigan Health System has adopted lean thinking as a consistent approach to quality and process improvement. We applied the principles and tools of lean thinking, especially value as defined by the customer, value stream mapping processes, and one piece flow, to improve the process of delivering care to patients referred for bone or brain metastases. Results and Conclusion The initial evaluation of the process revealed that it was rather chaotic and highly variable. Implementation of the lean thinking principles permitted us to improve the process by cutting the number of individual steps to begin treatment from 27 to 16 and minimize variability by applying standardization. After an initial learning period, the percentage of new patients with brain or bone metastases receiving consultation, simulation, and treatment within the same day rose from 43% to nearly 95%. By implementing the ideas of lean thinking, we improved the delivery of clinical care for our patients with bone or brain metastases. We believe these principles can be applied to much of the care administered throughout our and other health care delivery areas. PMID:20859409
Kruskal, Jonathan B; Reedy, Allen; Pascal, Laurie; Rosen, Max P; Boiselle, Phillip M
2012-01-01
Many hospital radiology departments are adopting "lean" methods developed in automobile manufacturing to improve operational efficiency, eliminate waste, and optimize the value of their services. The lean approach, which emphasizes process analysis, has particular relevance to radiology departments, which depend on a smooth flow of patients and uninterrupted equipment function for efficient operation. However, the application of lean methods to isolated problems is not likely to improve overall efficiency or to produce a sustained improvement. Instead, the authors recommend a gradual but continuous and comprehensive "lean transformation" of work philosophy and workplace culture. Fundamental principles that must consistently be put into action to achieve such a transformation include equal involvement of and equal respect for all staff members, elimination of waste, standardization of work processes, improvement of flow in all processes, use of visual cues to communicate and inform, and use of specific tools to perform targeted data collection and analysis and to implement and guide change. Many categories of lean tools are available to facilitate these tasks: value stream mapping for visualizing the current state of a process and identifying activities that add no value; root cause analysis for determining the fundamental cause of a problem; team charters for planning, guiding, and communicating about change in a specific process; management dashboards for monitoring real-time developments; and a balanced scorecard for strategic oversight and planning in the areas of finance, customer service, internal operations, and staff development. © RSNA, 2012.
ERIC Educational Resources Information Center
Aldowaisan, Tariq; Allahverdi, Ali
2016-01-01
This paper describes the process employed by the Industrial and Management Systems Engineering programme at Kuwait University to continuously improve the programme. Using a continuous improvement framework, the paper demonstrates how various qualitative and quantitative analyses methods, such as hypothesis testing and control charts, have been…
Improving the Pedagogy Associated with the Teaching of Psychopharmacology
ERIC Educational Resources Information Center
Glick, Ira D.; Salzman, Carl; Cohen, Bruce M.; Klein, Donald F.; Moutier, Christine; Nasrallah, Henry A.; Ongur, Dost; Wang, Po; Zisook, Sidney
2007-01-01
Objective: The authors summarize two special sessions focused on the teaching of psychopharmacology at the 2003 and 2004 annual meeting of the American College of Neuropsychopharmacology (ACNP). The focus was on whether "improving the teaching-learning process" in psychiatric residency programs could improve clinical practice. Method: Problems of…
Meeuwse, Marco
2018-03-30
Lean Six Sigma is an improvement method, combining Lean, which focuses on removing 'waste' from a process, with Six Sigma, which is a data-driven approach, making use of statistical tools. Traditionally it is used to improve the quality of products (reducing defects), or processes (reducing variability). However, it can also be used as a tool to increase the productivity or capacity of a production plant. The Lean Six Sigma methodology is therefore an important pillar of continuous improvement within DSM. In the example shown here a multistep batch process is improved, by analyzing the duration of the relevant process steps, and optimizing the procedures. Process steps were performed in parallel instead of sequential, and some steps were made shorter. The variability was reduced, giving the opportunity to make a tighter planning, and thereby reducing waiting times. Without any investment in new equipment or technical modifications, the productivity of the plant was improved by more than 20%; only by changing procedures and the programming of the process control system.
Li, Zhongwei; Liu, Xingjian; Wen, Shifeng; He, Piyao; Zhong, Kai; Wei, Qingsong; Shi, Yusheng; Liu, Sheng
2018-04-12
Lack of monitoring of the in situ process signatures is one of the challenges that has been restricting the improvement of Powder-Bed-Fusion Additive Manufacturing (PBF AM). Among various process signatures.
NASA Astrophysics Data System (ADS)
Hou, Zhenlong; Huang, Danian
2017-09-01
In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.
Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong
2017-07-01
In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Novel Automated Blood Separations Validate Whole Cell Biomarkers
Burger, Douglas E.; Wang, Limei; Ban, Liqin; Okubo, Yoshiaki; Kühtreiber, Willem M.; Leichliter, Ashley K.; Faustman, Denise L.
2011-01-01
Background Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs). Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs) of fresh blood samples. Methods and Findings To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes. Conclusions Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials. PMID:21799852
ERIC Educational Resources Information Center
Davenport, Carol
2013-01-01
Three methods from different schools illustrate how the cyclic process of action research can be used to develop teaching skills. The importance of learning from successful and unsuccessful lessons or parts of lessons is emphasised as the basis for development and improvement. This process can be carried out on an individual basis but development…
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem
NASA Astrophysics Data System (ADS)
Luo, Yabo; Waden, Yongo P.
2017-06-01
Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.
Larson, David B; Mickelsen, L Jake; Garcia, Kandice
2016-01-01
Performance improvement in a complex health care environment depends on the cooperation of diverse individuals and groups, allocation of time and resources, and use of effective improvement methods. To address this challenge, we developed an 18-week multidisciplinary training program that would also provide a vehicle for effecting needed improvements, by using a team- and project-based model. The program began in the radiology department and subsequently expanded to include projects from throughout the medical center. Participants were taught a specific method for team-based problem solving, which included (a) articulating the problem, (b) observing the process, (c) analyzing possible causes of problems, (d) identifying key drivers, (e) testing and refining interventions, and (f) providing for sustainment of results. Progress was formally reviewed on a weekly basis. A total of 14 teams consisting of 78 participants completed the course in two cohorts; one project was discontinued. All completed projects resulted in at least modest improvement. Mean skill scores increased from 2.5/6 to 4.5/6 (P < .01), and the mean satisfaction score was 4.7/5. Identified keys to success include (a) engagement of frontline staff, (b) teams given authority to make process changes, (c) capable improvement coaches, (d) a physician-director with improvement expertise and organizational authority, (e) capable administrative direction, (f) supportive organizational leaders, (g) weekly progress reviews, (h) timely educational material, (i) structured problem-solving methods, and ( j ) multiple projects working simultaneously. The purpose of this article is to review the program, including the methods and results, and discuss perceived keys to program success. © RSNA, 2016.
Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method
Lu, Zhaolin
2017-01-01
Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250 μm were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250 μm. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis. PMID:28298925
Applying PCI in Combination Swivel Head Wrench
NASA Astrophysics Data System (ADS)
Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen
2017-09-01
Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.
Improving the preparticipation exam process.
Reed, F E
2001-08-01
The Preparticipation Exam for too long has been a mandatory yearly athletic exam and not the base from which a process of continuous athletic care took place. The purpose of this article is not only to introduce improvements in the exam itself but to also describe some extensions of the process that allow us to improve athletic care in South Carolina. It is hoped that a software scanning program will allow compiling of demographic data from individual and group examinations and thus support the method of exam preferred by all physicians in our state. Standard forms will also facilitate communication within the Athletic Care Unit and between physicians involved in athletic care.
Gupta, Munish; Kaplan, Heather C
2017-09-01
Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.
Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie
2017-09-01
The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.
TBDQ: A Pragmatic Task-Based Method to Data Quality Assessment and Improvement
Vaziri, Reza; Mohsenzadeh, Mehran; Habibi, Jafar
2016-01-01
Organizations are increasingly accepting data quality (DQ) as a major key to their success. In order to assess and improve DQ, methods have been devised. Many of these methods attempt to raise DQ by directly manipulating low quality data. Such methods operate reactively and are suitable for organizations with highly developed integrated systems. However, there is a lack of a proactive DQ method for businesses with weak IT infrastructure where data quality is largely affected by tasks that are performed by human agents. This study aims to develop and evaluate a new method for structured data, which is simple and practical so that it can easily be applied to real world situations. The new method detects the potentially risky tasks within a process, and adds new improving tasks to counter them. To achieve continuous improvement, an award system is also developed to help with the better selection of the proposed improving tasks. The task-based DQ method (TBDQ) is most appropriate for small and medium organizations, and simplicity in implementation is one of its most prominent features. TBDQ is case studied in an international trade company. The case study shows that TBDQ is effective in selecting optimal activities for DQ improvement in terms of cost and improvement. PMID:27192547
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Optimization of processing parameters of UAV integral structural components based on yield response
NASA Astrophysics Data System (ADS)
Chen, Yunsheng
2018-05-01
In order to improve the overall strength of unmanned aerial vehicle (UAV), it is necessary to optimize the processing parameters of UAV structural components, which is affected by initial residual stress in the process of UAV structural components processing. Because machining errors are easy to occur, an optimization model for machining parameters of UAV integral structural components based on yield response is proposed. The finite element method is used to simulate the machining parameters of UAV integral structural components. The prediction model of workpiece surface machining error is established, and the influence of the path of walking knife on residual stress of UAV integral structure is studied, according to the stress of UAV integral component. The yield response of the time-varying stiffness is analyzed, and the yield response and the stress evolution mechanism of the UAV integral structure are analyzed. The simulation results show that this method is used to optimize the machining parameters of UAV integral structural components and improve the precision of UAV milling processing. The machining error is reduced, and the deformation prediction and error compensation of UAV integral structural parts are realized, thus improving the quality of machining.
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong
2004-06-01
An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.
NASA Astrophysics Data System (ADS)
Chen, Kai-Huang; Chang, Ting-Chang; Chang, Guan-Chang; Hsu, Yung-En; Chen, Ying-Chung; Xu, Hong-Quan
2010-04-01
To improve the electrical properties of as-deposited BZ1T9 ferroelectric thin films, the supercritical carbon dioxide fluid (SCF) process were used by a low temperature treatment. In this study, the BZ1T9 ferroelectric thin films were post-treated by SCF process which mixed with propyl alcohol and pure H2O. After SCF process treatment, the remnant polarization increased in hysteresis curves, and the passivation of oxygen vacancy and defect in leakage current density curves were found. Additionally, the improvement qualities of as-deposited BZ1T9 thin films after SCF process treatment were carried out XPS, C- V, and J- E measurements.
ERIC Educational Resources Information Center
Prouty, Kenneth E.
2004-01-01
This essay examines how jazz educators construct methods for teaching the art of improvisation in institutionalized jazz studies programs. Unlike previous studies of the processes and philosophies of jazz instruction, I examine such processes from a cultural standpoint, to identify why certain methods might be favored over others. Specifically,…
Forest Service National Visitor Use Monitoring Process: Research Method Documentation
Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold
2002-01-01
In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...
NASA Astrophysics Data System (ADS)
Bascetin, A.
2007-04-01
The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.
Non-rigid ultrasound image registration using generalized relaxation labeling process
NASA Astrophysics Data System (ADS)
Lee, Jong-Ha; Seong, Yeong Kyeong; Park, MoonHo; Woo, Kyoung-Gu; Ku, Jeonghun; Park, Hee-Jun
2013-03-01
This research proposes a novel non-rigid registration method for ultrasound images. The most predominant anatomical features in medical images are tissue boundaries, which appear as edges. In ultrasound images, however, other features can be identified as well due to the specular reflections that appear as bright lines superimposed on the ideal edge location. In this work, an image's local phase information (via the frequency domain) is used to find the ideal edge location. The generalized relaxation labeling process is then formulated to align the feature points extracted from the ideal edge location. In this work, the original relaxation labeling method was generalized by taking n compatibility coefficient values to improve non-rigid registration performance. This contextual information combined with a relaxation labeling process is used to search for a correspondence. Then the transformation is calculated by the thin plate spline (TPS) model. These two processes are iterated until the optimal correspondence and transformation are found. We have tested our proposed method and the state-of-the-art algorithms with synthetic data and bladder ultrasound images of in vivo human subjects. Experiments show that the proposed method improves registration performance significantly, as compared to other state-of-the-art non-rigid registration algorithms.
Advanced Hydrogen Liquefaction Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Joseph; Kromer, Brian; Neu, Ben
2011-09-28
The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less
Improved signal recovery for flow cytometry based on ‘spatially modulated emission’
NASA Astrophysics Data System (ADS)
Quint, S.; Wittek, J.; Spang, P.; Levanon, N.; Walther, T.; Baßler, M.
2017-09-01
Recently, the technique of ‘spatially modulated emission’ has been introduced (Baßler et al 2008 US Patent 0080181827A1; Kiesel et al 2009 Appl. Phys. Lett. 94 041107; Kiesel et al 2011 Cytometry A 79A 317-24) improving the signal-to-noise ratio (SNR) for detecting bio-particles in the field of flow cytometry. Based on this concept, we developed two advanced signal processing methods which further enhance the SNR and selectivity for cell detection. The improvements are achieved by adapting digital filtering methods from RADAR technology and mainly address inherent offset elimination, increased signal dynamics and moreover reduction of erroneous detections due to processing artifacts. We present a comprehensive theory on SNR gain and provide experimental results of our concepts.
Post processing for offline Chinese handwritten character string recognition
NASA Astrophysics Data System (ADS)
Wang, YanWei; Ding, XiaoQing; Liu, ChangSong
2012-01-01
Offline Chinese handwritten character string recognition is one of the most important research fields in pattern recognition. Due to the free writing style, large variability in character shapes and different geometric characteristics, Chinese handwritten character string recognition is a challenging problem to deal with. However, among the current methods over-segmentation and merging method which integrates geometric information, character recognition information and contextual information, shows a promising result. It is found experimentally that a large part of errors are segmentation error and mainly occur around non-Chinese characters. In a Chinese character string, there are not only wide characters namely Chinese characters, but also narrow characters like digits and letters of the alphabet. The segmentation error is mainly caused by uniform geometric model imposed on all segmented candidate characters. To solve this problem, post processing is employed to improve recognition accuracy of narrow characters. On one hand, multi-geometric models are established for wide characters and narrow characters respectively. Under multi-geometric models narrow characters are not prone to be merged. On the other hand, top rank recognition results of candidate paths are integrated to boost final recognition of narrow characters. The post processing method is investigated on two datasets, in total 1405 handwritten address strings. The wide character recognition accuracy has been improved lightly and narrow character recognition accuracy has been increased up by 10.41% and 10.03% respectively. It indicates that the post processing method is effective to improve recognition accuracy of narrow characters.
An Accurate and Stable FFT-based Method for Pricing Options under Exp-Lévy Processes
NASA Astrophysics Data System (ADS)
Ding, Deng; Chong U, Sio
2010-05-01
An accurate and stable method for pricing European options in exp-Lévy models is presented. The main idea of this new method is combining the quadrature technique and the Carr-Madan Fast Fourier Transform methods. The theoretical analysis shows that the overall complexity of this new method is still O(N log N) with N grid points as the fast Fourier transform methods. Numerical experiments for different exp-Lévy processes also show that the numerical algorithm proposed by this new method has an accuracy and stability for the small strike prices K. That develops and improves the Carr-Madan method.
Adaptive Image Processing Methods for Improving Contaminant Detection Accuracy on Poultry Carcasses
USDA-ARS?s Scientific Manuscript database
Technical Abstract A real-time multispectral imaging system has demonstrated a science-based tool for fecal and ingesta contaminant detection during poultry processing. In order to implement this imaging system at commercial poultry processing industry, the false positives must be removed. For doi...
Investigations in adaptive processing of multispectral data
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Horwitz, H. M.
1973-01-01
Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made.
2016-04-01
Practical Problem Solving Method RMD Resource Management Decision ROI Return on Investment SECAF Secretary of the Air Force SECNAV Secretary of...AFSO21 and now AF CPI, this program seeks to train and certify an organic cadre of CPI practitioners to support the use of its standard problem solving ...process known as the AF Practical Problem Solving Method (PPSM) to solve mission critical process deficiencies. The PPSM leverages several industry
Partial branch and bound algorithm for improved data association in multiframe processing
NASA Astrophysics Data System (ADS)
Poore, Aubrey B.; Yan, Xin
1999-07-01
A central problem in multitarget, multisensor, and multiplatform tracking remains that of data association. Lagrangian relaxation methods have shown themselves to yield near optimal answers in real-time. The necessary improvement in the quality of these solutions warrants a continuing interest in these methods. These problems are NP-hard; the only known methods for solving them optimally are enumerative in nature with branch-and-bound being most efficient. Thus, the development of methods less than a full branch-and-bound are needed for improving the quality. Such methods as K-best, local search, and randomized search have been proposed to improve the quality of the relaxation solution. Here, a partial branch-and-bound technique along with adequate branching and ordering rules are developed. Lagrangian relaxation is used as a branching method and as a method to calculate the lower bound for subproblems. The result shows that the branch-and-bound framework greatly improves the resolution quality of the Lagrangian relaxation algorithm and yields better multiple solutions in less time than relaxation alone.
Health-care process improvement decisions: a systems perspective.
Walley, Paul; Silvester, Kate; Mountford, Shaun
2006-01-01
The paper seeks to investigate decision-making processes within hospital improvement activity, to understand how performance measurement systems influence decisions and potentially lead to unsuccessful or unsustainable process changes. A longitudinal study over a 33-month period investigates key events, decisions and outcomes at one medium-sized hospital in the UK. Process improvement events are monitored using process control methods and by direct observation. The authors took a systems perspective of the health-care processes, ensuring that the impacts of decisions across the health-care supply chain were appropriately interpreted. The research uncovers the ways in which measurement systems disguise failed decisions and encourage managers to take a low-risk approach of "symptomatic relief" when trying to improve performance metrics. This prevents many managers from trying higher risk, sustainable process improvement changes. The behaviour of the health-care system is not understood by many managers and this leads to poor analysis of problem situations. Measurement using time-series methodologies, such as statistical process control are vital for a better understanding of the systems impact of changes. Senior managers must also be aware of the behavioural influence of similar performance measurement systems that discourage sustainable improvement. There is a risk that such experiences will tarnish the reputation of performance management as a discipline. Recommends process control measures as a way of creating an organization memory of how decisions affect performance--something that is currently lacking.
Hybrid performance measurement of a business process outsourcing - A Malaysian company perspective
NASA Astrophysics Data System (ADS)
Oluyinka, Oludapo Samson; Tamyez, Puteri Fadzline; Kie, Cheng Jack; Freida, Ayodele Ozavize
2017-05-01
It's no longer new that customer perceived value for product and services are now greatly influenced by its psychological and social advantages. In order to meet up with the increasing operational cost, response time, quality and innovative capabilities many companies turned their fixed operational cost to a variable cost through outsourcing. Hence, the researcher explored different underlying outsourcing theories and infer that these theories are essential to performance improvement. In this study, the researcher evaluates the performance of a business process outsource company by a combination of lean and agile method. To test the hypotheses, we analyze different variability that a business process company faces, how lean and agile have been used in other industry to address such variability and discuss the result using a predictive multiple regression analysis on data collected from companies in Malaysia. The findings from this study revealed that while each method has its own advantage, a business process outsource company could achieve more (up to 87%) increase in performance level by developing a strategy which focuses on a perfect mixture of lean and agile improvement methods. Secondly, this study shows that performance indicator could be better evaluated with non-metrics variables of the agile method. Thirdly, this study also shows that business process outsourcing company could perform better when they concentrate more on strengthening internal process integration of employees.
Lynn, Joanne
2011-04-01
The methods for healthcare reform are strikingly underdeveloped, with much reliance on political power. A methodology that combined methods from sources such as clinical trials, experience-based wisdom, and improvement science could be among the aims of the upcoming work in the USA on comparative effectiveness and on the agenda of the Center for Medicare and Medicaid Innovation in the Centers for Medicare and Medicaid Services. Those working in quality improvement have an unusual opportunity to generate substantial input into these processes through professional organisations such as the Academy for Healthcare Improvement and dominant leadership organisations such as the Institute for Healthcare Improvement.
Modification of polymers by polymeric additives
NASA Astrophysics Data System (ADS)
Nesterov, A. E.; Lebedev, E. V.
1989-08-01
The conditions for the thermodynamic compatibility of polymers and methods for its enhancement are examined. The study of the influence of various factors on the concentration-temperature limits of compatibility, dispersion stabilisation processes, and methods for the improvement of adhesion between phases in mixtures of thermodynamically incompatible polymers is described. Questions concerning the improvement of the physicomechanical characteristics of polymer dispersions are considered. The bibliography includes 200 references.
Tan, Ryan Y C; Met-Domestici, Marie; Zhou, Ke; Guzman, Alexis B; Lim, Soon Thye; Soo, Khee Chee; Feeley, Thomas W; Ngeow, Joanne
2016-03-01
To meet increasing demand for cancer genetic testing and improve value-based cancer care delivery, National Cancer Centre Singapore restructured the Cancer Genetics Service in 2014. Care delivery processes were redesigned. We sought to improve access by increasing the clinic capacity of the Cancer Genetics Service by 100% within 1 year without increasing direct personnel costs. Process mapping and plan-do-study-act (PDSA) cycles were used in a quality improvement project for the Cancer Genetics Service clinic. The impact of interventions was evaluated by tracking the weekly number of patient consultations and access times for appointments between April 2014 and May 2015. The cost impact of implemented process changes was calculated using the time-driven activity-based costing method. Our study completed two PDSA cycles. An important outcome was achieved after the first cycle: The inclusion of a genetic counselor increased clinic capacity by 350%. The number of patients seen per week increased from two in April 2014 (range, zero to four patients) to seven in November 2014 (range, four to 10 patients). Our second PDSA cycle showed that manual preappointment reminder calls reduced the variation in the nonattendance rate and contributed to a further increase in patients seen per week to 10 in May 2015 (range, seven to 13 patients). There was a concomitant decrease in costs of the patient care cycle by 18% after both PDSA cycles. This study shows how quality improvement methods can be combined with time-driven activity-based costing to increase value. In this paper, we demonstrate how we improved access while reducing costs of care delivery. Copyright © 2016 by American Society of Clinical Oncology.
The potential application of behavior-based safety in the trucking industry
DOT National Transportation Integrated Search
2000-04-01
Behavior-based safety (BBS) is a set of methods to improve safety performance in the workplace by engaging workers in the improvement process, identifying critical safety behaviors, performing observations to gather data, providing feedback to encour...
Student Receivables Management: Opportunities for Improved Practices.
ERIC Educational Resources Information Center
Jacquin, Jules C.; Goyal, Anil K.
1995-01-01
The college or university's business office can help reduce problems with student receivables through procedural review of the tuition revenue process, application of analytical methods, and improved operating practices. Admissions, financial aid, and billing offices must all be involved. (MSE)
Template for success: using a resident-designed sign-out template in the handover of patient care.
Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P
2011-01-01
Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.
2012-04-01
In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.
NATO COMMITTEE ON THE CHALLENGES TO MODERN SOCIETY PILOT STUDY: CLEAN PRODUCTS AND PROCESSES
Promote cooperation for improving the common pollution landscape by stimulating cross-national dialogues and collaboration. Share knowledge on the methods, tools, and technologies for making cleaner products and processes possible.
Process for using surface strain measurements to obtain operational loads for complex structures
NASA Technical Reports Server (NTRS)
Ko, William L. (Inventor); Richards, William Lance (Inventor)
2010-01-01
The invention is an improved process for using surface strain data to obtain real-time, operational loads data for complex structures that significantly reduces the time and cost versus current methods.
Postprocessing for character recognition using pattern features and linguistic information
NASA Astrophysics Data System (ADS)
Yoshikawa, Takatoshi; Okamoto, Masayosi; Horii, Hiroshi
1993-04-01
We propose a new method of post-processing for character recognition using pattern features and linguistic information. This method corrects errors in the recognition of handwritten Japanese sentences containing Kanji characters. This post-process method is characterized by having two types of character recognition. Improving the accuracy of the character recognition rate of Japanese characters is made difficult by the large number of characters, and the existence of characters with similar patterns. Therefore, it is not practical for a character recognition system to recognize all characters in detail. First, this post-processing method generates a candidate character table by recognizing the simplest features of characters. Then, it selects words corresponding to the character from the candidate character table by referring to a word and grammar dictionary before selecting suitable words. If the correct character is included in the candidate character table, this process can correct an error, however, if the character is not included, it cannot correct an error. Therefore, if this method can presume a character does not exist in a candidate character table by using linguistic information (word and grammar dictionary). It then can verify a presumed character by character recognition using complex features. When this method is applied to an online character recognition system, the accuracy of character recognition improves 93.5% to 94.7%. This proved to be the case when it was used for the editorials of a Japanese newspaper (Asahi Shinbun).
Integrated aerodynamic-structural design of a forward-swept transport wing
NASA Technical Reports Server (NTRS)
Haftka, Raphael T.; Grossman, Bernard; Kao, Pi-Jen; Polen, David M.; Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
The introduction of composite materials is having a profound effect on aircraft design. Since these materials permit the designer to tailor material properties to improve structural, aerodynamic and acoustic performance, they require an integrated multidisciplinary design process. Futhermore, because of the complexity of the design process, numerical optimization methods are required. The utilization of integrated multidisciplinary design procedures for improving aircraft design is not currently feasible because of software coordination problems and the enormous computational burden. Even with the expected rapid growth of supercomputers and parallel architectures, these tasks will not be practical without the development of efficient methods for cross-disciplinary sensitivities and efficient optimization procedures. The present research is part of an on-going effort which is focused on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration. A sequence of integrated wing design procedures has been developed in order to investigate various aspects of the design process.
Research on oral test modeling based on multi-feature fusion
NASA Astrophysics Data System (ADS)
Shi, Yuliang; Tao, Yiyue; Lei, Jun
2018-04-01
In this paper, the spectrum of speech signal is taken as an input of feature extraction. The advantage of PCNN in image segmentation and other processing is used to process the speech spectrum and extract features. And a new method combining speech signal processing and image processing is explored. At the same time of using the features of the speech map, adding the MFCC to establish the spectral features and integrating them with the features of the spectrogram to further improve the accuracy of the spoken language recognition. Considering that the input features are more complicated and distinguishable, we use Support Vector Machine (SVM) to construct the classifier, and then compare the extracted test voice features with the standard voice features to achieve the spoken standard detection. Experiments show that the method of extracting features from spectrograms using PCNN is feasible, and the fusion of image features and spectral features can improve the detection accuracy.
Accelerated simulation of stochastic particle removal processes in particle-resolved aerosol models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, J.H.; Michelotti, M.D.; Riemer, N.
2016-10-01
Stochastic particle-resolved methods have proven useful for simulating multi-dimensional systems such as composition-resolved aerosol size distributions. While particle-resolved methods have substantial benefits for highly detailed simulations, these techniques suffer from high computational cost, motivating efforts to improve their algorithmic efficiency. Here we formulate an algorithm for accelerating particle removal processes by aggregating particles of similar size into bins. We present the Binned Algorithm for particle removal processes and analyze its performance with application to the atmospherically relevant process of aerosol dry deposition. We show that the Binned Algorithm can dramatically improve the efficiency of particle removals, particularly for low removalmore » rates, and that computational cost is reduced without introducing additional error. In simulations of aerosol particle removal by dry deposition in atmospherically relevant conditions, we demonstrate about 50-times increase in algorithm efficiency.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D; Gach, H; Li, H
Purpose: The daily treatment MRIs acquired on MR-IGRT systems, like diagnostic MRIs, suffer from intensity inhomogeneity issue, associated with B1 and B0 inhomogeneities. An improved homomorphic unsharp mask (HUM) filtering method, automatic and robust body segmentation, and imaging field-of-view (FOV) detection methods were developed to compute the multiplicative slow-varying correction field and correct the intensity inhomogeneity. The goal is to improve and normalize the voxel intensity so that the images could be processed more accurately by quantitative methods (e.g., segmentation and registration) that require consistent image voxel intensity values. Methods: HUM methods have been widely used for years. A bodymore » mask is required, otherwise the body surface in the corrected image would be incorrectly bright due to the sudden intensity transition at the body surface. In this study, we developed an improved HUM-based correction method that includes three main components: 1) Robust body segmentation on the normalized image gradient map, 2) Robust FOV detection (needed for body segmentation) using region growing and morphologic filters, and 3) An effective implementation of HUM using repeated Gaussian convolution. Results: The proposed method was successfully tested on patient images of common anatomical sites (H/N, lung, abdomen and pelvis). Initial qualitative comparisons showed that this improved HUM method outperformed three recently published algorithms (FCM, LEMS, MICO) in both computation speed (by 50+ times) and robustness (in intermediate to severe inhomogeneity situations). Currently implemented in MATLAB, it takes 20 to 25 seconds to process a 3D MRI volume. Conclusion: Compared to more sophisticated MRI inhomogeneity correction algorithms, the improved HUM method is simple and effective. The inhomogeneity correction, body mask, and FOV detection methods developed in this study would be useful as preprocessing tools for many MRI-related research and clinical applications in radiotherapy. Authors have received research grants from ViewRay and Varian.« less
[Detection of lung nodules. New opportunities in chest radiography].
Pötter-Lang, S; Schalekamp, S; Schaefer-Prokop, C; Uffmann, M
2014-05-01
Chest radiography still represents the most commonly performed X-ray examination because it is readily available, requires low radiation doses and is relatively inexpensive. However, as previously published, many initially undetected lung nodules are retrospectively visible in chest radiographs. The great improvements in detector technology with the increasing dose efficiency and improved contrast resolution provide a better image quality and reduced dose needs. The dual energy acquisition technique and advanced image processing methods (e.g. digital bone subtraction and temporal subtraction) reduce the anatomical background noise by reduction of overlapping structures in chest radiography. Computer-aided detection (CAD) schemes increase the awareness of radiologists for suspicious areas. The advanced image processing methods show clear improvements for the detection of pulmonary lung nodules in chest radiography and strengthen the role of this method in comparison to 3D acquisition techniques, such as computed tomography (CT). Many of these methods will probably be integrated into standard clinical treatment in the near future. Digital software solutions offer advantages as they can be easily incorporated into radiology departments and are often more affordable as compared to hardware solutions.
Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad
2016-01-01
Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162
Vavadi, Hamed; Zhu, Quing
2016-01-01
Imaging-guided near infrared diffuse optical tomography (DOT) has demonstrated a great potential as an adjunct modality for differentiation of malignant and benign breast lesions and for monitoring treatment response of breast cancers. However, diffused light measurements are sensitive to artifacts caused by outliers and errors in measurements due to probe-tissue coupling, patient and probe motions, and tissue heterogeneity. In general, pre-processing of the measurements is needed by experienced users to manually remove these outliers and therefore reduce imaging artifacts. An automated method of outlier removal, data selection, and filtering for diffuse optical tomography is introduced in this manuscript. This method consists of multiple steps to first combine several data sets collected from the same patient at contralateral normal breast and form a single robust reference data set using statistical tests and linear fitting of the measurements. The second step improves the perturbation measurements by filtering out outliers from the lesion site measurements using model based analysis. The results of 20 malignant and benign cases show similar performance between manual data processing and automated processing and improvement in tissue characterization of malignant to benign ratio by about 27%. PMID:27867711
Balasubramanian, Bijal A; Cohen, Deborah J; Davis, Melinda M; Gunn, Rose; Dickinson, L Miriam; Miller, William L; Crabtree, Benjamin F; Stange, Kurt C
2015-03-10
In healthcare change interventions, on-the-ground learning about the implementation process is often lost because of a primary focus on outcome improvements. This paper describes the Learning Evaluation, a methodological approach that blends quality improvement and implementation research methods to study healthcare innovations. Learning Evaluation is an approach to multi-organization assessment. Qualitative and quantitative data are collected to conduct real-time assessment of implementation processes while also assessing changes in context, facilitating quality improvement using run charts and audit and feedback, and generating transportable lessons. Five principles are the foundation of this approach: (1) gather data to describe changes made by healthcare organizations and how changes are implemented; (2) collect process and outcome data relevant to healthcare organizations and to the research team; (3) assess multi-level contextual factors that affect implementation, process, outcome, and transportability; (4) assist healthcare organizations in using data for continuous quality improvement; and (5) operationalize common measurement strategies to generate transportable results. Learning Evaluation principles are applied across organizations by the following: (1) establishing a detailed understanding of the baseline implementation plan; (2) identifying target populations and tracking relevant process measures; (3) collecting and analyzing real-time quantitative and qualitative data on important contextual factors; (4) synthesizing data and emerging findings and sharing with stakeholders on an ongoing basis; and (5) harmonizing and fostering learning from process and outcome data. Application to a multi-site program focused on primary care and behavioral health integration shows the feasibility and utility of Learning Evaluation for generating real-time insights into evolving implementation processes. Learning Evaluation generates systematic and rigorous cross-organizational findings about implementing healthcare innovations while also enhancing organizational capacity and accelerating translation of findings by facilitating continuous learning within individual sites. Researchers evaluating change initiatives and healthcare organizations implementing improvement initiatives may benefit from a Learning Evaluation approach.
NASA Astrophysics Data System (ADS)
Azmi, Nur Iffah Mohamed; Arifin Mat Piah, Kamal; Yusoff, Wan Azhar Wan; Romlay, Fadhlur Rahman Mohd
2018-03-01
Controller that uses PID parameters requires a good tuning method in order to improve the control system performance. Tuning PID control method is divided into two namely the classical methods and the methods of artificial intelligence. Particle swarm optimization algorithm (PSO) is one of the artificial intelligence methods. Previously, researchers had integrated PSO algorithms in the PID parameter tuning process. This research aims to improve the PSO-PID tuning algorithms by integrating the tuning process with the Variable Weight Grey- Taguchi Design of Experiment (DOE) method. This is done by conducting the DOE on the two PSO optimizing parameters: the particle velocity limit and the weight distribution factor. Computer simulations and physical experiments were conducted by using the proposed PSO- PID with the Variable Weight Grey-Taguchi DOE and the classical Ziegler-Nichols methods. They are implemented on the hydraulic positioning system. Simulation results show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE has reduced the rise time by 48.13% and settling time by 48.57% compared to the Ziegler-Nichols method. Furthermore, the physical experiment results also show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE tuning method responds better than Ziegler-Nichols tuning. In conclusion, this research has improved the PSO-PID parameter by applying the PSO-PID algorithm together with the Variable Weight Grey-Taguchi DOE method as a tuning method in the hydraulic positioning system.
Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H
2017-03-13
Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.
Method for enhanced atomization of liquids
Thompson, Richard E.; White, Jerome R.
1993-01-01
In a process for atomizing a slurry or liquid process stream in which a slurry or liquid is passed through a nozzle to provide a primary atomized process stream, an improvement which comprises subjecting the liquid or slurry process stream to microwave energy as the liquid or slurry process stream exits the nozzle, wherein sufficient microwave heating is provided to flash vaporize the primary atomized process stream.
Software life cycle methodologies and environments
NASA Technical Reports Server (NTRS)
Fridge, Ernest
1991-01-01
Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.
Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J
2006-01-01
The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.
Suppressing molecular vibrations in organic semiconductors by inducing strain
Kubo, Takayoshi; Häusermann, Roger; Tsurumi, Junto; Soeda, Junshi; Okada, Yugo; Yamashita, Yu; Akamatsu, Norihisa; Shishido, Atsushi; Mitsui, Chikahiko; Okamoto, Toshihiro; Yanagisawa, Susumu; Matsui, Hiroyuki; Takeya, Jun
2016-01-01
Organic molecular semiconductors are solution processable, enabling the growth of large-area single-crystal semiconductors. Improving the performance of organic semiconductor devices by increasing the charge mobility is an ongoing quest, which calls for novel molecular and material design, and improved processing conditions. Here we show a method to increase the charge mobility in organic single-crystal field-effect transistors, by taking advantage of the inherent softness of organic semiconductors. We compress the crystal lattice uniaxially by bending the flexible devices, leading to an improved charge transport. The mobility increases from 9.7 to 16.5 cm2 V−1 s−1 by 70% under 3% strain. In-depth analysis indicates that compressing the crystal structure directly restricts the vibration of the molecules, thus suppresses dynamic disorder, a unique mechanism in organic semiconductors. Since strain can be easily induced during the fabrication process, we expect our method to be exploited to build high-performance organic devices. PMID:27040501
Suppressing molecular vibrations in organic semiconductors by inducing strain.
Kubo, Takayoshi; Häusermann, Roger; Tsurumi, Junto; Soeda, Junshi; Okada, Yugo; Yamashita, Yu; Akamatsu, Norihisa; Shishido, Atsushi; Mitsui, Chikahiko; Okamoto, Toshihiro; Yanagisawa, Susumu; Matsui, Hiroyuki; Takeya, Jun
2016-04-04
Organic molecular semiconductors are solution processable, enabling the growth of large-area single-crystal semiconductors. Improving the performance of organic semiconductor devices by increasing the charge mobility is an ongoing quest, which calls for novel molecular and material design, and improved processing conditions. Here we show a method to increase the charge mobility in organic single-crystal field-effect transistors, by taking advantage of the inherent softness of organic semiconductors. We compress the crystal lattice uniaxially by bending the flexible devices, leading to an improved charge transport. The mobility increases from 9.7 to 16.5 cm(2) V(-1) s(-1) by 70% under 3% strain. In-depth analysis indicates that compressing the crystal structure directly restricts the vibration of the molecules, thus suppresses dynamic disorder, a unique mechanism in organic semiconductors. Since strain can be easily induced during the fabrication process, we expect our method to be exploited to build high-performance organic devices.
The effect of process parameters in Aluminum Metal Matrix Composites with Powder Metallurgy
NASA Astrophysics Data System (ADS)
Vani, Vemula Vijaya; Chak, Sanjay Kumar
2018-06-01
Metal Matrix Composites are developed in recent years as an alternative over conventional engineering materials due to their improved properties. Among all, Aluminium Matrix Composites (AMCs) are increasing their demand due to low density, high strength-to-weight ratio, high toughness, corrosion resistance, higher stiffness, improved wear resistance, increased creep resistance, low co-efficient of thermal expansion, improved high temperature properties. Major applications of these materials have been in aerospace, automobile, military. There are different processing techniques for the fabrication of AMCs. Powder metallurgy is a one of the most promising and versatile routes for fabrication of particle reinforced AMCs as compared to other manufacturing methods. This method ensures the good wettability between matrix and reinforcement, homogeneous microstructure of the fabricated MMC, and prevents the formation of any undesirable phases. This article addresses mainly on the effect of process parameters like sintering time, temperature and particle size on the microstructure of aluminum metal matrix composites.
Method for rapidly producing microporous and mesoporous materials
Coronado, Paul R.; Poco, John F.; Hrubesh, Lawrence W.; Hopper, Robert W.
1997-01-01
An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods.
Partnering through Training and Practice to Achieve Performance Improvement
ERIC Educational Resources Information Center
Lyons, Paul R.
2010-01-01
This article presents a partnership effort among managers, trainers, and employees to spring to life performance improvement using the performance templates (P-T) approach. P-T represents a process model as well as a method of training leading to performance improvement. Not only does it add to our repertoire of training and performance management…
Advances in medical image computing.
Tolxdorff, T; Deserno, T M; Handels, H; Meinzer, H-P
2009-01-01
Medical image computing has become a key technology in high-tech applications in medicine and an ubiquitous part of modern imaging systems and the related processes of clinical diagnosis and intervention. Over the past years significant progress has been made in the field, both on methodological and on application level. Despite this progress there are still big challenges to meet in order to establish image processing routinely in health care. In this issue, selected contributions of the German Conference on Medical Image Processing (BVM) are assembled to present latest advances in the field of medical image computing. The winners of scientific awards of the German Conference on Medical Image Processing (BVM) 2008 were invited to submit a manuscript on their latest developments and results for possible publication in Methods of Information in Medicine. Finally, seven excellent papers were selected to describe important aspects of recent advances in the field of medical image processing. The selected papers give an impression of the breadth and heterogeneity of new developments. New methods for improved image segmentation, non-linear image registration and modeling of organs are presented together with applications of image analysis methods in different medical disciplines. Furthermore, state-of-the-art tools and techniques to support the development and evaluation of medical image processing systems in practice are described. The selected articles describe different aspects of the intense development in medical image computing. The image processing methods presented enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
Improving surgeon utilization in an orthopedic department using simulation modeling
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193
Elevation data fitting and precision analysis of Google Earth in road survey
NASA Astrophysics Data System (ADS)
Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei
2018-05-01
Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously improve data precision of Google Earth. The error of data in hilly terrain areas meets the requirement of specifications after precision improvement and it can be used in feasibility study stage of road survey and design.
A 3D inversion for all-space magnetotelluric data with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, Kun
2017-04-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results. The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm.
Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes
NASA Astrophysics Data System (ADS)
Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.
2015-12-01
Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.
Improved NSGA model for multi objective operation scheduling and its evaluation
NASA Astrophysics Data System (ADS)
Li, Weining; Wang, Fuyu
2017-09-01
Reasonable operation can increase the income of the hospital and improve the patient’s satisfactory level. In this paper, by using multi object operation scheduling method with improved NSGA algorithm, it shortens the operation time, reduces the operation costand lowers the operation risk, the multi-objective optimization model is established for flexible operation scheduling, through the MATLAB simulation method, the Pareto solution is obtained, the standardization of data processing. The optimal scheduling scheme is selected by using entropy weight -Topsis combination method. The results show that the algorithm is feasible to solve the multi-objective operation scheduling problem, and provide a reference for hospital operation scheduling.
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
Error detection and reduction in blood banking.
Motschman, T L; Moore, S B
1996-12-01
Error management plays a major role in facility process improvement efforts. By detecting and reducing errors, quality and, therefore, patient care improve. It begins with a strong organizational foundation of management attitude with clear, consistent employee direction and appropriate physical facilities. Clearly defined critical processes, critical activities, and SOPs act as the framework for operations as well as active quality monitoring. To assure that personnel can detect an report errors they must be trained in both operational duties and error management practices. Use of simulated/intentional errors and incorporation of error detection into competency assessment keeps employees practiced, confident, and diminishes fear of the unknown. Personnel can clearly see that errors are indeed used as opportunities for process improvement and not for punishment. The facility must have a clearly defined and consistently used definition for reportable errors. Reportable errors should include those errors with potentially harmful outcomes as well as those errors that are "upstream," and thus further away from the outcome. A well-written error report consists of who, what, when, where, why/how, and follow-up to the error. Before correction can occur, an investigation to determine the underlying cause of the error should be undertaken. Obviously, the best corrective action is prevention. Correction can occur at five different levels; however, only three of these levels are directed at prevention. Prevention requires a method to collect and analyze data concerning errors. In the authors' facility a functional error classification method and a quality system-based classification have been useful. An active method to search for problems uncovers them further upstream, before they can have disastrous outcomes. In the continual quest for improving processes, an error management program is itself a process that needs improvement, and we must strive to always close the circle of quality assurance. Ultimately, the goal of better patient care will be the reward.
Nkundabombi, Marie Grace; Nakimbugwe, Dorothy; Muyonga, John H
2016-05-01
Common beans (Phaseolus vulgaris L.) are rich nutritious and affordable by vulnerable groups, thus a good choice for biofortification to address malnutrition. However, increasing micronutrients content of beans, without improving micronutrients bioavailability will not improve the micronutrients status of consumers. Effect of different processing methods on the physicochemical characteristics of biofortified bean flour was determined. Processing methods used in this study were malting (48 h), roasting (170°C/45 min), and extrusion cooking using a twin screw extruder with three heating sections, the first set at 60°C, the second at 130°C, and the last one at 150°C. The screw was set at a speed of 35 Hz (123g) and bean flour moisture content was 15%. Mineral extractability, in vitro protein digestibility, pasting properties, and sensory acceptability of porridge and sauce from processed flour were determined. All processing methods significantly increased (P < 0.05) mineral extractability, iron from 38.9% to 79.5% for K131 and from 40.7% to 83.4% for ROBA1, in vitro protein digestibility from 58.2% to 82% for ROBA1 and from 56.2% to 79% for K131. Pasting viscosities of both bean varieties reduced with processing. There was no significant difference (P < 0.05) between sensory acceptability of porridge or sauce from extruded biofortified bean flour and malted/roasted biofortified bean flour. Acceptability was also not affected by the bean variety used. Mineral bioavailability and in vitro protein digestibility increased more for extruded flour than for malted/roasted flours. Sauce and porridge prepared from processed biofortified bean flour had lower viscosity (extruded flour had the lowest viscosity), thus higher nutrient and energy density than those prepared from unprocessed biofortified bean flour. Estimated nutritional contribution of sauce and porridge made from processed ROBA1 flour to daily requirement of children below 5 years and women of reproductive age found to be high. These results show that processing methods enhanced nutritional value of biofortified bean flour and that processed biofortified bean flour can be used to prepare nutrient and energy-dense gruel to improve on nutritional status of children under 5 years and women of reproductive age.
UAV path planning using artificial potential field method updated by optimal control theory
NASA Astrophysics Data System (ADS)
Chen, Yong-bo; Luo, Guan-chen; Mei, Yue-song; Yu, Jian-qiao; Su, Xiao-long
2016-04-01
The unmanned aerial vehicle (UAV) path planning problem is an important assignment in the UAV mission planning. Based on the artificial potential field (APF) UAV path planning method, it is reconstructed into the constrained optimisation problem by introducing an additional control force. The constrained optimisation problem is translated into the unconstrained optimisation problem with the help of slack variables in this paper. The functional optimisation method is applied to reform this problem into an optimal control problem. The whole transformation process is deduced in detail, based on a discrete UAV dynamic model. Then, the path planning problem is solved with the help of the optimal control method. The path following process based on the six degrees of freedom simulation model of the quadrotor helicopters is introduced to verify the practicability of this method. Finally, the simulation results show that the improved method is more effective in planning path. In the planning space, the length of the calculated path is shorter and smoother than that using traditional APF method. In addition, the improved method can solve the dead point problem effectively.
Review of the methods to form hydrogen peroxide in electrical discharge plasma with liquid water
NASA Astrophysics Data System (ADS)
Locke, Bruce R.; Shih, Kai-Yuan
2011-06-01
This paper presents a review of the literature dealing with the formation of hydrogen peroxide from plasma processes. Energy yields for hydrogen peroxide generation by plasma from water span approximately three orders of magnitude from 4 × 10-2 to 80 g kWh-1. A wide range of plasma processes from rf to pulsed, ac, and dc discharges directly in the liquid phase have similar energy yields and may thus be limited by radical quenching processes at the plasma-liquid interface. Reactor modification using discharges in bubbles and discharges over the liquid phase can provide modest improvements in energy yield over direct discharge in the liquid, but the interpretation is complicated by additional chemical reactions of gas phase components such as ozone and nitrogen oxides. The highest efficiency plasma process utilizes liquid water droplets that may enhance efficiency by sequestering hydrogen peroxide in the liquid and by suppressing decomposition reactions by radicals from the gas and at the interface. Kinetic simulations of water vapor reported in the literature suggest that plasma generation of hydrogen peroxide should approach 45% of the thermodynamics limit, and this fact coupled with experimental studies demonstrating improvements with the presence of the condensed liquid phase suggest that further improvements in energy yield may be possible. Plasma generation of hydrogen peroxide directly from water compares favorably with a number of other methods including electron beam, ultrasound, electrochemical and photochemical methods, and other chemical processes.
Six Sigma methods applied to cryogenic coolers assembly line
NASA Astrophysics Data System (ADS)
Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René
2009-05-01
Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.
NASA Astrophysics Data System (ADS)
Tian, Yu; Rao, Changhui; Wei, Kai
2008-07-01
The adaptive optics can only partially compensate the image blurred by atmospheric turbulence due to the observing condition and hardware restriction. A post-processing method based on frame selection and multi-frames blind deconvolution to improve images partially corrected by adaptive optics is proposed. The appropriate frames which are suitable for blind deconvolution from the recorded AO close-loop frames series are selected by the frame selection technique and then do the multi-frame blind deconvolution. There is no priori knowledge except for the positive constraint in blind deconvolution. It is benefit for the use of multi-frame images to improve the stability and convergence of the blind deconvolution algorithm. The method had been applied in the image restoration of celestial bodies which were observed by 1.2m telescope equipped with 61-element adaptive optical system at Yunnan Observatory. The results show that the method can effectively improve the images partially corrected by adaptive optics.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-10-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521
Improving the medical records department processes by lean management.
Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine
2015-01-01
Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. The study represents one of the few attempts trying to eliminate wastes in the MRD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gratama van Andel, H. A. F.; Venema, H. W.; Streekstra, G. J.
For clear visualization of vessels in CT angiography (CTA) images of the head and neck using maximum intensity projection (MIP) or volume rendering (VR) bone has to be removed. In the past we presented a fully automatic method to mask the bone [matched mask bone elimination (MMBE)] for this purpose. A drawback is that vessels adjacent to bone may be partly masked as well. We propose a modification, multiscale MMBE, which reduces this problem by using images at two scales: a higher resolution than usual for image processing and a lower resolution to which the processed images are transformed formore » use in the diagnostic process. A higher in-plane resolution is obtained by the use of a sharper reconstruction kernel. The out-of-plane resolution is improved by deconvolution or by scanning with narrower collimation. The quality of the mask that is used to remove bone is improved by using images at both scales. After masking, the desired resolution for the normal clinical use of the images is obtained by blurring with Gaussian kernels of appropriate widths. Both methods (multiscale and original) were compared in a phantom study and with clinical CTA data sets. With the multiscale approach the width of the strip of soft tissue adjacent to the bone that is masked can be reduced from 1.0 to 0.2 mm without reducing the quality of the bone removal. The clinical examples show that vessels adjacent to bone are less affected and therefore better visible. Images processed with multiscale MMBE have a slightly higher noise level or slightly reduced resolution compared with images processed by the original method and the reconstruction and processing time is also somewhat increased. Nevertheless, multiscale MMBE offers a way to remove bone automatically from CT angiography images without affecting the integrity of the blood vessels. The overall image quality of MIP or VR images is substantially improved relative to images processed with the original MMBE method.« less
Removal of bone in CT angiography by multiscale matched mask bone elimination.
Gratama van Andel, H A F; Venema, H W; Streekstra, G J; van Straten, M; Majoie, C B L M; den Heeten, G J; Grimbergen, C A
2007-10-01
For clear visualization of vessels in CT angiography (CTA) images of the head and neck using maximum intensity projection (MIP) or volume rendering (VR) bone has to be removed. In the past we presented a fully automatic method to mask the bone [matched mask bone elimination (MMBE)] for this purpose. A drawback is that vessels adjacent to bone may be partly masked as well. We propose a modification, multiscale MMBE, which reduces this problem by using images at two scales: a higher resolution than usual for image processing and a lower resolution to which the processed images are transformed for use in the diagnostic process. A higher in-plane resolution is obtained by the use of a sharper reconstruction kernel. The out-of-plane resolution is improved by deconvolution or by scanning with narrower collimation. The quality of the mask that is used to remove bone is improved by using images at both scales. After masking, the desired resolution for the normal clinical use of the images is obtained by blurring with Gaussian kernels of appropriate widths. Both methods (multiscale and original) were compared in a phantom study and with clinical CTA data sets. With the multiscale approach the width of the strip of soft tissue adjacent to the bone that is masked can be reduced from 1.0 to 0.2 mm without reducing the quality of the bone removal. The clinical examples show that vessels adjacent to bone are less affected and therefore better visible. Images processed with multiscale MMBE have a slightly higher noise level or slightly reduced resolution compared with images processed by the original method and the reconstruction and processing time is also somewhat increased. Nevertheless, multiscale MMBE offers a way to remove bone automatically from CT angiography images without affecting the integrity of the blood vessels. The overall image quality of MIP or VR images is substantially improved relative to images processed with the original MMBE method.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Wilding, Bruce M; Turner, Terry D
2014-12-02
A method of natural gas liquefaction may include cooling a gaseous NG process stream to form a liquid NG process stream. The method may further include directing the first tail gas stream out of a plant at a first pressure and directing a second tail gas stream out of the plant at a second pressure. An additional method of natural gas liquefaction may include separating CO.sub.2 from a liquid NG process stream and processing the CO.sub.2 to provide a CO.sub.2 product stream. Another method of natural gas liquefaction may include combining a marginal gaseous NG process stream with a secondary substantially pure NG stream to provide an improved gaseous NG process stream. Additionally, a NG liquefaction plant may include a first tail gas outlet, and at least a second tail gas outlet, the at least a second tail gas outlet separate from the first tail gas outlet.
Computer modeling of lung cancer diagnosis-to-treatment process
Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick
2015-01-01
We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181
Bringing in the "CIA": A New Process to Improve Staff Communication.
Hunter, Rebecca; Mitchell, Jinjer; Loomis, Elena
2015-01-01
Nurses consistently express dissatisfaction with the overwhelming amount and rate of change in health care today. Nurse educators identified this as a problem at a 475-bed hospital and developed a process to present changes in information in a new and exciting method. This article reports on the identification and implementation of the new communication model and the lessons learned during the process. A new method for communication dissemination was designed utilizing a "Coordinator Information Advisory Group" concept.
Li, Ke; Liu, Yi; Wang, Quanxin; Wu, Yalei; Song, Shimin; Sun, Yi; Liu, Tengchong; Wang, Jun; Li, Yang; Du, Shaoyi
2015-01-01
This paper proposes a novel multi-label classification method for resolving the spacecraft electrical characteristics problems which involve many unlabeled test data processing, high-dimensional features, long computing time and identification of slow rate. Firstly, both the fuzzy c-means (FCM) offline clustering and the principal component feature extraction algorithms are applied for the feature selection process. Secondly, the approximate weighted proximal support vector machine (WPSVM) online classification algorithms is used to reduce the feature dimension and further improve the rate of recognition for electrical characteristics spacecraft. Finally, the data capture contribution method by using thresholds is proposed to guarantee the validity and consistency of the data selection. The experimental results indicate that the method proposed can obtain better data features of the spacecraft electrical characteristics, improve the accuracy of identification and shorten the computing time effectively. PMID:26544549
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case. PMID:26125631
Hawthorne, Steven B.; Miller, David J.; Yang, Yu; Lagadec, Arnaud Jean-Marie
1999-01-01
The method of the present invention is adapted to manipulate the chemical properties of water in order to improve the effectiveness of a desired chemical process. The method involves heating the water in the vessel to subcritical temperatures between 100.degree. to 374.degree. C. while maintaining sufficient pressure to the water to maintain the water in the liquid state. Various physiochemical properties of the water can be manipulated including polarity, solute solubility, surface tension, viscosity, and the disassociation constant. The method of the present invention has various uses including extracting organics from solids and semisolids such as soil, selectively extracting desired organics from nonaqueous liquids, selectively separating organics using sorbent phases, enhancing reactions by controlling the disassociation constant of water, cleaning waste water, and removing organics from water using activated carbon or other suitable sorbents.
Hawthorne, Steven B.; Miller, David J.; Lagadec, Arnaud Jean-Marie; Hammond, Peter James; Clifford, Anthony Alan
2002-01-01
The method of the present invention is adapted to manipulate the chemical properties of water in order to improve the effectiveness of a desired process. The method involves heating the water in the vessel to subcritical temperatures between 100.degree. to 374.degree. C. while maintaining sufficient pressure to the water to maintain the water in the liquid state. Various physiochemical properties of the water can be manipulated including polarity, solute solubility, surface tension, viscosity, and the disassociation constant. The method of the present invention has various uses including extracting organics from solids and semisolids such as soil, selectively extracting desired organics from liquids, selectively separating organics using sorbent phases, enhancing reactions by controlling the disassociation constant of water, cleaning waste water, removing organics from water using activated carbon or other suitable sorbents, and degrading various compounds.
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks.
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case.
Methods for improved forewarning of critical events across multiple data channels
Hively, Lee M [Philadelphia, TN
2007-04-24
This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.
NASA Astrophysics Data System (ADS)
Kurchatkin, I. V.; Gorshkalev, A. A.; Blagin, E. V.
2017-01-01
This article deals with developed methods of the working processes modelling in the combustion chamber of an internal combustion engine (ICE). Methods includes description of the preparation of a combustion chamber 3-d model, setting of the finite-element mesh, boundary condition setting and solution customization. Aircraft radial engine M-14 was selected for modelling. The cycle of cold blowdown in the ANSYS IC Engine software was carried out. The obtained data were compared to results of known calculation methods. A method of engine’s induction port improvement was suggested.
An integrated lean-methods approach to hospital facilities redesign.
Nicholas, John
2012-01-01
Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.
Processing on high efficiency solar collector coatings
NASA Technical Reports Server (NTRS)
Roberts, M.
1977-01-01
Wavelength selective coatings for solar collectors are considered. Substrates with good infrared reflectivity were examined along with their susceptibility to physical and environmental damage. Improvements of reflective surfaces were accomplished through buffing, chemical polishing and other surface processing methods.
NASA Astrophysics Data System (ADS)
Konakhina, I. A.; Khusnutdinova, E. M.; Khamidullina, G. R.; Khamidullina, A. F.
2016-06-01
This paper describes a mathematical model of flow-related hydrodynamic processes for rheologically complex high-viscosity bitumen oil and oil-water suspensions and presents methods to improve the design and performance of oil pipelines.
Doppler ultrasound monitoring technology.
Docker, M F
1993-03-01
Developments in the signal processing of Doppler ultrasound used for the detection of fetal heart rate (FHR) have improved the operation of cardiotocographs. These developments are reviewed and the advantages and disadvantages of the various Doppler and signal processing methods are compared.
Lean Manufacturing Principles Improving the Targeting Process
2012-06-08
author has familiarity with Lean manufacturing principles. Third, Lean methods have been used in different industries and have proven adaptable to the...92 The case study also demonstrates the multi organizational application of VSM, JIT and the 5S method ...new members not knowing the process, this will serve as a start point for the developing of understanding. Within the Food industry we observed “the
Method and apparatus for obtaining enhanced production rate of thermal chemical reactions
Tonkovich, Anna Lee Y.; Wang, Yong; Wegeng, Robert S.; Gao, Yufei
2003-09-09
Reactors and processes are disclosed that can utilize high heat fluxes to obtain fast, steady-state reaction rates. Porous catalysts used in conjunction with microchannel reactors to obtain high rates of heat transfer are also disclosed. Reactors and processes that utilize short contact times, high heat flux and low pressure drop are described. Improved methods of steam reforming are also provided.
Method and apparatus for obtaining enhanced production rate of thermal chemical reactions
Tonkovich, Anna Lee Y [Pasco, WA; Wang, Yong [Richland, WA; Wegeng, Robert S [Richland, WA; Gao, Yufei [Kennewick, WA
2006-05-16
Reactors and processes are disclosed that can utilize high heat fluxes to obtain fast, steady-state reaction rates. Porous catalysts used in conjunction with microchannel reactors to obtain high rates of heat transfer are also disclosed. Reactors and processes that utilize short contact times, high heat flux and low pressure drop are described. Improved methods of steam reforming are also provided.
Overlap junctions for high coherence superconducting qubits
NASA Astrophysics Data System (ADS)
Wu, X.; Long, J. L.; Ku, H. S.; Lake, R. E.; Bal, M.; Pappas, D. P.
2017-07-01
Fabrication of sub-micron Josephson junctions is demonstrated using standard processing techniques for high-coherence, superconducting qubits. These junctions are made in two separate lithography steps with normal-angle evaporation. Most significantly, this work demonstrates that it is possible to achieve high coherence with junctions formed on aluminum surfaces cleaned in situ by Ar plasma before junction oxidation. This method eliminates the angle-dependent shadow masks typically used for small junctions. Therefore, this is conducive to the implementation of typical methods for improving margins and yield using conventional CMOS processing. The current method uses electron-beam lithography and an additive process to define the top and bottom electrodes. Extension of this work to optical lithography and subtractive processes is discussed.
NASA Astrophysics Data System (ADS)
Dachyar, M.; Christy, E.
2014-04-01
To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.
Redesigning a risk-management process for tracking injuries.
Wenzel, G R
1998-01-01
The changing responsibilities of registered nurses are challenging even the most dedicated professionals. To survive within her newly-defined roles, one nurse used a total quality improvement model to understand, analyze, and improve a medical center's system for tracking inpatient injuries. This process led to the drafting of an original software design that implemented a nursing informatics tracking system. It has resulted in significant savings of time and money and has far surpassed the accuracy, efficiency, and scope of the previous method. This article presents an overview of the design process.
They do, They Get and They Know; How to Motivate Learner to Upgrade Their Learning Quality
NASA Astrophysics Data System (ADS)
Yogica, R.; Helendra, H.
2018-04-01
A learning process that occurs in the classroom is a very important thing to note the quality, so it can be a determinant of student success in understanding the content of the lesson. The success of the learning process could be seen from the learning outcomes and the level of positive activities of students while in class. Students who are active in the classroom at the time of learning happen mean interest to the content of the lesson and will make their understanding deeper. In some learning processes in the classroom, the authors observed that in the first weeks of learning the level of student activity was very low. This is due to low student learning motivation. The author applies a method named: they do, they get, and they know. This method is very influential on the increase of learning activities because it affects the psychology of students to improve their learning motivation. After study in this method at two different courses in university, authors make a conclusion in the end that the method is effective to increase the frequency of student positive activity, so this method plays a role in improving the quality of learning.
Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis
NASA Technical Reports Server (NTRS)
Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.
1985-01-01
Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.
Development program to produce mullite fiber insulation
NASA Technical Reports Server (NTRS)
Long, W. G.
1975-01-01
Processing methods were utilized to form a mullite fiber-Kaowool felt. The formation of a blended felt using the Rotoformer wet-laying method was successful. Felt products were evaluated for tensile strength, thermal stability, thermal conductivity and structural integrity at 1259 C and 1371 C. Textile processing methods failed in an attempt to form a yarn from staple and multifilament mullite fiber due to fiber damage through mechanical handling. The refractoriness of pure Kaowool ceramic fiber is improved with additions of 30% or greater mullite fiber.
Reid, Denise
2013-01-01
Background. This pilot study investigated the efficacy of a novel virtual reality-cognitive rehabilitation (VR-CR) intervention to improve contextual processing of objects in children with autism. Previous research supports that children with autism show deficits in contextual processing, as well as deficits in its elementary components: abstraction and cognitive flexibility. Methods. Four children with autism participated in a multiple-baseline, single-subject study. The children were taught how to see objects in context by reinforcing attention to pivotal contextual information. Results. All children demonstrated statistically significant improvements in contextual processing and cognitive flexibility. Mixed results were found on the control test and changes in context-related behaviours. Conclusions. Larger-scale studies are warranted to determine the effectiveness and usability in comprehensive educational programs. PMID:24324379
Zhang, Ruifen; Su, Dongxiao; Hou, Fangli; Liu, Lei; Huang, Fei; Dong, Lihong; Deng, Yuanyuan; Zhang, Yan; Wei, Zhencheng; Zhang, Mingwei
2017-08-01
To establish optimal ultra-high-pressure (UHP)-assisted extraction conditions for procyanidins from lychee pericarp, a response surface analysis method with four factors and three levels was adopted. The optimum conditions were as follows: 295 MPa pressure, 13 min pressure holding time, 16.0 mL/g liquid-to-solid ratio, and 70% ethanol concentration. Compared with conventional ethanol extraction and ultrasonic-assisted extraction methods, the yields of the total procyanidins, flavonoids, and phenolics extracted using the UHP process were significantly increased; consequently, the oxygen radical absorbance capacity and cellular antioxidant activity of UHP-assisted lychee pericarp extracts were substantially enhanced. LC-MS/MS and high-performance liquid chromatography quantification results for individual phenolic compounds revealed that the yield of procyanidin compounds, including epicatechin, procyanidin A2, and procyanidin B2, from lychee pericarp could be significantly improved by the UHP-assisted extraction process. This UHP-assisted extraction process is thus a practical method for the extraction of procyanidins from lychee pericarp.
Localization of multiple defects using the compact phased array (CPA) method
NASA Astrophysics Data System (ADS)
Senyurek, Volkan Y.; Baghalian, Amin; Tashakori, Shervin; McDaniel, Dwayne; Tansel, Ibrahim N.
2018-01-01
Array systems of transducers have found numerous applications in detection and localization of defects in structural health monitoring (SHM) of plate-like structures. Different types of array configurations and analysis algorithms have been used to improve the process of localization of defects. For accurate and reliable monitoring of large structures by array systems, a high number of actuator and sensor elements are often required. In this study, a compact phased array system consisting of only three piezoelectric elements is used in conjunction with an updated total focusing method (TFM) for localization of single and multiple defects in an aluminum plate. The accuracy of the localization process was greatly improved by including wave propagation information in TFM. Results indicated that the proposed CPA approach can locate single and multiple defects with high accuracy while decreasing the processing costs and the number of required transducers. This method can be utilized in critical applications such as aerospace structures where the use of a large number of transducers is not desirable.
NASA Astrophysics Data System (ADS)
Wu, Weibin; Dai, Yifan; Zhou, Lin; Xu, Mingjin
2016-09-01
Material removal accuracy has a direct impact on the machining precision and efficiency of ion beam figuring. By analyzing the factors suppressing the improvement of material removal accuracy, we conclude that correcting the removal function deviation and reducing the removal material amount during each iterative process could help to improve material removal accuracy. Removal function correcting principle can effectively compensate removal function deviation between actual figuring and simulated processes, while experiments indicate that material removal accuracy decreases with a long machining time, so a small amount of removal material in each iterative process is suggested. However, more clamping and measuring steps will be introduced in this way, which will also generate machining errors and suppress the improvement of material removal accuracy. On this account, a free-measurement iterative process method is put forward to improve material removal accuracy and figuring efficiency by using less measuring and clamping steps. Finally, an experiment on a φ 100-mm Zerodur planar is preformed, which shows that, in similar figuring time, three free-measurement iterative processes could improve the material removal accuracy and the surface error convergence rate by 62.5% and 17.6%, respectively, compared with a single iterative process.
The road to business process improvement--can you get there from here?
Gilberto, P A
1995-11-01
Historically, "improvements" within the organization have been frequently attained through automation by building and installing computer systems. Material requirements planning (MRP), manufacturing resource planning II (MRP II), just-in-time (JIT), computer aided design (CAD), computer aided manufacturing (CAM), electronic data interchange (EDI), and various other TLAs (three-letter acronyms) have been used as the methods to attain business objectives. But most companies have found that installing computer software, cleaning up their data, and providing every employee with training on how to best use the systems have not resulted in the level of business improvements needed. The software systems have simply made management around the problems easier but did little to solve the basic problems. The missing element in the efforts to improve the performance of the organization has been a shift in focus from individual department improvements to cross-organizational business process improvements. This article describes how the Electric Boat Division of General Dynamics Corporation, in conjunction with the Data Systems Division, moved its focus from one of vertical organizational processes to horizontal business processes. In other words, how we got rid of the dinosaurs.
Facilitation Standards: A Mixed Methods Study
ERIC Educational Resources Information Center
Hunter, Jennifer
2017-01-01
Online education is increasing as a solution to manage ever increasing enrollment numbers at higher education institutions. Intentionally and thoughtfully constructed courses allow students to improve performance through practice and self-assessment and instructors benefit from improving consistency in providing content and assessing process,…
Software Acquisition Improvement in the Aeronautical Systems Center
2008-09-01
software fielded, a variety of different methods were suggested by the interviewees. These included blocks, suites and other tailored processes developed...12 Selection of Research Method ...DoD look to the commercial market to buy tools, methods , environments, and application software, instead of custom-built software (DSB: 1987). These
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
Development and accuracy of a multipoint method for measuring visibility.
Tai, Hongda; Zhuang, Zibo; Sun, Dongsong
2017-10-01
Accurate measurements of visibility are of great importance in many fields. This paper reports a multipoint visibility measurement (MVM) method to measure and calculate the atmospheric transmittance, extinction coefficient, and meteorological optical range (MOR). The relative errors of atmospheric transmittance and MOR measured by the MVM method and traditional transmissometer method are analyzed and compared. Experiments were conducted indoors, and the data were simultaneously processed. The results revealed that the MVM can effectively improve the accuracy under different visibility conditions. The greatest improvement of accuracy was 27%. The MVM can be used to calibrate and evaluate visibility meters.
NASA Astrophysics Data System (ADS)
Alrbaey, K.; Wimpenny, D. I.; Al-Barzinjy, A. A.; Moroz, A.
2016-07-01
This three-level three-factor full factorial study describes the effects of electropolishing using deep eutectic solvents on the surface roughness of re-melted 316L stainless steel samples produced by the selective laser melting (SLM) powder bed fusion additive manufacturing method. An improvement in the surface finish of re-melted stainless steel 316L parts was achieved by optimizing the processing parameters for a relatively environmentally friendly (`green') electropolishing process using a Choline Chloride ionic electrolyte. The results show that further improvement of the response value-average surface roughness ( Ra) can be obtained by electropolishing after re-melting to yield a 75% improvement compared to the as-built Ra. The best Ra value was less than 0.5 μm, obtained with a potential of 4 V, maintained for 30 min at 40 °C. Electropolishing has been shown to be effective at removing the residual oxide film formed during the re-melting process. The material dissolution during the process is not homogenous and is directed preferentially toward the iron and nickel, leaving the surface rich in chromium with potentially enhanced properties. The re-melted and polished surface of the samples gave an approximately 20% improvement in fatigue life at low stresses (approximately 570 MPa). The results of the study demonstrate that a combination of re-melting and electropolishing provides a flexible method for surface texture improvement which is capable of delivering a significant improvement in surface finish while holding the dimensional accuracy of parts within an acceptable range.
An automatic method for segmentation of fission tracks in epidote crystal photomicrographs
NASA Astrophysics Data System (ADS)
de Siqueira, Alexandre Fioravante; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Tello Saenz, Carlos Alberto; Job, Aldo Eloizo
2014-08-01
Manual identification of fission tracks has practical problems, such as variation due to observe-observation efficiency. An automatic processing method that could identify fission tracks in a photomicrograph could solve this problem and improve the speed of track counting. However, separation of nontrivial images is one of the most difficult tasks in image processing. Several commercial and free softwares are available, but these softwares are meant to be used in specific images. In this paper, an automatic method based on starlet wavelets is presented in order to separate fission tracks in mineral photomicrographs. Automatization is obtained by the Matthews correlation coefficient, and results are evaluated by precision, recall and accuracy. This technique is an improvement of a method aimed at segmentation of scanning electron microscopy images. This method is applied in photomicrographs of epidote phenocrystals, in which accuracy higher than 89% was obtained in fission track segmentation, even for difficult images. Algorithms corresponding to the proposed method are available for download. Using the method presented here, a user could easily determine fission tracks in photomicrographs of mineral samples.
Methods and systems for deacidizing gaseous mixtures
Hu, Liang
2010-05-18
An improved process for deacidizing a gaseous mixture using phase enhanced gas-liquid absorption is described. The process utilizes a multiphasic absorbent that absorbs an acid gas at increased rate and leads to reduced overall energy costs for the deacidizing operation.
Improvement of a method for positioning of pithead by considering motion of the surface water
NASA Astrophysics Data System (ADS)
Yi, H.; Lee, D. K.
2016-12-01
Underground mining has weakness compared with open pit mining in aspects of efficiency, economy and working environment. However, the method has applied for the development of a deep orebody. Development plan is established when the economic valuation and technical analysis of the deposits is completed through exploration of mineral resources. Development is a process to open a passage from the ground surface to the orebody as one of the steps of mining process. In the planning, there are details such as pithead positioning, mining method selection, and shaft design, etc. Among these, pithead positioning is implemented by considering infrastructures, watershed, geology, and economy. In this study, we propose a method to consider the motion of the surface waters in order to improve the existing pithead positioning techniques. The method contemplates the terrain around the mine and makes the surface water flow information. Then, the drainage treatment cost for each candidate location of pithead is suggested. This study covers the concept and design of the scheme.
NASA Astrophysics Data System (ADS)
Benea, Lidia
2018-06-01
There are two applied electrochemical methods in our group in order to obtain advanced functional surfaces on materials: (i) direct electrochemical synthesis by electro-codeposition process and (ii) anodization of materials to form nanoporous oxide layers followed by electrodeposition of hydroxyapatite or other bioactive molecules and compounds into porous film. Electrodeposition is a process of low energy consumption, and therefore very convenient for the surface modification of various types of materials. Electrodeposition is a powerful method compared with other methods, which led her to be adopted and spread rapidly in nanotechnology to obtain nanostructured layers and films. Nanoporous thin oxide layers on titanum alloys as support for hydroxyapatite or other biomolecules electrodeposition in view of biomedical applications could be obtained by electrochemical methods. For surface modification of titanium or titanium alloys to improve the biocompatibility or osseointegration, the two steps must be fulfilled; the first is controlled growth of oxide layer followed by second being biomolecule electrodeposition into nanoporous formed titanium oxide layer.
Quality improvement on the acute inpatient psychiatry unit using the model for improvement.
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.
Optimizing process and equipment efficiency using integrated methods
NASA Astrophysics Data System (ADS)
D'Elia, Michael J.; Alfonso, Ted F.
1996-09-01
The semiconductor manufacturing industry is continually riding the edge of technology as it tries to push toward higher design limits. Mature fabs must cut operating costs while increasing productivity to remain profitable and cannot justify large capital expenditures to improve productivity. Thus, they must push current tool production capabilities to cut manufacturing costs and remain viable. Working to continuously improve mature production methods requires innovation. Furthermore, testing and successful implementation of these ideas into modern production environments require both supporting technical data and commitment from those working with the process daily. At AMD, natural work groups (NWGs) composed of operators, technicians, engineers, and supervisors collaborate to foster innovative thinking and secure commitment. Recently, an AMD NWG improved equipment cycle time on the Genus tungsten silicide (WSi) deposition system. The team used total productive manufacturing (TPM) to identify areas for process improvement. Improved in-line equipment monitoring was achieved by constructing a real time overall equipment effectiveness (OEE) calculator which tracked equipment down, idle, qualification, and production times. In-line monitoring results indicated that qualification time associated with slow Inspex turn-around time and machine downtime associated with manual cleans contributed greatly to reduced availability. Qualification time was reduced by 75% by implementing a new Inspex monitor pre-staging technique. Downtime associated with manual cleans was reduced by implementing an in-situ plasma etch back to extend the time between manual cleans. A designed experiment was used to optimize the process. Time between 18 hour manual cleans has been improved from every 250 to every 1500 cycles. Moreover defect density realized a 3X improvement. Overall, the team achieved a 35% increase in tool availability. This paper details the above strategies and accomplishments.
Kaizen method for esophagectomy patients: improved quality control, outcomes, and decreased costs.
Iannettoni, Mark D; Lynch, William R; Parekh, Kalpaj R; McLaughlin, Kelley A
2011-04-01
The majority of costs associated with esophagectomy are related to the initial 3 days of hospital stay requiring intensive care unit stays, ventilator support, and intraoperative time. Additional costs arise from hospital-based services. The major cost increases are related to complications associated with the procedure. We attempted to define these costs and identify expense management by streamlining care through strict adherence to patient care maps, operative standardization, and rapid discharge planning to reduce variability. Utilizing methods of Kaizen philosophy we evaluated all processes related to the entire experience of esophageal resection. This process has taken over 5 years to achieve, with quality and cost being tracked over this time period. Cost analysis included expenses related to intensive care unit, anesthesia, disposables, and hospital services. Quality improvement measures were related to intraoperative complications, in-hospital complications, and postoperative outcomes. The Institutional Review Board approved the use of anonymous data from standard clinical practice because no additional treatment was planned (observational study). Utilizing a continuous process improvement methodology, a 43% reduction in cost per case has been achieved with a significant increase in contribution margin for esophagectomy. The length of stay has been reduced from 14 days to 5. With intraoperative and postoperative standardization the leak rate has dropped from 12% to less than 3% to no leaks in our current Kaizen modification of care in our last 64 patients. Utilizing lean manufacturing techniques and continuous process evaluation we have attempted to eliminate variability, standardized the phases of care resulting in improved outcomes, decreased length of stay, and improved contribution margins. These Kaizen improvements require continuous interventions, strict adherence to care maps, and input from all levels for quality improvements. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Crosby, Lori E.; Joffe, Naomi E.; Davis, Blair; Quinn, Charles T.; Shook, Lisa; Morgan, Darice; Simmons, Kenya; Kalinyak, Karen A.
2016-01-01
Stroke, a devastating complication of sickle cell anemia (SCA), can cause irreversible brain injury with physical and cognitive deficits. Transcranial Doppler ultrasonography (TCD) is a non-invasive tool for identifying children with SCA at highest risk of stroke. National guidelines recommend that TCD screening begin at age 2 years, yet there is research to suggest less than half of young children undergo screening. The purpose of this project was to use quality improvement methods to improve the proportion of patients aged 24–27 months who successfully completed their initial TCD from 25% to 75% by December 31, 2013. Quality improvement methods (e.g., process mapping, simplified failure mode effect analysis, and plan–do–study–act cycles) were used to develop and test processes for identifying eligible patients, scheduling TCDs, preparing children and families for the first TCD, and monitoring outcomes (i.e., TCD protocol). Progress was tracked using a report of eligible patients and a chart showing the age in months for the first successful TCD (population metric). As of December 2013, 100% of eligible patients successfully completed their initial TCD screen; this improvement was maintained for the next 20 months. In November 2014, a Welch’s one-way ANOVA was conducted. Results showed a statistically significant difference between the average age of first TCD for eligible patients born in 2009 and eligible patients born during the intervention period (2010–2013; F[1,11.712]=16.03, p=0.002). Use of quality improvement methods to implement a TCD protocol was associated with improved TCD screening rates in young children with SCA. PMID:27320459
Crosby, Lori E; Joffe, Naomi E; Davis, Blair; Quinn, Charles T; Shook, Lisa; Morgan, Darice; Simmons, Kenya; Kalinyak, Karen A
2016-07-01
Stroke, a devastating complication of sickle cell anemia (SCA), can cause irreversible brain injury with physical and cognitive deficits. Transcranial Doppler ultrasonography (TCD) is a non-invasive tool for identifying children with SCA at highest risk of stroke. National guidelines recommend that TCD screening begin at age 2 years, yet there is research to suggest less than half of young children undergo screening. The purpose of this project was to use quality improvement methods to improve the proportion of patients aged 24-27 months who successfully completed their initial TCD from 25% to 75% by December 31, 2013. Quality improvement methods (e.g., process mapping, simplified failure mode effect analysis, and plan-do-study-act cycles) were used to develop and test processes for identifying eligible patients, scheduling TCDs, preparing children and families for the first TCD, and monitoring outcomes (i.e., TCD protocol). Progress was tracked using a report of eligible patients and a chart showing the age in months for the first successful TCD (population metric). As of December 2013, 100% of eligible patients successfully completed their initial TCD screen; this improvement was maintained for the next 20 months. In November 2014, a Welch's one-way ANOVA was conducted. Results showed a statistically significant difference between the average age of first TCD for eligible patients born in 2009 and eligible patients born during the intervention period (2010-2013; F[1,11.712]=16.03, p=0.002). Use of quality improvement methods to implement a TCD protocol was associated with improved TCD screening rates in young children with SCA. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Use of Thermochrons in the Classroom
ERIC Educational Resources Information Center
Avard, Margaret Marie
2010-01-01
Preservice elementary education students often do not have a good feel for the process of science. Many may be acquainted with the steps of the scientific method but have never been through the scientific process. An exercise was designed using temperature-logging iButtons (Thermochrons) to improve knowledge of and familiarity with the process of…
USDA-ARS?s Scientific Manuscript database
The measurement of sugar concentration and dry matter in processing potatoes is a time and resource intensive activity, cannot be performed in the field, and does not easily measure within tuber variation. A proposed method to improve the phenotyping of processing potatoes is to employ hyperspectral...
USDA-ARS?s Scientific Manuscript database
Infrared (IR) radiation heating has been considered as an alternative to current food and agricultural processing methods for improving product quality and safety, increasing energy and processing efficiency, and reducing water and chemical usage. As part of the electromagnetic spectrum, IR has the ...
The Transition Assessment Process and IDEIA 2004
ERIC Educational Resources Information Center
Sitlington, Patricia L.; Clark, Gary M.
2007-01-01
This article will first provide an overview of the transition assessment process in terms of the requirements of the Individuals with Disabilities Education Improvement Act of 2004 and the basic tenets of the process. The second section will provide an overview of the methods of gathering assessment information on the student and on the living,…
NASA Astrophysics Data System (ADS)
Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.
2017-02-01
The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.
Rapid and Checkable Electrical Post-Treatment Method for Organic Photovoltaic Devices
Park, Sangheon; Seo, Yu-Seong; Shin, Won Suk; Moon, Sang-Jin; Hwang, Jungseek
2016-01-01
Post-treatment processes improve the performance of organic photovoltaic devices by changing the microscopic morphology and configuration of the vertical phase separation in the active layer. Thermal annealing and solvent vapor (or chemical) treatment processes have been extensively used to improve the performance of bulk-heterojunction (BHJ) organic photovoltaic (OPV) devices. In this work we introduce a new post-treatment process which we apply only electrical voltage to the BHJ-OPV devices. We used the commercially available P3HT [Poly(3-hexylthiophene)] and PC61BM (Phenyl-C61-Butyric acid Methyl ester) photovoltaic materials as donor and acceptor, respectively. We monitored the voltage and current applied to the device to check for when the post-treatment process had been completed. This electrical treatment process is simpler and faster than other post-treatment methods, and the performance of the electrically treated solar cell is comparable to that of a reference (thermally annealed) device. Our results indicate that the proposed treatment process can be used efficiently to fabricate high-performance BHJ-OPV devices. PMID:26932767
Adaptive Swarm Balancing Algorithms for rare-event prediction in imbalanced healthcare data
Wong, Raymond K.; Mohammed, Sabah; Fiaidhi, Jinan; Sung, Yunsick
2017-01-01
Clinical data analysis and forecasting have made substantial contributions to disease control, prevention and detection. However, such data usually suffer from highly imbalanced samples in class distributions. In this paper, we aim to formulate effective methods to rebalance binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat algorithm, and apply them to empower the effects of synthetic minority over-sampling technique (SMOTE) for pre-processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reported in this paper reveal that the performance improvements obtained by the former methods are not scalable to larger data scales. The latter methods, which we call Adaptive Swarm Balancing Algorithms, lead to significant efficiency and effectiveness improvements on large datasets while the first method is invalid. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. The proposed methods lead to more credible performances of the classifier, and shortening the run time compared to brute-force method. PMID:28753613
Takeda, Kayoko; Takahashi, Kiyoshi; Masukawa, Hiroyuki; Shimamori, Yoshimitsu
2017-01-01
Recently, the practice of active learning has spread, increasingly recognized as an essential component of academic studies. Classes incorporating small group discussion (SGD) are conducted at many universities. At present, assessments of the effectiveness of SGD have mostly involved evaluation by questionnaires conducted by teachers, by peer assessment, and by self-evaluation of students. However, qualitative data, such as open-ended descriptions by students, have not been widely evaluated. As a result, we have been unable to analyze the processes and methods involved in how students acquire knowledge in SGD. In recent years, due to advances in information and communication technology (ICT), text mining has enabled the analysis of qualitative data. We therefore investigated whether the introduction of a learning system comprising the jigsaw method and problem-based learning (PBL) would improve student attitudes toward learning; we did this by text mining analysis of the content of student reports. We found that by applying the jigsaw method before PBL, we were able to improve student attitudes toward learning and increase the depth of their understanding of the area of study as a result of working with others. The use of text mining to analyze qualitative data also allowed us to understand the processes and methods by which students acquired knowledge in SGD and also changes in students' understanding and performance based on improvements to the class. This finding suggests that the use of text mining to analyze qualitative data could enable teachers to evaluate the effectiveness of various methods employed to improve learning.
Efficiency improvement of technological preparation of power equipment manufacturing
NASA Astrophysics Data System (ADS)
Milukov, I. A.; Rogalev, A. N.; Sokolov, V. P.; Shevchenko, I. V.
2017-11-01
Competitiveness of power equipment primarily depends on speeding-up the development and mastering of new equipment samples and technologies, enhancement of organisation and management of design, manufacturing and operation. Actual political, technological and economic conditions cause the acute need in changing the strategy and tactics of process planning. At that the issues of maintenance of equipment with simultaneous improvement of its efficiency and compatibility to domestically produced components are considering. In order to solve these problems, using the systems of computer-aided process planning for process design at all stages of power equipment life cycle is economically viable. Computer-aided process planning is developed for the purpose of improvement of process planning by using mathematical methods and optimisation of design and management processes on the basis of CALS technologies, which allows for simultaneous process design, process planning organisation and management based on mathematical and physical modelling of interrelated design objects and production system. An integration of computer-aided systems providing the interaction of informative and material processes at all stages of product life cycle is proposed as effective solution to the challenges in new equipment design and process planning.
Macrae, Rhoda; Lillo-Crespo, Manuel; Rooney, Kevin D
2017-01-01
Abstract Introduction There is a limited body of research in the field of healthcare improvement science (HIS). Quality improvement and ‘change making’ should become an intrinsic part of everyone’s job, every day in all parts of the healthcare system. The lack of theoretical grounding may partly explain the minimal transfer of health research into health policy. Methods This article seeks to present the development of the definition for healthcare improvement science. A consensus method approach was adopted with a two-stage Delphi process, expert panel and consensus group techniques. A total of 18 participants were involved in the expert panel and consensus group, and 153 answers were analysed as a part of the Delphi survey. Participants were researchers, educators and healthcare professionals from Scotland, Slovenia, Spain, Italy, England, Poland, and Romania. Results A high level of consensus was achieved for the broad definition in the 2nd Delphi iteration (86%). The final definition was agreed on by the consensus group: ‘Healthcare improvement science is the generation of knowledge to cultivate change and deliver person-centred care that is safe, effective, efficient, equitable and timely. It improves patient outcomes, health system performance and population health.’ Conclusions The process of developing a consensus definition revealed different understandings of healthcare improvement science between the participants. Having a shared consensus definition of healthcare improvement science is an important step forward, bringing about a common understanding in order to advance the professional education and practice of healthcare improvement science. PMID:28289467
Improvement for enhancing effectiveness of universal power system (UPS) continuous testing process
NASA Astrophysics Data System (ADS)
Sriratana, Lerdlekha
2018-01-01
This experiment aims to enhance the effectiveness of the Universal Power System (UPS) continuous testing process of the Electrical and Electronic Institute by applying work scheduling and time study methods. Initially, the standard time of testing process has not been considered that results of unaccurate testing target and also time wasting has been observed. As monitoring and reducing waste time for improving the efficiency of testing process, Yamazumi chart and job scheduling theory (North West Corner Rule) were applied to develop new work process. After the improvements, the overall efficiency of the process possibly increased from 52.8% to 65.6% or 12.7%. Moreover, the waste time could reduce from 828.3 minutes to 653.6 minutes or 21%, while testing units per batch could increase from 3 to 4 units. Therefore, the number of testing units would increase from 12 units up to 20 units per month that also contribute to increase of net income of UPS testing process by 72%.
Student Evaluations of the Portfolio Process
Airey, Tatum C.; Bisso, Andrea M.; Slack, Marion K.
2011-01-01
Objective. To evaluate pharmacy students’ perceived benefits of the portfolio process and to gather suggestions for improving the process. Methods. A questionnaire was designed and administered to 250 first-, second-, and third-year pharmacy students at the University of Arizona College of Pharmacy. Results. Although the objectives of the portfolio process were for students to understand the expected outcomes, understand the impact of extracurricular activities on attaining competencies, identify what should be learned, identify their strengths and weaknesses, and modify their approach to learning, overall students perceived the portfolio process as having less than moderate benefit. First-year students wanted more examples of portfolios while second- and third-year students suggested that more time with their advisor would be beneficial. Conclusions. The portfolio process will continue to be refined and efforts made to improve students’ perceptions of the process as it is intended to develop the self-assessments skills they will need to improve their knowledge and professional skills throughout their pharmacy careers. PMID:21969718
Lean methodology in health care.
Kimsey, Diane B
2010-07-01
Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area. Copyright (c) 2010 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Michael, Claire W; Naik, Kalyani; McVicker, Michael
2013-05-01
We developed a value stream map (VSM) of the Papanicolaou test procedure to identify opportunities to reduce waste and errors, created a new VSM, and implemented a new process emphasizing Lean tools. Preimplementation data revealed the following: (1) processing time (PT) for 1,140 samples averaged 54 hours; (2) 27 accessioning errors were detected on review of 357 random requisitions (7.6%); (3) 5 of the 20,060 tests had labeling errors that had gone undetected in the processing stage. Four were detected later during specimen processing but 1 reached the reporting stage. Postimplementation data were as follows: (1) PT for 1,355 samples averaged 31 hours; (2) 17 accessioning errors were detected on review of 385 random requisitions (4.4%); and (3) no labeling errors were undetected. Our results demonstrate that implementation of Lean methods, such as first-in first-out processes and minimizing batch size by staff actively participating in the improvement process, allows for higher quality, greater patient safety, and improved efficiency.
Computer image processing in marine resource exploration
NASA Technical Reports Server (NTRS)
Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.
1976-01-01
Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.
Plagiarism Detection for Indonesian Language using Winnowing with Parallel Processing
NASA Astrophysics Data System (ADS)
Arifin, Y.; Isa, S. M.; Wulandhari, L. A.; Abdurachman, E.
2018-03-01
The plagiarism has many forms, not only copy paste but include changing passive become active voice, or paraphrasing without appropriate acknowledgment. It happens on all language include Indonesian Language. There are many previous research that related with plagiarism detection in Indonesian Language with different method. But there are still some part that still has opportunity to improve. This research proposed the solution that can improve the plagiarism detection technique that can detect not only copy paste form but more advance than that. The proposed solution is using Winnowing with some addition process in pre-processing stage. With stemming processing in Indonesian Language and generate fingerprint in parallel processing that can saving time processing and produce the plagiarism result on the suspected document.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier
NASA Astrophysics Data System (ADS)
Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki
We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.
New signal processing technique for density profile reconstruction using reflectometry.
Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C
2011-08-01
Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, H. C.; Wimmer, J. M.
1986-01-01
Silicon nitride is a high temperature material currently under consideration for heat engine and other applications. The objective is to improve the net shape fabrication technology of Si3N4 by injection molding. This is to be accomplished by optimizing the process through a series of statistically designed matrix experiments. To provide input to the matrix experiments, a wide range of alternate materials and processing parameters was investigated throughout the whole program. The improvement in the processing is to be demonstrated by a 20 percent increase in strength and a 100 percent increase in the Weibull modulus over that of the baseline material. A full characterization of the baseline process was completed. Material properties were found to be highly dependent on each step of the process. Several important parameters identified thus far are the starting raw materials, sinter/hot isostatic pressing cycle, powder bed, mixing methods, and sintering aid levels.
Compute as Fast as the Engineers Can Think! ULTRAFAST COMPUTING TEAM FINAL REPORT
NASA Technical Reports Server (NTRS)
Biedron, R. T.; Mehrotra, P.; Nelson, M. L.; Preston, M. L.; Rehder, J. J.; Rogersm J. L.; Rudy, D. H.; Sobieski, J.; Storaasli, O. O.
1999-01-01
This report documents findings and recommendations by the Ultrafast Computing Team (UCT). In the period 10-12/98, UCT reviewed design case scenarios for a supersonic transport and a reusable launch vehicle to derive computing requirements necessary for support of a design process with efficiency so radically improved that human thought rather than the computer paces the process. Assessment of the present computing capability against the above requirements indicated a need for further improvement in computing speed by several orders of magnitude to reduce time to solution from tens of hours to seconds in major applications. Evaluation of the trends in computer technology revealed a potential to attain the postulated improvement by further increases of single processor performance combined with massively parallel processing in a heterogeneous environment. However, utilization of massively parallel processing to its full capability will require redevelopment of the engineering analysis and optimization methods, including invention of new paradigms. To that end UCT recommends initiation of a new activity at LaRC called Computational Engineering for development of new methods and tools geared to the new computer architectures in disciplines, their coordination, and validation and benefit demonstration through applications.
The direct liquefaction proof of concept program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comolli, A.G.; Lee, L.K.; Pradhan, V.R.
1995-12-31
The goal of the Proof of Concept (POC) Program is to develop Direct Coal Liquefaction and associated transitional technologies towards commercial readiness for economically producing premium liquid fuels from coal in an environmentally acceptable manner. The program focuses on developing the two-stage liquefaction (TSL) process by utilizing geographically strategic feedstocks, commercially feasible catalysts, new prototype equipment, and testing co-processing or alternate feedstocks and improved process configurations. Other high priority objectives include dispersed catalyst studies, demonstrating low rank coal liquefaction without solids deposition, improving distillate yields on a unit reactor volume basis, demonstrating ebullated bed operations while obtaining scale-up data, demonstratingmore » optimum catalyst consumption using new concepts (e.g. regeneration, cascading), producing premium products through on-line hydrotreating, demonstrating improved hydrogen utilization for low rank coals using novel heteroatom removal methods, defining and demonstrating two-stage product properties for upgrading; demonstrating efficient and economic solid separation methods, examining the merits of integrated coal cleaning, demonstrating co-processing, studying interactions between the preheater and first and second-stage reactors, improving process operability by testing and incorporating advanced equipment and instrumentation, and demonstrating operation with alternate coal feedstocks. During the past two years major PDU Proof of Concept runs were completed. POC-1 with Illinois No. 6 coal and POC-2 with Black Thunder sub-bituminous coal. Results from these operations are continuing under review and the products are being further refined and upgraded. This paper will update the results from these operations and discuss future plans for the POC program.« less
Klystron Manufacturing Technology Program.
1983-09-01
processes, and methodology used on the current production tube, VKU-7735E, and the new methods and techniques used to improve and reduce the cost of...the bellows. This alignment is c~tclto the smoothi operation of the internal tuniing mezhanism. IT METRD - VKCU-7795F The new assembly method changes...Varian, the MT contractor that the new methodology , technologies and process changes introduced into the MT power klystron and autotuner assembly - VKU
Dahling, Daniel R
2002-01-01
Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.
Improved synthesis of carbon-clad silica stationary phases.
Haidar Ahmad, Imad A; Carr, Peter W
2013-12-17
Previously, we described a novel method for cladding elemental carbon onto the surface of catalytically activated silica by a chemical vapor deposition (CVD) method using hexane as the carbon source and its use as a substitute for carbon-clad zirconia.1,2 In that method, we showed that very close to exactly one uniform monolayer of Al (III) was deposited on the silica by a process analogous to precipitation from homogeneous solution in order to preclude pore blockage. The purpose of the Al(III) monolayer is to activate the surface for subsequent CVD of carbon. In this work, we present an improved procedure for preparing the carbon-clad silica (denoted CCSi) phases along with a new column packing process. The new method yields CCSi phases having better efficiency, peak symmetry, and higher retentivity compared to carbon-clad zirconia. The enhancements were achieved by modifying the original procedure in three ways: First, the kinetics of the deposition of Al(III) were more stringently controlled. Second, the CVD chamber was flushed with a mixture of hydrogen and nitrogen gas during the carbon cladding process to minimize generation of polar sites by oxygen incorporation. Third, the fine particles generated during the CVD process were exhaustively removed by flotation in an appropriate solvent.
NASA Astrophysics Data System (ADS)
Wu, Hsin-Hung; Tsai, Ya-Ning
2012-11-01
This study uses both analytic hierarchy process (AHP) and decision-making trial and evaluation laboratory (DEMATEL) methods to evaluate the criteria in auto spare parts industry in Taiwan. Traditionally, AHP does not consider indirect effects for each criterion and assumes that criteria are independent without further addressing the interdependence between or among the criteria. Thus, the importance computed by AHP can be viewed as short-term improvement opportunity. On the contrary, DEMATEL method not only evaluates the importance of criteria but also depicts the causal relations of criteria. By observing the causal diagrams, the improvement based on cause-oriented criteria might improve the performance effectively and efficiently for the long-term perspective. As a result, the major advantage of integrating AHP and DEMATEL methods is that the decision maker can continuously improve suppliers' performance from both short-term and long-term viewpoints.
Varughese, Anna M; Hagerman, Nancy; Townsend, Mari E
2013-07-01
The anesthesia preoperative screening and evaluation of a patient prior to surgery is a critical element in the safe and effective delivery of anesthesia care. In this era of increased focus on cost containment, many anesthesia practices are looking for ways to maximize productivity while maintaining the quality of the preoperative evaluation process by harnessing and optimizing all available resources. We sought to develop a Nurse Practitioner-assisted Preoperative Anesthesia Screening process using quality improvement methods with the goal of maintaining the quality of the screening process, while at the same time redirecting anesthesiologists time for the provision of nonoperating room (OR) anesthesia. The Nurse practitioner (NP) time (approximately 10 h per week) directed to this project was gained as a result of an earlier resource utilization improvement project within the Department of Anesthesia. The goal of this improvement project was to increase the proportion of patient anesthesia screens conducted by NPs to 50% within 6 months. After discussion with key stakeholders of the process, a multidisciplinary improvement team identified a set of operational factors (key drivers) believed to be important to the success of the preoperative anesthesia screening process. These included the development of dedicated NP time for daily screening, NP competency and confidence with the screening process, effective mentoring by anesthesiologists, standardization of screening process, and communication with stakeholders of the process, that is, surgeons. These key drivers focused on the development of several interventions such as (i) NP education in the preoperative anesthesia screening for consultation process by a series of didactic lectures conducted by anesthesiologists, and NP's shadowing an anesthesiologist during the screening process, (ii) Anesthesiologist mentoring and assessment of NP screenings using the dual screening process whereby both anesthesiologists and NP conducted the screening process independently and results were compared and discussed, (iii) Examination and re-adjustment of NP schedules to provide time for daily screening while preserving other responsibilities, and (iv) Standardization through the development of guidelines for the preoperative screening process. Measures recorded included the percentage of patient anesthesia screens conducted by NP, the percentage of dual screens with MD and NP agreement regarding the screening decision, and the average times taken for the anesthesiologist and NP screening process. After implementation of these interventions, the percentage of successful NP-assisted anesthesia consultation screenings increased from 0% to 65% over a period of 6 months. The Anesthesiologists' time redirected to non-OR anesthesia averaged at least 8 h a week. The percentage of dual screens with agreement on the screening decision was 96% (goal >95%). The overall average time taken for a NP screen was 8.2 min vs 4.5 min for an anesthesiologist screen. The overall average operating room delays and cancelations for cases on the day of surgery remained the same. By applying quality improvement methods, we identified key drivers for the institution of an NP-assisted preoperative screening process and successfully implemented this process while redirecting anesthesiologists' time for the provision of non-OR anesthesia. This project was instrumental in improving the matching of provider skills with clinical need while maintaining superior outcomes at the lowest possible cost. © 2013 John Wiley & Sons Ltd.
An efficient hole-filling method based on depth map in 3D view generation
NASA Astrophysics Data System (ADS)
Liang, Haitao; Su, Xiu; Liu, Yilin; Xu, Huaiyuan; Wang, Yi; Chen, Xiaodong
2018-01-01
New virtual view is synthesized through depth image based rendering(DIBR) using a single color image and its associated depth map in 3D view generation. Holes are unavoidably generated in the 2D to 3D conversion process. We propose a hole-filling method based on depth map to address the problem. Firstly, we improve the process of DIBR by proposing a one-to-four (OTF) algorithm. The "z-buffer" algorithm is used to solve overlap problem. Then, based on the classical patch-based algorithm of Criminisi et al., we propose a hole-filling algorithm using the information of depth map to handle the image after DIBR. In order to improve the accuracy of the virtual image, inpainting starts from the background side. In the calculation of the priority, in addition to the confidence term and the data term, we add the depth term. In the search for the most similar patch in the source region, we define the depth similarity to improve the accuracy of searching. Experimental results show that the proposed method can effectively improve the quality of the 3D virtual view subjectively and objectively.
Preferred skin color enhancement for photographic color reproduction
NASA Astrophysics Data System (ADS)
Zeng, Huanzhao; Luo, Ronnier
2011-01-01
Skin tones are the most important colors among the memory color category. Reproducing skin colors pleasingly is an important factor in photographic color reproduction. Moving skin colors toward their preferred skin color center improves the color preference of skin color reproduction. Several methods to morph skin colors to a smaller preferred skin color region has been reported in the past. In this paper, a new approach is proposed to further improve the result of skin color enhancement. An ellipsoid skin color model is applied to compute skin color probabilities for skin color detection and to determine a weight for skin color adjustment. Preferred skin color centers determined through psychophysical experiments were applied for color adjustment. Preferred skin color centers for dark, medium, and light skin colors are applied to adjust skin colors differently. Skin colors are morphed toward their preferred color centers. A special processing is applied to avoid contrast loss in highlight. A 3-D interpolation method is applied to fix a potential contouring problem and to improve color processing efficiency. An psychophysical experiment validates that the method of preferred skin color enhancement effectively identifies skin colors, improves the skin color preference, and does not objectionably affect preferred skin colors in original images.
Comparative Mechanical Improvement of Stainless Steel 304 Through Three Methods
NASA Astrophysics Data System (ADS)
Mubarok, N.; Notonegoro, H. A.; Thosin, K. A. Z.
2018-05-01
Stainless Steel 304 (SS304) is one of stainless steel group widely used in industries for various purposes. In this paper, we compared the experimental process to enhance the mechanical properties of the surface SS304 through three different methods, cold rolled, annealed salt baht bronzing (ASB), and annealed salt baht boronizing-quench (ASB-Q). The phase change in SS304 due to the cold rolled process makes this method is to abandon. The increasing of the annealing time in the ASB method has a nonlinear relationship with the increases in hardness value. Comparing to the increases in hardness value of the ASB method, the hardness value of ASB-Q methods is still lower than that method.
Ten tools of continuous quality improvement: a review and case example of hospital discharge.
Ziegenfuss, J T; McKenna, C K
1995-01-01
Concepts and methods of continuous quality improvement have been endorsed by quality specialists in American Health care, and their use has convinced CEOs that industrial methods can make a contribution to health and medical care. For all the quality improvement publications, there are still few that offer a clear, concise definition and an explanation of the primary tools for teaching purposes. This report reviews ten continuous quality improvement methods including: problem solving cycle, affinity diagrams, cause and effect diagrams, Pareto diagrams, histograms, bar charts, control charts, scatter diagrams, checklists, and a process decision program chart. These do not represent an exhaustive list, but a set of commonly used tools. They are applied to a case study of bed utilization in a university hospital.
Method for rapidly producing microporous and mesoporous materials
Coronado, P.R.; Poco, J.F.; Hrubesh, L.W.; Hopper, R.W.
1997-11-11
An improved, rapid process is provided for making microporous and mesoporous materials, including aerogels and pre-ceramics. A gel or gel precursor is confined in a sealed vessel to prevent structural expansion of the gel during the heating process. This confinement allows the gelation and drying processes to be greatly accelerated, and significantly reduces the time required to produce a dried aerogel compared to conventional methods. Drying may be performed either by subcritical drying with a pressurized fluid to expel the liquid from the gel pores or by supercritical drying. The rates of heating and decompression are significantly higher than for conventional methods. 3 figs.
Artificial mismatch hybridization
Guo, Zhen; Smith, Lloyd M.
1998-01-01
An improved nucleic acid hybridization process is provided which employs a modified oligonucleotide and improves the ability to discriminate a control nucleic acid target from a variant nucleic acid target containing a sequence variation. The modified probe contains at least one artificial mismatch relative to the control nucleic acid target in addition to any mismatch(es) arising from the sequence variation. The invention has direct and advantageous application to numerous existing hybridization methods, including, applications that employ, for example, the Polymerase Chain Reaction, allele-specific nucleic acid sequencing methods, and diagnostic hybridization methods.
Symetrica Measurements at PNNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kouzes, Richard T.; Mace, Emily K.; Redding, Rebecca L.
2009-01-26
Symetrica is a small company based in Southampton, England, that has developed an algorithm for processing gamma ray spectra obtained from a variety of scintillation detectors. Their analysis method applied to NaI(Tl), BGO, and LaBr spectra results in deconvoluted spectra with the “resolution” improved by about a factor of three to four. This method has also been applied by Symetrica to plastic scintillator with the result that full energy peaks are produced. If this method is valid and operationally viable, it could lead to a significantly improved plastic scintillator based radiation portal monitor system.
Shelnutt, J.A.
1984-11-29
A method is disclosed improving product yields in an anionic metalloporphyrin-based artificial photosynthesis system for hydrogen generation. The method comprises forming an aqueous solution comprising an electron donor, methylviologen, and certain metalloporphyrins and metallochlorins, and irradiating said aqueous solution with light in the presence of a catalyst. In the photosynthesis process, solar energy is collected and stored in the form of a hydrogen. Ligands attached above and below the metalloporphyrin and metallochlorin plane are capable of sterically blocking photochemically inactive electrostatically bound ..pi..-..pi.. complexes which can develop.
Method and apparatus for improving the quality and efficiency of ultrashort-pulse laser machining
Stuart, Brent C.; Nguyen, Hoang T.; Perry, Michael D.
2001-01-01
A method and apparatus for improving the quality and efficiency of machining of materials with laser pulse durations shorter than 100 picoseconds by orienting and maintaining the polarization of the laser light such that the electric field vector is perpendicular relative to the edges of the material being processed. Its use is any machining operation requiring remote delivery and/or high precision with minimal collateral dames.
Healthcare quality measurement in orthopaedic surgery: current state of the art.
Auerbach, Andrew
2009-10-01
Improving quality of care in arthroplasty is of increasing importance to payors, hospitals, surgeons, and patients. Efforts to compel improvement have traditionally focused measurement and reporting of data describing structural factors, care processes (or 'quality measures'), and clinical outcomes. Reporting structural measures (eg, surgical case volume) has been used with varying degrees of success. Care process measures, exemplified by initiatives such as the Surgical Care Improvement Project measures, are chosen based on the strength of randomized trial evidence linking the process to improved outcomes. However, evidence linking improved performance on Surgical Care Improvement Project measures with improved outcomes is limited. Outcome measures in surgery are of increasing importance as an approach to compel care improvement with prominent examples represented by the National Surgical Quality Improvement Project. Although outcomes-focused approaches are often costly, when linked to active benchmarking and collaborative activities, they may improve care broadly. Moreover, implementation of computerized data systems collecting information formerly collected on paper only will facilitate benchmarking. In the end, care will only be improved if these data are used to define methods for innovating care systems that deliver better outcomes at lower or equivalent costs.
Health technology assessment process of a cardiovascular medical device in four different settings.
Olry de Labry Lima, Antonio; Espín Balbino, Jaime; Lemgruber, Alexandre; Caro Martínez, Araceli; García-Mochón, Leticia; Martín Ruiz, Eva; Lessa, Fernanda
2017-10-01
Health technology assessment (HTA) is a tool to help the decision-making process. The aim is to describe methods and processes used in the reimbursement decision making for drug-eluting stents (DES) in four different settings. DES as a technology under study was selected according to different criteria, all of them agreed by a working group. A survey of key informants was designed. DES was evaluated following well-structured HTA processes. Nonetheless, scope for improvement was observed in relation to the data considered for the final decision, the transparency and inclusiveness of the process as well as in the methods employed. An attempt to describe the HTA processes of a well-known medical device.
NASA Astrophysics Data System (ADS)
Kang, Chao; Shi, Yaoyao; He, Xiaodong; Yu, Tao; Deng, Bo; Zhang, Hongji; Sun, Pengcheng; Zhang, Wenbin
2017-09-01
This study investigates the multi-objective optimization of quality characteristics for a T300/epoxy prepreg tape-wound cylinder. The method integrates the Taguchi method, grey relational analysis (GRA) and response surface methodology, and is adopted to improve tensile strength and reduce residual stress. In the winding process, the main process parameters involving winding tension, pressure, temperature and speed are selected to evaluate the parametric influences on tensile strength and residual stress. Experiments are conducted using the Box-Behnken design. Based on principal component analysis, the grey relational grades are properly established to convert multi-responses into an individual objective problem. Then the response surface method is used to build a second-order model of grey relational grade and predict the optimum parameters. The predictive accuracy of the developed model is proved by two test experiments with a low prediction error of less than 7%. The following process parameters, namely winding tension 124.29 N, pressure 2000 N, temperature 40 °C and speed 10.65 rpm, have the highest grey relational grade and give better quality characteristics in terms of tensile strength and residual stress. The confirmation experiment shows that better results are obtained with GRA improved by the proposed method than with ordinary GRA. The proposed method is proved to be feasible and can be applied to optimize the multi-objective problem in the filament winding process.
Improvement in the amine glass platform by bubbling method for a DNA microarray
Jee, Seung Hyun; Kim, Jong Won; Lee, Ji Hyeong; Yoon, Young Soo
2015-01-01
A glass platform with high sensitivity for sexually transmitted diseases microarray is described here. An amino-silane-based self-assembled monolayer was coated on the surface of a glass platform using a novel bubbling method. The optimized surface of the glass platform had highly uniform surface modifications using this method, as well as improved hybridization properties with capture probes in the DNA microarray. On the basis of these results, the improved glass platform serves as a highly reliable and optimal material for the DNA microarray. Moreover, in this study, we demonstrated that our glass platform, manufactured by utilizing the bubbling method, had higher uniformity, shorter processing time, lower background signal, and higher spot signal than the platforms manufactured by the general dipping method. The DNA microarray manufactured with a glass platform prepared using bubbling method can be used as a clinical diagnostic tool. PMID:26468293
Improvement in the amine glass platform by bubbling method for a DNA microarray.
Jee, Seung Hyun; Kim, Jong Won; Lee, Ji Hyeong; Yoon, Young Soo
2015-01-01
A glass platform with high sensitivity for sexually transmitted diseases microarray is described here. An amino-silane-based self-assembled monolayer was coated on the surface of a glass platform using a novel bubbling method. The optimized surface of the glass platform had highly uniform surface modifications using this method, as well as improved hybridization properties with capture probes in the DNA microarray. On the basis of these results, the improved glass platform serves as a highly reliable and optimal material for the DNA microarray. Moreover, in this study, we demonstrated that our glass platform, manufactured by utilizing the bubbling method, had higher uniformity, shorter processing time, lower background signal, and higher spot signal than the platforms manufactured by the general dipping method. The DNA microarray manufactured with a glass platform prepared using bubbling method can be used as a clinical diagnostic tool.
Harrington, J Timothy; Barash, Harvey L; Day, Sherry; Lease, Joellen
2005-04-15
To develop new processes that assure more reliable, population-based care of fragility fracture patients. A 4-year clinical improvement project was performed in a multispecialty, community practice health system using evidence-based guidelines and rapid cycle process improvement methods (plan-do-study-act cycles). Prior to this project, appropriate osteoporosis care was provided to only 5% of our 1999 hip fracture patients. In 2001, primary physicians were provided prompts about appropriate care (cycle 1), which resulted in improved care for only 20% of patients. A process improvement pilot in 2002 (cycle 2) and full program implementation in 2003 (cycle 3) have assured osteoporosis care for all willing and able patients with any fragility fracture. Altogether, 58% of 2003 fragility fracture patients, including 46% of those with hip fracture, have had a bone measurement, have been assigned to osteoporosis care with their primary physician or a consultant, and are being monitored regularly. Only 19% refused osteoporosis care. Key process improvements have included using orthopedic billings to identify patients, referring patients directly from orthopedics to an osteoporosis care program, organizing care with a nurse manager and process management computer software, assigning patients to primary or consultative physician care based on disease severity, and monitoring adherence to therapy by telephone. Reliable osteoporosis care is achievable by redesigning clinical processes. Performance data motivate physicians to reconsider traditional approaches. Improving the care of osteoporosis and other chronic diseases requires coordinated care across specialty boundaries and health system support.
NASA Astrophysics Data System (ADS)
Hong, Wei; Wang, Shaoping; Liu, Haokuo; Tomovic, Mileta M.; Chao, Zhang
2017-01-01
The inductive debris detection is an effective method for monitoring mechanical wear, and could be used to prevent serious accidents. However, debris detection during early phase of mechanical wear, when small debris (<100 um) is generated, requires that the sensor has high sensitivity with respect to background noise. In order to detect smaller debris by existing sensors, this paper presents a hybrid method which combines Band Pass Filter and Correlation Algorithm to improve sensor signal-to-noise ratio (SNR). The simulation results indicate that the SNR will be improved at least 2.67 times after signal processing. In other words, this method ensures debris identification when the sensor's SNR is bigger than -3 dB. Thus, smaller debris will be detected in the same SNR. Finally, effectiveness of the proposed method is experimentally validated.
Munoz-Plaza, Corrine E; Parry, Carla; Hahn, Erin E; Tang, Tania; Nguyen, Huong Q; Gould, Michael K; Kanter, Michael H; Sharp, Adam L
2016-08-15
Despite reports advocating for integration of research into healthcare delivery, scant literature exists describing how this can be accomplished. Examples highlighting application of qualitative research methods embedded into a healthcare system are particularly needed. This article describes the process and value of embedding qualitative research as the second phase of an explanatory, sequential, mixed methods study to improve antibiotic stewardship for acute sinusitis. Purposive sampling of providers for in-depth interviews improved understanding of unwarranted antibiotic prescribing and elicited stakeholder recommendations for improvement. Qualitative data collection, transcription and constant comparative analyses occurred iteratively. Emerging themes and sub-themes identified primary drivers of unwarranted antibiotic prescribing patterns and recommendations for improving practice. These findings informed the design of a health system intervention to improve antibiotic stewardship for acute sinusitis. Core components of the intervention are also described. Qualitative research can be effectively applied in learning healthcare systems to elucidate quantitative results and inform improvement efforts.
Improved productivity through interactive communication
NASA Technical Reports Server (NTRS)
Marino, P. P.
1985-01-01
New methods and approaches are being tried and evaluated with the goal of increasing productivity and quality. The underlying concept in all of these approaches, methods or processes is that people require interactive communication to maximize the organization's strengths and minimize impediments to productivity improvement. This paper examines Bendix Field Engineering Corporation's organizational structure and experiences with employee involvement programs. The paper focuses on methods Bendix developed and implemented to open lines of communication throughout the organization. The Bendix approach to productivity and quality enhancement shows that interactive communication is critical to the successful implementation of any productivity improvement program. The paper concludes with an examination of the Bendix methodologies which can be adopted by any corporation in any industry.
A Method for Improved Interpretation of "Spot" Biomarker Data ...
A Method for Improved Interpretation of "Spot" Biomarker Data The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.
COMPACT, CONTINUOUS MONITORING FOR VOLATILE ORGANIC COMPOUNDS - PHASE I
Improved methods for onsite measurement of multiple volatile organic compounds are needed for process control, monitoring, and remediation. This Phase I SBIR project sets forth an optical measurement method that meets these needs. The proposed approach provides an instantaneous m...
Methods for estimating bicycling and walking in Washington state.
DOT National Transportation Integrated Search
2014-05-01
This report presents the work performed in the first and second phases in the process of creating a method to : calculate Bicycle and Pedestrian Miles Traveled (BMT/PMT) for the state of Washington. First, we recommend : improvements to the existing ...
Reformulated diesel fuel and method
McAdams, Hiramie T [Carrollton, IL; Crawford, Robert W [Tucson, AZ; Hadder, Gerald R [Oak Ridge, TN; McNutt, Barry D [Arlington, VA
2006-08-22
A method for mathematically identifying at least one diesel fuel suitable for combustion in an automotive diesel engine with significantly reduced emissions and producible from known petroleum blendstocks using known refining processes, including the use of cetane additives (ignition improvers) and oxygenated compounds.
Innovative methods for calculation of freeway travel time using limited data : final report.
DOT National Transportation Integrated Search
2008-01-01
Description: Travel time estimations created by processing of simulated freeway loop detector data using proposed method have been compared with travel times reported from VISSIM model. An improved methodology was proposed to estimate freeway corrido...
NASA Astrophysics Data System (ADS)
Meng, Xiangwei; Yang, Qing; Chen, Feng; Shan, Chao; Liu, Keyin; Li, Yanyang; Bian, Hao; Du, Guangqing; Hou, Xun
2015-02-01
This paper reports a flexible fabrication method for 3D solenoid microcoils in silica glass. The method consists of femtosecond laser wet etching (FLWE) and microsolidics process. The 3D microchannel with high aspect ratio is fabricated by an improved FLWE method. In the microsolidics process, an alloy was chosen as the conductive metal. The microwires are achieved by injecting liquid alloy into the microchannel, and allowing the alloy to cool and solidify. The alloy microwires with high melting point can overcome the limitation of working temperature and improve the electrical property. The geometry, the height and diameter of microcoils were flexibly fabricated by the pre-designed laser writing path, the laser power and etching time. The 3D microcoils can provide uniform magnetic field and be widely integrated in many magnetic microsystems.
Laing, Karen; Baumgartner, Katherine
2005-01-01
Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.
Search Radar Track-Before-Detect Using the Hough Transform.
1995-03-01
before - detect processing method which allows previous data to help in target detection. The technique provides many advantages compared to...improved target detection scheme, applicable to search radars, using the Hough transform image processing technique. The system concept involves a track
The study of crystals for space processing and the effect of o-gravity
NASA Technical Reports Server (NTRS)
Lal, R. B.
1977-01-01
The mechanism of crystal growth was studied by solution technique and how it was affected by space environment. Investigation was made as to how space processing methods are used to improve the promising candidate materials for different devices.
Methods for deacidizing gaseous mixtures by phase enhanced absorption
Hu, Liang
2012-11-27
An improved process for deacidizing a gaseous mixture using phase enhanced gas-liquid absorption is described. The process utilizes a multiphasic absorbent that absorbs an acid gas at increased rate and leads to reduced overall energy costs for the deacidizing operation.
Zhang, Erlin; Li, Shengyi; Ren, Jing; Zhang, Lan; Han, Yong
2016-12-01
Ti-Cu sintered alloys, Ti-Cu(S) alloy, have exhibited good anticorrosion resistance and strong antibacterial properties, but low ductility in previous study. In this paper, Ti-Cu(S) alloys were subjected to extrusion processing in order to improve the comprehensive property. The phase constitute, microstructure, mechanical property, biocorrosion property and antibacterial activity of the extruded alloys, Ti-Cu(E), were investigated in comparison with Ti-Cu(S) by X-ray diffraction (XRD), optical microscopy (OM), scanning electronic microscopy (SEM) with energy disperse spectroscopy (EDS), mechanical testing, electrochemical testing and plate-count method in order to reveal the effect of the extrusion process. XRD, OM and SEM results showed that the extrusion process did not change the phase constitute but refined the grain size and Ti2Cu particle significantly. Ti-Cu(E) alloys exhibited higher hardness and compressive yield strength than Ti-Cu(S) alloys due to the fine grain and Ti2Cu particles. With the consideration of the total compressive strain, it was suggested that the extrusion process could improve the ductility of Ti-Cu alloy(S) alloys. Electrochemical results have indicated that the extrusion process improved the corrosion resistance of Ti-Cu(S) alloys. Plate-count method displayed that both Ti-Cu(S) and Ti-Cu(E) exhibited strong antibacterial activity (>99%) against S. aureus. All these results demonstrated that hot forming processing, such as the extrusion in this study, refined the microstructure and densified the alloy, in turn improved the ductility and strength as well as anticorrosion properties without reduction in antibacterial properties. Copyright © 2016 Elsevier B.V. All rights reserved.
Improving informed consent: Stakeholder views.
Anderson, Emily E; Newman, Susan B; Matthews, Alicia K
2017-01-01
Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders-research participants and those responsible for obtaining informed consent-to inform potential development of a multimedia informed consent "app." This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms.
Methods to identify, study and understand end-user participation in HIT development.
Høstgaard, Anna Marie; Bertelsen, Pernille; Nøhr, Christian
2011-09-28
Experience has shown that for new health-information-technology (HIT) to be suc-cessful clinicians must obtain positive clinical benefits as a result of its implementation and joint-ownership of the decisions made during the development process. A prerequisite for achieving both success criteria is real end-user-participation. Experience has also shown that further research into developing improved methods to collect more detailed information on social groups participating in HIT development is needed in order to support, facilitate and improve real end-user participation. A case study of an EHR planning-process in a Danish county from October 2003 until April 2006 was conducted using process-analysis. Three social groups (physicians, IT-professionals and administrators) were identified and studied in the local, present perspective. In order to understand the interactions between the three groups, the national, historic perspective was included through a literature-study. Data were collected through observations, interviews, insight gathered from documents and relevant literature. In the local, present perspective, the administrator's strategy for the EHR planning process meant that there was no clinical workload-reduction. This was seen as one of the main barriers to the physicians to achieving real influence. In the national, historic perspective, physicians and administrators have had/have different perceptions of the purpose of the patient record and they have both struggled to influence this definition. To date, the administrators have won the battle. This explains the conditions made available for the physicians' participation in this case, which led to their role being reduced to that of clinical consultants--rather than real participants. In HIT-development the interests of and the balance of power between the different social groups involved are decisive in determining whether or not the end-users become real participants in the development process. Real end-user-participation is essential for the successful outcome of the process. By combining and developing existing theories and methods, this paper presents an improved method to collect more detailed information on social groups participating in HIT-development and their interaction during the development. This allows HIT management to explore new avenues during the HIT development process in order to support, facilitate and improve real end-user participation.
Challenges of using quality improvement methods in nursing homes that "need improvement".
Rantz, Marilyn J; Zwygart-Stauffacher, Mary; Flesner, Marcia; Hicks, Lanis; Mehr, David; Russell, Teresa; Minner, Donna
2012-10-01
Qualitatively describe the adoption of strategies and challenges experienced by intervention facilities participating in a study targeted to improve quality of care in nursing homes "in need of improvement". To describe how staff use federal quality indicator/quality measure (QI/QM) scores and reports, quality improvement methods and activities, and how staff supported and sustained the changes recommended by their quality improvement teams. A randomized, two-group, repeated-measures design was used to test a 2-year intervention for improving quality of care and resident outcomes in facilities in "need of improvement". Intervention group (n = 29) received an experimental multilevel intervention designed to help them: (1) use quality-improvement methods, (2) use team and group process for direct-care decision-making, (3) focus on accomplishing the basics of care, and (4) maintain more consistent nursing and administrative leadership committed to communication and active participation of staff in decision-making. A qualitative analysis revealed a subgroup of homes likely to continue quality improvement activities and readiness indicators of homes likely to improve: (1) a leadership team (nursing home administrator, director of nurses) interested in learning how to use their federal QI/QM reports as a foundation for improving resident care and outcomes; (2) one of the leaders to be a "change champion" and make sure that current QI/QM reports are consistently printed and shared monthly with each nursing unit; (3) leaders willing to involve all staff in the facility in educational activities to learn about the QI/QM process and the reports that show how their facility compares with others in the state and nation; (4) leaders willing to plan and continuously educate new staff about the MDS and federal QI/QM reports and how to do quality improvement activities; (5) leaders willing to continuously involve all staff in quality improvement committee and team activities so they "own" the process and are responsible for change. Results of this qualitative analysis can help allocate expert nurse time to facilities that are actually ready to improve. Wide-spread adoption of this intervention is feasible and could be enabled by nursing home medical directors in collaborative practice with advanced practice nurses. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nelson, Adam
Multi-group scattering moment matrices are critical to the solution of the multi-group form of the neutron transport equation, as they are responsible for describing the change in direction and energy of neutrons. These matrices, however, are difficult to correctly calculate from the measured nuclear data with both deterministic and stochastic methods. Calculating these parameters when using deterministic methods requires a set of assumptions which do not hold true in all conditions. These quantities can be calculated accurately with stochastic methods, however doing so is computationally expensive due to the poor efficiency of tallying scattering moment matrices. This work presents an improved method of obtaining multi-group scattering moment matrices from a Monte Carlo neutron transport code. This improved method of tallying the scattering moment matrices is based on recognizing that all of the outgoing particle information is known a priori and can be taken advantage of to increase the tallying efficiency (therefore reducing the uncertainty) of the stochastically integrated tallies. In this scheme, the complete outgoing probability distribution is tallied, supplying every one of the scattering moment matrices elements with its share of data. In addition to reducing the uncertainty, this method allows for the use of a track-length estimation process potentially offering even further improvement to the tallying efficiency. Unfortunately, to produce the needed distributions, the probability functions themselves must undergo an integration over the outgoing energy and scattering angle dimensions. This integration is too costly to perform during the Monte Carlo simulation itself and therefore must be performed in advance by way of a pre-processing code. The new method increases the information obtained from tally events and therefore has a significantly higher efficiency than the currently used techniques. The improved method has been implemented in a code system containing a new pre-processor code, NDPP, and a Monte Carlo neutron transport code, OpenMC. This method is then tested in a pin cell problem and a larger problem designed to accentuate the importance of scattering moment matrices. These tests show that accuracy was retained while the figure-of-merit for generating scattering moment matrices and fission energy spectra was significantly improved.
NASA Astrophysics Data System (ADS)
Tamimi, E.; Ebadi, H.; Kiani, A.
2017-09-01
Automatic building detection from High Spatial Resolution (HSR) images is one of the most important issues in Remote Sensing (RS). Due to the limited number of spectral bands in HSR images, using other features will lead to improve accuracy. By adding these features, the presence probability of dependent features will be increased, which leads to accuracy reduction. In addition, some parameters should be determined in Support Vector Machine (SVM) classification. Therefore, it is necessary to simultaneously determine classification parameters and select independent features according to image type. Optimization algorithm is an efficient method to solve this problem. On the other hand, pixel-based classification faces several challenges such as producing salt-paper results and high computational time in high dimensional data. Hence, in this paper, a novel method is proposed to optimize object-based SVM classification by applying continuous Ant Colony Optimization (ACO) algorithm. The advantages of the proposed method are relatively high automation level, independency of image scene and type, post processing reduction for building edge reconstruction and accuracy improvement. The proposed method was evaluated by pixel-based SVM and Random Forest (RF) classification in terms of accuracy. In comparison with optimized pixel-based SVM classification, the results showed that the proposed method improved quality factor and overall accuracy by 17% and 10%, respectively. Also, in the proposed method, Kappa coefficient was improved by 6% rather than RF classification. Time processing of the proposed method was relatively low because of unit of image analysis (image object). These showed the superiority of the proposed method in terms of time and accuracy.
Ordered iron aluminide alloys having an improved room-temperature ductility and method thereof
Sikka, Vinod K.
1992-01-01
A process is disclosed for improving the room temperature ductility and strength of iron aluminide intermetallic alloys. The process involves thermomechanically working an iron aluminide alloy by means which produce an elongated grain structure. The worked alloy is then heated at a temperature in the range of about 650.degree. C. to about 800.degree. C. to produce a B2-type crystal structure. The alloy is rapidly cooled in a moisture free atmosphere to retain the B2-type crystal structure at room temperature, thus providing an alloy having improved room temperature ductility and strength.
NASA Technical Reports Server (NTRS)
Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.;
1999-01-01
This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.
NASA Astrophysics Data System (ADS)
Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.
2017-11-01
This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.
Parallel AFSA algorithm accelerating based on MIC architecture
NASA Astrophysics Data System (ADS)
Zhou, Junhao; Xiao, Hong; Huang, Yifan; Li, Yongzhao; Xu, Yuanrui
2017-05-01
Analysis AFSA past for solving the traveling salesman problem, the algorithm efficiency is often a big problem, and the algorithm processing method, it does not fully responsive to the characteristics of the traveling salesman problem to deal with, and therefore proposes a parallel join improved AFSA process. The simulation with the current TSP known optimal solutions were analyzed, the results showed that the AFSA iterations improved less, on the MIC cards doubled operating efficiency, efficiency significantly.
Henze Bancroft, Leah C; Strigel, Roberta M; Hernando, Diego; Johnson, Kevin M; Kelcz, Frederick; Kijowski, Richard; Block, Walter F
2016-03-01
Chemical shift based fat/water decomposition methods such as IDEAL are frequently used in challenging imaging environments with large B0 inhomogeneity. However, they do not account for the signal modulations introduced by a balanced steady state free precession (bSSFP) acquisition. Here we demonstrate improved performance when the bSSFP frequency response is properly incorporated into the multipeak spectral fat model used in the decomposition process. Balanced SSFP allows for rapid imaging but also introduces a characteristic frequency response featuring periodic nulls and pass bands. Fat spectral components in adjacent pass bands will experience bulk phase offsets and magnitude modulations that change the expected constructive and destructive interference between the fat spectral components. A bSSFP signal model was incorporated into the fat/water decomposition process and used to generate images of a fat phantom, and bilateral breast and knee images in four normal volunteers at 1.5 Tesla. Incorporation of the bSSFP signal model into the decomposition process improved the performance of the fat/water decomposition. Incorporation of this model allows rapid bSSFP imaging sequences to use robust fat/water decomposition methods such as IDEAL. While only one set of imaging parameters were presented, the method is compatible with any field strength or repetition time. © 2015 Wiley Periodicals, Inc.
[Analysis of scatterer microstructure feature based on Chirp-Z transform cepstrum].
Guo, Jianzhong; Lin, Shuyu
2007-12-01
The fundamental research field of medical ultrasound has been the characterization of tissue scatterers. The signal processing method is widely used in this research field. A new method of Chirp-Z Transform Cepstrum for mean spacing estimation of tissue scatterers using ultrasonic scattered signals has been developed. By using this method together with conventional AR cepstrum method, we processed the backscattered signals of mimic tissue and pig liver in vitro. The results illustrated that the Chirp-Z Transform Cepstrum method is effective for signal analysis of ultrasonic scattering and characterization of tissue scatterers, and it can improve the resolution for mean spacing estimation of tissue scatterers.
Mishina, T; Okano, F; Yuyama, I
1999-06-10
The single-sideband method of holography, as is well known, cuts off beams that come from conjugate images for holograms produced in the Fraunhofer region and from objects with no phase components. The single-sideband method with half-zone-plate processing is also effective in the Fresnel region for beams from an object that has phase components. However, this method restricts the viewing zone to a narrow range. We propose a method to improve this restriction by time-alternating switching of hologram patterns and a spatial filter set on the focal plane of a reconstruction lens.
Methods for improved growth of group III nitride semiconductor compounds
Melnik, Yuriy; Chen, Lu; Kojiri, Hidehiro
2015-03-17
Methods are disclosed for growing group III-nitride semiconductor compounds with advanced buffer layer technique. In an embodiment, a method includes providing a suitable substrate in a processing chamber of a hydride vapor phase epitaxy processing system. The method includes forming an AlN buffer layer by flowing an ammonia gas into a growth zone of the processing chamber, flowing an aluminum halide containing precursor to the growth zone and at the same time flowing additional hydrogen halide or halogen gas into the growth zone of the processing chamber. The additional hydrogen halide or halogen gas that is flowed into the growth zone during buffer layer deposition suppresses homogeneous AlN particle formation. The hydrogen halide or halogen gas may continue flowing for a time period while the flow of the aluminum halide containing precursor is turned off.
Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design
NASA Astrophysics Data System (ADS)
Koga, Tsuyoshi; Aoyama, Kazuhiro
This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.
He, Tian; Xiao, Denghong; Pan, Qiang; Liu, Xiandong; Shan, Yingchun
2014-01-01
This paper attempts to introduce an improved acoustic emission (AE) beamforming method to localize rotor-stator rubbing fault in rotating machinery. To investigate the propagation characteristics of acoustic emission signals in casing shell plate of rotating machinery, the plate wave theory is used in a thin plate. A simulation is conducted and its result shows the localization accuracy of beamforming depends on multi-mode, dispersion, velocity and array dimension. In order to reduce the effect of propagation characteristics on the source localization, an AE signal pre-process method is introduced by combining plate wave theory and wavelet packet transform. And the revised localization velocity to reduce effect of array size is presented. The accuracy of rubbing localization based on beamforming and the improved method of present paper are compared by the rubbing test carried on a test table of rotating machinery. The results indicate that the improved method can localize rub fault effectively. Copyright © 2013 Elsevier B.V. All rights reserved.
An Improved Algorithm of Congruent Matching Cells (CMC) Method for Firearm Evidence Identifications
Tong, Mingsi; Song, John; Chu, Wei
2015-01-01
The Congruent Matching Cells (CMC) method was invented at the National Institute of Standards and Technology (NIST) for firearm evidence identifications. The CMC method divides the measured image of a surface area, such as a breech face impression from a fired cartridge case, into small correlation cells and uses four identification parameters to identify correlated cell pairs originating from the same firearm. The CMC method was validated by identification tests using both 3D topography images and optical images captured from breech face impressions of 40 cartridge cases fired from a pistol with 10 consecutively manufactured slides. In this paper, we discuss the processing of the cell correlations and propose an improved algorithm of the CMC method which takes advantage of the cell correlations at a common initial phase angle and combines the forward and backward correlations to improve the identification capability. The improved algorithm is tested by 780 pairwise correlations using the same optical images and 3D topography images as the initial validation. PMID:26958441
An Improved Algorithm of Congruent Matching Cells (CMC) Method for Firearm Evidence Identifications.
Tong, Mingsi; Song, John; Chu, Wei
2015-01-01
The Congruent Matching Cells (CMC) method was invented at the National Institute of Standards and Technology (NIST) for firearm evidence identifications. The CMC method divides the measured image of a surface area, such as a breech face impression from a fired cartridge case, into small correlation cells and uses four identification parameters to identify correlated cell pairs originating from the same firearm. The CMC method was validated by identification tests using both 3D topography images and optical images captured from breech face impressions of 40 cartridge cases fired from a pistol with 10 consecutively manufactured slides. In this paper, we discuss the processing of the cell correlations and propose an improved algorithm of the CMC method which takes advantage of the cell correlations at a common initial phase angle and combines the forward and backward correlations to improve the identification capability. The improved algorithm is tested by 780 pairwise correlations using the same optical images and 3D topography images as the initial validation.
Zou, Feng; Chen, Debao; Wang, Jiangtao
2016-01-01
An improved teaching-learning-based optimization with combining of the social character of PSO (TLBO-PSO), which is considering the teacher's behavior influence on the students and the mean grade of the class, is proposed in the paper to find the global solutions of function optimization problems. In this method, the teacher phase of TLBO is modified; the new position of the individual is determined by the old position, the mean position, and the best position of current generation. The method overcomes disadvantage that the evolution of the original TLBO might stop when the mean position of students equals the position of the teacher. To decrease the computation cost of the algorithm, the process of removing the duplicate individual in original TLBO is not adopted in the improved algorithm. Moreover, the probability of local convergence of the improved method is decreased by the mutation operator. The effectiveness of the proposed method is tested on some benchmark functions, and the results are competitive with respect to some other methods.
Machine Learning for Discriminating Quantum Measurement Trajectories and Improving Readout.
Magesan, Easwar; Gambetta, Jay M; Córcoles, A D; Chow, Jerry M
2015-05-22
Current methods for classifying measurement trajectories in superconducting qubit systems produce fidelities systematically lower than those predicted by experimental parameters. Here, we place current classification methods within the framework of machine learning (ML) algorithms and improve on them by investigating more sophisticated ML approaches. We find that nonlinear algorithms and clustering methods produce significantly higher assignment fidelities that help close the gap to the fidelity possible under ideal noise conditions. Clustering methods group trajectories into natural subsets within the data, which allows for the diagnosis of systematic errors. We find large clusters in the data associated with T1 processes and show these are the main source of discrepancy between our experimental and ideal fidelities. These error diagnosis techniques help provide a path forward to improve qubit measurements.
Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C
2018-06-01
The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.
Chitama, Dereck; Baltussen, Rob; Ketting, Evert; Kamazima, Switbert; Nswilla, Anna; Mujinja, Phares G M
2011-10-21
Successful priority setting is increasingly known to be an important aspect in achieving better family planning, maternal, newborn and child health (FMNCH) outcomes in developing countries. However, far too little attention has been paid to capturing and analysing the priority setting processes and criteria for FMNCH at district level. This paper seeks to capture and analyse the priority setting processes and criteria for FMNCH at district level in Tanzania. Specifically, we assess the FMNCH actor's engagement and understanding, the criteria used in decision making and the way criteria are identified, the information or evidence and tools used to prioritize FMNCH interventions at district level in Tanzania. We conducted an exploratory study mixing both qualitative and quantitative methods to capture and analyse the priority setting for FMNCH at district level, and identify the criteria for priority setting. We purposively sampled the participants to be included in the study. We collected the data using the nominal group technique (NGT), in-depth interviews (IDIs) with key informants and documentary review. We analysed the collected data using both content analysis for qualitative data and correlation analysis for quantitative data. We found a number of shortfalls in the district's priority setting processes and criteria which may lead to inefficient and unfair priority setting decisions in FMNCH. In addition, participants identified the priority setting criteria and established the perceived relative importance of the identified criteria. However, we noted differences exist in judging the relative importance attached to the criteria by different stakeholders in the districts. In Tanzania, FMNCH contents in both general development policies and sector policies are well articulated. However, the current priority setting process for FMNCH at district levels are wanting in several aspects rendering the priority setting process for FMNCH inefficient and unfair (or unsuccessful). To improve district level priority setting process for the FMNCH interventions, we recommend a fundamental revision of the current FMNCH interventions priority setting process. The improvement strategy should utilize rigorous research methods combining both normative and empirical methods to further analyze and correct past problems at the same time use the good practices to improve the current priority setting process for FMNCH interventions. The suggested improvements might give room for efficient and fair (or successful) priority setting process for FMNCH interventions.
Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme.
Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun
2015-01-01
Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation.
Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme
Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun
2015-01-01
Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation. PMID:25709942
Policy improvement by a model-free Dyna architecture.
Hwang, Kao-Shing; Lo, Chia-Yue
2013-05-01
The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate.
Fabrication of nano-scale Cu bond pads with seal design in 3D integration applications.
Chen, K N; Tsang, C K; Wu, W W; Lee, S H; Lu, J Q
2011-04-01
A method to fabricate nano-scale Cu bond pads for improving bonding quality in 3D integration applications is reported. The effect of Cu bonding quality on inter-level via structural reliability for 3D integration applications is investigated. We developed a Cu nano-scale-height bond pad structure and fabrication process for improved bonding quality by recessing oxides using a combination of SiO2 CMP process and dilute HF wet etching. In addition, in order to achieve improved wafer-level bonding, we introduced a seal design concept that prevents corrosion and provides extra mechanical support. Demonstrations of these concepts and processes provide the feasibility of reliable nano-scale 3D integration applications.
Panorama parking assistant system with improved particle swarm optimization method
NASA Astrophysics Data System (ADS)
Cheng, Ruzhong; Zhao, Yong; Li, Zhichao; Jiang, Weigang; Wang, Xin'an; Xu, Yong
2013-10-01
A panorama parking assistant system (PPAS) for the automotive aftermarket together with a practical improved particle swarm optimization method (IPSO) are proposed in this paper. In the PPAS system, four fisheye cameras are installed in the vehicle with different views, and four channels of video frames captured by the cameras are processed as a 360-deg top-view image around the vehicle. Besides the embedded design of PPAS, the key problem for image distortion correction and mosaicking is the efficiency of parameter optimization in the process of camera calibration. In order to address this problem, an IPSO method is proposed. Compared with other parameter optimization methods, the proposed method allows a certain range of dynamic change for the intrinsic and extrinsic parameters, and can exploit only one reference image to complete all of the optimization; therefore, the efficiency of the whole camera calibration is increased. The PPAS is commercially available, and the IPSO method is a highly practical way to increase the efficiency of the installation and the calibration of PPAS in automobile 4S shops.
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
ERIC Educational Resources Information Center
Spanbauer, Stanley J.
The Measurement and Costing Model (MCM) described in this book was developed and tested at Fox Valley Technical College (FVTC), Wisconsin, to enhance the college's quality improvement process and to serve as a guide to other institutions interested in improving their quality. The book presents a description of the model and outlines seven steps…
The Use of Journals To Improve Writing Skills in Commercial Spanish: State of a Research Project.
ERIC Educational Resources Information Center
Alvarez-Ruf, Hersilia
Recent writing theory, research, and pedagogy have aided in the development of several models for improving second language writing skills. Attention is focused more on writing as a process, the importance of practice in writing improvement, and the need to learn cognitive structures contributing to writing. Daily journal writing is one method of…
Background on Ammonia and EPA methods for key Ammonia (NH3) sectors in the NEI
Emissions Research for the National Emissions Inventory – 2017 NEI and Beyond Objective: Improve science of emissions sources that are associated with natural and physical processes in the environment. Include these improved emissions in the National Emissions Inventory (N...
Handlogten, Michael W; Stefanick, Jared F; Deak, Peter E; Bilgicer, Basar
2014-09-07
In a previous study, we demonstrated a non-chromatographic affinity-based precipitation method, using trivalent haptens, for the purification of mAbs. In this study, we significantly improved this process by using a simplified bivalent peptidic hapten (BPH) design, which enables facile and rapid purification of mAbs while overcoming the limitations of the previous trivalent design. The improved affinity-based precipitation method (ABP(BPH)) combines the simplicity of salt-induced precipitation with the selectivity of affinity chromatography for the purification of mAbs. The ABP(BPH) method involves 3 steps: (i) precipitation and separation of protein contaminants larger than immunoglobulins with ammonium sulfate; (ii) selective precipitation of the target-antibody via BPH by inducing antibody-complex formation; (iii) solubilization of the antibody pellet and removal of BPH with membrane filtration resulting in the pure antibody. The ABP(BPH) method was evaluated by purifying the pharmaceutical antibody trastuzumab from common contaminants including CHO cell conditioned media, DNA, ascites fluid, other antibodies, and denatured antibody with >85% yield and >97% purity. Importantly, the purified antibody demonstrated native binding activity to cell lines expressing the target protein, HER2. Combined, the ABP(BPH) method is a rapid and scalable process for the purification of antibodies with the potential to improve product quality while decreasing purification costs.
Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization
Chiu, Chung-Cheng; Ting, Chih-Chung
2016-01-01
Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412
Pricing of American style options with an adjoint process correction method
NASA Astrophysics Data System (ADS)
Jaekel, Uwe
2005-07-01
Pricing of American options is a more complicated problem than pricing of European options. In this work a formula is derived that allows the computation of the early exercise premium, i.e. the price difference between these two option types in terms of an adjoint process evolving in the reversed time direction of the original process determining the evolution of the European price. We show how this equation can be utilised to improve option price estimates from numerical schemes like finite difference or Monte Carlo methods.
Peña-Perez, Luis Manuel; Pedraza-Ortega, Jesus Carlos; Ramos-Arreguin, Juan Manuel; Arriaga, Saul Tovar; Fernandez, Marco Antonio Aceves; Becerra, Luis Omar; Hurtado, Efren Gorrostieta; Vargas-Soto, Jose Emilio
2013-10-24
The present work presents an improved method to align the measurement scale mark in an immersion hydrometer calibration system of CENAM, the National Metrology Institute (NMI) of Mexico, The proposed method uses a vision system to align the scale mark of the hydrometer to the surface of the liquid where it is immersed by implementing image processing algorithms. This approach reduces the variability in the apparent mass determination during the hydrostatic weighing in the calibration process, therefore decreasing the relative uncertainty of calibration.
Peña-Perez, Luis Manuel; Pedraza-Ortega, Jesus Carlos; Ramos-Arreguin, Juan Manuel; Arriaga, Saul Tovar; Fernandez, Marco Antonio Aceves; Becerra, Luis Omar; Hurtado, Efren Gorrostieta; Vargas-Soto, Jose Emilio
2013-01-01
The present work presents an improved method to align the measurement scale mark in an immersion hydrometer calibration system of CENAM, the National Metrology Institute (NMI) of Mexico, The proposed method uses a vision system to align the scale mark of the hydrometer to the surface of the liquid where it is immersed by implementing image processing algorithms. This approach reduces the variability in the apparent mass determination during the hydrostatic weighing in the calibration process, therefore decreasing the relative uncertainty of calibration. PMID:24284770
Experiment Research on Hot-Rolling Processing of Nonsmooth Pit Surface.
Gu, Yun-Qing; Fan, Tian-Xing; Mou, Jie-Gang; Yu, Wei-Bo; Zhao, Gang; Wang, Evan
2016-01-01
In order to achieve the nonsmooth surface drag reduction structure on the inner polymer coating of oil and gas pipelines and improve the efficiency of pipeline transport, a structural model of the machining robot on the pipe inner coating is established. Based on machining robot, an experimental technique is applied to research embossing and coating problems of rolling-head, and then the molding process rules under different conditions of rolling temperatures speeds and depth are analyzed. Also, an orthogonal experiment analysis method is employed to analyze the different effects of hot-rolling process apparatus on the embossed pits morphology and quality of rolling. The results also reveal that elevating the rolling temperature or decreasing the rolling speed can also improve the pit structure replication rates of the polymer coating surface, and the rolling feed has little effect on replication rates. After the rolling-head separates from the polymer coating, phenomenon of rebounding and refluxing of the polymer coating occurs, which is the reason of inability of the process. A continuous hot-rolling method for processing is used in the robot and the hot-rolling process of the processing apparatus is put in a dynamics analysis.
NASA Astrophysics Data System (ADS)
Omega, Dousmaris; Andika, Aditya
2017-12-01
This paper discusses the results of a research conducted on the production process of an Indonesian pharmaceutical company. The company is experiencing low performance in the Overall Equipment Effectiveness (OEE) metric. The OEE of the company machines are below world class standard. The machine that has the lowest OEE is the filler machine. Through observation and analysis, it is found that the cleaning process of the filler machine consumes significant amount of time. The long duration of the cleaning process happens because there is no structured division of jobs between cleaning operators, differences in operators’ ability, and operators’ inability in utilizing available cleaning equipment. The company needs to improve the cleaning process. Therefore, Critical Path Method (CPM) analysis is conducted to find out what activities are critical in order to shorten and simplify the cleaning process in the division of tasks. Afterwards, The Maynard Operation and Sequence Technique (MOST) method is used to reduce ineffective movement and specify the cleaning process standard time. From CPM and MOST, it is obtained the shortest time of the cleaning process is 1 hour 28 minutes and the standard time is 1 hour 38.826 minutes.
Experiment Research on Hot-Rolling Processing of Nonsmooth Pit Surface
Gu, Yun-qing; Fan, Tian-xing; Mou, Jie-gang; Yu, Wei-bo; Zhao, Gang; Wang, Evan
2016-01-01
In order to achieve the nonsmooth surface drag reduction structure on the inner polymer coating of oil and gas pipelines and improve the efficiency of pipeline transport, a structural model of the machining robot on the pipe inner coating is established. Based on machining robot, an experimental technique is applied to research embossing and coating problems of rolling-head, and then the molding process rules under different conditions of rolling temperatures speeds and depth are analyzed. Also, an orthogonal experiment analysis method is employed to analyze the different effects of hot-rolling process apparatus on the embossed pits morphology and quality of rolling. The results also reveal that elevating the rolling temperature or decreasing the rolling speed can also improve the pit structure replication rates of the polymer coating surface, and the rolling feed has little effect on replication rates. After the rolling-head separates from the polymer coating, phenomenon of rebounding and refluxing of the polymer coating occurs, which is the reason of inability of the process. A continuous hot-rolling method for processing is used in the robot and the hot-rolling process of the processing apparatus is put in a dynamics analysis. PMID:27022235
Using human rights for sexual and reproductive health: improving legal and regulatory frameworks
Kismodi, Eszter; Hilber, Adriane Martin; Lincetto, Ornella; Stahlhofer, Marcus; Gruskin, Sofia
2010-01-01
Abstract This paper describes the development of a tool that uses human rights concepts and methods to improve relevant laws, regulations and policies related to sexual and reproductive health. This tool aims to improve awareness and understanding of States’ human rights obligations. It includes a method for systematically examining the status of vulnerable groups, involving non-health sectors, fostering a genuine process of civil society participation and developing recommendations to address regulatory and policy barriers to sexual and reproductive health with a clear assignment of responsibility. Strong leadership from the ministry of health, with support from the World Health Organization or other international partners, and the serious engagement of all involved in this process can strengthen the links between human rights and sexual and reproductive health, and contribute to national achievement of the highest attainable standard of health. PMID:20616975
Everly, Marcee C
2013-02-01
To report the transformation from lecture to more active learning methods in a maternity nursing course and to evaluate whether student perception of improved learning through active-learning methods is supported by improved test scores. The process of transforming a course into an active-learning model of teaching is described. A voluntary mid-semester survey for student acceptance of the new teaching method was conducted. Course examination results, from both a standardized exam and a cumulative final exam, among students who received lecture in the classroom and students who had active learning activities in the classroom were compared. Active learning activities were very acceptable to students. The majority of students reported learning more from having active-learning activities in the classroom rather than lecture-only and this belief was supported by improved test scores. Students who had active learning activities in the classroom scored significantly higher on a standardized assessment test than students who received lecture only. The findings support the use of student reflection to evaluate the effectiveness of active-learning methods and help validate the use of student reflection of improved learning in other research projects. Copyright © 2011 Elsevier Ltd. All rights reserved.
Gounaridis, Lefteris; Groumas, Panos; Schreuder, Erik; Heideman, Rene; Avramopoulos, Hercules; Kouloumentas, Christos
2016-04-04
It is still a common belief that ultra-high quality-factors (Q-factors) are a prerequisite in optical resonant cavities for high refractive index resolution and low detection limit in biosensing applications. In combination with the ultra-short steps that are necessary when the measurement of the resonance shift relies on the wavelength scanning of a laser source and conventional methods for data processing, the high Q-factor requirement makes these biosensors extremely impractical. In this work we analyze an alternative processing method based on the fast-Fourier transform, and show through Monte-Carlo simulations that improvement by 2-3 orders of magnitude can be achieved in the resolution and the detection limit of the system in the presence of amplitude and spectral noise. More significantly, this improvement is maximum for low Q-factors around 104 and is present also for high intra-cavity losses and large scanning steps making the designs compatible with the low-cost aspect of lab-on-a-chip technology. Using a micro-ring resonator as model cavity and a system design with low Q-factor (104), low amplitude transmission (0.85) and relatively large scanning step (0.25 pm), we show that resolution close to 0.01 pm and detection limit close to 10-7 RIU can be achieved improving the sensing performance by more than 2 orders of magnitude compared to the performance of systems relying on a simple peak search processing method. The improvement in the limit of detection is present even when the simple method is combined with ultra-high Q-factors and ultra-short scanning steps due to the trade-off between the system resolution and sensitivity. Early experimental results are in agreement with the trends of the numerical studies.
NASA Astrophysics Data System (ADS)
Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto
2016-06-01
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoaf, S.; APS Engineering Support Division
A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.
Multi-pose facial correction based on Gaussian process with combined kernel function
NASA Astrophysics Data System (ADS)
Shi, Shuyan; Ji, Ruirui; Zhang, Fan
2018-04-01
In order to improve the recognition rate of various postures, this paper proposes a method of facial correction based on Gaussian Process which build a nonlinear regression model between the front and the side face with combined kernel function. The face images with horizontal angle from -45° to +45° can be properly corrected to front faces. Finally, Support Vector Machine is employed for face recognition. Experiments on CAS PEAL R1 face database show that Gaussian process can weaken the influence of pose changes and improve the accuracy of face recognition to certain extent.
USDA-ARS?s Scientific Manuscript database
Infrared (IR) radiation heating has been considered as an alternative to current food and agricultural processing methods for improving product quality and safety, increasing energy and processing efficiency, and reducing water and chemical usage. As part of the electromagnetic spectrum, IR has the ...
ERIC Educational Resources Information Center
Vila, Montserrat; Pallisera, Maria; Fullana, Judit
2007-01-01
Background: It is important to ensure that regular processes of labour market integration are available for all citizens. Method: Thematic content analysis techniques, using semi-structured group interviews, were used to identify the principal elements contributing to the processes of integrating people with disabilities into the regular labour…
A strategic planning methodology for aircraft redesign
NASA Astrophysics Data System (ADS)
Romli, Fairuz Izzuddin
Due to a progressive market shift to a customer-driven environment, the influence of engineering changes on the product's market success is becoming more prominent. This situation affects many long lead-time product industries including aircraft manufacturing. Derivative development has been the key strategy for many aircraft manufacturers to survive the competitive market and this trend is expected to continue in the future. Within this environment of design adaptation and variation, the main market advantages are often gained by the fastest aircraft manufacturers to develop and produce their range of market offerings without any costly mistakes. This realization creates an emphasis on the efficiency of the redesign process, particularly on the handling of engineering changes. However, most activities involved in the redesign process are supported either inefficiently or not at all by the current design methods and tools, primarily because they have been mostly developed to improve original product development. In view of this, the main goal of this research is to propose an aircraft redesign methodology that will act as a decision-making aid for aircraft designers in the change implementation planning of derivative developments. The proposed method, known as Strategic Planning of Engineering Changes (SPEC), combines the key elements of the product redesign planning and change management processes. Its application is aimed at reducing the redesign risks of derivative aircraft development, improving the detection of possible change effects propagation, increasing the efficiency of the change implementation planning and also reducing the costs and the time delays due to the redesign process. To address these challenges, four research areas have been identified: baseline assessment, change propagation prediction, change impact analysis and change implementation planning. Based on the established requirements for the redesign planning process, several methods and tools that are identified within these research areas have been abstracted and adapted into the proposed SPEC method to meet the research goals. The proposed SPEC method is shown to be promising in improving the overall efficiency of the derivative aircraft planning process through two notional aircraft system redesign case studies that are presented in this study.
Automated Fall Detection With Quality Improvement “Rewind” to Reduce Falls in Hospital Rooms
Rantz, Marilyn J.; Banerjee, Tanvi S.; Cattoor, Erin; Scott, Susan D.; Skubic, Marjorie; Popescu, Mihail
2014-01-01
The purpose of this study was to test the implementation of a fall detection and “rewind” privacy-protecting technique using the Microsoft® Kinect™ to not only detect but prevent falls from occurring in hospitalized patients. Kinect sensors were placed in six hospital rooms in a step-down unit and data were continuously logged. Prior to implementation with patients, three researchers performed a total of 18 falls (walking and then falling down or falling from the bed) and 17 non-fall events (crouching down, stooping down to tie shoe laces, and lying on the floor). All falls and non-falls were correctly identified using automated algorithms to process Kinect sensor data. During the first 8 months of data collection, processing methods were perfected to manage data and provide a “rewind” method to view events that led to falls for post-fall quality improvement process analyses. Preliminary data from this feasibility study show that using the Microsoft Kinect sensors provides detection of falls, fall risks, and facilitates quality improvement after falls in real hospital environments unobtrusively, while taking into account patient privacy. PMID:24296567
Integrating artificial and human intelligence into tablet production process.
Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton
2014-12-01
We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.
Method and means for producing fluorocarbon finishes on fibrous structures
NASA Technical Reports Server (NTRS)
Toy, Madeline S. (Inventor); Stringham, Roger S. (Inventor); Fogg, Lawrence C. (Inventor)
1981-01-01
An improved process and apparatus is provided for imparting chemically bonded fluorocarbon finishes to textiles. In the process, the textiles are contacted with a gaseous mixture of fluoroolefins in an inert diluent gas in the presence of ultraviolet light under predetermined conditions.
How current ginning processes affect fiber length uniformity index
USDA-ARS?s Scientific Manuscript database
There is a need to develop cotton ginning methods that improve fiber characteristics that are compatible with the newer and more efficient spinning technologies. A literature search produced recent studies that described how current ginning processes affect HVI fiber length uniformity index. Resul...
Process Simulation of Aluminium Sheet Metal Deep Drawing at Elevated Temperatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winklhofer, Johannes; Trattnig, Gernot; Lind, Christoph
Lightweight design is essential for an economic and environmentally friendly vehicle. Aluminium sheet metal is well known for its ability to improve the strength to weight ratio of lightweight structures. One disadvantage of aluminium is that it is less formable than steel. Therefore complex part geometries can only be realized by expensive multi-step production processes. One method for overcoming this disadvantage is deep drawing at elevated temperatures. In this way the formability of aluminium sheet metal can be improved significantly, and the number of necessary production steps can thereby be reduced. This paper introduces deep drawing of aluminium sheet metalmore » at elevated temperatures, a corresponding simulation method, a characteristic process and its optimization. The temperature and strain rate dependent material properties of a 5xxx series alloy and their modelling are discussed. A three dimensional thermomechanically coupled finite element deep drawing simulation model and its validation are presented. Based on the validated simulation model an optimised process strategy regarding formability, time and cost is introduced.« less
Wang, Wenqiang
2018-01-01
To exploit the adsorption capacity of commercial powdered activated carbon (PAC) and to improve the efficiency of Cr(VI) removal from aqueous solutions, the adsorption of Cr(VI) by commercial PAC and the countercurrent two-stage adsorption (CTA) process was investigated. Different adsorption kinetics models and isotherms were compared, and the pseudo-second-order model and the Langmuir and Freundlich models fit the experimental data well. The Cr(VI) removal efficiency was >80% and was improved by 37% through the CTA process compared with the conventional single-stage adsorption process when the initial Cr(VI) concentration was 50 mg/L with a PAC dose of 1.250 g/L and a pH of 3. A calculation method for calculating the effluent Cr(VI) concentration and the PAC dose was developed for the CTA process, and the validity of the method was confirmed by a deviation of <5%. Copyright © 2017. Published by Elsevier Ltd.
Super-fine rice-flour production by enzymatic treatment with high hydrostatic pressure processing
NASA Astrophysics Data System (ADS)
Kido, Miyuki; Kobayashi, Kaneto; Chino, Shuji; Nishiwaki, Toshikazu; Homma, Noriyuki; Hayashi, Mayumi; Yamamoto, Kazutaka; Shigematsu, Toru
2013-06-01
In response to the recent expansion of rice-flour use, we established a new rice-flour manufacturing process through the application of high hydrostatic pressure (HP) to the enzyme-treated milling method. HP improved both the activity of pectinase, which is used in the enzyme-treated milling method and the water absorption capacity of rice grains. These results indicate improved damage to the tissue structures of rice grains. In contrast, HP suppressed the increase in glucose, which may have led to less starch damage. The manufacturing process was optimized to HP treatment at 200 MPa (40°C) for 1 h and subsequent wet-pulverization at 11,000 rpm. Using this process, rice flour with an exclusively fine mean particle size less than 20 μm and starch damage less than 5% was obtained from rice grains soaked in an enzyme solution and distilled water. This super-fine rice flour is suitable for bread, pasta, noodles and Western-style sweets.
Stapleton, F Bruder; Hendricks, James; Hagan, Patrick; DelBeccaro, Mark
2009-08-01
The Toyota Production System (TPS) has become a successful model for improving efficiency and eliminating errors in manufacturing processes. In an effort to provide patients and families with the highest quality clinical care, our academic children's hospital has modified the techniques of the TPS for a program in continuous performance improvement (CPI) and has expanded its application to educational and research programs. Over a period of years, physicians, nurses, residents, administrators, and hospital staff have become actively engaged in a culture of continuous performance improvement. This article provides background into the methods of CPI and describes examples of how we have applied these methods for improvement in clinical care, resident teaching, and research administration.