The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity...
Air Quality Management Alternatives: United States Air Force Firefighter Training Facilities
1988-01-01
Pollution at LAX, JFK , and ORD," Impact of Aircraft Emissions on Air Quality in the Vicinity of Airports , Volume II, FAA-EE-80-09B, Federal...developed and applied . This method enabled fire prevention, and environmental management experts and professionals to provide data, opinions, and to...methodology utilizing questionnaires, interviews, and site visits is developed and applied . This method enabled fire prevention, and environmental
Improvement of the edge method for on-orbit MTF measurement.
Viallefont-Robinet, Françoise; Léger, Dominique
2010-02-15
The edge method is a widely used way to assess the on-orbit Modulation Transfer Function (MTF). Since good quality is required for the edge, the higher the spatial resolution, the better the results are. In this case, an artificial target can be built and used to ensure a good edge quality. For moderate spatial resolutions, only natural targets are available. Hence the edge quality is unknown and generally rather poor. Improvements of the method have been researched in order to compensate for the poor quality of natural edges. This has been done through the use of symmetry and/or a transfer function model, which enables the elimination of noise. This has also been used for artificial target. In this case, the use of the model overcomes the incomplete sampling when the target is too small or gives the opportunity to assess the defocus of the sensor. This paper begins with a recall of the method followed by a presentation of the changes relying on transfer function parametric model. The transfer function model and the process corresponding to the changes are described. Applications of these changes for several satellites of the French spatial agency are presented: for SPOT 1, it enables to assess XS MTF with natural edges, for SPOT 5, it enables to use the Salon-de-Provence artificial target for MTF assessment in the HM mode, and for the foreseen Pleiades, it enables to estimate the defocus.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Design of k-Space Channel Combination Kernels and Integration with Parallel Imaging
Beatty, Philip J.; Chang, Shaorong; Holmes, James H.; Wang, Kang; Brau, Anja C. S.; Reeder, Scott B.; Brittain, Jean H.
2014-01-01
Purpose In this work, a new method is described for producing local k-space channel combination kernels using a small amount of low-resolution multichannel calibration data. Additionally, this work describes how these channel combination kernels can be combined with local k-space unaliasing kernels produced by the calibration phase of parallel imaging methods such as GRAPPA, PARS and ARC. Methods Experiments were conducted to evaluate both the image quality and computational efficiency of the proposed method compared to a channel-by-channel parallel imaging approach with image-space sum-of-squares channel combination. Results Results indicate comparable image quality overall, with some very minor differences seen in reduced field-of-view imaging. It was demonstrated that this method enables a speed up in computation time on the order of 3–16X for 32-channel data sets. Conclusion The proposed method enables high quality channel combination to occur earlier in the reconstruction pipeline, reducing computational and memory requirements for image reconstruction. PMID:23943602
The U. S. EPA's Framework for Developing Suspended and Bedded Sediments (SABS) Water Quality Criteria (SABS Framework) provides a consistent process, technical methods, and supporting materials to enable resource managers to develop ambient water quality criteria for one of the m...
An objective method for a video quality evaluation in a 3DTV service
NASA Astrophysics Data System (ADS)
Wilczewski, Grzegorz
2015-09-01
The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.
Modern Methods of Rail Welding
NASA Astrophysics Data System (ADS)
Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.
2017-10-01
Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.
Sol-gel processing with inorganic metal salt precursors
Hu, Zhong-Cheng
2004-10-19
Methods for sol-gel processing that generally involve mixing together an inorganic metal salt, water, and a water miscible alcohol or other organic solvent, at room temperature with a macromolecular dispersant material, such as hydroxypropyl cellulose (HPC) added. The resulting homogenous solution is incubated at a desired temperature and time to result in a desired product. The methods enable production of high quality sols and gels at lower temperatures than standard methods. The methods enable production of nanosize sols from inorganic metal salts. The methods offer sol-gel processing from inorganic metal salts.
Using Electronic Messaging to Improve the Quality of Instruction.
ERIC Educational Resources Information Center
Zack, Michael H.
1995-01-01
Qualitative and quantitative data from business students using electronic mail and computer conferencing showed these methods enabled the instructor to be more accessible and responsive; greater class cohesion developed, and perceived quality of the course and instructor effectiveness increased. (SK)
Nakanishi, Rine; Sankaran, Sethuraman; Grady, Leo; Malpeso, Jenifer; Yousfi, Razik; Osawa, Kazuhiro; Ceponiene, Indre; Nazarat, Negin; Rahmani, Sina; Kissel, Kendall; Jayawardena, Eranthi; Dailing, Christopher; Zarins, Christopher; Koo, Bon-Kwon; Min, James K; Taylor, Charles A; Budoff, Matthew J
2018-03-23
Our goal was to evaluate the efficacy of a fully automated method for assessing the image quality (IQ) of coronary computed tomography angiography (CCTA). The machine learning method was trained using 75 CCTA studies by mapping features (noise, contrast, misregistration scores, and un-interpretability index) to an IQ score based on manual ground truth data. The automated method was validated on a set of 50 CCTA studies and subsequently tested on a new set of 172 CCTA studies against visual IQ scores on a 5-point Likert scale. The area under the curve in the validation set was 0.96. In the 172 CCTA studies, our method yielded a Cohen's kappa statistic for the agreement between automated and visual IQ assessment of 0.67 (p < 0.01). In the group where good to excellent (n = 163), fair (n = 6), and poor visual IQ scores (n = 3) were graded, 155, 5, and 2 of the patients received an automated IQ score > 50 %, respectively. Fully automated assessment of the IQ of CCTA data sets by machine learning was reproducible and provided similar results compared with visual analysis within the limits of inter-operator variability. • The proposed method enables automated and reproducible image quality assessment. • Machine learning and visual assessments yielded comparable estimates of image quality. • Automated assessment potentially allows for more standardised image quality. • Image quality assessment enables standardization of clinical trial results across different datasets.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Microbiological Quality and Food Safety of Plants Grown on ISS Project
NASA Technical Reports Server (NTRS)
Wheeler, Raymond M. (Compiler)
2014-01-01
The goal of this project is to select and advance methods to enable real-time sampling, microbiological analysis, and sanitation of crops grown on the International Space Station (ISS). These methods would validate the microbiological quality of crops grown for consumption to ensure safe and palatable fresh foods. This would be achieved through the development / advancement of microbiological sample collection, rapid pathogen detection and effective sanitation methods that are compatible with a microgravity environment.
An Automated Blur Detection Method for Histological Whole Slide Imaging
Moles Lopez, Xavier; D'Andrea, Etienne; Barbot, Paul; Bridoux, Anne-Sophie; Rorive, Sandrine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2013-01-01
Whole slide scanners are novel devices that enable high-resolution imaging of an entire histological slide. Furthermore, the imaging is achieved in only a few minutes, which enables image rendering of large-scale studies involving multiple immunohistochemistry biomarkers. Although whole slide imaging has improved considerably, locally poor focusing causes blurred regions of the image. These artifacts may strongly affect the quality of subsequent analyses, making a slide review process mandatory. This tedious and time-consuming task requires the scanner operator to carefully assess the virtual slide and to manually select new focus points. We propose a statistical learning method that provides early image quality feedback and automatically identifies regions of the image that require additional focus points. PMID:24349343
Human Connectome Project Informatics: quality control, database services, and data visualization
Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.
2013-01-01
The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
A New Vision for Integrated Breast Care.
1998-09-01
Analysis tools to Mapping; and established counseling methods to Debriefing. We are now investigating how Neurolinguistic Programming to may help... programs and services for the benefit of the patient. Our Continuous Quality Improvement, Informatics and Education Cores are working together to help...streamline implementation of programs . This enables us to identify the quality improvements we hope to gain by changing a service and the quality
Defining the best quality-control systems by design and inspection.
Hinckley, C M
1997-05-01
Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.
Banić, Nikola; Lončarić, Sven
2015-11-01
Removing the influence of illumination on image colors and adjusting the brightness across the scene are important image enhancement problems. This is achieved by applying adequate color constancy and brightness adjustment methods. One of the earliest models to deal with both of these problems was the Retinex theory. Some of the Retinex implementations tend to give high-quality results by performing local operations, but they are computationally relatively slow. One of the recent Retinex implementations is light random sprays Retinex (LRSR). In this paper, a new method is proposed for brightness adjustment and color correction that overcomes the main disadvantages of LRSR. There are three main contributions of this paper. First, a concept of memory sprays is proposed to reduce the number of LRSR's per-pixel operations to a constant regardless of the parameter values, thereby enabling a fast Retinex-based local image enhancement. Second, an effective remapping of image intensities is proposed that results in significantly higher quality. Third, the problem of LRSR's halo effect is significantly reduced by using an alternative illumination processing method. The proposed method enables a fast Retinex-based image enhancement by processing Retinex paths in a constant number of steps regardless of the path size. Due to the halo effect removal and remapping of the resulting intensities, the method outperforms many of the well-known image enhancement methods in terms of resulting image quality. The results are presented and discussed. It is shown that the proposed method outperforms most of the tested methods in terms of image brightness adjustment, color correction, and computational speed.
Characterization of Adipose Tissue Product Quality Using Measurements of Oxygen Consumption Rate.
Suszynski, Thomas M; Sieber, David A; Mueller, Kathryn; Van Beek, Allen L; Cunningham, Bruce L; Kenkel, Jeffrey M
2018-03-14
Fat grafting is a common procedure in plastic surgery but associated with unpredictable graft retention. Adipose tissue (AT) "product" quality is affected by the methods used for harvest, processing and transfer, which vary widely amongst surgeons. Currently, there is no method available to accurately assess the quality of AT. In this study, we present a novel method for the assessment of AT product quality through direct measurements of oxygen consumption rate (OCR). OCR has exhibited potential in predicting outcomes following pancreatic islet transplant. Our study aim was to reapportion existing technology for its use with AT preparations and to confirm that these measurements are feasible. OCR was successfully measured for en bloc and postprocessed AT using a stirred microchamber system. OCR was then normalized to DNA content (OCR/DNA), which represents the AT product quality. Mean (±SE) OCR/DNA values for fresh en bloc and post-processed AT were 149.8 (± 9.1) and 61.1 (± 6.1) nmol/min/mg DNA, respectively. These preliminary data suggest that: (1) OCR and OCR/DNA measurements of AT harvested using conventional protocol are feasible; and (2) standard AT processing results in a decrease in overall AT product quality. OCR measurements of AT using existing technology can be done and enables accurate, real-time, quantitative assessment of the quality of AT product prior to transfer. The availability and further validation of this type of assay could enable optimization of fat grafting protocol by providing a tool for the more detailed study of procedural variables that affect AT product quality.
The creation, management, and use of data quality information for life cycle assessment.
Edelen, Ashley; Ingwersen, Wesley W
2018-04-01
Despite growing access to data, questions of "best fit" data and the appropriate use of results in supporting decision making still plague the life cycle assessment (LCA) community. This discussion paper addresses revisions to assessing data quality captured in a new US Environmental Protection Agency guidance document as well as additional recommendations on data quality creation, management, and use in LCA databases and studies. Existing data quality systems and approaches in LCA were reviewed and tested. The evaluations resulted in a revision to a commonly used pedigree matrix, for which flow and process level data quality indicators are described, more clarity for scoring criteria, and further guidance on interpretation are given. Increased training for practitioners on data quality application and its limits are recommended. A multi-faceted approach to data quality assessment utilizing the pedigree method alongside uncertainty analysis in result interpretation is recommended. A method of data quality score aggregation is proposed and recommendations for usage of data quality scores in existing data are made to enable improved use of data quality scores in LCA results interpretation. Roles for data generators, data repositories, and data users are described in LCA data quality management. Guidance is provided on using data with data quality scores from other systems alongside data with scores from the new system. The new pedigree matrix and recommended data quality aggregation procedure can now be implemented in openLCA software. Additional ways in which data quality assessment might be improved and expanded are described. Interoperability efforts in LCA data should focus on descriptors to enable user scoring of data quality rather than translation of existing scores. Developing and using data quality indicators for additional dimensions of LCA data, and automation of data quality scoring through metadata extraction and comparison to goal and scope are needed.
Oriented modulation for watermarking in direct binary search halftone images.
Guo, Jing-Ming; Su, Chang-Cheng; Liu, Yun-Fu; Lee, Hua; Lee, Jiann-Der
2012-09-01
In this paper, a halftoning-based watermarking method is presented. This method enables high pixel-depth watermark embedding, while maintaining high image quality. This technique is capable of embedding watermarks with pixel depths up to 3 bits without causing prominent degradation to the image quality. To achieve high image quality, the parallel oriented high-efficient direct binary search (DBS) halftoning is selected to be integrated with the proposed orientation modulation (OM) method. The OM method utilizes different halftone texture orientations to carry different watermark data. In the decoder, the least-mean-square-trained filters are applied for feature extraction from watermarked images in the frequency domain, and the naïve Bayes classifier is used to analyze the extracted features and ultimately to decode the watermark data. Experimental results show that the DBS-based OM encoding method maintains a high degree of image quality and realizes the processing efficiency and robustness to be adapted in printing applications.
Scholkmann, F; Spichtig, S; Muehlemann, T; Wolf, M
2010-05-01
Near-infrared imaging (NIRI) is a neuroimaging technique which enables us to non-invasively measure hemodynamic changes in the human brain. Since the technique is very sensitive, the movement of a subject can cause movement artifacts (MAs), which affect the signal quality and results to a high degree. No general method is yet available to reduce these MAs effectively. The aim was to develop a new MA reduction method. A method based on moving standard deviation and spline interpolation was developed. It enables the semi-automatic detection and reduction of MAs in the data. It was validated using simulated and real NIRI signals. The results show that a significant reduction of MAs and an increase in signal quality are achieved. The effectiveness and usability of the method is demonstrated by the improved detection of evoked hemodynamic responses. The present method can not only be used in the postprocessing of NIRI signals but also for other kinds of data containing artifacts, for example ECG or EEG signals.
Multi-stakeholder perspectives in defining health-services quality in cataract care.
Stolk-Vos, Aline C; van de Klundert, Joris J; Maijers, Niels; Zijlmans, Bart L M; Busschbach, Jan J V
2017-08-01
To develop a method to define a multi-stakeholder perspective on health-service quality that enables the expression of differences in systematically identified stakeholders' perspectives, and to pilot the approach for cataract care. Mixed-method study between 2014 and 2015. Cataract care in the Netherlands. Stakeholder representatives. We first identified and classified stakeholders using stakeholder theory. Participants established a multi-stakeholder perspective on quality of cataract care using concept mapping, this yielded a cluster map based on multivariate statistical analyses. Consensus-based quality dimensions were subsequently defined in a plenary stakeholder session. Stakeholders and multi-stakeholder perspective on health-service quality. Our analysis identified seven definitive stakeholders, as follows: the Dutch Ophthalmology Society, ophthalmologists, general practitioners, optometrists, health insurers, hospitals and private clinics. Patients, as dependent stakeholders, were considered to lack power by other stakeholders; hence, they were not classified as definitive stakeholders. Overall, 18 stakeholders representing ophthalmologists, general practitioners, optometrists, health insurers, hospitals, private clinics, patients, patient federations and the Dutch Healthcare Institute sorted 125 systematically collected indicators into the seven following clusters: patient centeredness and accessibility, interpersonal conduct and expectations, experienced outcome, clinical outcome, process and structure, medical technical acting and safety. Importance scores from stakeholders directly involved in the cataract service delivery process correlated strongly, as did scores from stakeholders not directly involved in this process. Using a case study on cataract care, the proposed methods enable different views among stakeholders concerning quality dimensions to be systematically revealed, and the stakeholders jointly agreed on these dimensions. The methods helped to unify different quality definitions and facilitated operationalisation of quality measurement in a way that was accepted by relevant stakeholders. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Achieving quality of service in IP networks
NASA Astrophysics Data System (ADS)
Hays, Tim
2001-07-01
The Internet Protocol (IP) has served global networks well, providing a standardized method to transmit data among many disparate systems. But IP is designed for simplicity, and only enables a `best effort' service that can be subject to delays and loss of data. For data networks, this is an acceptable trade-off. In the emerging world of convergence, driven by new applications such as video streaming and IP telephony, minimizing latency and packet loss as well as jitter can be critical. Simply increasing the size of the IP network `pipe' to meet those demands is not always sufficient. In this environment, vendors and standards bodies are endeavoring to create technologies and techniques to enable IP to improve the quality of service it can provide, while retaining the characteristics that has enabled it to become the dominant networking protocol.
Weng, Chunhua
2013-01-01
Objective To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. Materials and methods A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Results Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Discussion Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. Conclusion There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment. PMID:22733976
Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-01-01
Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791
A cross-domain communication resource scheduling method for grid-enabled communication networks
NASA Astrophysics Data System (ADS)
Zheng, Xiangquan; Wen, Xiang; Zhang, Yongding
2011-10-01
To support a wide range of different grid applications in environments where various heterogeneous communication networks coexist, it is important to enable advanced capabilities in on-demand and dynamical integration and efficient co-share with cross-domain heterogeneous communication resource, thus providing communication services which are impossible for single communication resource to afford. Based on plug-and-play co-share and soft integration with communication resource, Grid-enabled communication network is flexibly built up to provide on-demand communication services for gird applications with various requirements on quality of service. Based on the analysis of joint job and communication resource scheduling in grid-enabled communication networks (GECN), this paper presents a cross multi-domain communication resource cooperatively scheduling method and describes the main processes such as traffic requirement resolution for communication services, cross multi-domain negotiation on communication resource, on-demand communication resource scheduling, and so on. The presented method is to afford communication service capability to cross-domain traffic delivery in GECNs. Further research work towards validation and implement of the presented method is pointed out at last.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaluzhna, Oksana; Li, Ying; Allison, Thomas C.
2012-10-09
Inverse-micelle-encapsulated water formed in the two-phase Brust-Schiffrin method (BSM) synthesis of Au nanoparticles (NPs) is identified as essential for dialkyl diselenide/disulfide to react with the Au(III) complex in which the Se-Se/S-S bond is broken, leading to formation of higher-quality Au NPs.
Waites, Ken B; Duffy, Lynn B; Bébéar, Cécile M; Matlow, Anne; Talkington, Deborah F; Kenny, George E; Totten, Patricia A; Bade, Donald J; Zheng, Xiaotian; Davidson, Maureen K; Shortridge, Virginia D; Watts, Jeffrey L; Brown, Steven D
2012-11-01
An international multilaboratory collaborative study was conducted to develop standard media and consensus methods for the performance and quality control of antimicrobial susceptibility testing of Mycoplasma pneumoniae, Mycoplasma hominis, and Ureaplasma urealyticum using broth microdilution and agar dilution techniques. A reference strain from the American Type Culture Collection was designated for each species, which was to be used for quality control purposes. Repeat testing of replicate samples of each reference strain by participating laboratories utilizing both methods and different lots of media enabled a 3- to 4-dilution MIC range to be established for drugs in several different classes, including tetracyclines, macrolides, ketolides, lincosamides, and fluoroquinolones. This represents the first multilaboratory collaboration to standardize susceptibility testing methods and to designate quality control parameters to ensure accurate and reliable assay results for mycoplasmas and ureaplasmas that infect humans.
Jones, Emma; Lees, Nicholas; Martin, Graham; Dixon-Woods, Mary
2014-09-05
Quality improvement (QI) methods are widely used in surgery in an effort to improve care, often using techniques such as Plan-Do-Study-Act cycles to implement specific interventions. Explicit definition of both the QI method and quality intervention is necessary to enable the accurate replication of effective interventions in practice, facilitate cumulative learning, reduce research waste and optimise benefits to patients. This systematic review aims to assess quality of reporting of QI methods and quality interventions in perioperative care. Studies reporting on quality interventions implemented in perioperative care settings will be identified. Searches will be conducted in the Ovid SP version of Medline, Scopus, the Cochrane Central Register of Controlled Trials, the Cochrane Effective Practice and Organisation of Care database and the related articles function of PubMed. The journal BMJ Quality will be searched separately. Search strategy terms will relate to (i) surgery, (ii) QI and (iii) evaluation methods. Explicit exclusion and inclusion criteria will be applied. Data from studies will be extracted using a data extraction form. The Template for Intervention Description and Replication (TIDieR) checklist will be used to evaluate quality of reporting, together with additional items aimed at assessing QI methods specifically. PROSPERO http://CRD42014012845.
Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka
2016-03-01
This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).
Nemeth, Lynne S; Wessell, Andrea M; Jenkins, Ruth G; Nietert, Paul J; Liszka, Heather A; Ornstein, Steven M
2007-01-01
This research describes implementation strategies used by primary care practices using electronic medical records in a national quality improvement demonstration project, Accelerating Translation of Research into Practice, conducted within the Practice Partner Research Network. Qualitative methods enabled identification of strategies to improve 36 quality indicators. Quantitative survey results provide mean scores reflecting the integration of these strategies by practices. Nursing staff plays important roles to facilitate quality improvement within collaborative primary care practices.
Process for the physical segregation of minerals
Yingling, Jon C.; Ganguli, Rajive
2004-01-06
With highly heterogeneous groups or streams of minerals, physical segregation using online quality measurements is an economically important first stage of the mineral beneficiation process. Segregation enables high quality fractions of the stream to bypass processing, such as cleaning operations, thereby reducing the associated costs and avoiding the yield losses inherent in any downstream separation process. The present invention includes various methods for reliably segregating a mineral stream into at least one fraction meeting desired quality specifications while at the same time maximizing yield of that fraction.
NASA Astrophysics Data System (ADS)
Roghanian, E.; Alipour, Mohammad
2014-06-01
Lean production has become an integral part of the manufacturing landscape as its link with superior performance and its ability to provide competitive advantage is well accepted among academics and practitioners. Lean production helps producers in overcoming the challenges organizations face through using powerful tools and enablers. However, most companies are faced with restricted resources such as financial and human resources, time, etc., in using these enablers, and are not capable of implementing all these techniques. Therefore, identifying and selecting the most appropriate and efficient tool can be a significant challenge for many companies. Hence, this literature seeks to combine competitive advantages, lean attributes, and lean enablers to determine the most appropriate enablers for improvement of lean attributes. Quality function deployment in fuzzy environment and house of quality matrix are implemented. Throughout the methodology, fuzzy logic is the basis for translating linguistic judgments required for the relationships and correlation matrix to numerical values. Moreover, for final ranking of lean enablers, a multi-criteria decision-making method (PROMETHEE) is adopted. Finally, a case study in automotive industry is presented to illustrate the implementation of the proposed methodology.
Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-10-28
The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.
Testing a simple field method for assessing nitrate removal in riparian zones
Philippe Vidon; Michael G. Dosskey
2008-01-01
Being able to identify riparian sites that function better for nitrate removal from groundwater is critical to using efficiently the riparian zones for water quality management. For this purpose, managers need a method that is quick, inexpensive, and accurate enough to enable effective management decisions. This study assesses the precision and accuracy of a simple...
3D printing functional materials and devices (Conference Presentation)
NASA Astrophysics Data System (ADS)
McAlpine, Michael C.
2017-05-01
The development of methods for interfacing high performance functional devices with biology could impact regenerative medicine, smart prosthetics, and human-machine interfaces. Indeed, the ability to three-dimensionally interweave biological and functional materials could enable the creation of devices possessing unique geometries, properties, and functionalities. Yet, most high quality functional materials are two dimensional, hard and brittle, and require high crystallization temperatures for maximal performance. These properties render the corresponding devices incompatible with biology, which is three-dimensional, soft, stretchable, and temperature sensitive. We overcome these dichotomies by: 1) using 3D printing and scanning for customized, interwoven, anatomically accurate device architectures; 2) employing nanotechnology as an enabling route for overcoming mechanical discrepancies while retaining high performance; and 3) 3D printing a range of soft and nanoscale materials to enable the integration of a diverse palette of high quality functional nanomaterials with biology. 3D printing is a multi-scale platform, allowing for the incorporation of functional nanoscale inks, the printing of microscale features, and ultimately the creation of macroscale devices. This three-dimensional blending of functional materials and `living' platforms may enable next-generation 3D printed devices.
A method for identifying boundary interference in PADV data
USDA-ARS?s Scientific Manuscript database
Recent commercialization of profiling acoustic Doppler velocimeters (PADVs) has enabled researchers to measure velocities at high frequencies simultaneously at specified increments over the instrument measurement range. The quantity of data output by PADVs can be large, hence robust quality control...
Using magnetic levitation for non-destructive quality control of plastic parts.
Hennek, Jonathan W; Nemiroski, Alex; Subramaniam, Anand Bala; Bwambok, David K; Yang, Dian; Harburg, Daniel V; Tricard, Simon; Ellerbee, Audrey K; Whitesides, George M
2015-03-04
Magnetic levitation (MagLev) enables rapid and non-destructive quality control of plastic parts. The feasibility of MagLev as a method to: i) rapidly assess injection-molded plastic parts for defects during process optimization, ii) monitor the degradation of plastics after exposure to harsh environmental conditions, and iii) detect counterfeit polymers by density is demonstrated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Kawata, Masaaki; Sato, Chikara
2007-06-01
In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.
Image quality assessment using deep convolutional networks
NASA Astrophysics Data System (ADS)
Li, Yezhou; Ye, Xiang; Li, Yong
2017-12-01
This paper proposes a method of accurately assessing image quality without a reference image by using a deep convolutional neural network. Existing training based methods usually utilize a compact set of linear filters for learning features of images captured by different sensors to assess their quality. These methods may not be able to learn the semantic features that are intimately related with the features used in human subject assessment. Observing this drawback, this work proposes training a deep convolutional neural network (CNN) with labelled images for image quality assessment. The ReLU in the CNN allows non-linear transformations for extracting high-level image features, providing a more reliable assessment of image quality than linear filters. To enable the neural network to take images of any arbitrary size as input, the spatial pyramid pooling (SPP) is introduced connecting the top convolutional layer and the fully-connected layer. In addition, the SPP makes the CNN robust to object deformations to a certain extent. The proposed method taking an image as input carries out an end-to-end learning process, and outputs the quality of the image. It is tested on public datasets. Experimental results show that it outperforms existing methods by a large margin and can accurately assess the image quality on images taken by different sensors of varying sizes.
Nuckols, Teryl; Harber, Philip; Sandin, Karl; Benner, Douglas; Weng, Haoling; Shaw, Rebecca; Griffin, Anne; Asch, Steven
2011-03-01
Providing higher quality medical care to workers with occupationally associated carpal tunnel syndrome (CTS) may reduce disability, facilitate return to work, and lower the associated costs. Although many workers' compensation systems have adopted treatment guidelines to reduce the overuse of unnecessary care, limited attention has been paid to ensuring that the care workers do receive is high quality. Further, guidelines are not designed to enable objective assessments of quality of care. This study sought to develop quality measures for the diagnostic evaluation and non-operative management of CTS, including managing occupational activities and functional limitations. Using a variation of the well-established RAND/UCLA Appropriateness Method, we developed draft quality measures using guidelines and literature reviews. Next, in a two-round modified-Delphi process, a multidisciplinary panel of 11 U.S. experts in CTS rated the measures on validity and feasibility. Of 40 draft measures, experts rated 31 (78%) valid and feasible. Nine measures pertained to diagnostic evaluation, such as assessing symptoms, signs, and risk factors. Eleven pertain to non-operative treatments, such as the use of splints, steroid injections, and medications. Eleven others address assessing the association between symptoms and work, managing occupational activities, and accommodating functional limitations. These measures will complement existing treatment guidelines by enabling providers, payers, policymakers, and researchers to assess quality of care for CTS in an objective, structured manner. Given the characteristics of previous measures developed with these methods, greater adherence to these measures will probably lead to improved patient outcomes at a population level.
[Integral quantitative evaluation of working conditions in the construction industry].
Guseĭnov, A A
1993-01-01
Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K
2009-04-01
This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.
Zweigenbaum, P.; Bouaud, J.; Bachimont, B.; Charlet, J.; Boisvieux, J. F.
1997-01-01
The Menelas project aimed to produce a normalized conceptual representation from natural language patient discharge summaries. Because of the complex and detailed nature of conceptual representations, evaluating the quality of output of such a system is difficult. We present the method designed to measure the quality of Menelas output, and its application to the state of the French Menelas prototype as of the end of the project. We examine this method in the framework recently proposed by Friedman and Hripcsak. We also propose two conditions which enable to reduce the evaluation preparation workload. PMID:9357694
Lattice Cleaving: A Multimaterial Tetrahedral Meshing Algorithm with Guarantees
Bronson, Jonathan; Levine, Joshua A.; Whitaker, Ross
2014-01-01
We introduce a new algorithm for generating tetrahedral meshes that conform to physical boundaries in volumetric domains consisting of multiple materials. The proposed method allows for an arbitrary number of materials, produces high-quality tetrahedral meshes with upper and lower bounds on dihedral angles, and guarantees geometric fidelity. Moreover, the method is combinatoric so its implementation enables rapid mesh construction. These meshes are structured in a way that also allows grading, to reduce element counts in regions of homogeneity. Additionally, we provide proofs showing that both element quality and geometric fidelity are bounded using this approach. PMID:24356365
Appreciative Inquiry for Quality Improvement in Primary Care Practices
Ruhe, Mary C.; Bobiak, Sarah N.; Litaker, David; Carter, Caroline A.; Wu, Laura; Schroeder, Casey; Zyzanski, Stephen; Weyer, Sharon M.; Werner, James J.; Fry, Ronald E.; Stange, Kurt C.
2014-01-01
Purpose To test the effect of an Appreciative Inquiry (AI) quality improvement strategy, on clinical quality management and practice development outcomes. AI enables discovery of shared motivations, envisioning a transformed future, and learning around implementation of a change process. Methods Thirty diverse primary care practices were randomly assigned to receive an AI-based intervention focused on a practice-chosen topic and on improving preventive service delivery (PSD) rates. Medical record review assessed change in PSD rates. Ethnographic fieldnotes and observational checklist analysis used editing and immersion/crystallization methods to identify factors affecting intervention implementation and practice development outcomes. Results PSD rates did not change. Field note analysis suggested that the intervention elicited core motivations, facilitated development of a shared vision, defined change objectives and fostered respectful interactions. Practices most likely to implement the intervention or develop new practice capacities exhibited one or more of the following: support from key leader(s), a sense of urgency for change, a mission focused on serving patients, health care system and practice flexibility, and a history of constructive practice change. Conclusions An AI approach and enabling practice conditions can lead to intervention implementation and practice development by connecting individual and practice strengths and motivations to the change objective. PMID:21192206
Boiret, Mathieu; Chauchard, Fabien
2017-01-01
Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that enables better-understanding and optimization of pharmaceutical processes and final drug products. The use in line is often limited by acquisition speed and sampling area. This work focuses on performing a multipoint measurement at high acquisition speed at the end of the manufacturing process on a conveyor belt system to control both the distribution and the content of active pharmaceutical ingredient within final drug products, i.e., tablets. A specially designed probe with several collection fibers was developed for this study. By measuring spectral and spatial information, it provides physical and chemical knowledge on the final drug product. The NIR probe was installed on a conveyor belt system that enables the analysis of a lot of tablets. The use of these NIR multipoint measurement probes on a conveyor belt system provided an innovative method that has the potential to be used as a new paradigm to ensure the drug product quality at the end of the manufacturing process and as a new analytical method for the real-time release control strategy. Graphical abstract Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.
Pursuing the perfect patient experience.
Kaplan, Gary S
2013-01-01
Adapting the principles and tools of the Toyota Production System to healthcare in the form of the Virginia Mason Production System has enabled Virginia Mason Medical Center to transform itself as an organization. Virginia Mason has worked persistently for more than a decade to apply Toyota methods to eliminate waste, improve safety and quality, and provide the community it serves with the highest-quality healthcare at the lowest cost. We have made great progress in this pursuit.
NASA Astrophysics Data System (ADS)
Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal
2018-02-01
This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.
Autocalibrating motion-corrected wave-encoding for highly accelerated free-breathing abdominal MRI.
Chen, Feiyu; Zhang, Tao; Cheng, Joseph Y; Shi, Xinwei; Pauly, John M; Vasanawala, Shreyas S
2017-11-01
To develop a motion-robust wave-encoding technique for highly accelerated free-breathing abdominal MRI. A comprehensive 3D wave-encoding-based method was developed to enable fast free-breathing abdominal imaging: (a) auto-calibration for wave-encoding was designed to avoid extra scan for coil sensitivity measurement; (b) intrinsic butterfly navigators were used to track respiratory motion; (c) variable-density sampling was included to enable compressed sensing; (d) golden-angle radial-Cartesian hybrid view-ordering was incorporated to improve motion robustness; and (e) localized rigid motion correction was combined with parallel imaging compressed sensing reconstruction to reconstruct the highly accelerated wave-encoded datasets. The proposed method was tested on six subjects and image quality was compared with standard accelerated Cartesian acquisition both with and without respiratory triggering. Inverse gradient entropy and normalized gradient squared metrics were calculated, testing whether image quality was improved using paired t-tests. For respiratory-triggered scans, wave-encoding significantly reduced residual aliasing and blurring compared with standard Cartesian acquisition (metrics suggesting P < 0.05). For non-respiratory-triggered scans, the proposed method yielded significantly better motion correction compared with standard motion-corrected Cartesian acquisition (metrics suggesting P < 0.01). The proposed methods can reduce motion artifacts and improve overall image quality of highly accelerated free-breathing abdominal MRI. Magn Reson Med 78:1757-1766, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Alternative Land-Use Method for Spatially Informed Watershed Management Decision Making Using SWAT
In this study, a modification is proposed to the Soil and Water Assessment Tool (SWAT) to enable identification of areas where the implementation of best management practices would likely result in the most significant improvement in downstream water quality. To geospatially link...
Ovretveit, John; Klazinga, Niek
2013-02-01
Both public and private health and social care services are facing increased and changing demands to improve quality and reduce costs. To enable local services to respond to these demands, governments and other organisations have established large scale improvement programmes. These usually seek to enable many services to make changes to apply proven improvements and to make use of quality improvement methods. The purpose of this paper is to provide an empirical description of how one organisation coordinated ten national improvement programmes between 2004 and 2010. It provides details which may be useful to others seeking to plan and implement such programmes, and also contributes to the understanding of knowledge translation and of network governance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Baumgart, Leigh A; Bass, Ellen J; Lyman, Jason A; Springs, Sherry; Voss, John; Hayden, Gregory F; Hellems, Martha A; Hoke, Tracey R; Schlag, Katharine A; Schorling, John B
2010-01-01
Participating in self-assessment activities may stimulate improvement in practice behaviors. However, it is unclear how best to support the development of self-assessment skills, particularly in the health care domain. Exploration of population-based data is one method to enable health care providers to identify deficiencies in overall practice behavior that can motivate quality improvement initiatives. At the University of Virginia, we are developing a decision support tool to integrate and present population-based patient data to health care providers related to both clinical outcomes and non-clinical measures (e.g., demographic information). By enabling users to separate their direct impact on clinical outcomes from other factors out of their control, we may enhance the self-assessment process.
Baumgart, Leigh A.; Bass, Ellen J.; Lyman, Jason A.; Springs, Sherry; Voss, John; Hayden, Gregory F.; Hellems, Martha A.; Hoke, Tracey R.; Schlag, Katharine A.; Schorling, John B.
2011-01-01
Participating in self-assessment activities may stimulate improvement in practice behaviors. However, it is unclear how best to support the development of self-assessment skills, particularly in the health care domain. Exploration of population-based data is one method to enable health care providers to identify deficiencies in overall practice behavior that can motivate quality improvement initiatives. At the University of Virginia, we are developing a decision support tool to integrate and present population-based patient data to health care providers related to both clinical outcomes and non-clinical measures (e.g., demographic information). By enabling users to separate their direct impact on clinical outcomes from other factors out of their control, we may enhance the self-assessment process. PMID:21874123
Senda, Miki; Muto, Shinsuke; Horikoshi, Masami; Senda, Toshiya
2008-10-01
One of the most frequent problems in crystallization is poor quality of the crystals. In order to overcome this obstacle several methods have been utilized, including amino-acid substitutions of the target protein. Here, an example is presented of crystal-quality improvement by leucine-to-methionine substitutions. A variant protein with three amino-acid substitutions enabled improvement of the crystal quality of the histone chaperone SET/TAF-Ibeta/INHAT when combined with optimization of the cryoconditions. This procedure improved the resolution of the SET/TAF-Ibeta/INHAT crystals from around 5.5 to 2.3 A without changing the crystallization conditions.
Setting standards and monitoring quality in the NHS 1999-2013: a classic case of goal conflict.
Littlejohns, Peter; Knight, Alec; Littlejohns, Anna; Poole, Tara-Lynn; Kieslich, Katharina
2017-04-01
2013 saw the National Health Service (NHS) in England severely criticized for providing poor quality despite successive governments in the previous 15 years, establishing a range of new institutions to improve NHS quality. This study seeks to understand the contributions of political and organizational influences in enabling the NHS to deliver high-quality care through exploring the experiences of two of the major new organizations established to set standards and monitor NHS quality. We used a mixed method approach: first a cross-sectional, in-depth qualitative interview study and then the application of principal agent modeling (Waterman and Meier broader framework). Ten themes were identified as influencing the functioning of the NHS regulatory institutions: socio-political environment; governance and accountability; external relationships; clarity of purpose; organizational reputation; leadership and management; organizational stability; resources; organizational methods; and organizational performance. The organizations could be easily mapped onto the framework, and their transience between the different states could be monitored. We concluded that differing policy objectives for NHS quality monitoring resulted in central involvement and organizational change. This had a disruptive effect on the ability of the NHS to monitor quality. Constant professional leadership, both clinical and managerial, and basing decisions on best evidence, both technical and organizational, helped one institution to deliver on its remit, even within a changing political/policy environment. Application of the Waterman-Meier framework enabled an understanding and description of the dynamic relationship between central government and organizations in the NHS and may predict when tensions will arise in the future. © 2016 The Authors. The International Journal of Health Planning and Management Published by John Wiley & Sons Ltd. © 2016 The Authors. The International Journal of Health Planning and Management Published by John Wiley & Sons Ltd.
Setting standards and monitoring quality in the NHS 1999–2013: a classic case of goal conflict
Knight, Alec; Littlejohns, Anna; Poole, Tara‐Lynn; Kieslich, Katharina
2016-01-01
Abstract 2013 saw the National Health Service (NHS) in England severely criticized for providing poor quality despite successive governments in the previous 15 years, establishing a range of new institutions to improve NHS quality. This study seeks to understand the contributions of political and organizational influences in enabling the NHS to deliver high‐quality care through exploring the experiences of two of the major new organizations established to set standards and monitor NHS quality. We used a mixed method approach: first a cross‐sectional, in‐depth qualitative interview study and then the application of principal agent modeling (Waterman and Meier broader framework). Ten themes were identified as influencing the functioning of the NHS regulatory institutions: socio‐political environment; governance and accountability; external relationships; clarity of purpose; organizational reputation; leadership and management; organizational stability; resources; organizational methods; and organizational performance. The organizations could be easily mapped onto the framework, and their transience between the different states could be monitored. We concluded that differing policy objectives for NHS quality monitoring resulted in central involvement and organizational change. This had a disruptive effect on the ability of the NHS to monitor quality. Constant professional leadership, both clinical and managerial, and basing decisions on best evidence, both technical and organizational, helped one institution to deliver on its remit, even within a changing political/policy environment. Application of the Waterman–Meier framework enabled an understanding and description of the dynamic relationship between central government and organizations in the NHS and may predict when tensions will arise in the future. © 2016 The Authors. The International Journal of Health Planning and Management Published by John Wiley & Sons Ltd. PMID:27435020
Pomeroy, Linda; Burnett, Susan; Anderson, Janet E; Fulop, Naomi J
2017-01-01
Background Health systems worldwide are increasingly holding boards of healthcare organisations accountable for the quality of care that they provide. Previous empirical research has found associations between certain board practices and higher quality patient care; however, little is known about how boards govern for quality improvement (QI). Methods We conducted fieldwork over a 30-month period in 15 healthcare provider organisations in England as part of a wider evaluation of a board-level organisational development intervention. Our data comprised board member interviews (n=65), board meeting observations (60 hours) and documents (30 sets of board meeting papers, 15 board minutes and 15 Quality Accounts). We analysed the data using a framework developed from existing evidence of links between board practices and quality of care. We mapped the variation in how boards enacted governance of QI and constructed a measure of QI governance maturity. We then compared organisations to identify the characteristics of those with mature QI governance. Results We found that boards with higher levels of maturity in relation to governing for QI had the following characteristics: explicitly prioritising QI; balancing short-term (external) priorities with long-term (internal) investment in QI; using data for QI, not just quality assurance; engaging staff and patients in QI; and encouraging a culture of continuous improvement. These characteristics appeared to be particularly enabled and facilitated by board-level clinical leaders. Conclusions This study contributes to a deeper understanding of how boards govern for QI. The identified characteristics of organisations with mature QI governance seemed to be enabled by active clinical leadership. Future research should explore the biographies, identities and work practices of board-level clinical leaders and their role in organisation-wide QI. PMID:28689191
EMISSIONS INVENTORY OF PM 2.5 TRACE ELEMENTS ACROSS THE U.S.
This abstract describes work done to speciate PM2.5 emissions into emissions of trace metals to enable concentrations of metal species to be predicted by air quality models. Methods are described and initial results are presented. A technique for validating the resul...
Blom, Mozes P K
2015-08-05
Recently developed molecular methods enable geneticists to target and sequence thousands of orthologous loci and infer evolutionary relationships across the tree of life. Large numbers of genetic markers benefit species tree inference but visual inspection of alignment quality, as traditionally conducted, is challenging with thousands of loci. Furthermore, due to the impracticality of repeated visual inspection with alternative filtering criteria, the potential consequences of using datasets with different degrees of missing data remain nominally explored in most empirical phylogenomic studies. In this short communication, I describe a flexible high-throughput pipeline designed to assess alignment quality and filter exonic sequence data for subsequent inference. The stringency criteria for alignment quality and missing data can be adapted based on the expected level of sequence divergence. Each alignment is automatically evaluated based on the stringency criteria specified, significantly reducing the number of alignments that require visual inspection. By developing a rapid method for alignment filtering and quality assessment, the consistency of phylogenetic estimation based on exonic sequence alignments can be further explored across distinct inference methods, while accounting for different degrees of missing data.
77 FR 42738 - Request for Information on Quality Measurement Enabled by Health IT
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-20
... Information on Quality Measurement Enabled by Health IT AGENCY: Agency for Healthcare Research and Quality (AHRQ), Health and Human Services (HHS). ACTION: Notice of Request for Information (RFI). SUMMARY: The Agency for Healthcare Research and Quality (AHRQ) requests information from the Public, including...
Hosokawa, Masahito; Nishikawa, Yohei; Kogawa, Masato; Takeyama, Haruko
2017-07-12
Massively parallel single-cell genome sequencing is required to further understand genetic diversities in complex biological systems. Whole genome amplification (WGA) is the first step for single-cell sequencing, but its throughput and accuracy are insufficient in conventional reaction platforms. Here, we introduce single droplet multiple displacement amplification (sd-MDA), a method that enables massively parallel amplification of single cell genomes while maintaining sequence accuracy and specificity. Tens of thousands of single cells are compartmentalized in millions of picoliter droplets and then subjected to lysis and WGA by passive droplet fusion in microfluidic channels. Because single cells are isolated in compartments, their genomes are amplified to saturation without contamination. This enables the high-throughput acquisition of contamination-free and cell specific sequence reads from single cells (21,000 single-cells/h), resulting in enhancement of the sequence data quality compared to conventional methods. This method allowed WGA of both single bacterial cells and human cancer cells. The obtained sequencing coverage rivals those of conventional techniques with superior sequence quality. In addition, we also demonstrate de novo assembly of uncultured soil bacteria and obtain draft genomes from single cell sequencing. This sd-MDA is promising for flexible and scalable use in single-cell sequencing.
NASA Astrophysics Data System (ADS)
Siok, Katarzyna; Jenerowicz, Agnieszka; Woroszkiewicz, Małgorzata
2017-07-01
Archival aerial photographs are often the only reliable source of information about the area. However, these data are single-band data that do not allow unambiguous detection of particular forms of land cover. Thus, the authors of this article seek to develop a method of coloring panchromatic aerial photographs, which enable increasing the spectral information of such images. The study used data integration algorithms based on pansharpening, implemented in commonly used remote sensing programs: ERDAS, ENVI, and PCI. Aerial photos and Landsat multispectral data recorded in 1987 and 2016 were chosen. This study proposes the use of modified intensity-hue-saturation and Brovey methods. The use of these methods enabled the addition of red-green-blue (RGB) components to monochrome images, thus enhancing their interpretability and spectral quality. The limitations of the proposed method relate to the availability of RGB satellite imagery, the accuracy of mutual orientation of the aerial and the satellite data, and the imperfection of archival aerial photographs. Therefore, it should be expected that the results of coloring will not be perfect compared to the results of the fusion of recent data with a similar ground sampling resolution, but still, they will allow a more accurate and efficient classification of land cover registered on archival aerial photographs.
Light Weight MP3 Watermarking Method for Mobile Terminals
NASA Astrophysics Data System (ADS)
Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro
This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.
Water Quality Monitoring in Developing Countries; Can Microbial Fuel Cells be the Answer?
Chouler, Jon; Di Lorenzo, Mirella
2015-01-01
The provision of safe water and adequate sanitation in developing countries is a must. A range of chemical and biological methods are currently used to ensure the safety of water for consumption. These methods however suffer from high costs, complexity of use and inability to function onsite and in real time. The microbial fuel cell (MFC) technology has great potential for the rapid and simple testing of the quality of water sources. MFCs have the advantages of high simplicity and possibility for onsite and real time monitoring. Depending on the choice of manufacturing materials, this technology can also be highly cost effective. This review covers the state-of-the-art research on MFC sensors for water quality monitoring, and explores enabling factors for their use in developing countries. PMID:26193327
Water Quality Monitoring in Developing Countries; Can Microbial Fuel Cells be the Answer?
Chouler, Jon; Di Lorenzo, Mirella
2015-07-16
The provision of safe water and adequate sanitation in developing countries is a must. A range of chemical and biological methods are currently used to ensure the safety of water for consumption. These methods however suffer from high costs, complexity of use and inability to function onsite and in real time. The microbial fuel cell (MFC) technology has great potential for the rapid and simple testing of the quality of water sources. MFCs have the advantages of high simplicity and possibility for onsite and real time monitoring. Depending on the choice of manufacturing materials, this technology can also be highly cost effective. This review covers the state-of-the-art research on MFC sensors for water quality monitoring, and explores enabling factors for their use in developing countries.
Methods for slow axis beam quality improvement of high power broad area diode lasers
NASA Astrophysics Data System (ADS)
An, Haiyan; Xiong, Yihan; Jiang, Ching-Long J.; Schmidt, Berthold; Treusch, Georg
2014-03-01
For high brightness direct diode laser systems, it is of fundamental importance to improve the slow axis beam quality of the incorporated laser diodes regardless what beam combining technology is applied. To further advance our products in terms of increased brightness at a high power level, we must optimize the slow axis beam quality despite the far field blooming at high current levels. The later is caused predominantly by the built-in index step in combination with the thermal lens effect. Most of the methods for beam quality improvements reported in publications sacrifice the device efficiency and reliable output power. In order to improve the beam quality as well as maintain the efficiency and reliable output power, we investigated methods of influencing local heat generation to reduce the thermal gradient across the slow axis direction, optimizing the built-in index step and discriminating high order modes. Based on our findings, we have combined different methods in our new device design. Subsequently, the beam parameter product (BPP) of a 10% fill factor bar has improved by approximately 30% at 7 W/emitter without efficiency penalty. This technology has enabled fiber coupled high brightness multi-kilowatt direct diode laser systems. In this paper, we will elaborate on the methods used as well as the results achieved.
Robust synthesis and continuous manufacturing of carbon nanotube forests and graphene films
NASA Astrophysics Data System (ADS)
Polsen, Erik S.
Successful translation of the outstanding properties of carbon nanotubes (CNTs) and graphene to commercial applications requires highly consistent methods of synthesis, using scalable and cost-effective machines. This thesis presents robust process conditions and a series of process operations that will enable integrated roll-to-roll (R2R) CNT and graphene growth on flexible substrates. First, a comprehensive study was undertaken to establish the sources of variation in laboratory CVD growth of CNT forests. Statistical analysis identified factors that contribute to variation in forest height and density including ambient humidity, sample position in the reactor, and barometric pressure. Implementation of system modifications and user procedures reduced the variation in height and density by 50% and 54% respectively. With improved growth, two new methods for continuous deposition and patterning of catalyst nanoparticles for CNT forest growth were developed, enabling the diameter, density and pattern geometry to be tailored through the control of process parameters. Convective assembly of catalyst nanoparticles in solution enables growth of CNT forests with density 3-fold higher than using sputtered catalyst films with the same growth parameters. Additionally, laser printing of magnetic ink character recognition toner provides a large scale patterning method, with digital control of the pattern density and tunable CNT density via laser intensity. A concentric tube CVD reactor was conceptualized, designed and built for R2R growth of CNT forests and graphene on flexible substrates helically fed through the annular gap. The design enables downstream injection of the hydrocarbon source, and gas consumption is reduced 90% compared to a standard tube furnace. Multi-wall CNT forests are grown continuously on metallic and ceramic fiber substrates at 33 mm/min. High quality, uniform bi- and multi-layer graphene is grown on Cu and Ni foils at 25 - 495 mm/min. A second machine for continuous forest growth and delamination was developed; and forest-substrate adhesion strength was controlled through CVD parameters. Taken together, these methods enable uniform R2R processing of CNT forests and graphene with engineered properties. Last, it is projected that foreseeable improvements in CNT forest quality and density using these methods will result in electrical and thermal properties that exceed state-of-the-art bulk materials.
Time Together: A nursing intervention in psychiatric inpatient care: Feasibility and effects.
Molin, Jenny; Lindgren, Britt-Marie; Graneheim, Ulla Hällgren; Ringnér, Anders
2018-04-25
The facilitation of quality time between patients and staff in psychiatric inpatient care is useful to promote recovery and reduce stress experienced by staff. However, interventions are reported to be complex to implement and are poorly described in the literature. This multisite study aimed to evaluate the feasibility and effects of the nursing intervention Time Together, using mixed methods. Data consisted of notes from participant observations and logs to evaluate feasibility, and questionnaires to evaluate effects. The primary outcome for patients was quality of interactions, and for staff, it was perceived stress. The secondary outcome for patients was anxiety and depression symptom levels, and for staff, it was stress of conscience. Data were analysed using visual analysis, percentage of nonoverlapping data, and qualitative content analysis. The results showed that Time Together was a feasible intervention, but measurements showed no effects on the two patient outcomes: quality of interactions and anxiety and depressive symptoms and, questionable effects on perceived stress and stress of conscience among staff. Shared responsibility, a friendly approach, and a predictable structure enabled Time Together, while a distant approach and an unpredictable structure hindered the intervention. In conclusion, the intervention proved to be feasible with potential to enable quality interactions between patients and staff using the enabling factors as supportive components. It also had some effects on perceived stress and stress of conscience among staff. Further evaluation is needed to build on the evidence for the intervention. © 2018 Australian College of Mental Health Nurses Inc.
Research Committee Issues Brief: Professional Development for Virtual Schooling and Online Learning
ERIC Educational Resources Information Center
Davis, Niki; Rose, Ray
2007-01-01
This report examines the types of professional development necessary to implement successful online learning initiatives. The potential for schools utilizing online learning is tremendous: schools can develop new distribution methods to enable equity and access for all students, they can provide high quality content for all students and they can…
USDA-ARS?s Scientific Manuscript database
The amount of secondary cell wall (SCW) cellulose in the fiber affects the quality and commercial value of cotton. Accurate assessments of SCW cellulose are essential for improving cotton fibers. Fourier Transform Infrared (FT-IR) spectroscopy enables distinguishing SCW from other cell wall componen...
Precision production: enabling deterministic throughput for precision aspheres with MRF
NASA Astrophysics Data System (ADS)
Maloney, Chris; Entezarian, Navid; Dumas, Paul
2017-10-01
Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.
Coater/developer based techniques to improve high-resolution EUV patterning defectivity
NASA Astrophysics Data System (ADS)
Hontake, Koichi; Huli, Lior; Lemley, Corey; Hetzer, Dave; Liu, Eric; Ko, Akiteru; Kawakami, Shinichiro; Shimoaoki, Takeshi; Hashimoto, Yusaku; Tanaka, Koichiro; Petrillo, Karen; Meli, Luciana; De Silva, Anuja; Xu, Yongan; Felix, Nelson; Johnson, Richard; Murray, Cody; Hubbard, Alex
2017-10-01
Extreme ultraviolet lithography (EUVL) technology is one of the leading candidates under consideration for enabling the next generation of devices, for 7nm node and beyond. As the focus shifts to driving down the 'effective' k1 factor and enabling the full scaling entitlement of EUV patterning, new techniques and methods must be developed to reduce the overall defectivity, mitigate pattern collapse, and eliminate film-related defects. In addition, CD uniformity and LWR/LER must be improved in terms of patterning performance. Tokyo Electron Limited (TEL™) and IBM Corporation are continuously developing manufacturing quality processes for EUV. In this paper, we review the ongoing progress in coater/developer based processes (coating, developing, baking) that are required to enable EUV patterning.
Shao, Xu; Milner, Ben
2005-08-01
This work proposes a method to reconstruct an acoustic speech signal solely from a stream of mel-frequency cepstral coefficients (MFCCs) as may be encountered in a distributed speech recognition (DSR) system. Previous methods for speech reconstruction have required, in addition to the MFCC vectors, fundamental frequency and voicing components. In this work the voicing classification and fundamental frequency are predicted from the MFCC vectors themselves using two maximum a posteriori (MAP) methods. The first method enables fundamental frequency prediction by modeling the joint density of MFCCs and fundamental frequency using a single Gaussian mixture model (GMM). The second scheme uses a set of hidden Markov models (HMMs) to link together a set of state-dependent GMMs, which enables a more localized modeling of the joint density of MFCCs and fundamental frequency. Experimental results on speaker-independent male and female speech show that accurate voicing classification and fundamental frequency prediction is attained when compared to hand-corrected reference fundamental frequency measurements. The use of the predicted fundamental frequency and voicing for speech reconstruction is shown to give very similar speech quality to that obtained using the reference fundamental frequency and voicing.
Rapid Flow-Based Peptide Synthesis
Simon, Mark D.; Heider, Patrick L.; Adamo, Andrea; Vinogradov, Alexander A.; Mong, Surin K.; Li, Xiyuan; Berger, Tatiana; Policarpo, Rocco L.; Zhang, Chi; Zou, Yekui; Liao, Xiaoli; Spokoyny, Alexander M.; Jensen, Klavs F.
2014-01-01
A flow-based solid phase peptide synthesis methodology that enables the incorporation of an amino acid residue every 1.8 minutes under automatic control, or every three minutes under manual control, is described. This is accomplished by passing a stream of reagent through a heat exchanger, into a low volume, low backpressure reaction vessel, and through a UV detector. These features enable the continuous delivery of heated solvents and reagents to the solid support at high flow rate, maintaining a maximal concentration of reagents in the reaction vessel, quickly exchanging reagents, and eliminating the need to rapidly heat reagents after they have been added to the vessel. The UV detector enables continuous monitoring of the process. To demonstrate the broad applicability and reliability of this method, it was employed in the total synthesis of a small protein, as well as dozens of peptides. The quality of the material obtained with this method is comparable to traditional batch methods, and, in all cases, the desired material was readily purifiable via RP-HPLC. The application of this method to the synthesis of the 113 residue B. amyloliquefaciens RNase and the 130 residue pE59 DARPin is described in the accompanying manuscript. PMID:24616230
Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen
2015-11-10
An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.
Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus
2018-01-01
Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.
Ranking Reputation and Quality in Online Rating Systems
Liao, Hao; Zeng, An; Xiao, Rui; Ren, Zhuo-Ming; Chen, Duan-Bing; Zhang, Yi-Cheng
2014-01-01
How to design an accurate and robust ranking algorithm is a fundamental problem with wide applications in many real systems. It is especially significant in online rating systems due to the existence of some spammers. In the literature, many well-performed iterative ranking methods have been proposed. These methods can effectively recognize the unreliable users and reduce their weight in judging the quality of objects, and finally lead to a more accurate evaluation of the online products. In this paper, we design an iterative ranking method with high performance in both accuracy and robustness. More specifically, a reputation redistribution process is introduced to enhance the influence of highly reputed users and two penalty factors enable the algorithm resistance to malicious behaviors. Validation of our method is performed in both artificial and real user-object bipartite networks. PMID:24819119
Ruszczyńska, A; Szteyn, J; Wiszniewska-Laszczych, A
2007-01-01
Producing dairy products which are safe for consumers requires the constant monitoring of the microbiological quality of raw material, the production process itself and the end product. Traditional methods, still a "gold standard", require a specialized laboratory working on recognized and validated methods. Obtaining results is time- and labor-consuming and do not allow rapid evaluation. Hence, there is a need for a rapid, precise method enabling the real-time monitoring of microbiological quality, and flow cytometry serves this function well. It is based on labeling cells suspended in a solution with fluorescent dyes and pumping them into a measurement zone where they are exposed to a precisely focused laser beam. This paper is aimed at presenting the possibilities of applying flow cytometry in the dairy industry.
van Uem, Janet M.T.; Isaacs, Tom; Lewin, Alan; Bresolin, Eros; Salkovic, Dina; Espay, Alberto J.; Matthews, Helen; Maetzler, Walter
2016-01-01
In this viewpoint, we discuss how several aspects of Parkinson’s disease (PD) – known to be correlated with wellbeing and health-related quality of life–could be measured using wearable devices (‘wearables’). Moreover, three people with PD (PwP) having exhaustive experience with using such devices write about their personal understanding of wellbeing and health-related quality of life, building a bridge between the true needs defined by PwP and the available methods of data collection. Rapidly evolving new technologies develop wearables that probe function and behaviour in domestic environments of people with chronic conditions such as PD and have the potential to serve their needs. Gathered data can serve to inform patient-driven management changes, enabling greater control by PwP and enhancing likelihood of improvements in wellbeing and health-related quality of life. Data can also be used to quantify wellbeing and health-related quality of life. Additionally these techniques can uncover novel more sensitive and more ecologically valid disease-related endpoints. Active involvement of PwP in data collection and interpretation stands to provide personally and clinically meaningful endpoints and milestones to inform advances in research and relevance of translational efforts in PD. PMID:27003779
Bergholz, W
2008-11-01
In many high-tech industries, quality management (QM) has enabled improvements of quality by a factor of 100 or more, in combination with significant cost reductions. Compared to this, the application of QM methods in health care is in its initial stages. It is anticipated that stringent process management, embedded in an effective QM system will lead to significant improvements in health care in general and in the German public health service in particular. Process management is an ideal platform for controlling in the health care sector, and it will significantly improve the leverage of controlling to bring down costs. Best practice sharing in industry has led to quantum leap improvements. Process management will enable best practice sharing also in the public health service, in spite of the highly diverse portfolio of services that the public health service offers in different German regions. Finally, it is emphasised that "technical" QM, e.g., on the basis of the ISO 9001 standard is not sufficient to reach excellence. It is necessary to integrate soft factors, such as patient or employee satisfaction, and leadership quality into the system. The EFQM model for excellence can serve as proven tool to reach this goal.
Full-frame video stabilization with motion inpainting.
Matsushita, Yasuyuki; Ofek, Eyal; Ge, Weina; Tang, Xiaoou; Shum, Heung-Yeung
2006-07-01
Video stabilization is an important video enhancement technology which aims at removing annoying shaky motion from videos. We propose a practical and robust approach of video stabilization that produces full-frame stabilized videos with good visual quality. While most previous methods end up with producing smaller size stabilized videos, our completion method can produce full-frame videos by naturally filling in missing image parts by locally aligning image data of neighboring frames. To achieve this, motion inpainting is proposed to enforce spatial and temporal consistency of the completion in both static and dynamic image areas. In addition, image quality in the stabilized video is enhanced with a new practical deblurring algorithm. Instead of estimating point spread functions, our method transfers and interpolates sharper image pixels of neighboring frames to increase the sharpness of the frame. The proposed video completion and deblurring methods enabled us to develop a complete video stabilizer which can naturally keep the original image quality in the stabilized videos. The effectiveness of our method is confirmed by extensive experiments over a wide variety of videos.
Fast and accurate de novo genome assembly from long uncorrected reads
Vaser, Robert; Sović, Ivan; Nagarajan, Niranjan
2017-01-01
The assembly of long reads from Pacific Biosciences and Oxford Nanopore Technologies typically requires resource-intensive error-correction and consensus-generation steps to obtain high-quality assemblies. We show that the error-correction step can be omitted and that high-quality consensus sequences can be generated efficiently with a SIMD-accelerated, partial-order alignment–based, stand-alone consensus module called Racon. Based on tests with PacBio and Oxford Nanopore data sets, we show that Racon coupled with miniasm enables consensus genomes with similar or better quality than state-of-the-art methods while being an order of magnitude faster. PMID:28100585
Adaptive single-pixel imaging with aggregated sampling and continuous differential measurements
NASA Astrophysics Data System (ADS)
Huo, Yaoran; He, Hongjie; Chen, Fan; Tai, Heng-Ming
2018-06-01
This paper proposes an adaptive compressive imaging technique with one single-pixel detector and single arm. The aggregated sampling (AS) method enables the reduction of resolutions of the reconstructed images. It aims to reduce the time and space consumption. The target image with a resolution up to 1024 × 1024 can be reconstructed successfully at the 20% sampling rate. The continuous differential measurement (CDM) method combined with a ratio factor of significant coefficient (RFSC) improves the imaging quality. Moreover, RFSC reduces the human intervention in parameter setting. This technique enhances the practicability of single-pixel imaging with the benefits from less time and space consumption, better imaging quality and less human intervention.
Interactive visual exploration and refinement of cluster assignments.
Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R
2017-09-12
With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.
X-ray Computed Microtomography technique applied for cementitious materials: A review.
da Silva, Ítalo Batista
2018-04-01
The main objective of this article is to present a bibliographical review about the use of the X-ray microtomography method in 3D images processing of cementitious materials microstructure, analyzing the pores microstructure and connectivity network, enabling tthe possibility of building a relationship between permeability and porosity. The use of this technique enables the understanding of physical, chemical and mechanical properties of cementitious materials by publishing good results, considering that the quality and quantity of accessible information were significant and may contribute to the study of cementitious materials development. Copyright © 2018 Elsevier Ltd. All rights reserved.
The river absorption capacity determination as a tool to evaluate state of surface water
NASA Astrophysics Data System (ADS)
Wilk, Paweł; Orlińska-Woźniak, Paulina; Gębala, Joanna
2018-02-01
In order to complete a thorough and systematic assessment of water quality, it is useful to measure the absorption capacity of a river. Absorption capacity is understood as a pollution load introduced into river water that will not cause permanent and irreversible changes in the aquatic ecosystem and will not cause a change in the classification of water quality in the river profile. In order to implement the method, the Macromodel DNS/SWAT basin for the Middle Warta pilot (central Poland) was used to simulate nutrient loads. This enabled detailed analysis of water quality in each water body and the assessment of the size of the absorption capacity parameter, which allows the determination of how much pollution can be added to the river without compromising its quality class. Positive values of the calculated absorption capacity parameter mean that it is assumed that the ecosystem is adjusted in such a way that it can eliminate pollution loads through a number of self-purification processes. Negative values indicate that the load limit has been exceeded, and too much pollution has been introduced into the ecosystem for it to be able to deal with through the processes of self-purification. Absorption capacity thus enables the connection of environmental standards of water quality and water quality management plans in order to meet these standards.
NASA Astrophysics Data System (ADS)
Wierzbicki, Damian; Fryskowska, Anna; Kedzierski, Michal; Wojtkowska, Michalina; Delis, Paulina
2018-01-01
Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.
Wavefront measurement using computational adaptive optics.
South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A
2018-03-01
In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.
Electrically Injected UV-Visible Nanowire Lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, George T.; Li, Changyi; Li, Qiming
2015-09-01
There is strong interest in minimizing the volume of lasers to enable ultracompact, low-power, coherent light sources. Nanowires represent an ideal candidate for such nanolasers as stand-alone optical cavities and gain media, and optically pumped nanowire lasing has been demonstrated in several semiconductor systems. Electrically injected nanowire lasers are needed to realize actual working devices but have been elusive due to limitations of current methods to address the requirement for nanowire device heterostructures with high material quality, controlled doping and geometry, low optical loss, and efficient carrier injection. In this project we proposed to demonstrate electrically injected single nanowire lasersmore » emitting in the important UV to visible wavelengths. Our approach to simultaneously address these challenges is based on high quality III-nitride nanowire device heterostructures with precisely controlled geometries and strong gain and mode confinement to minimize lasing thresholds, enabled by a unique top-down nanowire fabrication technique.« less
Design and simulation of a sensor for heliostat field closed loop control
NASA Astrophysics Data System (ADS)
Collins, Mike; Potter, Daniel; Burton, Alex
2017-06-01
Significant research has been completed in pursuit of capital cost reductions for heliostats [1],[2]. The camera array closed loop control concept has potential to radically alter the way heliostats are controlled and installed by replacing high quality open loop targeting systems with low quality targeting devices that rely on measurement of image position to remove tracking errors during operation. Although the system could be used for any heliostat size, the system significantly benefits small heliostats by reducing actuation costs, enabling large numbers of heliostats to be calibrated simultaneously, and enabling calibration of heliostats that produce low irradiance (similar or less than ambient light images) on Lambertian calibration targets, such as small heliostats that are far from the tower. A simulation method for the camera array has been designed and verified experimentally. The simulation tool demonstrates that closed loop calibration or control is possible using this device.
Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F
2017-11-01
Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.
A portable borehole temperature logging system using the four-wire resistance method
NASA Astrophysics Data System (ADS)
Erkan, Kamil; Akkoyunlu, Bülent; Balkan, Elif; Tayanç, Mete
2017-12-01
High-quality temperature-depth information from boreholes with a depth of 100 m or more is used in geothermal studies and in studies of climate change. Electrical wireline tools with thermistor sensors are capable of measuring borehole temperatures with millikelvin resolution. The use of a surface readout mode allows analysis of the thermally conductive state of a borehole, which is especially important for climatic and regional heat flow studies. In this study we describe the design of a portable temperature logging tool that uses the four-wire resistance measurement method. The four-wire method enables the elimination of cable resistance effects, thus allowing millikelvin resolution of temperature data at depth. A preliminary two-wire model of the system is also described. The portability of the tool enables one to collect data from boreholes down to 300 m, even in locations with limited accessibility.
Whole mount nuclear fluorescent imaging: convenient documentation of embryo morphology
Sandell, Lisa L.; Kurosaka, Hiroshi; Trainor, Paul A.
2012-01-01
Here we describe a relatively inexpensive and easy method to produce high quality images that reveal fine topological details of vertebrate embryonic structures. The method relies on nuclear staining of whole mount embryos in combination with confocal microscopy or conventional widefield fluorescent microscopy. In cases where confocal microscopy is used in combination with whole mount nuclear staining, the resulting embryo images can rival the clarity and resolution of images of similar specimens produced by Scanning Electron Microscopy (SEM). The fluorescent nuclear staining may be performed with a variety of cell permeable nuclear dyes, enabling the technique to be performed with multiple standard microscope/illumination or confocal/laser systems. The method may be used to document morphology of embryos of a variety of organisms, as well as individual organs and tissues. Nuclear stain imaging imposes minimal impact on embryonic specimens, enabling imaged specimens to be utilized for additional assays. PMID:22930523
Whole mount nuclear fluorescent imaging: convenient documentation of embryo morphology.
Sandell, Lisa L; Kurosaka, Hiroshi; Trainor, Paul A
2012-11-01
Here, we describe a relatively inexpensive and easy method to produce high quality images that reveal fine topological details of vertebrate embryonic structures. The method relies on nuclear staining of whole mount embryos in combination with confocal microscopy or conventional wide field fluorescent microscopy. In cases where confocal microscopy is used in combination with whole mount nuclear staining, the resulting embryo images can rival the clarity and resolution of images produced by scanning electron microscopy (SEM). The fluorescent nuclear staining may be performed with a variety of cell permeable nuclear dyes, enabling the technique to be performed with multiple standard microscope/illumination or confocal/laser systems. The method may be used to document morphology of embryos of a variety of organisms, as well as individual organs and tissues. Nuclear stain imaging imposes minimal impact on embryonic specimens, enabling imaged specimens to be utilized for additional assays. Copyright © 2012 Wiley Periodicals, Inc.
Accounting for the costs of quality.
Suver, J D; Neumann, B R; Boles, K E
1992-09-01
Total quality management (TQM) represents a paradigm shift in the organizational values that shape every aspect of a healthcare provider's activities. The TQM approach to quality management subscribes to the theory that it is not the work of employees of an organization that leads to poor quality; rather, it is the poor design of systems and procedures. In a book recently published by HFMA, Management Accounting for Healthcare Organizations, third edition, authors Suver, Neumann and Boles point out that the changes in behavioral focus and organizational climate brought about by TQM will have a major impact on management accounting function in healthcare organizations. TQM will require new methods of accounting that will enable the effects of declining quality to be recognized and evaluated. It also will require new types of management accounting reports that will identify opportunities for quality improvement and will monitor the effectiveness of quality management endeavors. The following article has been adapted from the book cited above.
Enabling Transformative Learning in the Workplace: An Educative Research Intervention
ERIC Educational Resources Information Center
Wilhelmson, Lena; Åberg, Marie Moström; Backström, Tomas; Olsson, Bengt Köping
2015-01-01
The aim of this article is to discuss the potential of an educative research intervention to influence the quality of the learning outcome in the workplace as interpreted from the perspectives of adult learning theory. The research project was designed as a quasi-experimental, mixed-methods study. In this article, quantitative survey data were…
Teaching Data Analysis with Interactive Visual Narratives
ERIC Educational Resources Information Center
Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha
2016-01-01
Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…
ERIC Educational Resources Information Center
Welfare, Rhonda Marie
2013-01-01
In an effort to increase the quantity and quality of available teachers, states have begun to offer alternate methods of teacher certification. This means that in addition to traditional teacher training, which involves graduation from an accredited teacher-education institution, states provide alternate routes to enable teachers to transition to…
Burrows, T; Golley, R K; Khambalia, A; McNaughton, S A; Magarey, A; Rosenkranz, R R; Alllman-Farinelli, M; Rangan, A M; Truby, H; Collins, C
2012-12-01
Assessing dietary intake is important in evaluating childhood obesity intervention effectiveness. The purpose of this review was to evaluate the dietary intake methods and reporting in intervention studies that included a dietary component to treat overweight or obese children. A systematic review of studies published in the English language, between 1985 and August 2010 in health databases. The search identified 2,295 papers, of which 335 were retrieved and 31 met the inclusion criteria. Twenty-three studies reported energy intake as an outcome measure, 20 reported macronutrient intakes and 10 studies reported food intake outcomes. The most common dietary method employed was the food diary (n = 13), followed by 24-h recall (n = 5), food frequency questionnaire (FFQ) (n = 4) and dietary questionnaire (n = 4). The quality of the dietary intake methods reporting was rated as 'poor' in 15 studies (52%) and only 3 were rated as 'excellent'. The reporting quality of FFQs tended to be higher than food diaries/recalls. Deficiencies in the quality of dietary intake methods reporting in child obesity studies were identified. Use of a dietary intake methods reporting checklist is recommended. This will enable the quality of dietary intake results to be evaluated, and an increased ability to replicate study methodology by other researchers. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.
Golberg, Alexander; Linshiz, Gregory; Kravets, Ilia; Stawski, Nina; Hillson, Nathan J; Yarmush, Martin L; Marks, Robert S; Konry, Tania
2014-01-01
We report an all-in-one platform - ScanDrop - for the rapid and specific capture, detection, and identification of bacteria in drinking water. The ScanDrop platform integrates droplet microfluidics, a portable imaging system, and cloud-based control software and data storage. The cloud-based control software and data storage enables robotic image acquisition, remote image processing, and rapid data sharing. These features form a "cloud" network for water quality monitoring. We have demonstrated the capability of ScanDrop to perform water quality monitoring via the detection of an indicator coliform bacterium, Escherichia coli, in drinking water contaminated with feces. Magnetic beads conjugated with antibodies to E. coli antigen were used to selectively capture and isolate specific bacteria from water samples. The bead-captured bacteria were co-encapsulated in pico-liter droplets with fluorescently-labeled anti-E. coli antibodies, and imaged with an automated custom designed fluorescence microscope. The entire water quality diagnostic process required 8 hours from sample collection to online-accessible results compared with 2-4 days for other currently available standard detection methods.
Kravets, Ilia; Stawski, Nina; Hillson, Nathan J.; Yarmush, Martin L.; Marks, Robert S.; Konry, Tania
2014-01-01
We report an all-in-one platform – ScanDrop – for the rapid and specific capture, detection, and identification of bacteria in drinking water. The ScanDrop platform integrates droplet microfluidics, a portable imaging system, and cloud-based control software and data storage. The cloud-based control software and data storage enables robotic image acquisition, remote image processing, and rapid data sharing. These features form a “cloud” network for water quality monitoring. We have demonstrated the capability of ScanDrop to perform water quality monitoring via the detection of an indicator coliform bacterium, Escherichia coli, in drinking water contaminated with feces. Magnetic beads conjugated with antibodies to E. coli antigen were used to selectively capture and isolate specific bacteria from water samples. The bead-captured bacteria were co-encapsulated in pico-liter droplets with fluorescently-labeled anti-E. coli antibodies, and imaged with an automated custom designed fluorescence microscope. The entire water quality diagnostic process required 8 hours from sample collection to online-accessible results compared with 2–4 days for other currently available standard detection methods. PMID:24475107
Quality assessment of SPR sensor chips; case study on L1 chips.
Olaru, Andreea; Gheorghiu, Mihaela; David, Sorin; Polonschii, Cristina; Gheorghiu, Eugen
2013-07-15
Surface quality of the Surface Plasmon Resonance (SPR) chips is a major limiting issue in most SPR analyses, even more for supported lipid membranes experiments, where both the organization of the lipid matrix and the subsequent incorporation of the target molecule depend on the surface quality. A novel quantitative method to characterize the quality of SPR sensors chips is described for L1 chips subject to formation of lipid films, injection of membrane disrupting compounds, followed by appropriate regeneration procedures. The method consists in analysis of the SPR reflectivity curves for several standard solutions (e.g. PBS, HEPES or deionized water). This analysis reveals the decline of sensor surface as a function of the number of experimental cycles (consisting in biosensing assay and regeneration step) and enables active control of surface regeneration for enhanced reproducibility. We demonstrate that quantitative evaluation of the changes in reflectivity curves (shape of the SPR dip) and of the slope of the calibration curve provides a rapid and effective procedure for surface quality assessment. Whereas the method was tested on L1 SPR sensors chips, we stress on its amenability to assess the quality of other types of SPR chips, as well. Copyright © 2013 Elsevier B.V. All rights reserved.
Impact of voice- and knowledge-enabled clinical reporting--US example.
Bushko, Renata G; Havlicek, Penny L; Deppert, Edward; Epner, Stephen
2002-01-01
This study shows qualitative and quantitative estimates of the national and the clinic level impact of utilizing voice and knowledge enabled clinical reporting systems. Using common sense estimation methodology, we show that the delivery of health care can experience a dramatic improvement in four areas as a result of the broad use of voice and knowledge enabled clinical reporting: (1) Process Quality as measured by cost savings, (2) Organizational Quality as measured by compliance, (3) Clinical Quality as measured by clinical outcomes and (4) Service Quality as measured by patient satisfaction. If only 15 percent of US physicians replaced transcription with modem clinical reporting voice-based methodology, about one half billion dollars could be saved. $6.7 Billion could be saved annually if all medical reporting currently transcribed was handled with voice-and knowledge-enabled dictation and reporting systems.
[Quality assurance of rehabilitation by the German pension insurance: an overview].
Klosterhuis, H; Baumgarten, E; Beckmann, U; Erbstösser, S; Lindow, B; Naumann, B; Widera, T; Zander, J
2010-12-01
The German pension insurance has in recent years developed a comprehensive programme for quality assurance in rehabilitation, and has implemented the programme into routine practice. Different aspects of rehabilitation are evaluated with differentiated instruments. Issues dealt with inter alia include the quality of rehabilitative care in a narrower sense, the structure and organisation of the rehabilitation centres, as well as quality from the patients' perspective. On the whole, positive results predominate. Big differences in quality however have been found between the rehabilitation centres. The data collections and data evaluations carried out make a continuous process of quality assurance reporting possible for use by rehabilitation centres and pension insurance agencies. This will enable targeted initiatives for quality improvement. The methods and procedures of quality assurance are enhanced at regular intervals, and the scope of quality assurance is extended. Thus, rehab quality assurance is also expanded to cover ambulant rehabilitation or rehabilitation of children and young people. © Georg Thieme Verlag KG Stuttgart · New York.
Incentives and enablers to improve adherence in tuberculosis
Lutge, Elizabeth E; Wiysonge, Charles Shey; Knight, Stephen E; Sinclair, David; Volmink, Jimmy
2015-01-01
Background Patient adherence to medications, particularly for conditions requiring prolonged treatment such as tuberculosis (TB), is frequently less than ideal and can result in poor treatment outcomes. Material incentives to reward good behaviour and enablers to remove economic barriers to accessing care are sometimes given in the form of cash, vouchers, or food to improve adherence. Objectives To evaluate the effects of material incentives and enablers in patients undergoing diagnostic testing, or receiving prophylactic or curative therapy, for TB. Search methods We undertook a comprehensive search of the Cochrane Infectious Diseases Group Specialized Register; Cochrane Central Register of Controlled Trials (CENTRAL); MEDLINE; EMBASE; LILACS; Science Citation Index; and reference lists of relevant publications up to 5 June 2015. Selection criteria Randomized controlled trials of material incentives in patients being investigated for TB, or on treatment for latent or active TB. Data collection and analysis At least two review authors independently screened and selected studies, extracted data, and assessed the risk of bias in the included trials. We compared the effects of interventions using risk ratios (RR), and presented RRs with 95% confidence intervals (CI). The quality of the evidence was assessed using GRADE. Main results We identified 12 eligible trials. Ten were conducted in the USA: in adolescents (one trial), in injection drug or cocaine users (four trials), in homeless adults (three trials), and in prisoners (two trials). The remaining two trials, in general adult populations, were conducted in Timor-Leste and South Africa. Sustained incentive programmes Only two trials have assessed whether material incentives and enablers can improve long-term adherence and completion of treatment for active TB, and neither demonstrated a clear benefit (RR 1.04, 95% CI 0.97 to 1.14; two trials, 4356 participants; low quality evidence). In one trial, the incentive, given as a daily hot meal, was not well received by the population due to the inconvenience of attending the clinic at midday, whilst in the other trial, nurses distributing the vouchers chose to "ration" their distribution among eligible patients, giving only to those whom they felt were most deprived. Three trials assessed the effects of material incentives and enablers on completion of TB prophylaxis with mixed results (low quality evidence). A large effect was seen with regular cash incentives given to drug users at each clinic visit in a setting with extremely low treatment completion in the control group (treatment completion 52.8% intervention versus 3.6% control; RR 14.53, 95% CI 3.64 to 57.98; one trial, 108 participants), but no effects were seen in one trial assessing a cash incentive for recently released prisoners (373 participants), or another trial assessing material incentives offered by parents to teenagers (388 participants). Single once-only incentives However in specific populations, such as recently released prisoners, drug users, and the homeless, trials show that material incentives probably do improve one-off clinic re-attendance for initiation or continuation of anti-TB prophylaxis (RR 1.58, 95% CI 1.27 to 1.96; three trials, 595 participants; moderate quality evidence), and may increase the return rate for reading of tuberculin skin test results (RR 2.16, 95% CI 1.41 to 3.29; two trials, 1371 participants; low quality evidence). Comparison of different types of incentives Single trials in specific sub-populations suggest that an immediate cash incentive may be more effective than delaying the incentive until completion of treatment (RR 1.11, 95% CI 0.98 to 1.24; one trial, 300 participants; low quality evidence), cash incentives may be more effective than non-cash incentives (completion of TB prophylaxis: RR 1.26, 95% CI 1.02 to 1.56; one trial, 141 participants; low quality evidence; return for skin test reading: RR 1.13, 95% CI 1.07 to 1.19; one trial, 652 participants; low quality evidence); and higher cash incentives may be more effective than lower cash incentives (RR 1.08, 95% CI 1.01 to 1.16; one trial, 404 participants; low quality evidence). Authors' conclusions Material incentives and enablers may have some positive short term effects on clinic attendance, particularly for marginal populations such as drug users, recently released prisoners, and the homeless, but there is currently insufficient evidence to know if they can improve long term adherence to TB treatment. PLAIN LANGUAGE SUMMARY Incentives and enablers for improving patient adherence to tuberculosis diagnosis, prophylaxis, and treatment Cochrane researchers conducted a review of the effects of material (economic) incentives or enablers on the adherence and outcomes of patients being tested or treated for latent or active tuberculosis (TB). After searching up to 5 June 2015 for relevant trials, they included 12 randomized controlled trials in this Cochrane review. What are material incentives and enablers and how might they improve patient care? Material incentives and enablers are economic interventions which may be given to patients to reward healthy behaviour (incentives) or remove economic barriers to accessing healthcare (enablers). Incentives and enablers may be given directly as cash or vouchers, or indirectly in the provision of a service for which the patient might otherwise have to pay (like transport to a health facility). What the research says Material incentives and enablers may have little or no effect in improving the outcomes of patients on treatment for active TB (low quality evidence), but further trials of alternative incentives and enablers are needed. Material incentives and enablers may have some effects on completion of prophylaxis for latent TB in some circumstances but trial results were mixed, with one trial showing a large effect, and two trials showing no effect (low quality evidence). One-off material incentives and enablers probably improve rates of return to a single clinic appointment for patients starting or continuing prophylaxis for TB (moderate quality evidence) and may improve the rate of return to the clinic for the reading of diagnostic tests for TB (low quality evidence). Thus although material incentives and enablers may improve some patients' attendance at the clinic in the short term, more research is needed to determine if they have an important positive effect in patients on long term treatment for TB. PMID:26333525
Mini-implants for orthodontic anchorage.
Reynders, Reint Meursinge; Ladu, Luisa
2017-10-27
Data sourcesPubmed, Embase, Cochrane Central Register of Controlled Trials and the Web of Science databases. Hand searches of the journals European Journal of Orthodontics, Journal of Orthodontics, Journal of Clinical Orthodontics, Seminars in Orthodontics, American Journal of Orthodontics & Dentofacial Orthopaedics and Angle Orthodontist.Study selectionTwo reviewers independently selected studies. Randomised controlled trials (RCTs) and controlled clinical trials (CCTs) of orthodontic patients requiring extraction of the maxillary first premolars and closure of the spaces without anchorage loss were considered.Data extraction and synthesisData extraction and risk of bias assessment were carried out independently by two reviewers. Meta-analysis and sensitivity analysis were conducted.ResultsFourteen studies; seven RCTS and seven CCTs were included. In total 303 patients received TISADs with 313 control patients. Overall the quality of the studies was considered to be moderate. Overall the TISAD group had significantly less anchorage loss than the control group. On average, TISADs enabled 1.86mm more anchorage preservation than did conventional methods.ConclusionsThe results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions.
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT
Kim, Jonghyuk
2018-01-01
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.
Kim, Jonghyuk; Hwangbo, Hyunwoo
2018-03-23
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
Modeling the time--varying subjective quality of HTTP video streams with rate adaptations.
Chen, Chao; Choi, Lark Kwon; de Veciana, Gustavo; Caramanis, Constantine; Heath, Robert W; Bovik, Alan C
2014-05-01
Newly developed hypertext transfer protocol (HTTP)-based video streaming technologies enable flexible rate-adaptation under varying channel conditions. Accurately predicting the users' quality of experience (QoE) for rate-adaptive HTTP video streams is thus critical to achieve efficiency. An important aspect of understanding and modeling QoE is predicting the up-to-the-moment subjective quality of a video as it is played, which is difficult due to hysteresis effects and nonlinearities in human behavioral responses. This paper presents a Hammerstein-Wiener model for predicting the time-varying subjective quality (TVSQ) of rate-adaptive videos. To collect data for model parameterization and validation, a database of longer duration videos with time-varying distortions was built and the TVSQs of the videos were measured in a large-scale subjective study. The proposed method is able to reliably predict the TVSQ of rate adaptive videos. Since the Hammerstein-Wiener model has a very simple structure, the proposed method is suitable for online TVSQ prediction in HTTP-based streaming.
larvalign: Aligning Gene Expression Patterns from the Larval Brain of Drosophila melanogaster.
Muenzing, Sascha E A; Strauch, Martin; Truman, James W; Bühler, Katja; Thum, Andreas S; Merhof, Dorit
2018-01-01
The larval brain of the fruit fly Drosophila melanogaster is a small, tractable model system for neuroscience. Genes for fluorescent marker proteins can be expressed in defined, spatially restricted neuron populations. Here, we introduce the methods for 1) generating a standard template of the larval central nervous system (CNS), 2) spatial mapping of expression patterns from different larvae into a reference space defined by the standard template. We provide a manually annotated gold standard that serves for evaluation of the registration framework involved in template generation and mapping. A method for registration quality assessment enables the automatic detection of registration errors, and a semi-automatic registration method allows one to correct registrations, which is a prerequisite for a high-quality, curated database of expression patterns. All computational methods are available within the larvalign software package: https://github.com/larvalign/larvalign/releases/tag/v1.0.
The 1999 ICSI/IHI colloquium on clinical quality improvement--"quality: settling the frontier".
Palmersheim, T M
1999-12-01
A Colloquium on Clinical Quality Improvement, "Quality: Setting the Frontier," held in May 1999, covered methods and programs in clinical quality improvement. Leadership and organizational behavior were the main themes of the breakout sessions; specific topics included implementing guidelines, applying continuous quality improvement (CQI) methods in preventive services and primary care, and using systems thinking to improve clinical outcomes. Three keynote addresses were presented. James L. Reinertsen, MD (CareGroup, Boston), characterized the financial challenges faced by many health care organizations as a "clarion call" for leadership on quality. "The leadership imperative is to establish an environment in which quality can thrive, despite unprecedented, severe economic pressures on our health systems." How do we make improvement more effective? G. Ross Baker, PhD (University of Toronto), reviewed what organizational literature says about making teams more effective, understanding the organizational context to enable improvement work, and augmenting existing methods for creating sustainable improvement. For example, he noted the increasing interest among may organizations in rapid-cycle improvement but cautioned that such efforts may work best where problems can be addressed by existing clinical teams (not cross-functional work groups) and where there are available solutions that have worked in other settings. Mark Chassin, MD (Mount Sinai School of Medicine, New York), stated that critical tasks for improving quality include increasing public awareness, engaging clinicians in improvement, increasing the investment in producing measures and improvement tools, and reinventing health care delivery, clinical education and training, and QI.
Opening the black box of ethics policy work: evaluating a covert practice.
Frolic, Andrea; Drolet, Katherine; Bryanton, Kim; Caron, Carole; Cupido, Cynthia; Flaherty, Barb; Fung, Sylvia; McCall, Lori
2012-01-01
Hospital ethics committees (HECs) and ethicists generally describe themselves as engaged in four domains of practice: case consultation, research, education, and policy work. Despite the increasing attention to quality indicators, practice standards, and evaluation methods for the other domains, comparatively little is known or published about the policy work of HECs or ethicists. This article attempts to open the "black box" of this health care ethics practice by providing two detailed case examples of ethics policy reviews. We also describe the development and application of an evaluation strategy to assess the quality of ethics policy review work, and to enable continuous improvement of ethics policy review processes. Given the potential for policy work to impact entire patient populations and organizational systems, it is imperative that HECs and ethicists develop clearer roles, responsibilities, procedural standards, and evaluation methods to ensure the delivery of consistent, relevant, and high-quality ethics policy reviews.
Role to Be Played by Independent Geotechnical Supervision in the Foundation for Bridge Construction
NASA Astrophysics Data System (ADS)
Sobala, Dariusz; Rybak, Jarosław
2017-10-01
Some remarks concerning the necessity of employing an independent and over all ethical geotechnical survey were presented in the paper. Starting from the design phase, through the whole construction process, the importance of geotechnical engineer is stated in legal acts. Numerous testing technologies serve for the calibration of geotechnical technologies and allow for confirming the quality and capacity of piles. Special emphasis was payed to the involvement of scientifical and research institutions which can not only serve services but also can postprocess and methodize collected data. Such databases enable for new codes, methods and recommendations. Selection of deep foundations for bridge-type structures is most often dependent on complex geotechnical conditions, concentrated loads and constraints for pier displacements. Besides the last ones, prior to more common introduction of the design-construct system, could be a convenient justification for design engineer, who imposed deep foundation because he did not want or was not able to estimate the effect of pier settlement on civil engineering structure. The paper provides some notes about the need to engage a geotechnical supervising service of high competency and ethical quality during engineering and construction stages of foundations for bridge-type structures where legal requirements are of special consideration. Successive stages of projects are reviewed and research methods used for current calibration of geotechnical technologies and verification of geotechnical work quality are analysed. Special attention is given to potential involvement of independent R&D institutions which, apart from rendering specific services, also collect and systemize the research results thus enabling, in the long term, to revise engineering standards, instructions and guidelines.
Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S
2011-11-01
With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: 1) the main production quality characteristics; 2) the process features for quality assurance; 3) the internal quality characteristics and their specification designs; 4) the quality control and process capability evaluation methods, and 5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. PMID:23872356
2011-01-01
Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303
ERIC Educational Resources Information Center
Webster, Amanda A.; Carter, Mark
2013-01-01
Background: One of the most commonly cited rationales for inclusive education is to enable the development of quality relationships with typically developing peers. Relatively few researchers have examined the features of the range of relationships that children with developmental disability form in inclusive school settings. Method: Interviews…
Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report
1995-06-01
technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that
Enhancement of automated blood flow estimates (ENABLE) from arterial spin-labeled MRI.
Shirzadi, Zahra; Stefanovic, Bojana; Chappell, Michael A; Ramirez, Joel; Schwindt, Graeme; Masellis, Mario; Black, Sandra E; MacIntosh, Bradley J
2018-03-01
To validate a multiparametric automated algorithm-ENhancement of Automated Blood fLow Estimates (ENABLE)-that identifies useful and poor arterial spin-labeled (ASL) difference images in multiple postlabeling delay (PLD) acquisitions and thereby improve clinical ASL. ENABLE is a sort/check algorithm that uses a linear combination of ASL quality features. ENABLE uses simulations to determine quality weighting factors based on an unconstrained nonlinear optimization. We acquired a set of 6-PLD ASL images with 1.5T or 3.0T systems among 98 healthy elderly and adults with mild cognitive impairment or dementia. We contrasted signal-to-noise ratio (SNR) of cerebral blood flow (CBF) images obtained with ENABLE vs. conventional ASL analysis. In a subgroup, we validated our CBF estimates with single-photon emission computed tomography (SPECT) CBF images. ENABLE produced significantly increased SNR compared to a conventional ASL analysis (Wilcoxon signed-rank test, P < 0.0001). We also found the similarity between ASL and SPECT was greater when using ENABLE vs. conventional ASL analysis (n = 51, Wilcoxon signed-rank test, P < 0.0001) and this similarity was strongly related to ASL SNR (t = 24, P < 0.0001). These findings suggest that ENABLE improves CBF image quality from multiple PLD ASL in dementia cohorts at either 1.5T or 3.0T, achieved by multiparametric quality features that guided postprocessing of dementia ASL. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:647-655. © 2017 International Society for Magnetic Resonance in Medicine.
2012-01-01
Background Sub-Saharan African populations are growing in many European countries. Data on the health of these populations are rare. Additionally, many sub-Saharan African migrants are confronted with issues of low socio-economic status, acculturation and language difficulties, which may hamper their access to health care. Despite the identification of some of those barriers, little is known about the enabling factors. Knowledge about the enablers and barriers in access to healthcare experienced is important in addressing their health needs and promoting healthcare access. This study aimed to investigate the enabling factors as well as barriers in access to the Dutch healthcare system among the largest sub-Saharan African migrant group (Ghanaians) living in Amsterdam, the Netherlands. Methods Six focus groups were conducted from November 2009 to February 2010. A semi-structured interview guideline was used. Discussions were conducted in English or Twi (Ghanaian dialect), recorded and transcribed verbatim. Analysis was based on the Andersen model of healthcare utilisation using MAXQDA software. Results Knowledge and perceived quality of the health system, awareness of diseases, family and community support, community initiatives and availability of social support were the main enablers to the healthcare system. Difficulties with the Dutch language and mistrust in health care providers were major barriers in access to healthcare. Conclusions Access to healthcare is facilitated mainly by knowledge of and the perceived efficiency and quality of the Dutch healthcare system. However, poor Dutch language proficiency and mistrust in health care providers appear to be important barriers in accessing healthcare. The enablers and barriers identified by this study provide useful information for promoting healthcare access among this and similar Sub-Saharan African communities. PMID:22443162
NASA Astrophysics Data System (ADS)
Polosin, A. N.; Chistyakova, T. B.
2018-05-01
In this article, the authors describe mathematical modeling of polymer processing in extruders of various types used in extrusion and calender productions of film materials. The method consists of the synthesis of a static model for calculating throughput, energy consumption of the extruder, extrudate quality indices, as well as a dynamic model for evaluating polymer residence time in the extruder, on which the quality indices depend. Models are adjusted according to the extruder type (single-screw, reciprocating, twin-screw), its screw and head configuration, extruder’s work temperature conditions, and the processed polymer type. Models enable creating extruder screw configurations and determining extruder controlling action values that provide the extrudate of required quality while satisfying extruder throughput and energy consumption requirements. Model adequacy has been verified using polyolefins’ and polyvinylchloride processing data in different extruders. The program complex, based on mathematical models, has been developed in order to control extruders of various types in order to ensure resource and energy saving in multi-assortment productions of polymeric films. Using the program complex in the control system for the extrusion stage of the polymeric film productions enables improving film quality, reducing spoilage, lessening the time required for production line change-over to other throughput and film type assignment.
Enabling Self-Monitoring Data Exchange in Participatory Medicine.
Lopez-Campos, Guillermo; Ofoghi, Bahadorreza; Martin-Sanchez, Fernando
2015-01-01
The development of new methods, devices and apps for self-monitoring have enabled the extension of the application of these approaches for consumer health and research purposes. The increase in the number and variety of devices has generated a complex scenario where reporting guidelines and data exchange formats will be needed to ensure the quality of the information and the reproducibility of results of the experiments. Based on the Minimal Information for Self Monitoring Experiments (MISME) reporting guideline we have developed an XML format (MISME-ML) to facilitate data exchange for self monitoring experiments. We have also developed a sample instance to illustrate the concept and a Java MISME-ML validation tool. The implementation and adoption of these tools should contribute to the consolidation of a set of methods that ensure the reproducibility of self monitoring experiments for research purposes.
IT investments can add business value.
Williams, Terry G
2002-05-01
Investment in information technology (IT) is costly, but necessary to enable healthcare organizations to improve their infrastructure and achieve other improvement initiatives. Such an investment is even more costly, however, if the technology does not appropriately enable organizations to perform business processes that help them accomplish their mission of providing safe, high-quality care cost-effectively. Before committing to a costly IT investment, healthcare organizations should implement a decision-making process that can help them choose, implement, and use technology that will provide sustained business value. A seven-step decision-making process that can help healthcare organizations achieve this result involves performing a gap analysis, assessing and aligning organizational goals, establishing distributed accountability, identifying linked organizational-change initiatives, determining measurement methods, establishing appropriate teams to ensure systems are integrated with multidisciplinary improvement methods, and developing a plan to accelerate adoption of the IT product.
Papoulias, Constantina
2018-06-01
This article considers the strengths and potential contributions of participatory visual methods for healthcare quality improvement research. It argues that such approaches may enable us to expand our understanding of 'patient experience' and of its potential for generating new knowledge for health systems. In particular, they may open up dimensions of people's engagement with services and treatments which exceed both the declarative nature of responses to questionnaires and the narrative sequencing of self reports gathered through qualitative interviewing. I will suggest that working with such methods may necessitate a more reflexive approach to the constitution of evidence in quality improvement work. To this end, the article will first consider the emerging rationale for the use of visual participatory methods in improvement before outlining the implications of two related approaches-photo-elicitation and PhotoVoice-for the constitution of 'experience'. It will then move to a participatory model for healthcare improvement work, Experience Based Co-Design (EBCD). It will argue that EBCD exemplifies both the strengths and the limitations of adequating visual participatory approaches to quality improvement ends. The article will conclude with a critical reflection on a small photographic study, in which the author participated, and which sought to harness service user perspectives for the design of psychiatric facilities, as a way of considering the potential contribution of visual participatory methods for quality improvement.
High-quality Health Information Provision for Stroke Patients.
Du, Hong-Sheng; Ma, Jing-Jian; Li, Mu
2016-09-05
High-quality information provision can allow stroke patients to effectively participate in healthcare decision-making, better manage the stroke, and make a good recovery. In this study, we reviewed information needs of stroke patients, methods for providing information to patients, and considerations needed by the information providers. The literature concerning or including information provision for patients with stroke in English was collected from PubMed published from 1990 to 2015. We included all the relevant articles on information provision for stroke patients in English, with no limitation of study design. Stroke is a major public health concern worldwide. High-quality and effective health information provision plays an essential role in helping patients to actively take part in decision-making and healthcare, and empowering them to effectively self-manage their long-standing chronic conditions. Different methods for providing information to patients have their relative merits and suitability, and as a result, the effective strategies taken by health professionals may include providing high-quality information, meeting patients' individual needs, using suitable methods in providing information, and maintaining active involvement of patients. It is suggested that to enable stroke patients to access high-quality health information, greater efforts need to be made to ensure patients to receive accurate and current evidence-based information which meets their individual needs. Health professionals should use suitable information delivery methods, and actively involve stroke patients in information provision.
Aurumskjöld, Marie-Louise; Ydström, Kristina; Tingberg, Anders; Söderberg, Marcus
2017-01-01
The number of computed tomography (CT) examinations is increasing and leading to an increase in total patient exposure. It is therefore important to optimize CT scan imaging conditions in order to reduce the radiation dose. The introduction of iterative reconstruction methods has enabled an improvement in image quality and a reduction in radiation dose. To investigate how image quality depends on reconstruction method and to discuss patient dose reduction resulting from the use of hybrid and model-based iterative reconstruction. An image quality phantom (Catphan® 600) and an anthropomorphic torso phantom were examined on a Philips Brilliance iCT. The image quality was evaluated in terms of CT numbers, noise, noise power spectra (NPS), contrast-to-noise ratio (CNR), low-contrast resolution, and spatial resolution for different scan parameters and dose levels. The images were reconstructed using filtered back projection (FBP) and different settings of hybrid (iDose 4 ) and model-based (IMR) iterative reconstruction methods. iDose 4 decreased the noise by 15-45% compared with FBP depending on the level of iDose 4 . The IMR reduced the noise even further, by 60-75% compared to FBP. The results are independent of dose. The NPS showed changes in the noise distribution for different reconstruction methods. The low-contrast resolution and CNR were improved with iDose 4 , and the improvement was even greater with IMR. There is great potential to reduce noise and thereby improve image quality by using hybrid or, in particular, model-based iterative reconstruction methods, or to lower radiation dose and maintain image quality. © The Foundation Acta Radiologica 2016.
Method for isolating nucleic acids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurt, Jr., Richard Ashley; Elias, Dwayne A.
The current disclosure provides methods and kits for isolating nucleic acid from an environmental sample. The current methods and compositions further provide methods for isolating nucleic acids by reducing adsorption of nucleic acids by charged ions and particles within an environmental sample. The methods of the current disclosure provide methods for isolating nucleic acids by releasing adsorbed nucleic acids from charged particles during the nucleic acid isolation process. The current disclosure facilitates the isolation of nucleic acids of sufficient quality and quantity to enable one of ordinary skill in the art to utilize or analyze the isolated nucleic acids formore » a wide variety of applications including, sequencing or species population analysis.« less
Advancing Resident Assessment in Graduate Medical Education
Swing, Susan R.; Clyman, Stephen G.; Holmboe, Eric S.; Williams, Reed G.
2009-01-01
Background The Outcome Project requires high-quality assessment approaches to provide reliable and valid judgments of the attainment of competencies deemed important for physician practice. Intervention The Accreditation Council for Graduate Medical Education (ACGME) convened the Advisory Committee on Educational Outcome Assessment in 2007–2008 to identify high-quality assessment methods. The assessments selected by this body would form a core set that could be used by all programs in a specialty to assess resident performance and enable initial steps toward establishing national specialty databases of program performance. The committee identified a small set of methods for provisional use and further evaluation. It also developed frameworks and processes to support the ongoing evaluation of methods and the longer-term enhancement of assessment in graduate medical education. Outcome The committee constructed a set of standards, a methodology for applying the standards, and grading rules for their review of assessment method quality. It developed a simple report card for displaying grades on each standard and an overall grade for each method reviewed. It also described an assessment system of factors that influence assessment quality. The committee proposed a coordinated, national-level infrastructure to support enhancements to assessment, including method development and assessor training. It recommended the establishment of a new assessment review group to continue its work of evaluating assessment methods. The committee delivered a report summarizing its activities and 5 related recommendations for implementation to the ACGME Board in September 2008. PMID:21975993
NASA Astrophysics Data System (ADS)
Přibil, Jiří; Přibilová, Anna; Frollo, Ivan
2017-12-01
The paper focuses on two methods of evaluation of successfulness of speech signal enhancement recorded in the open-air magnetic resonance imager during phonation for the 3D human vocal tract modeling. The first approach enables to obtain a comparison based on statistical analysis by ANOVA and hypothesis tests. The second method is based on classification by Gaussian mixture models (GMM). The performed experiments have confirmed that the proposed ANOVA and GMM classifiers for automatic evaluation of the speech quality are functional and produce fully comparable results with the standard evaluation based on the listening test method.
Developing Confidence Limits For Reliability Of Software
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1991-01-01
Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.
Utilization of Low Temperatures in Electrical Machines,
1983-09-08
quality 8 | | -. . * - . * . . .. . . . - * , . . , . . . . . * of the obtained junctions. For welding of steels, we used the TIG * method which is the...most frequently used technique for joining alloy steels. We studied the effects of the chemical composition of the * weld , linear energy of welding and...disappearance of resistance in certain metals and alloys at very low temperatures, in the vicinity of abso- lute zero. This fact enables currents to
WE-EF-207-02: The Rotate-Plus-Shift C-Arm Trajectory: Theory and First Clinical Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritschl, L; Kachelriess, M; Kuntz, J
Purpose: The proposed method enables the acquisition of a complete dataset for 3D reconstruction of C-Arm data using less than 180° rotation. Methods: Typically a C–arm cone–beam CT scan is performed using a circle–like trajectory around a region of interest. Therefore an angular range of at least 180° plus fan–angle must be covered to ensure a completely sampled data set. This fact defines some constraints on the geometry and technical specifications of a C–arm system, for example a larger C radius or a smaller C opening respectively. This is even more important for mobile C-arm devices which are typically usedmore » in surgical applications.To overcome these limitations we propose a new trajectory which requires only 180° minusfan–angle of rotation for a complete data set. The trajectory consists of three parts: A rotation of the C around a defined iso–center and two translational movements parallel to the detector plane at the begin and at the end of the rotation (rotate plus shift trajectory). This enables the acquisition of a completely sampled dataset using only 180° minus fan–angle of rotation. Results: For the evaluation of the method we show simulated and measured data. The results show, that the rotate plus shift scan yields equivalent image quality compared to the short scan which is assumed to be the gold standard for C-arm CT today. Compared to the pure rotational scan over only 165°, the rotate plus shift scan shows strong improvements in image quality. Conclusion: The proposed method makes 3D imaging using C–arms with less than 180° rotation range possible. This enables integrating full 3D functionality into a C- arm device without any loss of handling and usability for 2D imaging.« less
Appreciative Inquiry for quality improvement in primary care practices.
Ruhe, Mary C; Bobiak, Sarah N; Litaker, David; Carter, Caroline A; Wu, Laura; Schroeder, Casey; Zyzanski, Stephen J; Weyer, Sharon M; Werner, James J; Fry, Ronald E; Stange, Kurt C
2011-01-01
To test the effect of an Appreciative Inquiry (AI) quality improvement strategy on clinical quality management and practice development outcomes. Appreciative inquiry enables the discovery of shared motivations, envisioning a transformed future, and learning around the implementation of a change process. Thirty diverse primary care practices were randomly assigned to receive an AI-based intervention focused on a practice-chosen topic and on improving preventive service delivery (PSD) rates. Medical-record review assessed change in PSD rates. Ethnographic field notes and observational checklist analysis used editing and immersion/crystallization methods to identify factors affecting intervention implementation and practice development outcomes. The PSD rates did not change. Field note analysis suggested that the intervention elicited core motivations, facilitated development of a shared vision, defined change objectives, and fostered respectful interactions. Practices most likely to implement the intervention or develop new practice capacities exhibited 1 or more of the following: support from key leader(s), a sense of urgency for change, a mission focused on serving patients, health care system and practice flexibility, and a history of constructive practice change. An AI approach and enabling practice conditions can lead to intervention implementation and practice development by connecting individual and practice strengths and motivations to the change objective.
TiN-buffered substrates for photoelectrochemical measurements of oxynitride thin films
NASA Astrophysics Data System (ADS)
Pichler, Markus; Pergolesi, Daniele; Landsmann, Steve; Chawla, Vipin; Michler, Johann; Döbeli, Max; Wokaun, Alexander; Lippert, Thomas
2016-04-01
Developing novel materials for the conversion of solar to chemical energy is becoming an increasingly important endeavour. Perovskite compounds based on bandgap tunable oxynitrides represent an exciting class of novel photoactive materials. To date, literature mostly focuses on the characterization of oxynitride powder samples which have undeniable technological interest but do not allow the investigation of fundamental properties such as the role of the crystalline quality and/or the surface crystallographic orientation toward photo-catalytic activity. The challenge of growing high quality oxynitride thin films arises from the availability of a suitable substrate, owing to strict material and processing requirements: effective lattice matching, sufficiently high conductivities, stability under high temperatures and in strongly reducing environments. Here, we have established the foundations of a model system incorporating a TiN-buffer layer which enables fundamental investigations into crystallographic surface orientation and crystalline quality of the photocatalyst against photo(electro)chemical performance to be effectively performed. Furthermore, we find that TiN as current collector enables control over the nitrogen content of oxynitride thin films produced by a modified pulsed laser deposition method and allows the growth of highly ordered LaTiO3-xNx thin films.
Cepeda-Carrión, Gabriel; Cegarra-Navarro, Juan Gabriel; Martínez-Caro, Eva; Eldridge, Stephen
2011-10-01
With the passing of time, knowledge like other resources can become obsolete. Thus, people in a healthcare system need to update their knowledge in order to keep pace with the ongoing changes in their operational environment. Information technology continually provides a great amount of new knowledge which can lead to healthcare professionals becoming overloaded with knowledge. This overloading can be alleviated by a process of unlearning which enables the professional to retain just the relevant and critical knowledge required to improve the quality of service provided by them. This paper shows some of the tools and methods that Hospital-in-the-Home Units (HHUs) have used to update the physician-patient knowledge and the technology knowledge of the HHUs' personnel. A survey study was carried out in the HHU in Spanish health system in 2010. Fifty-five doctors and 62 nurses belonging to 44 HHUs. None. Three hypotheses are presented and supported, which suggest that technology and physician-patient knowledge is related to the unlearning context and the unlearning context impacts positively on the quality of health services provided. The key benefits of the unlearning context for the quality of service provided in HHUs are clear: it enables them to identify and replace poor practices and also avoids the reinvention of the wheel (e.g.: by minimizing unnecessary work caused by the use of poor methods) and it reduces costs through better productivity and efficiency (improving services to patients).
The integration of quality function deployment and Kansei Engineering: An overview of application
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Awang, Ahmad Azran; Omar, Abdul Rahman; Abdullah, Nur Atiqah Sia
2016-02-01
As a result of today's globalized world and robust development of emerging markets, consumers are able to select from an endless number of products that are mostly similar in terms of design and properties, as well as equivalent in function and performance. The survival of businesses in a competitive ambience requires innovation, consumer loyalty, and products that are easily identifiable by consumers. Today's manufacturers have started to employ customer research instruments to survive in the highly industrialized world—for example, Conjoint Analysis, Design of Experiments and Semantic Design of Environment. However, this work only attempts to concentrate on Kansei Engineering and Quality Function Deployment. Kansei Engineering (KE) is deemed as the most appropriate method to link consumers' feelings, emotions or senses to the properties of a product because it translates people's impressions, interests, and feelings to the solutions of product design. Likewise, Quality Function Deployment (QFD) enables clearer interpretation of the needs of consumers, better concepts or products, and enhanced communication to internal operations that must then manufacture and deliver the product or services. The integration of both KE and QFD is believed possible, as many product manufacturers and businesses have started to utilize systematized methods to translate consumers' needs and wants into processes and products. Therefore, this work addresses areas of various integrations of KE and QFD processes in the industry, in an effort to assist an integration of KE and QFD. This work aims to provide evidence on the integration mechanism to enable successful incorporation of consumer's implicit feelings and demands into product quality improvement, and simultaneously providing an overview of both KE and QFD from the perspective of a novice.
Hyperspectral microscope for in vivo imaging of microstructures and cells in tissues
Demos,; Stavros, G [Livermore, CA
2011-05-17
An optical hyperspectral/multimodal imaging method and apparatus is utilized to provide high signal sensitivity for implementation of various optical imaging approaches. Such a system utilizes long working distance microscope objectives so as to enable off-axis illumination of predetermined tissue thereby allowing for excitation at any optical wavelength, simplifies design, reduces required optical elements, significantly reduces spectral noise from the optical elements and allows for fast image acquisition enabling high quality imaging in-vivo. Such a technology provides a means of detecting disease at the single cell level such as cancer, precancer, ischemic, traumatic or other type of injury, infection, or other diseases or conditions causing alterations in cells and tissue micro structures.
Software Analytical Instrument for Assessment of the Process of Casting Slabs
NASA Astrophysics Data System (ADS)
Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš
2010-06-01
The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.
A vacuum flash-assisted solution process for high-efficiency large-area perovskite solar cells
NASA Astrophysics Data System (ADS)
Li, Xiong; Bi, Dongqin; Yi, Chenyi; Décoppet, Jean-David; Luo, Jingshan; Zakeeruddin, Shaik Mohammed; Hagfeldt, Anders; Grätzel, Michael
2016-07-01
Metal halide perovskite solar cells (PSCs) currently attract enormous research interest because of their high solar-to-electric power conversion efficiency (PCE) and low fabrication costs, but their practical development is hampered by difficulties in achieving high performance with large-size devices. We devised a simple vacuum flash-assisted solution processing method to obtain shiny, smooth, crystalline perovskite films of high electronic quality over large areas. This enabled us to fabricate solar cells with an aperture area exceeding 1 square centimeter, a maximum efficiency of 20.5%, and a certified PCE of 19.6%. By contrast, the best certified PCE to date is 15.6% for PSCs of similar size. We demonstrate that the reproducibility of the method is excellent and that the cells show virtually no hysteresis. Our approach enables the realization of highly efficient large-area PSCs for practical deployment.
Lensless Photoluminescence Hyperspectral Camera Employing Random Speckle Patterns.
Žídek, Karel; Denk, Ondřej; Hlubuček, Jiří
2017-11-10
We propose and demonstrate a spectrally-resolved photoluminescence imaging setup based on the so-called single pixel camera - a technique of compressive sensing, which enables imaging by using a single-pixel photodetector. The method relies on encoding an image by a series of random patterns. In our approach, the image encoding was maintained via laser speckle patterns generated by an excitation laser beam scattered on a diffusor. By using a spectrometer as the single-pixel detector we attained a realization of a spectrally-resolved photoluminescence camera with unmatched simplicity. We present reconstructed hyperspectral images of several model scenes. We also discuss parameters affecting the imaging quality, such as the correlation degree of speckle patterns, pattern fineness, and number of datapoints. Finally, we compare the presented technique to hyperspectral imaging using sample scanning. The presented method enables photoluminescence imaging for a broad range of coherent excitation sources and detection spectral areas.
Segment scheduling method for reducing 360° video streaming latency
NASA Astrophysics Data System (ADS)
Gudumasu, Srinivas; Asbun, Eduardo; He, Yong; Ye, Yan
2017-09-01
360° video is an emerging new format in the media industry enabled by the growing availability of virtual reality devices. It provides the viewer a new sense of presence and immersion. Compared to conventional rectilinear video (2D or 3D), 360° video poses a new and difficult set of engineering challenges on video processing and delivery. Enabling comfortable and immersive user experience requires very high video quality and very low latency, while the large video file size poses a challenge to delivering 360° video in a quality manner at scale. Conventionally, 360° video represented in equirectangular or other projection formats can be encoded as a single standards-compliant bitstream using existing video codecs such as H.264/AVC or H.265/HEVC. Such method usually needs very high bandwidth to provide an immersive user experience. While at the client side, much of such high bandwidth and the computational power used to decode the video are wasted because the user only watches a small portion (i.e., viewport) of the entire picture. Viewport dependent 360°video processing and delivery approaches spend more bandwidth on the viewport than on non-viewports and are therefore able to reduce the overall transmission bandwidth. This paper proposes a dual buffer segment scheduling algorithm for viewport adaptive streaming methods to reduce latency when switching between high quality viewports in 360° video streaming. The approach decouples the scheduling of viewport segments and non-viewport segments to ensure the viewport segment requested matches the latest user head orientation. A base layer buffer stores all lower quality segments, and a viewport buffer stores high quality viewport segments corresponding to the most recent viewer's head orientation. The scheduling scheme determines viewport requesting time based on the buffer status and the head orientation. This paper also discusses how to deploy the proposed scheduling design for various viewport adaptive video streaming methods. The proposed dual buffer segment scheduling method is implemented in an end-to-end tile based 360° viewports adaptive video streaming platform, where the entire 360° video is divided into a number of tiles, and each tile is independently encoded into multiple quality level representations. The client requests different quality level representations of each tile based on the viewer's head orientation and the available bandwidth, and then composes all tiles together for rendering. The simulation results verify that the proposed dual buffer segment scheduling algorithm reduces the viewport switch latency, and utilizes available bandwidth more efficiently. As a result, a more consistent immersive 360° video viewing experience can be presented to the user.
Nemiroski, Alex; Soh, Siowling; Kwok, Sen Wai; Yu, Hai-Dong; Whitesides, George M
2016-02-03
Magnetic levitation (MagLev) of diamagnetic or weakly paramagnetic materials suspended in a paramagnetic solution in a magnetic field gradient provides a simple method to measure the density of small samples of solids or liquids. One major limitation of this method, thus far, has been an inability to measure or manipulate materials outside of a narrow range of densities (0.8 g/cm(3) < ρ < 2.3 g/cm(3)) that are close in density to the suspending, aqueous medium. This paper explores a simple method-"tilted MagLev"-to increase the range of densities that can be levitated magnetically. Tilting the MagLev device relative to the gravitational vector enables the magnetic force to be decreased (relative to the magnetic force) along the axis of measurement. This approach enables many practical measurements over the entire range of densities observed in matter at ambient conditions-from air bubbles (ρ ≈ 0) to osmium and iridium (ρ ≈ 23 g/cm(3)). The ability to levitate, simultaneously, objects with a broad range of different densities provides an operationally simple method that may find application to forensic science (e.g., for identifying the composition of miscellaneous objects or powders), industrial manufacturing (e.g., for quality control of parts), or resource-limited settings (e.g., for identifying and separating small particles of metals and alloys).
An Analysis Method for Superconducting Resonator Parameter Extraction with Complex Baseline Removal
NASA Technical Reports Server (NTRS)
Cataldo, Giuseppe
2014-01-01
A new semi-empirical model is proposed for extracting the quality (Q) factors of arrays of superconducting microwave kinetic inductance detectors (MKIDs). The determination of the total internal and coupling Q factors enables the computation of the loss in the superconducting transmission lines. The method used allows the simultaneous analysis of multiple interacting discrete resonators with the presence of a complex spectral baseline arising from reflections in the system. The baseline removal allows an unbiased estimate of the device response as measured in a cryogenic instrumentation setting.
Morgan, R M
2017-11-01
This paper builds on the FoRTE conceptual model presented in part I to address the forms of knowledge that are integral to the four components of the model. Articulating the different forms of knowledge within effective forensic reconstructions is valuable. It enables a nuanced approach to the development and use of evidence bases to underpin decision-making at every stage of a forensic reconstruction by enabling transparency in the reporting of inferences. It also enables appropriate methods to be developed to ensure quality and validity. It is recognised that the domains of practice, research, and policy/law intersect to form the nexus where forensic science is situated. Each domain has a distinctive infrastructure that influences the production and application of different forms of knowledge in forensic science. The channels that can enable the interaction between these domains, enhance the impact of research in theory and practice, increase access to research findings, and support quality are presented. The particular strengths within the different domains to deliver problem solving forensic reconstructions are thereby identified and articulated. It is argued that a conceptual understanding of forensic reconstruction that draws on the full range of both explicit and tacit forms of knowledge, and incorporates the strengths of the different domains pertinent to forensic science, offers a pathway to harness the full value of trace evidence for context sensitive, problem-solving forensic applications. Copyright © 2017 The Author. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Qiu, Guoping; Kheiri, Ahmed
2011-01-01
Current subjective image quality assessments have been developed in the laboratory environments, under controlledconditions, and are dependent on the participation of limited numbers of observers. In this research, with the help of Web 2.0 and social media technology, a new method for building a subjective image quality metric has been developed where the observers are the Internet users. A website with a simple user interface that enables Internet users from anywhere at any time to vote for a better quality version of a pair of the same image has been constructed. Users' votes are recorded and used to rank the images according to their perceived visual qualities. We have developed three rank aggregation algorithms to process the recorded pair comparison data, the first uses a naive approach, the second employs a Condorcet method, and the third uses the Dykstra's extension of Bradley-Terry method. The website has been collecting data for about three months and has accumulated over 10,000 votes at the time of writing this paper. Results show that the Internet and its allied technologies such as crowdsourcing offer a promising new paradigm for image and video quality assessment where hundreds of thousands of Internet users can contribute to building more robust image quality metrics. We have made Internet user generated social image quality (SIQ) data of a public image database available online (http://www.hdri.cs.nott.ac.uk/siq/) to provide the image quality research community with a new source of ground truth data. The website continues to collect votes and will include more public image databases and will also be extended to include videos to collect social video quality (SVQ) data. All data will be public available on the website in due course.
Comans, Tracy A; Nguyen, Kim-Huong; Mulhern, Brendan; Corlis, Megan; Li, Li; Welch, Alyssa; Kurrle, Susan E; Rowen, Donna; Moyle, Wendy; Kularatna, Sanjeewa; Ratcliffe, Julie
2018-01-01
Introduction Generic instruments for assessing health-related quality of life may lack the sensitivity to detect changes in health specific to certain conditions, such as dementia. The Quality of Life in Alzheimer’s Disease (QOL-AD) is a widely used and well-validated condition-specific instrument for assessing health-related quality of life for people living with dementia, but it does not enable the calculation of quality-adjusted life years, the basis of cost utility analysis. This study will generate a preference-based scoring algorithm for a health state classification system -the Alzheimer’s Disease Five Dimensions (AD-5D) derived from the QOL-AD. Methods and analysis Discrete choice experiments with duration (DCETTO) and best–worst scaling health state valuation tasks will be administered to a representative sample of 2000 members of the Australian general population via an online survey and to 250 dementia dyads (250 people with dementia and their carers) via face-to-face interview. A multinomial (conditional) logistic framework will be used to analyse responses and produce the utility algorithm for the AD-5D. Ethics and dissemination The algorithms developed will enable prospective and retrospective economic evaluation of any treatment or intervention targeting people with dementia where the QOL-AD has been administered and will be available online. Results will be disseminated through journals that publish health economics articles and through professional conferences. This study has ethical approval. PMID:29358437
NASA Astrophysics Data System (ADS)
Obropta, Christopher C.; Niazi, Mehran; Kardos, Josef S.
2008-12-01
Environmental decision support systems (EDSSs) are an emerging tool used to integrate the evaluation of highly complex and interrelated physicochemical, biological, hydrological, social, and economic aspects of environmental problems. An EDSS approach is developed to address hot-spot concerns for a water quality trading program intended to implement the total maximum daily load (TMDL) for phosphorus in the Non-Tidal Passaic River Basin of New Jersey. Twenty-two wastewater treatment plants (WWTPs) spread throughout the watershed are considered the major sources of phosphorus loading to the river system. Periodic surface water diversions to a major reservoir from the confluence of two key tributaries alter the natural hydrology of the watershed and must be considered in the development of a trading framework that ensures protection of water quality. An EDSS is applied that enables the selection of a water quality trading framework that protects the watershed from phosphorus-induced hot spots. The EDSS employs Simon’s (1960) three stages of the decision-making process: intelligence, design, and choice. The identification of two potential hot spots and three diversion scenarios enables the delineation of three management areas for buying and selling of phosphorus credits among WWTPs. The result shows that the most conservative option entails consideration of two possible diversion scenarios, and trading between management areas is restricted accordingly. The method described here is believed to be the first application of an EDSS to a water quality trading program that explicitly accounts for surface water diversions.
Kim, Tae Hyung; Setsompop, Kawin; Haldar, Justin P.
2016-01-01
Purpose Parallel imaging and partial Fourier acquisition are two classical approaches for accelerated MRI. Methods that combine these approaches often rely on prior knowledge of the image phase, but the need to obtain this prior information can place practical restrictions on the data acquisition strategy. In this work, we propose and evaluate SENSE-LORAKS, which enables combined parallel imaging and partial Fourier reconstruction without requiring prior phase information. Theory and Methods The proposed formulation is based on combining the classical SENSE model for parallel imaging data with the more recent LORAKS framework for MR image reconstruction using low-rank matrix modeling. Previous LORAKS-based methods have successfully enabled calibrationless partial Fourier parallel MRI reconstruction, but have been most successful with nonuniform sampling strategies that may be hard to implement for certain applications. By combining LORAKS with SENSE, we enable highly-accelerated partial Fourier MRI reconstruction for a broader range of sampling trajectories, including widely-used calibrationless uniformly-undersampled trajectories. Results Our empirical results with retrospectively undersampled datasets indicate that when SENSE-LORAKS reconstruction is combined with an appropriate k-space sampling trajectory, it can provide substantially better image quality at high-acceleration rates relative to existing state-of-the-art reconstruction approaches. Conclusion The SENSE-LORAKS framework provides promising new opportunities for highly-accelerated MRI. PMID:27037836
Image registration: enabling technology for image guided surgery and therapy.
Sauer, Frank
2005-01-01
Imaging looks inside the patient's body, exposing the patient's anatomy beyond what is visible on the surface. Medical imaging has a very successful history for medical diagnosis. It also plays an increasingly important role as enabling technology for minimally invasive procedures. Interventional procedures (e.g. catheter based cardiac interventions) are traditionally supported by intra-procedure imaging (X-ray fluoro, ultrasound). There is realtime feedback, but the images provide limited information. Surgical procedures are traditionally supported with pre-operative images (CT, MR). The image quality can be very good; however, the link between images and patient has been lost. For both cases, image registration can play an essential role -augmenting intra-op images with pre-op images, and mapping pre-op images to the patient's body. We will present examples of both approaches from an application oriented perspective, covering electrophysiology, radiation therapy, and neuro-surgery. Ultimately, as the boundaries between interventional radiology and surgery are becoming blurry, also the different methods for image guidance will merge. Image guidance will draw upon a combination of pre-op and intra-op imaging together with magnetic or optical tracking systems, and enable precise minimally invasive procedures. The information is registered into a common coordinate system, and allows advanced methods for visualization such as augmented reality or advanced methods for therapy delivery such as robotics.
Primary Care Practice Transformation Is Hard Work
Crabtree, Benjamin F.; Nutting, Paul A.; Miller, William L.; McDaniel, Reuben R.; Stange, Kurt C.; Jaén, Carlos Roberto; Stewart, Elizabeth
2010-01-01
Background Serious shortcomings remain in clinical care in the United States despite widespread use of improvement strategies for enhancing clinical performance based on knowledge transfer approaches. Recent calls to transform primary care practice to a patient-centered medical home present even greater challenges and require more effective approaches. Methods Our research team conducted a series of National Institutes of Health funded descriptive and intervention projects to understand organizational change in primary care practice settings, emphasizing a complexity science perspective. The result was a developmental research effort that enabled the identification of critical lessons relevant to enabling practice change. Results A summary of findings from a 15-year program of research highlights the limitations of viewing primary care practices in the mechanistic terms that underlie current or traditional approaches to quality improvement. A theoretical perspective that views primary care practices as dynamic complex adaptive systems with “agents” who have the capacity to learn, and the freedom to act in unpredictable ways provides a better framework for grounding quality improvement strategies. This framework strongly emphasizes that quality improvement interventions should not only use a complexity systems perspective, but also there is a need for continual reflection, careful tailoring of interventions, and ongoing attention to the quality of interactions among agents in the practice. Conclusions It is unlikely that current strategies for quality improvement will be successful in transforming current primary care practice to a patient-centered medical home without a stronger guiding theoretical foundation. Our work suggests that a theoretical framework guided by complexity science can help in the development of quality improvement strategies that will more effectively facilitate practice change. PMID:20856145
Patient perspectives of telemedicine quality
LeRouge, Cynthia M; Garfield, Monica J; Hevner, Alan R
2015-01-01
Background The purpose of this study was to explore the quality attributes required for effective telemedicine encounters from the perspective of the patient. Methods We used a multi-method (direct observation, focus groups, survey) field study to collect data from patients who had experienced telemedicine encounters. Multi-perspectives (researcher and provider) were used to interpret a rich set of data from both a research and practice perspective. Results The result of this field study is a taxonomy of quality attributes for telemedicine service encounters that prioritizes the attributes from the patient perspective. We identify opportunities to control the level of quality for each attribute (ie, who is responsible for control of each attribute and when control can be exerted in relation to the encounter process). This analysis reveals that many quality attributes are in the hands of various stakeholders, and all attributes can be addressed proactively to some degree before the encounter begins. Conclusion Identification of the quality attributes important to a telemedicine encounter from a patient perspective enables one to better design telemedicine encounters. This preliminary work not only identifies such attributes, but also ascertains who is best able to address quality issues prior to an encounter. For practitioners, explicit representation of the quality attributes of technology-based systems and processes and insight on controlling key attributes are essential to implementation, utilization, management, and common understanding. PMID:25565781
Measuring the patient experience.
Lees, Carolyn
2011-01-01
This paper examines the complex issues of measuring the patient experience and evaluating the quality of health care. It discusses the use of surveys, patient stories and narrative methods of data collection in an attempt to define quality and how it should be measured. A recent Department of Health (DH) document insists that patients will be at the heart of decision making in the NHS by having greater control in informing strategic commissioning decisions (DH 2010c). The government aims to improve patient experience, enabling patients to rate services according to the quality of care they receive. This will be carried out using information generated by patients. This paper discusses the advantages and disadvantages of using surveys in gathering patient satisfaction data. It considers the value of surveys in measuring quality of care and appraises their usefulness in strengthening patients' collective voice. The paper investigates the use of another source of feedback - it examines the design of qualitative data collection methods as a means of gaining feedback from service users in encouraging providers of health care to be more responsive to their needs. Too often, patients are expected to fit the services, rather than services meeting the patients' needs. The most effective way of exploring and representing the patient's experience is by using a mixed-method approach. In other words, an integrated approach with the use of surveys and more narrative methods, such as patient stories, will effectively define quality and how it should be measured, ensuring that the focus is always on what matters most to patients.
Analysis of nonreciprocal noise based on mode splitting in a high-Q optical microresonator
NASA Astrophysics Data System (ADS)
Yang, Zhaohua; Xiao, Yarong; Huo, Jiayan; Shao, Hui
2018-01-01
The whispering gallery mode optical microresonator offers a high quality factor, which enables it to act as the core component of a high sensitivity resonator optic gyro; however, nonreciprocal noise limits its precision. Considering the Sagnac effect, i.e. mode splitting in high-quality optical micro-resonators, we derive the explicit expression for the angular velocity versus the splitting amount, and verify the sensing mechanism by simulation using finite element method. Remarkably, the accuracy of the angular velocity measurement in the whispering gallery mode optical microresonator with a quality factor of 108 is 106 °/s. We obtain the optimal coupling position of the novel angular velocity sensing system by detecting the output transmittance spectra of different vertical coupling distances and axial coupling positions. In addition, the reason for the nonreciprocal phenomenon is determined by theoretical analysis of the evanescent distribution of a tapered fiber. These results will provide an effective method and a theoretical basis for suppression of the nonreciprocal noise.
Adaptive optical fluorescence microscopy.
Ji, Na
2017-03-31
The past quarter century has witnessed rapid developments of fluorescence microscopy techniques that enable structural and functional imaging of biological specimens at unprecedented depth and resolution. The performance of these methods in multicellular organisms, however, is degraded by sample-induced optical aberrations. Here I review recent work on incorporating adaptive optics, a technology originally applied in astronomical telescopes to combat atmospheric aberrations, to improve image quality of fluorescence microscopy for biological imaging.
Development of a synoptic MRI report for primary rectal cancer.
Spiegle, Gillian; Leon-Carlyle, Marisa; Schmocker, Selina; Fruitman, Mark; Milot, Laurent; Gagliardi, Anna R; Smith, Andy J; McLeod, Robin S; Kennedy, Erin D
2009-12-02
Although magnetic resonance imaging (MRI) is an important imaging modality for pre-operative staging and surgical planning of rectal cancer, to date there has been little investigation on the completeness and overall quality of MRI reports. This is important because optimal patient care depends on the quality of the MRI report and clear communication of these reports to treating physicians. Previous work has shown that the use of synoptic pathology reports improves the quality of pathology reports and communication between physicians. The aims of this project are to develop a synoptic MRI report for rectal cancer and determine the enablers and barriers toward the implementation of a synoptic MRI report for rectal cancer in the clinical setting. A three-step Delphi process with an expert panel will extract the key criteria for the MRI report to guide pre-operative chemoradiation and surgical planning following a review of the literature, and a synoptic template will be developed. Furthermore, standardized qualitative research methods will be used to conduct interviews with radiologists to determine the enablers and barriers to the implementation and sustainability of the synoptic MRI report in the clinic setting. Synoptic MRI reports for rectal cancer are currently not used in North America and may improve the overall quality of MRI report and communication between physicians. This may, in turn, lead to improved patient care and outcomes for rectal cancer patients.
Dental MRI using wireless intraoral coils
NASA Astrophysics Data System (ADS)
Ludwig, Ute; Eisenbeiss, Anne-Katrin; Scheifele, Christian; Nelson, Katja; Bock, Michael; Hennig, Jürgen; von Elverfeldt, Dominik; Herdt, Olga; Flügge, Tabea; Hövener, Jan-Bernd
2016-03-01
Currently, the gold standard for dental imaging is projection radiography or cone-beam computed tomography (CBCT). These methods are fast and cost-efficient, but exhibit poor soft tissue contrast and expose the patient to ionizing radiation (X-rays). The need for an alternative imaging modality e.g. for soft tissue management has stimulated a rising interest in dental magnetic resonance imaging (MRI) which provides superior soft tissue contrast. Compared to X-ray imaging, however, so far the spatial resolution of MRI is lower and the scan time is longer. In this contribution, we describe wireless, inductively-coupled intraoral coils whose local sensitivity enables high resolution MRI of dental soft tissue. In comparison to CBCT, a similar image quality with complementary contrast was obtained ex vivo. In-vivo, a voxel size of the order of 250•250•500 μm3 was achieved in 4 min only. Compared to dental MRI acquired with clinical equipment, the quality of the images was superior in the sensitive volume of the coils and is expected to improve the planning of interventions and monitoring thereafter. This method may enable a more accurate dental diagnosis and avoid unnecessary interventions, improving patient welfare and bringing MRI a step closer to becoming a radiation-free alternative for dental imaging.
Optimisation in the Design of Environmental Sensor Networks with Robustness Consideration
Budi, Setia; de Souza, Paulo; Timms, Greg; Malhotra, Vishv; Turner, Paul
2015-01-01
This work proposes the design of Environmental Sensor Networks (ESN) through balancing robustness and redundancy. An Evolutionary Algorithm (EA) is employed to find the optimal placement of sensor nodes in the Region of Interest (RoI). Data quality issues are introduced to simulate their impact on the performance of the ESN. Spatial Regression Test (SRT) is also utilised to promote robustness in data quality of the designed ESN. The proposed method provides high network representativeness (fit for purpose) with minimum sensor redundancy (cost), and ensures robustness by enabling the network to continue to achieve its objectives when some sensors fail. PMID:26633392
Ovretveit, John; Mittman, Brian; Rubenstein, Lisa; Ganz, David A
2017-10-09
Purpose The purpose of this paper is to enable improvers to use recent knowledge from implementation science to carry out improvement changes more effectively. It also highlights the importance of converting research findings into practical tools and guidance for improvers so as to make research easier to apply in practice. Design/methodology/approach This study provides an illustration of how a quality improvement (QI) team project can make use of recent findings from implementation research so as to make their improvement changes more effective and sustainable. The guidance is based on a review and synthesis of improvement and implementation methods. Findings The paper illustrates how research can help a quality project team in the phases of problem definition and preparation, in design and planning, in implementation, and in sustaining and spreading a QI. Examples of the use of different ideas and methods are cited where they exist. Research limitations/implications The example is illustrative and there is little limited experimental evidence of whether using all the steps and tools in the one approach proposed do enable a quality team to be more effective. Evidence supporting individual guidance proposals is cited where it exists. Practical implications If the steps proposed and illustrated in the paper were followed, it is possible that quality projects could avoid waste by ensuring the conditions they need for success are in place, and sustain and spread improvement changes more effectively. Social implications More patients could benefit more quickly from more effective implementation of proven interventions. Originality/value The paper is the first to describe how improvement and implementation science can be combined in a tangible way that practical improvers can use in their projects. It shows how QI project teams can take advantage of recent advances in improvement and implementation science to make their work more effective and sustainable.
A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldham, Mark, E-mail: mark.oldham@duke.edu; Thomas, Andrew; O'Daniel, Jennifer
2012-10-01
Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution wasmore » measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient's anatomy. The latter step represents an important development that advances the clinical relevance of complex treatment QA.« less
Jabbari, Hamidreza; Fakhri, Mohammad; Lotfaliani, Mojtaba; Kiani, Arda
2013-01-01
It is suggested that hot electrocoagulation-enabled forceps (hot biopsy) may reduce hemorrhage risk after the biopsy in endobronchial tumors. The main concern in this method is possible reduction of the specimen's quality. To compare the procedure related hemorrhage with hot biopsy and conventional forceps biopsy and the diagnostic quality of the obtained specimens with either technique. In this prospective study, assessment of the biopsy samples and quantity of hemorrhage were done in a blind fashion. At first, for each patient a definite clinical diagnosis was made based on pathologic examination of all available samples, clinical data, and imaging findings. Then, second pathologist reviewed all samples to evaluate the quality of the samples. A total of 36 patients with endobronchial lesions were included in this study. Definite diagnosis was made in 83% of the patients. Diagnostic yield of the two methods were not statistically different, while the mean hemorrhage grades of all hot biopsy protocols were significantly lower as compared to that of conventional biopsy (p=0.003, p<0.001 and p<0.001 for 10,20and40 voltages respectively). No significant difference was detected between the qualities of specimens obtained by hot biopsy methods in comparison with conventional biopsy (p>0.05 for all three voltages). Hot biopsy can be a valuable alternative to forceps biopsy in evaluating endobronchial lesions.
Morris, Meg E; Erickson, Shane; Serry, Tanya A
2016-01-01
Background Although mobile apps are readily available for speech sound disorders (SSD), their validity has not been systematically evaluated. This evidence-based appraisal will critically review and synthesize current evidence on available therapy apps for use by children with SSD. Objective The main aims are to (1) identify the types of apps currently available for Android and iOS mobile phones and tablets, and (2) to critique their design features and content using a structured quality appraisal tool. Methods This protocol paper presents and justifies the methods used for a systematic review of mobile apps that provide intervention for use by children with SSD. The primary outcomes of interest are (1) engagement, (2) functionality, (3) aesthetics, (4) information quality, (5) subjective quality, and (6) perceived impact. Quality will be assessed by 2 certified practicing speech-language pathologists using a structured quality appraisal tool. Two app stores will be searched from the 2 largest operating platforms, Android and iOS. Systematic methods of knowledge synthesis shall include searching the app stores using a defined procedure, data extraction, and quality analysis. Results This search strategy shall enable us to determine how many SSD apps are available for Android and for iOS compatible mobile phones and tablets. It shall also identify the regions of the world responsible for the apps’ development, the content and the quality of offerings. Recommendations will be made for speech-language pathologists seeking to use mobile apps in their clinical practice. Conclusions This protocol provides a structured process for locating apps and appraising the quality, as the basis for evaluating their use in speech pathology for children in English-speaking nations. PMID:27899341
Standards for Cell Line Authentication and Beyond
Cole, Kenneth D.; Plant, Anne L.
2016-01-01
Different genomic technologies have been applied to cell line authentication, but only one method (short tandem repeat [STR] profiling) has been the subject of a comprehensive and definitive standard (ASN-0002). Here we discuss the power of this document and why standards such as this are so critical for establishing the consensus technical criteria and practices that can enable progress in the fields of research that use cell lines. We also examine other methods that could be used for authentication and discuss how a combination of methods could be used in a holistic fashion to assess various critical aspects of the quality of cell lines. PMID:27300367
2014-01-01
Background Workplace learning refers to continuing professional development that is stimulated by and occurs through participation in workplace activities. Workplace learning is essential for staff development and high quality clinical care. The purpose of this study was to explore the barriers to and enablers of workplace learning for allied health professionals within NSW Health. Methods A qualitative study was conducted with a purposively selected maximum variation sample (n = 46) including 19 managers, 19 clinicians and eight educators from 10 allied health professions. Seven semi-structured interviews and nine focus groups were audio-recorded and transcribed. The ‘framework approach’ was used to guide the interviews and analysis. Textual data were coded and charted using an evolving thematic framework. Results Key enablers of workplace learning included having access to peers, expertise and ‘learning networks’, protected learning time, supportive management and positive staff attitudes. The absence of these key enablers including heavy workload and insufficient staffing were important barriers to workplace learning. Conclusion Attention to these barriers and enablers may help organisations to more effectively optimise allied health workplace learning. Ultimately better workplace learning may lead to improved patient, staff and organisational outcomes. PMID:24661614
Martin, Priya; Kumar, Saravana; Lizarondo, Lucylynn; VanErp, Ans
2015-09-24
Health professionals practising in countries with dispersed populations such as Australia rely on clinical supervision for professional support. While there are directives and guidelines in place to govern clinical supervision, little is known about how it is actually conducted and what makes it effective. The purpose of this study was to explore the enablers of and barriers to high quality clinical supervision among occupational therapists across Queensland in Australia. This qualitative study took place as part of a broader project. Individual, in-depth, semi-structured interviews were conducted with occupational therapy supervisees in Queensland. The interviews explored the enablers of and barriers to high quality clinical supervision in this group. They further explored some findings from the initial quantitative study. Content analysis of the interview data resulted in eight themes. These themes were broadly around the importance of the supervisory relationship, the impact of clinical supervision and the enablers of and barriers to high quality clinical supervision. This study identified a number of factors that were perceived to be associated with high quality clinical supervision. Supervisor-supervisee matching and fit, supervisory relationship and availability of supervisor for support in between clinical supervision sessions appeared to be associated with perceptions of higher quality of clinical supervision received. Some face-to-face contact augmented with telesupervision was found to improve perceptions of the quality of clinical supervision received via telephone. Lastly, dual roles where clinical supervision and line management were provided by the same person were not considered desirable by supervisees. A number of enablers of and barriers to high quality clinical supervision were also identified. With clinical supervision gaining increasing prominence as part of organisational and professional governance, this study provides important lessons for successful and sustainable clinical supervision in practice contexts.
Toward an integrative and predictive sperm quality analysis in Bos taurus.
Yániz, J L; Soler, C; Alquézar-Baeta, C; Santolaria, P
2017-06-01
There is a need to develop more integrative sperm quality analysis methods, enabling researchers to evaluate different parameters simultaneously cell by cell. In this work, we present a new multi-parametric fluorescent test able to discriminate different sperm subpopulations based on their labeling pattern and motility characteristics. Cryopreserved semen samples from 20 Holstein bulls were used in the study. Analyses of sperm motility using computer-assisted sperm analysis (CASA-mot), membrane integrity by acridine orange-propidium iodide combination and multi-parametric by the ISAS ® 3Fun kit, were performed. The new method allows a clear discrimination of sperm subpopulations based on membrane and acrosomal integrity, motility and morphology. It was also possible to observe live spermatozoa showing signs of capacitation such as hyperactivated motility and changes in acrosomal structure. Sperm subpopulation with intact plasma membrane and acrosome showed a higher proportion of motile sperm than those with damaged acrosome or increased fluorescence intensity. Spermatozoa with intact plasmalemma and damaged acrosome were static or exhibit weak movement. Significant correlations among the different sperm quality parameters evaluated were also described. We concluded that the ISAS ® 3Fun is an integrated method that represents an advance in sperm quality analysis with the potential to improve fertility predictions. Copyright © 2017 Elsevier B.V. All rights reserved.
The Quality of Methods Reporting in Parasitology Experiments
Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy
2014-01-01
There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000–2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32–90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <−0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data. PMID:25076044
The quality of methods reporting in parasitology experiments.
Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy
2014-01-01
There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000-2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32-90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <-0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data.
Feedstock powder processing research needs for additive manufacturing development
Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan
2018-02-01
Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less
Feedstock powder processing research needs for additive manufacturing development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan
Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less
Copying of holograms by spot scanning approach.
Okui, Makoto; Wakunami, Koki; Oi, Ryutaro; Ichihashi, Yasuyuki; Jackin, Boaz Jessie; Yamamoto, Kenji
2018-05-20
To replicate holograms, contact copying has conventionally been used. In this approach, a photosensitive material is fixed together with a master hologram and illuminated with a coherent beam. This method is simple and enables high-quality copies; however, it requires a large optical setup for large-area holograms. In this paper, we present a new method of replicating holograms that uses a relatively compact optical system even for the replication of large holograms. A small laser spot that irradiates only part of the hologram is used to reproduce the hologram by scanning the spot over the whole area of the hologram. We report on the results of experiments carried out to confirm the copy quality, along with a guide to design scanning conditions. The results show the potential effectiveness of the large-area hologram replication technology using a relatively compact apparatus.
New methods to monitor emerging chemicals in the drinking water production chain.
van Wezel, Annemarie; Mons, Margreet; van Delft, Wouter
2010-01-01
New techniques enable a shift in monitoring chemicals that affect water quality from mainly at the end product, tap water, towards monitoring during the whole process along the production chain. This is congruent with the 'HACCP' system (hazard analysis of critical control points) that is fairly well integrated into food production but less well in drinking water production. This shift brings about more information about source quality, the efficiency of treatment and distribution, and understanding of processes within the production chain, and therefore can lead to a more pro-active management of drinking water production. At present, monitoring is focused neither on emerging chemicals, nor on detection of compounds with chronic toxicity. We discuss techniques to be used, detection limits compared to quality criteria, data interpretation and possible interventions in production.
Evaluation of Knowledge Development in a Healthcare Setting
NASA Astrophysics Data System (ADS)
Schaffer, Scott P.
Healthcare organizations worldwide have recently increased efforts to improve performance, quality, and knowledge transfer using information and communication technologies. Evaluation of the effectiveness and quality of such efforts is challenging. A macro and micro-level system evaluation conducted with a 14000 member US hospital administrative services organization examined the appropriateness of a blended face-to-face and technology-enabled performance improvement and knowledge development system. Furthermore, a successful team or microsystem in a high performing hospital was studied in-depth. Several types of data methods including interview, observation, and questionnaire were used to address evaluation questions within a knowledge development framework created for the study. Results of this preliminary study focus on how this organization attempted to organize clinical improvement efforts around quality and performance improvement processes supported by networked technologies.
Modeling of the laser beam shape for high-power applications
NASA Astrophysics Data System (ADS)
Jabczyński, Jan K.; Kaskow, Mateusz; Gorajek, Lukasz; Kopczyński, Krzysztof; Zendzian, Waldemar
2018-04-01
Aperture losses and thermo-optic effects (TOE) inside optics as well as the effective beam width in far field should be taken into account in the analysis of the most appropriate laser beam profile for high-power applications. We have theoretically analyzed such a problem for a group of super-Gaussian beams taking first only diffraction limitations. Furthermore, we have investigated TOE on far-field parameters of such beams to determine the influence of absorption in optical elements on beam quality degradation. The best compromise gives the super-Gaussian profile of index p = 5, for which beam quality does not decrease noticeably and the thermo-optic higher order aberrations are compensated. The simplified formulas were derived for beam quality metrics (parameter M2 and Strehl ratio), which enable estimation of the influence of heat deposited in optics on degradation of beam quality. The method of dynamic compensation of such effect was proposed.
Quality of life after cancer-How the extent of impairment is influenced by patient characteristics.
Peters, Elisabeth; Mendoza Schulz, Laura; Reuss-Borst, Monika
2016-10-10
Although this effect is well known, tailored treatment methods have not yet been broadly adopted. The aim of this study was to identify those patient characteristics that most influence the impairment of quality of life and thus to identify those patients who need and can benefit most from specific intervention treatment. 1879 cancer patients were given the EORTC QLQ C-30 questionnaire at the beginning and end of their inpatient rehabilitation. Patients' scores were compared to those of 2081 healthy adults (Schwarz and Hinz, Eur J Cancer 37:1345-1351, 2001). Furthermore, differences in quality of life corresponding to sex, age, tumor site, TNM stage, interval between diagnosis and rehabilitation, and therapy method were examined. Compared to the healthy population, the study group showed a decreased quality of life in all analyzed domains. This difference diminished with increasing age. Women reported a lower quality of life then men in general. Patients with prostate cancer showed the least impairment in several domains. Patients having undergone chemotherapy as well as radiotherapy were impaired the most. Surprisingly, TNM stage and interval between diagnosis and rehabilitation did not significantly influence quality of life. Global quality of life and all functional domains significantly improved after a 3-week rehabilitation program. Despite an individualized and increasingly better tolerable therapy, the quality of life of cancer patients is still considerably impaired. However, systematic screening of psychosocial aspects of cancer, e.g. quality of life, could enable improved intervention.
Carvalho, J J; Jerónimo, P C A; Gonçalves, C; Alpendurada, M F
2008-11-01
European Council Directive 98/83/EC on the quality of water intended for human consumption brought a new challenge for water-quality control routine laboratories, mainly on pesticides analysis. Under the guidelines of ISO/IEC 17025:2005, a multiresidue method was developed, validated, implemented in routine, and studied with real samples during a one-year period. The proposed method enables routine laboratories to handle a large number of samples, since 28 pesticides of 14 different chemical groups can be quantitated in a single procedure. The method comprises a solid-phase extraction step and subsequent analysis by liquid chromatography-mass spectrometry (LC-MS-MS). The accuracy was established on the basis of participation in interlaboratory proficiency tests, with encouraging results (majority |z-score| <2), and the precision was consistently analysed over one year. The limits of quantitation (below 0.050 microg L(-1)) are in agreement with the enforced threshold value for pesticides of 0.10 microg L(-1). Overall method performance is suitable for routine use according to accreditation rules, taking into account the data collected over one year.
McNab, Duncan; McKay, John; Bowie, Paul
2015-11-01
Small-scale quality improvement projects are expected to make a significant contribution towards improving the quality of healthcare. Enabling doctors-in-training to design and lead quality improvement projects is important preparation for independent practice. Participation is mandatory in speciality training curricula. However, provision of training and ongoing support in quality improvement methods and practice is variable. We aimed to design and deliver a quality improvement training package to core medical and general practice specialty trainees and evaluate impact in terms of project participation, completion and publication in a healthcare journal. A quality improvement training package was developed and delivered to core medical trainees and general practice specialty trainees in the west of Scotland encompassing a 1-day workshop and mentoring during completion of a quality improvement project over 3 months. A mixed methods evaluation was undertaken and data collected via questionnaire surveys, knowledge assessment, and formative assessment of project proposals, completed quality improvement projects and publication success. Twenty-three participants attended the training day with 20 submitting a project proposal (87%). Ten completed quality improvement projects (43%), eight were judged as satisfactory (35%), and four were submitted and accepted for journal publication (17%). Knowledge and confidence in aspects of quality improvement improved during the pilot, while early feedback on project proposals was valued (85.7%). This small study reports modest success in training core medical trainees and general practice specialty trainees in quality improvement. Many gained knowledge of, confidence in and experience of quality improvement, while journal publication was shown to be possible. The development of educational resources to aid quality improvement project completion and mentoring support is necessary if expectations for quality improvement are to be realised. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Carles, Guillem; Muyo, Gonzalo; van Hemert, Jano; Harvey, Andrew R.
2017-11-01
We demonstrate a multimode detection system in a scanning laser ophthalmoscope (SLO) that enables simultaneous operation in confocal, indirect, and direct modes to permit an agile trade between image contrast and optical sensitivity across the retinal field of view to optimize the overall imaging performance, enabling increased contrast in very wide-field operation. We demonstrate the method on a wide-field SLO employing a hybrid pinhole at its image plane, to yield a twofold increase in vasculature contrast in the central retina compared to its conventional direct mode while retaining high-quality imaging across a wide field of the retina, of up to 200 deg and 20 μm on-axis resolution.
Surface smoothening of the inherent roughness of micro-lenses fabricated with 2-photon lithography
NASA Astrophysics Data System (ADS)
Schift, Helmut; Kirchner, Robert; Chidambaram, Nachiappan; Altana, Mirco
2018-01-01
Two-photon polymerization by direct laser writing enables to write refractive micro-optical elements with sub-μm precision. The trajectories and layering during the direct writing process often result in roughness in the range of the writing increment, which has adverse effects for optical applications. Instead of increasing overlap between adjacent voxels, roughness in the range of 100 nm can be smoothed out by post-processing. For this a method known as TASTE was developed, which allows polishing of surfaces without changing the structural details or the overall shape. It works particularly well with thermoplastic polymers and enables sub-10 nm roughness. The optical quality was confirmed for an array with several 100 microlenses.
Pearce, Christopher; Shearer, Marianne; Gardner, Karina; Kelly, Jill; Xu, Tony Baixian
2012-01-01
This paper describes how the Melbourne East General Practice Network supports general practice to enable quality of care, it describes the challenges and enablers of change, and the evidence of practice capacity building and improved quality of care. Primary care is well known as a place where quality, relatively inexpensive medical care occurs. General practice is made up of multiple small sites with fragmented systems and a funding system that challenges a whole-of-practice approach to clinical care. General Practice Networks support GPs to synthesise complexity and crystallise solutions that enhance general practice beyond current capacity. Through a culture of change management, GP Networks create the link between the practice and the big picture of the whole health system and reduce the isolation of general practice. They distribute information (evidence-based learning and resources) and provide individualised support, responding to practice need and capacity.
Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël
2016-01-01
Background: Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. Purpose: To assess the feasibility of using online software to collect quality patient information. Methods: The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients’ perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Results: Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Conclusions: Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties. PMID:27069272
Is infant feeding pattern associated with father's quality of life?
Chen, Yi Chun; Chie, Wei-Chu; Chang, Pei-Jen; Chuang, Chao-Hua; Lin, Yu-Hsuan; Lin, Shio-Jean; Chen, Pau-Chung
2010-12-01
The aim of this study was to compare the health-related quality of life of fathers under different infant feeding type scenarios. The Medical Outcomes Study 36-item Short-Form was used to measure the health-related quality of life of 1,699 fathers, and the scores were used to look for associations with different infant feeding methods. Multivariable linear regression analysis was used to explore the contribution of the other potential related factors on fathers' quality of life. After controlling for confounding factors, fathers whose infants were ever being breast-fed reported lower scores than fathers whose infants were bottle-fed. Except for the infant feeding pattern, having a job, higher family income, and being the major caregiver were positively related to the father's quality of life. Fathers may not benefit during breast-feeding process. Because fathers' involvement plays an important role in the success of breast-feeding, the development of interventions that enable fathers to support their breast-feeding partner is very important.
Volovitz, Ilan; Shapira, Netanel; Ezer, Haim; Gafni, Aviv; Lustgarten, Merav; Alter, Tal; Ben-Horin, Idan; Barzilai, Ori; Shahar, Tal; Kanner, Andrew; Fried, Itzhak; Veshchev, Igor; Grossman, Rachel; Ram, Zvi
2016-06-01
Conducting research on the molecular biology, immunology, and physiology of brain tumors (BTs) and primary brain tissues requires the use of viably dissociated single cells. Inadequate methods for tissue dissociation generate considerable loss in the quantity of single cells produced and in the produced cells' viability. Improper dissociation may also demote the quality of data attained in functional and molecular assays due to the presence of large quantities cellular debris containing immune-activatory danger associated molecular patterns, and due to the increased quantities of degraded proteins and RNA. Over 40 resected BTs and non-tumorous brain tissue samples were dissociated into single cells by mechanical dissociation or by mechanical and enzymatic dissociation. The quality of dissociation was compared for all frequently used dissociation enzymes (collagenase, DNase, hyaluronidase, papain, dispase) and for neutral protease (NP) from Clostridium histolyticum. Single-cell-dissociated cell mixtures were evaluated for cellular viability and for the cell-mixture dissociation quality. Dissociation quality was graded by the quantity of subcellular debris, non-dissociated cell clumps, and DNA released from dead cells. Of all enzymes or enzyme combinations examined, NP (an enzyme previously not evaluated on brain tissues) produced dissociated cell mixtures with the highest mean cellular viability: 93 % in gliomas, 85 % in brain metastases, and 89 % in non-tumorous brain tissue. NP also produced cell mixtures with significantly less cellular debris than other enzymes tested. Dissociation using NP was non-aggressive over time-no changes in cell viability or dissociation quality were found when comparing 2-h dissociation at 37 °C to overnight dissociation at ambient temperature. The use of NP allows for the most effective dissociation of viable single cells from human BTs or brain tissue. Its non-aggressive dissociative capacity may enable ambient-temperature shipping of tumor pieces in multi-center clinical trials, meanwhile being dissociated. As clinical grade NP is commercially available it can be easily integrated into cell-therapy clinical trials in neuro-oncology. The high quality viable cells produced may enable investigators to conduct more consistent research by avoiding the experimental artifacts associated with the presence dead cells or cellular debris.
Assessment of Factors Influencing Communication in Clinical Pharmacy.
Yao, Dongning; Jiang, Liang; Huang, Yuankai; Chen, Lei; Wang, Yitao; Xi, Xiaoyu
2018-01-01
This study aimed to identify and assess the factors that influence communication quality between clinical pharmacists and patients using a structural equation model based on the predisposing, reinforcing, and enabling constructs in educational/environmental diagnosis and evaluation-policy, regulatory, and organizational constructs in educational and ecological development model to identify the most effective path to increase their communication quality. A survey was conducted at 253 Class-A tertiary hospitals in China from March to December 2016. During on-site observations, verbal communications between clinical pharmacists ( n = 752) and patients were audio recorded, and communication quality was rated by an expert panel on an 8-item Quality of Communication Rating Scale. Clinical pharmacists completed questionnaires that examined the predisposing, enabling, and reinforcing factors that influenced communication quality. Finally, AMOS was employed to examine the relationships between the three factors and communication quality. The results indicated that all three factors positively affected communication quality, with correlation coefficients of .26, .13, and .17, respectively. The most influential predisposing factor was attitude (.77), the most influential enabling factors were self-efficacy (.71) and confidence (.72), and the most influential reinforcing factor was rewards (.74). The findings suggest that pharmacists' attitudes toward, perceived knowledge of, and skill and confidence in communication, and the rewards offered by pharmacy management are the most influential factors that influence communication quality.
Improved GMP-compliant multi-dose production and quality control of 6-[18F]fluoro-L-DOPA.
Luurtsema, G; Boersma, H H; Schepers, M; de Vries, A M T; Maas, B; Zijlma, R; de Vries, E F J; Elsinga, P H
2017-01-01
6-[ 18 F]Fluoro-L-3,4-dihydroxyphenylalanine (FDOPA) is a frequently used radiopharmaceutical for detecting neuroendocrine and brain tumors and for the differential diagnosis of Parkinson's disease. To meet the demand for FDOPA, a high-yield GMP-compliant production method is required. Therefore, this study aimed to improve the FDOPA production and quality control procedures to enable distribution of the radiopharmaceutical over distances.FDOPA was prepared by electrophilic fluorination of the trimethylstannyl precursor with [ 18 F]F 2 , produced from [ 18 O] 2 via the double-shoot approach, leading to FDOPA with higher specific activity as compared to FDOPA which was synthesized, using [ 18 F]F 2 produced from 20 Ne, leading to FDOPA with a lower specific activity. The quality control of the product was performed using a validated UPLC system and compared with quality control with a conventional HPLC system. Impurities were identified using UPLC-MS. The [ 18 O] 2 double-shoot radionuclide production method yielded significantly more [ 18 F]F 2 with less carrier F 2 than the conventional method starting from 20 Ne. After adjustment of radiolabeling parameters substantially higher amounts of FDOPA with higher specific activity could be obtained. Quality control by UPLC was much faster and detected more side-products than HPLC. UPLC-MS showed that the most important side-product was FDOPA-quinone, rather than 6-hydroxydopa as suggested by the European Pharmacopoeia. The production and quality control of FDOPA were significantly improved by introducing the [ 18 O] 2 double-shoot radionuclide production method, and product analysis by UPLC, respectively. As a result, FDOPA is now routinely available for clinical practice and for distribution over distances.
Coriat, R; Pommaret, E; Chryssostalis, A; Viennot, S; Gaudric, M; Brezault, C; Lamarque, D; Roche, H; Verdier, D; Parlier, D; Prat, F; Chaussade, S
2009-02-01
To produce valid information, an evaluation of professional practices has to assess the quality of all practices before, during and after the procedure under study. Several auditing techniques have been proposed for colonoscopy. The purpose of this work is to describe a straightforward original validated method for the prospective evaluation of professional practices in the field of colonoscopy applicable in all endoscopy units without increasing the staff work load. Pertinent quality-control criteria (14 items) were identified by the endoscopists at the Cochin Hospital and were compatible with: findings in the available literature; guidelines proposed by the Superior Health Authority; and application in any endoscopy unit. Prospective routine data were collected and the methodology validated by evaluating 50 colonoscopies every quarter for one year. The relevance of the criteria was assessed using data collected during four separate periods. The standard checklist was complete for 57% of the colonoscopy procedures. The colonoscopy procedure was appropriate according to national guidelines in 94% of cases. These observations were particularly noteworthy: the quality of the colonic preparation was insufficient for 9% of the procedures; complete colonoscopy was achieved for 93% of patients; and 0.38 adenomas and 0.045 carcinomas were identified per colonoscopy. This simple and reproducible method can be used for valid quality-control audits in all endoscopy units. In France, unit-wide application of this method enables endoscopists to validate 100 of the 250 points required for continuous medical training. This is a quality-control tool that can be applied annually, using a random month to evaluate any changes in routine practices.
Sustaining Reliability on Accountability Measures at The Johns Hopkins Hospital.
Pronovost, Peter J; Holzmueller, Christine G; Callender, Tiffany; Demski, Renee; Winner, Laura; Day, Richard; Austin, J Matthew; Berenholtz, Sean M; Miller, Marlene R
2016-02-01
In 2012 Johns Hopkins Medicine leaders challenged their health system to reliably deliver best practice care linked to nationally vetted core measures and achieve The Joint Commission Top Performer on Key Quality Measures ®program recognition and the Delmarva Foundation award. Thus, the Armstrong Institute for Patient Safety and Quality implemented an initiative to ensure that ≥96% of patients received care linked to measures. Nine low-performing process measures were targeted for improvement-eight Joint Commission accountability measures and one Delmarva Foundation core measure. In the initial evaluation at The Johns Hopkins Hospital, all accountability measures for the Top Performer program reached the required ≥95% performance, gaining them recognition by The Joint Commission in 2013. Efforts were made to sustain performance of accountability measures at The Johns Hopkins Hospital. Improvements were sustained through 2014 using the following conceptual framework: declare and communicate goals, create an enabling infrastructure, engage clinicians and connect them in peer learning communities, report transparently, and create accountability systems. One part of the accountability system was for teams to create a sustainability plan, which they presented to senior leaders. To support sustained improvements, Armstrong Institute leaders added a project management office for all externally reported quality measures and concurrent reviewers to audit performance on care processes for certain measure sets. The Johns Hopkins Hospital sustained performance on all accountability measures, and now more than 96% of patients receive recommended care consistent with nationally vetted quality measures. The initiative methods enabled the transition of quality improvement from an isolated project to a way of leading an organization.
Improving the care of older persons in Australian prisons using the Policy Delphi method.
Patterson, Karen; Newman, Claire; Doona, Katherine
2016-09-01
There are currently no internationally recognised and approved processes relating to the care of older persons with dementia in prison. This research aimed to develop tools and procedures related to managing the care of, including the identification and assessment of, older persons with dementia who are imprisoned in New South Wales, Australia. A modified approach to the Policy Delphi method, using both surveys and facilitated discussion groups, enabled experts to come together to discuss improving the quality of care provision for older persons with dementia in prison and achieve research aims. © The Author(s) 2014.
Computer numeric control generation of toric surfaces
NASA Astrophysics Data System (ADS)
Bradley, Norman D.; Ball, Gary A.; Keller, John R.
1994-05-01
Until recently, the manufacture of toric ophthalmic lenses relied largely upon expensive, manual techniques for generation and polishing. Recent gains in computer numeric control (CNC) technology and tooling enable lens designers to employ single- point diamond, fly-cutting methods in the production of torics. Fly-cutting methods continue to improve, significantly expanding lens design possibilities while lowering production costs. Advantages of CNC fly cutting include precise control of surface geometry, rapid production with high throughput, and high-quality lens surface finishes requiring minimal polishing. As accessibility and affordability increase within the ophthalmic market, torics promise to dramatically expand lens design choices available to consumers.
Sports Injury Surveillance Systems: A Review of Methods and Data Quality.
Ekegren, Christina L; Gabbe, Belinda J; Finch, Caroline F
2016-01-01
Data from sports injury surveillance systems are a prerequisite to the development and evaluation of injury prevention strategies. This review aimed to identify ongoing sports injury surveillance systems and determine whether there are gaps in our understanding of injuries in certain sport settings. A secondary aim was to determine which of the included surveillance systems have evaluated the quality of their data, a key factor in determining their usefulness. A systematic search was carried out to identify (1) publications presenting methodological details of sports injury surveillance systems within clubs and organisations; and (2) publications describing quality evaluations and the quality of data from these systems. Data extracted included methodological details of the surveillance systems, methods used to evaluate data quality, and results of these evaluations. Following literature search and review, a total of 15 sports injury surveillance systems were identified. Data relevant to each aim were summarised descriptively. Most systems were found to exist within professional and elite sports. Publications concerning data quality were identified for seven (47%) systems. Validation of system data through comparison with alternate sources has been undertaken for only four systems (27%). This review identified a shortage of ongoing injury surveillance data from amateur and community sport settings and limited information about the quality of data in professional and elite settings. More surveillance systems are needed across a range of sport settings, as are standards for data quality reporting. These efforts will enable better monitoring of sports injury trends and the development of sports safety strategies.
Hu, Wenfa; He, Xinhua
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.
Metabolite Profiling and Classification of DNA-Authenticated Licorice Botanicals
Simmler, Charlotte; Anderson, Jeffrey R.; Gauthier, Laura; Lankin, David C.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2015-01-01
Raw licorice roots represent heterogeneous materials obtained from mainly three Glycyrrhiza species. G. glabra, G. uralensis, and G. inflata exhibit marked metabolite differences in terms of flavanones (Fs), chalcones (Cs), and other phenolic constituents. The principal objective of this work was to develop complementary chemometric models for the metabolite profiling, classification, and quality control of authenticated licorice. A total of 51 commercial and macroscopically verified samples were DNA authenticated. Principal component analysis and canonical discriminant analysis were performed on 1H NMR spectra and area under the curve values obtained from UHPLC-UV chromatograms, respectively. The developed chemometric models enable the identification and classification of Glycyrrhiza species according to their composition in major Fs, Cs, and species specific phenolic compounds. Further key outcomes demonstrated that DNA authentication combined with chemometric analyses enabled the characterization of mixtures, hybrids, and species outliers. This study provides a new foundation for the botanical and chemical authentication, classification, and metabolomic characterization of crude licorice botanicals and derived materials. Collectively, the proposed methods offer a comprehensive approach for the quality control of licorice as one of the most widely used botanical dietary supplements. PMID:26244884
Childs, Kevin L; Konganti, Kranti; Buell, C Robin
2012-01-01
Major feedstock sources for future biofuel production are likely to be high biomass producing plant species such as poplar, pine, switchgrass, sorghum and maize. One active area of research in these species is genome-enabled improvement of lignocellulosic biofuel feedstock quality and yield. To facilitate genomic-based investigations in these species, we developed the Biofuel Feedstock Genomic Resource (BFGR), a database and web-portal that provides high-quality, uniform and integrated functional annotation of gene and transcript assembly sequences from species of interest to lignocellulosic biofuel feedstock researchers. The BFGR includes sequence data from 54 species and permits researchers to view, analyze and obtain annotation at the gene, transcript, protein and genome level. Annotation of biochemical pathways permits the identification of key genes and transcripts central to the improvement of lignocellulosic properties in these species. The integrated nature of the BFGR in terms of annotation methods, orthologous/paralogous relationships and linkage to seven species with complete genome sequences allows comparative analyses for biofuel feedstock species with limited sequence resources. Database URL: http://bfgr.plantbiology.msu.edu.
Development of a shock wave adhesion test for composite bonds by pulsed laser and mechanical impacts
NASA Astrophysics Data System (ADS)
Ecault, R.; Boustie, M.; Touchard, F.; Arrigoni, M.; Berthe, L.
2014-05-01
Evaluating the bonding quality of composite material is becoming one of the main challenges faced by aeronautic industries. This work aims to the development of a technique using shock wave, which would enable to quantify the bonding mechanical quality. Laser shock experiments were carried out. This technique enables high tensile stress generation in the thickness of composite bonds. The resulting damage has been quantified using different methods such as confocal microscopy, ultrasound and cross section observation. The discrimination between a correct bond and a weak bond was possible thanks to these experiments. Nevertheless, laser sources are not well adapted for optimization of such a test because of often fixed settings. That is why mechanical impacts on bonded composites were also performed in this work. By changing the thickness of aluminum projectiles, the generated tensile stresses by the shock wave propagation were moved toward the composite/bond interface. The made observations prove that the technique optimization is possible. The key parameters for the development of a bonding test using shock waves have been identified.
Development of a shock wave adhesion test for composite bonds by laser pulsed and mechanical impacts
NASA Astrophysics Data System (ADS)
Ecault, Romain; Boustie, Michel; Touchard, Fabienne; Arrigoni, Michel; Berthe, Laurent; CNRS Collaboration
2013-06-01
Evaluating the bonding quality of composite material is becoming one of the main challenges faced by aeronautic industries. This work aims the development of a technique using shock wave, which would enable to quantify the bonding mechanical quality. Laser shock experiments were carried out. This technique enables high tensile stress generation in the thickness of composite bond without any mechanical contact. The resulting damage has been quantified using different method such as confocal microscopy, ultrasound and cross section observation. The discrimination between a correct bond and a weak bond was possible thanks to these experiments. Nevertheless, laser sources are not well adapted for optimization of such a test since it has often fixed parameters. That is why mechanical impacts bonded composites were also performed in this work. By changing the thickness of aluminum projectiles, the tensile stresses generated by the shock wave propagation were moved toward the composite/bond interface. The observations made prove that the optimization of the technique is possible. The key parameters for the development of a bonding test using shock wave have been identified.
Fast online deconvolution of calcium imaging data
Zhou, Pengcheng; Paninski, Liam
2017-01-01
Fluorescent calcium indicators are a popular means for observing the spiking activity of large neuronal populations, but extracting the activity of each neuron from raw fluorescence calcium imaging data is a nontrivial problem. We present a fast online active set method to solve this sparse non-negative deconvolution problem. Importantly, the algorithm 3progresses through each time series sequentially from beginning to end, thus enabling real-time online estimation of neural activity during the imaging session. Our algorithm is a generalization of the pool adjacent violators algorithm (PAVA) for isotonic regression and inherits its linear-time computational complexity. We gain remarkable increases in processing speed: more than one order of magnitude compared to currently employed state of the art convex solvers relying on interior point methods. Unlike these approaches, our method can exploit warm starts; therefore optimizing model hyperparameters only requires a handful of passes through the data. A minor modification can further improve the quality of activity inference by imposing a constraint on the minimum spike size. The algorithm enables real-time simultaneous deconvolution of O(105) traces of whole-brain larval zebrafish imaging data on a laptop. PMID:28291787
An approach for accurate simulation of liquid mixing in a T-shaped micromixer.
Matsunaga, Takuya; Lee, Ho-Joon; Nishino, Koichi
2013-04-21
In this paper, we propose a new computational method for efficient evaluation of the fluid mixing behaviour in a T-shaped micromixer with a rectangular cross section at high Schmidt number under steady state conditions. Our approach enables a low-cost high-quality simulation based on tracking of fluid particles for convective fluid mixing and posterior solving of a model of the species equation for molecular diffusion. The examined parameter range is Re = 1.33 × 10(-2) to 240 at Sc = 3600. The proposed method is shown to simulate well the mixing quality even in the engulfment regime, where the ordinary grid-based simulation is not able to obtain accurate solutions with affordable mesh sizes due to the numerical diffusion at high Sc. The obtained results agree well with a backward random-walk Monte Carlo simulation, by which the accuracy of the proposed method is verified. For further investigation of the characteristics of the proposed method, the Sc dependency is examined in a wide range of Sc from 10 to 3600 at Re = 200. The study reveals that the model discrepancy error emerges more significantly in the concentration distribution at lower Sc, while the resulting mixing quality is accurate over the entire range.
Vogel, Simon M; Bauer, Matthias R; Boeckler, Frank M
2011-10-24
For widely applied in silico screening techniques success depends on the rational selection of an appropriate method. We herein present a fast, versatile, and robust method to construct demanding evaluation kits for objective in silico screening (DEKOIS). This automated process enables creating tailor-made decoy sets for any given sets of bioactives. It facilitates a target-dependent validation of docking algorithms and scoring functions helping to save time and resources. We have developed metrics for assessing and improving decoy set quality and employ them to investigate how decoy embedding affects docking. We demonstrate that screening performance is target-dependent and can be impaired by latent actives in the decoy set (LADS) or enhanced by poor decoy embedding. The presented method allows extending and complementing the collection of publicly available high quality decoy sets toward new target space. All present and future DEKOIS data sets will be made accessible at www.dekois.com.
From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.
Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J
2017-05-01
To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Towards the Application of Fuzzy Logic for Developing a Novel Indoor Air Quality Index (FIAQI)
JAVID, Allahbakhsh; HAMEDIAN, Amir Abbas; GHARIBI, Hamed; SOWLAT, Mohammad Hossein
2016-01-01
Background: In the past few decades, Indoor Air Pollution (IAP) has become a primary concern to the point. It is increasingly believed to be of equal or greater importance to human health compared to ambient air. However, due to the lack of comprehensive indices for the integrated assessment of indoor air quality (IAQ), we aimed to develop a novel, Fuzzy-Based Indoor Air Quality Index (FIAQI) to bridge the existing gap in this area. Methods: We based our index on fuzzy logic, which enables us to overcome the limitations of traditional methods applied to develop environmental quality indices. Fifteen parameters, including the criteria air pollutants, volatile organic compounds, and bioaerosols were included in the FIAQI due mainly to their significant health effects. Weighting factors were assigned to the parameters based on the medical evidence available in the literature on their health effects. The final FIAQI consisted of 108 rules. In order to demonstrate the performance of the index, data were intentionally generated to cover a variety of quality levels. In addition, a sensitivity analysis was conducted to assess the validity of the index. Results: The FIAQI tends to be a comprehensive tool to classify IAQ and produce accurate results. Conclusion: It seems useful and reliable to be considered by authorities to assess IAQ environments. PMID:27114985
Fast Batch Production of High-Quality Graphene Films in a Sealed Thermal Molecular Movement System.
Xu, Jianbao; Hu, Junxiong; Li, Qi; Wang, Rubing; Li, Weiwei; Guo, Yufen; Zhu, Yongbo; Liu, Fengkui; Ullah, Zaka; Dong, Guocai; Zeng, Zhongming; Liu, Liwei
2017-07-01
Chemical vapor deposition (CVD) growth of high-quality graphene has emerged as the most promising technique in terms of its integrated manufacturing. However, there lacks a controllable growth method for producing high-quality and a large-quantity graphene films, simultaneously, at a fast growth rate, regardless of roll-to-roll (R2R) or batch-to-batch (B2B) methods. Here, a stationary-atmospheric-pressure CVD (SAPCVD) system based on thermal molecular movement, which enables fast B2B growth of continuous and uniform graphene films on tens of stacked Cu(111) foils, with a growth rate of 1.5 µm s -1 , is demonstrated. The monolayer graphene of batch production is found to nucleate from arrays of well-aligned domains, and the films possess few defects and exhibit high carrier mobility up to 6944 cm 2 V -1 s -1 at room temperature. The results indicate that the SAPCVD system combined with single-domain Cu(111) substrates makes it possible to realize fast batch-growth of high-quality graphene films, which opens up enormous opportunities to use this unique 2D material for industrial device applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Primer and platform effects on 16S rRNA tag sequencing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tremblay, Julien; Singh, Kanwar; Fern, Alison
Sequencing of 16S rRNA gene tags is a popular method for profiling and comparing microbial communities. The protocols and methods used, however, vary considerably with regard to amplification primers, sequencing primers, sequencing technologies; as well as quality filtering and clustering. How results are affected by these choices, and whether data produced with different protocols can be meaningfully compared, is often unknown. Here we compare results obtained using three different amplification primer sets (targeting V4, V6–V8, and V7–V8) and two sequencing technologies (454 pyrosequencing and Illumina MiSeq) using DNA from a mock community containing a known number of species as wellmore » as complex environmental samples whose PCR-independent profiles were estimated using shotgun sequencing. We find that paired-end MiSeq reads produce higher quality data and enabled the use of more aggressive quality control parameters over 454, resulting in a higher retention rate of high quality reads for downstream data analysis. While primer choice considerably influences quantitative abundance estimations, sequencing platform has relatively minor effects when matched primers are used. In conclusion, beta diversity metrics are surprisingly robust to both primer and sequencing platform biases.« less
Appleton, P L; Quyn, A J; Swift, S; Näthke, I
2009-05-01
Visualizing overall tissue architecture in three dimensions is fundamental for validating and integrating biochemical, cell biological and visual data from less complex systems such as cultured cells. Here, we describe a method to generate high-resolution three-dimensional image data of intact mouse gut tissue. Regions of highest interest lie between 50 and 200 mum within this tissue. The quality and usefulness of three-dimensional image data of tissue with such depth is limited owing to problems associated with scattered light, photobleaching and spherical aberration. Furthermore, the highest-quality oil-immersion lenses are designed to work at a maximum distance of =10-15 mum into the sample, further compounding the ability to image at high-resolution deep within tissue. We show that manipulating the refractive index of the mounting media and decreasing sample opacity greatly improves image quality such that the limiting factor for a standard, inverted multi-photon microscope is determined by the working distance of the objective as opposed to detectable fluorescence. This method negates the need for mechanical sectioning of tissue and enables the routine generation of high-quality, quantitative image data that can significantly advance our understanding of tissue architecture and physiology.
Primer and platform effects on 16S rRNA tag sequencing
Tremblay, Julien; Singh, Kanwar; Fern, Alison; ...
2015-08-04
Sequencing of 16S rRNA gene tags is a popular method for profiling and comparing microbial communities. The protocols and methods used, however, vary considerably with regard to amplification primers, sequencing primers, sequencing technologies; as well as quality filtering and clustering. How results are affected by these choices, and whether data produced with different protocols can be meaningfully compared, is often unknown. Here we compare results obtained using three different amplification primer sets (targeting V4, V6–V8, and V7–V8) and two sequencing technologies (454 pyrosequencing and Illumina MiSeq) using DNA from a mock community containing a known number of species as wellmore » as complex environmental samples whose PCR-independent profiles were estimated using shotgun sequencing. We find that paired-end MiSeq reads produce higher quality data and enabled the use of more aggressive quality control parameters over 454, resulting in a higher retention rate of high quality reads for downstream data analysis. While primer choice considerably influences quantitative abundance estimations, sequencing platform has relatively minor effects when matched primers are used. In conclusion, beta diversity metrics are surprisingly robust to both primer and sequencing platform biases.« less
MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, G; Pan, X; Stayman, J
2014-06-15
Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.
2016-12-01
The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.
The principles of quality-associated costing: derivation from clinical transfusion practice.
Trenchard, P M; Dixon, R
1997-01-01
As clinical transfusion practice works towards achieving cost-effectiveness, prescribers of blood and its derivatives must be certain that the prices of such products are based on real manufacturing costs and not market forces. Using clinical cost-benefit analysis as the context for the costing and pricing of blood products, this article identifies the following two principles: (1) the product price must equal the product cost (the "price = cost" rule) and (2) the product cost must equal the real cost of product manufacture. In addition, the article describes a new method of blood product costing, quality-associated costing (QAC), that will enable valid cost-benefit analysis of blood products.
Event-Driven Messaging for Offline Data Quality Monitoring at ATLAS
NASA Astrophysics Data System (ADS)
Onyisi, Peter
2015-12-01
During LHC Run 1, the information flow through the offline data quality monitoring in ATLAS relied heavily on chains of processes polling each other's outputs for handshaking purposes. This resulted in a fragile architecture with many possible points of failure and an inability to monitor the overall state of the distributed system. We report on the status of a project undertaken during the LHC shutdown to replace the ad hoc synchronization methods with a uniform message queue system. This enables the use of standard protocols to connect processes on multiple hosts; reliable transmission of messages between possibly unreliable programs; easy monitoring of the information flow; and the removal of inefficient polling-based communication.
Odusola, Aina O; Stronks, Karien; Hendriks, Marleen E; Schultsz, Constance; Akande, Tanimola; Osibogun, Akin; van Weert, Henk; Haafkens, Joke A
2016-01-01
Hypertension is a highly prevalent risk factor for cardiovascular diseases in sub-Saharan Africa (SSA) that can be modified through timely and long-term treatment in primary care. We explored perspectives of primary care staff and health insurance managers on enablers and barriers for implementing high-quality hypertension care, in the context of a community-based health insurance programme in rural Nigeria. Qualitative study using semi-structured individual interviews with primary care staff (n = 11) and health insurance managers (n=4). Data were analysed using standard qualitative techniques. Both stakeholder groups perceived health insurance as an important facilitator for implementing high-quality hypertension care because it covered costs of care for patients and provided essential resources and incentives to clinics: guidelines, staff training, medications, and diagnostic equipment. Perceived inhibitors included the following: high staff workload; administrative challenges at facilities; discordance between healthcare provider and insurer on how health insurance and provider payment methods work; and insufficient fit between some guideline recommendations and tools for patient education and characteristics/needs of the local patient population. Perceived strategies to address inhibitors included the following: task-shifting; adequate provider payment benchmarking; good provider-insurer relationships; automated administration systems; and tailoring guidelines/patient education. By providing insights into perspectives of primary care providers and health insurance managers, this study offers information on potential strategies for implementing high-quality hypertension care for insured patients in SSA.
Mashamba-Thompson, Tivani P.; Jama, Ngcwalisa A.; Sartorius, Benn; Drain, Paul K.; Thompson, Rowan M.
2017-01-01
Introduction: Key stakeholders’ involvement is crucial to the sustainability of quality point-of-care (POC) diagnostics services in low-and-middle income countries. The aim of this study was to explore key stakeholder perceptions on the implementation of POC diagnostics in rural primary healthcare (PHC) clinics in South Africa. Method: We conducted a qualitative study encompassing in-depth interviews with multiple key stakeholders of POC diagnostic services for rural and resource-limited PHC clinics. Interviews were digitally recorded and transcribed verbatim prior to thematic content analysis. Thematic content analysis was conducted using themes guided by the World Health Organisation (WHO) quality-ASSURED (Affordable, Sensitive, Specific, User friendly, Rapid and to enable treatment at first visit and Robust, Equipment free and Delivered to those who need it) criteria for POC diagnostic services in resource-limited settings. Results: 11 key stakeholders participated in the study. All stakeholders perceived the main advantage of POC diagnostics as enabling access to healthcare for rural patients. Stakeholders perceived the current POC diagnostic services to have an ability to meet patients’ needs, but recommended further improvement of the following areas: research on cost-effectiveness; improved quality management systems; development of affordable POC diagnostic and clinic-based monitoring and evaluation. Conclusions: Key stakeholders of POC diagnostics in rural PHC clinics in South Africa highlighted the need to assess affordability and ensure quality assurance of current services before adopting new POC diagnostics and scaling up current POC diagnostics. PMID:28075337
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp; Chatani, Masashi; Otani, Yuki
Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images aremore » degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.« less
Multiview 3D sensing and analysis for high quality point cloud reconstruction
NASA Astrophysics Data System (ADS)
Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard
2018-04-01
Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.
Biodiversity informatics: managing and applying primary biodiversity data.
Soberón, Jorge; Peterson, A Townsend
2004-01-01
Recently, advances in information technology and an increased willingness to share primary biodiversity data are enabling unprecedented access to it. By combining presences of species data with electronic cartography via a number of algorithms, estimating niches of species and their areas of distribution becomes feasible at resolutions one to three orders of magnitude higher than it was possible a few years ago. Some examples of the power of that technique are presented. For the method to work, limitations such as lack of high-quality taxonomic determination, precise georeferencing of the data and availability of high-quality and updated taxonomic treatments of the groups must be overcome. These are discussed, together with comments on the potential of these biodiversity informatics techniques not only for fundamental studies but also as a way for developing countries to apply state of the art bioinformatic methods and large quantities of data, in practical ways, to tackle issues of biodiversity management. PMID:15253354
Sorge, John P; Harmon, C Reid; Sherman, Susan M; Baillie, E Eugene
2005-07-01
We used data management software to compare pathology report data concerning regional lymph node sampling for colorectal carcinoma from 2 institutions using different dissection methods. Data were retrieved from 2 disparate anatomic pathology information systems for all cases of colorectal carcinoma in 2003 involving the ascending and descending colon. Initial sorting of the data included overall lymph node recovery to assess differences between the dissection methods at the 2 institutions. Additional segregation of the data was used to challenge the application's capability of accurately addressing the complexity of the process. This software approach can be used to evaluate data from disparate computer systems, and we demonstrate how an automated function can enable institutions to compare internal pathologic assessment processes and the results of those comparisons. The use of this process has future implications for pathology quality assurance in other areas.
Lucas, Patricia J; Baird, Janis; Arai, Lisa; Law, Catherine; Roberts, Helen M
2007-01-01
Background The inclusion of qualitative studies in systematic reviews poses methodological challenges. This paper presents worked examples of two methods of data synthesis (textual narrative and thematic), used in relation to one review, with the aim of enabling researchers to consider the strength of different approaches. Methods A systematic review of lay perspectives of infant size and growth was conducted, locating 19 studies (including both qualitative and quantitative). The data extracted from these were synthesised using both a textual narrative and a thematic synthesis. Results The processes of both methods are presented, showing a stepwise progression to the final synthesis. Both methods led us to similar conclusions about lay views toward infant size and growth. Differences between methods lie in the way they dealt with study quality and heterogeneity. Conclusion On the basis of the work reported here, we consider textual narrative and thematic synthesis have strengths and weaknesses in relation to different research questions. Thematic synthesis holds most potential for hypothesis generation, but may obscure heterogeneity and quality appraisal. Textual narrative synthesis is better able to describe the scope of existing research and account for the strength of evidence, but is less good at identifying commonality. PMID:17224044
Printable semiconductor structures and related methods of making and assembling
Nuzzo, Ralph G.; Rogers, John A.; Menard, Etienne; Lee, Keon Jae; Khang; , Dahl-Young; Sun, Yugang; Meitl, Matthew; Zhu, Zhengtao; Ko, Heung Cho; Mack, Shawn
2013-03-12
The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.
Printable semiconductor structures and related methods of making and assembling
Nuzzo, Ralph G [Champaign, IL; Rogers, John A [Champaign, IL; Menard, Etienne [Durham, NC; Lee, Keon Jae [Tokyo, JP; Khang, Dahl-Young [Urbana, IL; Sun, Yugang [Westmont, IL; Meitl, Matthew [Raleigh, NC; Zhu, Zhengtao [Rapid City, SD; Ko, Heung Cho [Urbana, IL; Mack, Shawn [Goleta, CA
2011-10-18
The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.
Printable semiconductor structures and related methods of making and assembling
Nuzzo, Ralph G.; Rogers, John A.; Menard, Etienne; Lee, Keon Jae; Khang, Dahl-Young; Sun, Yugang; Meitl, Matthew; Zhu, Zhengtao; Ko, Heung Cho; Mack, Shawn
2010-09-21
The present invention provides a high yield pathway for the fabrication, transfer and assembly of high quality printable semiconductor elements having selected physical dimensions, shapes, compositions and spatial orientations. The compositions and methods of the present invention provide high precision registered transfer and integration of arrays of microsized and/or nanosized semiconductor structures onto substrates, including large area substrates and/or flexible substrates. In addition, the present invention provides methods of making printable semiconductor elements from low cost bulk materials, such as bulk silicon wafers, and smart-materials processing strategies that enable a versatile and commercially attractive printing-based fabrication platform for making a broad range of functional semiconductor devices.
NASA Astrophysics Data System (ADS)
Vogt, William C.; Jia, Congxian; Wear, Keith A.; Garra, Brian S.; Pfefer, T. Joshua
2017-03-01
As Photoacoustic Tomography (PAT) matures and undergoes clinical translation, objective performance test methods are needed to facilitate device development, regulatory clearance and clinical quality assurance. For mature medical imaging modalities such as CT, MRI, and ultrasound, tissue-mimicking phantoms are frequently incorporated into consensus standards for performance testing. A well-validated set of phantom-based test methods is needed for evaluating performance characteristics of PAT systems. To this end, we have constructed phantoms using a custom tissue-mimicking material based on PVC plastisol with tunable, biologically-relevant optical and acoustic properties. Each phantom is designed to enable quantitative assessment of one or more image quality characteristics including 3D spatial resolution, spatial measurement accuracy, ultrasound/PAT co-registration, uniformity, penetration depth, geometric distortion, sensitivity, and linearity. Phantoms contained targets including high-intensity point source targets and dye-filled tubes. This suite of phantoms was used to measure the dependence of performance of a custom PAT system (equipped with four interchangeable linear array transducers of varying design) on design parameters (e.g., center frequency, bandwidth, element geometry). Phantoms also allowed comparison of image artifacts, including surface-generated clutter and bandlimited sensing artifacts. Results showed that transducer design parameters create strong variations in performance including a trade-off between resolution and penetration depth, which could be quantified with our method. This study demonstrates the utility of phantom-based image quality testing in device performance assessment, which may guide development of consensus standards for PAT systems.
BCD Beam Search: considering suboptimal partial solutions in Bad Clade Deletion supertrees.
Fleischauer, Markus; Böcker, Sebastian
2018-01-01
Supertree methods enable the reconstruction of large phylogenies. The supertree problem can be formalized in different ways in order to cope with contradictory information in the input. Some supertree methods are based on encoding the input trees in a matrix; other methods try to find minimum cuts in some graph. Recently, we introduced Bad Clade Deletion (BCD) supertrees which combines the graph-based computation of minimum cuts with optimizing a global objective function on the matrix representation of the input trees. The BCD supertree method has guaranteed polynomial running time and is very swift in practice. The quality of reconstructed supertrees was superior to matrix representation with parsimony (MRP) and usually on par with SuperFine for simulated data; but particularly for biological data, quality of BCD supertrees could not keep up with SuperFine supertrees. Here, we present a beam search extension for the BCD algorithm that keeps alive a constant number of partial solutions in each top-down iteration phase. The guaranteed worst-case running time of the new algorithm is still polynomial in the size of the input. We present an exact and a randomized subroutine to generate suboptimal partial solutions. Both beam search approaches consistently improve supertree quality on all evaluated datasets when keeping 25 suboptimal solutions alive. Supertree quality of the BCD Beam Search algorithm is on par with MRP and SuperFine even for biological data. This is the best performance of a polynomial-time supertree algorithm reported so far.
Systems engineering and management.
Rouse, William B; Compton, W Dale
2010-01-01
This chapter offers a systems view of healthcare delivery and outlines a wide range of concepts, principles, models, methods and tools from systems engineering and management that can enable the transformation of the dysfunctional "as is" healthcare system to an agreed-upon "to be" system that will provide quality, affordable care for everyone. Topics discussed include systems definition, design, analysis, and control, as well as the data and information needed to support these functions. Barriers to implementation are also considered.
Management and assimilation of diverse, distributed watershed datasets
NASA Astrophysics Data System (ADS)
Varadharajan, C.; Faybishenko, B.; Versteeg, R.; Agarwal, D.; Hubbard, S. S.; Hendrix, V.
2016-12-01
The U.S. Department of Energy's (DOE) Watershed Function Scientific Focus Area (SFA) seeks to determine how perturbations to mountainous watersheds (e.g., floods, drought, early snowmelt) impact the downstream delivery of water, nutrients, carbon, and metals over seasonal to decadal timescales. We are building a software platform that enables integration of diverse and disparate field, laboratory, and simulation datasets, of various types including hydrological, geological, meteorological, geophysical, geochemical, ecological and genomic datasets across a range of spatial and temporal scales within the Rifle floodplain and the East River watershed, Colorado. We are using agile data management and assimilation approaches, to enable web-based integration of heterogeneous, multi-scale dataSensor-based observations of water-level, vadose zone and groundwater temperature, water quality, meteorology as well as biogeochemical analyses of soil and groundwater samples have been curated and archived in federated databases. Quality Assurance and Quality Control (QA/QC) are performed on priority datasets needed for on-going scientific analyses, and hydrological and geochemical modeling. Automated QA/QC methods are used to identify and flag issues in the datasets. Data integration is achieved via a brokering service that dynamically integrates data from distributed databases via web services, based on user queries. The integrated results are presented to users in a portal that enables intuitive search, interactive visualization and download of integrated datasets. The concepts, approaches and codes being used are shared across various data science components of various large DOE-funded projects such as the Watershed Function SFA, Next Generation Ecosystem Experiment (NGEE) Tropics, Ameriflux/FLUXNET, and Advanced Simulation Capability for Environmental Management (ASCEM), and together contribute towards DOE's cyberinfrastructure for data management and model-data integration.
Nåbo, Lina J; Olsen, Jógvan Magnus Haugaard; Martínez, Todd J; Kongsted, Jacob
2017-12-12
The calculation of spectral properties for photoactive proteins is challenging because of the large cost of electronic structure calculations on large systems. Mixed quantum mechanical (QM) and molecular mechanical (MM) methods are typically employed to make such calculations computationally tractable. This study addresses the connection between the minimal QM region size and the method used to model the MM region in the calculation of absorption properties-here exemplified for calculations on the green fluorescent protein. We find that polarizable embedding is necessary for a qualitatively correct description of the MM region, and that this enables the use of much smaller QM regions compared to fixed charge electrostatic embedding. Furthermore, absorption intensities converge very slowly with system size and inclusion of effective external field effects in the MM region through polarizabilities is therefore very important. Thus, this embedding scheme enables accurate prediction of intensities for systems that are too large to be treated fully quantum mechanically.
LeBlanc, André; Michaud, Sarah A; Percy, Andrew J; Hardie, Darryl B; Yang, Juncong; Sinclair, Nicholas J; Proudfoot, Jillaine I; Pistawka, Adam; Smith, Derek S; Borchers, Christoph H
2017-07-07
When quantifying endogenous plasma proteins for fundamental and biomedical research - as well as for clinical applications - precise, reproducible, and robust assays are required. Targeted detection of peptides in a bottom-up strategy is the most common and precise mass spectrometry-based quantitation approach when combined with the use of stable isotope-labeled peptides. However, when measuring protein in plasma, the unknown endogenous levels prevent the implementation of the best calibration strategies, since no blank matrix is available. Consequently, several alternative calibration strategies are employed by different laboratories. In this study, these methods were compared to a new approach using two different stable isotope-labeled standard (SIS) peptide isotopologues for each endogenous peptide to be quantified, enabling an external calibration curve as well as the quality control samples to be prepared in pooled human plasma without interference from endogenous peptides. This strategy improves the analytical performance of the assay and enables the accuracy of the assay to be monitored, which can also facilitate method development and validation.
Uervirojnangkoorn, Monarin; Zeldin, Oliver B.; Lyubimov, Artem Y.; ...
2015-03-17
There is considerable potential for X-ray free electron lasers (XFELs) to enable determination of macromolecular crystal structures that are difficult to solve using current synchrotron sources. Prior XFEL studies often involved the collection of thousands to millions of diffraction images, in part due to limitations of data processing methods. We implemented a data processing system based on classical post-refinement techniques, adapted to specific properties of XFEL diffraction data. When applied to XFEL data from three different proteins collected using various sample delivery systems and XFEL beam parameters, our method improved the quality of the diffraction data as well as themore » resulting refined atomic models and electron density maps. Moreover, the number of observations for a reflection necessary to assemble an accurate data set could be reduced to a few observations. In conclusion, these developments will help expand the applicability of XFEL crystallography to challenging biological systems, including cases where sample is limited.« less
Uervirojnangkoorn, Monarin; Zeldin, Oliver B.; Lyubimov, Artem Y.; ...
2015-03-17
There is considerable potential for X-ray free electron lasers (XFELs) to enable determination of macromolecular crystal structures that are difficult to solve using current synchrotron sources. Prior XFEL studies often involved the collection of thousands to millions of diffraction images, in part due to limitations of data processing methods. We implemented a data processing system based on classical post-refinement techniques, adapted to specific properties of XFEL diffraction data. When applied to XFEL data from three different proteins collected using various sample delivery systems and XFEL beam parameters, our method improved the quality of the diffraction data as well as themore » resulting refined atomic models and electron density maps. Moreover, the number of observations for a reflection necessary to assemble an accurate data set could be reduced to a few observations. These developments will help expand the applicability of XFEL crystallography to challenging biological systems, including cases where sample is limited.« less
Uervirojnangkoorn, Monarin; Zeldin, Oliver B; Lyubimov, Artem Y; Hattne, Johan; Brewster, Aaron S; Sauter, Nicholas K; Brunger, Axel T; Weis, William I
2015-01-01
There is considerable potential for X-ray free electron lasers (XFELs) to enable determination of macromolecular crystal structures that are difficult to solve using current synchrotron sources. Prior XFEL studies often involved the collection of thousands to millions of diffraction images, in part due to limitations of data processing methods. We implemented a data processing system based on classical post-refinement techniques, adapted to specific properties of XFEL diffraction data. When applied to XFEL data from three different proteins collected using various sample delivery systems and XFEL beam parameters, our method improved the quality of the diffraction data as well as the resulting refined atomic models and electron density maps. Moreover, the number of observations for a reflection necessary to assemble an accurate data set could be reduced to a few observations. These developments will help expand the applicability of XFEL crystallography to challenging biological systems, including cases where sample is limited. DOI: http://dx.doi.org/10.7554/eLife.05421.001 PMID:25781634
Autonomy enables new science missions
NASA Astrophysics Data System (ADS)
Doyle, Richard J.; Gor, Victoria; Man, Guy K.; Stolorz, Paul E.; Chapman, Clark; Merline, William J.; Stern, Alan
1997-01-01
The challenge of space flight in NASA's future is to enable smaller, more frequent and intensive space exploration at much lower total cost without substantially decreasing mission reliability, capability, or the scientific return on investment. The most effective way to achieve this goal is to build intelligent capabilities into the spacecraft themselves. Our technological vision for meeting the challenge of returning quality science through limited communication bandwidth will actually put scientists in a more direct link with the spacecraft than they have enjoyed to date. Technologies such as pattern recognition and machine learning can place a part of the scientist's awareness onboard the spacecraft to prioritize downlink or to autonomously trigger time-critical follow-up observations-particularly important in flyby missions-without ground interaction. Onboard knowledge discovery methods can be used to include candidate discoveries in each downlink for scientists' scrutiny. Such capabilities will allow scientists to quickly reprioritize missions in a much more intimate and efficient manner than is possible today. Ultimately, new classes of exploration missions will be enabled.
Nursing home quality of life: study of an enabling garden.
Raske, Martha
2010-05-01
The purpose of this study was to conduct an in-depth evaluation of the impact of the construction and use of an enabling garden on resident quality of life in a rural nursing home. This qualitative study used interviews with residents, family members, staff members, and community volunteers who built the garden. Findings suggest the garden had positive effects on resident quality of life, particularly in terms of meaningful daily activities, enjoyment of daily life, resident relationships, and functional competency. Implications for research and practice are discussed.
Alamar, Priscila D; Caramês, Elem T S; Poppi, Ronei J; Pallone, Juliana A L
2016-07-01
The present study investigated the application of near infrared spectroscopy as a green, quick, and efficient alternative to analytical methods currently used to evaluate the quality (moisture, total sugars, acidity, soluble solids, pH and ascorbic acid) of frozen guava and passion fruit pulps. Fifty samples were analyzed by near infrared spectroscopy (NIR) and reference methods. Partial least square regression (PLSR) was used to develop calibration models to relate the NIR spectra and the reference values. Reference methods indicated adulteration by water addition in 58% of guava pulp samples and 44% of yellow passion fruit pulp samples. The PLS models produced lower values of root mean squares error of calibration (RMSEC), root mean squares error of prediction (RMSEP), and coefficient of determination above 0.7. Moisture and total sugars presented the best calibration models (RMSEP of 0.240 and 0.269, respectively, for guava pulp; RMSEP of 0.401 and 0.413, respectively, for passion fruit pulp) which enables the application of these models to determine adulteration in guava and yellow passion fruit pulp by water or sugar addition. The models constructed for calibration of quality parameters of frozen fruit pulps in this study indicate that NIR spectroscopy coupled with the multivariate calibration technique could be applied to determine the quality of guava and yellow passion fruit pulp. Copyright © 2016 Elsevier Ltd. All rights reserved.
Can aging in place be cost effective? A systematic review.
Graybill, Erin M; McMeekin, Peter; Wildman, John
2014-01-01
To systematically review cost, cost-minimization and cost-effectiveness studies for assisted living technologies (ALTs) that specifically enable older people to 'age in place' and highlight what further research is needed to inform decisions regarding aging in place. People aged 65+ and their live-in carers (where applicable), using an ALT to age in place at home opposed to a community-dwelling arrangement. Studies were identified using a predefined search strategy on two key economic and cost evaluation databases NHS EED, HEED. Studies were assessed using methods recommended by the Campbell and Cochrane Economic Methods Group and presented in a narrative synthesis style. Eight eligible studies were identified from North America spread over a diverse geographical range. The majority of studies reported the ALT intervention group as having lower resource use costs than the control group; though the low methodological quality and heterogeneity of the individual costs and outcomes reported across studies must be considered. The studies suggest that in some cases ALTs may reduce costs, though little data were identified and what there were was of poor quality. Methods to capture quality of life gains were not used, therefore potential effects on health and wellbeing may be missed. Further research is required using newer developments such as the capabilities approach. High quality studies assessing the cost-effectiveness of ALTs for ageing in place are required before robust conclusion on their use can be drawn.
Piracha, Afaq H; Rath, Patrik; Ganesan, Kumaravelu; Kühn, Stefan; Pernice, Wolfram H P; Prawer, Steven
2016-05-11
Diamond has emerged as a promising platform for nanophotonic, optical, and quantum technologies. High-quality, single crystalline substrates of acceptable size are a prerequisite to meet the demanding requirements on low-level impurities and low absorption loss when targeting large photonic circuits. Here, we describe a scalable fabrication method for single crystal diamond membrane windows that achieves three major goals with one fabrication method: providing high quality diamond, as confirmed by Raman spectroscopy; achieving homogeneously thin membranes, enabled by ion implantation; and providing compatibility with established planar fabrication via lithography and vertical etching. On such suspended diamond membranes we demonstrate a suite of photonic components as building blocks for nanophotonic circuits. Monolithic grating couplers are used to efficiently couple light between photonic circuits and optical fibers. In waveguide coupled optical ring resonators, we find loaded quality factors up to 66 000 at a wavelength of 1560 nm, corresponding to propagation loss below 7.2 dB/cm. Our approach holds promise for the scalable implementation of future diamond quantum photonic technologies and all-diamond photonic metrology tools.
Deng, William Nanqiao; Wang, Shuo; Ventrici de Souza, Joao; Kuhl, Tonya L; Liu, Gang-Yu
2018-06-25
Scanning probe microscopy (SPM), such as atomic force microscopy (AFM), is widely known for high-resolution imaging of surface structures and nanolithography in two dimensions (2D), providing important physical insights into surface science and material science. This work reports a new algorithm to enable construction and display of layer-by-layer 3D structures from SPM images. The algorithm enables alignment of SPM images acquired during layer-by-layer deposition and removal of redundant features and faithfully constructs the deposited 3D structures. The display uses a "see-through" strategy to enable the structure of each layer to be visible. The results demonstrate high spatial accuracy as well as algorithm versatility; users can set parameters for reconstruction and display as per image quality and research needs. To the best of our knowledge, this method represents the first report to enable SPM technology for 3D imaging construction and display. The detailed algorithm is provided to facilitate usage of the same approach in any SPM software. These new capabilities support wide applications of SPM that require 3D image reconstruction and display, such as 3D nanoprinting and 3D additive and subtractive manufacturing and imaging.
Solhi, Mahnaz; Shabani Hamedan, Marziyeh; Salehi, Masoud
2016-01-01
Background: Women-headed households are more exposed to social damages than other women. Such condition remarkably influences the women's health-related life quality. The present study is aimed to investigate the effect of an educational intervention in quality of life of women-headed households under protection of Tehran Welfare Organization, in 2015. Methods: In this quasi-experimental study with control group, 180 women-headed households participated. Sampling method was random allocation. Data collection tools were Life Quality standard questionnaire (WHOQOL-BREF) and a researcher-made questionnaire about structures of ecological and educational diagnosis phase of PRECEDE-PROCEED model. Validity and reliability of the questionnaire approved in a primary study. Based on the results obtained from the primary study, the intervention was performed in the case group only. Participants were followed one and three months after intervention. Data were analyzed through SPSS v. 15 software using descriptive and analytical tests. Results: Before intervention no significant difference was observed among the mean scores of life quality, behavioral factors, and knowledge, enabling, and reinforcing factors in the two groups. But, one month and three months after intervention a significant difference was observed between the mean scores of these variables (in five instances p<0.001). Conclusion: Intervention through the PRECEDE-PROCEED model improved the women-headed households' quality of life. The innovation of this study is using such intervention on quality of life in women-headed households for the first time.
NASA Astrophysics Data System (ADS)
Foulser-Piggott, R.; Saito, K.; Spence, R.
2012-04-01
Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.
Remotely Powered Reconfigurable Receiver for Extreme Sensing Platforms
NASA Technical Reports Server (NTRS)
Sheldon, Douglas J. (Inventor)
2017-01-01
Unmanned space programs are currently used to enable scientists to explore and research the furthest reaches of outer space. Systems and methods for low power communication devices in accordance with embodiments of the invention are disclosed, describing a wide variety of low power communication devices capable of remotely collecting, processing, and transmitting data from outer space in order to further mankind's goal of exploring the cosmos. Many embodiments of the invention include a Flash-based FPGA, an energy-harvesting power supply module, a sensor module, and a radio module. By utilizing technologies that withstand the harsh environment of outer space, more reliable low power communication devices can be deployed, enhancing the quality and longevity of the low power communication devices, enabling more data to be gathered and aiding in the exploration of outer space.
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.
Decision support for patient care: implementing cybernetics.
Ozbolt, Judy; Ozdas, Asli; Waitman, Lemuel R; Smith, Janis B; Brennan, Grace V; Miller, Randolph A
2004-01-01
The application of principles and methods of cybernetics permits clinicians and managers to use feedback about care effectiveness and resource expenditure to improve quality and to control costs. Keys to the process are the specification of therapeutic goals and the creation of an organizational culture that supports the use of feedback to improve care. Daily feedback on the achievement of each patient's therapeutic goals provides tactical decision support, enabling clinicians to adjust care as needed. Monthly or quarterly feedback on aggregated goal achievement for all patients on a clinical pathway provides strategic decision support, enabling clinicians and managers to identify problems with supposed "best practices" and to test hypotheses about solutions. Work is underway at Vanderbilt University Medical Center to implement feedback loops in care and management processes and to evaluate the effects.
Method for Estimating the Charge Density Distribution on a Dielectric Surface.
Nakashima, Takuya; Suhara, Hiroyuki; Murata, Hidekazu; Shimoyama, Hiroshi
2017-06-01
High-quality color output from digital photocopiers and laser printers is in strong demand, motivating attempts to achieve fine dot reproducibility and stability. The resolution of a digital photocopier depends on the charge density distribution on the organic photoconductor surface; however, directly measuring the charge density distribution is impossible. In this study, we propose a new electron optical instrument that can rapidly measure the electrostatic latent image on an organic photoconductor surface, which is a dielectric surface, as well as a novel method to quantitatively estimate the charge density distribution on a dielectric surface by combining experimental data obtained from the apparatus via a computer simulation. In the computer simulation, an improved three-dimensional boundary charge density method (BCM) is used for electric field analysis in the vicinity of the dielectric material with a charge density distribution. This method enables us to estimate the profile and quantity of the charge density distribution on a dielectric surface with a resolution of the order of microns. Furthermore, the surface potential on the dielectric surface can be immediately calculated using the obtained charge density. This method enables the relation between the charge pattern on the organic photoconductor surface and toner particle behavior to be studied; an understanding regarding the same may lead to the development of a new generation of higher resolution photocopiers.
Online analysis: Deeper insights into water quality dynamics in spring water.
Page, Rebecca M; Besmer, Michael D; Epting, Jannis; Sigrist, Jürg A; Hammes, Frederik; Huggenberger, Peter
2017-12-01
We have studied the dynamics of water quality in three karst springs taking advantage of new technological developments that enable high-resolution measurements of bacterial load (total cell concentration: TCC) as well as online measurements of abiotic parameters. We developed a novel data analysis approach, using self-organizing maps and non-linear projection methods, to approximate the TCC dynamics using the multivariate data sets of abiotic parameter time-series, thus providing a method that could be implemented in an online water quality management system for water suppliers. The (TCC) data, obtained over several months, provided a good basis to study the microbiological dynamics in detail. Alongside the TCC measurements, online abiotic parameter time-series, including spring discharge, turbidity, spectral absorption coefficient at 254nm (SAC254) and electrical conductivity, were obtained. High-density sampling over an extended period of time, i.e. every 45min for 3months, allowed a detailed analysis of the dynamics in karst spring water quality. Substantial increases in both the TCC and the abiotic parameters followed precipitation events in the catchment area. Differences between the parameter fluctuations were only apparent when analyzed at a high temporal scale. Spring discharge was always the first to react to precipitation events in the catchment area. Lag times between the onset of precipitation and a change in discharge varied between 0.2 and 6.7h, depending on the spring and event. TCC mostly reacted second or approximately concurrent with turbidity and SAC254, whereby the fastest observed reaction in the TCC time series occurred after 2.3h. The methodological approach described here enables a better understanding of bacterial dynamics in karst springs, which can be used to estimate risks and management options to avoid contamination of the drinking water. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Etnoyer, P. J.; Hourigan, T. F.; Reser, B.; Monaco, M.
2016-02-01
The growing fleet of telepresence-enabled research vessels equipped with deep-sea imaging technology provides a new opportunity to catalyze and coordinate research efforts among ships. This development is particularly useful for studying the distribution and diversity of deep-sea corals, which occur worldwide from 50 to 8600 m depth. Marine managers around the world seek to conserve these habitats, but require a clear consensus on what types of information are most important and most relevant for marine conservation. The National Oceanic and Atmospheric Administration (NOAA) seeks to develop a reproducible, non-invasive set of ROV methods designed to measure conservation value, or habitat quality, for deep-sea corals and sponges. New tools and methods will be proposed to inform ocean resource management, as well as facilitate research, outreach, and education. A new database schema will be presented, building upon the Ocean Biogeographic Information System (OBIS) and efforts of submersible and ROV teams over the years. Visual information about corals and sponges has proven paramount, particularly high-quality images with standard attributes for marine geology and marine biology, including scientific names, colony size, health, abundance, and density. Improved habitat suitability models can be developed from these data if presence and absence are measured. Recent efforts to incorporate physical sampling into telepresence protocols further increase the value of such information. It is possible for systematic observations with small file sizes to be distributed as geo-referenced, time-stamped still images with environmental variables for water chemistry and a standardized habitat classification. The technique is common among researchers, but a distributed network for this information is still in its infancy. One goal of this presentation is to make progress towards a more integrated network of these measured observations of habitat quality to better facilitate research, education, and conservation of deep-sea corals.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Prest, E I; Hammes, F; Kötzsch, S; van Loosdrecht, M C M; Vrouwenvelder, J S
2013-12-01
Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15 min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zwickl, Titus; Carleer, Bart; Kubli, Waldemar
2005-08-01
In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.
SECIMTools: a suite of metabolomics data analysis tools.
Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M
2018-04-20
Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.
Nose, Takayuki; Chatani, Masashi; Otani, Yuki; Teshima, Teruki; Kumita, Shinichirou
2017-03-15
High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bootsma, Gregory J.
X-ray scatter in cone-beam computed tomography (CBCT) is known to reduce image quality by introducing image artifacts, reducing contrast, and limiting computed tomography (CT) number accuracy. The extent of the effect of x-ray scatter on CBCT image quality is determined by the shape and magnitude of the scatter distribution in the projections. A method to allay the effects of scatter is imperative to enable application of CBCT to solve a wider domain of clinical problems. The work contained herein proposes such a method. A characterization of the scatter distribution through the use of a validated Monte Carlo (MC) model is carried out. The effects of imaging parameters and compensators on the scatter distribution are investigated. The spectral frequency components of the scatter distribution in CBCT projection sets are analyzed using Fourier analysis and found to reside predominately in the low frequency domain. The exact frequency extents of the scatter distribution are explored for different imaging configurations and patient geometries. Based on the Fourier analysis it is hypothesized the scatter distribution can be represented by a finite sum of sine and cosine functions. The fitting of MC scatter distribution estimates enables the reduction of the MC computation time by diminishing the number of photon tracks required by over three orders of magnitude. The fitting method is incorporated into a novel scatter correction method using an algorithm that simultaneously combines multiple MC scatter simulations. Running concurrent MC simulations while simultaneously fitting the results allows for the physical accuracy and flexibility of MC methods to be maintained while enhancing the overall efficiency. CBCT projection set scatter estimates, using the algorithm, are computed on the order of 1--2 minutes instead of hours or days. Resulting scatter corrected reconstructions show a reduction in artifacts and improvement in tissue contrast and voxel value accuracy.
NASA Astrophysics Data System (ADS)
Liba, Orly; Sorelle, Elliott D.; Sen, Debasish; de La Zerda, Adam
2016-03-01
Optical Coherence Tomography (OCT) enables real-time imaging of living tissues at cell-scale resolution over millimeters in three dimensions. Despite these advantages, functional biological studies with OCT have been limited by a lack of exogenous contrast agents that can be distinguished from tissue. Here we report an approach to functional OCT imaging that implements custom algorithms to spectrally identify unique contrast agents: large gold nanorods (LGNRs). LGNRs exhibit 110-fold greater spectral signal per particle than conventional GNRs, which enables detection of individual LGNRs in water and concentrations as low as 250 pM in the circulation of living mice. This translates to ~40 particles per imaging voxel in vivo. Unlike previous implementations of OCT spectral detection, the methods described herein adaptively compensate for depth and processing artifacts on a per sample basis. Collectively, these methods enable high-quality noninvasive contrast-enhanced imaging of OCT in living subjects, including detection of tumor microvasculature at twice the depth achievable with conventional OCT. Additionally, multiplexed detection of spectrally-distinct LGNRs was demonstrated to observe discrete patterns of lymphatic drainage and identify individual lymphangions and lymphatic valve functional states. These capabilities provide a powerful platform for molecular imaging and characterization of tissue noninvasively at cellular resolution, called MOZART.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
Metric Evaluation Pipeline for 3d Modeling of Urban Scenes
NASA Astrophysics Data System (ADS)
Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.
2017-05-01
Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.
Tuckley, Kushal
2017-01-01
In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient's information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB) obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image. PMID:29104744
Lucas, Patricia J; Baird, Janis; Arai, Lisa; Law, Catherine; Roberts, Helen M
2007-01-15
The inclusion of qualitative studies in systematic reviews poses methodological challenges. This paper presents worked examples of two methods of data synthesis (textual narrative and thematic), used in relation to one review, with the aim of enabling researchers to consider the strength of different approaches. A systematic review of lay perspectives of infant size and growth was conducted, locating 19 studies (including both qualitative and quantitative). The data extracted from these were synthesised using both a textual narrative and a thematic synthesis. The processes of both methods are presented, showing a stepwise progression to the final synthesis. Both methods led us to similar conclusions about lay views toward infant size and growth. Differences between methods lie in the way they dealt with study quality and heterogeneity. On the basis of the work reported here, we consider textual narrative and thematic synthesis have strengths and weaknesses in relation to different research questions. Thematic synthesis holds most potential for hypothesis generation, but may obscure heterogeneity and quality appraisal. Textual narrative synthesis is better able to describe the scope of existing research and account for the strength of evidence, but is less good at identifying commonality.
A framework for directional and higher-order reconstruction in photoacoustic tomography
NASA Astrophysics Data System (ADS)
Boink, Yoeri E.; Lagerwerf, Marinus J.; Steenbergen, Wiendelt; van Gils, Stephan A.; Manohar, Srirang; Brune, Christoph
2018-02-01
Photoacoustic tomography is a hybrid imaging technique that combines high optical tissue contrast with high ultrasound resolution. Direct reconstruction methods such as filtered back-projection, time reversal and least squares suffer from curved line artefacts and blurring, especially in the case of limited angles or strong noise. In recent years, there has been great interest in regularised iterative methods. These methods employ prior knowledge of the image to provide higher quality reconstructions. However, easy comparisons between regularisers and their properties are limited, since many tomography implementations heavily rely on the specific regulariser chosen. To overcome this bottleneck, we present a modular reconstruction framework for photoacoustic tomography, which enables easy comparisons between regularisers with different properties, e.g. nonlinear, higher-order or directional. We solve the underlying minimisation problem with an efficient first-order primal-dual algorithm. Convergence rates are optimised by choosing an operator-dependent preconditioning strategy. A variety of reconstruction methods are tested on challenging 2D synthetic and experimental data sets. They outperform direct reconstruction approaches for strong noise levels and limited angle measurements, offering immediate benefits in terms of acquisition time and quality. This work provides a basic platform for the investigation of future advanced regularisation methods in photoacoustic tomography.
Sollazzo, Marco; Baccelloni, Simone; D'Onofrio, Claudio; Bellincontro, Andrea
2018-01-03
This paper provides data for the potential use of a color chart to establish the best quality of white wine grapes destined for postharvest processing. Grechetto, Vermentino and Muscat of Alexandria white wine grape varieties were tested by sampling berries at different dates during their quality attribute evolution. A color chart and reflectance spectrocolorimeter were used in combination with analyses of total carotenoids and chlorophylls in all three varieties and of volatile organic compounds (VOCs) in Grechetto alone. Total carotenoids decreased from 0.85 to 0.76 µg g -1 in Grechetto berries and from 0.70 to 0.46 µg g -1 in Vermentino berries while increased from 0.70 to 0.80 µg g -1 in Muscat berries during ripening. Total chlorophylls decreased in all varieties, and a strict correlation was found between hue angle (measured by color chart or spectrocolorimeter) and chlorophyll disappearance, with R 2 ranging from 0.81 to 0.95 depending on the variety. VOCs were only measured in Grechetto grapes, and a significant increase in glycosylation was found with ripening. The concentration of different classes of VOCs exhibited a clear decrease during ripening, except for terpenoids and esters which showed a peak at the beginning. The benzenoid class reached the highest concentration, which was almost 50% of the total. Cluster analysis using Ward's method enabled the best grape quality to be identified. This experimental work highlights that a color chart is cheap and easy to use to define the right quality stage for white wine grapes. The color chart enabled the enochemical features to be matched with the VOC results for the aromatic maturity of Grechetto. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.
Controlling protein adsorption on graphene for cryo-EM using low-energy hydrogen plasmas
Russo, Christopher J.; Passmore, Lori A.
2014-01-01
Despite its many favorable properties as a sample support for biological electron microscopy, graphene is not widely used because its hydrophobicity precludes reliable protein deposition. We describe a method to modify graphene using a low-energy hydrogen plasma, which reduces hydrophobicity without degrading the graphene lattice. We show that the use of plasma-treated graphene enables better control of protein distribution in ice for electron cryo-microscopy and improved image quality by reducing radiation-induced sample motion. PMID:24747813
Quantifying quality in DNA self-assembly
Wagenbauer, Klaus F.; Wachauf, Christian H.; Dietz, Hendrik
2014-01-01
Molecular self-assembly with DNA is an attractive route for building nanoscale devices. The development of sophisticated and precise objects with this technique requires detailed experimental feedback on the structure and composition of assembled objects. Here we report a sensitive assay for the quality of assembly. The method relies on measuring the content of unpaired DNA bases in self-assembled DNA objects using a fluorescent de-Bruijn probe for three-base ‘codons’, which enables a comparison with the designed content of unpaired DNA. We use the assay to measure the quality of assembly of several multilayer DNA origami objects and illustrate the use of the assay for the rational refinement of assembly protocols. Our data suggests that large and complex objects like multilayer DNA origami can be made with high strand integration quality up to 99%. Beyond DNA nanotechnology, we speculate that the ability to discriminate unpaired from paired nucleic acids in the same macromolecule may also be useful for analysing cellular nucleic acids. PMID:24751596
McGrath, Michael J; Burns, Adrian; Dishongh, Terry
2007-01-01
Using five different commercially available class one and class two Bluetooth dongles a total of seven homes which represented a cross section of typical Irish homes were surveyed to determine the effect of construction methods, house size, sensor placement, host placement, antenna design and RF interference had on the link quality of Bluetooth enabled sensors. The results obtained indicates there is high variability in the link quality which is determined by the quality of the BT radio, placement of the antenna on both the master and slave, the number of walls which must be penetrated and the construction materials used in the wall. The placement of the sensor was the single biggest factor in determining the link quality. The type of construction used in the interior walls has significant influence also. The final factor of significant influence was the type of antenna used on the Bluetooth dongle. The use of an external antenna gave significantly better range performance than an internal antenna.
Mosier, Jarrod; Joseph, Bellal; Sakles, John C
2013-02-01
Since the first remote intubation with telemedicine guidance, wireless technology has advanced to enable more portable methods of telemedicine involvement in remote airway management. Three voice over Internet protocol (VoIP) services were evaluated for quality of image transmitted, data lag, and audio quality with remotely observed and assisted intubations in an academic emergency department. The VoIP clients evaluated were Apple (Cupertino, CA) FaceTime(®), Skype™ (a division of Microsoft, Luxembourg City, Luxembourg), and Tango(®) (TangoMe, Palo Alto, CA). Each client was tested over a Wi-Fi network as well as cellular third generation (3G) (Skype and Tango). All three VoIP clients provided acceptable image and audio quality. There is a significant data lag in image transmission and quality when VoIP clients are used over cellular broadband (3G) compared with Wi-Fi. Portable remote telemedicine guidance is possible with newer technology devices such as a smartphone or tablet, as well as VoIP clients used over Wi-Fi or cellular broadband.
[Evoked potentials extraction based on cross-talk resistant adaptive noise cancellation].
Zeng, Qingning; Li, Ling; Liu, Qinghua; Yao, Dezhong
2004-06-01
As Evoked Potentials are much lower in amplitude with respect to the on-going EEC, many trigger-related signals are needed for common averaging technique to enable the extraction of single-trail evoked potentials (EP). How to acquire EP through fewer evocations is an important research project. This paper proposes a cross-talk resistant adaptive noise cancellation method to extract EP. Together with the use of filtering technique and the common averaging technique, the present method needs much less evocations to acquire EP signals. According to the simulating experiment, it needs only several evocations or even only one evocation to get EP signals in good quality.
Dale, Simeon; Levi, Christopher; Ward, Jeanette; Grimshaw, Jeremy M; Jammali-Blasi, Asmara; D'Este, Catherine; Griffiths, Rhonda; Quinn, Clare; Evans, Malcolm; Cadilhac, Dominique; Cheung, N Wah; Middleton, Sandy
2015-02-01
The Quality in Acute Stroke Care (QASC) trial evaluated systematic implementation of clinical treatment protocols to manage fever, sugar, and swallow (FeSS protocols) in acute stroke care. This cluster-randomised controlled trial was conducted in 19 stroke units in Australia. To describe perceived barriers and enablers preimplementation to the introduction of the FeSS protocols and, postimplementation, to determine which of these barriers eventuated as actual barriers. Preimplementation: Workshops were held at the intervention stroke units (n = 10). The first workshop involved senior clinicians who identified perceived barriers and enablers to implementation of the protocols, the second workshop involved bedside clinicians. Postimplementation, an online survey with stroke champions from intervention sites was conducted. A total of 111 clinicians attended the preimplementation workshops, identifying 22 barriers covering four main themes: (a) need for new policies, (b) limited workforce (capacity), (c) lack of equipment, and (d) education and logistics of training staff. Preimplementation enablers identified were: support by clinical champions, medical staff, nursing management and allied health staff; easy adaptation of current protocols, care-plans, and local policies; and presence of specialist stroke unit staff. Postimplementation, only five of the 22 barriers identified preimplementation were reported as actual barriers to adoption of the FeSS protocols, namely, no previous use of insulin infusions; hyperglycaemic protocols could not be commenced without written orders; medical staff reluctance to use the ASSIST swallowing screening tool; poor level of engagement of medical staff; and doctors' unawareness of the trial. The process of identifying barriers and enablers preimplementation allowed staff to take ownership and to address barriers and plan for change. As only five of the 22 barriers identified preimplementation were reported to be actual barriers at completion of the trial, this suggests that barriers are often overcome whilst some are only ever perceived rather than actual barriers. © 2015 Sigma Theta Tau International.
Clinical Quality Performance in U.S. Health Centers
Shi, Leiyu; Lebrun, Lydie A; Zhu, Jinsheng; Hayashi, Arthur S; Sharma, Ravi; Daly, Charles A; Sripipatana, Alek; Ngo-Metzger, Quyen
2012-01-01
Objective To describe current clinical quality among the nation's community health centers and to examine health center characteristics associated with performance excellence. Data Sources National data from the 2009 Uniform Data System. Data Collection/Extraction Methods Health centers reviewed patient records and reported aggregate data to the Uniform Data System. Study Design Six measures were examined: first-trimester prenatal care, childhood immunization completion, Pap tests, low birth weight, controlled hypertension, and controlled diabetes. The top 25 percent performing centers were compared with lower performing (bottom 75 percent) centers on these measures. Logistic regressions were utilized to assess the impact of patient, provider, and institutional characteristics on health center performance. Principal Findings Clinical care and outcomes among health centers were generally comparable to national averages. For instance, 67 percent of pregnant patients received timely prenatal care (national = 68 percent), 69 percent of children achieved immunization completion (national = 67 percent), and 63 percent of hypertensive patients had blood pressure under control (national = 48 percent). Depending on the measure, centers with more uninsured patients were less likely to do well, while centers with more physicians and enabling service providers were more likely to do well. Conclusions Health centers provide quality care at rates comparable to national averages. Performance may be improved by increasing insurance coverage among patients and increasing the ratios of physicians and enabling service providers to patients. PMID:22594465
Bulk Group-III Nitride Crystal Growth in Supercritical Ammonia-Sodium Solutions
NASA Astrophysics Data System (ADS)
Griffiths, Steven Herbert
Gallium nitride (GaN) and its alloys with indium nitride (InGaN) and aluminum nitride (AlGaN), collectively referred to as Group-III Nitride semiconductors, have enabled white solid-state lighting (SSL) sources and power electronic devices. While these technologies have already made a lasting, positive impact on society, improvements in design and efficiency are anticipated by shifting from heteroepitaxial growth on foreign substrates (such as sapphire, Si, SiC, etc.) to homoepitaxial growth on native, bulk GaN substrates. Bulk GaN has not supplanted foreign substrate materials due to the extreme conditions required to achieve a stoichiometric GaN melt (temperatures and pressures in excess of 2200°C and 6 GPa, respectively). The only method used to produce bulk GaN on an industrial scale is hydride vapor phase epitaxy (HVPE), but the high cost of gaseous precursors and relatively poor crystal quality have limited the adoption of this technology. A solution growth technique known as the ammonothermal method has attracted interest from academia and industry alike for its ability to produce bulk GaN boules of exceedingly high crystal quality. The ammonothermal method employs supercritical ammonia (NH3) solutions to dissolve, transport, and crystallize GaN. However, ammonothermal growth pressures are still relatively high (˜200 MPa), which has thus far prevented the acquisition of fundamental crystal growth knowledge needed to efficiently (i.e. through data-driven approaches) advance the field. This dissertation focused on addressing the gaps in the literature through two studies employing in situ fluid temperature analysis. The first study focused on identifying the solubility of GaN in supercritical NH3-Na solutions. The design and utilization of in situ and ex situ monitoring equipment enabled the first reports of the two-phase nature of supercritical NH3-Na solutions, and of Ga-alloying of Ni-containing autoclave components. The effects of these error sources on the gravimetric determination of GaN solubility were explored in detail. The second study was aimed at correlating autoclave dissolution and growth zone fluid temperatures with bulk GaN crystal growth kinetics, crystal quality, and impurity incorporation. The insights resulting from this analysis include the identification of the barrier between mass transport and surface integration-limited GaN growth regimes, GaN crystal shape evolution with fluid temperature, the sensitivity of (0001)-orientation crystal quality with fluid temperature, and impurity-specific incorporation activated from the dissolution and growth zones of the autoclave. The results of the aforementioned studies motivated a paradigm-shift in ammonothermal growth. To address this need, a fundamentally different crystal growth approach involving isothermal solutions and tailor-made Group-III alloy source materials was developed/demonstrated. This growth method enabled impurity incorporation reduction compared to traditional ammonothermal GaN growth, and the realization of bulk, ternary Group-III Nitride crystals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeraatkar, Navid; Farahani, Mohammad Hossein; Rahmim, Arman
Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicatedmore » image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as incorporated in the forward- and back-projection steps. Meanwhile, the developed MC model and the analytical simulator of the system can be applied for further studies on development and evaluation of the system.« less
Sadler, Euan; Fisher, Helen R.; Maher, John; Wolfe, Charles D. A.; McKevitt, Christopher
2016-01-01
Introduction Translational research is central to international health policy, research and funding initiatives. Despite increasing use of the term, the translation of basic science discoveries into clinical practice is not straightforward. This systematic search and narrative synthesis aimed to examine factors enabling or hindering translational research from the perspective of basic and clinician scientists, a key stakeholder group in translational research, and to draw policy-relevant implications for organisations seeking to optimise translational research opportunities. Methods and Results We searched SCOPUS and Web of Science from inception until April 2015 for papers reporting scientists’ views of the factors they perceive as enabling or hindering the conduct of translational research. We screened 8,295 papers from electronic database searches and 20 papers from hand searches and citation tracking, identifying 26 studies of qualitative, quantitative or mixed method designs. We used a narrative synthesis approach and identified the following themes: 1) differing concepts of translational research 2) research processes as a barrier to translational research; 3) perceived cultural divide between research and clinical care; 4) interdisciplinary collaboration as enabling translation research, but dependent on the quality of prior and current social relationships; 5) translational research as entrepreneurial science. Across all five themes, factors enabling or hindering translational research were largely shaped by wider social, organisational, and structural factors. Conclusion To optimise translational research, policy could consider refining translational research models to better reflect scientists’ experiences, fostering greater collaboration and buy in from all types of scientists. Organisations could foster cultural change, ensuring that organisational practices and systems keep pace with the change in knowledge production brought about by the translational research agenda. PMID:27490373
Multilayer ultra thick resist development for MEMS
NASA Astrophysics Data System (ADS)
Washio, Yasushi; Senzaki, Takahiro; Masuda, Yasuo; Saito, Koji; Obiya, Hiroyuki
2005-05-01
MEMS (Micro-Electro-Mechanical Systems) is achieved through a process technology, called Micro-machining. There are two distinct methods to manufacture a MEMS-product. One method is to form permanent film through photolithography, and the other is to form a non-permanent film resist after photolithography proceeded by etch or plating process. The three-dimensional ultra-fine processing technology based on photolithography, and is assembled by processes, such as anode junction, and post lithography processes such as etching and plating. Currently ORDYL PR-100 (Dry Film Type) is used for the permanent resist process. TOK has developed TMMR S2000 (Liquid Type) and TMMF S2000 (Dry Film Type) also. TOK has developed a new process utilizing these resist. The electro-forming method by photolithography is developed as one of the methods for enabling high resolution and high aspect formation. In recent years, it has become possible to manufacture conventionally difficult multilayer through our development with material and equipment project (M&E). As for material for electro-forming, it was checked that chemically amplified resist is optimal from the reaction mechanism as it is easily removed by the clean solution. Moreover, multiple plating formations were enabled with the resist through a new process. As for the equipment, TOK developed Applicator (It can apply 500 or more μms) and Developer, which achieves high throughput and quality. The detailed plating formations, which a path differs, and air wiring are realizable through M&E. From the above results, opposed to metallic mold plating, electro-forming method by resist, enabled to form high resolution and aspect pattern, at low cost. It is thought that the infinite possibility spreads by applying this process.
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351
A Framework Incorporating Community Preferences in Use ...
The report is intended to assist water quality officials, watershed managers, members of stakeholder groups, and other interested individuals in fully evaluating ecological and socioeconomic objectives and the gains and losses that often are involved in use attainment decisions. In addition, this report enables local, state, and tribal managers to better understand the benefits, as well as the costs, of attaining high water quality, and to incorporate community preferences in decision-making. Specific objectives are (1) to provide an introduction to the CWA and WQS regulation and analyses related to setting or changing designated uses; (2) create a basis for understanding the relationship between use-attainment decisions and the effects on ecosystems, ecosystem services, and ecological benefits; (3) serve as reference for methods that elicit or infer preferences for benefits and costs related to attaining uses and (4) present process for incorporating new approaches in water quality decisions.
NASA Astrophysics Data System (ADS)
Palma, V.; Carli, M.; Neri, A.
2011-02-01
In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method.
Nechayev, Sergey; Reusswig, Philip D; Baldo, Marc A; Rotschild, Carmel
2016-12-07
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd 3+ :YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers.
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method
Nechayev, Sergey; Reusswig, Philip D.; Baldo, Marc A.; Rotschild, Carmel
2016-01-01
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd3+:YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers. PMID:27924844
NASA Astrophysics Data System (ADS)
Hsieh, M.; Zhao, L.; Ma, K.
2010-12-01
Finite-frequency approach enables seismic tomography to fully utilize the spatial and temporal distributions of the seismic wavefield to improve resolution. In achieving this goal, one of the most important tasks is to compute efficiently and accurately the (Fréchet) sensitivity kernels of finite-frequency seismic observables such as traveltime and amplitude to the perturbations of model parameters. In scattering-integral approach, the Fréchet kernels are expressed in terms of the strain Green tensors (SGTs), and a pre-established SGT database is necessary to achieve practical efficiency for a three-dimensional reference model in which the SGTs must be calculated numerically. Methods for computing Fréchet kernels for seismic velocities have long been established. In this study, we develop algorithms based on the finite-difference method for calculating Fréchet kernels for the quality factor Qμ and seismic boundary topography. Kernels for the quality factor can be obtained in a way similar to those for seismic velocities with the help of the Hilbert transform. The effects of seismic velocities and quality factor on either traveltime or amplitude are coupled. Kernels for boundary topography involve spatial gradient of the SGTs and they also exhibit interesting finite-frequency characteristics. Examples of quality factor and boundary topography kernels will be shown for a realistic model for the Taiwan region with three-dimensional velocity variation as well as surface and Moho discontinuity topography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.
1997-06-01
Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173
Wang, Ying; Yang, Zaixing; Wu, Xiaofeng; Han, Ning; Liu, Hanyu; Wang, Shuobo; Li, Jun; Tse, WaiMan; Yip, SenPo; Chen, Yunfa; Ho, Johnny C
2016-12-01
Growing high-quality and low-cost GaAs nanowires (NWs) as well as fabricating high-performance NW solar cells by facile means is an important development towards the cost-effective next-generation photovoltaics. In this work, highly crystalline, dense, and long GaAs NWs are successfully synthesized using a two-source method on non-crystalline SiO2 substrates by a simple solid-source chemical vapor deposition method. The high V/III ratio and precursor concentration enabled by this two-source configuration can significantly benefit the NW growth and suppress the crystal defect formation as compared with the conventional one-source system. Since less NW crystal defects would contribute fewer electrons being trapped by the surface oxides, the p-type conductivity is then greatly enhanced as revealed by the electrical characterization of fabricated NW devices. Furthermore, the individual single NW and high-density NW parallel arrays achieved by contact printing can be effectively fabricated into Schottky barrier solar cells simply by employing asymmetric Ni-Al contacts, along with an open circuit voltage of ~0.3 V. All these results indicate the technological promise of these high-quality two-source grown GaAs NWs, especially for the realization of facile Schottky solar cells utilizing the asymmetric Ni-Al contact.
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
Tshitenge, Dieudonné Tshitenge; Ioset, Karine Ndjoko; Lami, José Nzunzu; Ndelo-di-Phanzu, Josaphat; Mufusama, Jean-Pierre Koy Sita; Bringmann, Gerhard
2016-04-01
Herbal medicines are the most globally used type of medical drugs. Their high cultural acceptability is due to the experienced safety and efficiency over centuries of use. Many of them are still phytochemically less-investigated, and are used without standardization or quality control. Choosing SIROP KILMA, an authorized Congolese antimalarial phytomedicine, as a model case, our study describes an interdisciplinary approach for a rational quality assessment of herbal drugs in general. It combines an authentication step of the herbal remedy prior to any fingerprinting, the isolation of the major constituents, the development and validation of an HPLC-DAD analytical method with internal markers, and the application of the method to several batches of the herbal medicine (here KILMA) thus permitting the establishment of a quantitative fingerprint. From the constitutive plants of KILMA, acteoside, isoacteoside, stachannin A, and pectolinarigenin-7-O-glucoside were isolated, and acteoside was used as the prime marker for the validation of an analytical method. This study contributes to the efforts of the WHO for the establishment of standards enabling the analytical evaluation of herbal materials. Moreover, the paper describes the first phytochemical and analytical report on a marketed Congolese phytomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.
Regional Principal Color Based Saliency Detection
Lou, Jing; Ren, Mingwu; Wang, Huan
2014-01-01
Saliency detection is widely used in many visual applications like image segmentation, object recognition and classification. In this paper, we will introduce a new method to detect salient objects in natural images. The approach is based on a regional principal color contrast modal, which incorporates low-level and medium-level visual cues. The method allows a simple computation of color features and two categories of spatial relationships to a saliency map, achieving higher F-measure rates. At the same time, we present an interpolation approach to evaluate resulting curves, and analyze parameters selection. Our method enables the effective computation of arbitrary resolution images. Experimental results on a saliency database show that our approach produces high quality saliency maps and performs favorably against ten saliency detection algorithms. PMID:25379960
Johanson, Helene C; Hyland, Valentine; Wicking, Carol; Sturm, Richard A
2009-04-01
We describe here a method for DNA elution from buccal cells and whole blood both collected onto Whatman FTA technology, using methanol fixation followed by an elution PCR program. Extracted DNA is comparable in quality to published Whatman FTA protocols, as judged by PCR-based genotyping. Elution of DNA from the dried sample is a known rate-limiting step in the published Whatman FTA protocol; this method enables the use of each 3-mm punch of sample for several PCR reactions instead of the standard, one PCR reaction per sample punch. This optimized protocol therefore extends the usefulness and cost effectiveness of each buccal swab sample collected, when used for nucleic acid PCR and genotyping.
An Integrated Decision Support System for Water Quality Management of Songhua River Basin
NASA Astrophysics Data System (ADS)
Zhang, Haiping; Yin, Qiuxiao; Chen, Ling
2010-11-01
In the Songhua River Basin of China, many water resource and water environment conflicts interact. A Decision Support System (DSS) for the water quality management has been established for the Basin. The System is featured by the incorporation of a numerical water quality model system into a conventional water quality management system which usually consists of geographic information system (GIS), WebGIS technology, database system and network technology. The model system is built based on DHI MIKE software comprising of a basin rainfall-runoff module, a basin pollution load evaluation module, a river hydrodynamic module and a river water quality module. The DSS provides a friendly graphical user interface that enables the rapid and transparent calculation of various water quality management scenarios, and also enables the convenient access and interpretation of the modeling results to assist the decision-making.
Odusola, Aina O.; Stronks, Karien; Hendriks, Marleen E.; Schultsz, Constance; Akande, Tanimola; Osibogun, Akin; van Weert, Henk; Haafkens, Joke A.
2016-01-01
Background Hypertension is a highly prevalent risk factor for cardiovascular diseases in sub-Saharan Africa (SSA) that can be modified through timely and long-term treatment in primary care. Objective We explored perspectives of primary care staff and health insurance managers on enablers and barriers for implementing high-quality hypertension care, in the context of a community-based health insurance programme in rural Nigeria. Design Qualitative study using semi-structured individual interviews with primary care staff (n = 11) and health insurance managers (n=4). Data were analysed using standard qualitative techniques. Results Both stakeholder groups perceived health insurance as an important facilitator for implementing high-quality hypertension care because it covered costs of care for patients and provided essential resources and incentives to clinics: guidelines, staff training, medications, and diagnostic equipment. Perceived inhibitors included the following: high staff workload; administrative challenges at facilities; discordance between healthcare provider and insurer on how health insurance and provider payment methods work; and insufficient fit between some guideline recommendations and tools for patient education and characteristics/needs of the local patient population. Perceived strategies to address inhibitors included the following: task-shifting; adequate provider payment benchmarking; good provider–insurer relationships; automated administration systems; and tailoring guidelines/patient education. Conclusions By providing insights into perspectives of primary care providers and health insurance managers, this study offers information on potential strategies for implementing high-quality hypertension care for insured patients in SSA. PMID:26880152
Yan, Yin-zhuo; Qian, Yu-lin; Ji, Feng-di; Chen, Jing-yu; Han, Bei-zhong
2013-05-01
Koji-making is a key process for production of high quality soy sauce. The microbial composition during koji-making was investigated by culture-dependent and culture-independent methods to determine predominant bacterial and fungal populations. The culture-dependent methods used were direct culture and colony morphology observation, and PCR amplification of 16S/26S rDNA fragments followed by sequencing analysis. The culture-independent method was based on the analysis of 16S/26S rDNA clone libraries. There were differences between the results obtained by different methods. However, sufficient overlap existed between the different methods to identify potentially significant microbial groups. 16 and 20 different bacterial species were identified using culture-dependent and culture-independent methods, respectively. 7 species could be identified by both methods. The most predominant bacterial genera were Weissella and Staphylococcus. Both 6 different fungal species were identified using culture-dependent and culture-independent methods, respectively. Only 3 species could be identified by both sets of methods. The most predominant fungi were Aspergillus and Candida species. This work illustrated the importance of a comprehensive polyphasic approach in the analysis of microbial composition during soy sauce koji-making, the knowledge of which will enable further optimization of microbial composition and quality control of koji to upgrade Chinese traditional soy sauce product. Copyright © 2013 Elsevier Ltd. All rights reserved.
A review of distributed parameter groundwater management modeling methods
Gorelick, Steven M.
1983-01-01
Models which solve the governing groundwater flow or solute transport equations in conjunction with optimization techniques, such as linear and quadratic programing, are powerful aquifer management tools. Groundwater management models fall in two general categories: hydraulics or policy evaluation and water allocation. Groundwater hydraulic management models enable the determination of optimal locations and pumping rates of numerous wells under a variety of restrictions placed upon local drawdown, hydraulic gradients, and water production targets. Groundwater policy evaluation and allocation models can be used to study the influence upon regional groundwater use of institutional policies such as taxes and quotas. Furthermore, fairly complex groundwater-surface water allocation problems can be handled using system decomposition and multilevel optimization. Experience from the few real world applications of groundwater optimization-management techniques is summarized. Classified separately are methods for groundwater quality management aimed at optimal waste disposal in the subsurface. This classification is composed of steady state and transient management models that determine disposal patterns in such a way that water quality is protected at supply locations. Classes of research missing from the literature are groundwater quality management models involving nonlinear constraints, models which join groundwater hydraulic and quality simulations with political-economic management considerations, and management models that include parameter uncertainty.
A Review of Distributed Parameter Groundwater Management Modeling Methods
NASA Astrophysics Data System (ADS)
Gorelick, Steven M.
1983-04-01
Models which solve the governing groundwater flow or solute transport equations in conjunction with optimization techniques, such as linear and quadratic programing, are powerful aquifer management tools. Groundwater management models fall in two general categories: hydraulics or policy evaluation and water allocation. Groundwater hydraulic management models enable the determination of optimal locations and pumping rates of numerous wells under a variety of restrictions placed upon local drawdown, hydraulic gradients, and water production targets. Groundwater policy evaluation and allocation models can be used to study the influence upon regional groundwater use of institutional policies such as taxes and quotas. Furthermore, fairly complex groundwater-surface water allocation problems can be handled using system decomposition and multilevel optimization. Experience from the few real world applications of groundwater optimization-management techniques is summarized. Classified separately are methods for groundwater quality management aimed at optimal waste disposal in the subsurface. This classification is composed of steady state and transient management models that determine disposal patterns in such a way that water quality is protected at supply locations. Classes of research missing from the literature are groundwater quality management models involving nonlinear constraints, models which join groundwater hydraulic and quality simulations with political-economic management considerations, and management models that include parameter uncertainty.
CZT sensors for Computed Tomography: from crystal growth to image quality
NASA Astrophysics Data System (ADS)
Iniewski, K.
2016-12-01
Recent advances in Traveling Heater Method (THM) growth and device fabrication that require additional processing steps have enabled to dramatically improve hole transport properties and reduce polarization effects in Cadmium Zinc Telluride (CZT) material. As a result high flux operation of CZT sensors at rates in excess of 200 Mcps/mm2 is now possible and has enabled multiple medical imaging companies to start building prototype Computed Tomography (CT) scanners. CZT sensors are also finding new commercial applications in non-destructive testing (NDT) and baggage scanning. In order to prepare for high volume commercial production we are moving from individual tile processing to whole wafer processing using silicon methodologies, such as waxless processing, cassette based/touchless wafer handling. We have been developing parametric level screening at the wafer stage to ensure high wafer quality before detector fabrication in order to maximize production yields. These process improvements enable us, and other CZT manufacturers who pursue similar developments, to provide high volume production for photon counting applications in an economically feasible manner. CZT sensors are capable of delivering both high count rates and high-resolution spectroscopic performance, although it is challenging to achieve both of these attributes simultaneously. The paper discusses material challenges, detector design trade-offs and ASIC architectures required to build cost-effective CZT based detection systems. Photon counting ASICs are essential part of the integrated module platforms as charge-sensitive electronics needs to deal with charge-sharing and pile-up effects.
Hilligoss, Brian; Zheng, Kai
2013-01-01
To examine how clinicians on the receiving end of admission handoffs use electronic health records (EHRs) in preparation for those handoffs and to identify the kinds of impacts such usage may have. This analysis is part of a two-year ethnographic study of emergency department (ED) to internal medicine admission handoffs at a tertiary teaching and referral hospital. Qualitative data were gathered and analyzed iteratively, following a grounded theory methodology. Data collection methods included semi-structured interviews (N = 48), observations (349 hours), and recording of handoff conversations (N = 48). Data analyses involved coding, memo writing, and member checking. The use of EHRs has enabled an emerging practice that we refer to as pre-handoff "chart biopsy": the activity of selectively examining portions of a patient's health record to gather specific data or information about that patient or to get a broader sense of the patient and the care that patient has received. Three functions of chart biopsy are identified: getting an overview of the patient; preparing for handoff and subsequent care; and defending against potential biases. Chart biopsies appear to impact important clinical and organizational processes. Among these are the nature and quality of handoff interactions, and the quality of care, including the appropriateness of dispositioning of patients. Chart biopsy has the potential to enrich collaboration and to enable the hospital to act safely, efficiently, and effectively. Implications for handoff research and for the design and evaluation of EHRs are also discussed.
Meng, Bowen; Lee, Ho; Xing, Lei; Fahimian, Benjamin P.
2013-01-01
Purpose: X-ray scatter results in a significant degradation of image quality in computed tomography (CT), representing a major limitation in cone-beam CT (CBCT) and large field-of-view diagnostic scanners. In this work, a novel scatter estimation and correction technique is proposed that utilizes peripheral detection of scatter during the patient scan to simultaneously acquire image and patient-specific scatter information in a single scan, and in conjunction with a proposed compressed sensing scatter recovery technique to reconstruct and correct for the patient-specific scatter in the projection space. Methods: The method consists of the detection of patient scatter at the edges of the field of view (FOV) followed by measurement based compressed sensing recovery of the scatter through-out the projection space. In the prototype implementation, the kV x-ray source of the Varian TrueBeam OBI system was blocked at the edges of the projection FOV, and the image detector in the corresponding blocked region was used for scatter detection. The design enables image data acquisition of the projection data on the unblocked central region of and scatter data at the blocked boundary regions. For the initial scatter estimation on the central FOV, a prior consisting of a hybrid scatter model that combines the scatter interpolation method and scatter convolution model is estimated using the acquired scatter distribution on boundary region. With the hybrid scatter estimation model, compressed sensing optimization is performed to generate the scatter map by penalizing the L1 norm of the discrete cosine transform of scatter signal. The estimated scatter is subtracted from the projection data by soft-tuning, and the scatter-corrected CBCT volume is obtained by the conventional Feldkamp-Davis-Kress algorithm. Experimental studies using image quality and anthropomorphic phantoms on a Varian TrueBeam system were carried out to evaluate the performance of the proposed scheme. Results: The scatter shading artifacts were markedly suppressed in the reconstructed images using the proposed method. On the Catphan©504 phantom, the proposed method reduced the error of CT number to 13 Hounsfield units, 10% of that without scatter correction, and increased the image contrast by a factor of 2 in high-contrast regions. On the anthropomorphic phantom, the spatial nonuniformity decreased from 10.8% to 6.8% after correction. Conclusions: A novel scatter correction method, enabling unobstructed acquisition of the high frequency image data and concurrent detection of the patient-specific low frequency scatter data at the edges of the FOV, is proposed and validated in this work. Relative to blocker based techniques, rather than obstructing the central portion of the FOV which degrades and limits the image reconstruction, compressed sensing is used to solve for the scatter from detection of scatter at the periphery of the FOV, enabling for the highest quality reconstruction in the central region and robust patient-specific scatter correction. PMID:23298098
Water use data to enhance scientific and policy insight
NASA Astrophysics Data System (ADS)
Konar, M.
2017-12-01
We live in an era of big data. However, water use data remains sparse. There is an urgent need to enhance both the quality and resolution of water data. Metered water use information - as opposed to estimated water use, typically based on climate - would enhance the quality of existing water databases. Metered water use data would enable the research community to evaluate the "who, where, and when" of water use. Importantly, this information would enable the scientific community to better understand decision making related to water use (i.e. the "why"), providing the insight necessary to guide policies that promote water conservation. Metered water use data is needed at a sufficient resolution (i.e. spatial, temporal, and water user) to fully resolve how water is used throughout the economy and society. Improving the quality and resolution of water use data will enable scientific understanding that can inform policy.
[Quality Indicators of Primary Health Care Facilities in Austria].
Semlitsch, Thomas; Abuzahra, Muna; Stigler, Florian; Jeitler, Klaus; Posch, Nicole; Siebenhofer, Andrea
2017-07-11
Background The strengthening of primary health care is one major goal of the current national health reform in Austria. In this context, a new interdisciplinary concept was developed in 2014 that defines structures and requirements for future primary health care facilities. Objective The aim of this project was the development of quality indicators for the evaluation of the scheduled primary health care facilities in Austria, which are in accordance with the new Austrian concept. Methods We used the RAND/NPCRDC method for the development and selection of the quality indicators. We conducted systematic literature searches for existing measures in international databases for quality indicators as well as in bibliographic databases. All retrieved measures were evaluated and rated by an expert panel in a 2-step process regarding relevance and feasibility. Results Overall, the literature searches yielded 281 potentially relevant quality indicators, which were summarized to 65 different quality measures for primary health care. Out of these, the panel rated and accepted 30 measures as relevant and feasible for use in Austria. Five of these indicators were structure measures, 14 were process measures and the remaining 11 were outcome measures. Based on the Austrian primary health care concept, the final set of quality indicators was grouped in the 5 following domains: Access to primary health care (5), quality of care (15), continuity of care (5), coordination of care (4), and safety (1). Conclusion This set of quality measures largely covers the four defined functions of primary health care. It enables standardized evaluation of primary health care facilities in Austria regarding the implementation of the Austrian primary health care concept as well as improvement in healthcare of the population. © Georg Thieme Verlag KG Stuttgart · New York.
Furlong, Lisa M; Morris, Meg E; Erickson, Shane; Serry, Tanya A
2016-11-29
Although mobile apps are readily available for speech sound disorders (SSD), their validity has not been systematically evaluated. This evidence-based appraisal will critically review and synthesize current evidence on available therapy apps for use by children with SSD. The main aims are to (1) identify the types of apps currently available for Android and iOS mobile phones and tablets, and (2) to critique their design features and content using a structured quality appraisal tool. This protocol paper presents and justifies the methods used for a systematic review of mobile apps that provide intervention for use by children with SSD. The primary outcomes of interest are (1) engagement, (2) functionality, (3) aesthetics, (4) information quality, (5) subjective quality, and (6) perceived impact. Quality will be assessed by 2 certified practicing speech-language pathologists using a structured quality appraisal tool. Two app stores will be searched from the 2 largest operating platforms, Android and iOS. Systematic methods of knowledge synthesis shall include searching the app stores using a defined procedure, data extraction, and quality analysis. This search strategy shall enable us to determine how many SSD apps are available for Android and for iOS compatible mobile phones and tablets. It shall also identify the regions of the world responsible for the apps' development, the content and the quality of offerings. Recommendations will be made for speech-language pathologists seeking to use mobile apps in their clinical practice. This protocol provides a structured process for locating apps and appraising the quality, as the basis for evaluating their use in speech pathology for children in English-speaking nations. ©Lisa M Furlong, Meg E Morris, Shane Erickson, Tanya A Serry. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 29.11.2016.
Huang, Yuan; Sutter, Eli; Shi, Norman N; Zheng, Jiabao; Yang, Tianzhong; Englund, Dirk; Gao, Hong-Jun; Sutter, Peter
2015-11-24
Mechanical exfoliation has been a key enabler of the exploration of the properties of two-dimensional materials, such as graphene, by providing routine access to high-quality material. The original exfoliation method, which remained largely unchanged during the past decade, provides relatively small flakes with moderate yield. Here, we report a modified approach for exfoliating thin monolayer and few-layer flakes from layered crystals. Our method introduces two process steps that enhance and homogenize the adhesion force between the outermost sheet in contact with a substrate: Prior to exfoliation, ambient adsorbates are effectively removed from the substrate by oxygen plasma cleaning, and an additional heat treatment maximizes the uniform contact area at the interface between the source crystal and the substrate. For graphene exfoliation, these simple process steps increased the yield and the area of the transferred flakes by more than 50 times compared to the established exfoliation methods. Raman and AFM characterization shows that the graphene flakes are of similar high quality as those obtained in previous reports. Graphene field-effect devices were fabricated and measured with back-gating and solution top-gating, yielding mobilities of ∼4000 and 12,000 cm(2)/(V s), respectively, and thus demonstrating excellent electrical properties. Experiments with other layered crystals, e.g., a bismuth strontium calcium copper oxide (BSCCO) superconductor, show enhancements in exfoliation yield and flake area similar to those for graphene, suggesting that our modified exfoliation method provides an effective way for producing large area, high-quality flakes of a wide range of 2D materials.
Advanced industrial fluorescence metrology used for qualification of high quality optical materials
NASA Astrophysics Data System (ADS)
Engel, Axel; Becker, Hans-Juergen; Sohr, Oliver; Haspel, Rainer; Rupertus, Volker
2003-11-01
Schott Glas is developing and producing the optical material for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. The requirements on quality for optical materials are extremely high and still increasing. For example in micro lithography applications the impurities of the material are specified to be in the low ppb range. Usually the impurities in the lower ppb range are determined using analytical methods like LA ICP-MS and Neutron Activation Analysis. On the other hand absorption and laser resistivity of optical material is qualified with optical methods like precision spectral photometers and in-situ transmission measurements having UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). In order to achieve the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometer is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity that state of the art UV absorption spectroscopy) and fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analytics). An overview is given for spectral characteristics and using specified standards. Moreover correlations to the material qualities are shown. In particular we have investigated the elementary fluorescence and absorption of rare earth element impurities as well as defects induced luminescence originated by impurities.
Burnett, E; Curran, E; Loveday, H P; Kiernan, M A; Tannahill, M
2014-01-01
Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences.
Adaptive image inversion of contrast 3D echocardiography for enabling automated analysis.
Shaheen, Anjuman; Rajpoot, Kashif
2015-08-01
Contrast 3D echocardiography (C3DE) is commonly used to enhance the visual quality of ultrasound images in comparison with non-contrast 3D echocardiography (3DE). Although the image quality in C3DE is perceived to be improved for visual analysis, however it actually deteriorates for the purpose of automatic or semi-automatic analysis due to higher speckle noise and intensity inhomogeneity. Therefore, the LV endocardial feature extraction and segmentation from the C3DE images remains a challenging problem. To address this challenge, this work proposes an adaptive pre-processing method to invert the appearance of C3DE image. The image inversion is based on an image intensity threshold value which is automatically estimated through image histogram analysis. In the inverted appearance, the LV cavity appears dark while the myocardium appears bright thus making it similar in appearance to a 3DE image. Moreover, the resulting inverted image has high contrast and low noise appearance, yielding strong LV endocardium boundary and facilitating feature extraction for segmentation. Our results demonstrate that the inverse appearance of contrast image enables the subsequent LV segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lu, Liming; Zou, Guanyang; Zeng, Zhi; Han, Lu; Guo, Yan; Ling, Li
2014-01-01
Objectives To explore the relationship between health-related quality of life (HRQOL) status and associated factors among rural-to-urban migrants in China. Methods A cross-sectional survey was conducted with 856 rural-to-urban migrants working at small- and medium-size enterprises (SMEs) in Shenzhen and Zhongshan City in 2012. Andersen's behavioral model was used as a theoretical framework to exam the relationships among factors affecting HRQOL. Analysis was performed using structural equation modeling (SEM). Results Workers with statutory working hours, higher wages and less migrant experience had higher HRQOL scores. Need (contracting a disease in the past two weeks and perception of needing health service) had the greatest total effect on HRQOL (β = −0.78), followed by enabling (labor contract, insurance purchase, income, physical examination during work and training) (β = 0.40), predisposing (age, family separation, education) (β = 0.22) and health practices and use of health service (physical exercise weekly, health check-up and use of protective equipments) (β = −0.20). Conclusions Priority should be given to satisfy the needs of migrant workers, and improve the enabling resources. PMID:24392084
Curran, E; Loveday, HP; Kiernan, MA; Tannahill, M
2013-01-01
Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences. PMID:28989348
NASA Astrophysics Data System (ADS)
Modegi, Toshio
We are developing audio watermarking techniques which enable extraction of embedded data by cell phones. For that we have to embed data onto frequency ranges, where our auditory response is prominent, therefore data embedding will cause much auditory noises. Previously we have proposed applying a two-channel stereo play-back feature, where noises generated by a data embedded left-channel signal will be reduced by the other right-channel signal. However, this proposal has practical problems of restricting extracting terminal location. In this paper, we propose synthesizing the noise reducing right-channel signal with the left-signal and reduces noises completely by generating an auditory stream segregation phenomenon to users. This newly proposed makes the noise reducing right-channel signal unnecessary and supports monaural play-back operations. Moreover, we propose a wide-band embedding method causing dual auditory stream segregation phenomena, which enables data embedding on whole public phone frequency ranges and stable extractions with 3-G mobile phones. From these proposals, extraction precisions become higher than those by the previously proposed method whereas the quality damages of embedded signals become smaller. In this paper we present an abstract of our newly proposed method and experimental results comparing with those by the previously proposed method.
3D Ultrasonic Non-destructive Evaluation of Spot Welds Using an Enhanced Total Focusing Method
NASA Astrophysics Data System (ADS)
Jasiuniene, Elena; Samaitis, Vykintas; Mazeika, Liudas; Sanderson, Ruth
2015-02-01
Spot welds are used to join sheets of metals in the automotive industry. When spot weld quality is evaluated using conventional ultrasonic manual pulse-echo method, the reliability of the inspection is affected by selection of the probe diameter and the positioning of the probe in the weld center. The application of a 2D matrix array is a potential solution to the aforementioned problems. The objective of this work was to develop a signal processing algorithm to reconstruct the 3D spot weld volume showing the size of the nugget and the defects in it. In order to achieve this, the conventional total focusing method was enhanced by taking into account the directivities of the single elements of the array and the divergence of the ultrasonic beam due to the propagation distance. Enhancements enabled a reduction in the background noise and uniform sensitivity at different depths to be obtained. The proposed algorithm was verified using a finite element model of ultrasonic wave propagation simulating three common spot weld conditions: a good weld, an undersized weld, and a weld containing a pore. The investigations have demonstrated that proposed method enables the determination of the size of the nugget and detection of discontinuities.
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779
Serial femtosecond crystallography of soluble proteins in lipidic cubic phase
Fromme, Raimund; Ishchenko, Andrii; Metz, Markus; ...
2015-08-04
Serial femtosecond crystallography (SFX) at X-ray free-electron lasers (XFELs) enables high-resolution protein structure determination using micrometre-sized crystals at room temperature with minimal effects from radiation damage. SFX requires a steady supply of microcrystals intersecting the XFEL beam at random orientations. An LCP–SFX method has recently been introduced in which microcrystals of membrane proteins are grown and delivered for SFX data collection inside a gel-like membrane-mimetic matrix, known as lipidic cubic phase (LCP), using a special LCP microextrusion injector. Here, it is shown enabling a dramatic reduction in the amount of crystallized protein required for data collection compared with crystals deliveredmore » by liquid injectors. High-quality LCP–SFX data sets were collected for two soluble proteins, lysozyme and phycocyanin, using less than 0.1 mg of each protein.« less
Serial femtosecond crystallography of soluble proteins in lipidic cubic phase
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fromme, Raimund; Ishchenko, Andrii; Metz, Markus
Serial femtosecond crystallography (SFX) at X-ray free-electron lasers (XFELs) enables high-resolution protein structure determination using micrometre-sized crystals at room temperature with minimal effects from radiation damage. SFX requires a steady supply of microcrystals intersecting the XFEL beam at random orientations. An LCP–SFX method has recently been introduced in which microcrystals of membrane proteins are grown and delivered for SFX data collection inside a gel-like membrane-mimetic matrix, known as lipidic cubic phase (LCP), using a special LCP microextrusion injector. Here, it is shown enabling a dramatic reduction in the amount of crystallized protein required for data collection compared with crystals deliveredmore » by liquid injectors. High-quality LCP–SFX data sets were collected for two soluble proteins, lysozyme and phycocyanin, using less than 0.1 mg of each protein.« less
Attya, Mohamed; Benabdelkamel, Hicham; Perri, Enzo; Russo, Anna; Sindona, Giovanni
2010-12-01
The quality of olive oils is sensorially tested by accurate and well established methods. It enables the classification of the pressed oils into the classes of extra virgin oil, virgin oil and lampant oil. Nonetheless, it would be convenient to have analytical methods for screening oils or supporting sensorial analysis using a reliable independent approach based on exploitation of mass spectrometric methodologies. A number of methods have been proposed to evaluate deficiencies of extra virgin olive oils resulting from inappropriate technological treatments, such as high or low temperature deodoration, and home cooking processes. The quality and nutraceutical value of extra virgin olive oil (EVOO) can be related to the antioxidant property of its phenolic compounds. Olive oil is a source of at least 30 phenolic compounds, such as oleuropein, oleocanthal, hydroxytyrosol, and tyrosol, all acting as strong antioxidants, radical scavengers and NSAI-like drugs. We now report the efficacy of MRM tandem mass spectrometry, assisted by the isotope dilution assay, in the evaluation of the thermal stability of selected active principles of extra virgin olive oil.
Enhanced sequencing coverage with digital droplet multiple displacement amplification
Sidore, Angus M.; Lan, Freeman; Lim, Shaun W.; Abate, Adam R.
2016-01-01
Sequencing small quantities of DNA is important for applications ranging from the assembly of uncultivable microbial genomes to the identification of cancer-associated mutations. To obtain sufficient quantities of DNA for sequencing, the small amount of starting material must be amplified significantly. However, existing methods often yield errors or non-uniform coverage, reducing sequencing data quality. Here, we describe digital droplet multiple displacement amplification, a method that enables massive amplification of low-input material while maintaining sequence accuracy and uniformity. The low-input material is compartmentalized as single molecules in millions of picoliter droplets. Because the molecules are isolated in compartments, they amplify to saturation without competing for resources; this yields uniform representation of all sequences in the final product and, in turn, enhances the quality of the sequence data. We demonstrate the ability to uniformly amplify the genomes of single Escherichia coli cells, comprising just 4.7 fg of starting DNA, and obtain sequencing coverage distributions that rival that of unamplified material. Digital droplet multiple displacement amplification provides a simple and effective method for amplifying minute amounts of DNA for accurate and uniform sequencing. PMID:26704978
Talarska, D
2007-01-01
Evaluation of quality of life has become a frequently used method in treatment effects supervision. Quality of Life Childhood Epilepsy (QOLCE) questionnaire, which is completed by patients' parents, has been prepared for children with epilepsy. It enables to determine the quality of life in children aged 4-18 years. The aim of the study was to show the usefulness of QOLCE questionnaire in evaluating the quality of life of children with epilepsy. 160 epileptic children, aged 8-18 years and their parents were examined in the Chair and Department of Developmental Neurology, K. Marcinkowski University of Medical Sciences in Poznań. QOLCE questionnaire was completed by parents and "Young people and epilepsy" questionnaire was designed for children. Reliability index of the complete questionnaire in own research and in the original amounted to 0.93 Cronbach alpha coefficient. Epileptic, drug-resistant children constituted 28% of the examined group. Parents of children with controlled seizures evaluated children's functioning in analyzed areas of quality of life higher. 1. QOLCE questionnaire is a suitable tool to evaluate the quality of children's and adolescents' life. 2. The most significant differences in functioning of epileptic, drug-resistant patients and those with controlled seizures were observed in areas of cognitive processes and social activity.
Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric
2014-03-01
Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.
Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne
2016-02-01
Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.
Automated ground-water monitoring with Robowell: case studies and potential applications
NASA Astrophysics Data System (ADS)
Granato, Gregory E.; Smith, Kirk P.
2002-02-01
Robowell is an automated system and method for monitoring ground-water quality. Robowell meets accepted manual- sampling protocols without high labor and laboratory costs. Robowell periodically monitors and records water-quality properties and constituents in ground water by pumping a well or multilevel sampler until one or more purge criteria have been met. A record of frequent water-quality measurements from a monitoring site can indicate changes in ground-water quality and can provide a context for the interpretation of laboratory data from discrete samples. Robowell also can communicate data and system performance through a remote communication link. Remote access to ground-water data enables the user to monitor conditions and optimize manual sampling efforts. Six Robowell prototypes have successfully monitored ground-water quality during all four seasons of the year under different hydrogeologic conditions, well designs, and geochemical environments. The U.S. Geological Survey is seeking partners for research with robust and economical water-quality monitoring instruments designed to measure contaminants of concern in conjunction with the application and commercialization of the Robowell technology. Project publications and information about technology transfer opportunities are available on the Internet at URL http://ma.water.usgs.gov/automon/
Automated ground-water monitoring with robowell-Case studies and potential applications
Granato, G.E.; Smith, K.P.; ,
2001-01-01
Robowell is an automated system and method for monitoring ground-water quality. Robowell meets accepted manual-sampling protocols without high labor and laboratory costs. Robowell periodically monitors and records water-quality properties and constituents in ground water by pumping a well or multilevel sampler until one or more purge criteria have been met. A record of frequent water-quality measurements from a monitoring site can indicate changes in ground-water quality and can provide a context for the interpretation of laboratory data from discrete samples. Robowell also can communicate data and system performance through a remote communication link. Remote access to ground-water data enables the user to monitor conditions and optimize manual sampling efforts. Six Robowell prototypes have successfully monitored ground-water quality during all four seasons of the year under different hydrogeologic conditions, well designs, and geochemical environments. The U.S. Geological Survey is seeking partners for research with robust and economical water-quality monitoring instruments designed to measure contaminants of concern in conjunction with the application and commercialization of the Robowell technology. Project publications and information about technology transfer opportunities are available on the Internet at URL http://ma.water.usgs.gov/automon/.
Bailie, Jodie; Laycock, Alison; Matthews, Veronica; Bailie, Ross
2016-01-01
There is an enduring gap between recommended practice and care that is actually delivered; and there is wide variation between primary health care (PHC) centers in delivery of care. Where aspects of care are not being done well across a range of PHC centers, this is likely due to inadequacies in the broader system. This paper aims to describe stakeholders' perceptions of the barriers and enablers to addressing gaps in Australian Aboriginal and Torres Strait Islander chronic illness care and child health, and to identify key drivers for improvement. This paper draws on data collected as part of a large-scale continuous quality improvement project in Australian Indigenous PHC settings. We undertook a qualitative assessment of stakeholder feedback on the main barriers and enablers to addressing gaps in care for Aboriginal and Torres Strait Islander children and in chronic illness care. Themes on barriers and enablers were further analyzed to develop a "driver diagram," an improvement tool used to locate barriers and enablers within causal pathways (as primary and secondary drivers), enabling them to be targeted by tailored interventions. We identified 5 primary drivers and 11 secondary drivers of high-quality care, and associated strategies that have potential for wide-scale implementation to address barriers and enablers for improving care. Perceived barriers to addressing gaps in care included both health system and staff attributes. Primary drivers were: staff capability to deliver high-quality care; availability and use of clinical information systems and decision support tools; embedding of quality improvement processes and data-driven decision-making; appropriate and effective recruitment and retention of staff; and community capacity, engagement and mobilization for health. Suggested strategies included mechanisms for increasing clinical supervision and support, staff retention, reorientation of service delivery, use of information systems and community health literacy. The findings identify areas of focus for development of barrier-driven, tailored interventions to improve health outcomes. They reinforce the importance of system-level action to improve health center performance and health outcomes, and of developing strategies to address system-wide challenges that can be adapted to local contexts.
Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette
2018-05-10
Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.
Multiclassifier information fusion methods for microarray pattern recognition
NASA Astrophysics Data System (ADS)
Braun, Jerome J.; Glina, Yan; Judson, Nicholas; Herzig-Marx, Rachel
2004-04-01
This paper addresses automatic recognition of microarray patterns, a capability that could have a major significance for medical diagnostics, enabling development of diagnostic tools for automatic discrimination of specific diseases. The paper presents multiclassifier information fusion methods for microarray pattern recognition. The input space partitioning approach based on fitness measures that constitute an a-priori gauging of classification efficacy for each subspace is investigated. Methods for generation of fitness measures, generation of input subspaces and their use in the multiclassifier fusion architecture are presented. In particular, two-level quantification of fitness that accounts for the quality of each subspace as well as the quality of individual neighborhoods within the subspace is described. Individual-subspace classifiers are Support Vector Machine based. The decision fusion stage fuses the information from mulitple SVMs along with the multi-level fitness information. Final decision fusion stage techniques, including weighted fusion as well as Dempster-Shafer theory based fusion are investigated. It should be noted that while the above methods are discussed in the context of microarray pattern recognition, they are applicable to a broader range of discrimination problems, in particular to problems involving a large number of information sources irreducible to a low-dimensional feature space.
Wagner Mackenzie, Brett; Waite, David W; Taylor, Michael W
2015-01-01
The human gut contains dense and diverse microbial communities which have profound influences on human health. Gaining meaningful insights into these communities requires provision of high quality microbial nucleic acids from human fecal samples, as well as an understanding of the sources of variation and their impacts on the experimental model. We present here a systematic analysis of commonly used microbial DNA extraction methods, and identify significant sources of variation. Five extraction methods (Human Microbiome Project protocol, MoBio PowerSoil DNA Isolation Kit, QIAamp DNA Stool Mini Kit, ZR Fecal DNA MiniPrep, phenol:chloroform-based DNA isolation) were evaluated based on the following criteria: DNA yield, quality and integrity, and microbial community structure based on Illumina amplicon sequencing of the V4 region of bacterial and archaeal 16S rRNA genes. Our results indicate that the largest portion of variation within the model was attributed to differences between subjects (biological variation), with a smaller proportion of variation associated with DNA extraction method (technical variation) and intra-subject variation. A comprehensive understanding of the potential impact of technical variation on the human gut microbiota will help limit preventable bias, enabling more accurate diversity estimates.
Blind multirigid retrospective motion correction of MR images.
Loktyushin, Alexander; Nickisch, Hannes; Pohmann, Rolf; Schölkopf, Bernhard
2015-04-01
Physiological nonrigid motion is inevitable when imaging, e.g., abdominal viscera, and can lead to serious deterioration of the image quality. Prospective techniques for motion correction can handle only special types of nonrigid motion, as they only allow global correction. Retrospective methods developed so far need guidance from navigator sequences or external sensors. We propose a fully retrospective nonrigid motion correction scheme that only needs raw data as an input. Our method is based on a forward model that describes the effects of nonrigid motion by partitioning the image into patches with locally rigid motion. Using this forward model, we construct an objective function that we can optimize with respect to both unknown motion parameters per patch and the underlying sharp image. We evaluate our method on both synthetic and real data in 2D and 3D. In vivo data was acquired using standard imaging sequences. The correction algorithm significantly improves the image quality. Our compute unified device architecture (CUDA)-enabled graphic processing unit implementation ensures feasible computation times. The presented technique is the first computationally feasible retrospective method that uses the raw data of standard imaging sequences, and allows to correct for nonrigid motion without guidance from external motion sensors. © 2014 Wiley Periodicals, Inc.
Revising the lower statistical limit of x-ray grating-based phase-contrast computed tomography.
Marschner, Mathias; Birnbacher, Lorenz; Willner, Marian; Chabior, Michael; Herzen, Julia; Noël, Peter B; Pfeiffer, Franz
2017-01-01
Phase-contrast x-ray computed tomography (PCCT) is currently investigated as an interesting extension of conventional CT, providing high soft-tissue contrast even if examining weakly absorbing specimen. Until now, the potential for dose reduction was thought to be limited compared to attenuation CT, since meaningful phase retrieval fails for scans with very low photon counts when using the conventional phase retrieval method via phase stepping. In this work, we examine the statistical behaviour of the reverse projection method, an alternative phase retrieval approach and compare the results to the conventional phase retrieval technique. We investigate the noise levels in the projections as well as the image quality and quantitative accuracy of the reconstructed tomographic volumes. The results of our study show that this method performs better in a low-dose scenario than the conventional phase retrieval approach, resulting in lower noise levels, enhanced image quality and more accurate quantitative values. Overall, we demonstrate that the lower statistical limit of the phase stepping procedure as proposed by recent literature does not apply to this alternative phase retrieval technique. However, further development is necessary to overcome experimental challenges posed by this method which would enable mainstream or even clinical application of PCCT.
NASA Astrophysics Data System (ADS)
Pfefer, Joshua; Agrawal, Anant
2012-03-01
In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolly, S; Mutic, S; Anastasio, M
Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework wasmore » developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation of additional modules to include any aspect of the treatment process, and therefore has great potential for both assessment and optimization within radiation therapy.« less
Automated daily quality control analysis for mammography in a multi-unit imaging center.
Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli
2018-01-01
Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.
Global parameter estimation for thermodynamic models of transcriptional regulation.
Suleimenov, Yerzhan; Ay, Ahmet; Samee, Md Abul Hassan; Dresch, Jacqueline M; Sinha, Saurabh; Arnosti, David N
2013-07-15
Deciphering the mechanisms involved in gene regulation holds the key to understanding the control of central biological processes, including human disease, population variation, and the evolution of morphological innovations. New experimental techniques including whole genome sequencing and transcriptome analysis have enabled comprehensive modeling approaches to study gene regulation. In many cases, it is useful to be able to assign biological significance to the inferred model parameters, but such interpretation should take into account features that affect these parameters, including model construction and sensitivity, the type of fitness calculation, and the effectiveness of parameter estimation. This last point is often neglected, as estimation methods are often selected for historical reasons or for computational ease. Here, we compare the performance of two parameter estimation techniques broadly representative of local and global approaches, namely, a quasi-Newton/Nelder-Mead simplex (QN/NMS) method and a covariance matrix adaptation-evolutionary strategy (CMA-ES) method. The estimation methods were applied to a set of thermodynamic models of gene transcription applied to regulatory elements active in the Drosophila embryo. Measuring overall fit, the global CMA-ES method performed significantly better than the local QN/NMS method on high quality data sets, but this difference was negligible on lower quality data sets with increased noise or on data sets simplified by stringent thresholding. Our results suggest that the choice of parameter estimation technique for evaluation of gene expression models depends both on quality of data, the nature of the models [again, remains to be established] and the aims of the modeling effort. Copyright © 2013 Elsevier Inc. All rights reserved.
Integrated standardization concept for Angelica botanicals using quantitative NMR
Gödecke, Tanja; Yao, Ping; Napolitano, José G.; Nikolić, Dejan; Dietz, Birgit M.; Bolton, Judy L.; van Breemen, Richard B.; Farnsworth, Norman R.; Chen, Shao-Nong; Lankin, David C.; Pauli, Guido F.
2011-01-01
Despite numerous in vitro/vivo and phytochemical studies, the active constituents of Angelica sinensis (AS) have not been conclusively identified for the standardization to bioactive markers. Phytochemical analyses of AS extracts and fractions that demonstrate activity in a panel of in vitro bioassays, have repeatedly pointed to ligustilide as being (associated with) the active principle(s). Due to the chemical instability of ligustilide and related issues in GC/LC analyses, new methods capable of quantifying ligustilide in mixtures that do not rely on an identical reference standard are in high demand. This study demonstrates how NMR can satisfy the requirement for simultaneous, multi-target quantification and qualitative identification. First, the AS activity was concentrated into a single fraction by RP-solid-phase extraction, as confirmed by an (anti-)estrogenicity and cytotoxicity assay. Next, a quantitative 1H NMR (qHNMR) method was established and validated using standard compounds and comparing processing methods. Subsequent 1D/2D NMR and qHNMR analysis led to the identification and quantification of ligustilide and other minor components in the active fraction, and to the development of quality criteria for authentic AS preparations. The absolute and relative quantities of ligustilide, six minor alkyl phthalides, and groups of phenylpropanoids, polyynes, and poly-unsaturated fatty acids were measured by a combination of qHNMR and 2D COSY. The qNMR approach enables multi-target quality control of the bioactive fraction, and enables the integrated biological and chemical standardization of AS botanicals. This methodology can potentially be transferred to other botanicals with active principles that act synergistically, or that contain closely related and/or constituents, which have not been conclusively identified as the active principles. PMID:21907766
Rees, Alan F.; Avens, Larisa; Ballorain, Katia; Bevan, Elizabeth; Broderick, Annette C.; Carthy, Raymond R.; Christianen, Marjolijn J. A.; Duclos, Gwénaël; Heithaus, Michael R.; Johnston, David W.; Mangel, Jeffrey C.; Paladino, Frank V.; Pendoley, Kellie; Reina, Richard D.; Robinson, Nathan J.; Ryan, Robert; Sykora-Bodie, Seth T.; Tilley, Dominic; Varela, Miguel R.; Whitman, Elizabeth R.; Whittock, Paul A.; Wibbels, Thane; Godley, Brendan J.
2018-01-01
The use of satellite systems and manned aircraft surveys for remote data collection has been shown to be transformative for sea turtle conservation and research by enabling the collection of data on turtles and their habitats over larger areas than can be achieved by surveys on foot or by boat. Unmanned aerial vehicles (UAVs) or drones are increasingly being adopted to gather data, at previously unprecedented spatial and temporal resolutions in diverse geographic locations. This easily accessible, low-cost tool is improving existing research methods and enabling novel approaches in marine turtle ecology and conservation. Here we review the diverse ways in which incorporating inexpensive UAVs may reduce costs and field time while improving safety and data quality and quantity over existing methods for studies on turtle nesting, at-sea distribution and behaviour surveys, as well as expanding into new avenues such as surveillance against illegal take. Furthermore, we highlight the impact that high-quality aerial imagery captured by UAVs can have for public outreach and engagement. This technology does not come without challenges. We discuss the potential constraints of these systems within the ethical and legal frameworks which researchers must operate and the difficulties that can result with regard to storage and analysis of large amounts of imagery. We then suggest areas where technological development could further expand the utility of UAVs as data-gathering tools; for example, functioning as downloading nodes for data collected by sensors placed on turtles. Development of methods for the use of UAVs in sea turtle research will serve as case studies for use with other marine and terrestrial taxa.
Nims, Raymond W; Sykes, Greg; Cottrill, Karin; Ikonomi, Pranvera; Elmore, Eugene
2010-12-01
The role of cell authentication in biomedical science has received considerable attention, especially within the past decade. This quality control attribute is now beginning to be given the emphasis it deserves by granting agencies and by scientific journals. Short tandem repeat (STR) profiling, one of a few DNA profiling technologies now available, is being proposed for routine identification (authentication) of human cell lines, stem cells, and tissues. The advantage of this technique over methods such as isoenzyme analysis, karyotyping, human leukocyte antigen typing, etc., is that STR profiling can establish identity to the individual level, provided that the appropriate number and types of loci are evaluated. To best employ this technology, a standardized protocol and a data-driven, quality-controlled, and publically searchable database will be necessary. This public STR database (currently under development) will enable investigators to rapidly authenticate human-based cultures to the individual from whom the cells were sourced. Use of similar approaches for non-human animal cells will require developing other suitable loci sets. While implementing STR analysis on a more routine basis should significantly reduce the frequency of cell misidentification, additional technologies may be needed as part of an overall authentication paradigm. For instance, isoenzyme analysis, PCR-based DNA amplification, and sequence-based barcoding methods enable rapid confirmation of a cell line's species of origin while screening against cross-contaminations, especially when the cells present are not recognized by the species-specific STR method. Karyotyping may also be needed as a supporting tool during establishment of an STR database. Finally, good cell culture practices must always remain a major component of any effort to reduce the frequency of cell misidentification.
Microaspiration of Solanum tuberosum root cells at early stages of infection by Globodera pallida.
Kooliyottil, Rinu; Dandurand, Louise-Marie; Kuhl, Joseph C; Caplan, Allan; Xiao, Fangming
2017-01-01
Sedentary endoparasitic cyst nematodes form a feeding structure in plant roots, called a syncytium. Syncytium formation involves extensive transcriptional modifications, which leads to cell modifications such as increased cytoplasmic streaming, enlarged nuclei, increased numbers of organelles, and replacement of a central vacuole by many small vacuoles. When whole root RNA is isolated and analyzed, transcript changes manifested in the infected plant cells are overshadowed by gene expression from cells of the entire root system. Use of microaspiration allows isolation of the content of nematode infected cells from a heterogeneous cell population. However, one challenge with this method is identifying the nematode infected cells under the microscope at early stages of infection. This problem was addressed by staining nematode juveniles with a fluorescent dye prior to infection so that the infected cells could be located and microaspirated. In the present study, we used the fluorescent vital stain PKH26 coupled with a micro-rhizosphere chamber to locate the infected nematode Globodera pallida in Solanum tuberosum root cells. This enabled microaspiration of nematode-infected root cells during the early stages of parasitism. To study the transcriptional events occurring in these cells, an RNA isolation method from microaspirated samples was optimized, and subsequently the RNA was purified using magnetic beads. With this method, we obtained an RNA quality number of 7.8. For transcriptome studies, cDNA was synthesized from the isolated RNA and assessed by successfully amplifying several pathogenesis related protein coding genes. The use of PKH26 stained nematode juveniles enabled early detection of nematode infected cells for microaspiration. To investigate transcriptional changes in low yielding RNA samples, bead-based RNA extraction procedures minimized RNA degradation and provided high quality RNA. This protocol provides a robust procedure to analyze gene expression in nematode-infected cells.
Zhang, Hai; Bussini, Daniele; Hortal, Mercedes; Elegir, Graziano; Mendes, Joana; Jordá Beneyto, Maria
2016-06-01
For paper and paperboard packaging, recyclability plays an important role in conserving the resources and reducing the environmental impacts. Therefore, when it comes to the nano-enabled paper packaging material, the recyclability issue should be properly addressed. This study represents our first report on the fate of nanomaterials in paper recycling process. The packaging material of concern is a PLA (Polylactic Acid) coated paper incorporating zinc oxide nanoparticles in the coating layer. The material was characterised and assessed in a lab-scale paper recycling line. The recyclability test was based on a method adapted from ATICELCA MC501-13, which enabled to recover over 99% of the solids material. The mass balance result indicates that 86-91% zinc oxide nanoparticles ended up in the rejected material stream, mostly embedded within the polymer coating; whereas 7-16% nanoparticles ended up in the accepted material stream. Besides, the tensile strength of the recycled handsheets suggests that the nano-enabled coating had no negative impacts on the recovered fibre quality. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automated Tumor Volumetry Using Computer-Aided Image Segmentation
Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos
2015-01-01
Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633
Agrimonti, Caterina; Bottari, Benedetta; Sardaro, Maria Luisa Savo; Marmiroli, Nelson
2017-09-08
Dairy foods represent an important sector of the food market for their nutritional qualities and their organoleptic characteristics, which are often linked to tradition and to region. These products are typically protected by labels such as PDO (Protected Designation of Origin) and PGI (Protected Geographical Indication). Real-time PCR (qPCR) is a fundamental tool in "Food Genomics;" a discipline concerned with the residual DNA in food, which, alongside traditional physical and chemical methods, is frequently used to determine product safety, quality and authenticity. Compared to conventional or "end-point" PCR, qPCR incorporates continuous monitoring of reaction progress, thereby enabling quantification of target DNA. This review describes qPCR applications to the analysis of microbiota, and to the identification of the animal species source of milk from which dairy products have been made. These are important aspects for ensuring safety and authenticity. The various applications of qPCR are discussed, as well as advantages and disadvantages in comparison with other analytical methods.
Luber, Florian; Demmel, Anja; Hosken, Anne; Busch, Ulrich; Engel, Karl-Heinz
2012-06-13
The confectionery ingredient marzipan is exclusively prepared from almond kernels and sugar. The potential use of apricot kernels, so-called persipan, is an important issue for the quality assessment of marzipan. Therefore, a ligation-dependent probe amplification (LPA) assay was developed that enables a specific and sensitive detection of apricot DNA, as an indicator for the presence of persipan. The limit of detection was determined to be 0.1% persipan in marzipan. The suitability of the method was confirmed by the analysis of 20 commercially available food samples. The integration of a Prunus -specific probe in the LPA assay as a reference allowed for the relative quantitation of persipan in marzipan. The limit of quantitation was determined to be 0.5% persipan in marzipan. The analysis of two self-prepared mixtures of marzipan and persipan demonstrated the applicability of the quantitation method at concentration levels of practical relevance for quality control.
Isleroglu, Hilal; Kemerli, Tansel; Özdestan, Özgül; Uren, Ali; Kaymak-Ertekin, Figen
2014-09-01
The aim of this study was to evaluate effect of steam-assisted hybrid oven cooking method in comparison with convection ovens (natural and forced) on quality characteristics (color, hardness, cooking loss, soluble protein content, fat retention, and formation of heterocyclic aromatic amines) of chicken patties. The cooking experiments of chicken patties (n = 648) were conducted at oven temperatures of 180, 210, and 240°C until 3 different end point temperatures (75, 90, and 100°C) were reached. Steam-assisted hybrid oven cooking enabled faster cooking than convection ovens and resulted in chicken patties having lower a* and higher L* value, lower hardness, lower fat, and soluble protein content (P < 0.05), and higher cooking loss than convection ovens. Steam-assisted hybrid oven could reduce the formation of heterocyclic aromatic amines that have mutagenic and carcinogenic effects on humans. © 2014 Poultry Science Association Inc.
Freestanding films of crosslinked gold nanoparticles prepared via layer-by-layer spin-coating.
Schlicke, Hendrik; Schröder, Jan H; Trebbin, Martin; Petrov, Alexey; Ijeh, Michael; Weller, Horst; Vossmeyer, Tobias
2011-07-29
A new, extremely efficient method for the fabrication of films comprised of gold nanoparticles (GNPs) crosslinked by organic dithiols is presented in this paper. The method is based on layer-by-layer spin-coating of both components, GNPs and crosslinker, and enables the deposition of films several tens of nanometers in thickness within a few minutes. X-ray diffraction and conductance measurements reveal the proper adjustment concentration of the crosslinker solution of the critical is in order to prevent the destabilization and coalescence of particles. UV/vis spectroscopy, atomic force microscopy, and conductivity measurements indicate that films prepared via layer-by-layer spin-coating are of comparable quality to coatings prepared via laborious layer-by-layer self-assembly using immersion baths. Because spin-coated films are not bound chemically to the substrate, they can be lifted-off by alkaline underetching and transferred onto 3d-electrodes to produce electrically addressable, freely suspended films. Comparative measurements of the sheet resistances indicate that the transfer process does not compromise the film quality.
Freestanding films of crosslinked gold nanoparticles prepared via layer-by-layer spin-coating
NASA Astrophysics Data System (ADS)
Schlicke, Hendrik; Schröder, Jan H.; Trebbin, Martin; Petrov, Alexey; Ijeh, Michael; Weller, Horst; Vossmeyer, Tobias
2011-07-01
A new, extremely efficient method for the fabrication of films comprised of gold nanoparticles (GNPs) crosslinked by organic dithiols is presented in this paper. The method is based on layer-by-layer spin-coating of both components, GNPs and crosslinker, and enables the deposition of films several tens of nanometers in thickness within a few minutes. X-ray diffraction and conductance measurements reveal the proper adjustment concentration of the crosslinker solution of the critical is in order to prevent the destabilization and coalescence of particles. UV/vis spectroscopy, atomic force microscopy, and conductivity measurements indicate that films prepared via layer-by-layer spin-coating are of comparable quality to coatings prepared via laborious layer-by-layer self-assembly using immersion baths. Because spin-coated films are not bound chemically to the substrate, they can be lifted-off by alkaline underetching and transferred onto 3d-electrodes to produce electrically addressable, freely suspended films. Comparative measurements of the sheet resistances indicate that the transfer process does not compromise the film quality.
Fast imaging of live organisms with sculpted light sheets
NASA Astrophysics Data System (ADS)
Chmielewski, Aleksander K.; Kyrsting, Anders; Mahou, Pierre; Wayland, Matthew T.; Muresan, Leila; Evers, Jan Felix; Kaminski, Clemens F.
2015-04-01
Light-sheet microscopy is an increasingly popular technique in the life sciences due to its fast 3D imaging capability of fluorescent samples with low photo toxicity compared to confocal methods. In this work we present a new, fast, flexible and simple to implement method to optimize the illumination light-sheet to the requirement at hand. A telescope composed of two electrically tuneable lenses enables us to define thickness and position of the light-sheet independently but accurately within milliseconds, and therefore optimize image quality of the features of interest interactively. We demonstrated the practical benefit of this technique by 1) assembling large field of views from tiled single exposure each with individually optimized illumination settings; 2) sculpting the light-sheet to trace complex sample shapes within single exposures. This technique proved compatible with confocal line scanning detection, further improving image contrast and resolution. Finally, we determined the effect of light-sheet optimization in the context of scattering tissue, devising procedures for balancing image quality, field of view and acquisition speed.
Cost-effective rapid prototyping and assembly of poly(methyl methacrylate) microfluidic devices.
Matellan, Carlos; Del Río Hernández, Armando E
2018-05-03
The difficulty in translating conventional microfluidics from laboratory prototypes to commercial products has shifted research efforts towards thermoplastic materials for their higher translational potential and amenability to industrial manufacturing. Here, we present an accessible method to fabricate and assemble polymethyl methacrylate (PMMA) microfluidic devices in a "mask-less" and cost-effective manner that can be applied to manufacture a wide range of designs due to its versatility. Laser micromachining offers high flexibility in channel dimensions and morphology by controlling the laser properties, while our two-step surface treatment based on exposure to acetone vapour and low-temperature annealing enables improvement of the surface quality without deformation of the device. Finally, we demonstrate a capillarity-driven adhesive delivery bonding method that can produce an effective seal between PMMA devices and a variety of substrates, including glass, silicon and LiNbO 3 . We illustrate the potential of this technique with two microfluidic devices, an H-filter and a droplet generator. The technique proposed here offers a low entry barrier for the rapid prototyping of thermoplastic microfluidics, enabling iterative design for laboratories without access to conventional microfabrication equipment.
NASA Astrophysics Data System (ADS)
Wyjadłowski, Marek
2017-12-01
The constant development of geotechnical technologies imposes the necessity of monitoring techniques to provide a proper quality and the safe execution of geotechnical works. Several monitoring methods enable the preliminary design of work process and current control of hydrotechnical works (pile driving, sheet piling, ground improvement methods). Wave parameter measurements and/or continuous histogram recording of shocks and vibrations and its dynamic impact on engineering structures in the close vicinity of the building site enable the modification of the technology parameters, such as vibrator frequency or hammer drop height. Many examples of practical applications have already been published and provide a basis for the formulation of guidelines, for work on the following sites. In the current work the author's experience gained during sheet piling works for the reconstruction of City Channel in Wrocław (Poland) was presented. The examples chosen describe ways of proceedings in the case of new and old residential buildings where the concrete or masonry walls were exposed to vibrations and in the case of the hydrotechnical structures (sluices, bridges).
Owlia, P; Vasei, M; Goliaei, B; Nassiri, I
2011-04-01
The interests in journal impact factor (JIF) in scientific communities have grown over the last decades. The JIFs are used to evaluate journals quality and the papers published therein. JIF is a discipline specific measure and the comparison between the JIF dedicated to different disciplines is inadequate, unless a normalization process is performed. In this study, normalized impact factor (NIF) was introduced as a relatively simple method enabling the JIFs to be used when evaluating the quality of journals and research works in different disciplines. The NIF index was established based on the multiplication of JIF by a constant factor. The constants were calculated for all 54 disciplines of biomedical field during 2005, 2006, 2007, 2008 and 2009 years. Also, ranking of 393 journals in different biomedical disciplines according to the NIF and JIF were compared to illustrate how the NIF index can be used for the evaluation of publications in different disciplines. The findings prove that the use of the NIF enhances the equality in assessing the quality of research works produced by researchers who work in different disciplines. Copyright © 2010 Elsevier Inc. All rights reserved.
Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Soares, Siomar de Castro; dos Santos, Anderson Rodrigues; Almeida, Sintia; Guimarães, Luis; Figueira, Flávia; Barbosa, Eudes; Tauch, Andreas; Azevedo, Vasco; Silva, Artur
2013-03-01
New sequencing platforms have enabled rapid decoding of complete prokaryotic genomes at relatively low cost. The Ion Torrent platform is an example of these technologies, characterized by lower coverage, generating challenges for the genome assembly. One particular problem is the lack of genomes that enable reference-based assembly, such as the one used in the present study, Corynebacterium pseudotuberculosis biovar equi, which causes high economic losses in the US equine industry. The quality treatment strategy incorporated into the assembly pipeline enabled a 16-fold greater use of the sequencing data obtained compared with traditional quality filter approaches. Data preprocessing prior to the de novo assembly enabled the use of known methodologies in the next-generation sequencing data assembly. Moreover, manual curation was proved to be essential for ensuring a quality assembly, which was validated by comparative genomics with other species of the genus Corynebacterium. The present study presents a modus operandi that enables a greater and better use of data obtained from semiconductor sequencing for obtaining the complete genome from a prokaryotic microorganism, C. pseudotuberculosis, which is not a traditional biological model such as Escherichia coli. © 2012 The Authors. Published by Society for Applied Microbiology and Blackwell Publishing Ltd. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Corbel, Michael J; Das, Rose Gaines; Lei, Dianliang; Xing, Dorothy K L; Horiuchi, Yoshinobu; Dobbelaer, Roland
2008-04-07
This report reflects the discussion and conclusions of a WHO group of experts from National Regulatory Authorities (NRAs), National Control Laboratories (NCLs), vaccine industries and other relevant institutions involved in standardization and control of diphtheria, tetanus and pertussis vaccines (DTP), held on 20-21 July 2006 and 28-30 March 2007, in Geneva Switzerland for the revision of WHO Manual for quality control of DTP vaccines. Taking into account recent developments and standardization in quality control methods and the revision of WHO recommendations for D, T, P vaccines, and a need for updating the manual has been recognized. In these two meetings the current situation of quality control methods in terms of potency, safety and identity tests for DTP vaccines and statistical analysis of data were reviewed. Based on the WHO recommendations and recent validation of testing methods, the content of current manual were reviewed and discussed. The group agreed that the principles to be observed in selecting methods included identifying those critical for assuring safety, efficacy and quality and which were consistent with WHO recommendations/requirements. Methods that were well recognized but not yet included in current Recommendations should be taken into account. These would include in vivo and/or in vitro methods for determining potency, safety testing and identity. The statistical analysis of the data should be revised and updated. It was noted that the mouse based assays for toxoid potency were still quite widely used and it was desirable to establish appropriate standards for these to enable the results to be related to the standard guinea pig assays. The working group was met again to review the first drafts and to input further suggestions or amendments to the contributions of the drafting groups. The revised manual was to be finalized and published by WHO.
Sader, John E; Friend, James R
2015-05-01
Overall precision of the simplified calibration method in J. E. Sader et al., Rev. Sci. Instrum. 83, 103705 (2012), Sec. III D, is dominated by the spring constant of the reference cantilever. The question arises: How does one take measurements from multiple reference cantilevers, and combine these results, to improve uncertainty of the reference cantilever's spring constant and hence the overall precision of the method? This question is addressed in this note. Its answer enables manufacturers to specify of a single set of data for the spring constant, resonant frequency, and quality factor, from measurements on multiple reference cantilevers. With this data set, users can trivially calibrate cantilevers of the same type.
Laser capture microdissection to study flower morphogenesis
NASA Astrophysics Data System (ADS)
Pawełkowicz, Magdalena Ewa; Skarzyńska, Agnieszka; Kowalczuk, Cezary; PlÄ der, Wojciech; Przybecki, Zbigniew
2017-08-01
Laser Capture Microdissection (LCM) is a sample preparation microscopic method that enables isolation of an interesting cell or cells population from human, animal or plant tissue. This technique allows for obtaining pure sample from heterogeneous mixture. From isolated cells, it is possible to obtain the appropriate quality material used for genomic research in transcriptomics, proteomics and metabolomics. We used LCM method to study flower morphogenesis and specific bud's organ organization and development. The genes expression level in developing flower buds of male (B10) and female (2gg) lines were analyzed with qPCR. The expression was checked for stamen and carpel primordia obtained with LCM and for whole flower buds at successive stages of growth.
Methods to increase the rate of mass transfer during osmotic dehydration of foods.
Chwastek, Anna
2014-01-01
Traditional methods of food preservation such as freezing, freeze drying (lyophilization), vacuum drying, convection drying are often supplemented by new technologies that enable obtaining of high quality products. Osmotic dehydration is more and more often used during processing of fruits and vegetables. This method allows maintaining good organoleptic and functional properties in the finished product. Obtaining the desired degree of dehydration or saturation of the material with an osmoactive substance often requires elongation of time or use of high temperatures. In recent years much attention was devoted to techniques aimed at increasing the mass transfer between the dehydrated material and the hypertonic solution. The work reviews the literature focused on methods of streamlining the process of osmotic dehydration which include the use of: ultrasound, high hydrostatic pressure, vacuum osmotic dehydration and pulsed electric field.
Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model
ERIC Educational Resources Information Center
Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep
2016-01-01
Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…
Non-Academic Service Quality: Comparative Analysis of Students and Faculty as Users
ERIC Educational Resources Information Center
Sharif, Khurram; Kassim, Norizan Mohd
2012-01-01
The research focus was a non-academic service quality assessment within higher education. In particular, non-academic service quality perceptions of faculty and students were evaluated using a service profit chain. This enabled a comparison which helped understanding of non-academic service quality orientation from a key users' perspective. Data…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Divan, Deepak; Brumsickle, William; Eto, Joseph
2003-04-01
This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less
NASA Astrophysics Data System (ADS)
Karam, Lina J.; Zhu, Tong
2015-03-01
The varying quality of face images is an important challenge that limits the effectiveness of face recognition technology when applied in real-world applications. Existing face image databases do not consider the effect of distortions that commonly occur in real-world environments. This database (QLFW) represents an initial attempt to provide a set of labeled face images spanning the wide range of quality, from no perceived impairment to strong perceived impairment for face detection and face recognition applications. Types of impairment include JPEG2000 compression, JPEG compression, additive white noise, Gaussian blur and contrast change. Subjective experiments are conducted to assess the perceived visual quality of faces under different levels and types of distortions and also to assess the human recognition performance under the considered distortions. One goal of this work is to enable automated performance evaluation of face recognition technologies in the presence of different types and levels of visual distortions. This will consequently enable the development of face recognition systems that can operate reliably on real-world visual content in the presence of real-world visual distortions. Another goal is to enable the development and assessment of visual quality metrics for face images and for face detection and recognition applications.
Kim, Tae Hyung; Setsompop, Kawin; Haldar, Justin P
2017-03-01
Parallel imaging and partial Fourier acquisition are two classical approaches for accelerated MRI. Methods that combine these approaches often rely on prior knowledge of the image phase, but the need to obtain this prior information can place practical restrictions on the data acquisition strategy. In this work, we propose and evaluate SENSE-LORAKS, which enables combined parallel imaging and partial Fourier reconstruction without requiring prior phase information. The proposed formulation is based on combining the classical SENSE model for parallel imaging data with the more recent LORAKS framework for MR image reconstruction using low-rank matrix modeling. Previous LORAKS-based methods have successfully enabled calibrationless partial Fourier parallel MRI reconstruction, but have been most successful with nonuniform sampling strategies that may be hard to implement for certain applications. By combining LORAKS with SENSE, we enable highly accelerated partial Fourier MRI reconstruction for a broader range of sampling trajectories, including widely used calibrationless uniformly undersampled trajectories. Our empirical results with retrospectively undersampled datasets indicate that when SENSE-LORAKS reconstruction is combined with an appropriate k-space sampling trajectory, it can provide substantially better image quality at high-acceleration rates relative to existing state-of-the-art reconstruction approaches. The SENSE-LORAKS framework provides promising new opportunities for highly accelerated MRI. Magn Reson Med 77:1021-1035, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
NASA Technical Reports Server (NTRS)
Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.;
2007-01-01
The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.
Xiao, Xia; Feng, Ya-Ping; Du, Bin; Sun, Han-Ru; Ding, You-Quan; Qi, Jian-Guo
2017-03-01
Fluorescent immunolabeling and imaging in free-floating thick (50-60 μm) tissue sections is relatively simple in practice and enables design-based non-biased stereology, or 3-D reconstruction and analysis. This method is widely used for 3-D in situ quantitative biology in many areas of biological research. However, the labeling quality and efficiency of standard protocols for fluorescent immunolabeling of these tissue sections are not always satisfactory. Here, we systematically evaluate the effects of raising the conventional antibody incubation temperatures (4°C or 21°C) to mammalian body temperature (37°C) in these protocols. Our modification significantly enhances the quality (labeling sensitivity, specificity, and homogeneity) and efficiency (antibody concentration and antibody incubation duration) of fluorescent immunolabeling of free-floating thick tissue sections.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Reconstructing Spatial Distributions from Anonymized Locations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horey, James L; Forrest, Stephanie; Groat, Michael
2012-01-01
Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstructionmore » algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.« less
Duque-Ramos, Astrid; Quesada-Martínez, Manuel; Iniesta-Moreno, Miguela; Fernández-Breis, Jesualdo Tomás; Stevens, Robert
2016-10-17
The biomedical community has now developed a significant number of ontologies. The curation of biomedical ontologies is a complex task and biomedical ontologies evolve rapidly, so new versions are regularly and frequently published in ontology repositories. This has the implication of there being a high number of ontology versions over a short time span. Given this level of activity, ontology designers need to be supported in the effective management of the evolution of biomedical ontologies as the different changes may affect the engineering and quality of the ontology. This is why there is a need for methods that contribute to the analysis of the effects of changes and evolution of ontologies. In this paper we approach this issue from the ontology quality perspective. In previous work we have developed an ontology evaluation framework based on quantitative metrics, called OQuaRE. Here, OQuaRE is used as a core component in a method that enables the analysis of the different versions of biomedical ontologies using the quality dimensions included in OQuaRE. Moreover, we describe and use two scales for evaluating the changes between the versions of a given ontology. The first one is the static scale used in OQuaRE and the second one is a new, dynamic scale, based on the observed values of the quality metrics of a corpus defined by all the versions of a given ontology (life-cycle). In this work we explain how OQuaRE can be adapted for understanding the evolution of ontologies. Its use has been illustrated with the ontology of bioinformatics operations, types of data, formats, and topics (EDAM). The two scales included in OQuaRE provide complementary information about the evolution of the ontologies. The application of the static scale, which is the original OQuaRE scale, to the versions of the EDAM ontology reveals a design based on good ontological engineering principles. The application of the dynamic scale has enabled a more detailed analysis of the evolution of the ontology, measured through differences between versions. The statistics of change based on the OQuaRE quality scores make possible to identify key versions where some changes in the engineering of the ontology triggered a change from the OQuaRE quality perspective. In the case of the EDAM, this study let us to identify that the fifth version of the ontology has the largest impact in the quality metrics of the ontology, when comparative analyses between the pairs of consecutive versions are performed.
Verbist, Bie; Clement, Lieven; Reumers, Joke; Thys, Kim; Vapirev, Alexander; Talloen, Willem; Wetzels, Yves; Meys, Joris; Aerssens, Jeroen; Bijnens, Luc; Thas, Olivier
2015-02-22
Deep-sequencing allows for an in-depth characterization of sequence variation in complex populations. However, technology associated errors may impede a powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores which are derived from a quadruplet of intensities, one channel for each nucleotide type for Illumina sequencing. The highest intensity of the four channels determines the base that is called. Mismatch bases can often be corrected by the second best base, i.e. the base with the second highest intensity in the quadruplet. A virus variant model-based clustering method, ViVaMBC, is presented that explores quality scores and second best base calls for identifying and quantifying viral variants. ViVaMBC is optimized to call variants at the codon level (nucleotide triplets) which enables immediate biological interpretation of the variants with respect to their antiviral drug responses. Using mixtures of HCV plasmids we show that our method accurately estimates frequencies down to 0.5%. The estimates are unbiased when average coverages of 25,000 are reached. A comparison with the SNP-callers V-Phaser2, ShoRAH, and LoFreq shows that ViVaMBC has a superb sensitivity and specificity for variants with frequencies above 0.4%. Unlike the competitors, ViVaMBC reports a higher number of false-positive findings with frequencies below 0.4% which might partially originate from picking up artificial variants introduced by errors in the sample and library preparation step. ViVaMBC is the first method to call viral variants directly at the codon level. The strength of the approach lies in modeling the error probabilities based on the quality scores. Although the use of second best base calls appeared very promising in our data exploration phase, their utility was limited. They provided a slight increase in sensitivity, which however does not warrant the additional computational cost of running the offline base caller. Apparently a lot of information is already contained in the quality scores enabling the model based clustering procedure to adjust the majority of the sequencing errors. Overall the sensitivity of ViVaMBC is such that technical constraints like PCR errors start to form the bottleneck for low frequency variant detection.
Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will
2016-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.
Morgan, Alison; Jimenez Soto, Eliana; Bhandari, Gajananda; Kermode, Michelle
2014-12-01
In Nepal, where difficult geography and an under-resourced health system contribute to poor health care access, the government has increased the number of trained skilled birth attendants (SBAs) and posted them in newly constructed birthing centres attached to peripheral health facilities that are available to women 24 h a day. This study describes their views on their enabling environment. Qualitative methods included semi-structured interviews with 22 SBAs within Palpa district, a hill district in the Western Region of Nepal; a focus group discussion with ten SBA trainees, and in-depth interviews with five key informants. Participants identified the essential components of an enabling environment as: relevant training; ongoing professional support; adequate infrastructure, equipment and drugs; and timely referral pathways. All SBAs who practised alone felt unable to manage obstetric complications because quality management of life-threatening complications requires the attention of more than one SBA. Maternal health guidelines should account for the provision of an enabling environment in addition to the deployment of SBAs. In Nepal, referral systems require strengthening, and the policy of posting SBAs alone, in remote clinics, needs to be reconsidered to achieve the goal of reducing maternal deaths through timely management of obstetric complications. © 2014 John Wiley & Sons Ltd.
Advanced materials for aircraft engine applications.
Backman, D G; Williams, J C
1992-02-28
A review of advances for aircraft engine structural materials and processes is presented. Improved materials, such as superalloys, and the processes for making turbine disks and blades have had a major impact on the capability of modern gas turbine engines. New structural materials, notably composites and intermetallic materials, are emerging that will eventually further enhance engine performance, reduce engine weight, and thereby enable new aircraft systems. In the future, successful aerospace manufacturers will combine product design and materials excellence with improved manufacturing methods to increase production efficiency, enhance product quality, and decrease the engine development cycle time.
The New Tropospheric Product of the International GNSS Service
NASA Technical Reports Server (NTRS)
Byun, Sung H.; Bar-Sever, Yoaz E.; Gendt, Gerd
2005-01-01
We compare this new approach for generating the IGS tropospheric products with the previous approach, which was based on explicit combination of total zenith delay contributions from the IGS ACs. The new approach enables the IGS to rapidly generate highly accurate and highly reliable total zenith delay time series for many hundreds of sites, thus increasing the utility of the products to weather modelers, climatologists, and GPS analysts. In this paper we describe this new method, and discuss issues of accuracy, quality control, utility of the new products and assess its benefits.
Oxidation of GaAs substrates to enable β-Ga2O3 films for sensors and optoelectronic devices
NASA Astrophysics Data System (ADS)
Mao, Howard; Alhalaili, Badriyah; Kaya, Ahmet; Dryden, Daniel M.; Woodall, Jerry M.; Islam, M. Saif
2017-08-01
A very simple and inexpensive method for growing β-Ga2O3 films by heating GaAs wafers at high temperature in a furnace was found to contribute to large-area, high-quality β-Ga2O3 nanoscale thin films as well as nanowires depending on the growth conditions. We present the material characterization results including the optical band gap, Schottky barrier height with metal (gold), field ionization and photoconductance of β-Ga2O3 film and nanowires.
Instrumental and atmospheric background lines observed by the SMM gamma-ray spectrometer
NASA Technical Reports Server (NTRS)
Share, G. H.; Kinzer, R. L.; Strickman, M. S.; Letaw, J. R.; Chupp, E. L.
1989-01-01
Preliminary identifications of instrumental and atmospheric background lines detected by the gamma-ray spectrometer on NASA's Solar Maximum Mission satellite (SMM) are presented. The long-term and stable operation of this experiment has provided data of high quality for use in this analysis. Methods are described for identifying radioactive isotopes which use their different decay times. Temporal evolution of the features are revealed by spectral comparisons, subtractions, and fits. An understanding of these temporal variations has enabled the data to be used for detecting celestial gamma-ray sources.
Tran Thi, Thu Nhi; Morse, J.; Caliste, D.; Fernandez, B.; Eon, D.; Härtwig, J.; Mer-Calfati, C.; Tranchant, N.; Arnault, J. C.; Lafford, T. A.; Baruchel, J.
2017-01-01
Bragg diffraction imaging enables the quality of synthetic single-crystal diamond substrates and their overgrown, mostly doped, diamond layers to be characterized. This is very important for improving diamond-based devices produced for X-ray optics and power electronics applications. The usual first step for this characterization is white-beam X-ray diffraction topography, which is a simple and fast method to identify the extended defects (dislocations, growth sectors, boundaries, stacking faults, overall curvature etc.) within the crystal. This allows easy and quick comparison of the crystal quality of diamond plates available from various commercial suppliers. When needed, rocking curve imaging (RCI) is also employed, which is the quantitative counterpart of monochromatic Bragg diffraction imaging. RCI enables the local determination of both the effective misorientation, which results from lattice parameter variation and the local lattice tilt, and the local Bragg position. Maps derived from these parameters are used to measure the magnitude of the distortions associated with polishing damage and the depth of this damage within the volume of the crystal. For overgrown layers, these maps also reveal the distortion induced by the incorporation of impurities such as boron, or the lattice parameter variations associated with the presence of growth-incorporated nitrogen. These techniques are described, and their capabilities for studying the quality of diamond substrates and overgrown layers, and the surface damage caused by mechanical polishing, are illustrated by examples. PMID:28381981
NASA Astrophysics Data System (ADS)
Gorczynska, Iwona; Migacz, Justin; Zawadzki, Robert J.; Sudheendran, Narendran; Jian, Yifan; Tiruveedhula, Pavan K.; Roorda, Austin; Werner, John S.
2015-07-01
We tested and compared the capability of multiple optical coherence tomography (OCT) angiography methods: phase variance, amplitude decorrelation and speckle variance, with application of the split spectrum technique, to image the choroiretinal complex of the human eye. To test the possibility of OCT imaging stability improvement we utilized a real-time tracking scanning laser ophthalmoscopy (TSLO) system combined with a swept source OCT setup. In addition, we implemented a post- processing volume averaging method for improved angiographic image quality and reduction of motion artifacts. The OCT system operated at the central wavelength of 1040nm to enable sufficient depth penetration into the choroid. Imaging was performed in the eyes of healthy volunteers and patients diagnosed with age-related macular degeneration.
Automated tumor volumetry using computer-aided image segmentation.
Gaonkar, Bilwaj; Macyszyn, Luke; Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A; Ali, Zarina S; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M; Davatzikos, Christos
2015-05-01
Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0-5 rating scale where 5 indicated perfect segmentation. The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, B; Southern Medical University, Guangzhou, Guangdong; Tian, Z
Purpose: While compressed sensing-based cone-beam CT (CBCT) iterative reconstruction techniques have demonstrated tremendous capability of reconstructing high-quality images from undersampled noisy data, its long computation time still hinders wide application in routine clinic. The purpose of this study is to develop a reconstruction framework that employs modern consensus optimization techniques to achieve CBCT reconstruction on a multi-GPU platform for improved computational efficiency. Methods: Total projection data were evenly distributed to multiple GPUs. Each GPU performed reconstruction using its own projection data with a conventional total variation regularization approach to ensure image quality. In addition, the solutions from GPUs were subjectmore » to a consistency constraint that they should be identical. We solved the optimization problem with all the constraints considered rigorously using an alternating direction method of multipliers (ADMM) algorithm. The reconstruction framework was implemented using OpenCL on a platform with two Nvidia GTX590 GPU cards, each with two GPUs. We studied the performance of our method and demonstrated its advantages through a simulation case with a NCAT phantom and an experimental case with a Catphan phantom. Result: Compared with the CBCT images reconstructed using conventional FDK method with full projection datasets, our proposed method achieved comparable image quality with about one third projection numbers. The computation time on the multi-GPU platform was ∼55 s and ∼ 35 s in the two cases respectively, achieving a speedup factor of ∼ 3.0 compared with single GPU reconstruction. Conclusion: We have developed a consensus ADMM-based CBCT reconstruction method which enabled performing reconstruction on a multi-GPU platform. The achieved efficiency made this method clinically attractive.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, A; Stayman, J; Otake, Y
Purpose: To address the challenges of image quality, radiation dose, and reconstruction speed in intraoperative cone-beam CT (CBCT) for neurosurgery by combining model-based image reconstruction (MBIR) with accelerated algorithmic and computational methods. Methods: Preclinical studies involved a mobile C-arm for CBCT imaging of two anthropomorphic head phantoms that included simulated imaging targets (ventricles, soft-tissue structures/bleeds) and neurosurgical procedures (deep brain stimulation (DBS) electrode insertion) for assessment of image quality. The penalized likelihood (PL) framework was used for MBIR, incorporating a statistical model with image regularization via an edgepreserving penalty. To accelerate PL reconstruction, the ordered-subset, separable quadratic surrogates (OS-SQS) algorithmmore » was modified to incorporate Nesterov's method and implemented on a multi-GPU system. A fair comparison of image quality between PL and conventional filtered backprojection (FBP) was performed by selecting reconstruction parameters that provided matched low-contrast spatial resolution. Results: CBCT images of the head phantoms demonstrated that PL reconstruction improved image quality (∼28% higher CNR) even at half the radiation dose (3.3 mGy) compared to FBP. A combination of Nesterov's method and fast projectors yielded a PL reconstruction run-time of 251 sec (cf., 5729 sec for OS-SQS, 13 sec for FBP). Insertion of a DBS electrode resulted in severe metal artifact streaks in FBP reconstructions, whereas PL was intrinsically robust against metal artifact. The combination of noise and artifact was reduced from 32.2 HU in FBP to 9.5 HU in PL, thereby providing better assessment of device placement and potential complications. Conclusion: The methods can be applied to intraoperative CBCT for guidance and verification of neurosurgical procedures (DBS electrode insertion, biopsy, tumor resection) and detection of complications (intracranial hemorrhage). Significant improvement in image quality, dose reduction, and reconstruction time of ∼4 min will enable practical deployment of low-dose C-arm CBCT within the operating room. AAPM Research Seed Funding (2013-2014); NIH Fellowship F32EB017571; Siemens Healthcare (XP Division)« less
Parallel mutual information estimation for inferring gene regulatory networks on GPUs
2011-01-01
Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA) programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs) from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs. PMID:21672264
Cabral, E C; Sevart, L; Spindola, H M; Coelho, M B; Sousa, I M O; Queiroz, N C A; Foglio, M A; Eberlin, M N; Riveros, J M
2013-02-01
The oil obtained from Pterodon pubescens (Leguminosae) seeds are known to display anti-cancer, anti-dermatogenic and anti-nociceptive activitiy. Phytochemical studies have demonstrated that its main constituents are diterpenoids with voucapan skeletons. Considering the potential biological activities of the oil, rapid and efficient methods for assessing its quality would facilitate certification and quality control. To develop a direct mass spectrometric fingerprinting method for the P. pubescens seed oil that would focus on the major diterpenoids constituents, enabling quality control, origin certification and recognition of marker species in commercially available products. Two techniques were used: (i) direct infusion electrospray ionisation (ESI) mass spectrometry after solvent extraction and dilution and (ii) ambient desorption/ionisation via easy ambient sonic-spray ionisation, EASI(+)-MS, performed directly on the seed surface or at a paper surface imprinted with the oil. From a combination of ESI-MS, HRESI-MS and ESI-MS/MS data, 12 diterpenes were characterised, and typical profiles were obtained for the oil extract or the crude oil via both ESI-MS and EASI-MS. These techniques require no or very simple sample preparation protocols and the whole analytical processes with spectra acquisition take just a few minutes. Both techniques, but particularly EASI-MS, provide simple, fast and efficient MS fingerprinting methodologies to characterise the P. pubescens oil with typical (di)terpene profiles being applicable to quality control and certification of authenticity and origin. Copyright © 2012 John Wiley & Sons, Ltd.
Curtis, J. Randall; Tulsky, James A.
2018-01-01
Abstract Background: High-quality care for seriously ill patients aligns treatment with their goals and values. Failure to achieve “goal-concordant” care is a medical error that can harm patients and families. Because communication between clinicians and patients enables goal concordance and also affects the illness experience in its own right, healthcare systems should endeavor to measure communication and its outcomes as a quality assessment. Yet, little consensus exists on what should be measured and by which methods. Objectives: To propose measurement priorities for serious illness communication and its anticipated outcomes, including goal-concordant care. Methods: We completed a narrative review of the literature to identify links between serious illness communication, goal-concordant care, and other outcomes. We used this review to identify gaps and opportunities for quality measurement in serious illness communication. Results: Our conceptual model describes the relationship between communication, goal-concordant care, and other relevant outcomes. Implementation-ready measures to assess the quality of serious illness communication and care include (1) the timing and setting of serious illness communication, (2) patient experience of communication and care, and (3) caregiver bereavement surveys that include assessment of perceived goal concordance of care. Future measurement priorities include direct assessment of communication quality, prospective patient or family assessment of care concordance with goals, and assessment of the bereaved caregiver experience. Conclusion: Improving serious illness care necessitates ensuring that high-quality communication has occurred and measuring its impact. Measuring patient experience and receipt of goal-concordant care should be our highest priority. We have the tools to measure both. PMID:29091522
2014-01-01
Background In 2009, the Lebanese Ministry of Public Health (MOPH) launched the Primary Healthcare (PHC) accreditation program to improve quality across the continuum of care. The MOPH, with the support of Accreditation Canada, conducted the accreditation survey in 25 PHC centers in 2012. This paper aims to gain a better understanding of the impact of accreditation on quality of care as perceived by PHC staff members and directors; how accreditation affected staff and patient satisfaction; key enablers, challenges and strategies to improve implementation of accreditation in PHC. Methods The study was conducted in 25 PHC centers using a cross-sectional mixed methods approach; all staff members were surveyed using a self-administered questionnaire whereas semi-structured interviews were conducted with directors. Results The scales measuring Management and Leadership had the highest mean score followed by Accreditation Impact, Human Resource Utilization, and Customer Satisfaction. Regression analysis showed that Strategic Quality Planning, Customer Satisfaction and Staff Involvement were associated with a perception of higher Quality Results. Directors emphasized the benefits of accreditation with regards to documentation, reinforcement of quality standards, strengthened relationships between PHC centers and multiple stakeholders and improved staff and patient satisfaction. Challenges encountered included limited financial resources, poor infrastructure, and staff shortages. Conclusions To better respond to population health needs, accreditation is an important first step towards improving the quality of PHC delivery arrangement system. While there is a need to expand the implementation of accreditation to cover all PHC centers in Lebanon, considerations should be given to strengthening their financial arrangements as well. PMID:24568632
Health-Enabling and Ambient Assistive Technologies: Past, Present, Future
2016-01-01
Summary Background During the last decades, health-enabling and ambient assistive technologies became of considerable relevance for new informatics-based forms of diagnosis, prevention, and therapy. Objectives To describe the state of the art of health-enabling and ambient assistive technologies in 1992 and today, and its evolution over the last 25 years as well as to project where the field is expected to be in the next 25 years. In the context of this review, we define health-enabling and ambient assistive technologies as ambiently used sensor-based information and communication technologies, aiming at contributing to a person’s health and health care as well as to her or his quality of life. Methods Systematic review of all original articles with research focus in all volumes of the IMIA Yearbook of Medical Informatics. Surveying authors independently on key projects and visions as well as on their lessons learned in the context of health-enabling and ambient assistive technologies and summarizing their answers. Surveying authors independently on their expectations for the future and summarizing their answers. Results IMIA Yearbook papers containing statements on health-enabling and ambient assistive technologies appear first in 2002. These papers form a minor part of published research articles in medical informatics. However, during recent years the number of articles published has increased significantly. Key projects were identified. There was a clear progress on the use of technologies. However proof of diagnostic relevance and therapeutic efficacy remains still limited. Reforming health care processes and focussing more on patient needs are required. Conclusions Health-enabling and ambient assistive technologies remain an important field for future health care and for interdisciplinary research. More and more publications assume that a person‘s home and their interaction therein, are becoming important components in health care provision, assessment, and management. PMID:27362588
Serial femtosecond crystallography of soluble proteins in lipidic cubic phase
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fromme, Raimund; Ishchenko, Andrii; Metz, Markus
Serial femtosecond crystallography (SFX) at X-ray free-electron lasers (XFELs) enables high-resolution protein structure determination using micrometre-sized crystals at room temperature with minimal effects from radiation damage. SFX requires a steady supply of microcrystals intersecting the XFEL beam at random orientations. An LCP–SFX method has recently been introduced in which microcrystals of membrane proteins are grown and delivered for SFX data collection inside a gel-like membrane-mimetic matrix, known as lipidic cubic phase (LCP), using a special LCP microextrusion injector. Here, it is demonstrated that LCP can also be used as a suitable carrier medium for microcrystals of soluble proteins, enabling amore » dramatic reduction in the amount of crystallized protein required for data collection compared with crystals delivered by liquid injectors. High-quality LCP–SFX data sets were collected for two soluble proteins, lysozyme and phycocyanin, using less than 0.1 mg of each protein.« less
Computational Analysis of Behavior.
Egnor, S E Roian; Branson, Kristin
2016-07-08
In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.
Policy perspectives on the emerging pathways of personalized medicine
Downing, Gregory J.
2009-01-01
Remarkable advances in the fundamental knowledge about the biological basis of disease and technical advances in methods to assess genomic information have led the health care system to the threshold of personalized medicine. It is now feasible to consider strategic application of genomic information to guide patient management by being predictive, preemptive, and preventive, and enabling patient participation in medical decisions. Early evidence of this transition has some hallmarks of disruptive innovation to existing health care practices. Presented here is an examination of the changes underway to enable this new concept in health care in the United States, to improve precision and quality of care through innovations aimed at individualized approaches to medical decision making. A broad range of public policy positions will need to be considered for the health care delivery enterprise to accommodate the promise of this new science and technology for the benefit of patients. PMID:20135895
Developing methods for systematic reviewing in health services delivery and organisation
Alborz, Alison; McNally, Rosalind
2007-01-01
Objectives To develop methods to facilitate the ‘systematic’ review of evidence from a range of methodologies on diffuse or ‘soft’ topics, as exemplified by ‘access to healthcare’. Data sources 28 bibliographic databases, research registers, organisational web sites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Review methods Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords’ model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesised. Quality assessment was by an initial set of ‘generic’ quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Results 82 studies were fully evaluated. Five studies were rated ‘highly rigorous’, 22 ‘rigorous’, 46 ‘less rigorous’ and 9 ‘poor’ papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. Conclusions The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or ‘soft’ topics. Synthesis can be facilitated further by using software, such as the Microsoft ‘Access’ database, for managing information. PMID:15606880
Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases
NASA Astrophysics Data System (ADS)
Grant, Glenn Edwin
Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.
Rapid determination of sugar level in snack products using infrared spectroscopy.
Wang, Ting; Rodriguez-Saona, Luis E
2012-08-01
Real-time spectroscopic methods can provide a valuable window into food manufacturing to permit optimization of production rate, quality and safety. There is a need for cutting edge sensor technology directed at improving efficiency, throughput and reliability of critical processes. The aim of the research was to evaluate the feasibility of infrared systems combined with chemometric analysis to develop rapid methods for determination of sugars in cereal products. Samples were ground and spectra were collected using a mid-infrared (MIR) spectrometer equipped with a triple-bounce ZnSe MIRacle attenuated total reflectance accessory or Fourier transform near infrared (NIR) system equipped with a diffuse reflection-integrating sphere. Sugar contents were determined using a reference HPLC method. Partial least squares regression (PLSR) was used to create cross-validated calibration models. The predictability of the models was evaluated on an independent set of samples and compared with reference techniques. MIR and NIR spectra showed characteristic absorption bands for sugars, and generated excellent PLSR models (sucrose: SEP < 1.7% and r > 0.96). Multivariate models accurately and precisely predicted sugar level in snacks allowing for rapid analysis. This simple technique allows for reliable prediction of quality parameters, and automation enabling food manufacturers for early corrective actions that will ultimately save time and money while establishing a uniform quality. The U.S. snack food industry generates billions of dollars in revenue each year and vibrational spectroscopic methods combined with pattern recognition analysis could permit optimization of production rate, quality, and safety of many food products. This research showed that infrared spectroscopy is a powerful technique for near real-time (approximately 1 min) assessment of sugar content in various cereal products. © 2012 Institute of Food Technologists®
Alper, Brian S; Tristan, Mario; Ramirez-Morera, Anggie; Vreugdenhil, Maria M T; Van Zuuren, Esther J; Fedorowicz, Zbys
2016-06-01
Guideline development is challenging, expensive and labor-intensive. A high-quality guideline with 90 recommendations for breast cancer treatment was developed within 6 months with limited resources in Costa Rica. We describe the experience and propose a process others can use and adapt.The ADAPTE method (using existing guidelines to minimize repeating work that has been done) was used but existing guidelines were not current. The method was extended to use databases that systematically identify, appraise and synthesize evidence for clinical application (DynaMed, EBM Guidelines) to provide current evidence searches and critical appraisal of evidence. The Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach was used to rate the quality of evidence and the strength of recommendations. Draft recommendations with supporting evidence were provided to panel members for facilitated voting to target panel discussion to areas necessary for reaching consensus.Training panelists in guideline development methodology facilitated rapid consensus development. Extending 'guideline adaptation' to 'evidence database adaptation' was highly effective and efficient. Methods were created to simplify mapping DynaMed evidence ratings to GRADE ratings. Twelve steps are presented to facilitate rapid guideline development and enable further adaptation by others.This is a case report and the RAPADAPTE method was retrospectively derived. Prospective replication and validation will support advances for the guideline development community. If guideline development can be accelerated without compromising validity and relevance of the resulting recommendations this would greatly improve our ability to impact clinical care. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Farag, Mohamed A
2014-01-01
The number of botanical dietary supplements in the market has recently increased primarily due to increased health awareness. Standardization and quality control of the constituents of these plant extracts is an important topic, particularly when such ingredients are used long term as dietary supplements, or in cases where higher doses are marketed as drugs. The development of fast, comprehensive, and effective untargeted analytical methods for plant extracts is of high interest. Nuclear magnetic resonance spectroscopy and mass spectrometry are the most informative tools, each of which enables high-throughput and global analysis of hundreds of metabolites in a single step. Although only one of the two techniques is utilized in the majority of plant metabolomics applications, there is a growing interest in combining the data from both platforms to effectively unravel the complexity of plant samples. The application of combined MS and NMR in the quality control of nutraceuticals forms the major part of this review. Finally I will look at the future developments and perspectives of these two technologies for the quality control of herbal materials.
Younger, D; Martin, G W
2000-11-01
The audit reported in this paper and submitted to the Psychiatry of Old Age Management group, assessed six units within each of two health districts in the UK. Using a nonparticipatory observation method in the units selected, the aim was to measure quality and the environment of care. Dependency levels of the clients/residents were also estimated to give a clearer picture of the setting and the care requirements. This was intended to establish a baseline for the units mapped and to enable care developments to be focussed upon intended outcomes. Results led to a number of observations related to the levels of interaction between staff and clients/residents, the need for a wider range of activities to promote person-centred care, and a suggested route to the improvement in quality of life for this vulnerable group of people. Assessment of dependency levels linked to the results of the mapping showed that high dependency does not lead automatically to a lower quality of person centred care.
Posey, Laurie; Pintz, Christine
2017-09-01
To help address the challenges of providing undergraduate nursing education in an accelerated time frame, the Teaching and Transforming through Technology (T3) project was funded to transition a second-degree ABSN program to a blended learning format. The project has explored the use of blended learning to: enable flexible solutions to support teaching goals and address course challenges; provide students with new types of independent learning activities outside of the traditional classroom; increase opportunities for active learning in the classroom; and improve students' digital literacy and lifelong learning skills. Program evaluation included quality reviews of the redesigned courses, surveys of student perceptions, pre- and post-program assessment of students' digital literacy and interviews with faculty about their experiences with the new teaching methods. Adopting an established quality framework to guide course design and evaluation for quality contributed to the efficient and effective development of a high-quality undergraduate blended nursing program. Program outcomes and lessons learned are presented to inform future teaching innovation and research related to blended learning in undergraduate nursing education. Copyright © 2016 Elsevier Ltd. All rights reserved.
Barriers and enablers for iron folic acid (IFA) supplementation in pregnant women.
Siekmans, Kendra; Roche, Marion; Kung'u, Jacqueline K; Desrochers, Rachelle E; De-Regil, Luz Maria
2017-12-22
In order to inform large scale supplementation programme design, we review and summarize the barriers and enablers for improved coverage and utilization of iron and folic acid (IFA) supplements by pregnant women in 7 countries in Africa and Asia. Mixed methods were used to analyse IFA supplementation programmes in Afghanistan, Bangladesh, Indonesia, Ethiopia, Kenya, Nigeria, and Senegal based on formative research conducted in 2012-2013. Qualitative data from focus-group discussions and interviews with women and service providers were used for content analysis to elicit common themes on barriers and enablers at internal, external, and relational levels. Anaemia symptoms in pregnancy are well known among women and health care providers in all countries, yet many women do not feel personally at risk. Broad awareness and increased coverage of facility-based antenatal care (ANC) make it an efficient delivery channel for IFA; however, first trimester access to IFA is hindered by beliefs about when to first attend ANC and preferences for disclosing pregnancy status. Variable access and poor quality ANC services, including insufficient IFA supplies and inadequate counselling to encourage consumption, are barriers to both coverage and adherence. Community-based delivery of IFA and referral to ANC provides earlier and more frequent access and opportunities for follow-up. Improving ANC access and quality is needed to facilitate IFA supplementation during pregnancy. Community-based delivery and counselling can address problems of timely and continuous access to supplements. Renewed investment in training for service providers and effective behaviour change designs are urgently needed to achieve the desired impact. © 2018 John Wiley & Sons Ltd.
Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…
A next generation air quality modeling system is being developed at the U.S. EPA to enable seamless modeling of air quality from global to regional to (eventually) local scales. State of the science chemistry and aerosol modules from the Community Multiscale Air Quality (CMAQ) mo...
Estimating Impaired Waters on a County Level for Public Health Analysis
Assessing the population-level impact of water quality on health can be difficult. Water quality data are measured at a watershed level and health data are organized at different levels of aggregation. To address this discrepancy and enable the consideration of water quality for ...
Development of the Next Generation Air Quality Modeling System
A next generation air quality modeling system is being developed at the U.S. EPA to enable modeling of air quality from global to regional to (eventually) local scales. We envision that the system will have three configurations: 1. Global meteorology with seamless mesh refinemen...
Setup calibration and optimization for comparative digital holography
NASA Astrophysics Data System (ADS)
Baumbach, Torsten; Osten, Wolfgang; Kebbel, Volker; von Kopylow, Christoph; Jueptner, Werner
2004-08-01
With increasing globalization many enterprises decide to produce the components of their products at different locations all over the world. Consequently, new technologies and strategies for quality control are required. In this context the remote comparison of objects with regard to their shape or response on certain loads is getting more and more important for a variety of applications. For such a task the novel method of comparative digital holography is a suitable tool with interferometric sensitivity. With this technique the comparison in shape or deformation of two objects does not require the presence of both objects at the same place. In contrast to the well known incoherent techniques based on inverse fringe projection this new approach uses a coherent mask for the illumination of the sample object. The coherent mask is created by digital holography to enable the instant access to the complete optical information of the master object at any wanted place. The reconstruction of the mask is done by a spatial light modulator (SLM). The transmission of the digital master hologram to the place of comparison can be done via digital telecommunication networks. Contrary to other interferometric techniques this method enables the comparison of objects with different microstructure. In continuation of earlier reports our investigations are focused here on the analysis of the constraints of the setup with respect to the quality of the hologram reconstruction with a spatial light modulator. For successful measurements the selection of the appropriate reconstruction method and the adequate optical set-up is mandatory. In addition, the use of a SLM for the reconstruction requires the knowledge of its properties for the accomplishment of this method. The investigation results for the display properties such as display curvature, phase shift and the consequences for the technique will be presented. The optimization and the calibration of the set-up and its components lead to improved results in comparative digital holography with respect to the resolution. Examples of measurements before and after the optimization and calibration will be presented.
Real-time assessments of water quality: expanding nowcasting throughout the Great Lakes
,
2013-01-01
Nowcasts are systems that inform the public of current bacterial water-quality conditions at beaches on the basis of predictive models. During 2010–12, the U.S. Geological Survey (USGS) worked with 23 local and State agencies to improve existing operational beach nowcast systems at 4 beaches and expand the use of predictive models in nowcasts at an additional 45 beaches throughout the Great Lakes. The predictive models were specific to each beach, and the best model for each beach was based on a unique combination of environmental and water-quality explanatory variables. The variables used most often in models to predict Escherichia coli (E. coli) concentrations or the probability of exceeding a State recreational water-quality standard included turbidity, day of the year, wave height, wind direction and speed, antecedent rainfall for various time periods, and change in lake level over 24 hours. During validation of 42 beach models during 2012, the models performed better than the current method to assess recreational water quality (previous day's E. coli concentration). The USGS will continue to work with local agencies to improve nowcast predictions, enable technology transfer of predictive model development procedures, and implement more operational systems during 2013 and beyond.
Quantitative evaluation of 3D images produced from computer-generated holograms
NASA Astrophysics Data System (ADS)
Sheerin, David T.; Mason, Ian R.; Cameron, Colin D.; Payne, Douglas A.; Slinger, Christopher W.
1999-08-01
Advances in computing and optical modulation techniques now make it possible to anticipate the generation of near real- time, reconfigurable, high quality, three-dimensional images using holographic methods. Computer generated holography (CGH) is the only technique which holds promise of producing synthetic images having the full range of visual depth cues. These realistic images will be viewable by several users simultaneously, without the need for headtracking or special glasses. Such a data visualization tool will be key to speeding up the manufacture of new commercial and military equipment by negating the need for the production of physical 3D models in the design phase. DERA Malvern has been involved in designing and testing fixed CGH in order to understand the connection between the complexity of the CGH, the algorithms used to design them, the processes employed in their implementation and the quality of the images produced. This poster describes results from CGH containing up to 108 pixels. The methods used to evaluate the reconstructed images are discussed and quantitative measures of image fidelity made. An understanding of the effect of the various system parameters upon final image quality enables a study of the possible system trade-offs to be carried out. Such an understanding of CGH production and resulting image quality is key to effective implementation of a reconfigurable CGH system currently under development at DERA.
Meat Quality Assessment by Electronic Nose (Machine Olfaction Technology)
Ghasemi-Varnamkhasti, Mahdi; Mohtasebi, Seyed Saeid; Siadat, Maryam; Balasubramanian, Sundar
2009-01-01
Over the last twenty years, newly developed chemical sensor systems (so called “electronic noses”) have made odor analyses possible. These systems involve various types of electronic chemical gas sensors with partial specificity, as well as suitable statistical methods enabling the recognition of complex odors. As commercial instruments have become available, a substantial increase in research into the application of electronic noses in the evaluation of volatile compounds in food, cosmetic and other items of everyday life is observed. At present, the commercial gas sensor technologies comprise metal oxide semiconductors, metal oxide semiconductor field effect transistors, organic conducting polymers, and piezoelectric crystal sensors. Further sensors based on fibreoptic, electrochemical and bi-metal principles are still in the developmental stage. Statistical analysis techniques range from simple graphical evaluation to multivariate analysis such as artificial neural network and radial basis function. The introduction of electronic noses into the area of food is envisaged for quality control, process monitoring, freshness evaluation, shelf-life investigation and authenticity assessment. Considerable work has already been carried out on meat, grains, coffee, mushrooms, cheese, sugar, fish, beer and other beverages, as well as on the odor quality evaluation of food packaging material. This paper describes the applications of these systems for meat quality assessment, where fast detection methods are essential for appropriate product management. The results suggest the possibility of using this new technology in meat handling. PMID:22454572
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsons, D; Robar, J; Nova Scotia Health Authority, Halifax, NS
Purpose: The focus of this work is to improve the available kV image quality for continuous intra-fraction monitoring of the prostate. This is investigated using a novel blade collimation system enabling modulated volume-of-interest (VOI) imaging of prostate fiducial markers. Methods: A four-blade dynamic kV collimator was used to track a VOI during gantry rotation. Planar image quality was investigated as a function of collimator dimension, while maintaining the same dose to isocenter, for a 22.2 cm diameter cylindrical water phantom with a 9 mm diameter bone insert. A sample prostate anatomy was defined in the planning system, including three fiducialmore » markers within the CTV. The VOI margin around each marker was set to be 2σ of the population covariance matrix characterizing prostate motion. DRRs were used to calculate the kV attenuation for each VOI as a function of angle. The optimal marker and tube current were determined using kV attenuation. Monte Carlo simulations were used to calculate the imaging dose to the phantom and MV scatter dose to the imaging panel. Results: Preliminary measurements show an increase in CNR by a factor of 1.3 with the VOI method, when decreasing from an 6×6 to 2×2 cm{sup 2} field. Attenuation calculations show a change in kV fluence at the detector by a factor of 21.6 with fiducial optimization; resultant tube current modulation increases maximum dose by a factor of 1.4 compared to no modulation. MV scatter contribution to the kV detector changes by approximately a factor of two over a complete gantry rotation. Conclusion: The dynamic collimation system allows single fiducial marker tracking at a very low dose, with reduction of scatter and improvement of image quality, compared to imaging the entire prostate. The approach is compatible with tube current modulation, which enables consistent image quality throughout the range of gantry rotation. This project was funded by Varian Medical Systems.« less
WE-EF-207-09: Single-Scan Dual-Energy CT Using Primary Modulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrongolo, M; Zhu, L
Purpose: Compared with conventional CT, dual energy CT (DECT) provides better material differentiation but requires projection data with two different effective x-ray spectra. Current DECT scanners use either a two-scan setting or costly imaging components, which are not feasible or available on open-gantry cone-beam CT systems. We propose a hardware-based method which utilizes primary modulation to enable single-scan DECT on a conventional CT scanner. The CT imaging geometry of primary modulation is identical to that used in our previous method for scatter removal, making it possible for future combination with effective scatter correction on the same CT scanner. Methods: Wemore » insert an attenuation sheet with a spatially-varying pattern - primary modulator-between the x-ray source and the imaged object. During the CT scan, the modulator selectively hardens the x-ray beam at specific detector locations. Thus, the proposed method simultaneously acquires high and low energy data. High and low energy CT images are then reconstructed from projections with missing data via an iterative CT reconstruction algorithm with gradient weighting. Proof-of-concept studies are performed using a copper modulator on a cone-beam CT system. Results: Our preliminary results on the Catphan(c) 600 phantom indicate that the proposed method for single-scan DECT is able to successfully generate high-quality high and low energy CT images and distinguish different materials through basis material decomposition. By applying correction algorithms and using all of the acquired projection data, we can reconstruct a single CT image of comparable image quality to conventional CT images, i.e., without primary modulation. Conclusion: This work shows great promise in using a primary modulator to perform high-quality single-scan DECT imaging. Future studies will test method performance on anthropomorphic phantoms and perform quantitative analyses on image qualities and DECT decomposition accuracy. We will use simulations to optimize the modulator material and geometry parameters.« less
NASA Astrophysics Data System (ADS)
Uranishi, Katsushige; Ikemori, Fumikazu; Nakatsubo, Ryohei; Shimadera, Hikari; Kondo, Akira; Kikutani, Yuki; Asano, Katsuyoshi; Sugata, Seiji
2017-10-01
This study presented a comparison approach with multiple source apportionment methods to identify which sectors of emission data have large biases. The source apportionment methods for the comparison approach included both receptor and chemical transport models, which are widely used to quantify the impacts of emission sources on fine particulate matter of less than 2.5 μm in diameter (PM2.5). We used daily chemical component concentration data in the year 2013, including data for water-soluble ions, elements, and carbonaceous species of PM2.5 at 11 sites in the Kinki-Tokai district in Japan in order to apply the Positive Matrix Factorization (PMF) model for the source apportionment. Seven PMF factors of PM2.5 were identified with the temporal and spatial variation patterns and also retained features of the sites. These factors comprised two types of secondary sulfate, road transportation, heavy oil combustion by ships, biomass burning, secondary nitrate, and soil and industrial dust, accounting for 46%, 17%, 7%, 14%, 13%, and 3% of the PM2.5, respectively. The multiple-site data enabled a comprehensive identification of the PM2.5 sources. For the same period, source contributions were estimated by air quality simulations using the Community Multiscale Air Quality model (CMAQ) with the brute-force method (BFM) for four source categories. Both models provided consistent results for the following three of the four source categories: secondary sulfates, road transportation, and heavy oil combustion sources. For these three target categories, the models' agreement was supported by the small differences and high correlations between the CMAQ/BFM- and PMF-estimated source contributions to the concentrations of PM2.5, SO42-, and EC. In contrast, contributions of the biomass burning sources apportioned by CMAQ/BFM were much lower than and little correlated with those captured by the PMF model, indicating large uncertainties in the biomass burning emissions used in the CMAQ simulations. Thus, this comparison approach using the two antithetical models enables us to identify which sectors of emission data have large biases for improvement of future air quality simulations.
Using quality experts from manufacturing to transform primary care.
Steiner, Rose M; Walsworth, David T
2010-01-01
Improving Performance in Practice (IPIP) is an initiative convened by the American Board of Medical Specialties. It investigates the efficacy of coaches in helping primary-care practices improve the care of patients with diabetes and asthma. Most IPIP states use coaches who have a health care background, and are trained in quality and process improvement. Michigan uses quality experts from the manufacturing industry who are educated regarding the health care environment, which enables them to perform as quality-improvement coaches (QICs) in primary-care practices. In this case study, ninety-six quality experts were trained to coach primary-care practices, with 53 currently assigned to offices, and others assisting as needed. Practice teams and QICs identify gaps in care and office practices with the use of assorted quality-improvement tools. Reports are made monthly to describe clinical and process measures and methods used. Michigan has 33 practices engaged, involving 205 physicians and 40 midlevel providers. The teaming of quality experts from the manufacturing industry with primary-care office providers and staff resulted in office efficiency, improved care provided, and progress toward attainment of a patient-centered medical home (PCMH). Quality experts from manufacturing volunteered to coach for improvements in primary care. The efforts of QICs have been successful. Because the QICs are volunteers, sustainability of the Michigan Improving Performance in Practice program is a challenge.
Development of Indicators to Assess Quality of Care for Prostate Cancer.
Nag, Nupur; Millar, Jeremy; Davis, Ian D; Costello, Shaun; Duthie, James B; Mark, Stephen; Delprado, Warick; Smith, David; Pryor, David; Galvin, David; Sullivan, Frank; Murphy, Áine C; Roder, David; Elsaleh, Hany; Currow, David; White, Craig; Skala, Marketa; Moretti, Kim L; Walker, Tony; De Ieso, Paolo; Brooks, Andrew; Heathcote, Peter; Frydenberg, Mark; Thavaseelan, Jeffery; Evans, Sue M
2016-02-20
The development, monitoring, and reporting of indicator measures that describe standard of care provide the gold standard for assessing quality of care and patient outcomes. Although indicator measures have been reported, little evidence of their use in measuring and benchmarking performance is available. A standard set, defining numerator, denominator, and risk adjustments, will enable global benchmarking of quality of care. To develop a set of indicators to enable assessment and reporting of quality of care for men with localised prostate cancer (PCa). Candidate indicators were identified from the literature. An international panel was invited to participate in a modified Delphi process. Teleconferences were held before and after each voting round to provide instruction and to review results. Panellists were asked to rate each proposed indicator on a Likert scale of 1-9 in a two-round iterative process. Calculations required to report on the endorsed indicators were evaluated and modified to reflect the data capture of the Prostate Cancer Outcomes Registry-Australia and New Zealand (PCOR-ANZ). A total of 97 candidate indicators were identified, of which 12 were endorsed. The set includes indicators covering pre-, intra-, and post-treatment of PCa care, within the limits of the data captured by PCOR-ANZ. The 12 endorsed quality measures enable international benchmarking on the quality of care of men with localised PCa. Reporting on these indicators enhances safety and efficacy of treatment, reduces variation in care, and can improve patient outcomes. PCa has the highest incidence of all cancers in men. Early diagnosis and relatively high survival rates mean issues of quality of care and best possible health outcomes for patients are important. This paper identifies 12 important measurable quality indicators in PCa care. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.
A manual and an automatic TERS based virus discrimination
NASA Astrophysics Data System (ADS)
Olschewski, Konstanze; Kämmer, Evelyn; Stöckel, Stephan; Bocklitz, Thomas; Deckert-Gaudig, Tanja; Zell, Roland; Cialla-May, Dana; Weber, Karina; Deckert, Volker; Popp, Jürgen
2015-02-01
Rapid techniques for virus identification are more relevant today than ever. Conventional virus detection and identification strategies generally rest upon various microbiological methods and genomic approaches, which are not suited for the analysis of single virus particles. In contrast, the highly sensitive spectroscopic technique tip-enhanced Raman spectroscopy (TERS) allows the characterisation of biological nano-structures like virions on a single-particle level. In this study, the feasibility of TERS in combination with chemometrics to discriminate two pathogenic viruses, Varicella-zoster virus (VZV) and Porcine teschovirus (PTV), was investigated. In a first step, chemometric methods transformed the spectral data in such a way that a rapid visual discrimination of the two examined viruses was enabled. In a further step, these methods were utilised to perform an automatic quality rating of the measured spectra. Spectra that passed this test were eventually used to calculate a classification model, through which a successful discrimination of the two viral species based on TERS spectra of single virus particles was also realised with a classification accuracy of 91%.Rapid techniques for virus identification are more relevant today than ever. Conventional virus detection and identification strategies generally rest upon various microbiological methods and genomic approaches, which are not suited for the analysis of single virus particles. In contrast, the highly sensitive spectroscopic technique tip-enhanced Raman spectroscopy (TERS) allows the characterisation of biological nano-structures like virions on a single-particle level. In this study, the feasibility of TERS in combination with chemometrics to discriminate two pathogenic viruses, Varicella-zoster virus (VZV) and Porcine teschovirus (PTV), was investigated. In a first step, chemometric methods transformed the spectral data in such a way that a rapid visual discrimination of the two examined viruses was enabled. In a further step, these methods were utilised to perform an automatic quality rating of the measured spectra. Spectra that passed this test were eventually used to calculate a classification model, through which a successful discrimination of the two viral species based on TERS spectra of single virus particles was also realised with a classification accuracy of 91%. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr07033j
Armstrong, Lorraine; Lauder, William; Shepherd, Ashley
2015-01-14
Despite criticism, quality improvement (QI) continues to drive political and educational priorities within health care. Until recently, QI educational interventions have varied, targeting mainly postgraduates, middle management and the medical profession. However, there is now consensus within the UK, USA and beyond to integrate QI explicitly into nurse education, and faculties may require redesign of their QI curriculum to achieve this. Whilst growth in QI preregistration nurse education is emerging, little empirical evidence exists to determine such effects. Furthermore, previous healthcare studies evaluating QI educational interventions lend little in the way of support and have instead been subject to criticism. They reveal methodological weakness such as no reporting of theoretical underpinnings, insufficient intervention description, poor evaluation methods, little clinical or patient impact and lack of sustainability. This study aims therefore to identify, evaluate and synthesise teaching methods used within the undergraduate population to aid development of QI curriculum within preregistration nurse education. A systematic review of the literature will be conducted. Electronic databases, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Psychological Information (PsychINFO), Education Resources Information Centre (ERIC), Medical Literature Analysis and Retrieval System Online (MEDLINE) and Applied Social Sciences Index and Abstracts (ASSIA), will be searched alongside reference list scanning and a grey literature search. Peer-reviewed studies from 2000-2014 will be identified using key terms quality improvement, education, curriculum, training, undergraduate, teaching methods, students and evaluation. Studies describing a QI themed educational intervention aimed at undergraduate healthcare students will be included and data extracted using a modified version of the Reporting of Primary Studies in Education (REPOSE) Guidelines. Studies will be judged for quality and relevance using the Evidence for Policy and Practice Information and Co-ordinating Centre's (EPPI) Weight of Evidence framework and a narrative synthesis of the findings provided. This study aims to identify, evaluate and synthesise the teaching methods used in quality improvement education for undergraduate healthcare students where currently this is lacking. This will enable nursing faculty to adopt the most effective methods when developing QI education within their curriculum. Prospero CRD42014013847.
Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.
Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester
2016-11-01
Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.
Mitigating Provider Uncertainty in Service Provision Contracts
NASA Astrophysics Data System (ADS)
Smith, Chris; van Moorsel, Aad
Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.
Design of an instrument to measure the quality of care in Physical Therapy
Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; do Prado, Cristiane; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo
2015-01-01
ABSTRACT Objective: To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. Methods: The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Results: Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. Conclusion: The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care. PMID:26154548
Super Resolution Algorithm for CCTVs
NASA Astrophysics Data System (ADS)
Gohshi, Seiichi
2015-03-01
Recently, security cameras and CCTV systems have become an important part of our daily lives. The rising demand for such systems has created business opportunities in this field, especially in big cities. Analogue CCTV systems are being replaced by digital systems, and HDTV CCTV has become quite common. HDTV CCTV can achieve images with high contrast and decent quality if they are clicked in daylight. However, the quality of an image clicked at night does not always have sufficient contrast and resolution because of poor lighting conditions. CCTV systems depend on infrared light at night to compensate for insufficient lighting conditions, thereby producing monochrome images and videos. However, these images and videos do not have high contrast and are blurred. We propose a nonlinear signal processing technique that significantly improves visual and image qualities (contrast and resolution) of low-contrast infrared images. The proposed method enables the use of infrared cameras for various purposes such as night shot and poor lighting environments under poor lighting conditions.
Chen, Pei; Jin, Hong-Yu; Sun, Lei; Ma, Shuang-Cheng
2016-09-01
Multi-source analysis of traditional Chinese medicine is key to ensuring its safety and efficacy. Compared with traditional experimental differentiation, chemometric analysis is a simpler strategy to identify traditional Chinese medicines. Multi-component analysis plays an increasingly vital role in the quality control of traditional Chinese medicines. A novel strategy, based on chemometric analysis and quantitative analysis of multiple components, was proposed to easily and effectively control the quality of traditional Chinese medicines such as Chonglou. Ultra high performance liquid chromatography was more convenient and efficient. Five species of Chonglou were distinguished by chemometric analysis and nine saponins, including Chonglou saponins I, II, V, VI, VII, D, and H, as well as dioscin and gracillin, were determined in 18 min. The method is feasible and credible, and enables to improve quality control of traditional Chinese medicines and natural products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Document segmentation for high-quality printing
NASA Astrophysics Data System (ADS)
Ancin, Hakan
1997-04-01
A technique to segment dark texts on light background of mixed mode color documents is presented. This process does not perceptually change graphics and photo regions. Color documents are scanned and printed from various media which usually do not have clean background. This is especially the case for the printouts generated from thin magazine samples, these printouts usually include text and figures form the back of the page, which is called bleeding. Removal of bleeding artifacts improves the perceptual quality of the printed document and reduces the color ink usage. By detecting the light background of the document, these artifacts are removed from background regions. Also detection of dark text regions enables the halftoning algorithms to use true black ink for the black text pixels instead of composite black. The processed document contains sharp black text on white background, resulting improved perceptual quality and better ink utilization. The described method is memory efficient and requires a small number of scan lines of high resolution color documents during processing.
Brouilette, Scott; Kuersten, Scott; Mein, Charles; Bozek, Monika; Terry, Anna; Dias, Kerith-Rae; Bhaw-Rosun, Leena; Shintani, Yasunori; Coppen, Steven; Ikebe, Chiho; Sawhney, Vinit; Campbell, Niall; Kaneko, Masahiro; Tano, Nobuko; Ishida, Hidekazu; Suzuki, Ken; Yashiro, Kenta
2012-10-01
Deep sequencing of single cell-derived cDNAs offers novel insights into oncogenesis and embryogenesis. However, traditional library preparation for RNA-seq analysis requires multiple steps with consequent sample loss and stochastic variation at each step significantly affecting output. Thus, a simpler and better protocol is desirable. The recently developed hyperactive Tn5-mediated library preparation, which brings high quality libraries, is likely one of the solutions. Here, we tested the applicability of hyperactive Tn5-mediated library preparation to deep sequencing of single cell cDNA, optimized the protocol, and compared it with the conventional method based on sonication. This new technique does not require any expensive or special equipment, which secures wider availability. A library was constructed from only 100 ng of cDNA, which enables the saving of precious specimens. Only a few steps of robust enzymatic reaction resulted in saved time, enabling more specimens to be prepared at once, and with a more reproducible size distribution among the different specimens. The obtained RNA-seq results were comparable to the conventional method. Thus, this Tn5-mediated preparation is applicable for anyone who aims to carry out deep sequencing for single cell cDNAs. Copyright © 2012 Wiley Periodicals, Inc.
A new quantitative approach to measure perceived work-related stress in Italian employees.
Cevenini, Gabriele; Fratini, Ilaria; Gambassi, Roberto
2012-09-01
We propose a method for a reliable quantitative measure of subjectively perceived occupational stress applicable in any company to enhance occupational safety and psychosocial health, to enable precise prevention policies and intervention and to improve work quality and efficiency. A suitable questionnaire was telephonically administered to a stratified sample of the whole Italian population of employees. Combined multivariate statistical methods, including principal component, cluster and discriminant analyses, were used to identify risk factors and to design a causal model for understanding work-related stress. The model explained the causal links of stress through employee perception of imbalance between job demands and resources for responding appropriately, by supplying a reliable U-shaped nonlinear stress index, expressed in terms of values of human systolic arterial pressure. Low, intermediate and high values indicated demotivation (or inefficiency), well-being and distress, respectively. Costs for stress-dependent productivity shortcomings were estimated to about 3.7% of national income from employment. The method identified useful structured information able to supply a simple and precise interpretation of employees' well-being and stress risk. Results could be compared with estimated national benchmarks to enable targeted intervention strategies to protect the health and safety of workers, and to reduce unproductive costs for firms.
Biomek Cell Workstation: A Variable System for Automated Cell Cultivation.
Lehmann, R; Severitt, J C; Roddelkopf, T; Junginger, S; Thurow, K
2016-06-01
Automated cell cultivation is an important tool for simplifying routine laboratory work. Automated methods are independent of skill levels and daily constitution of laboratory staff in combination with a constant quality and performance of the methods. The Biomek Cell Workstation was configured as a flexible and compatible system. The modified Biomek Cell Workstation enables the cultivation of adherent and suspension cells. Until now, no commercially available systems enabled the automated handling of both types of cells in one system. In particular, the automated cultivation of suspension cells in this form has not been published. The cell counts and viabilities were nonsignificantly decreased for cells cultivated in AutoFlasks in automated handling. The proliferation of manual and automated bioscreening by the WST-1 assay showed a nonsignificant lower proliferation of automatically disseminated cells associated with a mostly lower standard error. The disseminated suspension cell lines showed different pronounced proliferations in descending order, starting with Jurkat cells followed by SEM, Molt4, and RS4 cells having the lowest proliferation. In this respect, we successfully disseminated and screened suspension cells in an automated way. The automated cultivation and dissemination of a variety of suspension cells can replace the manual method. © 2015 Society for Laboratory Automation and Screening.
Custom implant design for large cranial defects.
Marreiros, Filipe M M; Heuzé, Y; Verius, M; Unterhofer, C; Freysinger, W; Recheis, W
2016-12-01
The aim of this work was to introduce a computer-aided design (CAD) tool that enables the design of large skull defect (>100 [Formula: see text]) implants. Functional and aesthetically correct custom implants are extremely important for patients with large cranial defects. For these cases, preoperative fabrication of implants is recommended to avoid problems of donor site morbidity, sufficiency of donor material and quality. Finally, crafting the correct shape is a non-trivial task increasingly complicated by defect size. We present a CAD tool to design such implants for the neurocranium. A combination of geometric morphometrics and radial basis functions, namely thin-plate splines, allows semiautomatic implant generation. The method uses symmetry and the best fitting shape to estimate missing data directly within the radiologic volume data. In addition, this approach delivers correct implant fitting via a boundary fitting approach. This method generates a smooth implant surface, free of sharp edges that follows the main contours of the boundary, enabling accurate implant placement in the defect site intraoperatively. The present approach is evaluated and compared to existing methods. A mean error of 89.29 % (72.64-100 %) missing landmarks with an error less or equal to 1 mm was obtained. In conclusion, the results show that our CAD tool can generate patient-specific implants with high accuracy.
Quality of Life for Individuals with Disabilities: A Conceptual Framework.
ERIC Educational Resources Information Center
Marinoble, Rita; Hegenauer, Judy
One of the desired outcomes of transition planning for students with disabilities is to enable the students to lead a quality adult life. This report contains a literature review which outlines recent approaches to addressing quality of life issues, including conceptualizations, methodologies, and ethical concerns. A field inquiry report…
Scurlock-Evans, Laura; Upton, Penney; Upton, Dominic
2014-09-01
Despite clear benefits of the Evidence-Based Practice (EBP) approach to ensuring quality and consistency of care, its uptake within physiotherapy has been inconsistent. Synthesise the findings of research into EBP barriers, facilitators and interventions in physiotherapy and identify methods of enhancing adoption and implementation. Literature concerning physiotherapists' practice between 2000 and 2012 was systematically searched using: Academic Search Complete, Cumulative Index of Nursing and Allied Health Literature Plus, American Psychological Association databases, Medline, Journal Storage, and Science Direct. Reference lists were searched to identify additional studies. Thirty-two studies, focusing either on physiotherapists' EBP knowledge, attitudes or implementation, or EBP interventions in physiotherapy were included. One author undertook all data extraction and a second author reviewed to ensure consistency and rigour. Synthesis was organised around the themes of EBP barriers/enablers, attitudes, knowledge/skills, use and interventions. Many physiotherapists hold positive attitudes towards EBP. However, this does not necessarily translate into consistent, high-quality EBP. Many barriers to EBP implementation are apparent, including: lack of time and skills, and misperceptions of EBP. Only studies published in the English language, in peer-reviewed journals were included, thereby introducing possible publication bias. Furthermore, narrative synthesis may be subject to greater confirmation bias. There is no "one-size fits all" approach to enhancing EBP implementation; assessing organisational culture prior to designing interventions is crucial. Although some interventions appear promising, further research is required to explore the most effective methods of supporting physiotherapists' adoption of EBP. Copyright © 2014 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Atomic Layer Deposition of Vanadium Dioxide and a Temperature-dependent Optical Model.
Currie, Marc; Mastro, Michael A; Wheeler, Virginia D
2018-05-23
Vanadium dioxide is a material that has a reversible metal-insulator phase change near 68 °C. To grow VO2 on a wide variety of substrates, with wafer-scale uniformity and angstrom level control of thickness, the method of atomic-layer deposition was chosen. This ALD process enables high-quality, low-temperature (≤150 °C) growth of ultrathin films (100-1000 Å) of VO2. For this demonstration, the VO2 films were grown on sapphire substrates. This low temperature growth technique produces mostly amorphous VO2 films. A subsequent anneal in an ultra-high vacuum chamber with a pressure of 7x10 -4 Pa of ultra-high purity (99.999%) oxygen produced oriented, polycrystalline VO2 films. The crystallinity, phase, and strain of the VO2 were determined by Raman spectroscopy and X-ray diffraction, while the stoichiometry and impurity levels were determined by X-ray photoelectron spectroscopy, and finally the morphology was determined by atomic force microscopy. These data demonstrate the high-quality of the films grown by this technique. A model was created to fit to the data for VO2 in its metallic and insulating phases in the near infrared spectral region. The permittivity and refractive index of the ALD VO2 agreed well with the other fabrication methods in its insulating phase, but showed a difference in its metallic state. Finally, the analysis of the films' optical properties enabled the creation of a wavelength- and temperature-dependent model of the complex optical refractive index for developing VO2 as a tunable refractive index material.
Sweetapple, Christine; Fu, Guangtao; Butler, David
2013-09-01
This study investigates sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment, through the use of local and global sensitivity analysis tools, and contributes to an in-depth understanding of wastewater treatment modelling by revealing critical parameters and parameter interactions. One-factor-at-a-time sensitivity analysis is used to screen model parameters and identify those with significant individual effects on three performance indicators: total greenhouse gas emissions, effluent quality and operational cost. Sobol's method enables identification of parameters with significant higher order effects and of particular parameter pairs to which model outputs are sensitive. Use of a variance-based global sensitivity analysis tool to investigate parameter interactions enables identification of important parameters not revealed in one-factor-at-a-time sensitivity analysis. These interaction effects have not been considered in previous studies and thus provide a better understanding wastewater treatment plant model characterisation. It was found that uncertainty in modelled nitrous oxide emissions is the primary contributor to uncertainty in total greenhouse gas emissions, due largely to the interaction effects of three nitrogen conversion modelling parameters. The higher order effects of these parameters are also shown to be a key source of uncertainty in effluent quality. Copyright © 2013 Elsevier Ltd. All rights reserved.
Occupancy as a surrogate for abundance estimation
MacKenzie, D.I.; Nichols, J.D.
2004-01-01
In many monitoring programmes it may be prohibitively expensive to estimate the actual abundance of a bird species in a defined area, particularly at large spatial scales, or where birds occur at very low densities. Often it may be appropriate to consider the proportion of area occupied by the species as an alternative state variable. However, as with abundance estimation, issues of detectability must be taken into account in order to make accurate inferences: the non?detection of the species does not imply the species is genuinely absent. Here we review some recent modelling developments that permit unbiased estimation of the proportion of area occupied, colonization and local extinction probabilities. These methods allow for unequal sampling effort and enable covariate information on sampling locations to be incorporated. We also describe how these models could be extended to incorporate information from marked individuals, which would enable finer questions of population dynamics (such as turnover rate of nest sites by specific breeding pairs) to be addressed. We believe these models may be applicable to a wide range of bird species and may be useful for investigating various questions of ecological interest. For example, with respect to habitat quality, we might predict that a species is more likely to have higher local extinction probabilities, or higher turnover rates of specific breeding pairs, in poor quality habitats.
LeProust, Emily M.; Peck, Bill J.; Spirin, Konstantin; McCuen, Heather Brummel; Moore, Bridget; Namsaraev, Eugeni; Caruthers, Marvin H.
2010-01-01
We have achieved the ability to synthesize thousands of unique, long oligonucleotides (150mers) in fmol amounts using parallel synthesis of DNA on microarrays. The sequence accuracy of the oligonucleotides in such large-scale syntheses has been limited by the yields and side reactions of the DNA synthesis process used. While there has been significant demand for libraries of long oligos (150mer and more), the yields in conventional DNA synthesis and the associated side reactions have previously limited the availability of oligonucleotide pools to lengths <100 nt. Using novel array based depurination assays, we show that the depurination side reaction is the limiting factor for the synthesis of libraries of long oligonucleotides on Agilent Technologies’ SurePrint® DNA microarray platform. We also demonstrate how depurination can be controlled and reduced by a novel detritylation process to enable the synthesis of high quality, long (150mer) oligonucleotide libraries and we report the characterization of synthesis efficiency for such libraries. Oligonucleotide libraries prepared with this method have changed the economics and availability of several existing applications (e.g. targeted resequencing, preparation of shRNA libraries, site-directed mutagenesis), and have the potential to enable even more novel applications (e.g. high-complexity synthetic biology). PMID:20308161
Synthetic spike-in standards for high-throughput 16S rRNA gene amplicon sequencing
Tourlousse, Dieter M.; Yoshiike, Satowa; Ohashi, Akiko; Matsukura, Satoko; Noda, Naohiro
2017-01-01
Abstract High-throughput sequencing of 16S rRNA gene amplicons (16S-seq) has become a widely deployed method for profiling complex microbial communities but technical pitfalls related to data reliability and quantification remain to be fully addressed. In this work, we have developed and implemented a set of synthetic 16S rRNA genes to serve as universal spike-in standards for 16S-seq experiments. The spike-ins represent full-length 16S rRNA genes containing artificial variable regions with negligible identity to known nucleotide sequences, permitting unambiguous identification of spike-in sequences in 16S-seq read data from any microbiome sample. Using defined mock communities and environmental microbiota, we characterized the performance of the spike-in standards and demonstrated their utility for evaluating data quality on a per-sample basis. Further, we showed that staggered spike-in mixtures added at the point of DNA extraction enable concurrent estimation of absolute microbial abundances suitable for comparative analysis. Results also underscored that template-specific Illumina sequencing artifacts may lead to biases in the perceived abundance of certain taxa. Taken together, the spike-in standards represent a novel bioanalytical tool that can substantially improve 16S-seq-based microbiome studies by enabling comprehensive quality control along with absolute quantification. PMID:27980100
Older Patients' Perspectives on Quality of Serious Illness Care in Primary Care.
Abu Al Hamayel, Nebras; Isenberg, Sarina R; Hannum, Susan M; Sixon, Joshua; Smith, Katherine Clegg; Dy, Sydney M
2018-01-01
Despite increased focus on measuring and improving quality of serious illness care, there has been little emphasis on the primary care context or incorporation of the patient perspective. To explore older patients' perspectives on the quality of serious illness care in primary care. Qualitative interview study. Twenty patients aged 60 or older who were at risk for or living with serious illness and who had participated in the clinic's quality improvement initiative. We used a semistructured, open-ended guide focusing on how older patients perceived quality of serious illness care, particularly in primary care. We transcribed interviews verbatim and inductively identified codes. We identified emergent themes using a thematic and constant comparative method. We identified 5 key themes: (1) the importance of patient-centered communication, (2) coordination of care, (3) the shared decision-making process, (4) clinician competence, and (5) access to care. Communication was an overarching theme that facilitated coordination of care between patients and their clinicians, empowered patients for shared decision-making, related to clinicians' perceived competence, and enabled access to primary and specialty care. Although access to care is not traditionally considered an aspect of quality, patients considered this integral to the quality of care they received. Patients perceived serious illness care as a key aspect of quality in primary care. Efforts to improve quality measurement and implementation of quality improvement initiatives in serious illness care should consider these aspects of care that patients deem important, particularly communication as an overarching priority.
New approach to probability estimate of femoral neck fracture by fall (Slovak regression model).
Wendlova, J
2009-01-01
3,216 Slovak women with primary or secondary osteoporosis or osteopenia, aged 20-89 years, were examined with the bone densitometer DXA (dual energy X-ray absorptiometry, GE, Prodigy - Primo), x = 58.9, 95% C.I. (58.42; 59.38). The values of the following variables for each patient were measured: FSI (femur strength index), T-score total hip left, alpha angle - left, theta angle - left, HAL (hip axis length) left, BMI (body mass index) was calculated from the height and weight of the patients. Regression model determined the following order of independent variables according to the intensity of their influence upon the occurrence of values of dependent FSI variable: 1. BMI, 2. theta angle, 3. T-score total hip, 4. alpha angle, 5. HAL. The regression model equation, calculated from the variables monitored in the study, enables a doctor in praxis to determine the probability magnitude (absolute risk) for the occurrence of pathological value of FSI (FSI < 1) in the femoral neck area, i. e., allows for probability estimate of a femoral neck fracture by fall for Slovak women. 1. The Slovak regression model differs from regression models, published until now, in chosen independent variables and a dependent variable, belonging to biomechanical variables, characterising the bone quality. 2. The Slovak regression model excludes the inaccuracies of other models, which are not able to define precisely the current and past clinical condition of tested patients (e.g., to define the length and dose of exposure to risk factors). 3. The Slovak regression model opens the way to a new method of estimating the probability (absolute risk) or the odds for a femoral neck fracture by fall, based upon the bone quality determination. 4. It is assumed that the development will proceed by improving the methods enabling to measure the bone quality, determining the probability of fracture by fall (Tab. 6, Fig. 3, Ref. 22). Full Text (Free, PDF) www.bmj.sk.
Film-based delivery quality assurance for robotic radiosurgery: Commissioning and validation.
Blanck, Oliver; Masi, Laura; Damme, Marie-Christin; Hildebrandt, Guido; Dunst, Jürgen; Siebert, Frank-Andre; Poppinga, Daniela; Poppe, Björn
2015-07-01
Robotic radiosurgery demands comprehensive delivery quality assurance (DQA), but guidelines for commissioning of the DQA method is missing. We investigated the stability and sensitivity of our film-based DQA method with various test scenarios and routine patient plans. We also investigated the applicability of tight distance-to-agreement (DTA) Gamma-Index criteria. We used radiochromic films with multichannel film dosimetry and re-calibration and our analysis was performed in four steps: 1) Film-to-plan registration, 2) Standard Gamma-Index criteria evaluation (local-pixel-dose-difference ≤2%, distance-to-agreement ≤2 mm, pass-rate ≥90%), 3) Dose distribution shift until maximum pass-rate (Maxγ) was found (shift acceptance <1 mm), and 4) Final evaluation with tight DTA criteria (≤1 mm). Test scenarios consisted of purposefully introduced phantom misalignments, dose miscalibrations, and undelivered MU. Initial method evaluation was done on 30 clinical plans. Our method showed similar sensitivity compared to the standard End-2-End-Test and incorporated an estimate of global system offsets in the analysis. The simulated errors (phantom shifts, global robot misalignment, undelivered MU) were detected by our method while standard Gamma-Index criteria often did not reveal these deviations. Dose miscalibration was not detected by film alone, hence simultaneous ion-chamber measurement for film calibration is strongly recommended. 83% of the clinical patient plans were within our tight DTA tolerances. Our presented methods provide additional measurements and quality references for film-based DQA enabling more sensitive error detection. We provided various test scenarios for commissioning of robotic radiosurgery DQA and demonstrated the necessity to use tight DTA criteria. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-01-01
Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. PMID:28062603
2013-01-01
Background To develop a Consumer Quality Index (CQI) Cancer Care questionnaire for measuring experiences with hospital care of patients with different types of cancer. Methods We derived quality aspects from focus group discussions, existing questionnaires and literature. We developed an experience questionnaire and sent it to 1,498 Dutch cancer patients. Another questionnaire measuring the importance of the quality aspects was sent to 600 cancer patients. Data were psychometrically analysed. Results The response to the experience questionnaire was 50 percent. Psychometric analysis revealed 12 reliable scales. Patients rated rapid and adequate referral, rapid start of the treatment after diagnosis, enough information and confidence in the healthcare professionals as most important themes. Hospitals received high scores for skills and cooperation of healthcare professionals and a patient-centered approach by doctors; and low scores for psychosocial guidance and information at completion of the treatment. Conclusions The CQI Cancer Care questionnaire is a valuable tool for the evaluation of the quality of cancer care from the patient’s perspective. Large scale implementation is necessary to determine the discriminatory powers of the questionnaire and may enable healthcare providers to improve the quality of cancer care. Preliminary results indicate that hospitals could improve their psychosocial guidance and information provision. PMID:23617741
A proposed ground-water quality monitoring network for Idaho
Whitehead, R.L.; Parliman, D.J.
1979-01-01
A ground water quality monitoring network is proposed for Idaho. The network comprises 565 sites, 8 of which will require construction of new wells. Frequencies of sampling at the different sites are assigned at quarterly, semiannual, annual, and 5 years. Selected characteristics of the water will be monitored by both laboratory- and field-analysis methods. The network is designed to: (1) Enable water managers to keep abreast of the general quality of the State 's ground water, and (2) serve as a warning system for undesirable changes in ground-water quality. Data were compiled for hydrogeologic conditions, ground-water quality, cultural elements, and pollution sources. A ' hydrologic unit priority index ' is used to rank 84 hydrologic units (river basins or segments of river basins) of the State for monitoring according to pollution potential. Emphasis for selection of monitoring sites is placed on the 15 highest ranked units. The potential for pollution is greatest in areas of privately owned agricultural land. Other areas of pollution potential are residential development, mining and related processes, and hazardous waste disposal. Data are given for laboratory and field analyses, number of site visits, manpower, subsistence, and mileage, from which costs for implementing the network can be estimated. Suggestions are made for data storage and retrieval and for reporting changes in water quality. (Kosco-USGS)
A novel scalable manufacturing process for the production of hydrogel-forming microneedle arrays.
Lutton, Rebecca E M; Larrañeta, Eneko; Kearney, Mary-Carmel; Boyd, Peter; Woolfson, A David; Donnelly, Ryan F
2015-10-15
A novel manufacturing process for fabricating microneedle arrays (MN) has been designed and evaluated. The prototype is able to successfully produce 14×14 MN arrays and is easily capable of scale-up, enabling the transition from laboratory to industry and subsequent commercialisation. The method requires the custom design of metal MN master templates to produce silicone MN moulds using an injection moulding process. The MN arrays produced using this novel method was compared with centrifugation, the traditional method of producing aqueous hydrogel-forming MN arrays. The results proved that there was negligible difference between either methods, with each producing MN arrays with comparable quality. Both types of MN arrays can be successfully inserted in a skin simulant. In both cases the insertion depth was approximately 60% of the needle length and the height reduction after insertion was in both cases approximately 3%. Copyright © 2015 Elsevier B.V. All rights reserved.
Bridges for Pedestrians with Random Parameters using the Stochastic Finite Elements Analysis
NASA Astrophysics Data System (ADS)
Szafran, J.; Kamiński, M.
2017-02-01
The main aim of this paper is to present a Stochastic Finite Element Method analysis with reference to principal design parameters of bridges for pedestrians: eigenfrequency and deflection of bridge span. They are considered with respect to random thickness of plates in boxed-section bridge platform, Young modulus of structural steel and static load resulting from crowd of pedestrians. The influence of the quality of the numerical model in the context of traditional FEM is shown also on the example of a simple steel shield. Steel structures with random parameters are discretized in exactly the same way as for the needs of traditional Finite Element Method. Its probabilistic version is provided thanks to the Response Function Method, where several numerical tests with random parameter values varying around its mean value enable the determination of the structural response and, thanks to the Least Squares Method, its final probabilistic moments.
A novel scalable manufacturing process for the production of hydrogel-forming microneedle arrays
Lutton, Rebecca E.M.; Larrañeta, Eneko; Kearney, Mary-Carmel; Boyd, Peter; Woolfson, A.David; Donnelly, Ryan F.
2015-01-01
A novel manufacturing process for fabricating microneedle arrays (MN) has been designed and evaluated. The prototype is able to successfully produce 14 × 14 MN arrays and is easily capable of scale-up, enabling the transition from laboratory to industry and subsequent commercialisation. The method requires the custom design of metal MN master templates to produce silicone MN moulds using an injection moulding process. The MN arrays produced using this novel method was compared with centrifugation, the traditional method of producing aqueous hydrogel-forming MN arrays. The results proved that there was negligible difference between either methods, with each producing MN arrays with comparable quality. Both types of MN arrays can be successfully inserted in a skin simulant. In both cases the insertion depth was approximately 60% of the needle length and the height reduction after insertion was in both cases approximately 3%. PMID:26302858
2010-01-01
Background Transferring knowledge from research into practice can be challenging, partly because the process involves a change in attitudes, roles and behaviour by individuals and teams. Helping teams to identify then target potential barriers may aid the knowledge transfer process. The aim of this study was to identify barriers and enablers, as perceived by allied health professionals, to delivering an evidence-based (Level 1) outdoor journey intervention for people with stroke. Methods A qualitative design and semi-structured interviews were used. Allied health professionals (n = 13) from two community rehabilitation teams were interviewed, before and after receiving feedback from a medical record audit and attending a training workshop. Interviews allowed participants to identify potential and actual barriers, as well as enablers to delivering the intervention. Qualitative data were analysed using theoretical domains described by Michie and colleagues. Results Two barriers to delivery of the intervention were the social influence of people with stroke and their family, and professionals' beliefs about their capabilities. Other barriers included professionals' knowledge and skills, their role identity, availability of resources, whether professionals remembered to provide the intervention, and how they felt about delivering the intervention. Enablers to delivering the intervention included a belief that they could deliver the intervention, a willingness to expand and share professional roles, procedures that reminded them what to do, and feeling good about helping people with stroke to participate. Conclusions This study represents one step in the quality improvement process. The interviews encouraged reflection by staff. We obtained valuable data which have been used to plan behaviour change interventions addressing identified barriers. Our methods may assist other researchers who need to design similar behaviour change interventions. PMID:20082725
Armstrong, Susan J.; Rispel, Laetitia C.; Penn-Kekana, Loveday
2015-01-01
Background Improving the quality of health care is central to the proposed health care reforms in South Africa. Nursing unit managers play a key role in coordinating patient care activities and in ensuring quality care in hospitals. Objective This paper examines whether the activities of nursing unit managers facilitate the provision of quality patient care in South African hospitals. Methods During 2011, a cross-sectional, descriptive study was conducted in nine randomly selected hospitals (six public, three private) in two South African provinces. In each hospital, one of each of the medical, surgical, paediatric, and maternity units was selected (n=36). Following informed consent, each unit manager was observed for a period of 2 hours on the survey day and the activities recorded on a minute-by-minute basis. The activities were entered into Microsoft Excel, coded into categories, and analysed according to the time spent on activities in each category. The observation data were complemented by semi-structured interviews with the unit managers who were asked to recall their activities on the day preceding the interview. The interviews were analysed using thematic content analysis. Results The study found that nursing unit managers spent 25.8% of their time on direct patient care, 16% on hospital administration, 14% on patient administration, 3.6% on education, 13.4% on support and communication, 3.9% on managing stock and equipment, 11.5% on staff management, and 11.8% on miscellaneous activities. There were also numerous interruptions and distractions. The semi-structured interviews revealed concordance between unit managers’ recall of the time spent on patient care, but a marked inflation of their perceived time spent on hospital administration. Conclusion The creation of an enabling practice environment, supportive executive management, and continuing professional development are needed to enable nursing managers to lead the provision of consistent and high-quality patient care. PMID:25971397
Injection-controlled laser resonator
Chang, J.J.
1995-07-18
A new injection-controlled laser resonator incorporates self-filtering and self-imaging characteristics with an efficient injection scheme. A low-divergence laser signal is injected into the resonator, which enables the injection signal to be converted to the desired resonator modes before the main laser pulse starts. This injection technique and resonator design enable the laser cavity to improve the quality of the injection signal through self-filtering before the main laser pulse starts. The self-imaging property of the present resonator reduces the cavity induced diffraction effects and, in turn, improves the laser beam quality. 5 figs.
Injection-controlled laser resonator
Chang, Jim J.
1995-07-18
A new injection-controlled laser resonator incorporates self-filtering and self-imaging characteristics with an efficient injection scheme. A low-divergence laser signal is injected into the resonator, which enables the injection signal to be converted to the desired resonator modes before the main laser pulse starts. This injection technique and resonator design enable the laser cavity to improve the quality of the injection signal through self-filtering before the main laser pulse starts. The self-imaging property of the present resonator reduces the cavity induced diffraction effects and, in turn, improves the laser beam quality.
Improving the efficacy of healthcare services for Aboriginal Australians.
Gwynne, Kylie; Jeffries, Thomas; Lincoln, Michelle
2018-01-16
Objective The aim of the present systematic review was to examine the enablers for effective health service delivery for Aboriginal Australians. Methods This systematic review was undertaken in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Papers were included if they had data related to health services for Australian Aboriginal people and were published between 2000 and 2015. The 21 papers that met the inclusion criteria were assessed using the Effective Public Health Practice Project Quality Assessment Tool for Quantitative Studies. Seven papers were subsequently excluded due to weak methodological approaches. Results There were two findings in the present study: (1) that Aboriginal people fare worse than non-Aboriginal people when accessing usual healthcare services; and (2) there are five enablers for effective health care services for Australian Aboriginal people: cultural competence, participation rates, organisational, clinical governance and compliance, and availability of services. Conclusions Health services for Australian Aboriginal people must be tailored and implementation of the five enablers is likely to affect the effectiveness of health services for Aboriginal people. The findings of the present study have significant implications in directing the future design, funding, delivery and evaluation of health care services for Aboriginal Australians. What is known about the topic? There is significant evidence about poor health outcomes and the 10-year gap in life expectancy between Aboriginal and non-Aboriginal people, and limited evidence about improving health service efficacy. What does this paper add? This systematic review found that with usual health care delivery, Aboriginal people experience worse health outcomes. This paper identifies five strategies in the literature that improve the effectiveness of health care services intended for Aboriginal people. What are the implications for practitioners? Aboriginal people fare worse in both experience and outcomes when they access usual care services. Health services intended for Aboriginal people should be tailored using the five enablers to provide timely, culturally safe and high-quality care.
Wu, Zhenqin; Ramsundar, Bharath; Feinberg, Evan N.; Gomes, Joseph; Geniesse, Caleb; Pappu, Aneesh S.; Leswing, Karl
2017-01-01
Molecular machine learning has been maturing rapidly over the last few years. Improved methods and the presence of larger datasets have enabled machine learning algorithms to make increasingly accurate predictions about molecular properties. However, algorithmic progress has been limited due to the lack of a standard benchmark to compare the efficacy of proposed methods; most new algorithms are benchmarked on different datasets making it challenging to gauge the quality of proposed methods. This work introduces MoleculeNet, a large scale benchmark for molecular machine learning. MoleculeNet curates multiple public datasets, establishes metrics for evaluation, and offers high quality open-source implementations of multiple previously proposed molecular featurization and learning algorithms (released as part of the DeepChem open source library). MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance. However, this result comes with caveats. Learnable representations still struggle to deal with complex tasks under data scarcity and highly imbalanced classification. For quantum mechanical and biophysical datasets, the use of physics-aware featurizations can be more important than choice of particular learning algorithm. PMID:29629118
Professional Competencies of Cuban Specialists in Intensive Care and Emergency Medicine.
Véliz-Martínez, Pedro L; Jorna-Calixto, Ana R; Oramas-González, René
2016-10-01
INTRODUCTION The quality of medical training and practice reflects the competency level of the professionals involved. The intensive care and emergency medicine specialty in Cuba has not defined its competencies. OBJECTIVE Identify the competencies required for specialty practice in intensive care and emergency medicine. METHODS The study was conducted from January 2014 to December 2015, using qualitative techniques; 48 professionals participated. We undertook functional occupational analysis, based on functions defined in a previous study. Three expert groups were utilized: the first used various group techniques; the second, the Delphi method; and the third, the Delphi method and a Likert questionnaire. RESULTS A total of 73 specific competencies were defined, grouped in 11 units: 44 in the patient care function, 16 in management, 7 in teaching and 6 in research. A competency map is provided. CONCLUSIONS The intensive care and emergency medicine specialty competencies identified will help improve professional standards, ensure health workforce quality, improve patient care and academic performance, and enable objective evaluation of specialists' competence and performance. KEYWORDS Clinical competency, competency-based education, professional education, intensive care, emergency medicine, urgent care, continuing medical education, curriculum, medical residency, Cuba.
NASA Astrophysics Data System (ADS)
Lin, Tingting; Zhang, Siyuan; Zhang, Yang; Wan, Ling; Lin, Jun
2017-01-01
Compared with the other geophysical approaches, magnetic resonance sounding (MRS) technique is direct and nondestructive in subsurface water exploration. It provides water content distribution and estimates hydrogeological properties. The biggest challenge is that MRS measurement always suffers bad signal-to-noise ratio, and it can be carried out only far from sources of noise. To solve this problem, a series of de-noising methods are developed. However, most of them are post-processing, leading the data quality uncontrolled for in situ measurements. In the present study, a new approach that removal of correlated noise online is found to overcome the restriction. Based on LabVIEW, a method is provided to enable online data quality control by the way of realizing signal acquisition and noise filtering simultaneously. Using one or more reference coils, adaptive noise cancellation based on LabVIEW to eliminate the correlated noise is available for in situ measurements. The approach was examined through numerical simulation and field measurements. The correlated noise is mitigated effectively and the application of MRS measurements is feasible in high-level noise environment. The method shortens the measurement time and improves the measurement efficiency.
Fluorescence dye-based detection of mAb aggregates in CHO culture supernatants.
Paul, Albert Jesuran; Schwab, Karen; Prokoph, Nina; Haas, Elena; Handrick, René; Hesse, Friedemann
2015-06-01
Product yields, efficacy, and safety of monoclonal antibodies (mAbs) are reduced by the formation of higher molecular weight aggregates during upstream processing. In-process characterization of mAb aggregate formation is a challenge since there is a lack of a fast detection method to identify mAb aggregates in cell culture. In this work, we present a rapid method to characterize mAb aggregate-containing Chinese hamster ovary (CHO) cell culture supernatants. The fluorescence dyes thioflavin T (ThT) and 4-4-bis-1-phenylamino-8-naphthalene sulfonate (Bis-ANS) enabled the detection of soluble as well as large mAb aggregates. Partial least square (PLS) regression models were used to evaluate the linearity of the dye-based mAb aggregate detection in buffer down to a mAb aggregate concentration of 2.4 μg mL(-1). Furthermore, mAb aggregates were detected in bioprocess medium using Bis-ANS and ThT. Dye binding to aggregates was stable for 60 min, making the method robust and reliable. Finally, the developed method using 10 μmol L(-1) Bis-ANS enabled discrimination between CHO cell culture supernatants containing different levels of mAb aggregates. The method can be adapted for high-throughput screening, e.g., to screen for cell culture conditions influencing mAb product quality, and hence can contribute to the improvement of production processes of biopharmaceuticals in mammalian cell culture.
High spatial resolution compressed sensing (HSPARSE) functional MRI.
Fang, Zhongnan; Van Le, Nguyen; Choy, ManKin; Lee, Jin Hyung
2016-08-01
To propose a novel compressed sensing (CS) high spatial resolution functional MRI (fMRI) method and demonstrate the advantages and limitations of using CS for high spatial resolution fMRI. A randomly undersampled variable density spiral trajectory enabling an acceleration factor of 5.3 was designed with a balanced steady state free precession sequence to achieve high spatial resolution data acquisition. A modified k-t SPARSE method was then implemented and applied with a strategy to optimize regularization parameters for consistent, high quality CS reconstruction. The proposed method improves spatial resolution by six-fold with 12 to 47% contrast-to-noise ratio (CNR), 33 to 117% F-value improvement and maintains the same temporal resolution. It also achieves high sensitivity of 69 to 99% compared the original ground-truth, small false positive rate of less than 0.05 and low hemodynamic response function distortion across a wide range of CNRs. The proposed method is robust to physiological noise and enables detection of layer-specific activities in vivo, which cannot be resolved using the highest spatial resolution Nyquist acquisition. The proposed method enables high spatial resolution fMRI that can resolve layer-specific brain activity and demonstrates the significant improvement that CS can bring to high spatial resolution fMRI. Magn Reson Med 76:440-455, 2016. © 2015 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made. © 2015 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Rossler, Tomas; Mandat, Dusan; Gallo, Jiri; Hrabovsky, Miroslav; Pochmon, Michal; Havranek, Vitezslav
2009-07-20
Total hip arthroplasty (THA) significantly improves the quality of life in majority of patients with severe osteoarthritis. However, long-term outcomes of THAs are compromised by aseptic loosening and periprosthetic osteolysis which needs revision surgery. Both of these are causally linked to a prosthetic wear deliberated from the prosthetic articulating surfaces. As a result, there is a need to measure the mode and magnitude of wear. The paper evaluates three optical methods proposed for construction of a device for the non-contact prosthetic wear measurement. Of them, the scanning profilometry achieved promising combination of accuracy and repeatability. Simultaneously, it is time efficient to enable the development of a sensor for wear measurement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sader, John E., E-mail: jsader@unimelb.edu.au; Friend, James R.
2015-05-15
Overall precision of the simplified calibration method in J. E. Sader et al., Rev. Sci. Instrum. 83, 103705 (2012), Sec. III D, is dominated by the spring constant of the reference cantilever. The question arises: How does one take measurements from multiple reference cantilevers, and combine these results, to improve uncertainty of the reference cantilever’s spring constant and hence the overall precision of the method? This question is addressed in this note. Its answer enables manufacturers to specify of a single set of data for the spring constant, resonant frequency, and quality factor, from measurements on multiple reference cantilevers. Withmore » this data set, users can trivially calibrate cantilevers of the same type.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Kanglin; Mi, Hongyi; Chang, Tzu-Hsuan
A novel method is developed to realize a III-V/Si dual-junction photovoltaic cell by combining epitaxial lift-off (ELO) and print-transfer-assisted bonding methods. The adoption of ELO enables III-V wafers to be recycled and reused, which can further lower the cost of III-V/Si photovoltaic panels. For demonstration, high crystal quality, micrometer-thick, GaAs/AlGaAs/GaAs films are lifted off, transferred, and directly bonded onto Si wafer without the use of any adhesive or bonding agents. The bonding interface is optically transparent and conductive both thermally and electrically. Prototype AlGaAs/Si dual-junction tandem solar cells have been fabricated and exhibit decent performance.
Xiong, Kanglin; Mi, Hongyi; Chang, Tzu-Hsuan; ...
2018-01-04
A novel method is developed to realize a III-V/Si dual-junction photovoltaic cell by combining epitaxial lift-off (ELO) and print-transfer-assisted bonding methods. The adoption of ELO enables III-V wafers to be recycled and reused, which can further lower the cost of III-V/Si photovoltaic panels. For demonstration, high crystal quality, micrometer-thick, GaAs/AlGaAs/GaAs films are lifted off, transferred, and directly bonded onto Si wafer without the use of any adhesive or bonding agents. The bonding interface is optically transparent and conductive both thermally and electrically. Prototype AlGaAs/Si dual-junction tandem solar cells have been fabricated and exhibit decent performance.
NASA Technical Reports Server (NTRS)
2001-01-01
Through a Small Business Innovation Research (SBIR) contract with NASA's Glenn Research Center, Rhenium Alloys, Inc., of Elyria, Ohio, developed a new method for producing rhenium combustion chambers. Using room temperature isostatic pressing, Rhenium Alloys, Inc., compacted rhenium powder to a high density and into the approximated end shape and dimension of the rocket thruster. The item was then subjected to sintering and containerless hot isostatic pressing, increasing the density of the powder metallurgy part. With the new manufacturing process, both production time and costs are reduced while quality is significantly increased. The method enabled the company to deliver two chemical rocket thrusters to Glenn Research Center. The company makes rhenium a practical choice in manufacturing fields, including the aerospace, nuclear, and electronic industries, with upcoming opportunities projected in medical instrumentation.
Schilardi, Patricia L; Dip, Patricio; dos Santos Claro, Paula C; Benítez, Guillermo A; Fonticelli, Mariano H; Azzaroni, Omar; Salvarezza, Roberto C
2005-12-16
Pattern transfer with high resolution is a frontier topic in the emerging field of nanotechnologies. Electrochemical molding is a possible route for nanopatterning metal, alloys and oxide surfaces with high resolution in a simple and inexpensive way. This method involves electrodeposition onto a conducting master covered by a self-assembled alkanethiolate monolayer (SAMs). This molecular film enables direct surface-relief pattern transfer from the conducting master to the inner face of the electrodeposit, and also allows an easy release of the electrodeposited film due their excellent anti-adherent properties. Replicas of the original conductive master can be also obtained by a simple two-step procedure. SAM quality and stability under electrodeposition conditions combined with the formation of smooth electrodeposits are crucial to obtain high-quality pattern transfer with sub-50 nm resolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tendille, Florian, E-mail: florian.tendille@crhea.cnrs.fr; Vennéguès, Philippe; De Mierry, Philippe
2016-08-22
Semipolar GaN crystal stripes larger than 100 μm with dislocation densities below 5 × 10{sup 6} cm{sup −2} are achieved using a low cost fabrication process. An original sapphire patterning procedure is proposed, enabling selective growth of semipolar oriented GaN stripes while confining the defects to specific areas. Radiative and non-radiative crystalline defects are investigated by cathodoluminescence and can be correlated to the development of crystal microstructure during the growth process. A dislocation reduction mechanism, supported by transmission electron microscopy, is proposed. This method represents a step forward toward low-cost quasi-bulk semipolar GaN epitaxial platforms with an excellent structural quality which will allowmore » for even more efficient III-nitride based devices.« less
One chromosome, one contig: complete microbial genomes from long-read sequencing and assembly.
Koren, Sergey; Phillippy, Adam M
2015-02-01
Like a jigsaw puzzle with large pieces, a genome sequenced with long reads is easier to assemble. However, recent sequencing technologies have favored lowering per-base cost at the expense of read length. This has dramatically reduced sequencing cost, but resulted in fragmented assemblies, which negatively affect downstream analyses and hinder the creation of finished (gapless, high-quality) genomes. In contrast, emerging long-read sequencing technologies can now produce reads tens of kilobases in length, enabling the automated finishing of microbial genomes for under $1000. This promises to improve the quality of reference databases and facilitate new studies of chromosomal structure and variation. We present an overview of these new technologies and the methods used to assemble long reads into complete genomes. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Davidson, John B.
1998-01-01
A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.
Yamada, I; Narihara, K; Funaba, H; Hayashi, H; Kohmoto, T; Takahashi, H; Shimozuma, T; Kubo, S; Yoshimura, Y; Igami, H; Tamura, N
2010-10-01
In Large Helical Device (LHD) experiments, an electron temperature (T(e)) more than 15 keV has been observed by the yttrium-aluminum-garnet (YAG) laser Thomson scattering diagnostic. Since the LHD Thomson scattering system has been optimized for the temperature region, 50 eV≤T(e)≤10 keV, the data quality becomes worse in the higher T(e) region exceeding 10 keV. In order to accurately determine T(e) in the LHD high-T(e) experiments, we tried to increase the laser pulse energy by simultaneously firing three lasers. The technique enables us to decrease the uncertainties in the measured T(e). Another signal accumulation method was also tested. In addition, we estimated the influence of high-energy electrons on T(e) obtained by the LHD Thomson scattering system.
Silva, Luiz Antonio F.; Barriviera, Mauricio; Januário, Alessandro L.; Bezerra, Ana Cristina B.; Fioravanti, Maria Clorinda S.
2011-01-01
The development of veterinary dentistry has substantially improved the ability to diagnose canine and feline dental abnormalities. Consequently, examinations previously performed only on humans are now available for small animals, thus improving the diagnostic quality. This has increased the need for technical qualification of veterinary professionals and increased technological investments. This study evaluated the use of cone beam computed tomography and intraoral radiography as complementary exams for diagnosing dental abnormalities in dogs and cats. Cone beam computed tomography was provided faster image acquisition with high image quality, was associated with low ionizing radiation levels, enabled image editing, and reduced the exam duration. Our results showed that radiography was an effective method for dental radiographic examination with low cost and fast execution times, and can be performed during surgical procedures. PMID:22122905
Approaches to Enable Demand Response by Industrial Loads for Ancillary Services Provision
NASA Astrophysics Data System (ADS)
Zhang, Xiao
Demand response has gained significant attention in recent years as it demonstrates potentials to enhance the power system's operational flexibility in a cost-effective way. Industrial loads such as aluminum smelters, steel manufacturers, and cement plants demonstrate advantages in supporting power system operation through demand response programs, because of their intensive power consumption, already existing advanced monitoring and control infrastructure, and the strong economic incentive due to the high energy costs. In this thesis, we study approaches to efficiently integrate each of these types of manufacturing processes as demand response resources. The aluminum smelting process is able to change its power consumption both accurately and quickly by controlling the pots' DC voltage, without affecting the production quality. Hence, an aluminum smelter has both the motivation and the ability to participate in demand response. First, we focus on determining the optimal regulation capacity that such a manufacturing plant should provide. Next, we focus on determining its optimal bidding strategy in the day-ahead energy and ancillary services markets. Electric arc furnaces (EAFs) in steel manufacturing consume a large amount of electric energy. However, a steel plant can take advantage of time-based electricity prices by optimally arranging energy-consuming activities to avoid peak hours. We first propose scheduling methods that incorporate the EAFs' flexibilities to reduce the electricity cost. We then propose methods to make the computations more tractable. Finally, we extend the scheduling formulations to enable the provision of spinning reserve. Cement plants are able to quickly adjust their power consumption rate by switching on/off the crushers. However, switching on/off the loading units only achieves discrete power changes, which restricts the load from offering valuable ancillary services such as regulation and load following, as continuous power changes are required for these services. We propose methods that enable these services with the support of an on-site energy storage device. As demonstrated by the case studies, the proposed approaches are effective and can generate practical production instructions for the industrial loads. This thesis not only provides methods to enable demand response by industrial loads but also potentially encourages industrial loads to be active in electricity markets.
Acoustic window planning for ultrasound acquisition.
Göbl, Rüdiger; Virga, Salvatore; Rackerseder, Julia; Frisch, Benjamin; Navab, Nassir; Hennersperger, Christoph
2017-06-01
Autonomous robotic ultrasound has recently gained considerable interest, especially for collaborative applications. Existing methods for acquisition trajectory planning are solely based on geometrical considerations, such as the pose of the transducer with respect to the patient surface. This work aims at establishing acoustic window planning to enable autonomous ultrasound acquisitions of anatomies with restricted acoustic windows, such as the liver or the heart. We propose a fully automatic approach for the planning of acquisition trajectories, which only requires information about the target region as well as existing tomographic imaging data, such as X-ray computed tomography. The framework integrates both geometrical and physics-based constraints to estimate the best ultrasound acquisition trajectories with respect to the available acoustic windows. We evaluate the developed method using virtual planning scenarios based on real patient data as well as for real robotic ultrasound acquisitions on a tissue-mimicking phantom. The proposed method yields superior image quality in comparison with a naive planning approach, while maintaining the necessary coverage of the target. We demonstrate that by taking image formation properties into account acquisition planning methods can outperform naive plannings. Furthermore, we show the need for such planning techniques, since naive approaches are not sufficient as they do not take the expected image quality into account.
Sharif, Behzad; Derbyshire, J. Andrew; Faranesh, Anthony Z.; Bresler, Yoram
2010-01-01
MR imaging of the human heart without explicit cardiac synchronization promises to extend the applicability of cardiac MR to a larger patient population and potentially expand its diagnostic capabilities. However, conventional non-gated imaging techniques typically suffer from low image quality or inadequate spatio-temporal resolution and fidelity. Patient-Adaptive Reconstruction and Acquisition in Dynamic Imaging with Sensitivity Encoding (PARADISE) is a highly-accelerated non-gated dynamic imaging method that enables artifact-free imaging with high spatio-temporal resolutions by utilizing novel computational techniques to optimize the imaging process. In addition to using parallel imaging, the method gains acceleration from a physiologically-driven spatio-temporal support model; hence, it is doubly accelerated. The support model is patient-adaptive, i.e., its geometry depends on dynamics of the imaged slice, e.g., subject’s heart-rate and heart location within the slice. The proposed method is also doubly adaptive as it adapts both the acquisition and reconstruction schemes. Based on the theory of time-sequential sampling, the proposed framework explicitly accounts for speed limitations of gradient encoding and provides performance guarantees on achievable image quality. The presented in-vivo results demonstrate the effectiveness and feasibility of the PARADISE method for high resolution non-gated cardiac MRI during a short breath-hold. PMID:20665794
Elkin, L L; Harden, D G; Saldanha, S; Ferguson, H; Cheney, D L; Pieniazek, S N; Maloney, D P; Zewinski, J; O'Connell, J; Banks, M
2015-06-01
Compound pooling, or multiplexing more than one compound per well during primary high-throughput screening (HTS), is a controversial approach with a long history of limited success. Many issues with this approach likely arise from long-term storage of library plates containing complex mixtures of compounds at high concentrations. Due to the historical difficulties with using multiplexed library plates, primary HTS often uses a one-compound-one-well approach. However, as compound collections grow, innovative strategies are required to increase the capacity of primary screening campaigns. Toward this goal, we have developed a novel compound pooling method that increases screening capacity without compromising data quality. This method circumvents issues related to the long-term storage of complex compound mixtures by using acoustic dispensing to enable "just-in-time" compound pooling directly in the assay well immediately prior to assay. Using this method, we can pool two compounds per well, effectively doubling the capacity of a primary screen. Here, we present data from pilot studies using just-in-time pooling, as well as data from a large >2-million-compound screen using this approach. These data suggest that, for many targets, this method can be used to vastly increase screening capacity without significant reduction in the ability to detect screening hits. © 2015 Society for Laboratory Automation and Screening.
Berendes, Sima; Adeyemi, Olusegun; Oladele, Edward Adekola; Oresanya, Olusola Bukola; Okoh, Festus; Valadez, Joseph J.
2012-01-01
Background Patent medicine vendors (PMV) provide antimalarial treatment and care throughout Sub-Saharan Africa, and can play an important role in the fight against malaria. Their close-to-client infrastructure could enable lifesaving artemisinin-based combination therapy (ACT) to reach patients in time. However, systematic assessments of drug sellers’ performance quality are crucial if their role is to be managed within the health system. Lot quality assurance sampling (LQAS) could be an efficient method to monitor and evaluate PMV practice, but has so far never been used for this purpose. Methods In support of the Nigeria Malaria Booster Program we assessed PMV practices in three Senatorial Districts (SDs) of Jigawa, Nigeria. A two-stage LQAS assessed whether at least 80% of PMV stores in SDs used national treatment guidelines. Acceptable sampling errors were set in consultation with government officials (alpha and beta <0.10). The hypergeometric formula determined sample sizes and cut-off values for SDs. A structured assessment tool identified high and low performing SDs for quality of care indicators. Findings Drug vendors performed poorly in all SDs of Jigawa for all indicators. For example, all SDs failed for stocking and selling first-line antimalarials. PMV sold no longer recommended antimalarials, such as Chloroquine, Sulfadoxine-Pyrimethamine and oral Artesunate monotherapy. Most PMV were ignorant of and lacked training about new treatment guidelines that had endorsed ACTs as first-line treatment for uncomplicated malaria. Conclusion There is urgent need to regularly monitor and improve the availability and quality of malaria treatment provided by medicine sellers in Nigeria; the irrational use of antimalarials in the ACT era revealed in this study bears a high risk of economic loss, death and development of drug resistance. LQAS has been shown to be a suitable method for monitoring malaria-related indicators among PMV, and should be applied in Nigeria and elsewhere to improve service delivery. PMID:22984555
Challenges of using quality improvement methods in nursing homes that "need improvement".
Rantz, Marilyn J; Zwygart-Stauffacher, Mary; Flesner, Marcia; Hicks, Lanis; Mehr, David; Russell, Teresa; Minner, Donna
2012-10-01
Qualitatively describe the adoption of strategies and challenges experienced by intervention facilities participating in a study targeted to improve quality of care in nursing homes "in need of improvement". To describe how staff use federal quality indicator/quality measure (QI/QM) scores and reports, quality improvement methods and activities, and how staff supported and sustained the changes recommended by their quality improvement teams. A randomized, two-group, repeated-measures design was used to test a 2-year intervention for improving quality of care and resident outcomes in facilities in "need of improvement". Intervention group (n = 29) received an experimental multilevel intervention designed to help them: (1) use quality-improvement methods, (2) use team and group process for direct-care decision-making, (3) focus on accomplishing the basics of care, and (4) maintain more consistent nursing and administrative leadership committed to communication and active participation of staff in decision-making. A qualitative analysis revealed a subgroup of homes likely to continue quality improvement activities and readiness indicators of homes likely to improve: (1) a leadership team (nursing home administrator, director of nurses) interested in learning how to use their federal QI/QM reports as a foundation for improving resident care and outcomes; (2) one of the leaders to be a "change champion" and make sure that current QI/QM reports are consistently printed and shared monthly with each nursing unit; (3) leaders willing to involve all staff in the facility in educational activities to learn about the QI/QM process and the reports that show how their facility compares with others in the state and nation; (4) leaders willing to plan and continuously educate new staff about the MDS and federal QI/QM reports and how to do quality improvement activities; (5) leaders willing to continuously involve all staff in quality improvement committee and team activities so they "own" the process and are responsible for change. Results of this qualitative analysis can help allocate expert nurse time to facilities that are actually ready to improve. Wide-spread adoption of this intervention is feasible and could be enabled by nursing home medical directors in collaborative practice with advanced practice nurses. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
Reljin, Branimir; Milosević, Zorica; Stojić, Tomislav; Reljin, Irini
2009-01-01
Two methods for segmentation and visualization of microcalcifications in digital or digitized mammograms are described. First method is based on modern mathematical morphology, while the second one uses the multifractal approach. In the first method, by using an appropriate combination of some morphological operations, high local contrast enhancement, followed by significant suppression of background tissue, irrespective of its radiology density, is obtained. By iterative procedure, this method highly emphasizes only small bright details, possible microcalcifications. In a multifractal approach, from initial mammogram image, a corresponding multifractal "images" are created, from which a radiologist has a freedom to change the level of segmentation. An appropriate user friendly computer aided visualization (CAV) system with embedded two methods is realized. The interactive approach enables the physician to control the level and the quality of segmentation. Suggested methods were tested through mammograms from MIAS database as a gold standard, and from clinical praxis, using digitized films and digital images from full field digital mammograph.
Podshivalov, L; Fischer, A; Bar-Yoseph, P Z
2011-04-01
This paper describes a new alternative for individualized mechanical analysis of bone trabecular structure. This new method closes the gap between the classic homogenization approach that is applied to macro-scale models and the modern micro-finite element method that is applied directly to micro-scale high-resolution models. The method is based on multiresolution geometrical modeling that generates intermediate structural levels. A new method for estimating multiscale material properties has also been developed to facilitate reliable and efficient mechanical analysis. What makes this method unique is that it enables direct and interactive analysis of the model at every intermediate level. Such flexibility is of principal importance in the analysis of trabecular porous structure. The method enables physicians to zoom-in dynamically and focus on the volume of interest (VOI), thus paving the way for a large class of investigations into the mechanical behavior of bone structure. This is one of the very few methods in the field of computational bio-mechanics that applies mechanical analysis adaptively on large-scale high resolution models. The proposed computational multiscale FE method can serve as an infrastructure for a future comprehensive computerized system for diagnosis of bone structures. The aim of such a system is to assist physicians in diagnosis, prognosis, drug treatment simulation and monitoring. Such a system can provide a better understanding of the disease, and hence benefit patients by providing better and more individualized treatment and high quality healthcare. In this paper, we demonstrate the feasibility of our method on a high-resolution model of vertebra L3. Copyright © 2010 Elsevier Inc. All rights reserved.
Eligibility, Quality, and Identification of Aeronautical Replacement Parts
DOT National Transportation Integrated Search
1996-05-24
This advisory circular (AC) provides information and guidance for use in determining the quality, eligibility and traceability of aeronautical parts and materials intended for installation on U.S. type-certificated products and to enable compliance w...
Application of the PRECEDE model to understanding mental health promoting behaviors in Hong Kong.
Mo, Phoenix K H; Mak, Winnie W S
2008-08-01
The burdens related to mental illness have been increasingly recognized in many countries. Nevertheless, research in positive mental health behaviors remains scarce. This study utilizes the Predisposing, Reinforcing, and Enabling Causes in Education Diagnosis and Evaluation (PRECEDE) model to identify factors associated with mental health promoting behaviors and to examine the effects of these behaviors on mental well-being and quality of life among 941 adults in Hong Kong. Structural equation modeling shows that sense of coherence (predisposing factor), social support (reinforcing factor), and daily hassles (enabling factor) are significantly related to mental health promoting behaviors, which are associated with mental well-being and quality of life. Results of bootstrap analyses confirm the mediating role of mental health promoting behaviors on well-being and quality of life. The study supports the application of the PRECEDE model in understanding mental health promoting behaviors and demonstrates its relationships with well-being and quality of life.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... Government Solutions, Koansys LLC, and Quality Associates Inc.; Transfer of Data AGENCY: Environmental...Info Solutions and its subcontractors, Avaya Government Solutions, Koansys LLC, and Quality Associates Inc. have been awarded a contract to perform work for OPP, and access to this information will enable...
Bullen, A.; Taylor, R.R.; Kachar, B.; Moores, C.; Fleck, R.A.; Forge, A.
2014-01-01
In the preservation of tissues in as ‘close to life’ state as possible, rapid freeze fixation has many benefits over conventional chemical fixation. One technique by which rapid freeze-fixation can be achieved, high pressure freezing (HPF), has been shown to enable ice crystal artefact-free freezing and tissue preservation to greater depths (more than 40 μm) than other quick-freezing methods. Despite increasingly becoming routine in electron microscopy, the use of HPF for the fixation of inner ear tissue has been limited. Assessment of the quality of preservation showed routine HPF techniques were suitable for preparation of inner ear tissues in a variety of species. Good preservation throughout the depth of sensory epithelia was achievable. Comparison to chemically fixed tissue indicated that fresh frozen preparations exhibited overall superior structural preservation of cells. However, HPF fixation caused characteristic artefacts in stereocilia that suggested poor quality freezing of the actin bundles. The hybrid technique of pre-fixation and high pressure freezing was shown to produce cellular preservation throughout the tissue, similar to that seen in HPF alone. Pre-fixation HPF produced consistent high quality preservation of stereociliary actin bundles. Optimising the preparation of samples with minimal artefact formation allows analysis of the links between ultrastructure and function in inner ear tissues. PMID:25016142
A Bioassay System Using Bioelectric Signals from Small Fish
NASA Astrophysics Data System (ADS)
Terawaki, Mitsuru; Soh, Zu; Hirano, Akira; Tsuji, Toshio
Although the quality of tap water is generally examined using chemical assay, this method cannot be used for examination in real time. Against such a background, the technique of fish bioassay has attracted attention as an approach that enables constant monitoring of aquatic contamination. The respiratory rhythms of fish are considered an efficient indicator for the ongoing assessment of water quality, since they are sensitive to chemicals and can be indirectly measured from bioelectric signals generated by breathing. In order to judge aquatic contamination accurately, it is necessary to measure bioelectric signals from fish swimming freely as well as to stably discriminate measured signals, which vary between individuals. However, no bioassay system meeting the above requirements has yet been established. This paper proposes a bioassay system using bioelectric signals generated from small fish in free-swimming conditions. The system records signals using multiple electrodes to cover the extensive measurement range required in a free-swimming environment, and automatically discriminates changes in water quality from signal frequency components. This discrimination is achieved through an ensemble classification method using probability neural networks to solve the problem of differences between individual fish. The paper also reports on the results of related validation experiments, which showed that the proposed system was able to stably discriminate between water conditions before and after bleach exposure.
Bakitas, Marie; Lyons, Kathleen Doyle; Hegel, Mark T.; Ahles, Tim
2013-01-01
Purpose To understand oncology clinicians’ perspectives about the care of advanced cancer patients following the completion of the ENABLE II (Educate, Nurture, Advise, Before Life Ends) randomized clinical trial (RCT) of a concurrent oncology palliative care model. Methods Qualitative interview study of 35 oncology clinicians about their approach to patients with advanced cancer and the effect of the ENABLE II RCT. Results Oncologists believed that integrating palliative care at the time of an advanced cancer diagnosis enhanced patient care and complemented their practice. Self-assessment of their practice with advanced cancer patients comprised four themes: 1) treating the whole patient, 2) focusing on quality versus quantity of life, 3) “some patients just want to fight”, and 4) helping with transitions; timing is everything. Five themes comprised oncologists’ views on the complementary role of palliative care: 1) “refer early and often”, 2) referral challenges: “Palliative” equals hospice; “Heme patients are different”, 3) palliative care as consultants or co-managers, 4) palliative care “shares the load”, and 5) ENABLE II facilitated palliative care integration. Conclusions Oncologists described the RCT as holistic and complementary, and as a significant factor in adopting concurrent care as a standard of care. PMID:23040412
Controlled Environments Enable Adaptive Management in Aquatic Ecosystems Under Altered Environments
NASA Technical Reports Server (NTRS)
Bubenheim, David L.
2016-01-01
Ecosystems worldwide are impacted by altered environment conditions resulting from climate, drought, and land use changes. Gaps in the science knowledge base regarding plant community response to these novel and rapid changes limit both science understanding and management of ecosystems. We describe how CE Technologies have enabled the rapid supply of gap-filling science, development of ecosystem simulation models, and remote sensing assessment tools to provide science-informed, adaptive management methods in the impacted aquatic ecosystem of the California Sacramento-San Joaquin River Delta. The Delta is the hub for California's water, supplying Southern California agriculture and urban communities as well as the San Francisco Bay area. The changes in environmental conditions including temperature, light, and water quality and associated expansion of invasive aquatic plants negatively impact water distribution and ecology of the San Francisco Bay/Delta complex. CE technologies define changes in resource use efficiencies, photosynthetic productivity, evapotranspiration, phenology, reproductive strategies, and spectral reflectance modifications in native and invasive species in response to altered conditions. We will discuss how the CE technologies play an enabling role in filling knowledge gaps regarding plant response to altered environments, parameterization and validation of ecosystem models, development of satellite-based, remote sensing tools, and operational management strategies.
Passmore, Erin; Mason, Chloe; Rissel, Chris
2013-01-01
Introduction. Cycling can be an enjoyable way to meet physical activity recommendations and is suitable for older people; however cycling participation by older Australians is low. This qualitative study explored motivators, enablers, and barriers to cycling among older people through an age-targeted cycling promotion program. Methods. Seventeen adults who aged 50–75 years participated in a 12-week cycling promotion program which included a cycling skills course, mentor, and resource pack. Semistructured interviews at the beginning and end of the program explored motivators, enablers, and barriers to cycling. Results. Fitness and recreation were the primary motivators for cycling. The biggest barrier was fear of cars and traffic, and the cycling skills course was the most important enabler for improving participants' confidence. Reported outcomes from cycling included improved quality of life (better mental health, social benefit, and empowerment) and improved physical health. Conclusions. A simple cycling program increased cycling participation among older people. This work confirms the importance of improving confidence in this age group through a skills course, mentors, and maps and highlights additional strategies for promoting cycling, such as ongoing improvement to infrastructure and advertising. PMID:23864869
78 FR 39008 - Renewal of Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... enables the BLM to manage Federal coal resources in accordance with applicable statutes. The Office of...: This collection enables the BLM to learn the extent and qualities of Federal coal resources; evaluate... lessees to acquire and hold Federal coal leases; and ensure lessee compliance with applicable statutes...
NASA Astrophysics Data System (ADS)
Murakami, S.; Takemoto, T.; Ito, Y.
2012-07-01
The Japanese government, local governments and businesses are working closely together to establish spatial data infrastructures in accordance with the Basic Act on the Advancement of Utilizing Geospatial Information (NSDI Act established in August 2007). Spatial data infrastructures are urgently required not only to accelerate computerization of the public administration, but also to help restoration and reconstruction of the areas struck by the East Japan Great Earthquake and future disaster prevention and reduction. For construction of a spatial data infrastructure, various guidelines have been formulated. But after an infrastructure is constructed, there is a problem of maintaining it. In one case, an organization updates its spatial data only once every several years because of budget problems. Departments and sections update the data on their own without careful consideration. That upsets the quality control of the entire data system and the system loses integrity, which is crucial to a spatial data infrastructure. To ensure quality, ideally, it is desirable to update data of the entire area every year. But, that is virtually impossible, considering the recent budget crunch. The method we suggest is to update spatial data items of higher importance only in order to maintain quality, not updating all the items across the board. We have explored a method of partially updating the data of these two geographical features while ensuring the accuracy of locations. Using this method, data on roads and buildings that greatly change with time can be updated almost in real time or at least within a year. The method will help increase the availability of a spatial data infrastructure. We have conducted an experiment on the spatial data infrastructure of a municipality using those data. As a result, we have found that it is possible to update data of both features almost in real time.
Eckermann, Simon; Coelli, Tim
2013-01-01
Evidence based medicine supports net benefit maximising therapies and strategies in processes of health technology assessment (HTA) for reimbursement and subsidy decisions internationally. However, translation of evidence based medicine to practice is impeded by efficiency measures such as cost per case-mix adjusted separation in hospitals, which ignore health effects of care. In this paper we identify a correspondence method that allows quality variables under control of providers to be incorporated in efficiency measures consistent with maximising net benefit. Including effects framed from a disutility bearing (utility reducing) perspective (e.g. mortality, morbidity or reduction in life years) as inputs and minimising quality inclusive costs on the cost-disutility plane is shown to enable efficiency measures consistent with maximising net benefit under a one to one correspondence. The method combines advantages of radial properties with an appropriate objective of maximising net benefit to overcome problems of inappropriate objectives implicit with alternative methods, whether specifying quality variables with utility bearing output (e.g. survival, reduction in morbidity or life years), hyperbolic or exogenous variables. This correspondence approach is illustrated in undertaking efficiency comparison at a clinical activity level for 45 Australian hospitals allowing for their costs and mortality rates per admission. Explicit coverage and comparability conditions of the underlying correspondence method are also shown to provide a robust framework for preventing cost-shifting and cream-skimming incentives, with appropriate qualification of analysis and support for data linkage and risk adjustment where these conditions are not satisfied. Comparison on the cost-disutility plane has previously been shown to have distinct advantages in comparing multiple strategies in HTA, which this paper naturally extends to a robust method and framework for comparing efficiency of health care providers in practice. Consequently, the proposed approach provides a missing link between HTA and practice, to allow active incentives for evidence based net benefit maximisation in practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
Automated Data Quality Assurance using OGC Sensor Web Enablement Frameworks for Marine Observatories
NASA Astrophysics Data System (ADS)
Toma, Daniel; Bghiel, Ikram; del Rio, Joaquin; Hidalgo, Alberto; Carreras, Normandino; Manuel, Antoni
2014-05-01
Over the past years, environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent. Therefore, many sensor networks are increasingly deployed to monitor our environment. But due to the large number of sensor manufacturers, accompanying protocols and data encoding, automated integration and data quality assurance of diverse sensors in an observing systems is not straightforward, requiring development of data management code and manual tedious configuration. However, over the past few years it has been demonstrated that Open-Geospatial Consortium (OGC) frameworks can enable web services with fully-described sensor systems, including data processing, sensor characteristics and quality control tests and results. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The data management software which enables access to sensors, data processing and quality control tests has to be implemented and the results have to be manually mapped to the SWE models. In this contribution, we describe a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) OGC PUCK protocol - a simple standard embedded instrument protocol to store and retrieve directly from the devices the declarative description of sensor characteristics and quality control tests, (2) an automatic mechanism for data processing and quality control tests underlying the Sensor Web - the Sensor Interface Descriptor (SID) concept, as well as (3) a model for the declarative description of sensor which serves as a generic data management mechanism - designed as a profile and extension of OGC SWE's SensorML standard. We implement and evaluate our approach by applying it to the OBSEA Observatory, and can be used to demonstrate the ability to assess data quality for temperature, salinity, air pressure and wind speed and direction observations off the coast of Garraf, in the north-eastern Spain.
Didelot, Audrey; Kotsopoulos, Steve K; Lupo, Audrey; Pekin, Deniz; Li, Xinyu; Atochin, Ivan; Srinivasan, Preethi; Zhong, Qun; Olson, Jeff; Link, Darren R; Laurent-Puig, Pierre; Blons, Hélène; Hutchison, J Brian; Taly, Valerie
2013-05-01
Assessment of DNA integrity and quantity remains a bottleneck for high-throughput molecular genotyping technologies, including next-generation sequencing. In particular, DNA extracted from paraffin-embedded tissues, a major potential source of tumor DNA, varies widely in quality, leading to unpredictable sequencing data. We describe a picoliter droplet-based digital PCR method that enables simultaneous detection of DNA integrity and the quantity of amplifiable DNA. Using a multiplex assay, we detected 4 different target lengths (78, 159, 197, and 550 bp). Assays were validated with human genomic DNA fragmented to sizes of 170 bp to 3000 bp. The technique was validated with DNA quantities as low as 1 ng. We evaluated 12 DNA samples extracted from paraffin-embedded lung adenocarcinoma tissues. One sample contained no amplifiable DNA. The fractions of amplifiable DNA for the 11 other samples were between 0.05% and 10.1% for 78-bp fragments and ≤1% for longer fragments. Four samples were chosen for enrichment and next-generation sequencing. The quality of the sequencing data was in agreement with the results of the DNA-integrity test. Specifically, DNA with low integrity yielded sequencing results with lower levels of coverage and uniformity and had higher levels of false-positive variants. The development of DNA-quality assays will enable researchers to downselect samples or process more DNA to achieve reliable genome sequencing with the highest possible efficiency of cost and effort, as well as minimize the waste of precious samples. © 2013 American Association for Clinical Chemistry.
Genetic Engineering of Alfalfa (Medicago sativa L.).
Wang, Dan; Khurshid, Muhammad; Sun, Zhan Min; Tang, Yi Xiong; Zhou, Mei Liang; Wu, Yan Min
2016-01-01
Alfalfa is excellent perennial legume forage for its extensive ecological adaptability, high nutrition value, palatability and biological nitrogen fixation. It plays a very important role in the agriculture, animal husbandry and ecological construction. It is cultivated in all continents. With the development of modern plant breeding and genetic engineering techniques, a large amount of work has been carried out on alfalfa. Here we summarize the recent research advances in genetic engineering of alfalfa breeding, including transformation, quality improvement, stress resistance and as a bioreactor. The review article can enables us to understand the research method, direction and achievements of genetic engineering technology of Alfalfa.
Multi-Robot Assembly Strategies and Metrics.
Marvel, Jeremy A; Bostelman, Roger; Falco, Joe
2018-02-01
We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.
Microspherical photonics: Sorting resonant photonic atoms by using light
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maslov, Alexey V., E-mail: avmaslov@yandex.ru; Astratov, Vasily N., E-mail: astratov@uncc.edu
2014-09-22
A method of sorting microspheres by resonant light forces in vacuum, air, or liquid is proposed. Based on a two-dimensional model, it is shown that the sorting can be realized by allowing spherical particles to traverse a focused beam. Under resonance with the whispering gallery modes, the particles acquire significant velocity along the beam direction. This opens a unique way of large-volume sorting of nearly identical photonic atoms with 1/Q accuracy, where Q is the resonance quality factor. This is an enabling technology for developing super-low-loss coupled-cavity structures and devices.
The development of a decision aid for tinnitus.
Pryce, Helen; Durand, Marie-Anne; Hall, Amanda; Shaw, Rachel; Culhane, Beth-Anne; Swift, Sarah; Straus, Jean; Marks, Elizabeth; Ward, Melanie; Chilvers, Katie
2018-05-09
To develop a decision aid for tinnitus care that would meet international consensus for decision aid quality. A mixed methods design that included qualitative in-depth interviews, literature review, focus groups, user testing and readability checking. Patients and clinicians who have clinical experience of tinnitus. A decision aid for tinnitus care was developed. This incorporates key evidence of efficacy for the most frequently used tinnitus care options, together with information derived from patient priorities when deciding which choice to make. The decision aid has potential to enable shared decision making between clinicians and patients in audiology. The decision aid meets consensus standards.
Live immunization against East Coast fever--current status.
Di Giulio, Giuseppe; Lynen, Godelieve; Morzaria, Subhash; Oura, Chris; Bishop, Richard
2009-02-01
The infection-and-treatment method (ITM) for immunization of cattle against East Coast fever has historically been used only on a limited scale because of logistical and policy constraints. Recent large-scale deployment among pastoralists in Tanzania has stimulated demand. Concurrently, a suite of molecular tools, developed from the Theileria parva genome, has enabled improved quality control of the immunizing stabilate and post-immunization monitoring of the efficacy and biological impact of ITM in the field. This article outlines the current status of ITM immunization in the field, with associated developments in the molecular epidemiology of T. parva.
The LSST Data Mining Research Agenda
NASA Astrophysics Data System (ADS)
Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.
2008-12-01
We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.
Method for vacuum fusion bonding
Ackler, Harold D.; Swierkowski, Stefan P.; Tarte, Lisa A.; Hicks, Randall K.
2001-01-01
An improved vacuum fusion bonding structure and process for aligned bonding of large area glass plates, patterned with microchannels and access holes and slots, for elevated glass fusion temperatures. Vacuum pumpout of all components is through the bottom platform which yields an untouched, defect free top surface which greatly improves optical access through this smooth surface. Also, a completely non-adherent interlayer, such as graphite, with alignment and location features is located between the main steel platform and the glass plate pair, which makes large improvements in quality, yield, and ease of use, and enables aligned bonding of very large glass structures.
Multi-Robot Assembly Strategies and Metrics
MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE
2018-01-01
We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234
ScreenCube: A 3D Printed System for Rapid and Cost-Effective Chemical Screening in Adult Zebrafish.
Monstad-Rios, Adrian T; Watson, Claire J; Kwon, Ronald Y
2018-02-01
Phenotype-based small molecule screens in zebrafish embryos and larvae have been successful in accelerating pathway and therapeutic discovery for diverse biological processes. Yet, the application of chemical screens to adult physiologies has been relatively limited due to additional demands on cost, space, and labor associated with screens in adult animals. In this study, we present a 3D printed system and methods for intermittent drug dosing that enable rapid and cost-effective chemical administration in adult zebrafish. Using prefilled screening plates, the system enables dosing of 96 fish in ∼3 min, with a 10-fold reduction in drug quantity compared to that used in previous chemical screens in adult zebrafish. We characterize water quality kinetics during immersion in the system and use these kinetics to rationally design intermittent dosing regimens that result in 100% fish survival. As a demonstration of system fidelity, we show the potential to identify two known chemical inhibitors of adult tail fin regeneration, cyclopamine and dorsomorphin. By developing methods for rapid and cost-effective chemical administration in adult zebrafish, this study expands the potential for small molecule discovery in postembryonic models of development, disease, and regeneration.
Advances in remote sensing of the daytime ionosphere with EUV airglow
NASA Astrophysics Data System (ADS)
Stephan, Andrew W.
2016-09-01
This paper summarizes recent progress in developing a method for characterizing the daytime ionosphere from limb profile measurements of the OII 83.4 nm emission. This extreme ultraviolet emission is created by solar photoionization of atomic oxygen in the lower thermosphere and is resonantly scattered by O+ in the ionosphere. The brightness and shape of the measured altitude profile thus depend on both the photoionization source in the lower thermosphere and the ionospheric densities that determine the resonant scattering contribution. This technique has greatly matured over the past decade due to measurements by the series of Naval Research Laboratory Special Sensor Ultraviolet Limb Imager (SSULI) instruments flown on Defense Meteorological Satellite Program (DMSP) missions and the Remote Atmospheric and Ionospheric Detection System (RAIDS) on the International Space Station. The volume of data from these missions has enabled a better approach to handling specific biases and uncertainties in both the measurement and retrieval process that affect the accuracy of the result. This paper identifies the key measurement and data quality factors that will enable the continued evolution of this technique into an advanced method for characterization of the daytime ionosphere.
Comparative study on novel test systems to determine disintegration time of orodispersible films.
Preis, Maren; Gronkowsky, Dorothee; Grytzan, Dominik; Breitkreutz, Jörg
2014-08-01
Orodispersible films (ODFs) are a promising innovative dosage form enabling drug administration without the need for water and minimizing danger of aspiration due to their fast disintegration in small amounts of liquid. This study focuses on the development of a disintegration test system for ODFs. Two systems were developed and investigated: one provides an electronic end-point, and the other shows a transferable setup of the existing disintegration tester for orodispersible tablets. Different ODF preparations were investigated to determine the suitability of the disintegration test systems. The use of different test media and the impact of different storage conditions of ODFs on their disintegration time were additionally investigated. The experiments showed acceptable reproducibility (low deviations within sample replicates due to a clear determination of the measurement end-point). High temperatures and high humidity affected some of the investigated ODFs, resulting in higher disintegration time or even no disintegration within the tested time period. The methods provided clear end-point detection and were applicable for different types of ODFs. By the modification of a conventional test system to enable application for films, a standard method could be presented to ensure uniformity in current quality control settings. © 2014 Royal Pharmaceutical Society.
Correction tool for Active Shape Model based lumbar muscle segmentation.
Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio
2015-08-01
In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.
Fragon: rapid high-resolution structure determination from ideal protein fragments.
Jenkins, Huw T
2018-03-01
Correctly positioning ideal protein fragments by molecular replacement presents an attractive method for obtaining preliminary phases when no template structure for molecular replacement is available. This has been exploited in several existing pipelines. This paper presents a new pipeline, named Fragon, in which fragments (ideal α-helices or β-strands) are placed using Phaser and the phases calculated from these coordinates are then improved by the density-modification methods provided by ACORN. The reliable scoring algorithm provided by ACORN identifies success. In these cases, the resulting phases are usually of sufficient quality to enable automated model building of the entire structure. Fragon was evaluated against two test sets comprising mixed α/β folds and all-β folds at resolutions between 1.0 and 1.7 Å. Success rates of 61% for the mixed α/β test set and 30% for the all-β test set were achieved. In almost 70% of successful runs, fragment placement and density modification took less than 30 min on relatively modest four-core desktop computers. In all successful runs the best set of phases enabled automated model building with ARP/wARP to complete the structure.
Quality of phenobarbital solid-dosage forms in the urban community of Nouakchott (Mauritania).
Laroche, Marie-Laure; Traore, Hamidou; Merle, Louis; Gaulier, Jean-Michel; Viana, Marylene; Preux, Pierre-Marie
2005-08-01
Epilepsy is a major public-health problem in Africa. The quality of available drugs is a limiting factor for an adequate management. The aim of this study was to describe the proportion of poor-quality phenobarbital (PB) solid-dosage forms and evaluate the factors associated with its quality in Nouakchott (Mauritania). A cross-sectional study was carried out within pharmacies, hospitals, and on the parallel market in March 2003. PB samples were bought by a native person and then assayed by a liquid chromatography method. A package was considered to be of good quality if the active-substance average content was between 85 and 115% of the stated content printed on the packet. Forty-five pharmaceutical stores were visited, enabling us to collect 146 samples of PB. Three brand names were available in Nouakchott. They originated from France, Morocco, Senegal, and Egypt. Results: A prevalence of 13.7%[95% confidence interval (CI), 8.8-20.0] of poor-quality PB was found. All samples from Morocco were underdosed. The generic active content was satisfactory, but saccharose, an excipient with a potential side effects, was identified. Two factors associated with the good quality of PB have been put forward: tablets manufactured in France and loose packaging as generics conditioned in such a way were of good quality. This study shows that the quality of antiepileptic drugs in Africa is still worrying. The setting up of medicine quality control in Mauritania is legitimate. Considering the good quality of generic PB and its lower cost, this type of medicine should be promoted in this region.
Sawchuk, Dena; Currie, Kris; Vich, Manuel Lagravere; Palomo, Juan Martin
2016-01-01
Objective To evaluate the accuracy and reliability of the diagnostic tools available for assessing maxillary transverse deficiencies. Methods An electronic search of three databases was performed from their date of establishment to April 2015, with manual searching of reference lists of relevant articles. Articles were considered for inclusion if they reported the accuracy or reliability of a diagnostic method or evaluation technique for maxillary transverse dimensions in mixed or permanent dentitions. Risk of bias was assessed in the included articles, using the Quality Assessment of Diagnostic Accuracy Studies tool-2. Results Nine articles were selected. The studies were heterogeneous, with moderate to low methodological quality, and all had a high risk of bias. Four suggested that the use of arch width prediction indices with dental cast measurements is unreliable for use in diagnosis. Frontal cephalograms derived from cone-beam computed tomography (CBCT) images were reportedly more reliable for assessing intermaxillary transverse discrepancies than posteroanterior cephalograms. Two studies proposed new three-dimensional transverse analyses with CBCT images that were reportedly reliable, but have not been validated for clinical sensitivity or specificity. No studies reported sensitivity, specificity, positive or negative predictive values or likelihood ratios, or ROC curves of the methods for the diagnosis of transverse deficiencies. Conclusions Current evidence does not enable solid conclusions to be drawn, owing to a lack of reliable high quality diagnostic studies evaluating maxillary transverse deficiencies. CBCT images are reportedly more reliable for diagnosis, but further validation is required to confirm CBCT's accuracy and diagnostic superiority. PMID:27668196
Gulmans, J; Vollenbroek-Hutten, M M R; Van Gemert-Pijnen, J E W C; Van Harten, W H
2007-10-01
Owing to the involvement of multiple professionals from various institutions, integrated care settings are prone to suboptimal patient care communication. To assure continuity, communication gaps should be identified for targeted improvement initiatives. However, available assessment methods are often one-sided evaluations not appropriate for integrated care settings. We developed an evaluation approach that takes into account the multiple communication links and evaluation perspectives inherent to these settings. In this study, we describe this approach, using the integrated care setting of Cerebral Palsy as illustration. The approach follows a three-step mixed design in which the results of each step are used to mark out the subsequent step's focus. The first step patient questionnaire aims to identify quality gaps experienced by patients, comparing their expectancies and experiences with respect to patient-professional and inter-professional communication. Resulting gaps form the input of in-depth interviews with a subset of patients to evaluate underlying factors of ineffective communication. Resulting factors form the input of the final step's focus group meetings with professionals to corroborate and complete the findings. By combining methods, the presented approach aims to minimize limitations inherent to the application of single methods. The comprehensiveness of the approach enables its applicability in various integrated care settings. Its sequential design allows for in-depth evaluation of relevant quality gaps. Further research is needed to evaluate the approach's feasibility in practice. In our subsequent study, we present the results of the approach in the integrated care setting of children with Cerebral Palsy in three Dutch care regions.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Ultrafast Comparison of Personal Genomes via Precomputed Genome Fingerprints.
Glusman, Gustavo; Mauldin, Denise E; Hood, Leroy E; Robinson, Max
2017-01-01
We present an ultrafast method for comparing personal genomes. We transform the standard genome representation (lists of variants relative to a reference) into "genome fingerprints" via locality sensitive hashing. The resulting genome fingerprints can be meaningfully compared even when the input data were obtained using different sequencing technologies, processed using different pipelines, represented in different data formats and relative to different reference versions. Furthermore, genome fingerprints are robust to up to 30% missing data. Because of their reduced size, computation on the genome fingerprints is fast and requires little memory. For example, we could compute all-against-all pairwise comparisons among the 2504 genomes in the 1000 Genomes data set in 67 s at high quality (21 μs per comparison, on a single processor), and achieved a lower quality approximation in just 11 s. Efficient computation enables scaling up a variety of important genome analyses, including quantifying relatedness, recognizing duplicative sequenced genomes in a set, population reconstruction, and many others. The original genome representation cannot be reconstructed from its fingerprint, effectively decoupling genome comparison from genome interpretation; the method thus has significant implications for privacy-preserving genome analytics.
Helping the police with their inquiries
NASA Astrophysics Data System (ADS)
Kitson, Anthony J.
1995-09-01
The UK Home Office has held a long term interest in facial recognition. Work has concentrated upon providing the UK police with facilities to improve the use that can be made of the memory of victims and witnesses rather than automatically matching images. During the 1970s a psychological coding scheme and a search method were developed by Aberdeen University and Home Office. This has been incorporated into systems for searching prisoner photographs both experimentally and operationally. The coding scheme has also been incorporated in a facial likeness composition system. The Home Office is currenly implementing a national criminal record system (Phoenix) and work has been conducted to define and demonstrate standards for image enabled terminals for this application. Users have been consulted to establish suitable picture quality for the purpose, and a study of compression methods is in hand. Recently there has been increased use made by UK courts of expert testimony based upon the measurement of facial images. We are currently working with a group of practitioners to examine and improve the quality of such evidence and to develop a national standard.
Ge, Jian; Dong, Haobin; Liu, Huan; Yuan, Zhiwen; Dong, He; Zhao, Zhizhuo; Liu, Yonghua; Zhu, Jun; Zhang, Haiyang
2016-01-01
Based on the dynamic nuclear polarization (DNP) effect, an alternative design of an Overhauser geomagnetic sensor is presented that enhances the proton polarization and increases the amplitude of the free induction decay (FID) signal. The short-pulse method is adopted to rotate the enhanced proton magnetization into the plane of precession to create an FID signal. To reduce the negative effect of the powerful electromagnetic interference, the design of the anti-interference of the pick-up coil is studied. Furthermore, the radio frequency polarization method based on the capacitive-loaded coaxial cavity is proposed to improve the quality factor of the resonant circuit. In addition, a special test instrument is designed that enables the simultaneous testing of the classical proton precession and the Overhauser sensor. Overall, comparison experiments with and without the free radical of the Overhauser sensors show that the DNP effect does effectively improve the amplitude and quality of the FID signal, and the magnetic sensitivity, resolution and range reach to 10 pT/Hz1/2@1 Hz, 0.0023 nT and 20–100 μT, respectively. PMID:27258283
NASA Astrophysics Data System (ADS)
Mehedi, H.-A.; Baudrillart, B.; Alloyeau, D.; Mouhoub, O.; Ricolleau, C.; Pham, V. D.; Chacon, C.; Gicquel, A.; Lagoute, J.; Farhat, S.
2016-08-01
This article describes the significant roles of process parameters in the deposition of graphene films via cobalt-catalyzed decomposition of methane diluted in hydrogen using plasma-enhanced chemical vapor deposition (PECVD). The influence of growth temperature (700-850 °C), molar concentration of methane (2%-20%), growth time (30-90 s), and microwave power (300-400 W) on graphene thickness and defect density is investigated using Taguchi method which enables reaching the optimal parameter settings by performing reduced number of experiments. Growth temperature is found to be the most influential parameter in minimizing the number of graphene layers, whereas microwave power has the second largest effect on crystalline quality and minor role on thickness of graphene films. The structural properties of PECVD graphene obtained with optimized synthesis conditions are investigated with Raman spectroscopy and corroborated with atomic-scale characterization performed by high-resolution transmission electron microscopy and scanning tunneling microscopy, which reveals formation of continuous film consisting of 2-7 high quality graphene layers.
A scoring metric for multivariate data for reproducibility analysis using chemometric methods
Sheen, David A.; de Carvalho Rocha, Werickson Fortunato; Lippa, Katrice A.; Bearden, Daniel W.
2017-01-01
Process quality control and reproducibility in emerging measurement fields such as metabolomics is normally assured by interlaboratory comparison testing. As a part of this testing process, spectral features from a spectroscopic method such as nuclear magnetic resonance (NMR) spectroscopy are attributed to particular analytes within a mixture, and it is the metabolite concentrations that are returned for comparison between laboratories. However, data quality may also be assessed directly by using binned spectral data before the time-consuming identification and quantification. Use of the binned spectra has some advantages, including preserving information about trace constituents and enabling identification of process difficulties. In this paper, we demonstrate the use of binned NMR spectra to conduct a detailed interlaboratory comparison and composition analysis. Spectra of synthetic and biologically-obtained metabolite mixtures, taken from a previous interlaboratory study, are compared with cluster analysis using a variety of distance and entropy metrics. The individual measurements are then evaluated based on where they fall within their clusters, and a laboratory-level scoring metric is developed, which provides an assessment of each laboratory’s individual performance. PMID:28694553
Information technology model for evaluating emergency medicine teaching
NASA Astrophysics Data System (ADS)
Vorbach, James; Ryan, James
1996-02-01
This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.
Kovarik, Miroslav; Hronek, Miloslav; Zadak, Zdenek
2014-04-01
Lung cancer belongs to the type of tumors with a relatively high frequency of malnutrition, sarcopenia and cachexia, severe metabolic syndromes related to impairment of physical function and quality of life, resistance to therapy and short survival. Inexpensive and accessible methods of evaluating changes in body composition, physical function and nutrition status are for this reason of great importance for clinical practice to enable the early identification, monitoring, preventing and treatment of these nutritional deficiencies. This could lead to improved outcomes in the quality of life, physical performance and survival of patients with lung cancer. The aim of this article is to summarize the recent knowledge for the use of such methods, their predictability for patient outcomes and an association with other clinically relevant parameters, specifically with lung cancer patients, because such an article collectively describing their practical application in clinical practice is lacking. The interest of this article is in the use of anthropometry, handgrip dynamometry, bioelectrical impedance analysis derived phase angle and nutritional screening questionnaires in lung cancer patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Electroporation in food processing and biorefinery.
Mahnič-Kalamiza, Samo; Vorobiev, Eugène; Miklavčič, Damijan
2014-12-01
Electroporation is a method of treatment of plant tissue that due to its nonthermal nature enables preservation of the natural quality, colour and vitamin composition of food products. The range of processes where electroporation was shown to preserve quality, increase extract yield or optimize energy input into the process is overwhelming, though not exhausted; e.g. extraction of valuable compounds and juices, dehydration, cryopreservation, etc. Electroporation is--due to its antimicrobial action--a subject of research as one stage of the pasteurization or sterilization process, as well as a method of plant metabolism stimulation. This paper provides an overview of electroporation as applied to plant materials and electroporation applications in food processing, a quick summary of the basic technical aspects on the topic, and a brief discussion on perspectives for future research and development in the field. The paper is a review in the very broadest sense of the word, written with the purpose of orienting the interested newcomer to the field of electroporation applications in food technology towards the pertinent, highly relevant and more in-depth literature from the respective subdomains of electroporation research.
Krüger, H P
1989-02-01
The term "speech chronemics" is introduced to characterize a research strategy which extracts from the physical qualities of the speech signal only the pattern of ons ("speaking") and offs ("pausing"). The research in this field can be structured into the methodological dimension "unit of time", "number of speakers", and "quality of the prosodic measures". It is shown that a researcher's actual decision for one method largely determines the outcome of his study. Then, with the Logoport a new portable measurement device is presented. It enables the researcher to study speaking behavior over long periods of time (up to 24 hours) in the normal environment of his subjects. Two experiments are reported. The first shows the validity of articulation pauses for variations in the physiological state of the organism. The second study proves a new betablocking agent to have sociotropic effects: in a long-term trial socially high-strung subjects showed an improved interaction behavior (compared to placebo and socially easy-going persons) in their everyday life. Finally, the need for a comprehensive theoretical foundation and for standardization of measurement situations and methods is emphasized.
Complex adaptive systems: a tool for interpreting responses and behaviours.
Ellis, Beverley
2011-01-01
Quality improvement is a priority for health services worldwide. There are many barriers to implementing change at the locality level and misinterpreting responses and behaviours can effectively block change. Electronic health records will influence the means by which knowledge and information are generated and sustained among those operating quality improvement programmes. To explain how complex adaptive system (CAS) theory provides a useful tool and new insight into the responses and behaviours that relate to quality improvement programmes in primary care enabled by informatics. Case studies in two English localities who participated in the implementation and development of quality improvement programmes. The research strategy included purposefully sampled case studies, conducted within a social constructionist ontological perspective. Responses and behaviours of quality improvement programmes in the two localities include both positive and negative influences associated with a networked model of governance. Pressures of time, resources and workload are common issues, along with the need for education and training about capturing, coding, recording and sharing information held within electronic health records to support various information requirements. Primary care informatics enables information symmetry among those operating quality improvement programmes by making some aspects of care explicit, allowing consensus about quality improvement priorities and implementable solutions.
Time-resolved dosimetry using a pinpoint ionization chamber as quality assurance for IMRT and VMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louwe, Robert J. W., E-mail: rob.louwe@ccdbh.org.nz; Satherley, Thomas; Day, Rebecca A.
Purpose: To develop a method to verify the dose delivery in relation to the individual control points of intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) using an ionization chamber. In addition to more effective problem solving during patient-specific quality assurance (QA), the aim is to eventually map out the limitations in the treatment chain and enable a targeted improvement of the treatment technique in an efficient way. Methods: Pretreatment verification was carried out for 255 treatment plans that included a broad range of treatment indications in two departments using the equipment of different vendors. In-house developed softwaremore » was used to enable calculation of the dose delivery for the individual beamlets in the treatment planning system (TPS), for data acquisition, and for analysis of the data. The observed deviations were related to various delivery and measurement parameters such as gantry angle, field size, and the position of the detector with respect to the field edge to distinguish between error sources. Results: The average deviation of the integral fraction dose during pretreatment verification of the planning target volume dose was −2.1% ± 2.2% (1 SD), −1.7% ± 1.7% (1 SD), and 0.0% ± 1.3% (1 SD) for IMRT at the Radboud University Medical Center (RUMC), VMAT (RUMC), and VMAT at the Wellington Blood and Cancer Centre, respectively. Verification of the dose to organs at risk gave very similar results but was generally subject to a larger measurement uncertainty due to the position of the detector at a high dose gradient. The observed deviations could be related to limitations of the TPS beam models, attenuation of the treatment couch, as well as measurement errors. The apparent systematic error of about −2% in the average deviation of the integral fraction dose in the RUMC results could be explained by the limitations of the TPS beam model in the calculation of the beam penumbra. Conclusions: This study showed that time-resolved dosimetry using an ionization chamber is feasible and can be largely automated which limits the required additional time compared to integrated dose measurements. It provides a unique QA method which enables identification and quantification of the contribution of various error sources during IMRT and VMAT delivery.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... what types of quality measures should a combination of natural language processing and structured data... collection, analysis, processing, and its ability to facilitate information exchange among and across care...
NASA Astrophysics Data System (ADS)
Ginting, E.; Tambunanand, M. M.; Syahputri, K.
2018-02-01
Evolutionary Operation Methods (EVOP) is a method that is designed used in the process of running or operating routinely in the company to enables high productivity. Quality is one of the critical factors for a company to win the competition. Because of these conditions, the research for products quality has been done by gathering the production data of the company and make a direct observation to the factory floor especially the drying department to identify the problem which is the high water content in the mosquito incense coil. PT.X which is producing mosquito coils attempted to reduce product defects caused by the inaccuracy of operating conditions. One of the parameters of good quality insect repellent that is water content, that if the moisture content is too high then the product easy to mold and broken, and vice versa if it is too low the products are easily broken and burn shorter hours. Three factors that affect the value of the optimal water content, the stirring time, drying temperature and drying time. To obtain the required conditions Evolutionary Operation (EVOP) methods is used. Evolutionary Operation (EVOP) is used as an efficient technique for optimization of two or three variable experimental parameters using two-level factorial designs with center point. Optimal operating conditions in the experiment are stirring time performed for 20 minutes, drying temperature at 65°C, and drying time for 130 minutes. The results of the analysis based on the method of Evolutionary Operation (EVOP) value is the optimum water content of 6.90%, which indicates the value has approached the optimal in a production plant that is 7%.
Gatidis, Sergios; Würslin, Christian; Seith, Ferdinand; Schäfer, Jürgen F; la Fougère, Christian; Nikolaou, Konstantin; Schwenzer, Nina F; Schmidt, Holger
2016-01-01
Optimization of tracer dose regimes in positron emission tomography (PET) imaging is a trade-off between diagnostic image quality and radiation exposure. The challenge lies in defining minimal tracer doses that still result in sufficient diagnostic image quality. In order to find such minimal doses, it would be useful to simulate tracer dose reduction as this would enable to study the effects of tracer dose reduction on image quality in single patients without repeated injections of different amounts of tracer. The aim of our study was to introduce and validate a method for simulation of low-dose PET images enabling direct comparison of different tracer doses in single patients and under constant influencing factors. (18)F-fluoride PET data were acquired on a combined PET/magnetic resonance imaging (MRI) scanner. PET data were stored together with the temporal information of the occurrence of single events (list-mode format). A predefined proportion of PET events were then randomly deleted resulting in undersampled PET data. These data sets were subsequently reconstructed resulting in simulated low-dose PET images (retrospective undersampling of list-mode data). This approach was validated in phantom experiments by visual inspection and by comparison of PET quality metrics contrast recovery coefficient (CRC), background-variability (BV) and signal-to-noise ratio (SNR) of measured and simulated PET images for different activity concentrations. In addition, reduced-dose PET images of a clinical (18)F-FDG PET dataset were simulated using the proposed approach. (18)F-PET image quality degraded with decreasing activity concentrations with comparable visual image characteristics in measured and in corresponding simulated PET images. This result was confirmed by quantification of image quality metrics. CRC, SNR and BV showed concordant behavior with decreasing activity concentrations for measured and for corresponding simulated PET images. Simulation of dose-reduced datasets based on clinical (18)F-FDG PET data demonstrated the clinical applicability of the proposed data. Simulation of PET tracer dose reduction is possible with retrospective undersampling of list-mode data. Resulting simulated low-dose images have equivalent characteristics with PET images actually measured at lower doses and can be used to derive optimal tracer dose regimes.
Metrology of human-based and other qualitative measurements
NASA Astrophysics Data System (ADS)
Pendrill, Leslie; Petersson, Niclas
2016-09-01
The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component Regression, can also be improved by exploiting the increased resolution of the Rasch approach.
ZoroufchiBenis, Khaled; Fatehifar, Esmaeil; Ahmadi, Javad; Rouhi, Alireza
2015-01-01
Background: Industrial air pollution is a growing challenge to humane health, especially in developing countries, where there is no systematic monitoring of air pollution. Given the importance of the availability of valid information on population exposure to air pollutants, it is important to design an optimal Air Quality Monitoring Network (AQMN) for assessing population exposure to air pollution and predicting the magnitude of the health risks to the population. Methods: A multi-pollutant method (implemented as a MATLAB program) was explored for configuring an AQMN to detect the highest level of pollution around an oil refinery plant. The method ranks potential monitoring sites (grids) according to their ability to represent the ambient concentration. The term of cluster of contiguous grids that exceed a threshold value was used to calculate the Station Dosage. Selection of the best configuration of AQMN was done based on the ratio of a station’s dosage to the total dosage in the network. Results: Six monitoring stations were needed to detect the pollutants concentrations around the study area for estimating the level and distribution of exposure in the population with total network efficiency of about 99%. An analysis of the design procedure showed that wind regimes have greatest effect on the location of monitoring stations. Conclusion: The optimal AQMN enables authorities to implement an effective program of air quality management for protecting human health. PMID:26933646
Taylor, Natalie; Clay-Williams, Robyn; Hogden, Emily; Pye, Victoria; Li, Zhicheng; Groene, Oliver; Suñol, Rosa; Braithwaite, Jeffrey
2015-01-01
Introduction Despite the growing body of research on quality and safety in healthcare, there is little evidence of the association between the way hospitals are organised for quality and patient factors, limiting our understanding of how to effect large-scale change. The ‘Deepening our Understanding of Quality in Australia’ (DUQuA) study aims to measure and examine relationships between (1) organisation and department-level quality management systems (QMS), clinician leadership and culture, and (2) clinical treatment processes, clinical outcomes and patient-reported perceptions of care within Australian hospitals. Methods and analysis The DUQuA project is a national, multilevel, cross-sectional study with data collection at organisation (hospital), department, professional and patient levels. Sample size calculations indicate a minimum of 43 hospitals are required to adequately power the study. To allow for rejection and attrition, 70 hospitals across all Australian jurisdictions that meet the inclusion criteria will be invited to participate. Participants will consist of hospital quality management professionals; clinicians; and patients with stroke, acute myocardial infarction and hip fracture. Organisation and department-level QMS, clinician leadership and culture, patient perceptions of safety, clinical treatment processes, and patient outcomes will be assessed using validated, evidence-based or consensus-based measurement tools. Data analysis will consist of simple correlations, linear and logistic regression and multilevel modelling. Multilevel modelling methods will enable identification of the amount of variation in outcomes attributed to the hospital and department levels, and the factors contributing to this variation. Ethics and dissemination Ethical approval has been obtained. Results will be disseminated to individual hospitals in de-identified national and international benchmarking reports with data-driven recommendations. This ground-breaking national study has the potential to influence decision-making on the implementation of quality and safety systems and processes in Australian and international hospitals. PMID:26644128