Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
Kumar, B Vinodh; Mohan, Thuthi
2018-01-01
Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.
Sigma Metrics Across the Total Testing Process.
Charuruks, Navapun
2017-03-01
Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.
Kumar, B. Vinodh; Mohan, Thuthi
2018-01-01
OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587
Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H
1984-01-01
The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.
The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.
Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C
2017-12-13
Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.
Temporal Learning Analytics for Adaptive Assessment
ERIC Educational Resources Information Center
Papamitsiou, Zacharoula; Economides, Anastasios A.
2014-01-01
Accurate and early predictions of student performance could significantly affect interventions during teaching and assessment, which gradually could lead to improved learning outcomes. In our research, we seek to identify and formalize temporal parameters as predictors of performance ("temporal learning analytics" or TLA) and examine…
Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R
2005-10-25
Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.
ERIC Educational Resources Information Center
Buerck, John P.; Mudigonda, Srikanth P.
2014-01-01
Academic analytics and learning analytics have been increasingly adopted by academic institutions of higher learning for improving student performance and retention. While several studies have reported the implementation details and the successes of specific analytics initiatives, relatively fewer studies exist in literature that describe the…
Embarking on performance improvement.
Brown, Bobbi; Falk, Leslie Hough
2014-06-01
Healthcare organizations should approach performance improvement as a program, not a project. The program should be led by a guidance team that identifies goals, prioritizes work, and removes barriers to enable clinical improvement teams and work groups to realize performance improvements. A healthcare enterprise data warehouse can provide the initial foundation for the program analytics. Evidence-based best practices can help achieve improved outcomes and reduced costs.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Screen-printed back-to-back electroanalytical sensors.
Metters, Jonathan P; Randviir, Edward P; Banks, Craig E
2014-11-07
We introduce the concept of screen-printed back-to-back electroanalytical sensors where in this facile and generic approach, screen-printed electrodes are printed back-to-back with a common electrical connection to the two working electrodes with the counter and reference electrodes for each connected in the same manner as a normal "traditional" screen-printed sensor would be. This approach utilises the usually redundant back of the screen-printed sensor, converting this "dead-space" into a further electrochemical sensor which results in improvements in the analytical performance. In the use of the back-to-back design, the electrode area is consequently doubled with improvements in the analytical performance observed with the analytical sensitivity (gradient of a plot of peak height/analytical signal against concentration) doubling and the corresponding limit-of-detection being reduced. We also demonstrate that through intelligent electrode design, a quadruple in the observed analytical sensitivity can also be realised when double microband electrodes are used in the back-to-back configuration as long as they are placed sufficiently apart such that no diffusional interaction occurs. Such work is generic in nature and can be facilely applied to a plethora of screen-printed (and related) sensors utilising the commonly overlooked redundant back of the electrode providing facile improvements in the electroanalytical performance.
On the performance of piezoelectric harvesters loaded by finite width impulses
NASA Astrophysics Data System (ADS)
Doria, A.; Medè, C.; Desideri, D.; Maschio, A.; Codecasa, L.; Moro, F.
2018-02-01
The response of cantilevered piezoelectric harvesters loaded by finite width impulses of base acceleration is studied analytically in the frequency domain in order to identify the parameters that influence the generated voltage. Experimental tests are then performed on harvesters loaded by hammer impacts. The latter are used to confirm analytical results and to validate a linear finite element (FE) model of a unimorph harvester. The FE model is, in turn, used to extend analytical results to more general harvesters (tapered, inverse tapered, triangular) and to more general impulses (heel strike in human gait). From analytical and numerical results design criteria for improving harvester performance are obtained.
Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.
Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko
2015-01-01
Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.
Improved sidelobe suppression mode performance on ATCRBS with various antennas
DOT National Transportation Integrated Search
1975-02-01
The SLS mode performance of terminal and enroute ATCRBS using existing and various improved antennas in the presence of perfectly dielectric flat ground are investigated theoretically. Necessary analytical expressions for various quantities character...
Using business analytics to improve outcomes.
Rivera, Jose; Delaney, Stephen
2015-02-01
Orlando Health has brought its hospital and physician practice revenue cycle systems into better balance using four sets of customized analytics: Physician performance analytics gauge the total net revenue for every employed physician. Patient-pay analytics provide financial risk scores for all patients on both the hospital and physician practice sides. Revenue management analytics bridge the gap between the back-end central business office and front-end physician practice managers and administrators. Enterprise management analytics allow the hospitals and physician practices to share important information about common patients.
Development of airframe design technology for crashworthiness.
NASA Technical Reports Server (NTRS)
Kruszewski, E. T.; Thomson, R. G.
1973-01-01
This paper describes the NASA portion of a joint FAA-NASA General Aviation Crashworthiness Program leading to the development of improved crashworthiness design technology. The objectives of the program are to develop analytical technology for predicting crashworthiness of structures, provide design improvements, and perform full-scale crash tests. The analytical techniques which are being developed both in-house and under contract are described, and typical results from these analytical programs are shown. In addition, the full-scale testing facility and test program are discussed.
NASA Astrophysics Data System (ADS)
Takahashi, D.; Sawaki, S.; Mu, R.-L.
2016-06-01
A new method for improving the sound insulation performance of double-glazed windows is proposed. This technique uses viscoelastic materials as connectors between the two glass panels to ensure that the appropriate spacing is maintained. An analytical model that makes it possible to discuss the effects of spacing, contact area, and viscoelastic properties of the connectors on the performance in terms of sound insulation is developed. The validity of the model is verified by comparing its results with measured data. The numerical experiments using this analytical model showed the importance of the ability of the connectors to achieve the appropriate spacing and their viscoelastic properties, both of which are necessary for improving the sound insulation performance. In addition, it was shown that the most effective factor is damping: the stronger the damping, the more the insulation performance increases.
Using Learning Analytics to Assess Student Learning in Online Courses
ERIC Educational Resources Information Center
Martin, Florence; Ndoye, Abdou
2016-01-01
Learning analytics can be used to enhance student engagement and performance in online courses. Using learning analytics, instructors can collect and analyze data about students and improve the design and delivery of instruction to make it more meaningful for them. In this paper, the authors review different categories of online assessments and…
Big data and visual analytics in anaesthesia and health care.
Simpao, A F; Ahumada, L M; Rehman, M A
2015-09-01
Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The 2D analytic signal for envelope detection and feature extraction on ultrasound images.
Wachinger, Christian; Klein, Tassilo; Navab, Nassir
2012-08-01
The fundamental property of the analytic signal is the split of identity, meaning the separation of qualitative and quantitative information in form of the local phase and the local amplitude, respectively. Especially the structural representation, independent of brightness and contrast, of the local phase is interesting for numerous image processing tasks. Recently, the extension of the analytic signal from 1D to 2D, covering also intrinsic 2D structures, was proposed. We show the advantages of this improved concept on ultrasound RF and B-mode images. Precisely, we use the 2D analytic signal for the envelope detection of RF data. This leads to advantages for the extraction of the information-bearing signal from the modulated carrier wave. We illustrate this, first, by visual assessment of the images, and second, by performing goodness-of-fit tests to a Nakagami distribution, indicating a clear improvement of statistical properties. The evaluation is performed for multiple window sizes and parameter estimation techniques. Finally, we show that the 2D analytic signal allows for an improved estimation of local features on B-mode images. Copyright © 2012 Elsevier B.V. All rights reserved.
Analysis of a virtual memory model for maintaining database views
NASA Technical Reports Server (NTRS)
Kinsley, Kathryn C.; Hughes, Charles E.
1992-01-01
This paper presents an analytical model for predicting the performance of a new support strategy for database views. This strategy, called the virtual method, is compared with traditional methods for supporting views. The analytical model's predictions of improved performance by the virtual method are then validated by comparing these results with those achieved in an experimental implementation.
Big data analytics to improve cardiovascular care: promise and challenges.
Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M
2016-06-01
The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.
Preijers, Frank W M B; van der Velden, Vincent H J; Preijers, Tim; Brooimans, Rik A; Marijt, Erik; Homburg, Christa; van Montfort, Kees; Gratama, Jan W
2016-05-01
In 1985, external quality assurance was initiated in the Netherlands to reduce the between-laboratory variability of leukemia/lymphoma immunophenotyping and to improve diagnostic conclusions. This program consisted of regular distributions of test samples followed by biannual plenary participant meetings in which results were presented and discussed. A scoring system was developed in which the quality of results was rated by systematically reviewing the pre-analytical, analytical, and post-analytical assay stages using three scores, i.e., correct (A), minor fault (B), and major fault (C). Here, we report on 90 consecutive samples distributed to 40-61 participating laboratories between 1998 and 2012. Most samples contained >20% aberrant cells, mainly selected from mature lymphoid malignancies (B or T cell) and acute leukemias (myeloid or lymphoblastic). In 2002, minimally required monoclonal antibody (mAb) panels were introduced, whilst methodological guidelines for all three assay stages were implemented. Retrospectively, we divided the study into subsequent periods of 4 ("initial"), 4 ("learning"), and 7 years ("consolidation") to detect "learning effects." Uni- and multivariate models showed that analytical performance declined since 2002, but that post-analytical performance improved during the entire period. These results emphasized the need to improve technical aspects of the assay, and reflected improved interpretational skills of the participants. A strong effect of participant affiliation in all three assay stages was observed: laboratories in academic and large peripheral hospitals performed significantly better than those in small hospitals. © 2015 International Clinical Cytometry Society. © 2015 International Clinical Cytometry Society.
Improved explosive collection and detection with rationally assembled surface sampling materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chouyyok, Wilaiwan; Bays, J. Timothy; Gerasimenko, Aleksandr A.
Sampling and detection of trace explosives is a key analytical process in modern transportation safety. In this work we have explored some of the fundamental analytical processes for collection and detection of trace level explosive on surfaces with the most widely utilized system, thermal desorption IMS. The performance of the standard muslin swipe material was compared with chemically modified fiberglass cloth. The fiberglass surface was modified to include phenyl functional groups. When compared to standard muslin, the phenyl functionalized fiberglass sampling material showed better analyte release from the sampling material as well as improved response and repeatability from multiple usesmore » of the same swipe. The improved sample release of the functionalized fiberglass swipes resulted in a significant increase in sensitivity. Various physical and chemical properties were systematically explored to determine optimal performance. The results herein have relevance to improving the detection of other explosive compounds and potentially to a wide range of other chemical sampling and field detection challenges.« less
Chung, Hee-Jung; Song, Yoon Kyung; Hong, Sung Kuk; Hwang, Sang-Hyun; Seo, Hee Seung; Whang, Dong Hee; Nam, Myung-Hyun; Lee, Do Hoon
2017-01-01
Recently, because the quality of laboratory analyses has increased along with the need for quality improvement, several external quality control bodies have adapted performance specifications using the Desirable Biological Variation Database, termed "Ricos goals"; these criteria are more stringent than those presented in CLIA 88. In this study, we aimed to validate newly introduced serum separator tubes, Improvacutor, for routine clinical chemistry testing in accordance with Ricos goals and CLIA 88. Blood samples were collected from 100 volunteers into three types of serum vacuum tubes: Greiner Vacuette, Becton Dickinson (BD) Vacutainer, and Improve Improvacutor. The samples were subjected to 16 routine chemistry tests using a TBA-200fr NEO chemistry autoanalyzer. In the comparison analysis, all 16 test results were acceptable according to CLIA 88. However, in the comparison of Improve and BD tubes, creatinine showed 4.31% (+0.08 μmol/L) bias. This slightly exceeded the Desirable Specification for Inaccuracy Ricos limit of ±3.96%, but still satisfied the CLIS88 limit of ±26.52 μmol/L. The remaining 15 analytes performed acceptably according to the Desirable Specifications of Ricos. The correlation coefficient of 12 analytes was greater than 0.95 in Passing-Bablok regression analysis among the three tubes, but was lower for four analytes: calcium, sodium, potassium, and chloride. In the stability assay, only potassium tested in the Greiner tube revealed a larger positive bias (2.18%) than the Ricos Desirable Specification for Inaccuracy based on biologic variation (1.8%). The BD tube also showed a positive bias of 1.74%, whereas the new Improve tube showed the smallest positive bias of 1.17% in potassium level after 72 h storage. Thus, the results of this study demonstrate that recently introduced analytical performance specifications based on components of biological variation (Rico's goal) could be extended to criterion for performance evaluation and applied.
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Widanapathirana, Chathuranga
2014-01-01
Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…
Human, Lauren J; Thorson, Katherine R; Woolley, Joshua D; Mendes, Wendy Berry
2017-04-01
Intranasal administration of the hypothalamic neuropeptide oxytocin (OT) has, in some studies, been associated with positive effects on social perception and cognition. Similarly, positive emotion inductions can improve a range of perceptual and performance-based behaviors. In this exploratory study, we examined how OT administration and positive emotion inductions interact in their associations with social and analytical performance. Participants (N=124) were randomly assigned to receive an intranasal spray of OT (40IU) or placebo and then viewed one of three videos designed to engender one of the following emotion states: social warmth, pride, or an affectively neutral state. Following the emotion induction, participants completed social perception and analytical tasks. There were no significant main effects of OT condition on social perception tasks, failing to replicate prior research, or on analytical performance. Further, OT condition and positive emotion inductions did not interact with each other in their associations with social perception performance. However, OT condition and positive emotion manipulations did significantly interact in their associations with analytical performance. Specifically, combining positive emotion inductions with OT administration was associated with worse analytical performance, with the pride induction no longer benefiting performance and the warmth induction resulting in worse performance. In sum, we found little evidence for main or interactive effects of OT on social perception but preliminary evidence that OT administration may impair analytical performance when paired with positive emotion inductions. Copyright © 2017 Elsevier Inc. All rights reserved.
Shaikh, M S; Moiz, B
2016-04-01
Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.
High-performance heat pipes for heat recovery applications
NASA Technical Reports Server (NTRS)
Saaski, E. W.; Hartl, J. H.
1980-01-01
Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.
Integrated pest management of "Golden Delicious" apples.
Simončič, A; Stopar, M; Velikonja Bolta, Š; Bavčar, D; Leskovšek, R; Baša Česnik, H
2015-01-01
Monitoring of plant protection product (PPP) residues in "Golden Delicious" apples was performed in 2011-2013, where 216 active substances were analysed with three analytical methods. Integrated pest management (IPM) production and improved IPM production were compared. Results were in favour of improved IPM production. Some active compounds determined in IPM production (boscalid, pyraclostrobin, thiacloprid and thiametoxam) were not found in improved IPM production. Besides that, in 2011 and 2012, captan residues were lower in improved IPM production. Risk assessment was also performed. Chronic exposure of consumers was low in general, but showed no major differences for IPM and improved IPM production for active substances determined in both types of production. Analytical results were compared with the European Union report of 2010 where 1.3% of apple samples exceeded maximum residue levels (MRLs), while MRL exceedances were not observed in this survey.
NASA Technical Reports Server (NTRS)
Graham, A. B.
1977-01-01
Small- and large-scale models of supersonic cruise fighter vehicles were used to determine the effectiveness of airframe/propulsion integration concepts for improved low-speed performance and stability and control characteristics. Computer programs were used for engine/airframe sizing studies to yield optimum vehicle performance.
Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula
2017-02-01
Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.
Quality in laboratory medicine: 50years on.
Plebani, Mario
2017-02-01
The last 50years have seen substantial changes in the landscape of laboratory medicine: its role in modern medicine is in evolution and the quality of laboratory services is changing. The need to control and improve quality in clinical laboratories has grown hand in hand with the growth in technological developments leading to an impressive reduction of analytical errors over time. An essential cause of this impressive improvement has been the introduction and monitoring of quality indicators (QIs) such as the analytical performance specifications (in particular bias and imprecision) based on well-established goals. The evolving landscape of quality and errors in clinical laboratories moved first from analytical errors to all errors performed within the laboratory walls, subsequently to errors in laboratory medicine (including errors in test requesting and result interpretation), and finally, to a focus on errors more frequently associated with adverse events (laboratory-associated errors). After decades in which clinical laboratories have focused on monitoring and improving internal indicators of analytical quality, efficiency and productivity, it is time to shift toward indicators of total quality, clinical effectiveness and patient outcomes. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Analytical concepts for health management systems of liquid rocket engines
NASA Technical Reports Server (NTRS)
Williams, Richard; Tulpule, Sharayu; Hawman, Michael
1990-01-01
Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.
Performance specifications for the extra-analytical phases of laboratory testing: Why and how.
Plebani, Mario
2017-07-01
An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
He, Xiaoyong; Dong, Bo; Chen, Yuqi; Li, Runhua; Wang, Fujuan; Li, Jiaoyang; Cai, Zhigang
2018-03-01
In order to improve the analytical speed and performance of laser-ablation based atomic emission spectroscopy, high repetition rate laser-ablation spark-induced breakdown spectroscopy (HRR LA-SIBS) was first developed. Magnesium and copper in aluminum alloys were analyzed with this technique. In the experiments, the fundamental output of an acousto-optically Q-switched Nd:YAG laser operated at 1 kHz repetition rate with low pulse energy and 120 ns pulse width was used to ablate the samples and the plasma emission was enhanced by spark discharge. The spectra were recorded with a compact fiber spectrometer with non-intensified charge-coupled device in non-gating mode. Different parameters relative with analytical performance, such as capacitance, voltage, laser pulse energy were optimized. Under current experimental conditions, calibration curves of magnesium and copper in aluminum alloys were built and limits of detection of them were determined to be 14.0 and 9.9 ppm by HRR LA-SIBS, respectively, which were 8-12 folds better than that achieved by HRR LA under similar experimental condition without spark discharge. The analytical sensitivities are close to those obtained with conventional LIBS but with improved analytical speed as well as possibility of using compact fiber spectrometer. Under high repetition rate operation, the noise level can be decreased and the analytical reproducibility can be improved obviously by averaging multiple measurements within short time. High repetition rate operation of laser-ablation spark-induced breakdown spectroscopy is very helpful for improving analytical speed. It is possible to find applications in fast elements analysis, especially fast two-dimension elemental mapping of solid samples.
Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.
2017-12-01
We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.
A flexible influence of affective feelings on creative and analytic performance.
Huntsinger, Jeffrey R; Ray, Cara
2016-09-01
Considerable research shows that positive affect improves performance on creative tasks and negative affect improves performance on analytic tasks. The present research entertained the idea that affective feelings have flexible, rather than fixed, effects on cognitive performance. Consistent with the idea that positive and negative affect signal the value of accessible processing inclinations, the influence of affective feelings on performance on analytic or creative tasks was found to be flexibly responsive to the relative accessibility of different styles of processing (i.e., heuristic vs. systematic, global vs. local). When a global processing orientation was accessible happy participants generated more creative uses for a brick (Experiment 1), successfully solved more remote associates and insight problems (Experiment 2) and displayed broader categorization (Experiment 3) than those in sad moods. When a local processing orientation was accessible this pattern reversed. When a heuristic processing style was accessible happy participants were more likely to commit the conjunction fallacy (Experiment 3) and showed less pronounced anchoring effects (Experiment 4) than sad participants. When a systematic processing style was accessible this pattern reversed. Implications of these results for relevant affect-cognition models are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Post-analytical Issues in Hemostasis and Thrombosis Testing.
Favaloro, Emmanuel J; Lippi, Giuseppe
2017-01-01
Analytical concerns within hemostasis and thrombosis testing are continuously decreasing. This is essentially attributable to modern instrumentation, improvements in test performance and reliability, as well as the application of appropriate internal quality control and external quality assurance measures. Pre-analytical issues are also being dealt with in some newer instrumentation, which are able to detect hemolysis, icteria and lipemia, and, in some cases, other issues related to sample collection such as tube under-filling. Post-analytical issues are generally related to appropriate reporting and interpretation of test results, and these are the focus of the current overview, which provides a brief description of these events, as well as guidance for their prevention or minimization. In particular, we propose several strategies for improved post-analytical reporting of hemostasis assays and advise that this may provide the final opportunity to prevent serious clinical errors in diagnosis.
Improving laboratory results turnaround time by reducing pre analytical phase.
Khalifa, Mohamed; Khalid, Parwaiz
2014-01-01
Laboratory turnaround time is considered one of the most important indicators of work efficiency in hospitals, physicians always need timely results to take effective clinical decisions especially in the emergency department where these results can guide physicians whether to admit patients to the hospital, discharge them home or do further investigations. A retrospective data analysis study was performed to identify the effects of ER and Lab staff training on new routines for sample collection and transportation on the pre-analytical phase of turnaround time. Renal profile tests requested by the ER and performed in 2013 has been selected as a sample, and data about 7,519 tests were retrieved and analyzed to compare turnaround time intervals before and after implementing new routines. Results showed significant time reduction on "Request to Sample Collection" and "Collection to In Lab Delivery" time intervals with less significant improvement on the analytical phase of the turnaround time.
Competing on talent analytics.
Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy
2010-10-01
Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.
Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory
Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721
Using Keystroke Analytics to Improve Pass-Fail Classifiers
ERIC Educational Resources Information Center
Casey, Kevin
2017-01-01
Learning analytics offers insights into student behaviour and the potential to detect poor performers before they fail exams. If the activity is primarily online (for example computer programming), a wealth of low-level data can be made available that allows unprecedented accuracy in predicting which students will pass or fail. In this paper, we…
Design and Implementation of a Learning Analytics Toolkit for Teachers
ERIC Educational Resources Information Center
Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik
2012-01-01
Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…
NASA Technical Reports Server (NTRS)
Sawdy, D. T.; Beckemeyer, R. J.; Patterson, J. D.
1976-01-01
Results are presented from detailed analytical studies made to define methods for obtaining improved multisegment lining performance by taking advantage of relative placement of each lining segment. Properly phased liner segments reflect and spatially redistribute the incident acoustic energy and thus provide additional attenuation. A mathematical model was developed for rectangular ducts with uniform mean flow. Segmented acoustic fields were represented by duct eigenfunction expansions, and mode-matching was used to ensure continuity of the total field. Parametric studies were performed to identify attenuation mechanisms and define preliminary liner configurations. An optimization procedure was used to determine optimum liner impedance values for a given total lining length, Mach number, and incident modal distribution. Optimal segmented liners are presented and it is shown that, provided the sound source is well-defined and flow environment is known, conventional infinite duct optimum attenuation rates can be improved. To confirm these results, an experimental program was conducted in a laboratory test facility. The measured data are presented in the form of analytical-experimental correlations. Excellent agreement between theory and experiment verifies and substantiates the analytical prediction techniques. The results indicate that phased liners may be of immediate benefit in the development of improved aircraft exhaust duct noise suppressors.
Sidelobe Suppression Mode Performance of ATCRBS with Various Antennas.
DOT National Transportation Integrated Search
1975-02-01
The SLS mode performance of terminal and enroute ATCRBS using existing and various improved antennas in the presence of perfectly dielectric flat ground are investigated theoretically. Necessary analytical expressions for various quantities character...
Asgharzadeh, Hafez; Borazjani, Iman
2017-02-15
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 - 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.
Asgharzadeh, Hafez; Borazjani, Iman
2016-01-01
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 – 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80–90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future. PMID:28042172
NASA Astrophysics Data System (ADS)
Asgharzadeh, Hafez; Borazjani, Iman
2017-02-01
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for non-linear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form a preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42-74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal and full Jacobian, respectivley, when the stretching factor was increased. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.
Performance implications from sizing a VM on multi-core systems: A Data analytic application s view
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Horey, James L; Begoli, Edmon
In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less
Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam
2015-04-01
We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Assessing data and modeling needs for urban transport : an Australian perspective
DOT National Transportation Integrated Search
2000-04-01
Managing the transport assets of an urban economy and ensuring that change is in accordance with suitable performance measures requires continuing improvement in analytical power and empirical information. One crucial input for improving planning and...
NASA Technical Reports Server (NTRS)
Hahne, David E.; Glaab, Louis J.
1999-01-01
An investigation was performed to evaluate leading-and trailing-edge flap deflections for optimal aerodynamic performance of a High-Speed Civil Transport concept during takeoff and approach-to-landing conditions. The configuration used for this study was designed by the Douglas Aircraft Company during the 1970's. A 0.1-scale model of this configuration was tested in the Langley 30- by 60-Foot Tunnel with both the original leading-edge flap system and a new leading-edge flap system, which was designed with modem computational flow analysis and optimization tools. Leading-and trailing-edge flap deflections were generated for the original and modified leading-edge flap systems with the computational flow analysis and optimization tools. Although wind tunnel data indicated improvements in aerodynamic performance for the analytically derived flap deflections for both leading-edge flap systems, perturbations of the analytically derived leading-edge flap deflections yielded significant additional improvements in aerodynamic performance. In addition to the aerodynamic performance optimization testing, stability and control data were also obtained. An evaluation of the crosswind landing capability of the aircraft configuration revealed that insufficient lateral control existed as a result of high levels of lateral stability. Deflection of the leading-and trailing-edge flaps improved the crosswind landing capability of the vehicle considerably; however, additional improvements are required.
Analytical investigation of thermal barrier coatings on advanced power generation gas turbines
NASA Technical Reports Server (NTRS)
Amos, D. J.
1977-01-01
An analytical investigation of present and advanced gas turbine power generation cycles incorporating thermal barrier turbine component coatings was performed. Approximately 50 parametric points considering simple, recuperated, and combined cycles (including gasification) with gas turbine inlet temperatures from current levels through 1644K (2500 F) were evaluated. The results indicated that thermal barriers would be an attractive means to improve performance and reduce cost of electricity for these cycles. A recommended thermal barrier development program has been defined.
Dual nozzle aerodynamic and cooling analysis study
NASA Technical Reports Server (NTRS)
Meagher, G. M.
1981-01-01
Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.
Simple functionalization method for single conical pores with a polydopamine layer
NASA Astrophysics Data System (ADS)
Horiguchi, Yukichi; Goda, Tatsuro; Miyahara, Yuji
2018-04-01
Resistive pulse sensing (RPS) is an interesting analytical system in which micro- to nanosized pores are used to evaluate particles or small analytes. Recently, molecular immobilization techniques to improve the performance of RPS have been reported. The problem in functionalization for RPS is that molecular immobilization by chemical reaction is restricted by the pore material type. Herein, a simple functionalization is performed using mussel-inspired polydopamine as an intermediate layer to connect the pore material with functional molecules.
High-frequency phase shift measurement greatly enhances the sensitivity of QCM immunosensors.
March, Carmen; García, José V; Sánchez, Ángel; Arnau, Antonio; Jiménez, Yolanda; García, Pablo; Manclús, Juan J; Montoya, Ángel
2015-03-15
In spite of being widely used for in liquid biosensing applications, sensitivity improvement of conventional (5-20MHz) quartz crystal microbalance (QCM) sensors remains an unsolved challenging task. With the help of a new electronic characterization approach based on phase change measurements at a constant fixed frequency, a highly sensitive and versatile high fundamental frequency (HFF) QCM immunosensor has successfully been developed and tested for its use in pesticide (carbaryl and thiabendazole) analysis. The analytical performance of several immunosensors was compared in competitive immunoassays taking carbaryl insecticide as the model analyte. The highest sensitivity was exhibited by the 100MHz HFF-QCM carbaryl immunosensor. When results were compared with those reported for 9MHz QCM, analytical parameters clearly showed an improvement of one order of magnitude for sensitivity (estimated as the I50 value) and two orders of magnitude for the limit of detection (LOD): 30μgl(-1) vs 0.66μgL(-1)I50 value and 11μgL(-1) vs 0.14μgL(-1) LOD, for 9 and 100MHz, respectively. For the fungicide thiabendazole, I50 value was roughly the same as that previously reported for SPR under the same biochemical conditions, whereas LOD improved by a factor of 2. The analytical performance achieved by high frequency QCM immunosensors surpassed those of conventional QCM and SPR, closely approaching the most sensitive ELISAs. The developed 100MHz QCM immunosensor strongly improves sensitivity in biosensing, and therefore can be considered as a very promising new analytical tool for in liquid applications where highly sensitive detection is required. Copyright © 2014 Elsevier B.V. All rights reserved.
The evaluation and enhancement of quality, environmental protection and seaport safety by using FAHP
NASA Astrophysics Data System (ADS)
Tadic, Danijela; Aleksic, Aleksandar; Popovic, Pavle; Arsovski, Slavko; Castelli, Ana; Joksimovic, Danijela; Stefanovic, Miladin
2017-02-01
The evaluation and enhancement of business processes in any organization in an uncertain environment presents one of the main requirements of ISO 9000:2008 and has a key effect on competitive advantage and long-term sustainability. The aim of this paper can be defined as the identification and discussion of some of the most important business processes of seaports and the performances of business processes and their key performance indicators (KPIs). The complexity and importance of the treated problem call for analytic methods rather than intuitive decisions. The existing decision variables of the considered problem are described by linguistic expressions which are modelled by triangular fuzzy numbers (TFNs). In this paper, the modified fuzzy extended analytic hierarchy process (FAHP) is proposed. The assessment of the relative importance of each pair of performances and their key performance indicators are stated as a fuzzy group decision-making problem. By using the modified fuzzy extended analytic hierarchy process, the fuzzy rank of business processes of a seaport is obtained. The model is tested through an illustrative example with real-life data, where the obtained data suggest measures which should enhance business strategy and improve key performance indicators. The future improvement is based on benchmark and knowledge sharing.
Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Keller, J.; Wallen, R.
2015-02-01
Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.
Analysis of serum angiotensin-converting enzyme.
Muller, B R
2002-09-01
Serum angiotensin-converting enzyme (SACE) levels are influenced by genetic polymorphism. Interpretation of serum levels with the appropriate genotypic reference range improves the diagnostic sensitivity of the assay for sarcoidosis. SACE assays are performed by a large number of routine clinical laboratories. However, there is no external quality assessment (EQA) for SACE other than an informal regional scheme. This showed analytical performance of SACE assays to be poor, with a diversity of reference ranges, leading to widely disparate clinical classification of EQA samples. Genetic polymorphism combined with poor analytical performance suggest that perhaps SACE assays should revert to being the province of specialized laboratories.
NASA Astrophysics Data System (ADS)
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-02-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-01-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
Performance Improvement Through Indexing of Turbine Airfoils. Part 2; Numerical Simulation
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Huber, Frank W.; Sharma, Om P.
1996-01-01
An experimental/analytical study has been conducted to determine the performance improvements achievable by circumferentially indexing succeeding rows of turbine stator airfoils. A series of tests was conducted to experimentally investigate stator wake clocking effects on the performance of the space shuttle main engine (SSME) alternate turbopump development (ATD) fuel turbine test article (TTA). The results from this study indicate that significant increases in stage efficiency can be attained through application of this airfoil clocking concept. Details of the experiment and its results are documented in part 1 of this paper. In order to gain insight into the mechanisms of the performance improvement, extensive computational fluid dynamics (CFD) simulations were executed. The subject of the present paper is the initial results from the CFD investigation of the configurations and conditions detailed in part 1 of the paper. To characterize the aerodynamic environments in the experimental test series, two-dimensional (2D), time accurate, multistage, viscous analyses were performed at the TTA midspan. Computational analyses for five different circumferential positions of the first stage stator have been completed. Details of the computational procedure and the results are presented. The analytical results verify the experimentally demonstrated performance improvement and are compared with data whenever possible. Predictions of time-averaged turbine efficiencies as well as gas conditions throughout the flow field are presented. An initial understanding of the turbine performance improvement mechanism based on the results from this investigation is described.
"Dip-and-read" paper-based analytical devices using distance-based detection with color screening.
Yamada, Kentaro; Citterio, Daniel; Henry, Charles S
2018-05-15
An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.
Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre
2011-11-01
The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.
Ayal, Shahar; Rusou, Zohar; Zakay, Dan; Hochman, Guy
2015-01-01
A framework is presented to better characterize the role of individual differences in information processing style and their interplay with contextual factors in determining decision making quality. In Experiment 1, we show that individual differences in information processing style are flexible and can be modified by situational factors. Specifically, a situational manipulation that induced an analytical mode of thought improved decision quality. In Experiment 2, we show that this improvement in decision quality is highly contingent on the compatibility between the dominant thinking mode and the nature of the task. That is, encouraging an intuitive mode of thought led to better performance on an intuitive task but hampered performance on an analytical task. The reverse pattern was obtained when an analytical mode of thought was encouraged. We discuss the implications of these results for the assessment of decision making competence, and suggest practical directions to help individuals better adjust their information processing style to the situation at hand and make optimal decisions. PMID:26284011
Ayal, Shahar; Rusou, Zohar; Zakay, Dan; Hochman, Guy
2015-01-01
A framework is presented to better characterize the role of individual differences in information processing style and their interplay with contextual factors in determining decision making quality. In Experiment 1, we show that individual differences in information processing style are flexible and can be modified by situational factors. Specifically, a situational manipulation that induced an analytical mode of thought improved decision quality. In Experiment 2, we show that this improvement in decision quality is highly contingent on the compatibility between the dominant thinking mode and the nature of the task. That is, encouraging an intuitive mode of thought led to better performance on an intuitive task but hampered performance on an analytical task. The reverse pattern was obtained when an analytical mode of thought was encouraged. We discuss the implications of these results for the assessment of decision making competence, and suggest practical directions to help individuals better adjust their information processing style to the situation at hand and make optimal decisions.
Glucose Biosensors: An Overview of Use in Clinical Practice
Yoo, Eun-Hyung; Lee, Soo-Youn
2010-01-01
Blood glucose monitoring has been established as a valuable tool in the management of diabetes. Since maintaining normal blood glucose levels is recommended, a series of suitable glucose biosensors have been developed. During the last 50 years, glucose biosensor technology including point-of-care devices, continuous glucose monitoring systems and noninvasive glucose monitoring systems has been significantly improved. However, there continues to be several challenges related to the achievement of accurate and reliable glucose monitoring. Further technical improvements in glucose biosensors, standardization of the analytical goals for their performance, and continuously assessing and training lay users are required. This article reviews the brief history, basic principles, analytical performance, and the present status of glucose biosensors in the clinical practice. PMID:22399892
Improving Student Performance Using Nudge Analytics
ERIC Educational Resources Information Center
Feild, Jacqueline
2015-01-01
Providing students with continuous and personalized feedback on their performance is an important part of encouraging self regulated learning. As part of our higher education platform, we built a set of data visualizations to provide feedback to students on their assignment performance. These visualizations give students information about how they…
Murray, Ian; Walker, Glenn; Bereman, Michael S
2016-06-20
Two paper-based microfluidic techniques, photolithography and wax patterning, were investigated for their potential to improve upon the sensitivity, reproducibility, and versatility of paper spray mass spectrometry. The main limitation of photolithography was the significant signal (approximately three orders of magnitude) above background which was attributed to the chemicals used in the photoresist process. Hydrophobic barriers created via wax patterning were discovered to have approximately 2 orders of magnitude less background signal compared to analogous barriers created using photolithography. A minimum printed wax barrier thickness of approximately 0.3 mm was necessary to consistently retain commonly used paper spray solvents (1 : 1 water : acetonitrile/methanol) and avoid leakage. Constricting capillary flow via wax-printed channels yielded both a significant increase in signal and detection time for detection of model analytes. This signal increase, which was attributed to restricting the radial flow of analyte/solvent on paper (i.e., a concentrating effect), afforded a significant increase in sensitivity (p ≪ 0.05) for the detection of pesticides spiked into residential tap water using a five-point calibration curve. Finally, unique mixing designs using wax patterning can be envisioned to perform on-paper analyte derivatization.
Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J
2016-02-01
Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.
Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry
2017-08-01
Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.
Analytical and Clinical Performance of Blood Glucose Monitors
Boren, Suzanne Austin; Clarke, William L.
2010-01-01
Background The objective of this study was to understand the level of performance of blood glucose monitors as assessed in the published literature. Methods Medline from January 2000 to October 2009 and reference lists of included articles were searched to identify eligible studies. Key information was abstracted from eligible studies: blood glucose meters tested, blood sample, meter operators, setting, sample of people (number, diabetes type, age, sex, and race), duration of diabetes, years using a glucose meter, insulin use, recommendations followed, performance evaluation measures, and specific factors affecting the accuracy evaluation of blood glucose monitors. Results Thirty-one articles were included in this review. Articles were categorized as review articles of blood glucose accuracy (6 articles), original studies that reported the performance of blood glucose meters in laboratory settings (14 articles) or clinical settings (9 articles), and simulation studies (2 articles). A variety of performance evaluation measures were used in the studies. The authors did not identify any studies that demonstrated a difference in clinical outcomes. Examples of analytical tools used in the description of accuracy (e.g., correlation coefficient, linear regression equations, and International Organization for Standardization standards) and how these traditional measures can complicate the achievement of target blood glucose levels for the patient were presented. The benefits of using error grid analysis to quantify the clinical accuracy of patient-determined blood glucose values were discussed. Conclusions When examining blood glucose monitor performance in the real world, it is important to consider if an improvement in analytical accuracy would lead to improved clinical outcomes for patients. There are several examples of how analytical tools used in the description of self-monitoring of blood glucose accuracy could be irrelevant to treatment decisions. PMID:20167171
Analytical sensitivity of current best-in-class malaria rapid diagnostic tests.
Jimenez, Alfons; Rees-Channer, Roxanne R; Perera, Rushini; Gamboa, Dionicia; Chiodini, Peter L; González, Iveth J; Mayor, Alfredo; Ding, Xavier C
2017-03-24
Rapid diagnostic tests (RDTs) are today the most widely used method for malaria diagnosis and are recommended, alongside microscopy, for the confirmation of suspected cases before the administration of anti-malarial treatment. The diagnostic performance of RDTs, as compared to microscopy or PCR is well described but the actual analytical sensitivity of current best-in-class tests is poorly documented. This value is however a key performance indicator and a benchmark value needed to developed new RDTs of improved sensitivity. Thirteen RDTs detecting either the Plasmodium falciparum histidine rich protein 2 (HRP2) or the plasmodial lactate dehydrogenase (pLDH) antigens were selected from the best performing RDTs according to the WHO-FIND product testing programme. The analytical sensitivity of these products was evaluated using a range of reference materials including P. falciparum and Plasmodium vivax whole parasite samples as well as recombinant proteins. The best performing HRP2-based RDTs could detect all P. falciparum cultured samples at concentrations as low as 0.8 ng/mL of HRP2. The limit of detection of the best performing pLDH-based RDT specifically detecting P. vivax was 25 ng/mL of pLDH. The analytical sensitivity of P. vivax and Pan pLDH-based RDTs appears to vary considerably from product to product, and improvement of the limit-of-detection for P. vivax detecting RDTs is needed to match the performance of HRP2 and Pf pLDH-based RDTs for P. falciparum. Different assays using different reference materials produce different values for antigen concentration in a given specimen, highlighting the need to establish universal reference assays.
NASA Technical Reports Server (NTRS)
Everett, L.
1992-01-01
This report documents the performance characteristics of a Targeting Reflective Alignment Concept (TRAC) sensor. The performance will be documented for both short and long ranges. For long ranges, the sensor is used without the flat mirror attached to the target. To better understand the capabilities of the TRAC based sensors, an engineering model is required. The model can be used to better design the system for a particular application. This is necessary because there are many interrelated design variables in application. These include lense parameters, camera, and target configuration. The report presents first an analytical development of the performance, and second an experimental verification of the equations. In the analytical presentation it is assumed that the best vision resolution is a single pixel element. The experimental results suggest however that the resolution is better than 1 pixel. Hence the analytical results should be considered worst case conditions. The report also discusses advantages and limitations of the TRAC sensor in light of the performance estimates. Finally the report discusses potential improvements.
Performance criteria and quality indicators for the post-analytical phase.
Sciacovelli, Laura; Aita, Ada; Padoan, Andrea; Pelloso, Michela; Antonelli, Giorgia; Piva, Elisa; Chiozza, Maria Laura; Plebani, Mario
2016-07-01
Quality indicators (QIs) used as performance measurements are an effective tool in accurately estimating quality, identifying problems that may need to be addressed, and monitoring the processes over time. In Laboratory Medicine, QIs should cover all steps of the testing process, as error studies have confirmed that most errors occur in the pre- and post-analytical phase of testing. Aim of the present study is to provide preliminary results on QIs and related performance criteria in the post-analytical phase. This work was conducted according to a previously described study design based on the voluntary participation of clinical laboratories in the project on QIs of the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). Overall, data collected highlighted an improvement or stability in performances over time for all reported indicators thus demonstrating that the use of QIs is effective in the quality improvement strategy. Moreover, QIs data are an important source for defining the state-of-the-art concerning the error rate in the total testing process. The definition of performance specifications based on the state-of-the-art, as suggested by consensus documents, is a valuable benchmark point in evaluating the performance of each laboratory. Laboratory tests play a relevant role in the monitoring and evaluation of the efficacy of patient outcome thus assisting clinicians in decision-making. Laboratory performance evaluation is therefore crucial to providing patients with safe, effective and efficient care.
Sánchez-Margalet, Víctor; Rodriguez-Oliva, Manuel; Sánchez-Pozo, Cristina; Fernández-Gallardo, María Francisca; Goberna, Raimundo
2005-01-01
Portable meters for blood glucose concentrations are used at the patients bedside, as well as by patients for self-monitoring of blood glucose. Even though most devices have important technological advances that decrease operator error, the analytical goals proposed for the performance of glucose meters have been recently changed by the American Diabetes Association (ADA) to reach <5% analytical error and <7.9% total error. We studied 80 meters throughout the Virgen Macarena Hospital and we found most devices with performance error higher than 10%. The aim of the present study was to establish a new system to control portable glucose meters together with an educational program for nurses in a 1200-bed University Hospital to achieve recommended analytical goals, so that we could improve the quality of diabetes care. We used portable glucose meters connected on-line to the laboratory after an educational program for nurses with responsibilities in point-of-care testing. We evaluated the system by assessing total error of the glucometers using high- and low-level glucose control solutions. In a period of 6 months, we collected data from 5642 control samples obtained by 14 devices (Precision PCx) directly from the control program (QC manager). The average total error for the low-level glucose control (2.77 mmol/l) was 6.3% (range 5.5-7.6%), and even lower for the high-level glucose control (16.66 mmol/l), at 4.8% (range 4.1-6.5%). In conclusion, the performance of glucose meters used in our University Hospital with more than 1000 beds not only improved after the intervention, but the meters achieved the analytical goals of the suggested ADA/National Academy of Clinical Biochemistry criteria for total error (<7.9% in the range 2.77-16.66 mmol/l glucose) and optimal total error for high glucose concentrations of <5%, which will improve the quality of care of our patients.
Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran
2016-01-01
Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence of errors in pre-analytical phase implies that a satisfactory level of quality standards was being practiced with still scope for improvement.
Study on application of aerospace technology to improve surgical implants
NASA Technical Reports Server (NTRS)
Johnson, R. E.; Youngblood, J. L.
1982-01-01
The areas where aerospace technology could be used to improve the reliability and performance of metallic, orthopedic implants was assessed. Specifically, comparisons were made of material controls, design approaches, analytical methods and inspection approaches being used in the implant industry with hardware for the aerospace industries. Several areas for possible improvement were noted such as increased use of finite element stress analysis and fracture control programs on devices where the needs exist for maximum reliability and high structural performance.
Use of multiple colorimetric indicators for paper-based microfluidic devices.
Dungchai, Wijitar; Chailapakul, Orawon; Henry, Charles S
2010-08-03
We report here the use of multiple indicators for a single analyte for paper-based microfluidic devices (microPAD) in an effort to improve the ability to visually discriminate between analyte concentrations. In existing microPADs, a single dye system is used for the measurement of a single analyte. In our approach, devices are designed to simultaneously quantify analytes using multiple indicators for each analyte improving the accuracy of the assay. The use of multiple indicators for a single analyte allows for different indicator colors to be generated at different analyte concentration ranges as well as increasing the ability to better visually discriminate colors. The principle of our devices is based on the oxidation of indicators by hydrogen peroxide produced by oxidase enzymes specific for each analyte. Each indicator reacts at different peroxide concentrations and therefore analyte concentrations, giving an extended range of operation. To demonstrate the utility of our approach, the mixture of 4-aminoantipyrine and 3,5-dichloro-2-hydroxy-benzenesulfonic acid, o-dianisidine dihydrochloride, potassium iodide, acid black, and acid yellow were chosen as the indicators for simultaneous semi-quantitative measurement of glucose, lactate, and uric acid on a microPAD. Our approach was successfully applied to quantify glucose (0.5-20 mM), lactate (1-25 mM), and uric acid (0.1-7 mM) in clinically relevant ranges. The determination of glucose, lactate, and uric acid in control serum and urine samples was also performed to demonstrate the applicability of this device for biological sample analysis. Finally results for the multi-indicator and single indicator system were compared using untrained readers to demonstrate the improvements in accuracy achieved with the new system. 2010 Elsevier B.V. All rights reserved.
Numerical convergence improvements for porflow unsaturated flow simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, Greg
2017-08-14
Section 3.6 of SRNL (2016) discusses various PORFLOW code improvements to increase modeling efficiency, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision. This memorandum documents interaction with Analytic & Computational Research, Inc. (http://www.acricfd.com/default.htm) to improve numerical convergence efficiency using PORFLOW version 6.42 for unsaturated flow simulations.
High performance cryogenic turboexpanders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agahi, R.R.; Ershaghi, B.; Lin, M.C.
1996-12-31
The use of turboexpanders for deep cryogenic temperatures has been constrained because of thermal efficiency limitations. This limited thermal efficiency was mostly due to mechanical constraints. Recent improvements in analytical techniques, bearing technology, and design features have made it possible to design and operate turboexpanders at more favorable conditions, such as of higher rotational speeds. Several turboexpander installations in helium and hydrogen processes have shown a significant improvement in plant performance over non-turboexpander options.
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
Solid Lubrication Fundamentals and Applications. Chapter 2
NASA Technical Reports Server (NTRS)
Miyoshi, Kazuhisa
1998-01-01
This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.
Huang, Yao-Hung; Chang, Jeng-Shian; Chao, Sheng D.; Wu, Kuang-Chong; Huang, Long-Sun
2014-01-01
A quartz crystal microbalance (QCM) serving as a biosensor to detect the target biomolecules (analytes) often suffers from the time consuming process, especially in the case of diffusion-limited reaction. In this experimental work, we modify the reaction chamber of a conventional QCM by integrating into the multi-microelectrodes to produce electrothermal vortex flow which can efficiently drive the analytes moving toward the sensor surface, where the analytes were captured by the immobilized ligands. The microelectrodes are placed on the top surface of the chamber opposite to the sensor, which is located on the bottom of the chamber. Besides, the height of reaction chamber is reduced to assure that the suspended analytes in the fluid can be effectively drived to the sensor surface by induced electrothermal vortex flow, and also the sample costs are saved. A series of frequency shift measurements associated with the adding mass due to the specific binding of the analytes in the fluid flow and the immobilized ligands on the QCM sensor surface are performed with or without applying electrothermal effect (ETE). The experimental results show that electrothermal vortex flow does effectively accelerate the specific binding and make the frequency shift measurement more sensible. In addition, the images of the binding surfaces of the sensors with or without applying electrothermal effect are taken through the scanning electron microscopy. By comparing the images, it also clearly indicates that ETE does raise the specific binding of the analytes and ligands and efficiently improves the performance of the QCM sensor. PMID:25538808
Rizk, Mostafa M; Zaki, Adel; Hossam, Nermine; Aboul-Ela, Yasmin
2014-12-01
The performance of clinical laboratories plays a fundamental role in the quality and effectiveness of healthcare. To evaluate the laboratory performance in Alexandria University Hospital Clinical Laboratories using key quality indicators and to compare the performance before and after an improvement plan based on ISO 15189 standards. The study was carried out on inpatient samples for a period of 7 months that was divided into three phases: phase I included data collection for evaluation of the existing process before improvement (March-May 2012); an intermediate phase, which included corrective, preventive action, quality initiative and steps for improvement (June 2012); and phase II, which included data collection for evaluation of the process after improvement (July 2012-September 2012). In terms of the preanalytical indicators, incomplete request forms in phase I showed that the total number of received requests were 31 944, with a percentage of defected request of 33.66%; whereas in phase II, there was a significant reduction in all defected request items (P<0.001) with a percentage of defected requests of 9.64%. As for the analytical indicators, the proficiency testing accuracy score in phase I showed poor performance of 10 analytes in which total error (TE) exceeded total error allowable (TEa), with a corresponding sigma value of less than 3, which indicates test problems and an unreliable method. The remaining analytes showed an acceptable performance in which TE did not exceed the TEa, with a sigma value of more than 6. Following an intervention of 3 months, the performance showed marked improvement. Error tracking in phase I showed a TE of (5.11%), whereas in phase II it was reduced to 2.48% (P<0.001).For the postanalytical indicators, our results in phase I showed that the percentage of nonreported critical results was 26.07%. In phase II, there was a significant improvement (P<0.001). The percentage of nonreported results was 11.37%, the reasons were either inability to contact the authorized doctor (8.24%), wrong patient identification (1.0%), lack of reporting by lab doctor (1.11%), and finally, lack of reporting by the lab technician (1.03%). Standardization and monitoring of each step in the total testing process is very important and is associated with the most efficient and well-organized laboratories.
Assessing Thinking Skills in Astro 101: Do We Make an Impact?
NASA Astrophysics Data System (ADS)
Bruning, D.
2005-12-01
Most instructors agree that a major goal of "Astronomy 101" is to develop thinking skills in our students (Partridge and Greenstein, AER 2, 46, 2003). Much educational research in astronomy has initially concentrated on "best practices" for improving student learning (development of "think-pair-share", lecture tutorials, peer tutoring, etc.). Little has been done to date to assess our efforts to improve student thinking skills and students' desire to think more deeply about the cognitively rich ideas offered in the typical astronomy class. This study surveys several astronomy and physics courses to determine whether general analytical thinking skills increase because of the science course and whether students' attitudes toward cognition improve. Cacioppo, Petty and Kao's "Need for Cognition" scale is used for the latter assessment (J. Personality Assessment 48, 306, 1984). A shortened version of Whimbey and Lochhead's ASI skills instrument is used to assess analytical skills ("Problem Solving and Comprehension," 1986). Preliminary results suggest that students need for cognition does not change in general, although there may be a correlation between increasing need for cognition and improvement in grades through the semester. There is a suggestion that need for cognition is slightly predictive of course performance, but a greater correlation exists between the post-course survey and grades. Gains in general analytical skills have been seen in initial surveys, but correlations with course performance appear elusive.
Prentice, Boone M; Chumbley, Chad W; Hachey, Brian C; Norris, Jeremy L; Caprioli, Richard M
2016-10-04
Quantitative matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) approaches have historically suffered from poor accuracy and precision mainly due to the nonuniform distribution of matrix and analyte across the target surface, matrix interferences, and ionization suppression. Tandem mass spectrometry (MS/MS) can be used to ensure chemical specificity as well as improve signal-to-noise ratios by eliminating interferences from chemical noise, alleviating some concerns about dynamic range. However, conventional MALDI TOF/TOF modalities typically only scan for a single MS/MS event per laser shot, and multiplex assays require sequential analyses. We describe here new methodology that allows for multiple TOF/TOF fragmentation events to be performed in a single laser shot. This technology allows the reference of analyte intensity to that of the internal standard in each laser shot, even when the analyte and internal standard are quite disparate in m/z, thereby improving quantification while maintaining chemical specificity and duty cycle. In the quantitative analysis of the drug enalapril in pooled human plasma with ramipril as an internal standard, a greater than 4-fold improvement in relative standard deviation (<10%) was observed as well as improved coefficients of determination (R 2 ) and accuracy (>85% quality controls). Using this approach we have also performed simultaneous quantitative analysis of three drugs (promethazine, enalapril, and verapamil) using deuterated analogues of these drugs as internal standards.
Mischak, Harald; Vlahou, Antonia; Ioannidis, John P A
2013-04-01
Mass spectrometry platforms have attracted a lot of interest in the last 2 decades as profiling tools for native peptides and proteins with clinical potential. However, limitations associated with reproducibility and analytical robustness, especially pronounced with the initial SELDI systems, hindered the application of such platforms in biomarker qualification and clinical implementation. The scope of this article is to give a short overview on data available on performance and on analytical robustness of the different platforms for peptide profiling. Using the CE-MS platform as a paradigm, data on analytical performance are described including reproducibility (short-term and intermediate repeatability), stability, interference, quantification capabilities (limits of detection), and inter-laboratory variability. We discuss these issues by using as an example our experience with the development of a 273-peptide marker for chronic kidney disease. Finally, we discuss pros and cons and means for improvement and emphasize the need to test in terms of comparative clinical performance and impact, different platforms that pass reasonably well analytical validation tests. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses
Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.
2010-01-01
Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573
Design, fabrication, and test of a steel spar wind turbine blade
NASA Technical Reports Server (NTRS)
Sullivan, T. L.; Sirocky, P. J., Jr.; Viterna, L. A.
1979-01-01
The design and fabrication of wind turbine blades based on 60 foot steel spars are discussed. Performance and blade load information is given and compared to analytical prediction. In addition, performance is compared to that of the original MOD-O aluminum blades. Costs for building the two blades are given, and a projection is made for the cost in mass production. Design improvements to reduce weight and improve fatigue life are suggested.
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Milan, Luis Aparecido; Stockton, Amanda M; Carrilho, Emanuel
2017-05-02
Paper-based devices are a portable, user-friendly, and affordable technology that is one of the best analytical tools for inexpensive diagnostic devices. Three-dimensional microfluidic paper-based analytical devices (3D-μPADs) are an evolution of single layer devices and they permit effective sample dispersion, individual layer treatment, and multiplex analytical assays. Here, we present the rational design of a wax-printed 3D-μPAD that enables more homogeneous permeation of fluids along the cellulose matrix than other existing designs in the literature. Moreover, we show the importance of the rational design of channels on these devices using glucose oxidase, peroxidase, and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS) reactions. We present an alternative method for layer stacking using a magnetic apparatus, which facilitates fluidic dispersion and improves the reproducibility of tests performed on 3D-μPADs. We also provide the optimized designs for printing, facilitating further studies using 3D-μPADs.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2014-01-01
Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.
Review and assessment of the HOST turbine heat transfer program
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.
Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei
2017-08-15
Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge
2017-07-18
Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.
Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén
2006-04-21
A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.
NASA Astrophysics Data System (ADS)
Vinh, T.
1980-08-01
There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.
The forensic validity of visual analytics
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.
2008-01-01
The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.
CAP/ACMG proficiency testing for biochemical genetics laboratories: a summary of performance.
Oglesbee, Devin; Cowan, Tina M; Pasquali, Marzia; Wood, Timothy C; Weck, Karen E; Long, Thomas; Palomaki, Glenn E
2018-01-01
PurposeTesting for inborn errors of metabolism is performed by clinical laboratories worldwide, each utilizing laboratory-developed procedures. We sought to summarize performance in the College of American Pathologists' (CAP) proficiency testing (PT) program and identify opportunities for improving laboratory quality. When evaluating PT data, we focused on a subset of laboratories that have participated in at least one survey since 2010.MethodsAn analysis of laboratory performance (2004 to 2014) on the Biochemical Genetics PT Surveys, a program administered by CAP and the American College of Medical Genetics and Genomics. Analytical and interpretive performance was evaluated for four tests: amino acids, organic acids, acylcarnitines, and mucopolysaccharides.ResultsSince 2010, 150 laboratories have participated in at least one of four PT surveys. Analytic sensitivities ranged from 88.2 to 93.4%, while clinical sensitivities ranged from 82.4 to 91.0%. Performance was higher for US participants and for more recent challenges. Performance was lower for challenges with subtle findings or complex analytical patterns.ConclusionUS clinical biochemical genetics laboratory proficiency is satisfactory, with a minority of laboratories accounting for the majority of errors. Our findings underscore the complex nature of clinical biochemical genetics testing and highlight the necessity of continuous quality management.
NASA Astrophysics Data System (ADS)
Thiébaut, E.; Goupil, C.; Pesty, F.; D'Angelo, Y.; Guegan, G.; Lecoeur, P.
2017-12-01
Increasing the maximum cooling effect of a Peltier cooler can be achieved through material and device design. The use of inhomogeneous, functionally graded materials may be adopted in order to increase maximum cooling without improvement of the Z T (figure of merit); however, these systems are usually based on the assumption that the local optimization of the Z T is the suitable criterion to increase thermoelectric performance. We solve the heat equation in a graded material and perform both analytical and numerical analysis of a graded Peltier cooler. We find a local criterion that we use to assess the possible improvement of graded materials for thermoelectric cooling. A fair improvement of the cooling effect (up to 36%) is predicted for semiconductor materials, and the best graded system for cooling is described. The influence of the equation of state of the electronic gas of the material is discussed, and the difference in term of entropy production between the graded and the classical system is also described.
High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh
Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less
NASA Technical Reports Server (NTRS)
Amos, D. J.
1977-01-01
An analytical evaluation was conducted to determine quantitatively the improvement potential in cycle efficiency and cost of electricity made possible by the introduction of thermal barrier coatings to power generation combustion turbine systems. The thermal barrier system, a metallic bond coat and yttria stabilized zirconia outer layer applied by plasma spray techniques, acts as a heat insulator to provide substantial metal temperature reductions below that of the exposed thermal barrier surface. The study results show the thermal barrier to be a potentially attractive means for improving performance and reducing cost of electricity for the simple, recuperated, and combined cycles evaluated.
Strategic analytics: towards fully embedding evidence in healthcare decision-making.
Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh
2015-01-01
Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.
A Visual Analytics Paradigm Enabling Trillion-Edge Graph Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.; Haglin, David J.; Gillen, David S.
We present a visual analytics paradigm and a system prototype for exploring web-scale graphs. A web-scale graph is described as a graph with ~one trillion edges and ~50 billion vertices. While there is an aggressive R&D effort in processing and exploring web-scale graphs among internet vendors such as Facebook and Google, visualizing a graph of that scale still remains an underexplored R&D area. The paper describes a nontraditional peek-and-filter strategy that facilitates the exploration of a graph database of unprecedented size for visualization and analytics. We demonstrate that our system prototype can 1) preprocess a graph with ~25 billion edgesmore » in less than two hours and 2) support database query and visualization on the processed graph database afterward. Based on our computational performance results, we argue that we most likely will achieve the one trillion edge mark (a computational performance improvement of 40 times) for graph visual analytics in the near future.« less
NASA Astrophysics Data System (ADS)
Zaiwani, B. E.; Zarlis, M.; Efendi, S.
2018-03-01
In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.
High efficiency silicon solar cell review
NASA Technical Reports Server (NTRS)
Godlewski, M. P. (Editor)
1975-01-01
An overview is presented of the current research and development efforts to improve the performance of the silicon solar cell. The 24 papers presented reviewed experimental and analytic modeling work which emphasizes the improvment of conversion efficiency and the reduction of manufacturing costs. A summary is given of the round-table discussion, in which the near- and far-term directions of future efficiency improvements were discussed.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
Engineering Analysis of Stresses in Railroad Rails.
DOT National Transportation Integrated Search
1981-10-01
One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...
Preliminary Description of Stresses in Railroad Rail
DOT National Transportation Integrated Search
1976-11-01
One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...
ERIC Educational Resources Information Center
Sanders, Ethan S.; Ruggles, Julie L.
2000-01-01
Discusses the evolution of human performance improvement, an outgrowth of instructional systems design and programmed instruction that emerged after World War II. Discusses the contributing disciplines (behaviorism, analytical systems, organizational learning, organizational development, systems theory, management development) and the major…
Contextual Facilitators of and Barriers to Nursing Home Pressure Ulcer Prevention
Hartmann, Christine W.; Solomon, Jeffrey; Palmer, Jennifer A.; Lukas, Carol VanDeusen
2016-01-01
OBJECTIVE Important gaps exist in the knowledge of how to achieve successful, sustained prevention of pressure ulcers (PrUs) in nursing homes. This study aimed to address those gaps by comparing nursing leadership and indirect care staff members’ impressions about the context of PrU prevention in facilities with improving and declining PrU rates. SETTING The study was conducted in a sample of 6 Veterans Health Administration nursing homes (known as community living centers) purposively selected to represent a range of PrU care performance. DESIGN AND PARTICIPANTS One-time 30-minute semistructured interviews with 23 community living center staff were conducted. Qualitative interview data were analyzed using an analytic framework containing (a) a priori analytic constructs based on the study’s conceptual framework and (b) sections for emerging constructs. MAIN RESULTS Analysis revealed 6 key concepts differentiating sites with improving and declining PrU care performance. These concepts were (1) structures through which the change effort is initiated; (2) organizational prioritization, alignment, and support; (3) improvement culture; (4) clarity of roles and responsibilities; (5) communication strategies; and (6) staffing and clinical practices. Results also pointed to potential contextual facilitators of and barriers to successful PrU prevention. CONCLUSIONS Leadership’s visible prioritization of and support for PrU prevention and the initiation of PrU prevention activities through formal structures were the most striking components represented at sites with improving performance, but not at ones where performance declined. Sites with improving performance were more likely to align frontline staff and leadership goals for PrU prevention. PMID:27089151
The "hospital central laboratory": automation, integration and clinical usefulness.
Zaninotto, Martina; Plebani, Mario
2010-07-01
Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
Assessing Vocal Performances Using Analytical Assessment: A Case Study
ERIC Educational Resources Information Center
Gynnild, Vidar
2016-01-01
This study investigated ways to improve the appraisal of vocal performances within a national academy of music. Since a criterion-based assessment framework had already been adopted, the conceptual foundation of an assessment rubric was used as a guide in an action research project. The group of teachers involved wanted to explore thinking…
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
Liu, Shao-Ying; Huang, Xi-Hui; Wang, Xiao-Fang; Jin, Quan; Zhu, Guo-Nian
2014-05-01
This study developed an improved analytical method for the simultaneous quantification of 13 quinolones in cosmetics by ultra high performance liquid chromatography combined with ESI triple quadrupole MS/MS under the multiple reaction monitoring mode. The analytes were extracted and purified by using an SPE cartridge. The limits of quantification ranged from 0.03 to 3.02 μg/kg. The precision for determining the quinolones was <19.39%. The proposed method was successfully developed for the determination of quinolones in real cosmetic samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1986-01-01
the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests
NASA Technical Reports Server (NTRS)
Storey, Jedediah Morse
2016-01-01
Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecraft's mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many experimental and numerical studies of water slosh have been conducted. However, slosh data for cryogenic liquids is lacking. Water and cryogenic liquid nitrogen are used in various ground-based tests with a spherical tank to characterize damping, slosh mode frequencies, and slosh forces. A single ring baffle is installed in the tank for some of the tests. Analytical models for slosh modes, slosh forces, and baffle damping are constructed based on prior work. Select experiments are simulated using a commercial CFD software, and the numerical results are compared to the analytical and experimental results for the purposes of validation and methodology-improvement.
An improved multiple flame photometric detector for gas chromatography.
Clark, Adrian G; Thurbide, Kevin B
2015-11-20
An improved multiple flame photometric detector (mFPD) is introduced, based upon interconnecting fluidic channels within a planar stainless steel (SS) plate. Relative to the previous quartz tube mFPD prototype, the SS mFPD provides a 50% reduction in background emission levels, an orthogonal analytical flame, and easier more sensitive operation. As a result, sulfur response in the SS mFPD spans 4 orders of magnitude, yields a minimum detectable limit near 9×10(-12)gS/s, and has a selectivity approaching 10(4) over carbon. The device also exhibits exceptionally large resistance to hydrocarbon response quenching. Additionally, the SS mFPD uniquely allows analyte emission monitoring in the multiple worker flames for the first time. The findings suggest that this mode can potentially further improve upon the analytical flame response of sulfur (both linear HSO, and quadratic S2) and also phosphorus. Of note, the latter is nearly 20-fold stronger in S/N in the collective worker flames response and provides 6 orders of linearity with a detection limit of about 2.0×10(-13)gP/s. Overall, the results indicate that this new SS design notably improves the analytical performance of the mFPD and can provide a versatile and beneficial monitoring tool for gas chromatography. Copyright © 2015 Elsevier B.V. All rights reserved.
In Vivo Analytical Performance of Nitric Oxide-Releasing Glucose Biosensors
2015-01-01
The in vivo analytical performance of percutaneously implanted nitric oxide (NO)-releasing amperometric glucose biosensors was evaluated in swine for 10 d. Needle-type glucose biosensors were functionalized with NO-releasing polyurethane coatings designed to release similar total amounts of NO (3.1 μmol cm–2) for rapid (16.0 ± 4.4 h) or slower (>74.6 ± 16.6 h) durations and remain functional as outer glucose sensor membranes. Relative to controls, NO-releasing sensors were characterized with improved numerical accuracy on days 1 and 3. Furthermore, the clinical accuracy and sensitivity of rapid NO-releasing sensors were superior to control and slower NO-releasing sensors at both 1 and 3 d implantation. In contrast, the slower, extended, NO-releasing sensors were characterized by shorter sensor lag times (<4.2 min) in response to intravenous glucose tolerance tests versus burst NO-releasing and control sensors (>5.8 min) at 3, 7, and 10 d. Collectively, these results highlight the potential for NO release to enhance the analytical utility of in vivo glucose biosensors. Initial results also suggest that this analytical performance benefit is dependent on the NO-release duration. PMID:24984031
In vivo analytical performance of nitric oxide-releasing glucose biosensors.
Soto, Robert J; Privett, Benjamin J; Schoenfisch, Mark H
2014-07-15
The in vivo analytical performance of percutaneously implanted nitric oxide (NO)-releasing amperometric glucose biosensors was evaluated in swine for 10 d. Needle-type glucose biosensors were functionalized with NO-releasing polyurethane coatings designed to release similar total amounts of NO (3.1 μmol cm(-2)) for rapid (16.0 ± 4.4 h) or slower (>74.6 ± 16.6 h) durations and remain functional as outer glucose sensor membranes. Relative to controls, NO-releasing sensors were characterized with improved numerical accuracy on days 1 and 3. Furthermore, the clinical accuracy and sensitivity of rapid NO-releasing sensors were superior to control and slower NO-releasing sensors at both 1 and 3 d implantation. In contrast, the slower, extended, NO-releasing sensors were characterized by shorter sensor lag times (<4.2 min) in response to intravenous glucose tolerance tests versus burst NO-releasing and control sensors (>5.8 min) at 3, 7, and 10 d. Collectively, these results highlight the potential for NO release to enhance the analytical utility of in vivo glucose biosensors. Initial results also suggest that this analytical performance benefit is dependent on the NO-release duration.
Slushy weightings for the optimal pilot model. [considering visual tracking task
NASA Technical Reports Server (NTRS)
Dillow, J. D.; Picha, D. G.; Anderson, R. O.
1975-01-01
A pilot model is described which accounts for the effect of motion cues in a well defined visual tracking task. The effect of visual and motion cues are accounted for in the model in two ways. First, the observation matrix in the pilot model is structured to account for the visual and motion inputs presented to the pilot. Secondly, the weightings in the quadratic cost function associated with the pilot model are modified to account for the pilot's perception of the variables he considers important in the task. Analytic results obtained using the pilot model are compared to experimental results and in general good agreement is demonstrated. The analytic model yields small improvements in tracking performance with the addition of motion cues for easily controlled task dynamics and large improvements in tracking performance with the addition of motion cues for difficult task dynamics.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Propeller flow visualization techniques
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.
1982-01-01
Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.
Empirically Optimized Flow Cytometric Immunoassay Validates Ambient Analyte Theory
Parpia, Zaheer A.; Kelso, David M.
2010-01-01
Ekins’ ambient analyte theory predicts, counter intuitively, that an immunoassay’s limit of detection can be improved by reducing the amount of capture antibody. In addition, it also anticipates that results should be insensitive to the volume of sample as well as the amount of capture antibody added. The objective of this study is to empirically validate all of the performance characteristics predicted by Ekins’ theory. Flow cytometric analysis was used to detect binding between a fluorescent ligand and capture microparticles since it can directly measure fractional occupancy, the primary response variable in ambient analyte theory. After experimentally determining ambient analyte conditions, comparisons were carried out between ambient and non-ambient assays in terms of their signal strengths, limits of detection, and their sensitivity to variations in reaction volume and number of particles. The critical number of binding sites required for an assay to be in the ambient analyte region was estimated to be 0.1VKd. As predicted, such assays exhibited superior signal/noise levels and limits of detection; and were not affected by variations in sample volume and number of binding sites. When the signal detected measures fractional occupancy, ambient analyte theory is an excellent guide to developing assays with superior performance characteristics. PMID:20152793
Analytical study of effect of casing treatment on performance of a multistage compressor
NASA Technical Reports Server (NTRS)
Snyder, R. W.; Blade, R. J.
1972-01-01
The simulation was based on individual stage pressure and efficiency maps. These maps were modified to account for casing treatment effects on the individual stage characteristics. The individual stage maps effects on overall compressor performance were observed. The results show that to improve the performance of the compressor in its normal operating range, casing treatment of the rear stages is required.
Bias Assessment of General Chemistry Analytes using Commutable Samples.
Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter
2014-11-01
Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoegg, Edward D.; Marcus, R. Kenneth; Hager, Georg
2018-02-28
In an effort to understand and improve the accuracy and precision of the liquid sampling- atmospheric pressure glow discharge (LS-APGD)/Orbitrap system, effects of concomitant ions on the acquired mass spectra are examined and presented. The LS-APGD/ Orbitrap instrument system is capable of high quality isotope ratio measurements, which are of high analytical interest for nuclear non-proliferation detection applications. The presence of background and concomitant ions (water clusters, matrix, and other analytes) has presented limitations in earlier studies. In order to mitigate these effects, an alternative quadrupole-Orbitrap hybrid mass spectrometer was employed in this study. This instrument configuration has a quadrupolemore » mass filter preceding the Orbitrap to filter-out undesired non-analyte ions. Results are presented for the analysis of U in the presence of Rb, Ag, Ba, and Pb as concomitants, each present at 5 µg/mL concentration. Progressive filtering of each concomitant ion shows steadily improved U isotope ratio performance. Ultimately, a 235U/238U ratio of 0.007133, with a relative accuracy of -2.1% and a relative standard deviation of 0.087% was achieved using this system, along with improved calibration linearity and lowered limits of detection. The resultant performance compares very favorably with other commonly accepted isotope ratio measurement platforms - surprisingly so for an ion trap type mass spectrometry instrument.« less
Bukve, Tone; Røraas, Thomas; Riksheim, Berit Oddny; Christensen, Nina Gade; Sandberg, Sverre
2015-01-01
The Norwegian Quality Improvement of Primary Care Laboratories (Noklus) offers external quality assurance (EQA) schemes (EQASs) for urine albumin (UA) annually. This study analyzed the EQA results to determine how the analytical quality of UA analysis in general practice (GP) offices developed between 1998 (n=473) and 2012 (n=1160). Two EQA urine samples were distributed yearly to the participants by mail. The participants measured the UA of each sample and returned the results together with information about their instrument, the profession and number of employees at the office, frequency of internal quality control (IQC), and number of analyses per month. In the feedback report, they received an assessment of their analytical performance. The number of years that the GP office had participated in Noklus was inversely related to the percentage of "poor" results for quantitative but not semiquantitative instruments. The analytical quality improved for participants using quantitative instruments who received an initial assessment of "poor" and who subsequently changed their instrument. Participants using reagents that had expired or were within 3 months of the expiration date performed worse than those using reagents that were expiring in more than 3 months. Continuous participation in the Noklus program improved the performance of quantitative UA analyses at GP offices. This is probably in part attributable to the complete Noklus quality system, whereby in addition to participating in EQAS, participants are visited by laboratory consultants who examine their procedures and provide practical advice and education regarding the use of different instruments.
NASA Astrophysics Data System (ADS)
Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.
2015-04-01
Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.
NASA Astrophysics Data System (ADS)
Tan, Yimin; Lin, Kejian; Zu, Jean W.
2018-05-01
Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.
Dunand, Marielle; Donzelli, Massimiliano; Rickli, Anna; Hysek, Cédric M; Liechti, Matthias E; Grouzmann, Eric
2014-08-01
The diagnosis of pheochromocytoma relies on the measurement of plasma free metanephrines assay whose reliability has been considerably improved by ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS/MS). Here we report an analytical interference occurring between 4-hydroxy-3-methoxymethamphetamine (HMMA), a metabolite of 3,4-methylenedioxymethamphetamine (MDMA, "Ecstasy"), and normetanephrine (NMN) since they share a common pharmacophore resulting in the same product ion after fragmentation. Synthetic HMMA was spiked into plasma samples containing various concentrations of NMN and the intensity of the interference was determined by UPLC-MS/MS before and after improvement of the analytical method. Using a careful adjustment of chromatographic conditions including the change of the UPLC analytical column, we were able to distinguish both compounds. HMMA interference for NMN determination should be seriously considered since MDMA activates the sympathetic nervous system and if confounded with NMN may lead to false-positive tests when performing a differential diagnostic of pheochromocytoma. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
JT8D and JT9D jet engine performance improvement program. Task 1: Feasibility analysis
NASA Technical Reports Server (NTRS)
Gaffin, W. O.; Webb, D. E.
1979-01-01
JT8D and JT9D component performance improvement concepts which have a high probability of incorporation into production engines were identified and ranked. An evaluation method based on airline payback period was developed for the purpose of identifying the most promising concepts. The method used available test data and analytical models along with conceptual/preliminary designs to predict the performance improvements, weight, installation characteristics, cost for new production and retrofit, maintenance cost, and qualitative characteristics of candidate concepts. These results were used to arrive at the concept payback period, which is the time required for an airline to recover the investment cost of concept implementation.
Review of recent advances in analytical techniques for the determination of neurotransmitters
Perry, Maura; Li, Qiang; Kennedy, Robert T.
2009-01-01
Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472
Khorosheva, Eugenia M.; Karymov, Mikhail A.; Selck, David A.; Ismagilov, Rustem F.
2016-01-01
In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. PMID:26358811
Shameli, Seyed Mostafa; Glawdel, Tomasz; Ren, Carolyn L
2015-03-01
Counter-flow gradient electrofocusing allows the simultaneous concentration and separation of analytes by generating a gradient in the total velocity of each analyte that is the sum of its electrophoretic velocity and the bulk counter-flow velocity. In the scanning format, the bulk counter-flow velocity is varying with time so that a number of analytes with large differences in electrophoretic mobility can be sequentially focused and passed by a single detection point. Studies have shown that nonlinear (such as a bilinear) velocity gradients along the separation channel can improve both peak capacity and separation resolution simultaneously, which cannot be realized by using a single linear gradient. Developing an effective separation system based on the scanning counter-flow nonlinear gradient electrofocusing technique usually requires extensive experimental and numerical efforts, which can be reduced significantly with the help of analytical models for design optimization and guiding experimental studies. Therefore, this study focuses on developing an analytical model to evaluate the separation performance of scanning counter-flow bilinear gradient electrofocusing methods. In particular, this model allows a bilinear gradient and a scanning rate to be optimized for the desired separation performance. The results based on this model indicate that any bilinear gradient provides a higher separation resolution (up to 100%) compared to the linear case. This model is validated by numerical studies. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas
2017-01-01
As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.
NASA Astrophysics Data System (ADS)
Lyashko, A. D.
2017-11-01
A new analytical presentation of the solution for steady-state oscillations of orthotopic rectangular prism is found. The corresponding infinite system of linear algebraic equations has been deduced by the superposition method. A countable set of precise eigenfrequencies and elementary eigenforms is found. The identities are found which make it possible to improve the convergence of all the infinite series in the solution of the problem. All the infinite series in presentation of solution are analytically summed up. Numerical calculations of stresses in the rectangular orthotropic prism with a uniform along the border and harmonic in time load on two opposite faces have been performed.
Ding, Chenghua; Qu, Kang; Li, Yongbo; Hu, Kai; Liu, Hongxia; Ye, Baoxian; Wu, Yangjie; Zhang, Shusheng
2007-11-02
Six calixarene bonded silica gel stationary phases were prepared and characterized by elemental analysis, infrared spectroscopy and thermal analysis. Their chromatographic performance was investigated by using PAHs, aromatic positional isomers and E- and Z-ethyl 3-(4-acetylphenyl) acrylate isomers as probes. Separation mechanism based on the different interactions between calixarenes and analytes were discussed. The chromatographic behaviors of those analytes on the calixarene columns were influenced by the supramolecular interaction including pi-pi interaction, space steric hindrance and hydrogen bonding interaction between calixarenes and analytes. Notably, the presence of polar groups (-OH, -NO(2) and -NH(2)) in the aromatic isomers could improve their separation selectivity on calixarene phase columns. The results from quantum chemistry calculation using DFT-B3LYP/STO-3G* base group were consistent with the retention behaviors of PHAs on calix[4]arene column.
Evans, Elizabeth; Gabriel, Ellen Flávia Moreira; Coltro, Wendell Karlos Tomazelli; Garcia, Carlos D
2014-05-07
A systematic investigation was conducted to study the effect of paper type on the analytical performance of a series of microfluidic paper-based analytical devices (μPADs) fabricated using a CO2 laser engraver. Samples included three different grades of Whatman chromatography paper, and three grades of Whatman filter paper. According to the data collected and the characterization performed, different papers offer a wide range of flow rate, thickness, and pore size. After optimizing the channel widths on the μPAD, the focus of this study was directed towards the color intensity and color uniformity formed during a colorimetric enzymatic reaction. According to the results herein described, the type of paper and the volume of reagents dispensed in each detection zone can determine the color intensity and uniformity. Therefore, the objective of this communication is to provide rational guidelines for the selection of paper substrates for the fabrication of μPADs.
Challenges in Modern Anti-Doping Analytical Science.
Ayotte, Christiane; Miller, John; Thevis, Mario
2017-01-01
The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.
Advanced analytical modeling of double-gate Tunnel-FETs - A performance evaluation
NASA Astrophysics Data System (ADS)
Graef, Michael; Hosenfeld, Fabian; Horst, Fabian; Farokhnejad, Atieh; Hain, Franziska; Iñíguez, Benjamín; Kloes, Alexander
2018-03-01
The Tunnel-FET is one of the most promising devices to be the successor of the standard MOSFET due to its alternative current transport mechanism, which allows a smaller subthreshold slope than the physically limited 60 mV/dec of the MOSFET. Recently fabricated devices show smaller slopes already but mostly not over multiple decades of the current transfer characteristics. In this paper the performance limiting effects, occurring during the fabrication process of the device, such as doping profiles and midgap traps are analyzed by physics-based analytical models and their performance limiting abilities are determined. Additionally, performance enhancing possibilities, such as hetero-structures and ambipolarity improvements are introduced and discussed. An extensive double-gate n-Tunnel-FET model is presented, which meets the versatile device requirements and shows a good fit with TCAD simulations and measurement data.
Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.
Borrero, Ernesto E
2018-01-01
This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.
Development of a plasma sprayed ceramic gas path seal for high pressure turbine applications
NASA Technical Reports Server (NTRS)
Shiembob, L. T.
1977-01-01
The plasma sprayed graded layered yittria stabilized zirconia (ZrO2)/metal(CoCrAlY) seal system for gas turbine blade tip applications up to 1589 K (2400 F) seal temperatures was studied. Abradability, erosion, and thermal fatigue characteristics of the graded layered system were evaluated by rig tests. Satisfactory abradability and erosion resistance was demonstrated. Encouraging thermal fatigue tolerance was shown. Initial properties for the plasma sprayed materials in the graded, layered seal system was obtained, and thermal stress analyses were performed. Sprayed residual stresses were determined. Thermal stability of the sprayed layer materials was evaluated at estimated maximum operating temperatures in each layer. Anisotropic behavior in the layer thickness direction was demonstrated by all layers. Residual stresses and thermal stability effects were not included in the analyses. Analytical results correlated reasonably well with results of the thermal fatigue tests. Analytical application of the seal system to a typical gas turbine engine application predicted performance similar to rig specimen thermal fatigue performance. A model for predicting crack propagation in the sprayed ZrO2/CoCrAlY seal system was proposed, and recommendations for improving thermal fatigue resistance were made. Seal system layer thicknesses were analytically optimized to minimize thermal stresses in the abradability specimen during thermal fatigue testing. Rig tests on the optimized seal configuration demonstrated some improvement in thermal fatigue characteristics.
The CCLM contribution to improvements in quality and patient safety.
Plebani, Mario
2013-01-01
Clinical laboratories play an important role in improving patient care. The past decades have seen unbelievable, often unpredictable improvements in analytical performance. Although the seminal concept of the brain-to-brain laboratory loop has been described more than four decades ago, there is now a growing awareness about the importance of extra-analytical aspects in laboratory quality. According to this concept, all phases and activities of the testing cycle should be assessed, monitored and improved in order to decrease the total error rates thereby improving patients' safety. Clinical Chemistry and Laboratory Medicine (CCLM) not only has followed the shift in perception of quality in the discipline, but has been the catalyst for promoting a large debate on this topic, underlining the value of papers dealing with errors in clinical laboratories and possible remedies, as well as new approaches to the definition of quality in pre-, intra-, and post-analytical steps. The celebration of the 50th anniversary of the CCLM journal offers the opportunity to recall and mention some milestones in the approach to quality and patient safety and to inform our readers, as well as laboratory professionals, clinicians and all the stakeholders of the willingness of the journal to maintain quality issues as central to its interest even in the future.
Tests of a Semi-Analytical Case 1 and Gelbstoff Case 2 SeaWiFS Algorithm with a Global Data Set
NASA Technical Reports Server (NTRS)
Carder, Kendall L.; Hawes, Steve K.; Lee, Zhongping
1997-01-01
A semi-analytical algorithm was tested with a total of 733 points of either unpackaged or packaged-pigment data, with corresponding algorithm parameters for each data type. The 'unpackaged' type consisted of data sets that were generally consistent with the Case 1 CZCS algorithm and other well calibrated data sets. The 'packaged' type consisted of data sets apparently containing somewhat more packaged pigments, requiring modification of the absorption parameters of the model consistent with the CalCOFI study area. This resulted in two equally divided data sets. A more thorough scrutiny of these and other data sets using a semianalytical model requires improved knowledge of the phytoplankton and gelbstoff of the specific environment studied. Since the semi-analytical algorithm is dependent upon 4 spectral channels including the 412 nm channel, while most other algorithms are not, a means of testing data sets for consistency was sought. A numerical filter was developed to classify data sets into the above classes. The filter uses reflectance ratios, which can be determined from space. The sensitivity of such numerical filters to measurement resulting from atmospheric correction and sensor noise errors requires further study. The semi-analytical algorithm performed superbly on each of the data sets after classification, resulting in RMS1 errors of 0.107 and 0.121, respectively, for the unpackaged and packaged data-set classes, with little bias and slopes near 1.0. In combination, the RMS1 performance was 0.114. While these numbers appear rather sterling, one must bear in mind what mis-classification does to the results. Using an average or compromise parameterization on the modified global data set yielded an RMS1 error of 0.171, while using the unpackaged parameterization on the global evaluation data set yielded an RMS1 error of 0.284. So, without classification, the algorithm performs better globally using the average parameters than it does using the unpackaged parameters. Finally, the effects of even more extreme pigment packaging must be examined in order to improve algorithm performance at high latitudes. Note, however, that the North Sea and Mississippi River plume studies contributed data to the packaged and unpackaged classess, respectively, with little effect on algorithm performance. This suggests that gelbstoff-rich Case 2 waters do not seriously degrade performance of the semi-analytical algorithm.
Climate Analytics as a Service
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.
2014-01-01
Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.
ERIC Educational Resources Information Center
Gilliam, Walter S.; Zigler, Edward F.
2000-01-01
Presents a meta-analytic review of evaluations of state-funded preschool programs over 20 years. Identifies several methodological flaws but also suggests that pattern of findings offers modest support for positive impact in improving children's developmental competence, improving later school attendance and performance, and reducing subsequent…
Iqbal, Sahar; Mustansar, Tazeen
2017-03-01
Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.
Analytical and experimental evaluation of an aluminum bridge deck panel. Part 2, failure analysis.
DOT National Transportation Integrated Search
1999-01-01
Aluminum bridge decks may prove to be an alternative to concrete decks for improving the performance of structural bridge systems. Combining excellent corrosion resistance with extremely low density, aluminum decks can prolong surface life, facilitat...
Teaching for Successful Intelligence Raises School Achievement.
ERIC Educational Resources Information Center
Sternberg, Robert J.; Torff, Bruce; Grigorenko, Elena
1998-01-01
A "successful intelligence" intervention improved school achievement for a group of 225 ethnically diverse third-graders, both on performance assessments measuring analytical, creative, and practical achievements and on conventional multiple-choice memory assessments. Teaching for triarchic thinking facilitates factual recall, because learning…
Estimating and Enhancing Public Transit Accessibility for People with Mobility Limitations
DOT National Transportation Integrated Search
2017-06-30
This two-part study employs fine-scale performance measures and analytical techniques designed to evaluate and improve transit services for people experiencing disability. Part one puts forth a series of time-sensitive, general transit feed system (G...
Convolved substructure: analytically decorrelating jet substructure observables
NASA Astrophysics Data System (ADS)
Moult, Ian; Nachman, Benjamin; Neill, Duff
2018-05-01
A number of recent applications of jet substructure, in particular searches for light new particles, require substructure observables that are decorrelated with the jet mass. In this paper we introduce the Convolved SubStructure (CSS) approach, which uses a theoretical understanding of the observable to decorrelate the complete shape of its distribution. This decorrelation is performed by convolution with a shape function whose parameters and mass dependence are derived analytically. We consider in detail the case of the D 2 observable and perform an illustrative case study using a search for a light hadronically decaying Z'. We find that the CSS approach completely decorrelates the D 2 observable over a wide range of masses. Our approach highlights the importance of improving the theoretical understanding of jet substructure observables to exploit increasingly subtle features for performance.
Contextual Facilitators of and Barriers to Nursing Home Pressure Ulcer Prevention.
Hartmann, Christine W; Solomon, Jeffrey; Palmer, Jennifer A; Lukas, Carol VanDeusen
2016-05-01
To present findings of a study of institutional factors related to pressure ulcer (PrU) prevention in Veterans Health Administration nursing homes. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Identify the study's design, process, and purpose.2. List the factors pertaining to sites with improving performance. Important gaps exist in the knowledge of how to achieve successful, sustained prevention of pressure ulcers (PrUs) in nursing homes. This study aimed to address those gaps by comparing nursing leadership and indirect care staff members' impressions about the context of PrU prevention in facilities with improving and declining PrU rates. The study was conducted in a sample of 6 Veterans Health Administration nursing homes (known as community living centers) purposively selected to represent a range of PrU care performance. One-time 30-minute semistructured interviews with 23 community living center staff were conducted. Qualitative interview data were analyzed using an analytic framework containing (a) a priori analytic constructs based on the study's conceptual framework and (b) sections for emerging constructs. Analysis revealed 6 key concepts differentiating sites with improving and declining PrU care performance. These concepts were (1) structures through which the change effort is initiated; (2) organizational prioritization, alignment, and support; (3) improvement culture; (4) clarity of roles and responsibilities; (5) communication strategies; and (6) staffing and clinical practices. Results also pointed to potential contextual facilitators of and barriers to successful PrU prevention. Leadership's visible prioritization of and support for PrU prevention and the initiation of PrU prevention activities through formal structures were the most striking components represented at sites with improving performance, but not at ones where performance declined. Sites with improving performance were more likely to align frontline staff and leadership goals for PrU prevention.
Eckfeldt, J H; Copeland, K R
1993-04-01
Proficiency testing using stabilized control materials has been used for decades as a means of monitoring and improving performance in the clinical laboratory. Often, the commonly used proficiency testing materials exhibit "matrix effects" that cause them to behave differently from fresh human specimens in certain clinical analytic systems. Because proficiency testing is the primary method in which regulatory agencies have chosen to evaluate clinical laboratory performance, the College of American Pathologists (CAP) has proposed guidelines for investigating the influence of matrix effects on their Survey results. The purpose of this investigation was to determine the feasibility, usefulness, and potential problems associated with this CAP Matrix Effect Analytical Protocol, in which fresh patient specimens and CAP proficiency specimens are analyzed simultaneously by a field method and a definitive, reference, or other comparative method. The optimal outcome would be that both the fresh human and CAP Survey specimens agree closely with the comparative method result. However, this was not always the case. Using several different analytic configurations, we were able to demonstrate matrix and calibration biases for several of the analytes investigated.
Errors in the Extra-Analytical Phases of Clinical Chemistry Laboratory Testing.
Zemlin, Annalise E
2018-04-01
The total testing process consists of various phases from the pre-preanalytical to the post-postanalytical phase, the so-called brain-to-brain loop. With improvements in analytical techniques and efficient quality control programmes, most laboratory errors now occur in the extra-analytical phases. There has been recent interest in these errors with numerous publications highlighting their effect on service delivery, patient care and cost. This interest has led to the formation of various working groups whose mission is to develop standardized quality indicators which can be used to measure the performance of service of these phases. This will eventually lead to the development of external quality assessment schemes to monitor these phases in agreement with ISO15189:2012 recommendations. This review focuses on potential errors in the extra-analytical phases of clinical chemistry laboratory testing, some of the studies performed to assess the severity and impact of these errors and processes that are in place to address these errors. The aim of this review is to highlight the importance of these errors for the requesting clinician.
Signal Enhancement in HPLC/Micro-Coil NMR Using Automated Column Trapping
Djukovic, Danijel; Liu, Shuhui; Henry, Ian; Tobias, Brian; Raftery, Daniel
2008-01-01
A new HPLC-NMR system is described that performs analytical separation, pre-concentration, and NMR spectroscopy in rapid succession. The central component of our method is the online pre-concentration sequence that improves the match between post-column analyte peak volume and the micro-coil NMR detection volume. Separated samples are collected on to a C18 guard column with a mobile phase composed of 90% D2O/10% acetonitrile-D3, and back-flashed to the NMR micro-coil probe with 90% acetonitrile-D3/10% D2O. In order to assess the performance of our unit, we separated a standard mixture of 1 mM ibuprofen, naproxen, and phenylbutazone using a commercially available C18 analytical column. The S/N measurements from the NMR acquisitions indicated that we achieved signal enhancement factors up to 10.4 (±1.2)-fold. Furthermore, we observed that pre-concentration factors increased as the injected amount of analyte decreased. The highest concentration enrichment of 14.7 (±2.2)-fold was attained injecting 100 μL solution of 0.2 mM (~4 μg) ibuprofen. PMID:17037915
Coetzee, L M; Cassim, N; Glencross, D K
2015-12-16
The CD4 integrated service delivery model (ITSDM) provides for reasonable access to pathology services across South Africa (SA) by offering three new service tiers that extend services into remote, under-serviced areas. ITSDM identified Pixley ka Seme as such an under-serviced district. To address the poor service delivery in this area, a new ITSDM community (tier 3) laboratory was established in De Aar, SA. Laboratory performance and turnaround time (TAT) were monitored post implementation to assess the impact on local service delivery. Using the National Health Laboratory Service Corporate Data Warehouse, CD4 data were extracted for the period April 2012-July 2013 (n=11,964). Total mean TAT (in hours) was calculated and pre-analytical and analytical components assessed. Ongoing testing volumes, as well as external quality assessment performance across ten trials, were used to indicate post-implementation success. Data were analysed using Stata 12. Prior to the implementation of CD4 testing at De Aar, the total mean TAT was 20.5 hours. This fell to 8.2 hours post implementation, predominantly as a result of a lower pre-analytical mean TAT reducing from a mean of 18.9 to 1.8 hours. The analytical testing TAT remained unchanged after implementation and monthly test volumes increased by up to 20%. External quality assessment indicated adequate performance. Although subjective, questionnaires sent to facilities reported improved service delivery. Establishing CD4 testing in a remote community laboratory substantially reduces overall TAT. Additional community CD4 laboratories should be established in under-serviced areas, especially where laboratory infrastructure is already in place.
Huang, Qiong; Bu, Tong; Zhang, Wentao; Yan, Lingzhi; Zhang, Mengyue; Yang, Qingfeng; Huang, Lunjie; Yang, Baowei; Hu, Na; Suo, Yourui; Wang, Jianlong; Zhang, Daohong
2018-10-01
Immunochromatographic assays (ICAs) are most frequently used for on-site rapid screening of clenbuterol. To improve sensitivity, a novel probe with bacteria as signal carriers was developed. Bacteria can load a great deal of gold nanoparticles (AuNPs) on their surface, meaning much fewer antibodies are needed to produce clearly visible results, although low concentrations of antibody could also trigger fierce competition between free analyte and the immobilized antigen. Thus, a limited number of antibodies was key to significantly improved sensitivity. Analytical conditions, including bacterial species, coupling method, and concentration, were optimized. The visual detection limit (VDL) for clenbuterol was 0.1 ng/mL, a 20-fold improvement in sensitivity compared with traditional strips. This work has opened up a new route for signal amplification and improved performance of ICAs. Furthermore, inactivated bacteria could also be environment-friendly and robust signal carriers for other biosensors. Copyright © 2018 Elsevier Ltd. All rights reserved.
Pitsiladis, Yannis P; Durussel, Jérôme; Rabin, Olivier
2014-05-01
Administration of recombinant human erythropoietin (rHumanEPO) improves sporting performance and hence is frequently subject to abuse by athletes, although rHumanEPO is prohibited by the WADA. Approaches to detect rHumanEPO doping have improved significantly in recent years but remain imperfect. A new transcriptomic-based longitudinal screening approach is being developed that has the potential to improve the analytical performance of current detection methods. In particular, studies are being funded by WADA to identify a 'molecular signature' of rHumanEPO doping and preliminary results are promising. In the first systematic study to be conducted, the expression of hundreds of genes were found to be altered by rHumanEPO with numerous gene transcripts being differentially expressed after the first injection and further transcripts profoundly upregulated during and subsequently downregulated up to 4 weeks postadministration of the drug; with the same transcriptomic pattern observed in all participants. The identification of a blood 'molecular signature' of rHumanEPO administration is the strongest evidence to date that gene biomarkers have the potential to substantially improve the analytical performance of current antidoping methods such as the Athlete Biological Passport for rHumanEPO detection. Given the early promise of transcriptomics, research using an 'omics'-based approach involving genomics, transcriptomics, proteomics and metabolomics should be intensified in order to achieve improved detection of rHumanEPO and other doping substances and methods difficult to detect such a recombinant human growth hormone and blood transfusions.
The analytical validation of the Oncotype DX Recurrence Score assay
Baehner, Frederick L
2016-01-01
In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score® result (scale: 0–100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time. PMID:27729940
The analytical validation of the Oncotype DX Recurrence Score assay.
Baehner, Frederick L
2016-01-01
In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX ® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score ® result (scale: 0-100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time.
Tank 241-B-108, cores 172 and 173 analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L., Fluoro Daniel Hanford
1997-03-04
The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poormore » precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.« less
A splay tree-based approach for efficient resource location in P2P networks.
Zhou, Wei; Tan, Zilong; Yao, Shaowen; Wang, Shipu
2014-01-01
Resource location in structured P2P system has a critical influence on the system performance. Existing analytical studies of Chord protocol have shown some potential improvements in performance. In this paper a splay tree-based new Chord structure called SChord is proposed to improve the efficiency of locating resources. We consider a novel implementation of the Chord finger table (routing table) based on the splay tree. This approach extends the Chord finger table with additional routing entries. Adaptive routing algorithm is proposed for implementation, and it can be shown that hop count is significantly minimized without introducing any other protocol overheads. We analyze the hop count of the adaptive routing algorithm, as compared to Chord variants, and demonstrate sharp upper and lower bounds for both worst-case and average case settings. In addition, we theoretically analyze the hop reducing in SChord and derive the fact that SChord can significantly reduce the routing hops as compared to Chord. Several simulations are presented to evaluate the performance of the algorithm and support our analytical findings. The simulation results show the efficiency of SChord.
Analytic investigation of helicopter rotor blade appended aeroelastic devices
NASA Technical Reports Server (NTRS)
Bielawa, Richard L.
1984-01-01
Analytic evaluations of four different passive aeroelastic devices appended to helicopter rotor blades are presented. The devices consist of a passive tuned tab, a control coupled tab, an all-flying tip and a harmonic dilational airfoil tip. Each device was conceived for improving either aerodynamic performance or reducing vibratory control loads or hub shears. The evaluation was performed using a comprehensive rotor aeroelastic analysis (the G400PA code with appropriate modifications), together with data for a realistic helicopter rotor blade (the UH-60A Blackhawk), in high speed flight (90 m/s, 175 kts). The results of this study show that significant performance (L/(D sub e)) gains can be achieved with the all-flying free tip. Results from the harmonic dilational airfoil tip show the potential for moderate improvements in L/(D sub e). Finally, the results for the passive tuned tab and the control coupled tab, as configured for this study, show these devices to be impractical. Sections are included which describe the operation of each device, the required G400PA modifications, and the detailed results obtained for each device.
Data and Analytics to Inform Energy Retrofit of High Performance Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Yang, Le; Hill, David
Buildings consume more than one-third of the world?s primary energy. Reducing energy use in buildings with energy efficient technologies is feasible and also driven by energy policies such as energy benchmarking, disclosure, rating, and labeling in both the developed and developing countries. Current energy retrofits focus on the existing building stocks, especially older buildings, but the growing number of new high performance buildings built around the world raises a question that how these buildings perform and whether there are retrofit opportunities to further reduce their energy use. This is a new and unique problem for the building industry. Traditional energymore » audit or analysis methods are inadequate to look deep into the energy use of the high performance buildings. This study aims to tackle this problem with a new holistic approach powered by building performance data and analytics. First, three types of measured data are introduced, including the time series energy use, building systems operating conditions, and indoor and outdoor environmental parameters. An energy data model based on the ISO Standard 12655 is used to represent the energy use in buildings in a three-level hierarchy. Secondly, a suite of analytics were proposed to analyze energy use and to identify retrofit measures for high performance buildings. The data-driven analytics are based on monitored data at short time intervals, and cover three levels of analysis ? energy profiling, benchmarking and diagnostics. Thirdly, the analytics were applied to a high performance building in California to analyze its energy use and identify retrofit opportunities, including: (1) analyzing patterns of major energy end-use categories at various time scales, (2) benchmarking the whole building total energy use as well as major end-uses against its peers, (3) benchmarking the power usage effectiveness for the data center, which is the largest electricity consumer in this building, and (4) diagnosing HVAC equipment using detailed time-series operating data. Finally, a few energy efficiency measures were identified for retrofit, and their energy savings were estimated to be 20percent of the whole-building electricity consumption. Based on the analyses, the building manager took a few steps to improve the operation of fans, chillers, and data centers, which will lead to actual energy savings. This study demonstrated that there are energy retrofit opportunities for high performance buildings and detailed measured building performance data and analytics can help identify and estimate energy savings and to inform the decision making during the retrofit process. Challenges of data collection and analytics were also discussed to shape best practice of retrofitting high performance buildings.« less
Svagera, Zdeněk; Hanzlíková, Dagmar; Simek, Petr; Hušek, Petr
2012-03-01
Four disulfide-reducing agents, dithiothreitol (DTT), 2,3-dimercaptopropanesulfonate (DMPS), and the newly tested 2-mercaptoethanesulfonate (MESNA) and Tris(hydroxypropyl)phosphine (THP), were investigated in detail for release of sulfur amino acids in human plasma. After protein precipitation with trichloroacetic acid (TCA), the plasma supernatant was treated with methyl, ethyl, or propyl chloroformate via the well-proven derivatization-extraction technique and the products were subjected to gas chromatographic-mass spectrometric (GC-MS) analysis. All the tested agents proved to be rapid and effective reducing agents for the assay of plasma thiols. When compared with DTT, the novel reducing agents DMPS, MESNA, and THP provided much cleaner extracts and improved analytical performance. Quantification of homocysteine, cysteine, and methionine was performed using their deuterated analogues, whereas other analytes were quantified by means of 4-chlorophenylalanine. Precise and reliable assay of all examined analytes was achieved, irrespective of the chloroformate reagent used. Average relative standard deviations at each analyte level were ≤6%, quantification limits were 0.1-0.2 μmol L(-1), recoveries were 94-121%, and linearity was over three orders of magnitude (r(2) equal to 0.997-0.998). Validation performed with the THP agent and propyl chloroformate derivatization demonstrated the robustness and reliability of this simple sample-preparation methodology.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Performance analysis and improvement of WPAN MAC for home networks.
Mehta, Saurabh; Kwak, Kyung Sup
2010-01-01
The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.
Performance Analysis and Improvement of WPAN MAC for Home Networks
Mehta, Saurabh; Kwak, Kyung Sup
2010-01-01
The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking. PMID:22319274
Analytic network process model for sustainable lean and green manufacturing performance indicator
NASA Astrophysics Data System (ADS)
Aminuddin, Adam Shariff Adli; Nawawi, Mohd Kamal Mohd; Mohamed, Nik Mohd Zuki Nik
2014-09-01
Sustainable manufacturing is regarded as the most complex manufacturing paradigm to date as it holds the widest scope of requirements. In addition, its three major pillars of economic, environment and society though distinct, have some overlapping among each of its elements. Even though the concept of sustainability is not new, the development of the performance indicator still needs a lot of improvement due to its multifaceted nature, which requires integrated approach to solve the problem. This paper proposed the best combination of criteria en route a robust sustainable manufacturing performance indicator formation via Analytic Network Process (ANP). The integrated lean, green and sustainable ANP model can be used to comprehend the complex decision system of the sustainability assessment. The finding shows that green manufacturing is more sustainable than lean manufacturing. It also illustrates that procurement practice is the most important criteria in the sustainable manufacturing performance indicator.
Potential benefits of propulsion and flight control integration for supersonic cruise vehicles
NASA Technical Reports Server (NTRS)
Berry, D. T.; Schweikhard, W. G.
1976-01-01
Typical airframe/propulsion interactions such as Mach/altitude excursions and inlet unstarts are reviewed. The improvements in airplane performance and flight control that can be achieved by improving the interfaces between propulsion and flight control are estimated. A research program to determine the feasibility of integrating propulsion and flight control is described. This program includes analytical studies and YF-12 flight tests.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
Sciacovelli, Laura; Panteghini, Mauro; Lippi, Giuseppe; Sumarac, Zorica; Cadamuro, Janne; Galoro, César Alex De Olivera; Pino Castro, Isabel Garcia Del; Shcolnik, Wilson; Plebani, Mario
2017-08-28
The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group "Performance specifications for the extra-analytical phases" (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project.
NASA Astrophysics Data System (ADS)
Askari, Davood
The theoretical objectives and accomplishment of this work are the analytical and numerical investigation of material properties and mechanical behavior of carbon nanotubes (CNTs) and nanotube nanocomposites when they are subjected to various loading conditions. First, the finite element method is employed to investigate numerically the effective Young's modulus and Poisson's ratio of a single-walled CNT. Next, the effects of chirality on the effective Young's modulus and Poisson's ratio are investigated and then variations of their effective coefficient of thermal expansions and effective thermal conductivities are studied for CNTs with different structural configurations. To study the influence of small vacancy defects on mechanical properties of CNTs, finite element analyses are performed and the behavior of CNTs with various structural configurations having different types of vacancy defects is studied. It is frequently reported that nano-materials are excellent candidates as reinforcements in nanocomposites to change or enhance material properties of polymers and their nanocomposites. Second, the inclusion of nano-materials can considerably improve electrical, thermal, and mechanical properties of the bonding agent, i.e., resin. Note that, materials atomic and molecular level do not usually show isotropic behaviour, rather they have orthotropic properties. Therefore, two-phase and three-phase cylindrically orthotropic composite models consisting of different constituents with orthotropic properties are developed and introduced in this work to analytically predict the effective mechanical properties and mechanical behavior of such structures when they are subjected to various external loading conditions. To verify the analytically obtained exact solutions, finite element analyses of identical cylindrical structures are also performed and then results are compared with those obtained analytically, and excellent agreement is achieved. The third part of this dissertation investigates the growth of vertically aligned, long, and high density arrays of CNTs and novel 3-D carbon nanotube nano-forests. A Chemical vapor deposition technique is used to grow radially aligned CNTs on various types of fibrous materials such as silicon carbide, carbon, Kevlar, and glass fibers and clothes that can be used for the fabrication of multifunctional high performing laminated nanocomposite structures. Using the CNTs nano-forest clothes, nanocomposite samples are prepared and tested giving promising results for the improvement of mechanical properties and performance of composites structures.
Ho, Sirikit; Lukacs, Zoltan; Hoffmann, Georg F; Lindner, Martin; Wetter, Thomas
2007-07-01
In newborn screening with tandem mass spectrometry, multiple intermediary metabolites are quantified in a single analytical run for the diagnosis of fatty-acid oxidation disorders, organic acidurias, and aminoacidurias. Published diagnostic criteria for these disorders normally incorporate a primary metabolic marker combined with secondary markers, often analyte ratios, for which the markers have been chosen to reflect metabolic pathway deviations. We applied a procedure to extract new markers and diagnostic criteria for newborn screening to the data of newborns with confirmed medium-chain acyl-CoA dehydrogenase deficiency (MCADD) and a control group from the newborn screening program, Heidelberg, Germany. We validated the results with external data of the screening center in Hamburg, Germany. We extracted new markers by performing a systematic search for analyte combinations (features) with high discriminatory performance for MCADD. To select feature thresholds, we applied automated procedures to separate controls and cases on the basis of the feature values. Finally, we built classifiers from these new markers to serve as diagnostic criteria in screening for MCADD. On the basis of chi(2) scores, we identified approximately 800 of >628,000 new analyte combinations with superior discriminatory performance compared with the best published combinations. Classifiers built with the new features achieved diagnostic sensitivities and specificities approaching 100%. Feature construction methods provide ways to disclose information hidden in the set of measured analytes. Other diagnostic tasks based on high-dimensional metabolic data might also profit from this approach.
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
NASA Astrophysics Data System (ADS)
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
Khorosheva, Eugenia M; Karymov, Mikhail A; Selck, David A; Ismagilov, Rustem F
2016-01-29
In this paper, we asked if it is possible to identify the best primers and reaction conditions based on improvements in reaction speed when optimizing isothermal reactions. We used digital single-molecule, real-time analyses of both speed and efficiency of isothermal amplification reactions, which revealed that improvements in the speed of isothermal amplification reactions did not always correlate with improvements in digital efficiency (the fraction of molecules that amplify) or with analytical sensitivity. However, we observed that the speeds of amplification for single-molecule (in a digital device) and multi-molecule (e.g. in a PCR well plate) formats always correlated for the same conditions. Also, digital efficiency correlated with the analytical sensitivity of the same reaction performed in a multi-molecule format. Our finding was supported experimentally with examples of primer design, the use or exclusion of loop primers in different combinations, and the use of different enzyme mixtures in one-step reverse-transcription loop-mediated amplification (RT-LAMP). Our results show that measuring the digital efficiency of amplification of single-template molecules allows quick, reliable comparisons of the analytical sensitivity of reactions under any two tested conditions, independent of the speeds of the isothermal amplification reactions. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
The lunar libration: comparisons between various models - a model fitted to LLR observations
NASA Astrophysics Data System (ADS)
Chapront, J.; Francou, G.
2005-09-01
We consider 4 libration models: 3 numerical models built by JPL (ephemerides for the libration in DE245, DE403 and DE405) and an analytical model improved with numerical complements fitted to recent LLR observations. The analytical solution uses 3 angular variables (ρ1, ρ2, τ) which represent the deviations with respect to Cassini's laws. After having referred the models to a unique reference frame, we study the differences between the models which depend on gravitational and tidal parameters of the Moon, as well as amplitudes and frequencies of the free librations. It appears that the differences vary widely depending of the above quantities. They correspond to a few meters displacement on the lunar surface, reminding that LLR distances are precise to the centimeter level. Taking advantage of the lunar libration theory built by Moons (1984) and improved by Chapront et al. (1999) we are able to establish 4 solutions and to represent their differences by Fourier series after a numerical substitution of the gravitational constants and free libration parameters. The results are confirmed by frequency analyses performed separately. Using DE245 as a basic reference ephemeris, we approximate the differences between the analytical and numerical models with Poisson series. The analytical solution - improved with numerical complements under the form of Poisson series - is valid over several centuries with an internal precision better than 5 centimeters.
NASA Astrophysics Data System (ADS)
Arafa, Safia; Bouchemat, Mohamed; Bouchemat, Touraya; Benmerkhi, Ahlem; Hocini, Abdesselam
2017-02-01
A Bio-sensing platform based on an infiltrated photonic crystal ring shaped holes cavity-coupled waveguide system is proposed for glucose concentration detection. Considering silicon-on-insulator (SOI) technology, it has been demonstrated that the ring shaped holes configuration provides an excellent optical confinement within the cavity region, which further enhances the light-matter interactions at the precise location of the analyte medium. Thus, the sensitivity and the quality factor (Q) can be significantly improved. The transmission characteristics of light in the biosensor under different refractive indices that correspond to the change in the analyte glucose concentration are analyzed by performing finite-difference time-domain (FDTD) simulations. Accordingly, an improved sensitivity of 462 nm/RIU and a Q factor as high as 1.11х105 have been achieved, resulting in a detection limit of 3.03х10-6 RIU. Such combination of attributes makes the designed structure a promising element for performing label-free biosensing in medical diagnosis and environmental monitoring.
CF6 jet engine diagnostics program. High pressure turbine roundness/clearance investigation
NASA Technical Reports Server (NTRS)
Howard, W. D.; Fasching, W. A.
1982-01-01
The effects of high pressure turbine clearance changes on engine and module performance was evaluated in addition to the measurement of CF6-50C high pressure turbine Stage 1 tip clearance and stator out-of-roundness during steady-state and transient operation. The results indicated a good correlation of the analytical model of round engine clearance response with measured data. The stator out-of-roundness measurements verified that the analytical technique for predicting the distortion effects of mechanical loads is accurate, whereas the technique for calculating the effects of certain circumferential thermal gradients requires some modifications. A potential for improvement in roundness was established in the order of 0.38 mm (0.015 in.), equivalent to 0.86 percent turbine efficiency which translates to a cruise SFC improvement of 0.36 percent. The HP turbine Stage 1 tip clearance performance derivative was established as 0.44 mm (17 mils) per percent of turbine efficiency at take-off power, somewhat smaller, therefore, more sensitive than predicted from previous investigations.
Cuadrado-Cenzual, M A; García Briñón, M; de Gracia Hills, Y; González Estecha, M; Collado Yurrita, L; de Pedro Moro, J A; Fernández Pérez, C; Arroyo Fernández, M
2015-01-01
Patient identification errors and biological samples are one of the problems with the highest risk factor in causing an adverse event in the patient. To detect and analyse the causes of patient identification errors in analytical requests (PIEAR) from emergency departments, and to develop improvement strategies. A process and protocol was designed, to be followed by all professionals involved in the requesting and performing of laboratory tests. Evaluation and monitoring indicators of PIEAR were determined, before and after the implementation of these improvement measures (years 2010-2014). A total of 316 PIEAR were detected in a total of 483,254 emergency service requests during the study period, representing a mean of 6.80/10,000 requests. Patient identification failure was the most frequent in all the 6-monthly periods assessed, with a significant difference (P<.0001). The improvement strategies applied showed to be effective in detecting PIEAR, as well as the prevention of such errors. However, we must continue working with this strategy, promoting a culture of safety for all the professionals involved, and trying to achieve the goal that 100% of the analytical and samples are properly identified. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Schleyer, Anneliese M; Robinson, Ellen; Dumitru, Roxana; Taylor, Mark; Hayes, Kimberly; Pergamit, Ronald; Beingessner, Daphne M; Zaros, Mark C; Cuschieri, Joseph
2016-12-01
Hospital-acquired venous thromboembolism (HA-VTE) is a potentially preventable cause of morbidity and mortality. Despite high rates of venous thromboembolism (VTE) prophylaxis in accordance with an institutional guideline, VTE remains the most common hospital-acquired condition in our institution. To improve the safety of all hospitalized patients, examine current VTE prevention practices, identify opportunities for improvement, and decrease rates of HA-VTE. Pre/post assessment. Urban academic tertiary referral center, level 1 trauma center, safety net hospital; all patients. We formed a multidisciplinary VTE task force to review all HA-VTE events, assess prevention practices relative to evidence-based institutional guidelines, and identify improvement opportunities. The task force developed an electronic tool to facilitate efficient VTE event review and designed decision-support and reporting tools, now integrated into the electronic health record, to bring optimal VTE prevention practices to the point of care. Performance is shared transparently across the institution. Harborview benchmarks process and outcome performance, including patient safety indicators and core measures, against hospitals nationally using Hospital Compare and Vizient data. Our program has resulted in >90% guideline-adherent VTE prevention and zero preventable HA-VTEs. Initiatives have resulted in a 15% decrease in HA-VTE and a 21% reduction in postoperative VTE. Keys to success include the multidisciplinary approach, clinical roles of task force members, senior leadership support, and use of quality improvement analytics for retrospective review, prospective reporting, and performance transparency. Ongoing task force collaboration with frontline providers is critical to sustained improvements. Journal of Hospital Medicine 2016;11:S38-S43. © 2016 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.
Mengoli, Carlo; Springer, Jan; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Klingspor, Lena; Lagrou, Katrien; Melchers, Willem J. G.; Morton, C. Oliver; Barnes, Rosemary A.; Donnelly, J. Peter; White, P. Lewis
2015-01-01
The use of serum or plasma for Aspergillus PCR testing facilitates automated and standardized technology. Recommendations for serum testing are available, and while serum and plasma are regularly considered interchangeable for use in fungal diagnostics, differences in galactomannan enzyme immunoassay (GM-EIA) performance have been reported and are attributed to clot formation. Therefore, it is important to assess plasma PCR testing to determine if previous recommendations for serum are applicable and also to compare analytical performance with that of serum PCR. Molecular methods testing serum and plasma were compared through multicenter distribution of quality control panels, with additional studies to investigate the effect of clot formation and blood fractionation on DNA availability. Analytical sensitivity and time to positivity (TTP) were compared, and a regression analysis was performed to identify variables that enhanced plasma PCR performance. When testing plasma, sample volume, preextraction-to-postextraction volume ratio, PCR volume, duplicate testing, and the use of an internal control for PCR were positively associated with performance. When whole-blood samples were spiked and then fractionated, the analytical sensitivity and TTP were superior when testing plasma. Centrifugation had no effect on DNA availability, whereas the presence of clot material significantly lowered the concentration (P = 0.028). Technically, there are no major differences in the molecular processing of serum and plasma, but the formation of clot material potentially reduces available DNA in serum. During disease, Aspergillus DNA burdens in blood are often at the limits of PCR performance. Using plasma might improve performance while maintaining the methodological simplicity of serum testing. PMID:26085614
Loeffler, Juergen; Mengoli, Carlo; Springer, Jan; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Klingspor, Lena; Lagrou, Katrien; Melchers, Willem J G; Morton, C Oliver; Barnes, Rosemary A; Donnelly, J Peter; White, P Lewis
2015-09-01
The use of serum or plasma for Aspergillus PCR testing facilitates automated and standardized technology. Recommendations for serum testing are available, and while serum and plasma are regularly considered interchangeable for use in fungal diagnostics, differences in galactomannan enzyme immunoassay (GM-EIA) performance have been reported and are attributed to clot formation. Therefore, it is important to assess plasma PCR testing to determine if previous recommendations for serum are applicable and also to compare analytical performance with that of serum PCR. Molecular methods testing serum and plasma were compared through multicenter distribution of quality control panels, with additional studies to investigate the effect of clot formation and blood fractionation on DNA availability. Analytical sensitivity and time to positivity (TTP) were compared, and a regression analysis was performed to identify variables that enhanced plasma PCR performance. When testing plasma, sample volume, preextraction-to-postextraction volume ratio, PCR volume, duplicate testing, and the use of an internal control for PCR were positively associated with performance. When whole-blood samples were spiked and then fractionated, the analytical sensitivity and TTP were superior when testing plasma. Centrifugation had no effect on DNA availability, whereas the presence of clot material significantly lowered the concentration (P = 0.028). Technically, there are no major differences in the molecular processing of serum and plasma, but the formation of clot material potentially reduces available DNA in serum. During disease, Aspergillus DNA burdens in blood are often at the limits of PCR performance. Using plasma might improve performance while maintaining the methodological simplicity of serum testing. Copyright © 2015 Loeffler et al.
Evaluating supplier quality performance using fuzzy analytical hierarchy process
NASA Astrophysics Data System (ADS)
Ahmad, Nazihah; Kasim, Maznah Mat; Rajoo, Shanmugam Sundram Kalimuthu
2014-12-01
Evaluating supplier quality performance is vital in ensuring continuous supply chain improvement, reducing the operational costs and risks towards meeting customer's expectation. This paper aims to illustrate an application of Fuzzy Analytical Hierarchy Process to prioritize the evaluation criteria in a context of automotive manufacturing in Malaysia. Five main criteria were identified which were quality, cost, delivery, customer serviceand technology support. These criteria had been arranged into hierarchical structure and evaluated by an expert. The relative importance of each criteria was determined by using linguistic variables which were represented as triangular fuzzy numbers. The Center of Gravity defuzzification method was used to convert the fuzzy evaluations into their corresponding crisps values. Such fuzzy evaluation can be used as a systematic tool to overcome the uncertainty evaluation of suppliers' performance which usually associated with human being subjective judgments.
A hybrid approach to near-optimal launch vehicle guidance
NASA Technical Reports Server (NTRS)
Leung, Martin S. K.; Calise, Anthony J.
1992-01-01
This paper evaluates a proposed hybrid analytical/numerical approach to launch-vehicle guidance for ascent to orbit injection. The feedback-guidance approach is based on a piecewise nearly analytic zero-order solution evaluated using a collocation method. The zero-order solution is then improved through a regular perturbation analysis, wherein the neglected dynamics are corrected in the first-order term. For real-time implementation, the guidance approach requires solving a set of small dimension nonlinear algebraic equations and performing quadrature. Assessment of performance and reliability are carried out through closed-loop simulation for a vertically launched 2-stage heavy-lift capacity vehicle to a low earth orbit. The solutions are compared with optimal solutions generated from a multiple shooting code. In the example the guidance approach delivers over 99.9 percent of optimal performance and terminal constraint accuracy.
Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations
NASA Astrophysics Data System (ADS)
Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.
2016-07-01
Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.
Laminar flow control perforated wing panel development
NASA Technical Reports Server (NTRS)
Fischler, J. E.
1986-01-01
Many structural concepts for a wing leading edge laminar flow control hybrid panel were analytically investigated. After many small, medium, and large tests, the selected design was verified. New analytic methods were developed to combine porous titanium sheet bonded to a substructure of fiberglass and carbon/epoxy cloth. At -65 and +160 F test conditions, the critical bond of the porous titanium to the composite failed at lower than anticipated test loads. New cure cycles, design improvements, and test improvements significantly improved the strength and reduced the deflections from thermal and lateral loadings. The wave tolerance limits for turbulence were not exceeded. Consideration of the beam column midbay deflections from the combinations of the axial and lateral loadings and thermal bowing at -65 F, room temperature, and +160 F were included. Many lap shear tests were performed at several cure cycles. Results indicate that sufficient verification was obtained to fabricate a demonstration vehicle.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Assessing the effect of cognitive styles with different learning modes on learning outcome.
Liao, Chechen; Chuang, Shu-Hui
2007-08-01
In this study, similarities and differences in learning outcome associated with individual differences in cognitive styles are examined using the traditional (face-to-face) and web-based learning modes. 140 undergraduate students were categorized as having analytic or holistic cognitive styles by their scores on the Style of Learning and Thinking questionnaire. Four different conditions were studies; students with analytic cognitive style in a traditional learning mode, analytic cognitive style in a web-based learning mode, holistic cognitive style in a traditional learning mode, and holistic cognitive style in a web-based learning mode. Analysis of the data show that analytic style in traditional mode lead to significantly higher performance and perceived satisfaction than in other conditions. Satisfaction did not differ significantly between students with analytic style in web-based learning and those with holistic style in traditional learning. This suggest that integrating different learning modes into the learning environment may be insufficient to improve learners' satisfaction.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Dowling, Geraldine; Malone, Edward; Harbison, Tom; Martin, Sheila
2010-07-01
A sensitive and selective method for the determination of six non-steroidal anti-inflammatory drugs (NSAIDs) in bovine plasma was developed. An improved method for the determination of authorized and non-authorized residues of 10 non-steroidal anti-inflammatory drugs in milk was developed. Analytes were separated and acquired by high performance liquid chromatography coupled with an electrospray ionisation tandem mass spectrometer (ESI-MS/MS). Target compounds were acidified in plasma, and plasma and milk samples were extracted with acetonitrile and both extracts were purified on an improved solid phase extraction procedure utilising Evolute ABN cartridges. The accuracy of the methods for milk and plasma was between 73 and 109%. The precision of the method for authorized and non-authorized NSAIDs in milk and plasma expressed as % RSD, for the within lab reproducibility was less than 16%. The % RSD for authorized NSAIDs at their associated MRL(s) in milk was less than 10% for meloxicam, flunixin and tolfenamic acid and was less than 25% for hydroxy flunixin. The methods were validated according to Commission Decision 2002/657/EC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Optis, Michael; Scott, George N.; Draxl, Caroline
The goal of this analysis was to assess the wind power forecast accuracy of the Vermont Weather Analytics Center (VTWAC) forecast system and to identify potential improvements to the forecasts. Based on the analysis at Georgia Mountain, the following recommendations for improving forecast performance were made: 1. Resolve the significant negative forecast bias in February-March 2017 (50% underprediction on average) 2. Improve the ability of the forecast model to capture the strong diurnal cycle of wind power 3. Add ability for forecast model to assess internal wake loss, particularly at sites where strong diurnal shifts in wind direction are present.more » Data availability and quality limited the robustness of this forecast assessment. A more thorough analysis would be possible given a longer period of record for the data (at least one full year), detailed supervisory control and data acquisition data for each wind plant, and more detailed information on the forecast system input data and methodologies.« less
Web-Based Predictive Analytics to Improve Patient Flow in the Emergency Department
NASA Technical Reports Server (NTRS)
Buckler, David L.
2012-01-01
The Emergency Department (ED) simulation project was established to demonstrate how requirements-driven analysis and process simulation can help improve the quality of patient care for the Veterans Health Administration's (VHA) Veterans Affairs Medical Centers (VAMC). This project developed a web-based simulation prototype of patient flow in EDs, validated the performance of the simulation against operational data, and documented IT requirements for the ED simulation.
Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.
Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo
2018-02-23
The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.
Nitric Oxide Release for Improving Performance of Implantable Chemical Sensors - A Review.
Cha, Kyoung Ha; Wang, Xuewei; Meyerhoff, Mark E
2017-12-01
Over the last three decades, there has been extensive interest in developing in vivo chemical sensors that can provide real-time measurements of blood gases (oxygen, carbon dioxide, and pH), glucose/lactate, and potentially other critical care analytes in the blood of hospitalized patients. However, clot formation with intravascular sensors and foreign body response toward sensors implanted subcutaneously can cause inaccurate analytical results. Further, the risk of bacterial infection from any sensor implanted in the human body is another major concern. To solve these issues, the release of an endogenous gas molecule, nitric oxide (NO), from the surface of such sensors has been investigated owing to NO's ability to inhibit platelet activation/adhesion, foreign body response and bacterial growth. This paper summarizes the importance of NO's therapeutic potential for this application and reviews the publications to date that report on the analytical performance of NO release sensors in laboratory testing and/or during in vivo testing.
Big-BOE: Fusing Spanish Official Gazette with Big Data Technology.
Basanta-Val, Pablo; Sánchez-Fernández, Luis
2018-06-01
The proliferation of new data sources, stemmed from the adoption of open-data schemes, in combination with an increasing computing capacity causes the inception of new type of analytics that process Internet of things with low-cost engines to speed up data processing using parallel computing. In this context, the article presents an initiative, called BIG-Boletín Oficial del Estado (BOE), designed to process the Spanish official government gazette (BOE) with state-of-the-art processing engines, to reduce computation time and to offer additional speed up for big data analysts. The goal of including a big data infrastructure is to be able to process different BOE documents in parallel with specific analytics, to search for several issues in different documents. The application infrastructure processing engine is described from an architectural perspective and from performance, showing evidence on how this type of infrastructure improves the performance of different types of simple analytics as several machines cooperate.
An analytic study of nonsteady two-phase laminar boundary layer around an airfoil
NASA Technical Reports Server (NTRS)
Hsu, Yu-Kao
1989-01-01
Recently, NASA, FAA, and other organizations have focused their attention upon the possible effects of rain on airfoil performance. Rhode carried out early experiments and concluded that the rain impacting the aircraft increased the drag. Bergrum made numerical calculation for the rain effects on airfoils. Luers and Haines did an analytic investigation and found that heavy rain induces severe aerodynamic penalties including both a momentum penalty due to the impact of the rain and a drag and lift penalty due to rain roughening of the airfoil and fuselage. More recently, Hansman and Barsotti performed experiments and declared that performance degradation of an airfoil in heavy rain is due to the effective roughening of the surface by the water layer. Hansman and Craig did further experimental research at low Reynolds number. E. Dunham made a critical review for the potential influence of rain on airfoil performance. Dunham et al. carried out experiments for the transport type airfoil and concluded that there is a reduction of maximum lift capability with increase in drag. There is a scarcity of published literature in analytic research of two-phase boundary layer around an airfoil. Analytic research is being improved. The following assumptions are made: the fluid flow is non-steady, viscous, and incompressible; the airfoil is represented by a two-dimensional flat plate; and there is only a laminar boundary layer throughout the flow region. The boundary layer approximation is solved and discussed.
Experimental Evaluation of Tuned Chamber Core Panels for Payload Fairing Noise Control
NASA Technical Reports Server (NTRS)
Schiller, Noah H.; Allen, Albert R.; Herlan, Jonathan W.; Rosenthal, Bruce N.
2015-01-01
Analytical models have been developed to predict the sound absorption and sound transmission loss of tuned chamber core panels. The panels are constructed of two facesheets sandwiching a corrugated core. When ports are introduced through one facesheet, the long chambers within the core can be used as an array of low-frequency acoustic resonators. To evaluate the accuracy of the analytical models, absorption and sound transmission loss tests were performed on flat panels. Measurements show that the acoustic resonators embedded in the panels improve both the absorption and transmission loss of the sandwich structure at frequencies near the natural frequency of the resonators. Analytical predictions for absorption closely match measured data. However, transmission loss predictions miss important features observed in the measurements. This suggests that higher-fidelity analytical or numerical models will be needed to supplement transmission loss predictions in the future.
Hsu, Yen-Michael S; Burnham, Carey-Ann D
2014-06-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a tool for identifying clinically relevant anaerobes. We evaluated the analytical performance characteristics of the Bruker Microflex with Biotyper 3.0 software system for identification of anaerobes and examined the impact of direct formic acid (FA) treatment and other pre-analytical factors on MALDI-TOF MS performance. A collection of 101 anaerobic bacteria were evaluated, including Clostridium spp., Propionibacterium spp., Fusobacterium spp., Bacteroides spp., and other anaerobic bacterial of clinical relevance. The results of our study indicate that an on-target extraction with 100% FA improves the rate of accurate identification without introducing misidentification (P<0.05). In addition, we modify the reporting cutoffs for the Biotyper "score" yielding acceptable identification. We found that a score of ≥1.700 can maximize the rate of identification. Of interest, MALDI-TOF MS can correctly identify anaerobes grown in suboptimal conditions, such as on selective culture media and following oxygen exposure. In conclusion, we report on a number of simple and cost-effective pre- and post-analytical modifications could enhance MALDI-TOF MS identification for anaerobic bacteria. Copyright © 2014 Elsevier Inc. All rights reserved.
Improving trends in gender disparities in the Department of Veterans Affairs: 2008-2013.
Whitehead, Alison M; Czarnogorski, Maggie; Wright, Steve M; Hayes, Patricia M; Haskell, Sally G
2014-09-01
Increasing numbers of women veterans using Department of Veterans Affairs (VA) services has contributed to the need for equitable, high-quality care for women. The VA has evaluated performance measure data by gender since 2006. In 2008, the VA launched a 5-year women's health redesign, and, in 2011, gender disparity improvement was included on leadership performance plans. We examined data from VA Office of Analytics and Business Intelligence quarterly gender reports for trends in gender disparities in gender-neutral performance measures from 2008 to 2013. Through reporting of data by gender, leadership involvement, electronic reminders, and population management dashboards, VA has seen a decreasing trend in gender inequities on most Health Effectiveness Data and Information Set performance measures.
Evaluating supplier quality performance using analytical hierarchy process
NASA Astrophysics Data System (ADS)
Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah
2013-09-01
This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.
Advances in Adaptive Control Methods
NASA Technical Reports Server (NTRS)
Nguyen, Nhan
2009-01-01
This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.
On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood
ERIC Educational Resources Information Center
Karabatsos, George
2017-01-01
This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon…
Leveraging Big-Data for Business Process Analytics
ERIC Educational Resources Information Center
Vera-Baquero, Alejandro; Colomo Palacios, Ricardo; Stantchev, Vladimir; Molloy, Owen
2015-01-01
Purpose: This paper aims to present a solution that enables organizations to monitor and analyse the performance of their business processes by means of Big Data technology. Business process improvement can drastically influence in the profit of corporations and helps them to remain viable. However, the use of traditional Business Intelligence…
High Performance Analytics with the R3-Cache
NASA Astrophysics Data System (ADS)
Eavis, Todd; Sayeed, Ruhan
Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.
Improving Conceptions in Analytical Chemistry: The Central Limit Theorem
ERIC Educational Resources Information Center
Rodriguez-Lopez, Margarita; Carrasquillo, Arnaldo, Jr.
2006-01-01
This article describes the central limit theorem (CLT) and its relation to analytical chemistry. The pedagogic rational, which argues for teaching the CLT in the analytical chemistry classroom, is discussed. Some analytical chemistry concepts that could be improved through an understanding of the CLT are also described. (Contains 2 figures.)
Heave-pitch-roll analysis and testing of air cushion landing systems
NASA Technical Reports Server (NTRS)
Boghani, A. B.; Captain, K. M.; Wormley, D. N.
1978-01-01
The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.
Fiber-Reinforced Origamic Robotic Actuator.
Yi, Juan; Chen, Xiaojiao; Song, Chaoyang; Wang, Zheng
2018-02-01
A novel pneumatic soft linear actuator Fiber-reinforced Origamic Robotic Actuator (FORA) is proposed with significant improvements on the popular McKibben-type actuators, offering nearly doubled motion range, substantially improved force profile, and significantly lower actuation pressure. The desirable feature set is made possible by a novel soft origamic chamber that expands radially while contracts axially when pressurized. Combining this new origamic chamber with a reinforcing fiber mesh, FORA generates very high traction force (over 150N) and very large contractile motion (over 50%) at very low input pressure (100 kPa). We developed quasi-static analytical models both to characterize the motion and forces and as guidelines for actuator design. Fabrication of FORA mostly involves consumer-grade three-dimensional (3D) printing. We provide a detailed list of materials and dimensions. Fabricated FORAs were tested on a dedicated platform against commercially available pneumatic artificial muscles from Shadow and Festo to showcase its superior performances and validate the analytical models with very good agreements. Finally, a robotic joint was developed driven by two antagonistic FORAs, to showcase the benefits of the performance improvements. With its simple structure, fully characterized mechanism, easy fabrication procedure, and highly desirable performance, FORA could be easily customized to application requirements and fabricated by anyone with access to a 3D printer. This will pave the way to the wider adaptation and application of soft robotic systems.
2010-04-01
available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J.
2017-01-01
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system’s configuration and LS’s relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS’ localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision. PMID:28125056
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique.
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J
2017-01-25
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system's configuration and LS's relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS' localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision.
Analytical and Clinical Performance Evaluation of the Abbott Architect PIVKA Assay.
Ko, Dae-Hyun; Hyun, Jungwon; Kim, Hyun Soo; Park, Min-Jeong; Kim, Jae-Seok; Park, Ji-Young; Shin, Dong Hoon; Cho, Hyoun Chan
2018-01-01
Protein induced by vitamin K absence (PIVKA) is measured using various assays and is used to help diagnose hepatocellular carcinoma. The present study evaluated the analytical and clinical performances of the recently released Abbott Architect PIVKA assay. Precision, linearity, and correlation tests were performed in accordance with the Clinical Laboratory Standardization Institute guidelines. Sample type suitability was assessed using serum and plasma samples from the same patients, and the reference interval was established using sera from 204 healthy individuals. The assay had coefficients of variation of 3.2-3.5% and intra-laboratory variation of 3.6-5.5%. Linearity was confirmed across the entire measurable range. The Architect PIVKA assay was comparable to the Lumipulse PIVKA assay, and the plasma and serum samples provided similar results. The lower reference limit was 13.0 mAU/mL and the upper reference limit was 37.4 mAU/mL. The ability of the Architect PIVKA assay to detect hepatocellular carcinoma was comparable to that of the alpha-fetoprotein test and the Lumipulse PIVKA assay. The Architect PIVKA assay provides excellent analytical and clinical performance, is simple for clinical laboratories to adopt, and has improved sample type suitability that could broaden the assay's utility. © 2018 by the Association of Clinical Scientists, Inc.
Maximum flow-based resilience analysis: From component to system
Jin, Chong; Li, Ruiying; Kang, Rui
2017-01-01
Resilience, the ability to withstand disruptions and recover quickly, must be considered during system design because any disruption of the system may cause considerable loss, including economic and societal. This work develops analytic maximum flow-based resilience models for series and parallel systems using Zobel’s resilience measure. The two analytic models can be used to evaluate quantitatively and compare the resilience of the systems with the corresponding performance structures. For systems with identical components, the resilience of the parallel system increases with increasing number of components, while the resilience remains constant in the series system. A Monte Carlo-based simulation method is also provided to verify the correctness of our analytic resilience models and to analyze the resilience of networked systems based on that of components. A road network example is used to illustrate the analysis process, and the resilience comparison among networks with different topologies but the same components indicates that a system with redundant performance is usually more resilient than one without redundant performance. However, not all redundant capacities of components can improve the system resilience, the effectiveness of the capacity redundancy depends on where the redundant capacity is located. PMID:28545135
Lyon, Elaine; Schrijver, Iris; Weck, Karen E; Ferreira-Gonzalez, Andrea; Richards, C Sue; Palomaki, Glenn E
2015-03-01
Molecular testing for cystic fibrosis mutations is widespread and routine in reproductive decision making and diagnosis. Our objective was to assess the level of performance of laboratories for this test. The College of American Pathologists administers external proficiency testing with multiple DNA samples distributed biannually. RESULTS are analyzed, reviewed, and graded by the joint College of American Pathologists/American College of Medical Genetics and Genomics Biochemical and Molecular Genetics Committee. Assessment is based on genotype and associated clinical interpretation. Overall, 357 clinical laboratories participated in the proficiency testing survey between 2003 and 2013 (322 in the United States and 35 international). In 2013, US participants reported performing nearly 120,000 tests monthly. Analytical sensitivity and specificity of US laboratories were 98.8% (95% confidence interval: 98.4-99.1%) and 99.6% (95% confidence interval: 99.4-99.7%), respectively. Analytical sensitivity improved between 2003 and 2008 (from 97.9 to 99.3%; P = 0.007) and remained steady thereafter. Clinical interpretation matched the intended response for 98.8, 86.0, and 91.0% of challenges with no, one, or two mutations, respectively. International laboratories performed similarly. Laboratory testing for cystic fibrosis in the United States has improved since 2003, and these data demonstrate a high level of quality. Neither the number of samples tested nor test methodology affected performance.
Comparative Kinetic Analysis of Closed-Ended and Open-Ended Porous Sensors
NASA Astrophysics Data System (ADS)
Zhao, Yiliang; Gaur, Girija; Mernaugh, Raymond L.; Laibinis, Paul E.; Weiss, Sharon M.
2016-09-01
Efficient mass transport through porous networks is essential for achieving rapid response times in sensing applications utilizing porous materials. In this work, we show that open-ended porous membranes can overcome diffusion challenges experienced by closed-ended porous materials in a microfluidic environment. A theoretical model including both transport and reaction kinetics is employed to study the influence of flow velocity, bulk analyte concentration, analyte diffusivity, and adsorption rate on the performance of open-ended and closed-ended porous sensors integrated with flow cells. The analysis shows that open-ended pores enable analyte flow through the pores and greatly reduce the response time and analyte consumption for detecting large molecules with slow diffusivities compared with closed-ended pores for which analytes largely flow over the pores. Experimental confirmation of the results was carried out with open- and closed-ended porous silicon (PSi) microcavities fabricated in flow-through and flow-over sensor configurations, respectively. The adsorption behavior of small analytes onto the inner surfaces of closed-ended and open-ended PSi membrane microcavities was similar. However, for large analytes, PSi membranes in a flow-through scheme showed significant improvement in response times due to more efficient convective transport of analytes. The experimental results and theoretical analysis provide quantitative estimates of the benefits offered by open-ended porous membranes for different analyte systems.
Continuing evolution of in-vitro diagnostic instrumentation
NASA Astrophysics Data System (ADS)
Cohn, Gerald E.
2000-04-01
The synthesis of analytical instrumentation and analytical biochemistry technologies in modern in vitro diagnostic instrumentation continues to generate new systems with improved performance and expanded capability. Detection modalities have expanded to include multichip modes of fluorescence, scattering, luminescence and reflectance so as to accommodate increasingly sophisticated immunochemical and nucleic acid based reagent systems. The time line graph of system development now extends from the earliest automated clinical spectrophotometers through molecule recognition assays and biosensors to the new breakthroughs of biochip and DNA diagnostics. This brief review traces some of the major innovations in the evolution of system technologies and previews the conference program.
Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis
NASA Technical Reports Server (NTRS)
Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.
2012-01-01
MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
Enhancing Community Detection By Affinity-based Edge Weighting Scheme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Andy; Sanders, Geoffrey; Henson, Van
Community detection refers to an important graph analytics problem of finding a set of densely-connected subgraphs in a graph and has gained a great deal of interest recently. The performance of current community detection algorithms is limited by an inherent constraint of unweighted graphs that offer very little information on their internal community structures. In this paper, we propose a new scheme to address this issue that weights the edges in a given graph based on recently proposed vertex affinity. The vertex affinity quantifies the proximity between two vertices in terms of their clustering strength, and therefore, it is idealmore » for graph analytics applications such as community detection. We also demonstrate that the affinity-based edge weighting scheme can improve the performance of community detection algorithms significantly.« less
Evaluation of Aspergillus PCR protocols for testing serum specimens.
White, P Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J G; McCulloch, Elaine; Barnes, Rosemary A; Donnelly, J Peter; Loeffler, Juergen
2011-11-01
A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance.
Evaluation of Aspergillus PCR Protocols for Testing Serum Specimens▿†
White, P. Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J. G.; McCulloch, Elaine; Barnes, Rosemary A.; Donnelly, J. Peter; Loeffler, Juergen
2011-01-01
A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance. PMID:21940479
NASA Astrophysics Data System (ADS)
Maghsoudi, Mohammad Javad; Mohamed, Z.; Sudin, S.; Buyamin, S.; Jaafar, H. I.; Ahmad, S. M.
2017-08-01
This paper proposes an improved input shaping scheme for an efficient sway control of a nonlinear three dimensional (3D) overhead crane with friction using the particle swarm optimization (PSO) algorithm. Using this approach, a higher payload sway reduction is obtained as the input shaper is designed based on a complete nonlinear model, as compared to the analytical-based input shaping scheme derived using a linear second order model. Zero Vibration (ZV) and Distributed Zero Vibration (DZV) shapers are designed using both analytical and PSO approaches for sway control of rail and trolley movements. To test the effectiveness of the proposed approach, MATLAB simulations and experiments on a laboratory 3D overhead crane are performed under various conditions involving different cable lengths and sway frequencies. Their performances are studied based on a maximum residual of payload sway and Integrated Absolute Error (IAE) values which indicate total payload sway of the crane. With experiments, the superiority of the proposed approach over the analytical-based is shown by 30-50% reductions of the IAE values for rail and trolley movements, for both ZV and DZV shapers. In addition, simulations results show higher sway reductions with the proposed approach. It is revealed that the proposed PSO-based input shaping design provides higher payload sway reductions of a 3D overhead crane with friction as compared to the commonly designed input shapers.
Analysis of Advanced Rotorcraft Configurations
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2000-01-01
Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).
NASA Astrophysics Data System (ADS)
Erfaisalsyah, M. H.; Mansur, A.; Khasanah, A. U.
2017-11-01
For a company which engaged in the textile field, specify the supplier of raw materials for production is one important part of supply chain management which can affect the company's business processes. This study aims to identify the best suppliers of raw material suppliers of yarn for PC. PKBI based on several criteria. In this study, the integration between the Analytical Hierarchy Process (AHP) and the Standardized Unitless Rating (SUR) are used to assess the performance of the suppliers. By using AHP, it can be known the value of the relative weighting of each criterion. While SUR shows the sequence performance value of the supplier. The result of supplier ranking calculation can be used to know the strengths and weaknesses of each supplier based on its performance criteria. From the final result, it can be known which suppliers should improve their performance in order to create long term cooperation with the company.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Manier, M. Lisa; Spraggins, Jeffrey M.; Reyzer, Michelle L.; Norris, Jeremy L.; Caprioli, Richard M.
2014-01-01
Imaging mass spectrometry (IMS) studies increasingly focus on endogenous small molecular weight metabolites and consequently bring special analytical challenges. Since analytical tissue blanks do not exist for endogenous metabolites, careful consideration must be given to confirm molecular identity. Here we present approaches for the improvement in detection of endogenous amine metabolites such as amino acids and neurotransmitters in tissues through chemical derivatization and matrix-assisted laser desorption/ionization (MALDI) IMS. Chemical derivatization with 4-hydroxy-3-methoxycinnamaldehyde (CA) was used to improve sensitivity and specificity. CA was applied to the tissue via MALDI sample targets precoated with a mixture of derivatization reagent and ferulic acid (FA) as a MALDI matrix. Spatial distributions of chemically derivatized endogenous metabolites in tissue were determined by high-mass resolution and MSn imaging mass spectrometry. We highlight an analytical strategy for metabolite validation whereby tissue extracts are analyzed by high-performance liquid chromatography (HPLC)-MS/MS to unambiguously identify metabolites and distinguish them from isobaric compounds. PMID:25044893
QSPR studies on the photoinduced-fluorescence behaviour of pharmaceuticals and pesticides.
López-Malo, D; Bueso-Bordils, J I; Duart, M J; Alemán-López, P A; Martín-Algarra, R V; Antón-Fos, G M; Lahuerta-Zamora, L; Martínez-Calatayud, J
2017-07-01
Fluorimetric analysis is still a growing line of research in the determination of a wide range of organic compounds, including pharmaceuticals and pesticides, which makes necessary the development of new strategies aimed at improving the performance of fluorescence determinations as well as the sensitivity and, especially, the selectivity of the newly developed analytical methods. In this paper are presented applications of a useful and growing tool suitable for fostering and improving research in the analytical field. Experimental screening, molecular connectivity and discriminant analysis are applied to organic compounds to predict their fluorescent behaviour after their photodegradation by UV irradiation in a continuous flow manifold (multicommutation flow assembly). The screening was based on online fluorimetric measurement and comprised pre-selected compounds with different molecular structures (pharmaceuticals and some pesticides with known 'native' fluorescent behaviour) to study their changes in fluorescent behaviour after UV irradiation. Theoretical predictions agree with the results from the experimental screening and could be used to develop selective analytical methods, as well as helping to reduce the need for expensive, time-consuming and trial-and-error screening procedures.
Curved Thermopiezoelectric Shell Structures Modeled by Finite Element Analysis
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
2000-01-01
"Smart" structures composed of piezoelectric materials may significantly improve the performance of aeropropulsion systems through a variety of vibration, noise, and shape-control applications. The development of analytical models for piezoelectric smart structures is an ongoing, in-house activity at the NASA Glenn Research Center at Lewis Field focused toward the experimental characterization of these materials. Research efforts have been directed toward developing analytical models that account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. Current work revolves around implementing thermal effects into a curvilinear-shell finite element code. This enhances capabilities to analyze curved structures and to account for coupling effects arising from thermal effects and the curved geometry. The current analytical model implements a unique mixed multi-field laminate theory to improve computational efficiency without sacrificing accuracy. The mechanics can model both the sensory and active behavior of piezoelectric composite shell structures. Finite element equations are being implemented for an eight-node curvilinear shell element, and numerical studies are being conducted to demonstrate capabilities to model the response of curved piezoelectric composite structures (see the figure).
Hummel, J M Marjan; Snoek, Govert J; van Til, Janine A; van Rossum, Wouter; Ijzerman, Maarten J
2005-01-01
This study supported the evaluation by a rehabilitation team of the performance of two treatment options that improve the arm-hand function in subjects with sixth cervical vertebra (C6) level Motor Group 2 tetraplegia. The analytic hierarchy process, a technique for multicriteria decision analysis, was used by a rehabilitation team and potential recipients to quantitatively compare a new technology, Functional Elec trical Stimulation (FES), with conventional surgery. Perform-ance was measured by functional improvement, treatment load, risks, user-friendliness, and social outcomes. Functional improvement after FES was considered better than that after conventional surgery. However, the rehabilitation team's overall rating for conventional surgery was slightly higher than that for FES (57% vs 44%). Compared with the rehabilitation team, potential recipients gave greater weight to burden of treatment and less weight to functional improvement. This study shows that evaluation of new technology must be more comprehensive than the evaluation of functional improvement alone, and that patient preferences may differ from those of the rehabilitation team.
Kai, Junhai; Puntambekar, Aniruddha; Santiago, Nelson; Lee, Se Hwan; Sehy, David W; Moore, Victor; Han, Jungyoup; Ahn, Chong H
2012-11-07
In this work we introduce a novel microfluidic enzyme linked immunoassays (ELISA) microplate as the next generation assay platform for unparalleled assay performances. A combination of microfluidic technology with standard SBS-configured 96-well microplate architecture, in the form of microfluidic microplate technology, allows for the improvement of ELISA workflows, conservation of samples and reagents, improved reaction kinetics, and the ability to improve the sensitivity of the assay by multiple analyte loading. This paper presents the design and characterization of the microfluidic microplate, and its application in ELISA.
Computer Simulation For Design Of TWT's
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1992-01-01
A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.
Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry
Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui
2014-01-01
Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355
Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui
2012-09-18
Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.
Performance monitoring can boost turboexpander efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIntire, R.
1982-07-05
This paper discusses ways of improving the productivity of the turboexpander/refrigeration system's radial expander and radial compressor through systematic review of component performance. It reviews several techniques to determine the performance of an expander and compressor. It suggests that any performance improvement program requires quantifying the performance of separate components over a range of operating conditions; estimating the increase in performance associated with any hardware change; and developing an analytical (computer) model of the entire system by using the performance curve of individual components. The model is used to quantify the economic benefits of any change in the system, eithermore » a change in operating procedures or a hardware modification. Topics include proper ways of using antisurge control valves and modifying flow rate/shaft speed (Q/N). It is noted that compressor efficiency depends on the incidence angle of blade at the rotor leading edge and the angle of the incoming gas stream.« less
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.
White, B J; Amrine, D E; Larson, R L
2018-04-14
Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.
NASA Astrophysics Data System (ADS)
Wang, D.; Cui, Y.
2015-12-01
The objectives of this paper are to validate the applicability of a multi-band quasi-analytical algorithm (QAA) in retrieval absorption coefficients of optically active constituents in turbid coastal waters, and to further improve the model using a proposed semi-analytical model (SAA). The ap(531) and ag(531) semi-analytically derived using SAA model are quite different from the retrievals procedures of QAA model that ap(531) and ag(531) are semi-analytically derived from the empirical retrievals results of a(531) and a(551). The two models are calibrated and evaluated against datasets taken from 19 independent cruises in West Florida Shelf in 1999-2003, provided by SeaBASS. The results indicate that the SAA model produces a superior performance to QAA model in absorption retrieval. Using of the SAA model in retrieving absorption coefficients of optically active constituents from West Florida Shelf decreases the random uncertainty of estimation by >23.05% from the QAA model. This study demonstrates the potential of the SAA model in absorption coefficients of optically active constituents estimating even in turbid coastal waters. Keywords: Remote sensing; Coastal Water; Absorption Coefficient; Semi-analytical Model
Improving Workplace-Based Assessment and Feedback by an E-Portfolio Enhanced with Learning Analytics
ERIC Educational Resources Information Center
van der Schaaf, Marieke; Donkers, Jeroen; Slof, Bert; Moonen-van Loon, Joyce; van Tartwijk, Jan; Driessen, Eric; Badii, Atta; Serban, Ovidiu; Ten Cate, Olle
2017-01-01
Electronic portfolios (E-portfolios) are crucial means for workplace-based assessment and feedback. Although E-portfolios provide a useful approach to view each learner's progress, so far options for personalized feedback and potential data about a learner's performances at the workplace often remain unexploited. This paper advocates that…
ERIC Educational Resources Information Center
Allenbaugh, R. J.; Herrera, K. M.
2014-01-01
Determining student readiness for gateway chemistry courses and providing underprepared students effective remediation are important as student bodies are growing increasingly diverse in their precollege preparation. The effectiveness of the ACT Mathematics Test and the Whimbey Analytical Skills Inventory (WASI) in predicting student success in…
A novel optimization algorithm for MIMO Hammerstein model identification under heavy-tailed noise.
Jin, Qibing; Wang, Hehe; Su, Qixin; Jiang, Beiyan; Liu, Qie
2018-01-01
In this paper, we study the system identification of multi-input multi-output (MIMO) Hammerstein processes under the typical heavy-tailed noise. To the best of our knowledge, there is no general analytical method to solve this identification problem. Motivated by this, we propose a general identification method to solve this problem based on a Gaussian-Mixture Distribution intelligent optimization algorithm (GMDA). The nonlinear part of Hammerstein process is modeled by a Radial Basis Function (RBF) neural network, and the identification problem is converted to an optimization problem. To overcome the drawbacks of analytical identification method in the presence of heavy-tailed noise, a meta-heuristic optimization algorithm, Cuckoo search (CS) algorithm is used. To improve its performance for this identification problem, the Gaussian-mixture Distribution (GMD) and the GMD sequences are introduced to improve the performance of the standard CS algorithm. Numerical simulations for different MIMO Hammerstein models are carried out, and the simulation results verify the effectiveness of the proposed GMDA. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Nanomaterials towards fabrication of cholesterol biosensors: Key roles and design approaches.
Saxena, Urmila; Das, Asim Bikas
2016-01-15
Importance of cholesterol biosensors is already recognized in the clinical diagnosis of cardiac and brain vascular diseases as discernible from the enormous amount of research in this field. Nevertheless, the practical application of a majority of the fabricated cholesterol biosensors is ordinarily limited by their inadequate performance in terms of one or more analytical parameters including stability, sensitivity and detection limit. Nanoscale materials offer distinctive size tunable electronic, catalytic and optical properties which opened new opportunities for designing highly efficient biosensor devices. Incorporation of nanomaterials in biosensing devices has found to improve the electroactive surface, electronic conductivity and biocompatibility of the electrode surfaces which then improves the analytical performance of the biosensors. Here we have reviewed recent advances in nanomaterial-based cholesterol biosensors. Foremost, the diverse roles of nanomaterials in these sensor systems have been discussed. Later, we have exhaustively explored the strategies used for engineering cholesterol biosensors with nanotubes, nanoparticles and nanocomposites. Finally, this review concludes with future outlook signifying some challenges of these nanoengineered cholesterol sensors. Copyright © 2015 Elsevier B.V. All rights reserved.
The intrinsic matter bispectrum in ΛCDM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tram, Thomas; Crittenden, Robert; Koyama, Kazuya
2016-05-01
We present a fully relativistic calculation of the matter bispectrum at second order in cosmological perturbation theory assuming a Gaussian primordial curvature perturbation. For the first time we perform a full numerical integration of the bispectrum for both baryons and cold dark matter using the second-order Einstein-Boltzmann code, SONG. We review previous analytical results and provide an improved analytic approximation for the second-order kernel in Poisson gauge which incorporates Newtonian nonlinear evolution, relativistic initial conditions, the effect of radiation at early times and the cosmological constant at late times. Our improved kernel provides a percent level fit to the fullmore » numerical result at late times for most configurations, including both equilateral shapes and the squeezed limit. We show that baryon acoustic oscillations leave an imprint in the matter bispectrum, making a significant impact on squeezed shapes.« less
Extra-analytical quality indicators and laboratory performances.
Sciacovelli, Laura; Aita, Ada; Plebani, Mario
2017-07-01
In the last few years much progress has been made in raising the awareness of laboratory medicine professionals about the effectiveness of quality indicators (QIs) in monitoring, and improving upon, performances in the extra-analytical phases of the Total Testing Process (TTP). An effective system for management of QIs includes the implementation of an internal assessment system and participation in inter-laboratory comparison. A well-designed internal assessment system allows the identification of critical activities and their systematic monitoring. Active participation in inter-laboratory comparison provides information on the performance level of one laboratory with respect to that of other participating laboratories. In order to guarantee the use of appropriate QIs and facilitate their implementation, many laboratories have adopted the Model of Quality Indicators (MQI) proposed by Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of IFCC, since 2008, which is the result of international consensus and continuous experimentation, and updating to meet new, constantly emerging needs. Data from participating laboratories are collected monthly and reports describing the statistical results and evaluating laboratory data, utilizing the Six Sigma metric, issued regularly. Although the results demonstrate that the processes need to be improved upon, overall the comparison with data collected in 2014 shows a general stability of quality levels and that an improvement has been achieved over time for some activities. The continuous monitoring of QI data allows identification all possible improvements, thus highlighting the value of participation in the inter-laboratory program proposed by WG-LEPS. The active participation of numerous laboratories will guarantee an ever more significant State-of-the-Art, promote the reduction of errors and improve quality of the TTP, thus guaranteeing patient safety. Copyright © 2017. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Dhariwal, Rohit; Rani, Sarma; Koch, Donald
2015-11-01
In an earlier work, Rani, Dhariwal, and Koch (JFM, Vol. 756, 2014) developed an analytical closure for the diffusion current in the PDF transport equation describing the relative motion of high-Stokes-number particle pairs in isotropic turbulence. In this study, an improved closure was developed for the diffusion coefficient, such that the motion of the particle-pair center of mass is taken into account. Using the earlier and the new analytical closures, Langevin simulations of pair relative motion were performed for four particle Stokes numbers, Stη = 10 , 20 , 40 , 80 and at two Taylor micro-scale Reynolds numbers Reλ = 76 , 131 . Detailed comparisons of the analytical model predictions with those of DNS were undertaken. It is seen that the pair relative motion statistics obtained from the improved theory show excellent agreement with the DNS statistics. The radial distribution functions (RDFs), and relative velocity PDFs obtained from the improved-closure-based Langevin simulations are found to be in very good agreement with those from DNS. It was found that the RDFs and relative velocity RMS increased with Reλ for all Stη . The collision kernel also increased strongly with Reλ , since it depended on the RDF and the radial relative velocities.
FLOQSwab™: Optimisation of Procedures for the Recovery of Microbiological Samples from Surfaces
Finazzi, Guido; Losio, Marina Nadia; Varisco, Giorgio
2016-01-01
The FLOQSwab™ is a specimen collection device worldwide recognised for its superior performance in the clinical diagnostics. The aim of this work was to evaluate FLOQSwab™ for the recovery of microbiological samples from surfaces compared to the traditional swab (rayon tipped swab) as per ISO 18593:2004 standard. The FLOQSwab™, thanks to its innovative manufacturing technology, allows improving the efficiency of recovery and release of analyte. The study has been divided into two experiments. In the first experiment the two swabs were evaluated for their capacity to recover and release the analyte (three different bacterial loads of Escherichia coli). In the second experiment, the two swabs were evaluated for their capacity to recover three different bacterial loads of E. coli from two different surface materials (stainless steel and polypropylene). In all experiments the flocked swab demonstrated a higher recovery rate compared to the traditional rayon tipped swab. The data obtained from this preliminary study demonstrated that the FLOQSwab™ could be a good food surfaces collection device, which improves the recovery of the analyte and thus produces accurate results. Based on the outcomes of the study, a larger field study is in progress using the FLOQSwab™ for samples collection to improve both environmental monitoring and the efficacy of the hygiene controls for food safety. PMID:27853708
Qi, H.; Coplen, T.B.; Wassenaar, L.I.
2011-01-01
It is well known that N2 in the ion source of a mass spectrometer interferes with the CO background during the δ18O measurement of carbon monoxide. A similar problem arises with the high-temperature conversion (HTC) analysis of nitrogenous O-bearing samples (e.g. nitrates and keratins) to CO for δ18O measurement, where the sample introduces a significant N2 peak before the CO peak, making determination of accurate oxygen isotope ratios difficult. Although using a gas chromatography (GC) column longer than that commonly provided by manufacturers (0.6 m) can improve the efficiency of separation of CO and N2 and using a valve to divert nitrogen and prevent it from entering the ion source of a mass spectrometer improved measurement results, biased δ18O values could still be obtained. A careful evaluation of the performance of the GC separation column was carried out. With optimal GC columns, the δ18O reproducibility of human hair keratins and other keratin materials was better than ±0.15 ‰ (n = 5; for the internal analytical reproducibility), and better than ±0.10 ‰ (n = 4; for the external analytical reproducibility).
Wang, Hua; Liu, Feng; Xia, Ling; Crozier, Stuart
2008-11-21
This paper presents a stabilized Bi-conjugate gradient algorithm (BiCGstab) that can significantly improve the performance of the impedance method, which has been widely applied to model low-frequency field induction phenomena in voxel phantoms. The improved impedance method offers remarkable computational advantages in terms of convergence performance and memory consumption over the conventional, successive over-relaxation (SOR)-based algorithm. The scheme has been validated against other numerical/analytical solutions on a lossy, multilayered sphere phantom excited by an ideal coil loop. To demonstrate the computational performance and application capability of the developed algorithm, the induced fields inside a human phantom due to a low-frequency hyperthermia device is evaluated. The simulation results show the numerical accuracy and superior performance of the method.
Improving Trends in Gender Disparities in the Department of Veterans Affairs: 2008–2013
Czarnogorski, Maggie; Wright, Steve M.; Hayes, Patricia M.; Haskell, Sally G.
2014-01-01
Increasing numbers of women veterans using Department of Veterans Affairs (VA) services has contributed to the need for equitable, high-quality care for women. The VA has evaluated performance measure data by gender since 2006. In 2008, the VA launched a 5-year women’s health redesign, and, in 2011, gender disparity improvement was included on leadership performance plans. We examined data from VA Office of Analytics and Business Intelligence quarterly gender reports for trends in gender disparities in gender-neutral performance measures from 2008 to 2013. Through reporting of data by gender, leadership involvement, electronic reminders, and population management dashboards, VA has seen a decreasing trend in gender inequities on most Health Effectiveness Data and Information Set performance measures. PMID:25100416
Sliding mode control of magnetic suspensions for precision pointing and tracking applications
NASA Technical Reports Server (NTRS)
Misovec, Kathleen M.; Flynn, Frederick J.; Johnson, Bruce G.; Hedrick, J. Karl
1991-01-01
A recently developed nonlinear control method, sliding mode control, is examined as a means of advancing the achievable performance of space-based precision pointing and tracking systems that use nonlinear magnetic actuators. Analytic results indicate that sliding mode control improves performance compared to linear control approaches. In order to realize these performance improvements, precise knowledge of the plant is required. Additionally, the interaction of an estimating scheme and the sliding mode controller has not been fully examined in the literature. Estimation schemes were designed for use with this sliding mode controller that do not seriously degrade system performance. The authors designed and built a laboratory testbed to determine the feasibility of utilizing sliding mode control in these types of applications. Using this testbed, experimental verification of the authors' analyses is ongoing.
1990-06-01
on simple railgun accelerators andI homopolar generators. Complex rotating flux compressors would drastically improve the performance of EM launchers...velocities. If this is the direction of improvement, then energies stored in the electric trains built with linear electric motors in Japan and Western I...laboratories which had power supplies 3 already built for other programs ( homopolar generators in conjunction with an inductor and an opening switch
Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.
2014-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. Analytics within NEX occurs at several levels - data, workflows, science and knowledge. At the data level, we are focusing on collecting and analyzing any information that is relevant to efficient acquisition, processing and management of data at the smallest granularity, such as files or collections. This includes processing and analyzing all local and many external metadata that are relevant to data quality, size, provenance, usage and other attributes. This then helps us better understand usage patterns and improve efficiency of data handling within NEX. When large-scale workflows are executed on NEX, we capture information that is relevant to processing and that can be analyzed in order to improve efficiencies in job scheduling, resource optimization, or data partitioning that would improve processing throughput. At this point we also collect data provenance as well as basic statistics of intermediate and final products created during the workflow execution. These statistics and metrics form basic process and data QA that, when combined with analytics algorithms, helps us identify issues early in the production process. We have already seen impact in some petabyte-scale projects, such as global Landsat processing, where we were able to reduce processing times from days to hours and enhance process monitoring and QA. While the focus so far has been mostly on support of NEX operations, we are also building a web-based infrastructure that enables users to perform direct analytics on science data - such as climate predictions or satellite data. Finally, as one of the main goals of NEX is knowledge acquisition and sharing, we began gathering and organizing information that associates users and projects with data, publications, locations and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.
Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice
NASA Astrophysics Data System (ADS)
Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.
2013-10-01
Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d
NASA Technical Reports Server (NTRS)
Chung, William; Chachad, Girish; Hochstetler, Ronald
2016-01-01
The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.
Krleza, Jasna Lenicek; Dorotic, Adrijana; Grzunov, Ana
2017-02-15
Proper standardization of laboratory testing requires assessment of performance after the tests are performed, known as the post-analytical phase. A nationwide external quality assessment (EQA) scheme implemented in Croatia in 2014 includes a questionnaire on post-analytical practices, and the present study examined laboratory responses in order to identify current post-analytical phase practices and identify areas for improvement. In four EQA exercises between September 2014 and December 2015, 145-174 medical laboratories across Croatia were surveyed using the Module 11 questionnaire on the post-analytical phase of testing. Based on their responses, the laboratories were evaluated on four quality indicators: turnaround time (TAT), critical values, interpretative comments and procedures in the event of abnormal results. Results were presented as absolute numbers and percentages. Just over half of laboratories (56.3%) monitored TAT. Laboratories varied substantially in how they dealt with critical values. Most laboratories (65-97%) issued interpretative comments with test results. One third of medical laboratories (30.6-33.3%) issued abnormal test results without confirming them in additional testing. Our results suggest that the nationwide post-analytical EQA scheme launched in 2014 in Croatia has yet to be implemented to the full. To close the gaps between existing recommendations and laboratory practice, laboratory professionals should focus on ensuring that TAT is monitored and lists of critical values are established within laboratories. Professional bodies/institutions should focus on clarify and harmonized rules to standardized practices and applied for adding interpretative comments to laboratory test results and for dealing with abnormal test results.
Improvement of a Pneumatic Control Valve with Self-Holding Function
NASA Astrophysics Data System (ADS)
Dohta, Shujiro; Akagi, Tetsuya; Kobayashi, Wataru; Shimooka, So; Masago, Yusuke
2017-10-01
The purpose of this study is to develop a small-sized, lightweight and low-cost control valve with low energy consumption and to apply it to the assistive system. We have developed some control valves; a tiny on/off valve using a vibration motor, and an on/off valve with self-holding function. We have also proposed and tested the digital servo valve with self-holding function using permanent magnets and a small-sized servo motor. In this paper, in order to improve the valve, an analytical model of the digital servo valve is proposed. And the simulated results by using the analytical model and identified parameters were compared with the experimental results. Then, the improved digital servo valve was designed based on the calculated results and tested. As a result, we realized the digital servo valve that can control the flow rate more precisely while maintaining its volume and weight compared with the previous valve. As an application of the improved valve, a position control system of rubber artificial muscle was built and the position control was performed successfully.
Toward improved understanding and control in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hieftje, Gary M.
1989-01-01
As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.
NASA Technical Reports Server (NTRS)
Burns, W. W., III
1977-01-01
An analytically derived approach to the control of energy-storage dc-to-dc converters, which enables improved system performance and an extensive understanding of the manner in which this improved performance is accomplished, is presented. The control approach is derived from a state-plane analysis of dc-to-dc converter power stages which enables a graphical visualization of the movement of the system state during both steady state and transient operation. This graphical representation of the behavior of dc-to-dc converter systems yields considerable qualitative insight into the cause and effect relationships which exist between various commonly used converter control functions and the system performance which results from them.
Novell, Arnau; Méndez, Alberto; Minguillón, Cristina
2015-07-17
The chromatographic behaviour and performance of four polyproline-derived chiral stationary phases (CSPs) were tested using supercritical fluid chromatography (SFC). A series of structurally related racemic compounds, whose enantioseparation was proved to be sensitive to the type of mobile phase used in NP-HPLC, were chosen to be tested in the SFC conditions. Good enantioselection ability was shown by the CSPs for the analytes tested in the new conditions. Resolution, efficiency and analysis time, were considerably improved with respect to NP-HPLC when CO2/alcohol mobile phases were used. Monolithic columns clearly show enhanced chromatographic parameters and improved performance respect to their bead-based counterparts. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Trung, Ha Duyen
2017-12-01
In this paper, the end-to-end performance of free-space optical (FSO) communication system combining with Amplify-and-Forward (AF)-assisted or fixed-gain relaying technology using subcarrier quadrature amplitude modulation (SC-QAM) over weak atmospheric turbulence channels modeled by log-normal distribution with pointing error impairments is studied. More specifically, unlike previous studies on AF relaying FSO communication systems without pointing error effects; the pointing error effect is studied by taking into account the influence of beamwidth, aperture size and jitter variance. In addition, a combination of these models to analyze the combined effect of atmospheric turbulence and pointing error to AF relaying FSO/SC-QAM systems is used. Finally, an analytical expression is derived to evaluate the average symbol error rate (ASER) performance of such systems. The numerical results show that the impact of pointing error on the performance of AF relaying FSO/SC-QAM systems and how we use proper values of aperture size and beamwidth to improve the performance of such systems. Some analytical results are confirmed by Monte-Carlo simulations.
NASA Astrophysics Data System (ADS)
Chen, G.; Chacón, L.
2013-08-01
We propose a 1D analytical particle mover for the recent charge- and energy-conserving electrostatic particle-in-cell (PIC) algorithm in Ref. [G. Chen, L. Chacón, D.C. Barnes, An energy- and charge-conserving, implicit, electrostatic particle-in-cell algorithm, Journal of Computational Physics 230 (2011) 7018-7036]. The approach computes particle orbits exactly for a given piece-wise linear electric field. The resulting PIC algorithm maintains the exact charge and energy conservation properties of the original algorithm, but with improved performance (both in efficiency and robustness against the number of particles and timestep). We demonstrate the advantageous properties of the scheme with a challenging multiscale numerical test case, the ion acoustic wave. Using the analytical mover as a reference, we demonstrate that the choice of error estimator in the Crank-Nicolson mover has significant impact on the overall performance of the implicit PIC algorithm. The generalization of the approach to the multi-dimensional case is outlined, based on a novel and simple charge conserving interpolation scheme.
NASA Astrophysics Data System (ADS)
Huo, Sen; Zhou, Jiaxun; Wang, Tianyou; Chen, Rui; Jiao, Kui
2018-04-01
Experimental test and analytical modeling are conducted to investigate the operating behavior of an alkaline electrolyte membrane (AEM) fuel cell fed by H2/air (or O2) and explore the effect of various operating pressures on the water transfer mechanism. According to the experimental test, the cell performance is greatly improved through increasing the operating pressure gradient from anode to cathode which leads to significant liquid water permeation through the membrane. The high frequency resistance of the A901 alkaline membrane is observed to be relatively stable as the operating pressure varies based on the electrochemical impedance spectroscopy (EIS) method. Correspondingly, based on the modeling prediction, the averaged water content in the membrane electrode assembly (MEA) does not change too much which leads to the weak variation of membrane ohmic resistance. This reveals that the performance enhancement should give the credit to better electro-chemical reaction kinetics for both the anode and cathode, also prone by the EIS results. The reversion of water back diffusion direction across the membrane is also observed through analytical solution.
Flatland, Bente; Freeman, Kathy P; Friedrichs, Kristen R; Vap, Linda M; Getzy, Karen M; Evans, Ellen W; Harr, Kendal E
2010-09-01
Owing to lack of governmental regulation of veterinary laboratory performance, veterinarians ideally should demonstrate a commitment to self-monitoring and regulation of laboratory performance from within the profession. In response to member concerns about quality management in veterinary laboratories, the American Society for Veterinary Clinical Pathology (ASVCP) formed a Quality Assurance and Laboratory Standards (QAS) committee in 1996. This committee recently published updated and peer-reviewed Quality Assurance Guidelines on the ASVCP website. The Quality Assurance Guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports on 1) general analytic factors for veterinary laboratory performance and comparisons, 2) hematology and hemostasis, and 3) clinical chemistry, endocrine assessment, and urinalysis. This report documents recommendations for control of general analytical factors within veterinary clinical laboratories and is based on section 2.1 (Analytical Factors Important In Veterinary Clinical Pathology, General) of the newly revised ASVCP QAS Guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimum guidelines for quality assurance and quality control for veterinary laboratory testing. It is hoped that these guidelines will provide a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. ©2010 American Society for Veterinary Clinical Pathology.
In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions
NASA Astrophysics Data System (ADS)
Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh
2005-09-01
This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
Geometric model of pseudo-distance measurement in satellite location systems
NASA Astrophysics Data System (ADS)
Panchuk, K. L.; Lyashkov, A. A.; Lyubchinov, E. V.
2018-04-01
The existing mathematical model of pseudo-distance measurement in satellite location systems does not provide a precise solution of the problem, but rather an approximate one. The existence of such inaccuracy, as well as bias in measurement of distance from satellite to receiver, results in inaccuracy level of several meters. Thereupon, relevance of refinement of the current mathematical model becomes obvious. The solution of the system of quadratic equations used in the current mathematical model is based on linearization. The objective of the paper is refinement of current mathematical model and derivation of analytical solution of the system of equations on its basis. In order to attain the objective, geometric analysis is performed; geometric interpretation of the equations is given. As a result, an equivalent system of equations, which allows analytical solution, is derived. An example of analytical solution implementation is presented. Application of analytical solution algorithm to the problem of pseudo-distance measurement in satellite location systems allows to improve the accuracy such measurements.
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556
Trellis phase codes for power-bandwith efficient satellite communications
NASA Technical Reports Server (NTRS)
Wilson, S. G.; Highfill, J. H.; Hsu, C. D.; Harkness, R.
1981-01-01
Support work on improved power and spectrum utilization on digital satellite channels was performed. Specific attention is given to the class of signalling schemes known as continuous phase modulation (CPM). The specific work described in this report addresses: analytical bounds on error probability for multi-h phase codes, power and bandwidth characterization of 4-ary multi-h codes, and initial results of channel simulation to assess the impact of band limiting filters and nonlinear amplifiers on CPM performance.
NASA Technical Reports Server (NTRS)
Martin, Glenn L.; Tice, David C.; Marcum, Don C., Jr.; Seidel, Jonathan A.
1991-01-01
The present analytic study of the potential performance of SST configurations radically differing from arrow-winged designs in lifting surface planform geometry gives attention to trapezoidal-wing and M-wing configurations; the trapezoidal wing is used as the baseline in the performance comparisons. The design mission was all-supersonic (Mach 2), carrying 248 passengers over a 5500 nautical-mile range. Design constraints encompassed approach speed, TO&L field length, and engine-out second-segment climb and missed-approach performance. Techniques for improving these configurations are discussed.
Ultrasensitive SERS Flow Detector Using Hydrodynamic Focusing
Negri, Pierre; Jacobs, Kevin T.; Dada, Oluwatosin O.; Schultz, Zachary D.
2013-01-01
Label-free, chemical specific detection in flow is important for high throughput characterization of analytes in applications such as flow injection analysis, electrophoresis, and chromatography. We have developed a surface-enhanced Raman scattering (SERS) flow detector capable of ultrasensitive optical detection on the millisecond time scale. The device employs hydrodynamic focusing to improve SERS detection in a flow channel where a sheath flow confines analyte molecules eluted from a fused silica capillary over a planar SERS-active substrate. Increased analyte interactions with the SERS substrate significantly improve detection sensitivity. The performance of this flow detector was investigated using a combination of finite element simulations, fluorescence imaging, and Raman experiments. Computational fluid dynamics based on finite element analysis was used to optimize the flow conditions. The modeling indicates that a number of factors, such as the capillary dimensions and the ratio of the sheath flow to analyte flow rates, are critical for obtaining optimal results. Sample confinement resulting from the flow dynamics was confirmed using wide-field fluorescence imaging of rhodamine 6G (R6G). Raman experiments at different sheath flow rates showed increased sensitivity compared with the modeling predictions, suggesting increased adsorption. Using a 50-millisecond acquisitions, a sheath flow rate of 180 μL/min, and a sample flow rate of 5 μL/min, a linear dynamic range from nanomolar to micromolar concentrations of R6G with a LOD of 1 nM is observed. At low analyte concentrations, rapid analyte desorption is observed, enabling repeated and high-throughput SERS detection. The flow detector offers substantial advantages over conventional SERS-based assays such as minimal sample volumes and high detection efficiency. PMID:24074461
Arreaza, Gladys; Qiu, Ping; Pang, Ling; Albright, Andrew; Hong, Lewis Z.; Marton, Matthew J.; Levitan, Diane
2016-01-01
In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC), circulating tumor DNA (ctDNA), etc.), tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE) is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC), in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.). Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research. PMID:27657050
Stacey, Peter; Butler, Owen
2008-06-01
This paper emphasizes the need for occupational hygiene professionals to require evidence of the quality of welding fume data from analytical laboratories. The measurement of metals in welding fume using atomic spectrometric techniques is a complex analysis often requiring specialist digestion procedures. The results from a trial programme testing the proficiency of laboratories in the Workplace Analysis Scheme for Proficiency (WASP) to measure potentially harmful metals in several different types of welding fume showed that most laboratories underestimated the mass of analyte on the filters. The average recovery was 70-80% of the target value and >20% of reported recoveries for some of the more difficult welding fume matrices were <50%. This level of under-reporting has significant implications for any health or hygiene studies of the exposure of welders to toxic metals for the types of fumes included in this study. Good laboratories' performance measuring spiked WASP filter samples containing soluble metal salts did not guarantee good performance when measuring the more complex welding fume trial filter samples. Consistent rather than erratic error predominated, suggesting that the main analytical factor contributing to the differences between the target values and results was the effectiveness of the sample preparation procedures used by participating laboratories. It is concluded that, with practice and regular participation in WASP, performance can improve over time.
Noise suppressing capillary separation system
Yeung, Edward S.; Xue, Yongjun
1996-07-30
A noise-suppressing capillary separation system for detecting the real-time presence or concentration of an analyte in a sample is provided. The system contains a capillary separation means through which the analyte is moved, a coherent light source that generates a beam which is split into a reference beam and a sample beam that irradiate the capillary, and a detector for detecting the reference beam and the sample beam light that transmits through the capillary. The laser beam is of a wavelength effective to be absorbed by a chromophore in the capillary. The system includes a noise suppressing system to improve performance and accuracy without signal averaging or multiple scans.
Just, Wolfram; Popovich, Svitlana; Amann, Andreas; Baba, Nilüfer; Schöll, Eckehard
2003-02-01
We investigate time-delayed feedback control schemes which are based on the unstable modes of the target state, to stabilize unstable periodic orbits. The periodic time dependence of these modes introduces an external time scale in the control process. Phase shifts that develop between these modes and the controlled periodic orbit may lead to a huge increase of the control performance. We illustrate such a feature on a nonlinear reaction diffusion system with global coupling and give a detailed investigation for the Rössler model. In addition we provide the analytical explanation for the observed control features.
Noise suppressing capillary separation system
Yeung, E.S.; Xue, Y.
1996-07-30
A noise-suppressing capillary separation system for detecting the real-time presence or concentration of an analyte in a sample is provided. The system contains a capillary separation means through which the analyte is moved, a coherent light source that generates a beam which is split into a reference beam and a sample beam that irradiate the capillary, and a detector for detecting the reference beam and the sample beam light that transmits through the capillary. The laser beam is of a wavelength effective to be absorbed by a chromophore in the capillary. The system includes a noise suppressing system to improve performance and accuracy without signal averaging or multiple scans. 13 figs.
Contamination in food from packaging material.
Lau, O W; Wong, S K
2000-06-16
Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.
Nikiforova, Marina N; Mercurio, Stephanie; Wald, Abigail I; Barbi de Moura, Michelle; Callenberg, Keith; Santana-Santos, Lucas; Gooding, William E; Yip, Linwah; Ferris, Robert L; Nikiforov, Yuri E
2018-04-15
Molecular tests have clinical utility for thyroid nodules with indeterminate fine-needle aspiration (FNA) cytology, although their performance requires further improvement. This study evaluated the analytical performance of the newly created ThyroSeq v3 test. ThyroSeq v3 is a DNA- and RNA-based next-generation sequencing assay that analyzes 112 genes for a variety of genetic alterations, including point mutations, insertions/deletions, gene fusions, copy number alterations, and abnormal gene expression, and it uses a genomic classifier (GC) to separate malignant lesions from benign lesions. It was validated in 238 tissue samples and 175 FNA samples with known surgical follow-up. Analytical performance studies were conducted. In the training tissue set of samples, ThyroSeq GC detected more than 100 genetic alterations, including BRAF, RAS, TERT, and DICER1 mutations, NTRK1/3, BRAF, and RET fusions, 22q loss, and gene expression alterations. GC cutoffs were established to distinguish cancer from benign nodules with 93.9% sensitivity, 89.4% specificity, and 92.1% accuracy. This correctly classified most papillary, follicular, and Hurthle cell lesions, medullary thyroid carcinomas, and parathyroid lesions. In the FNA validation set, the GC sensitivity was 98.0%, the specificity was 81.8%, and the accuracy was 90.9%. Analytical accuracy studies demonstrated a minimal required nucleic acid input of 2.5 ng, a 12% minimal acceptable tumor content, and reproducible test results under variable stress conditions. The ThyroSeq v3 GC analyzes 5 different classes of molecular alterations and provides high accuracy for detecting all common types of thyroid cancer and parathyroid lesions. The analytical sensitivity, specificity, and robustness of the test have been successfully validated and indicate its suitability for clinical use. Cancer 2018;124:1682-90. © 2018 American Cancer Society. © 2018 American Cancer Society.
Improvement of the accuracy of noise measurements by the two-amplifier correlation method.
Pellegrini, B; Basso, G; Fiori, G; Macucci, M; Maione, I A; Marconcini, P
2013-10-01
We present a novel method for device noise measurement, based on a two-channel cross-correlation technique and a direct "in situ" measurement of the transimpedance of the device under test (DUT), which allows improved accuracy with respect to what is available in the literature, in particular when the DUT is a nonlinear device. Detailed analytical expressions for the total residual noise are derived, and an experimental investigation of the increased accuracy provided by the method is performed.
NASA Technical Reports Server (NTRS)
Mickey, F. E.; Mcewan, A. J.; Ewing, E. G.; Huyler, W. C., Jr.; Khajeh-Nouri, B.
1970-01-01
An analysis was conducted with the objective of upgrading and improving the loads, stress, and performance prediction methods for Apollo spacecraft parachutes. The subjects considered were: (1) methods for a new theoretical approach to the parachute opening process, (2) new experimental-analytical techniques to improve the measurement of pressures, stresses, and strains in inflight parachutes, and (3) a numerical method for analyzing the dynamical behavior of rapidly loaded pilot chute risers.
Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce
2018-05-30
Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as a clinical test for evaluating CRC and AA risk in symptomatic individuals. Copyright © 2018 Elsevier B.V. All rights reserved.
Evaluation of Force Transfer Around Openings - Experimental and Analytical Studies
Borjen Yeh; Tom Skaggs; Frank Lam; Minghao Li; Douglas Rammer; James Wacker
2011-01-01
Wood structural panel (WSP) sheathed shear walls and diaphragms are the primary lateral-load-resistingelements in wood-frame construction. The historical performance of light-frame structures in North America is very good due, in part, to model building codes that are designed to safeguard life safety. These model building codes have spawned continual improvement and...
Resolution of seven-axis manipulator redundancy: A heuristic issue
NASA Technical Reports Server (NTRS)
Chen, I.
1990-01-01
An approach is presented for the resolution of the redundancy of a seven-axis manipulator arm from the AI and expert systems point of view. This approach is heuristic, analytical, and globally resolves the redundancy at the position level. When compared with other approaches, this approach has several improved performance capabilities, including singularity avoidance, repeatability, stability, and simplicity.
ERIC Educational Resources Information Center
Giacumo, Lisa A.; Breman, Jeroen
2016-01-01
This article provides a systematic literature review about nonprofit and for-profit organizations using "big data" to inform performance improvement initiatives. The review of literature resulted in 4 peer-reviewed articles and an additional 33 studies covering the topic for these contexts. The review found that big data and analytics…
Antony S. Cheng; Toddi Steelman; Cassandra Moseley
2011-01-01
U.S. wildfire policy and governance increasingly emphasize collaboration among levels of government and between government and non-governmental entities, expanding the roles and duties of nonfederal and nongovernmental organizations, and instituting performance-based measures to improve accountability and control costs. While many changes have been enacted, others have...
Evaluation of SHEEO's State Policy Resource Connections (SPRC) Initiative. Final Report
ERIC Educational Resources Information Center
Ryherd, Ann Daley
2011-01-01
With the assistance of the Lumina Foundation, the State Higher Education Executive Officers (SHEEO) staff has been working to develop a broad, up-to-date database of policy relevant information for the states and to create analytical studies to help state leaders identify priorities and practices for improving policies and performance across the…
ERIC Educational Resources Information Center
Lin, Chi-Jen; Hwang, Gwo-Jen
2018-01-01
Flipped classrooms have been widely adopted and discussed by school teachers and researchers in the past decade. However, few studies have been conducted to formally evaluate the effectiveness of flipped classrooms in terms of improving EFL students' English oral presentation, not to mention investigating factors affecting their flipped learning…
Predicting Success: How Predictive Analytics Are Transforming Student Support and Success Programs
ERIC Educational Resources Information Center
Boerner, Heather
2015-01-01
Every year, Lone Star College in Texas hosts a "Men of Honor" program to provide assistance and programming to male students, but particularly those who are Hispanic and black, in hopes their academic performance will improve. Lone Star might have kept directing its limited resources toward these students--and totally missed the subset…
NASA Astrophysics Data System (ADS)
Ferhati, H.; Djeffal, F.
2017-12-01
In this paper, a new MSM-UV-photodetector (PD) based on dual wide band-gap material (DM) engineering aspect is proposed to achieve high-performance self-powered device. Comprehensive analytical models for the proposed sensor photocurrent and the device properties are developed incorporating the impact of DM aspect on the device photoelectrical behavior. The obtained results are validated with the numerical data using commercial TCAD software. Our investigation demonstrates that the adopted design amendment modulates the electric field in the device, which provides the possibility to drive appropriate photo-generated carriers without an external applied voltage. This phenomenon suggests achieving the dual role of effective carriers' separation and an efficient reduce of the dark current. Moreover, a new hybrid approach based on analytical modeling and Particle Swarm Optimization (PSO) is proposed to achieve improved photoelectric behavior at zero bias that can ensure favorable self-powered MSM-based UV-PD. It is found that the proposed design methodology has succeeded in identifying the optimized design that offers a self-powered device with high-responsivity (98 mA/W) and superior ION/IOFF ratio (480 dB). These results make the optimized MSM-UV-DM-PD suitable for providing low cost self-powered devices for high-performance optical communication and monitoring applications.
Experimental and analytical investigation of a modified ring cusp NSTAR engine
NASA Technical Reports Server (NTRS)
Sengupta, Anita
2005-01-01
A series of experimental measurements on a modified laboratory NSTAR engine were used to validate a zero dimensional analytical discharge performance model of a ring cusp ion thruster. The model predicts the discharge performance of a ring cusp NSTAR thruster as a function the magnetic field configuration, thruster geometry, and throttle level. Analytical formalisms for electron and ion confinement are used to predict the ionization efficiency for a given thruster design. Explicit determination of discharge loss and volume averaged plasma parameters are also obtained. The model was used to predict the performance of the nominal and modified three and four ring cusp 30-cm ion thruster configurations operating at the full power (2.3 kW) NSTAR throttle level. Experimental measurements of the modified engine configuration discharge loss compare well with the predicted value for propellant utilizations from 80 to 95%. The theory, as validated by experiment, indicates that increasing the magnetic strength of the minimum closed reduces maxwellian electron diffusion and electrostatically confines the ion population and subsequent loss to the anode wall. The theory also indicates that increasing the cusp strength and minimizing the cusp area improves primary electron confinement increasing the probability of an ionization collision prior to loss at the cusp.
Status of internal quality control for thyroid hormones immunoassays from 2011 to 2016 in China.
Zhang, Shishi; Wang, Wei; Zhao, Haijian; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo
2018-01-01
Internal quality control (IQC) plays a key role in the evaluation of precision performance in clinical laboratories. This report aims to present precision status of thyroid hormones immunoassays from 2011 to 2016 in China. Through Clinet-EQA reporting system, IQC information of Triiodothyronine and Thyroxine in the form of free and total (FT3, TT3, FT4, TT4), as well as Thyroid Stimulating Hormone (TSH) were collected from participant laboratories submitting IQC data in February, 2011-2016. For each analyte, current CVs were compared among different years and measurement systems. Percentages of laboratories meeting five allowable imprecision specifications (pass rates) were also calculated. Analysis of IQC practice was conducted to constitute a complete report. Current CVs were decreasing significantly but pass rates increasing only for FT3 during 6 years. FT3, TT3, FT4, and TT4 had the highest pass rates comparing with 1/3TEa imprecision specification but TSH had this comparing with minimum imprecision specification derived from biological variation. Constituent ratios of four mainstream measurement systems changed insignificantly. In 2016, precision performance of Abbott and Roche systems were better than Beckman and Siemens systems for all analytes except FT3 had Siemens also better than Beckman. Analysis of IQC practice demonstrated wide variation and great progress in aspects of IQC rules and control frequency. With change of IQC practice, only FT3 had precision performance improved in 6 years. However, precision status of five analytes in China was still unsatisfying. Ongoing investigation and improvement of IQC have yet to be achieved. © 2017 Wiley Periodicals, Inc.
Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.
Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong
2018-06-05
Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient evaluation.
Temporal abstraction-based clinical phenotyping with Eureka!
Post, Andrew R; Kurc, Tahsin; Willard, Richie; Rathod, Himanshu; Mansour, Michel; Pai, Akshatha Kalsanka; Torian, William M; Agravat, Sanjay; Sturm, Suzanne; Saltz, Joel H
2013-01-01
Temporal abstraction, a method for specifying and detecting temporal patterns in clinical databases, is very expressive and performs well, but it is difficult for clinical investigators and data analysts to understand. Such patterns are critical in phenotyping patients using their medical records in research and quality improvement. We have previously developed the Analytic Information Warehouse (AIW), which computes such phenotypes using temporal abstraction but requires software engineers to use. We have extended the AIW's web user interface, Eureka! Clinical Analytics, to support specifying phenotypes using an alternative model that we developed with clinical stakeholders. The software converts phenotypes from this model to that of temporal abstraction prior to data processing. The model can represent all phenotypes in a quality improvement project and a growing set of phenotypes in a multi-site research study. Phenotyping that is accessible to investigators and IT personnel may enable its broader adoption.
An Improved Correlation between Impression and Uniaxial Creep
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsueh, Chun-Hway; Miranda, Pedro; Becher, Paul F
2006-01-01
A semiempirical correlation between impression and uniaxial creep has been established by Hyde et al. [Int. J. Mech. Sci. 35, 451 (1993) ] using finite element results for materials exhibiting general power-law creep with the stress exponent n in the range 2 {<=} n {<=} 15. Here, we derive the closed-form solution for a special case of viscoelastic materials, i.e., n = 1, subjected to impression creep and obtain the exact correlation between impression and uniaxial creep. This analytical solution serves as a checkpoint for the finite element results. We then perform finite element analyses for the general case tomore » derive a semiempirical correlation, which agrees well with both analytical viscoelastic results and the existing experimental data. Our improved correlation agrees with the correlation of Hyde et al. for n {>=} 4, and the difference increases with decreasing n for n<4.« less
Chu, Hsuan-Lien; Wang, Chen-Chin; Dai, Yu-Tzu
2009-01-01
The health care industry is under pressure from government and private entities as well as from market conditions to contain costs. In an effort to respond to these pressures, the case hospital in this study implemented a Balanced Scorecard (BSC) in January 2003 and integrated it with the hospital's formal incentive plan for non-physicians in January 2005. The nursing department's performance improved in the 2 years following the introduction of the plan. This study contributes to the literature by demonstrating the performance improvement that results from integrating the BSC with an incentive plan in the nursing field. The results provide insight into the current BSC performance metrics applied by the case nursing department, and could be used as guidelines by other health care organizations that wish to implement BSC-based incentive plans.
Boscardin, Christy; Fergus, Kirkpatrick B; Hellevig, Bonnie; Hauer, Karen E
2017-11-09
Easily accessible and interpretable performance data constitute critical feedback for learners that facilitate informed self-assessment and learning planning. To provide this feedback, there has been a proliferation of educational dashboards in recent years. An educational (learner) dashboard systematically delivers timely and continuous feedback on performance and can provide easily visualized and interpreted performance data. In this paper, we provide practical tips for developing a functional, user-friendly individual learner performance dashboard and literature review of dashboard development, assessment theory, and users' perspectives. Considering key design principles and maximizing current technological advances in data visualization techniques can increase dashboard utility and enhance the user experience. By bridging current technology with assessment strategies that support learning, educators can continue to improve the field of learning analytics and design of information management tools such as dashboards in support of improved learning outcomes.
Long, H. Keith; Daddow, Richard L.; Farrar, Jerry W.
1998-01-01
Since 1962, the U.S. Geological Survey (USGS) has operated the Standard Reference Sample Project to evaluate the performance of USGS, cooperator, and contractor analytical laboratories that analyze chemical constituents of environmental samples. The laboratories are evaluated by using performance evaluation samples, called Standard Reference Samples (SRSs). SRSs are submitted to laboratories semi-annually for round-robin laboratory performance comparison purposes. Currently, approximately 100 laboratories are evaluated for their analytical performance on six SRSs for inorganic and nutrient constituents. As part of the SRS Project, a surplus of homogeneous, stable SRSs is maintained for purchase by USGS offices and participating laboratories for use in continuing quality-assurance and quality-control activities. Statistical evaluation of the laboratories results provides information to compare the analytical performance of the laboratories and to determine possible analytical deficiences and problems. SRS results also provide information on the bias and variability of different analytical methods used in the SRS analyses.
Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof
2004-04-01
To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.
Leion, Felicia; Hegbrant, Josefine; den Bakker, Emil; Jonsson, Magnus; Abrahamson, Magnus; Nyman, Ulf; Björk, Jonas; Lindström, Veronica; Larsson, Anders; Bökenkamp, Arend; Grubb, Anders
2017-09-01
Estimating glomerular filtration rate (GFR) in adults by using the average of values obtained by a cystatin C- (eGFR cystatin C ) and a creatinine-based (eGFR creatinine ) equation shows at least the same diagnostic performance as GFR estimates obtained by equations using only one of these analytes or by complex equations using both analytes. Comparison of eGFR cystatin C and eGFR creatinine plays a pivotal role in the diagnosis of Shrunken Pore Syndrome, where low eGFR cystatin C compared to eGFR creatinine has been associated with higher mortality in adults. The present study was undertaken to elucidate if this concept can also be applied in children. Using iohexol and inulin clearance as gold standard in 702 children, we studied the diagnostic performance of 10 creatinine-based, 5 cystatin C-based and 3 combined cystatin C-creatinine eGFR equations and compared them to the result of the average of 9 pairs of a eGFR cystatin C and a eGFR creatinine estimate. While creatinine-based GFR estimations are unsuitable in children unless calibrated in a pediatric or mixed pediatric-adult population, cystatin C-based estimations in general performed well in children. The average of a suitable creatinine-based and a cystatin C-based equation generally displayed a better diagnostic performance than estimates obtained by equations using only one of these analytes or by complex equations using both analytes. Comparing eGFR cystatin and eGFR creatinine may help identify pediatric patients with Shrunken Pore Syndrome.
Radhakrishnan, Rajiv; Kiluk, Brian D; Tsai, Jack
2016-03-01
Cognitive remediation (CR) has been found to improve cognitive performance among adults with schizophrenia in randomized controlled trials (RCTs). However, improvements in cognitive performance are often observed in the control groups of RCTs as well. There has been no comprehensive examination of change in control groups for CR, which may inform trial methodology and improve our understanding of measured outcomes for cognitive remediation. In this meta-analysis, we calculated pre-post change in cognitive test performance within control groups of RCTs in 32 CR trials (n = 794 participants) published between 1970 and 2011, and examined the association between pre-post change and sample size, duration of treatment, type of control group, and participants' age, intelligence, duration of illness, and psychiatric symptoms. Results showed that control groups in CR trials showed small effect size changes (Cohen's d = 0.12 ± 0.16) in cognitive test performance over the trial duration. Study characteristics associated with pre-post change included participant age and sample size. These findings suggest attention to change in control groups may help improve detection of cognitive remediation effects for schizophrenia.
Performance evaluation of the croissant production line with reparable machines
NASA Astrophysics Data System (ADS)
Tsarouhas, Panagiotis H.
2015-03-01
In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.
An analytical study of various telecomminication networks using markov models
NASA Astrophysics Data System (ADS)
Ramakrishnan, M.; Jayamani, E.; Ezhumalai, P.
2015-04-01
The main aim of this paper is to examine issues relating to the performance of various Telecommunication networks, and applied queuing theory for better design and improved efficiency. Firstly, giving an analytical study of queues deals with quantifying the phenomenon of waiting lines using representative measures of performances, such as average queue length (on average number of customers in the queue), average waiting time in queue (on average time to wait) and average facility utilization (proportion of time the service facility is in use). In the second, using Matlab simulator, summarizes the finding of the investigations, from which and where we obtain results and describing methodology for a) compare the waiting time and average number of messages in the queue in M/M/1 and M/M/2 queues b) Compare the performance of M/M/1 and M/D/1 queues and study the effect of increasing the number of servers on the blocking probability M/M/k/k queue model.
On the performance of energy detection-based CR with SC diversity over IG channel
NASA Astrophysics Data System (ADS)
Verma, Pappu Kumar; Soni, Sanjay Kumar; Jain, Priyanka
2017-12-01
Cognitive radio (CR) is a viable 5G technology to address the scarcity of the spectrum. Energy detection-based sensing is known to be the simplest method as far as hardware complexity is concerned. In this paper, the performance of spectrum sensing-based energy detection technique in CR networks over inverse Gaussian channel for selection combining diversity technique is analysed. More specifically, accurate analytical expressions for the average detection probability under different detection scenarios such as single channel (no diversity) and with diversity reception are derived and evaluated. Further, the detection threshold parameter is optimised by minimising the probability of error over several diversity branches. The results clearly show the significant improvement in the probability of detection when optimised threshold parameter is applied. The impact of shadowing parameters on the performance of energy detector is studied in terms of complimentary receiver operating characteristic curve. To verify the correctness of our analysis, the derived analytical expressions are corroborated via exact result and Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Li, Yuanyuan; Gao, Guanjun; Zhang, Jie; Zhang, Kai; Chen, Sai; Yu, Xiaosong; Gu, Wanyi
2015-06-01
A simplex-method based optimizing (SMO) strategy is proposed to improve the transmission performance for dispersion uncompensated (DU) coherent optical systems with non-identical spans. Through analytical expression of quality of transmission (QoT), this strategy improves the Q factors effectively, while minimizing the number of erbium-doped optical fiber amplifier (EDFA) that needs to be optimized. Numerical simulations are performed for 100 Gb/s polarization-division multiplexed quadrature phase shift keying (PDM-QPSK) channels over 10-span standard single mode fiber (SSMF) with randomly distributed span-lengths. Compared to the EDFA configurations with complete span loss compensation, the Q factor of the SMO strategy is improved by approximately 1 dB at the optimal transmitter launch power. Moreover, instead of adjusting the gains of all the EDFAs to their optimal value, the number of EDFA that needs to be adjusted for SMO is reduced from 8 to 2, showing much less tuning costs and almost negligible performance degradation.
Ishibashi, Midori
2015-01-01
The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.
Decision-directed detector for overlapping PCM/NRZ signals.
NASA Technical Reports Server (NTRS)
Wang, C. D.; Noack, T. L.
1973-01-01
A decision-directed (DD) technique for the detection of overlapping PCM/NRZ signals in the presence of white Gaussian noise is investigated. The performance of the DD detector is represented by probability of error Pe versus input signal-to-noise ratio (SNR). To examine how much improvement in performance can be achieved with this technique, Pe's with and without DD feedback are evaluated in parallel. Further, analytical results are compared with those found by Monte Carlo simulations. The results are in good agreement.
Hybrid bearings for LH2 and LO2 turbopumps
NASA Technical Reports Server (NTRS)
Butner, M. F.; Lee, F. C.
1985-01-01
Hybrid combinations of hydrostatic and ball bearings can improve bearing performance for liquid hydrogen and liquid oxygen turbopumps. Analytic studies were conducted to optimize hybrid bearing designs for the SSME-type turbopump conditions. A method to empirically determine damping coefficients was devised. Four hybrid bearing configurations were designed, and three were fabricated. Six hybrid and hydrostatic-only bearing configurations will be tested for steady-state and transient performance, and quantification of damping coefficients. The initial tests were conducted with the liquid hydrogen bearing.
Public Health Analysis Transport Optimization Model v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walt; Finley, Patrick; Walser, Alex
PHANTOM models logistic functions of national public health systems. The system enables public health officials to visualize and coordinate options for public health surveillance, diagnosis, response and administration in an integrated analytical environment. Users may simulate and analyze system performance applying scenarios that represent current conditions or future contingencies what-if analyses of potential systemic improvements. Public health networks are visualized as interactive maps, with graphical displays of relevant system performance metrics as calculated by the simulation modeling components.
The Spanish external quality assessment scheme for mercury in urine.
Quintana, M J; Mazarrasa, O
1996-01-01
In 1986 the Instituto Nacional de Seguridad e Higiene en el Trabajo (INSHT), established the "Programa interlaboratorios de control de calidad de mercurio en orina (PICC-HgU)". The operation of this scheme is explained, criteria for evaluation of laboratory performance are defined and some results obtained are reviewed. Since the scheme started, an improvement in the overall performance of laboratories has been observed. The differences in the analytical methods used by laboratories do not seem to have a clear influence on the results.
Du, Lihong; White, Robert L
2009-02-01
A previously proposed partition equilibrium model for quantitative prediction of analyte response in electrospray ionization mass spectrometry is modified to yield an improved linear relationship. Analyte mass spectrometer response is modeled by a competition mechanism between analyte and background electrolytes that is based on partition equilibrium considerations. The correlation between analyte response and solution composition is described by the linear model over a wide concentration range and the improved model is shown to be valid for a wide range of experimental conditions. The behavior of an analyte in a salt solution, which could not be explained by the original model, is correctly predicted. The ion suppression effects of 16:0 lysophosphatidylcholine (LPC) on analyte signals are attributed to a combination of competition for excess charge and reduction of total charge due to surface tension effects. In contrast to the complicated mathematical forms that comprise the original model, the simplified model described here can more easily be employed to predict analyte mass spectrometer responses for solutions containing multiple components. Copyright (c) 2008 John Wiley & Sons, Ltd.
Enhanced biosensor performance using an avidin-biotin bridge for antibody immobilization
NASA Astrophysics Data System (ADS)
Narang, Upvan; Anderson, George P.; King, Keeley D.; Liss, Heidi S.; Ligler, Frances S.
1997-05-01
Maintaining antibody function after immobilization is critical to the performance of a biosensor. The conventional methods to immobilize antibodies onto surfaces are via covalent attachment using a crosslinker or by adsorption. Often, these methods of immobilization result in partial denaturation of the antibody and conformational changes leading to a reduced activity of the antibody. In this paper, we report on the immobilization of antibodies onto the surface of an optical fiber through an avidin-biotin bridge for the detection of ricin, ovalbumin, and Bacillus globigii (Bg). The assays are performed in a sandwich format. First, a capture antibody is immobilized, followed by the addition of the analyte. Finally, a fluorophore- labeled antibody is added for the specific detection of the analyte. The evanescent wave-induced fluorescence is coupled back through the same fiber to be detected using a photodiode. In all cases, we observe an improved performance of the biosensor, i.e., lower limit of detection and wide linear dynamic range, for the assays in which the antibody is immobilized via avidin-biotin bridges compared to covalent attachment method.
Bassanese, Danielle N; Conlan, Xavier A; Barnett, Neil W; Stevenson, Paul G
2015-05-01
This paper explores the analytical figures of merit of two-dimensional high-performance liquid chromatography for the separation of antioxidant standards. The cumulative two-dimensional high-performance liquid chromatography peak area was calculated for 11 antioxidants by two different methods--the areas reported by the control software and by fitting the data with a Gaussian model; these methods were evaluated for precision and sensitivity. Both methods demonstrated excellent precision in regards to retention time in the second dimension (%RSD below 1.16%) and cumulative second dimension peak area (%RSD below 3.73% from the instrument software and 5.87% for the Gaussian method). Combining areas reported by the high-performance liquid chromatographic control software displayed superior limits of detection, in the order of 1 × 10(-6) M, almost an order of magnitude lower than the Gaussian method for some analytes. The introduction of the countergradient eliminated the strong solvent mismatch between dimensions, leading to a much improved peak shape and better detection limits for quantification. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Neumann, Cedric; Ramotowski, Robert; Genessay, Thibault
2011-05-13
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library. Copyright © 2010 Elsevier B.V. All rights reserved.
Duan, Min; Wang, Wei; Zhao, Haijian; Zhang, Chuanbao; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo
2018-05-01
Internal quality control (IQC) is essential for precision evaluation and continuous quality improvement. This study aims to investigate the IQC status of blood gas analysis (BGA) in clinical laboratories of China from 2014 to 2017. IQC information on BGA (including pH, pCO2, pO2, Na+, K+, Ca2+, Cl-) was submitted by external quality assessment (EQA) participant laboratories and collected through Clinet-EQA reporting system in March from 2014 to 2017. First, current CVs were compared among different years and measurement systems. Then, percentages of laboratories meeting five allowable imprecision specifications for each analyte were calculated, respectively. Finally, laboratories were divided into different groups based on control rules and frequency to compare their variation trend. The current CVs of BGA were significantly decreasing from 2014 to 2017. pH and pCO2 got the highest pass rates when compared with the minimum imprecision specification, whereas pO2, Na+, K+, Ca2+, Cl- got the highest pass rates when 1/3 TEa imprecision specification applied. The pass rates of pH, pO2, Na+, K+, Ca2+, Cl- were significantly increasing during the 4 years. The comparisons of current CVs among different measurement systems showed that the precision performance of different analytes among different measurement systems had no regular distribution from 2014 to 2017. The analysis of IQC practice indicated great progress and improvement among different years. The imprecision performance of BGA has improved from 2014 to 2017, but the status of imprecision performance in China remains unsatisfying. Therefore, further investigation and continuous improvement measures should be taken.
Performance and capacity analysis of Poisson photon-counting based Iter-PIC OCDMA systems.
Li, Lingbin; Zhou, Xiaolin; Zhang, Rong; Zhang, Dingchen; Hanzo, Lajos
2013-11-04
In this paper, an iterative parallel interference cancellation (Iter-PIC) technique is developed for optical code-division multiple-access (OCDMA) systems relying on shot-noise limited Poisson photon-counting reception. The novel semi-analytical tool of extrinsic information transfer (EXIT) charts is used for analysing both the bit error rate (BER) performance as well as the channel capacity of these systems and the results are verified by Monte Carlo simulations. The proposed Iter-PIC OCDMA system is capable of achieving two orders of magnitude BER improvements and a 0.1 nats of capacity improvement over the conventional chip-level OCDMA systems at a coding rate of 1/10.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, T.
The Nuclear Forensics Analysis Center (NFAC) is part of Savannah River National Laboratory (SRNL) and is one of only two USG National Laboratories accredited to perform nuclear forensic analyses to the requirements of ISO 17025. SRNL NFAC is capable of analyzing nuclear and radiological samples from bulk material to ultra-trace samples. NFAC provides analytical support to the FBI's Radiological Evidence Examination Facility (REEF), which is located within SRNL. REEF gives the FBI the capability to perform traditional forensics on material that is radiological and/or is contaminated. SRNL is engaged in research and development efforts to improve the USG technical nuclearmore » forensics capabilities. Research includes improving predictive signatures and developing a database containing comparative samples.« less
Some trends in aircraft design: Structures
NASA Technical Reports Server (NTRS)
Brooks, G. W.
1975-01-01
Trends and programs currently underway on the national scene to improve the structural interface in the aircraft design process are discussed. The National Aeronautics and Space Administration shares a partnership with the educational and industrial community in the development of the tools, the criteria, and the data base essential to produce high-performance and cost-effective vehicles. Several thrusts to build the technology in materials, structural concepts, analytical programs, and integrated design procedures essential for performing the trade-offs required to fashion competitive vehicles are presented. The application of advanced fibrous composites, improved methods for structural analysis, and continued attention to important peripheral problems of aeroelastic and thermal stability are among the topics considered.
Chung, Yun Won; Kwon, Jae Kyun; Park, Suwon
2014-01-01
One of the key technologies to support mobility of mobile station (MS) in mobile communication systems is location management which consists of location update and paging. In this paper, an improved movement-based location management scheme with two movement thresholds is proposed, considering bursty data traffic characteristics of packet-switched (PS) services. The analytical modeling for location update and paging signaling loads of the proposed scheme is developed thoroughly and the performance of the proposed scheme is compared with that of the conventional scheme. We show that the proposed scheme outperforms the conventional scheme in terms of total signaling load with an appropriate selection of movement thresholds.
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
An analytic performance model of disk arrays and its application
NASA Technical Reports Server (NTRS)
Lee, Edward K.; Katz, Randy H.
1991-01-01
As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.
The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.
Enke, Christie G
2015-12-15
The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.
Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control †
Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob
2017-01-01
Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant’s intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms. PMID:28208697
Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control.
Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob
2017-02-08
Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant's intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.
ClimateSpark: An in-memory distributed computing framework for big climate data analytics
NASA Astrophysics Data System (ADS)
Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei
2018-06-01
The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.
Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M
2014-12-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.
Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.
2014-01-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.
NASA Astrophysics Data System (ADS)
Wu, Hsin-Hung; Tsai, Ya-Ning
2012-11-01
This study uses both analytic hierarchy process (AHP) and decision-making trial and evaluation laboratory (DEMATEL) methods to evaluate the criteria in auto spare parts industry in Taiwan. Traditionally, AHP does not consider indirect effects for each criterion and assumes that criteria are independent without further addressing the interdependence between or among the criteria. Thus, the importance computed by AHP can be viewed as short-term improvement opportunity. On the contrary, DEMATEL method not only evaluates the importance of criteria but also depicts the causal relations of criteria. By observing the causal diagrams, the improvement based on cause-oriented criteria might improve the performance effectively and efficiently for the long-term perspective. As a result, the major advantage of integrating AHP and DEMATEL methods is that the decision maker can continuously improve suppliers' performance from both short-term and long-term viewpoints.
Nano-immunoassay with improved performance for detection of cancer biomarkers
Krasnoslobodtsev, Alexey V.; Torres, Maria P.; Kaur, Sukhwinder; ...
2015-01-01
Nano-immunoassay utilizing surface-enhanced Raman scattering (SERS) effect is a promising analytical technique for the early detection of cancer. In its current standing the assay is capable of discriminating samples of healthy individuals from samples of pancreatic cancer patients. Further improvements in sensitivity and reproducibility will extend practical applications of the SERS-based detection platforms to wider range of problems. In this report, we discuss several strategies designed to improve performance of the SERS-based detection system. We demonstrate that reproducibility of the platform is enhanced by using atomically smooth mica surface as a template for preparation of capture surface in SERS sandwichmore » immunoassay. Furthermore, the assay's stability and sensitivity can be further improved by using either polymer or graphene monolayer as a thin protective layer applied on top of the assay addresses. The protective layer renders the signal to be more stable against photo-induced damage and carbonaceous contamination.« less
A Novel IEEE 802.15.4e DSME MAC for Wireless Sensor Networks
Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin
2017-01-01
IEEE 802.15.4e standard proposes Deterministic and Synchronous Multichannel Extension (DSME) mode for wireless sensor networks (WSNs) to support industrial, commercial and health care applications. In this paper, a new channel access scheme and beacon scheduling schemes are designed for the IEEE 802.15.4e enabled WSNs in star topology to reduce the network discovery time and energy consumption. In addition, a new dynamic guaranteed retransmission slot allocation scheme is designed for devices with the failure Guaranteed Time Slot (GTS) transmission to reduce the retransmission delay. To evaluate our schemes, analytical models are designed to analyze the performance of WSNs in terms of reliability, delay, throughput and energy consumption. Our schemes are validated with simulation and analytical results and are observed that simulation results well match with the analytical one. The evaluated results of our designed schemes can improve the reliability, throughput, delay, and energy consumptions significantly. PMID:28275216
Improved Quantification of Free and Ester-Bound Gallic Acid in Foods and Beverages by UHPLC-MS/MS.
Newsome, Andrew G; Li, Yongchao; van Breemen, Richard B
2016-02-17
Hydrolyzable tannins are measured routinely during the characterization of food and beverage samples. Most methods for the determination of hydrolyzable tannins use hydrolysis or methanolysis to convert complex tannins to small molecules (gallic acid, methyl gallate, and ellagic acid) for quantification by HPLC-UV. Often unrecognized, analytical limitations and variability inherent in these approaches for the measurement of hydrolyzable tannins include the variable mass fraction (0-0.90) that is released as analyte, contributions of sources other than tannins to hydrolyzable gallate (can exceed >10 wt %/wt), the measurement of both free and total analyte, and lack of controls to account for degradation. An accurate, specific, sensitive, and higher-throughput approach for the determination of hydrolyzable gallate based on ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) that overcomes these limitations was developed.
A Novel IEEE 802.15.4e DSME MAC for Wireless Sensor Networks.
Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin
2017-01-16
IEEE 802.15.4e standard proposes Deterministic and Synchronous Multichannel Extension (DSME) mode for wireless sensor networks (WSNs) to support industrial, commercial and health care applications. In this paper, a new channel access scheme and beacon scheduling schemes are designed for the IEEE 802.15.4e enabled WSNs in star topology to reduce the network discovery time and energy consumption. In addition, a new dynamic guaranteed retransmission slot allocation scheme is designed for devices with the failure Guaranteed Time Slot (GTS) transmission to reduce the retransmission delay. To evaluate our schemes, analytical models are designed to analyze the performance of WSNs in terms of reliability, delay, throughput and energy consumption. Our schemes are validated with simulation and analytical results and are observed that simulation results well match with the analytical one. The evaluated results of our designed schemes can improve the reliability, throughput, delay, and energy consumptions significantly.
NASA Astrophysics Data System (ADS)
Aymard, François; Gulminelli, Francesca; Margueron, Jérôme
2016-08-01
We have recently addressed the problem of the determination of the nuclear surface energy for symmetric nuclei in the framework of the extended Thomas-Fermi (ETF) approximation using Skyrme functionals. We presently extend this formalism to the case of asymmetric nuclei and the question of the surface symmetry energy. We propose an approximate expression for the diffuseness and the surface energy. These quantities are analytically related to the parameters of the energy functional. In particular, the influence of the different equation of state parameters can be explicitly quantified. Detailed analyses of the different energy components (local/non-local, isoscalar/isovector, surface/curvature and higher order) are also performed. Our analytical solution of the ETF integral improves previous models and leads to a precision of better than 200 keV per nucleon in the determination of the nuclear binding energy for dripline nuclei.
A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features.
Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang
2016-12-01
Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.
A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features
NASA Astrophysics Data System (ADS)
Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang
2016-04-01
Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.
Hydrogel nanoparticle based immunoassay
Liotta, Lance A; Luchini, Alessandra; Petricoin, Emanuel F; Espina, Virginia
2015-04-21
An immunoassay device incorporating porous polymeric capture nanoparticles within either the sample collection vessel or pre-impregnated into a porous substratum within fluid flow path of the analytical device is presented. This incorporation of capture particles within the immunoassay device improves sensitivity while removing the requirement for pre-processing of samples prior to loading the immunoassay device. A preferred embodiment is coreshell bait containing capture nanoparticles which perform three functions in one step, in solution: a) molecular size sieving, b) target analyte sequestration and concentration, and c) protection from degradation. The polymeric matrix of the capture particles may be made of co-polymeric materials having a structural monomer and an affinity monomer, the affinity monomer having properties that attract the analyte to the capture particle. This device is useful for point of care diagnostic assays for biomedical applications and as field deployable assays for environmental, pathogen and chemical or biological threat identification.
Developing an Emergency Physician Productivity Index Using Descriptive Health Analytics.
Khalifa, Mohamed
2015-01-01
Emergency department (ED) crowding became a major barrier to receiving timely emergency care. At King Faisal Specialist Hospital and Research Center, Saudi Arabia, we identified variables and factors affecting crowding and performance to develop indicators to help evaluation and improvement. Measuring efficiency of work and activity of throughput processes; it was important to develop an ED physician productivity index. Data on all ED patients' encounters over the last six months of 2014 were retrieved and descriptive health analytics methods were used. Three variables were identified for their influence on productivity and performance; Number of Treated Patients per Physician, Patient Acuity Level and Treatment Time. The study suggested a formula to calculate the productivity index of each physician through dividing the Number of Treated Patients by Patient Acuity Level squared and Treatment Time to identify physicians with low productivity index and investigate causes and factors.
NASA Technical Reports Server (NTRS)
Bienart, W. B.
1973-01-01
The objective of this program was to investigate analytically and experimentally the performance of heat pipes with composite wicks--specifically, those having pedestal arteries and screwthread circumferential grooves. An analytical model was developed to describe the effects of screwthreads and screen secondary wicks on the transport capability of the artery. The model describes the hydrodynamics of the circumferential flow in triangular grooves with azimuthally varying capillary menisci and liquid cross-sections. Normalized results were obtained which give the influence of evaporator heat flux on the axial heat transport capability of the arterial wick. In order to evaluate the priming behavior of composite wicks under actual load conditions, an 'inverted' glass heat pipe was designed and constructed. The results obtained from the analysis and from the tests with the glass heat pipe were applied to the OAO-C Level 5 heat pipe, and an improved correlation between predicted and measured evaporator and transport performance were obtained.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
42 CFR 493.845 - Standard; Toxicology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.851 - Standard; Hematology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.843 - Standard; Endocrinology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.845 - Standard; Toxicology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.845 - Standard; Toxicology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.851 - Standard; Hematology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.843 - Standard; Endocrinology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.843 - Standard; Endocrinology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
42 CFR 493.851 - Standard; Hematology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...
Katrangi, Waddah; Grebe, Stephan K G; Algeciras-Schimnich, Alicia
2017-10-26
While thyroglobulin autoantibodies (TgAb) can result in false low serum thyroglobulin (Tg) immunoassay (IA) measurements, they might also be indicators of disease persistence/recurrence. Hence, accurate TgAb measurement, in addition to Tg quantification, is crucial for thyroid cancer monitoring. We compared the analytical and clinical performance of four commonly used TgAb IAs. We measured Tg by mass spectrometry (Tg-MS) and by four pairs of Tg and TgAb IAs (Beckman, Roche, Siemens, Thermo) in 576 samples. Limit of quantitation (LOQ) and manufacturers' upper reference interval cut-off (URI) were used for comparisons. Clinical performance was assessed by receiving operator characteristics (ROC) curve analysis. Quantitative and qualitative agreement between TgAb-IAs was moderate with R2 of 0.20-0.70 and κ from 0.41-0.66 using LOQ and 0.47-0.71 using URI. In samples with TgAb interference, detection rates of TgAb were similar using LOQ and URI for Beckman, Siemens, and Thermo, but much lower for the Roche TgAb-IA when the URI was used. In TgAb positive cases, the ROC areas under the curve (AUC) for the TgAb-IAs were 0.59 (Beckman), 0.62 (Siemens), 0.59 (Roche), and 0.59 (Thermo), similar to ROC AUCs achieved with Tg. Combining Tg and TgAb measurements improved the ROC AUCs compared to Tg or TgAb alone. TgAb-IAs show significant qualitative and quantitative differences. For 2 of the 4 TgAb-IAs, using the LOQ improves the detection of interfering TgAbs. All assays showed suboptimal clinical performance when used as surrogate markers of disease, with modest improvements when Tg and TgAb were combined.
Li, Tingting; Wang, Wei; Zhao, Haijian; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo
2017-09-01
This study aimed to evaluate whether the quality performance of clinical laboratories in China has been greatly improved and whether Internal Quality Control (IQC) practice of HbA1c has also been changed since National Center for Clinical Laboratories (NCCL) of China organized laboratories to report IQC data for HbA1c in 2012. Internal Quality Control information of 306 External Quality Assessment (EQA) participant laboratories which kept reporting IQC data in February from 2012 to 2016 were collected by Web-based EQA system. Then percentages of laboratories meeting four different imprecision specifications for current coefficient of variations (CVs) of HbA1c measurements were calculated. Finally, we comprehensively analyzed analytical systems and IQC practice of HbA1c measurements. The current CVs of HbA1c tests have decreased significantly from 2012 to 2016. And percentages of laboratories meeting four imprecision specifications for CVs all showed the increasing tendency year by year. As for analytical system, 52.1% (159/306) laboratories changed their systems with the change in principle of assay. And many laboratories began to use cation exchange high-performance liquid chromatography (CE-HPLC) instead of Immunoturbidimetry, because CE-HPLC owed a lower intra-laboratory CVs. The data of IQC practice, such as IQC rules and frequency, also showed significant variability among years with overall tendency of meeting requirements. The imprecision performance of HbA1c tests has been improved in these 5 years with the change in IQC practice, but it is still disappointing in China. Therefore, laboratories should actively find existing problems and take action to promote performance of HbA1c measurements. © 2016 Wiley Periodicals, Inc.
Population-Based Pediatric Reference Intervals in General Clinical Chemistry: A Swedish Survey.
Ridefelt, Peter
2015-01-01
Very few high quality studies on pediatric reference intervals for general clinical chemistry and hematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The Swedish survey included 701 healthy children. Reference intervals for general clinical chemistry and hematology were defined.
Access to Basic Education in Ghana: The Evidence and the Issues. Country Analytic Report
ERIC Educational Resources Information Center
Akyeampong, Kwame; Djangmah, Jerome; Oduro, Abena; Seidu, Alhassan; Hunt, Frances
2007-01-01
The analysis of access to education in Ghana builds on the Ministry of Education Sector Performance Report and the World Bank sector studies. Though access has improved it remains uneven and has not grown as fast enough to reach universal levels of participation in primary school and JSS [Junior Secondary School] by 2015. More needs to be…
Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram
2013-01-01
The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less
NASA Astrophysics Data System (ADS)
Rahman, Mohammed M.; Jamal, A.; Khan, Sher Bahadar; Faisal, M.
2011-10-01
Hydrothermally prepared as-grown low-dimensional nano-particles (NPs) have been characterized using UV-vis spectroscopy, Fourier transform infrared (FT-IR) spectroscopy, powder X-ray diffraction (XRD), field emission scanning electron microscopy (FE-SEM), Raman spectroscopy, and electron dispersion spectroscopy (EDS). The uniformity of the nano-material was executed by the scanning electron microscopy, where the single phase of the nano-crystalline β-Fe 2O 3 was characterized using XRD techniques. β-Fe 2O 3 nanoparticles fabricated glassy carbon electrode (GCE) have improved chloroform-sensing performances in terms of electrical response ( I- V technique) for detecting analyte in liquid phase. The analytical performances were investigated, which showed that the better sensitivity, stability, and reproducibility of the sensor improved significantly by using Fe 2O 3 NPs thin-film on GCE. The calibration plot was linear ( R = 0.9785) over the large range of 12.0 μM to 12.0 mM. The sensitivity was calculated as 2.1792 μA cm -2 mM -1 with a detection limit of 4.4 ± 0.10 μM in short response time (10.0 s).
Sastrawan, J; Jones, C; Akhalwaya, I; Uys, H; Biercuk, M J
2016-08-01
We introduce concepts from optimal estimation to the stabilization of precision frequency standards limited by noisy local oscillators. We develop a theoretical framework casting various measures for frequency standard variance in terms of frequency-domain transfer functions, capturing the effects of feedback stabilization via a time series of Ramsey measurements. Using this framework, we introduce an optimized hybrid predictive feedforward measurement protocol that employs results from multiple past measurements and transfer-function-based calculations of measurement covariance to improve the accuracy of corrections within the feedback loop. In the presence of common non-Markovian noise processes these measurements will be correlated in a calculable manner, providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate benefits in both correction accuracy and long-term oscillator stability using hybrid feedforward. Simulations verify that in the presence of uncompensated dead time and noise with significant spectral weight near the inverse cycle time predictive feedforward outperforms traditional feedback, providing a path towards developing a class of stabilization software routines for frequency standards limited by noisy local oscillators.
Microchip integrating magnetic nanoparticles for allergy diagnosis.
Teste, Bruno; Malloggi, Florent; Siaugue, Jean-Michel; Varenne, Anne; Kanoufi, Frederic; Descroix, Stéphanie
2011-12-21
We report on the development of a simple and easy to use microchip dedicated to allergy diagnosis. This microchip combines both the advantages of homogeneous immunoassays i.e. species diffusion and heterogeneous immunoassays i.e. easy separation and preconcentration steps. In vitro allergy diagnosis is based on specific Immunoglobulin E (IgE) quantitation, in that way we have developed and integrated magnetic core-shell nanoparticles (MCSNPs) as an IgE capture nanoplatform in a microdevice taking benefit from both their magnetic and colloidal properties. Integrating such immunosupport allows to perform the target analyte (IgE) capture in the colloidal phase thus increasing the analyte capture kinetics since both immunological partners are diffusing during the immune reaction. This colloidal approach improves 1000 times the analyte capture kinetics compared to conventional methods. Moreover, based on the MCSNPs' magnetic properties and on the magnetic chamber we have previously developed the MCSNPs and therefore the target can be confined and preconcentrated within the microdevice prior to the detection step. The MCSNPs preconcentration factor achieved was about 35,000 and allows to reach high sensitivity thus avoiding catalytic amplification during the detection step. The developed microchip offers many advantages: the analytical procedure was fully integrated on-chip, analyses were performed in short assay time (20 min), the sample and reagents consumption was reduced to few microlitres (5 μL) while a low limit of detection can be achieved (about 1 ng mL(-1)).
NASA Astrophysics Data System (ADS)
Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke
2016-05-01
Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Improving Data Transfer Throughput with Direct Search Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar
2016-01-01
Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamicallymore » adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.« less
Dynamic remapping of parallel computations with varying resource demands
NASA Technical Reports Server (NTRS)
Nicol, D. M.; Saltz, J. H.
1986-01-01
A large class of computational problems is characterized by frequent synchronization, and computational requirements which change as a function of time. When such a problem must be solved on a message passing multiprocessor machine, the combination of these characteristics lead to system performance which decreases in time. Performance can be improved with periodic redistribution of computational load; however, redistribution can exact a sometimes large delay cost. We study the issue of deciding when to invoke a global load remapping mechanism. Such a decision policy must effectively weigh the costs of remapping against the performance benefits. We treat this problem by constructing two analytic models which exhibit stochastically decreasing performance. One model is quite tractable; we are able to describe the optimal remapping algorithm, and the optimal decision policy governing when to invoke that algorithm. However, computational complexity prohibits the use of the optimal remapping decision policy. We then study the performance of a general remapping policy on both analytic models. This policy attempts to minimize a statistic W(n) which measures the system degradation (including the cost of remapping) per computation step over a period of n steps. We show that as a function of time, the expected value of W(n) has at most one minimum, and that when this minimum exists it defines the optimal fixed-interval remapping policy. Our decision policy appeals to this result by remapping when it estimates that W(n) is minimized. Our performance data suggests that this policy effectively finds the natural frequency of remapping. We also use the analytic models to express the relationship between performance and remapping cost, number of processors, and the computation's stochastic activity.
Renz, Nora; Cabric, Sabrina; Morgenstern, Christian; Schuetz, Michael A; Trampuz, Andrej
2018-04-01
Bone healing disturbance following fracture fixation represents a continuing challenge. We evaluated a novel fully automated polymerase chain reaction (PCR) assay using sonication fluid from retrieved orthopedic hardware to diagnose infection. In this prospective diagnostic cohort study, explanted orthopedic hardware materials from consecutive patients were investigated by sonication and the resulting sonication fluid was analyzed by culture (standard procedure) and multiplex PCR (investigational procedure). Hardware-associated infection was defined as visible purulence, presence of a sinus tract, implant on view, inflammation in peri-implant tissue or positive culture. McNemar's chi-squared test was used to compare the performance of diagnostic tests. For the clinical performance all pathogens were considered, whereas for analytical performance only microorganisms were considered for which primers are included in the PCR assay. Among 51 patients, hardware-associated infection was diagnosed in 38 cases (75%) and non-infectious causes in 13 patients (25%). The sensitivity for diagnosing infection was 66% for peri-implant tissue culture, 84% for sonication fluid culture, 71% (clinical performance) and 77% (analytical performance) for sonication fluid PCR, the specificity of all tests was >90%. The analytical sensitivity of PCR was higher for gram-negative bacilli (100%), coagulase-negative staphylococci (89%) and Staphylococcus aureus (75%) than for Cutibacterium (formerly Propionibacterium) acnes (57%), enterococci (50%) and Candida spp. (25%). The performance of sonication fluid PCR for diagnosis of orthopedic hardware-associated infection was comparable to culture tests. The additional advantage of PCR was short processing time (<5 h) and fully automated procedure. With further improvement of the performance, PCR has the potential to complement conventional cultures. Copyright © 2018 Elsevier Ltd. All rights reserved.
Linear and Order Statistics Combiners for Pattern Classification
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Ghosh, Joydeep; Lau, Sonie (Technical Monitor)
2001-01-01
Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This chapter provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and order statistics combiners. We first show that to a first order approximation, the error rate obtained over and above the Bayes error rate, is directly proportional to the variance of the actual decision boundaries around the Bayes optimum boundary. Combining classifiers in output space reduces this variance, and hence reduces the 'added' error. If N unbiased classifiers are combined by simple averaging. the added error rate can be reduced by a factor of N if the individual errors in approximating the decision boundaries are uncorrelated. Expressions are then derived for linear combiners which are biased or correlated, and the effect of output correlations on ensemble performance is quantified. For order statistics based non-linear combiners, we derive expressions that indicate how much the median, the maximum and in general the i-th order statistic can improve classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions, and combining in output space. Experimental results on several public domain data sets are provided to illustrate the benefits of combining and to support the analytical results.
Heat conduction tuning by hyperbranched nanophononic metamaterials
NASA Astrophysics Data System (ADS)
Li, Bing; Tan, K. T.; Christensen, Johan
2018-05-01
Phonon dispersion and thermal conduction properties of hyperbranched nanostructures with unique topological complexity are theoretically and numerically investigated in this research. We present analytical cantilever-in-mass models to analyze and control the inherent resonance hybridization in hyperbranched nanomembranes containing different configurations and cross sections. We show that these local resonances hosted by hyperbranched nanopillars can generate numerous flat bands in the phonon dispersion relation and dramatically lower the group velocities, consequently resulting in a significant reduction of the thermal conductivity. The applicability of the proposed analytical models in thermal conductivity tuning is demonstrated, and a superior performance in reducing the heat flux in nano-structured membranes is exhibited, which can potentially lead to improved thermoelectric energy conversion devices.
Boareto, Marcelo; Cesar, Jonatas; Leite, Vitor B P; Caticha, Nestor
2015-01-01
We introduce Supervised Variational Relevance Learning (Suvrel), a variational method to determine metric tensors to define distance based similarity in pattern classification, inspired in relevance learning. The variational method is applied to a cost function that penalizes large intraclass distances and favors small interclass distances. We find analytically the metric tensor that minimizes the cost function. Preprocessing the patterns by doing linear transformations using the metric tensor yields a dataset which can be more efficiently classified. We test our methods using publicly available datasets, for some standard classifiers. Among these datasets, two were tested by the MAQC-II project and, even without the use of further preprocessing, our results improve on their performance.
Preliminary Results on Uncertainty Quantification for Pattern Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene
2015-09-01
This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search,more » and discuss a number of possible improvements for each.« less
Fast analytical scatter estimation using graphics processing units.
Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris
2015-01-01
To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.
Bhushan, Ravi; Sen, Arijit
2017-04-01
Very few Indian studies exist on evaluation of pre-analytical variables affecting "Prothrombin Time" the commonest coagulation assay performed. The study was performed in an Indian tertiary care setting with an aim to assess quantitatively the prevalence of pre-analytical variables and their effects on the results (patient safety), for Prothrombin time test. The study also evaluated their effects on the result and whether intervention, did correct the results. The firstly evaluated the prevalence for various pre-analytical variables detected in samples sent for Prothrombin Time testing. These samples with the detected variables wherever possible were tested and result noted. The samples from the same patients were repeated and retested ensuring that no pre-analytical variable is present. The results were again noted to check for difference the intervention produced. The study evaluated 9989 samples received for PT/INR over a period of 18 months. The prevalence of different pre-analytical variables was found to be 862 (8.63%). The proportion of various pre-analytical variables detected were haemolysed samples 515 (5.16%), over filled vacutainers 62 (0.62%), under filled vacutainers 39 (0.39%), low values 205 (2.05%), clotted samples 11 (0.11%), wrong labeling 4 (0.04%), wrong vacutainer use 2 (0.02%), chylous samples 7 (0.07%) and samples with more than one variable 17 (0.17%). The comparison of percentage of samples showing errors were noted for the first variables since they could be tested with and without the variable in place. The reduction in error percentage was 91.5%, 69.2%, 81.5% and 95.4% post intervention for haemolysed, overfilled, under filled and samples collected with excess pressure at phlebotomy respectively. Correcting the variables did reduce the error percentage to a great extent in these four variables and hence the variables are found to affect "Prothrombin Time" testing and can hamper patient safety.
Quintana, Daniel S.
2015-01-01
Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses. PMID:26500598
Quintana, Daniel S
2015-01-01
Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses.
Promoting clinical and laboratory interaction by harmonization.
Plebani, Mario; Panteghini, Mauro
2014-05-15
The lack of interchangeable results in current practice among clinical laboratories has underpinned greater attention to standardization and harmonization projects. Although the focus was mainly on the standardization and harmonization of measurement procedures and their results, the scope of harmonization goes beyond method and analytical results: it includes all other aspects of laboratory testing, including terminology and units, report formats, reference limits and decision thresholds, as well as test profiles and criteria for the interpretation of results. In particular, as evidence collected in last decades demonstrates that pre-pre- and post-post-analytical steps are more vulnerable to errors, harmonization initiatives should be performed to improve procedures and processes at the laboratory-clinical interface. Managing upstream demand, down-stream interpretation of laboratory results, and subsequent appropriate action through close relationships between laboratorians and clinicians remains a crucial issue of the laboratory testing process. Therefore, initiatives to improve test demand management from one hand and to harmonize procedures to improve physicians' acknowledgment of laboratory data and their interpretation from the other hand are needed in order to assure quality and safety in the total testing process. © 2013.
Piezoresistive Cantilever Performance—Part II: Optimization
Park, Sung-Jin; Doll, Joseph C.; Rastegar, Ali J.; Pruitt, Beth L.
2010-01-01
Piezoresistive silicon cantilevers fabricated by ion implantation are frequently used for force, displacement, and chemical sensors due to their low cost and electronic readout. However, the design of piezoresistive cantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. We systematically analyzed the effect of design and process parameters on force resolution and then developed an optimization approach to improve force resolution while satisfying various design constraints using simulation results. The combined simulation and optimization approach is extensible to other doping methods beyond ion implantation in principle. The optimization results were validated by fabricating cantilevers with the optimized conditions and characterizing their performance. The measurement results demonstrate that the analytical model accurately predicts force and displacement resolution, and sensitivity and noise tradeoff in optimal cantilever performance. We also performed a comparison between our optimization technique and existing models and demonstrated eight times improvement in force resolution over simplified models. PMID:20333323
Diwakar, Prasoon K.; Harilal, Sivanandan S.; LaHaye, Nicole L.; Hassanein, Ahmed; Kulkarni, Pramod
2015-01-01
Laser parameters, typically wavelength, pulse width, irradiance, repetition rate, and pulse energy, are critical parameters which influence the laser ablation process and thereby influence the LA-ICP-MS signal. In recent times, femtosecond laser ablation has gained popularity owing to the reduction in fractionation related issues and improved analytical performance which can provide matrix-independent sampling. The advantage offered by fs-LA is due to shorter pulse duration of the laser as compared to the phonon relaxation time and heat diffusion time. Hence the thermal effects are minimized in fs-LA. Recently, fs-LA-ICP-MS demonstrated improved analytical performance as compared to ns-LA-ICP-MS, but detailed mechanisms and processes are still not clearly understood. Improvement of fs-LA-ICP-MS over ns-LA-ICP-MS elucidates the importance of laser pulse duration and related effects on the ablation process. In this study, we have investigated the influence of laser pulse width (40 fs to 0.3 ns) and energy on LA-ICP-MS signal intensity and repeatability using a brass sample. Experiments were performed in single spot ablation mode as well as rastering ablation mode to monitor the Cu/Zn ratio. The recorded ICP-MS signal was correlated with total particle counts generated during laser ablation as well as particle size distribution. Our results show the importance of pulse width effects in the fs regime that becomes more pronounced when moving from femtosecond to picosecond and nanosecond regimes. PMID:26664120
NASA Astrophysics Data System (ADS)
Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.
2015-03-01
Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.
Shuttle TPS thermal performance and analysis methodology
NASA Technical Reports Server (NTRS)
Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.
1983-01-01
Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.
NASA Astrophysics Data System (ADS)
Dachyar, M.; Risky, S. A.
2014-06-01
Telecommunications company have to improve their business performance despite of the increase customers every year. In Indonesia, the telecommunication company have provided best services, improving operational systems by designing a framework for operational systems of the Internet of Things (IoT) other name of Machine to Machine (M2M). This study was conducted with expert opinion which further processed by the Analytic Hierarchy Process (AHP) to obtain important factor for organizations operational systems, and the Interpretive Structural Modeling (ISM) to determine factors of organization which found drives the biggest power. This study resulted, the greatest weight of SLA & KPI handling problems. The M2M current dashboard and current M2M connectivity have power to affect other factors and has important function for M2M operations roomates system which can be effectively carried out.
Improved coordinates of features in the vicinity of the Viking lander site on Mars
NASA Technical Reports Server (NTRS)
Davies, M. E.; Dole, S. H.
1980-01-01
The measurement of longitude of the Viking 1 landing site and the accuracy of the coordinates of features in the area around the landing site are discussed. The longitude must be measured photogrammatically from the small crater, Airy 0, which defines the 0 deg meridian on Mars. The computer program, GIANT, which was used to perform the analytical triangulations, and the photogrammetric computation of the longitude of the Viking 1 lander site are described. Improved coordinates of features in the vicinity of the Viking 1 lander site are presented.
Concept mapping improves academic performance in problem solving questions in biochemistry subject.
Baig, Mukhtiar; Tariq, Saba; Rehman, Rehana; Ali, Sobia; Gazzaz, Zohair J
2016-01-01
To assess the effectiveness of concept mapping (CM) on the academic performance of medical students' in problem-solving as well as in declarative knowledge questions and their perception regarding CM. The present analytical and questionnaire-based study was carried out at Bahria University Medical and Dental College (BUMDC), Karachi, Pakistan. In this analytical study, students were assessed with problem-solving questions (A-type MCQs), and declarative knowledge questions (short essay questions), and 50% of the questions were from the topics learned by CM. Students also filled a 10-item, 3-point Likert scale questionnaire about their perception regarding the effectiveness of the CM approach, and two open-ended questions were also asked. There was a significant difference in the marks obtained in those problem-solving questions, which were learned by CM as compared to those topics which were taught by the traditional lectures (p<0.001), while no significant difference was observed in marks in declarative knowledge questions (p=0.704). Analysis of students' perception regarding CM showed that majority of the students perceive that CM is a helpful technique and it is enjoyed by the students. In open-ended questions, the majority of the students commented positively about the effectiveness of CM. Our results indicate that CM improves academic performance in problem solving but not in declarative knowledge questions. Students' perception about the effectiveness of CM was overwhelmingly positive.
GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith
2014-08-25
Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines themore » scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.« less
Evaluation of analytical performance based on partial order methodology.
Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin
2015-01-01
Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.
Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis
ERIC Educational Resources Information Center
Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay
2018-01-01
Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…
Lean Stability augmentation study
NASA Technical Reports Server (NTRS)
Mcvey, J. B.; Kennedy, J. B.
1979-01-01
An analytical and experimental program was conducted to investigate techniques and develop technology for improving the lean combustion limits of premixing, prevaporizing combustors applicable to gas turbine engine main burners. Three concepts for improving lean stability limits were selected for experimental evaluation among twelve approaches considered. Concepts were selected on the basis of the potential for improving stability limits and achieving emission goals, the technological risks associated with development of practical burners employing the concepts, and the penalties to airline direct operating costs resulting from decreased combustor performance, increased engine cost, increased maintenance cost and increased engine weight associated with implementation of the concepts. Tests of flameholders embodying the selected concepts were conducted.
Data analytics and optimization of an ice-based energy storage system for commercial buildings
Luo, Na; Hong, Tianzhen; Li, Hui; ...
2017-07-25
Ice-based thermal energy storage (TES) systems can shift peak cooling demand and reduce operational energy costs (with time-of-use rates) in commercial buildings. The accurate prediction of the cooling load, and the optimal control strategy for managing the charging and discharging of a TES system, are two critical elements to improving system performance and achieving energy cost savings. This study utilizes data-driven analytics and modeling to holistically understand the operation of an ice–based TES system in a shopping mall, calculating the system’s performance using actual measured data from installed meters and sensors. Results show that there is significant savings potential whenmore » the current operating strategy is improved by appropriately scheduling the operation of each piece of equipment of the TES system, as well as by determining the amount of charging and discharging for each day. A novel optimal control strategy, determined by an optimization algorithm of Sequential Quadratic Programming, was developed to minimize the TES system’s operating costs. Three heuristic strategies were also investigated for comparison with our proposed strategy, and the results demonstrate the superiority of our method to the heuristic strategies in terms of total energy cost savings. Specifically, the optimal strategy yields energy costs of up to 11.3% per day and 9.3% per month compared with current operational strategies. A one-day-ahead hourly load prediction was also developed using machine learning algorithms, which facilitates the adoption of the developed data analytics and optimization of the control strategy in a real TES system operation.« less
Data analytics and optimization of an ice-based energy storage system for commercial buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Na; Hong, Tianzhen; Li, Hui
Ice-based thermal energy storage (TES) systems can shift peak cooling demand and reduce operational energy costs (with time-of-use rates) in commercial buildings. The accurate prediction of the cooling load, and the optimal control strategy for managing the charging and discharging of a TES system, are two critical elements to improving system performance and achieving energy cost savings. This study utilizes data-driven analytics and modeling to holistically understand the operation of an ice–based TES system in a shopping mall, calculating the system’s performance using actual measured data from installed meters and sensors. Results show that there is significant savings potential whenmore » the current operating strategy is improved by appropriately scheduling the operation of each piece of equipment of the TES system, as well as by determining the amount of charging and discharging for each day. A novel optimal control strategy, determined by an optimization algorithm of Sequential Quadratic Programming, was developed to minimize the TES system’s operating costs. Three heuristic strategies were also investigated for comparison with our proposed strategy, and the results demonstrate the superiority of our method to the heuristic strategies in terms of total energy cost savings. Specifically, the optimal strategy yields energy costs of up to 11.3% per day and 9.3% per month compared with current operational strategies. A one-day-ahead hourly load prediction was also developed using machine learning algorithms, which facilitates the adoption of the developed data analytics and optimization of the control strategy in a real TES system operation.« less
Durner, Bernhard; Ehmann, Thomas; Matysik, Frank-Michael
2018-06-05
The adaption of an parallel-path poly(tetrafluoroethylene)(PTFE) ICP-nebulizer to an evaporative light scattering detector (ELSD) was realized. This was done by substituting the originally installed concentric glass nebulizer of the ELSD. The performance of both nebulizers was compared regarding nebulizer temperature, evaporator temperature, flow rate of nebulizing gas and flow rate of mobile phase of different solvents using caffeine and poly(dimethylsiloxane) (PDMS) as analytes. Both nebulizers showed similar performances but for the parallel-path PTFE nebulizer the performance was considerably better at low LC flow rates and the nebulizer lifetime was substantially increased. In general, for both nebulizers the highest sensitivity was obtained by applying the lowest possible evaporator temperature in combination with the highest possible nebulizer temperature at preferably low gas flow rates. Besides the optimization of detector parameters, response factors for various PDMS oligomers were determined and the dependency of the detector signal on molar mass of the analytes was studied. The significant improvement regarding long-term stability made the modified ELSD much more robust and saved time and money by reducing the maintenance efforts. Thus, especially in polymer HPLC, associated with a complex matrix situation, the PTFE-based parallel-path nebulizer exhibits attractive characteristics for analytical studies of polymers. Copyright © 2018. Published by Elsevier B.V.
Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.
2013-01-01
Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960
Improving Learning Analytics--Combining Observational and Self-Report Data on Student Learning
ERIC Educational Resources Information Center
Ellis, Robert A.; Han, Feifei; Pardo, Abelardo
2017-01-01
The field of education technology is embracing a use of learning analytics to improve student experiences of learning. Along with exponential growth in this area is an increasing concern of the interpretability of the analytics from the student experience and what they can tell us about learning. This study offers a way to address some of the…
Kim, Dalho; Han, Jungho; Choi, Yongwook
2013-01-01
A method using on-line solid-phase microextraction (SPME) on a carbowax-templated fiber followed by liquid chromatography (LC) with ultraviolet (UV) detection was developed for the determination of triclosan in environmental water samples. Along with triclosan, other selected phenolic compounds, bisphenol A, and acidic pharmaceuticals were studied. Previous SPME/LC or stir-bar sorptive extraction/LC-UV for polar analytes showed lack of sensitivity. In this study, the calculated octanol-water distribution coefficient (log D) values of the target analytes at different pH values were used to estimate polarity of the analytes. The lack of sensitivity observed in earlier studies is identified as a lack of desorption by strong polar-polar interactions between analyte and solid-phase. Calculated log D values were useful to understand or predict the interaction between analyte and solid phase. Under the optimized conditions, the method detection limit of selected analytes by using on-line SPME-LC-UV method ranged from 5 to 33 ng L(-1), except for very polar 3-chlorophenol and 2,4-dichlorophenol which was obscured in wastewater samples by an interfering substance. This level of detection represented a remarkable improvement over the conventional existing methods. The on-line SPME-LC-UV method, which did not require derivatization of analytes, was applied to the determination of TCS including phenolic compounds and acidic pharmaceuticals in tap water and river water and municipal wastewater samples.
Analytical Chemistry Division annual progress report for period ending December 31, 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: (1) Analytical Research, Development, and Implementation. The division maintains a program to conceptualize, investigate, develop, assess, improve, and implement advanced technology for chemical and physicochemical measurements. Emphasis is on problems and needs identified with ORNL and Department of Energy (DOE) programs; however, attention is also given to advancing the analytical sciences themselves. (2) Programmatic Research, Development, andmore » Utilization. The division carries out a wide variety of chemical work that typically involves analytical research and/or development plus the utilization of analytical capabilities to expedite programmatic interests. (3) Technical Support. The division performs chemical and physicochemical analyses of virtually all types. The Analytical Chemistry Division is organized into four major sections, each of which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to December 31, 1988. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8.« less
Findeisen, P; Zahn, I; Fiedler, G M; Leichtle, A B; Wang, S; Soria, G; Johnson, P; Henzell, J; Hegel, J K; Bendavid, C; Collet, N; McGovern, M; Klopprogge, K
2018-06-04
The new immunochemistry cobas e 801 module (Roche Diagnostics) was developed to meet increasing demands on routine laboratories to further improve testing efficiency, while maintaining high quality and reliable data. During a non-interventional multicenter evaluation study, the overall performance, functionality and reliability of the new module was investigated under routine-like conditions. It was tested as a dedicated immunochemistry system at four sites and as a consolidator combined with clinical chemistry at three sites. We report on testing efficiency and analytical performance of the new module. Evaluation of sample workloads with site-specific routine request patterns demonstrated increased speed and almost doubled throughput (maximal 300 tests per h), thus revealing that one cobas e 801 module can replace two cobas e 602 modules while saving up to 44% floor space. Result stability was demonstrated by QC analysis per assay throughout the study. Precision testing over 21 days yielded excellent results within and between labs, and, method comparison performed versus the cobas e 602 module routine results showed high consistency of results for all assays under study. In a practicability assessment related to performance and handling, 99% of graded features met (44%) or even exceeded (55%) laboratory expectations, with enhanced reagent management and loading during operation being highlighted. By nearly doubling immunochemistry testing efficiency on the same footprint as a cobas e 602 module, the new module has a great potential to further consolidate and enhance laboratory testing while maintaining high quality analytical performance with Roche platforms. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Britz-McKibbin, Philip; Otsuka, Koji; Terabe, Shigeru
2002-08-01
Simple yet effective methods to enhance concentration sensitivity is needed for capillary electrophoresis (CE) to become a practical method to analyze trace levels of analytes in real samples. In this report, the development of a novel on-line preconcentration technique combining dynamic pH junction and sweeping modes of focusing is applied to the sensitive and selective analysis of three flavin derivatives: riboflavin, flavin mononucleotide (FMN) and flavin adenine dinucleotide (FAD). Picomolar (pM) detectability of flavins by CE with laser-induced fluorescence (LIF) detection is demonstrated through effective focusing of large sample volumes (up to 22% capillary length) using a dual pH junction-sweeping focusing mode. This results in greater than a 1,200-fold improvement in sensitivity relative to conventional injection methods, giving a limit of detection (S/N = 3) of approximately 4.0 pM for FAD and FMN. Flavin focusing is examined in terms of analyte mobility dependence on buffer pH, borate complexation and SDS interaction. Dynamic pH junction-sweeping extends on-line focusing to both neutral (hydrophobic) and weakly acidic (hydrophilic) species and is considered useful in cases when either conventional sweeping or dynamic pH junction techniques used alone are less effective for certain classes of analytes. Enhanced focusing performance by this hyphenated method was demonstrated by greater than a 4-fold reduction in flavin bandwidth, as compared to either sweeping or dynamic pH junction, reflected by analyte detector bandwidths <0.20 cm. Novel on-line focusing strategies are required to improve sensitivity in CE, which may be applied toward more effective biochemical analysis methods for diverse types of analytes.
Cates, Joshua W.; Vinke, Ruud; Levin, Craig S.
2015-01-01
Excellent timing resolution is required to enhance the signal-to-noise ratio (SNR) gain available from the incorporation of time-of-flight (ToF) information in image reconstruction for positron emission tomography (PET). As the detector’s timing resolution improves, so does SNR, reconstructed image quality, and accuracy. This directly impacts the challenging detection and quantification tasks in the clinic. The recognition of these benefits has spurred efforts within the molecular imaging community to determine to what extent the timing resolution of scintillation detectors can be improved and develop near-term solutions for advancing ToF-PET. Presented in this work, is a method for calculating the Cramér-Rao lower bound (CRLB) on timing resolution for scintillation detectors with long crystal elements, where the influence of the variation in optical path length of scintillation light on achievable timing resolution is non-negligible. The presented formalism incorporates an accurate, analytical probability density function (PDF) of optical transit time within the crystal to obtain a purely mathematical expression of the CRLB with high-aspect-ratio (HAR) scintillation detectors. This approach enables the statistical limit on timing resolution performance to be analytically expressed for clinically-relevant PET scintillation detectors without requiring Monte Carlo simulation-generated photon transport time distributions. The analytically calculated optical transport PDF was compared with detailed light transport simulations, and excellent agreement was found between the two. The coincidence timing resolution (CTR) between two 3×3×20 mm3 LYSO:Ce crystals coupled to analogue SiPMs was experimentally measured to be 162±1 ps FWHM, approaching the analytically calculated lower bound within 6.5%. PMID:26083559
NASA Astrophysics Data System (ADS)
Cates, Joshua W.; Vinke, Ruud; Levin, Craig S.
2015-07-01
Excellent timing resolution is required to enhance the signal-to-noise ratio (SNR) gain available from the incorporation of time-of-flight (ToF) information in image reconstruction for positron emission tomography (PET). As the detector’s timing resolution improves, so does SNR, reconstructed image quality, and accuracy. This directly impacts the challenging detection and quantification tasks in the clinic. The recognition of these benefits has spurred efforts within the molecular imaging community to determine to what extent the timing resolution of scintillation detectors can be improved and develop near-term solutions for advancing ToF-PET. Presented in this work, is a method for calculating the Cramér-Rao lower bound (CRLB) on timing resolution for scintillation detectors with long crystal elements, where the influence of the variation in optical path length of scintillation light on achievable timing resolution is non-negligible. The presented formalism incorporates an accurate, analytical probability density function (PDF) of optical transit time within the crystal to obtain a purely mathematical expression of the CRLB with high-aspect-ratio (HAR) scintillation detectors. This approach enables the statistical limit on timing resolution performance to be analytically expressed for clinically-relevant PET scintillation detectors without requiring Monte Carlo simulation-generated photon transport time distributions. The analytically calculated optical transport PDF was compared with detailed light transport simulations, and excellent agreement was found between the two. The coincidence timing resolution (CTR) between two 3× 3× 20 mm3 LYSO:Ce crystals coupled to analogue SiPMs was experimentally measured to be 162+/- 1 ps FWHM, approaching the analytically calculated lower bound within 6.5%.
ERIC Educational Resources Information Center
Colthorpe, Kay; Zimbardi, Kirsten; Ainscough, Louise; Anderson, Stephen
2015-01-01
It is well established that a student's capacity to regulate his or her own learning is a key determinant of academic success, suggesting that interventions targeting improvements in self-regulation will have a positive impact on academic performance. However, to evaluate the success of such interventions, the self-regulatory characteristics of…
Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure
2014-12-01
We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.
Evaluating Trends in Historical PM2.5 Element Concentrations by Reanalyzing a 15-Year Sample Archive
NASA Astrophysics Data System (ADS)
Hyslop, N. P.; White, W. H.; Trzepla, K.
2014-12-01
The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains (GRSM), Mount Rainier (MORA), and Point Reyes National Parks (PORE) were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era. The graph below compares the trend estimates for all the elements measured by IMPROVE based on the original and repeat analyses; the elements identified in color are measured above the detection limit more than 90% of the time. The trend estimates are sensitive to the treatment of non-detect data. The original and reanalysis trends are indistinguishable (have overlapping confidence intervals) for most of the well-detected elements.
Reanalysis of a 15-year Archive of IMPROVE Samples
NASA Astrophysics Data System (ADS)
Hyslop, N. P.; White, W. H.; Trzepla, K.
2013-12-01
The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains, Mount Rainier, and Point Reyes National Parks were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era (Figure 1). Temporal trends for some elements are affected by these changes in measurement technique while others are not (Figure 2). Figure 1. Repeatability of analyses for sulfur and vanadium at Great Smoky Mountains National Park. Each point shows the ratio of mass loadings determined by the original analysis and recent reanalysis. Major method distinctions are indicated at the top. Figure 2. Trends, based on Thiel-Sen regression, in lead concentrations based on the original and reanalysis data.
Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro
2017-06-27
External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.
Improving adsorption cryocoolers by multi-stage compression and reducing void volume
NASA Technical Reports Server (NTRS)
Bard, S.
1986-01-01
It is shown that the performance of gas adsorption cryocoolers is greatly improved by using adsorbents with low void volume within and between individual adsorbent particles (reducing void volumes in plumbing lines), and by compressing the working fluid in more than one stage. Refrigerator specific power requirements and compressor volumetric efficiencies are obtained in terms of adsorbent and plumbing line void volumes and operating pressures for various charcoal adsorbents using an analytical model. Performance optimization curves for 117.5 and 80 K charcoal/nitrogen adsorption cryocoolers are given for both single stage and multistage compressor systems, and compressing the nitrogen in two stages is shown to lower the specific power requirements by 18 percent for the 117.5 K system.
RSNA Diagnosis Live: A Novel Web-based Audience Response Tool to Promote Evidence-based Learning.
Awan, Omer A; Shaikh, Faiq; Kalbfleisch, Brian; Siegel, Eliot L; Chang, Paul
2017-01-01
Audience response systems have become more commonplace in radiology residency programs in the last 10 years, as a means to engage learners and promote improved learning and retention. A variety of systems are currently in use. RSNA Diagnosis Live™ provides unique features that are innovative, particularly for radiology resident education. One specific example is the ability to annotate questions with subspecialty tags, which allows resident performance to be tracked over time. In addition, deficiencies in learning can be monitored for each trainee and analytics can be provided, allowing documentation of resident performance improvement. Finally, automated feedback is given not only to the instructor, but also to the trainee. Online supplemental material is available for this article. © RSNA, 2017.
NASA Astrophysics Data System (ADS)
Siami, Mohammad; Gholamian, Mohammad Reza; Basiri, Javad
2014-10-01
Nowadays, credit scoring is one of the most important topics in the banking sector. Credit scoring models have been widely used to facilitate the process of credit assessing. In this paper, an application of the locally linear model tree algorithm (LOLIMOT) was experimented to evaluate the superiority of its performance to predict the customer's credit status. The algorithm is improved with an aim of adjustment by credit scoring domain by means of data fusion and feature selection techniques. Two real world credit data sets - Australian and German - from UCI machine learning database were selected to demonstrate the performance of our new classifier. The analytical results indicate that the improved LOLIMOT significantly increase the prediction accuracy.
Computer-aided design analysis of 57-mm, angular-contact, cryogenic turbopump bearings
NASA Technical Reports Server (NTRS)
Armstrong, Elizabeth S.; Coe, Harold H.
1988-01-01
The Space Shuttle main engine high-pressure oxygen turbopumps have not experienced the sevice life required of them. This insufficiency has been due in part to the shortened life of the bearings. To improve the life of the existing turbopump bearings, an effort is under way to investigate bearing modifications that could be retrofitted into the present bearing cavity. Several bearing parameters were optimized using the computer program SHABERTH, which performs a thermomechanical simulation of a load support system. The computer analysis showed that improved bearing performance is feasible if low friction coefficients can be attained. Bearing geometries were optimized considering heat generation, equilibrium temperatures, and relative life. Thermal gradients through the bearings were found to be lower with liquid lubrication than with solid film lubrication, and a liquid oxygen coolant flowrate of approximately 4.0 kg/s was found to be optimal. This paper describes the analytical modeling used to determine these feasible modifications to improve bearing performance.
Promoting principals' managerial involvement in instructional improvement
Gillat, Alex; Sulzer-Azaroff, Beth
1994-01-01
Studies of school leadership suggest that visiting classrooms, emphasizing achievement and training, and supporting teachers are important indicators of the effectiveness of school principals. The utility of a behavior-analytic program to support the enhancement of these behaviors in 2 school principals and the impact of their involvement upon teachers' and students' performances in three classes were examined in two experiments, one at an elementary school and another at a secondary school. Treatment conditions consisted of helping the principal or teacher to schedule his or her time and to use goal setting, feedback, and praise. A withdrawal design (Experiment 1) and a multiple baseline across classrooms (Experiment 2) showed that the principal's and teacher's rates of praise, feedback, and goal setting increased during the intervention, and were associated with improvements in the academic performance of the students. In the future, school psychologists might analyze the impact of involving themselves in supporting the principal's involvement in improving students' and teachers' performances or in playing a similar leadership role themselves. PMID:16795819
Promoting principals' managerial involvement in instructional improvement.
Gillat, A
1994-01-01
Studies of school leadership suggest that visiting classrooms, emphasizing achievement and training, and supporting teachers are important indicators of the effectiveness of school principals. The utility of a behavior-analytic program to support the enhancement of these behaviors in 2 school principals and the impact of their involvement upon teachers' and students' performances in three classes were examined in two experiments, one at an elementary school and another at a secondary school. Treatment conditions consisted of helping the principal or teacher to schedule his or her time and to use goal setting, feedback, and praise. A withdrawal design (Experiment 1) and a multiple baseline across classrooms (Experiment 2) showed that the principal's and teacher's rates of praise, feedback, and goal setting increased during the intervention, and were associated with improvements in the academic performance of the students. In the future, school psychologists might analyze the impact of involving themselves in supporting the principal's involvement in improving students' and teachers' performances or in playing a similar leadership role themselves.
NASA Astrophysics Data System (ADS)
Tsifouti, A.; Triantaphillidou, S.; Larabi, M. C.; Doré, G.; Bilissi, E.; Psarrou, A.
2015-01-01
In this investigation we study the effects of compression and frame rate reduction on the performance of four video analytics (VA) systems utilizing a low complexity scenario, such as the Sterile Zone (SZ). Additionally, we identify the most influential scene parameters affecting the performance of these systems. The SZ scenario is a scene consisting of a fence, not to be trespassed, and an area with grass. The VA system needs to alarm when there is an intruder (attack) entering the scene. The work includes testing of the systems with uncompressed and compressed (using H.264/MPEG-4 AVC at 25 and 5 frames per second) footage, consisting of quantified scene parameters. The scene parameters include descriptions of scene contrast, camera to subject distance, and attack portrayal. Additional footage, including only distractions (no attacks) is also investigated. Results have shown that every system has performed differently for each compression/frame rate level, whilst overall, compression has not adversely affected the performance of the systems. Frame rate reduction has decreased performance and scene parameters have influenced the behavior of the systems differently. Most false alarms were triggered with a distraction clip, including abrupt shadows through the fence. Findings could contribute to the improvement of VA systems.
Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek
2016-01-15
In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.
Mechanics of additively manufactured porous biomaterials based on the rhombicuboctahedron unit cell.
Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A
2016-01-01
Thanks to recent developments in additive manufacturing techniques, it is now possible to fabricate porous biomaterials with arbitrarily complex micro-architectures. Micro-architectures of such biomaterials determine their physical and biological properties, meaning that one could potentially improve the performance of such biomaterials through rational design of micro-architecture. The relationship between the micro-architecture of porous biomaterials and their physical and biological properties has therefore received increasing attention recently. In this paper, we studied the mechanical properties of porous biomaterials made from a relatively unexplored unit cell, namely rhombicuboctahedron. We derived analytical relationships that relate the micro-architecture of such porous biomaterials, i.e. the dimensions of the rhombicuboctahedron unit cell, to their elastic modulus, Poisson's ratio, and yield stress. Finite element models were also developed to validate the analytical solutions. Analytical and numerical results were compared with experimental data from one of our recent studies. It was found that analytical solutions and numerical results show a very good agreement particularly for smaller values of apparent density. The elastic moduli predicted by analytical and numerical models were in very good agreement with experimental observations too. While in excellent agreement with each other, analytical and numerical models somewhat over-predicted the yield stress of the porous structures as compared to experimental data. As the ratio of the vertical struts to the inclined struts, α, approaches zero and infinity, the rhombicuboctahedron unit cell respectively approaches the octahedron (or truncated cube) and cube unit cells. For those limits, the analytical solutions presented here were found to approach the analytic solutions obtained for the octahedron, truncated cube, and cube unit cells, meaning that the presented solutions are generalizations of the analytical solutions obtained for several other types of porous biomaterials. Copyright © 2015 Elsevier Ltd. All rights reserved.
Performance of Dental Ceramics
Rekow, E.D.; Silva, N.R.F.A.; Coelho, P.G.; Zhang, Y.; Guess, P.; Thompson, V.P.
2011-01-01
The clinical success of modern dental ceramics depends on an array of factors, ranging from initial physical properties of the material itself, to the fabrication and clinical procedures that inevitably damage these brittle materials, and the oral environment. Understanding the influence of these factors on clinical performance has engaged the dental, ceramics, and engineering communities alike. The objective of this review is to first summarize clinical, experimental, and analytic results reported in the recent literature. Additionally, it seeks to address how this new information adds insight into predictive test procedures and reveals challenges for future improvements. PMID:21224408
Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil
2014-01-01
Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364
WE-G-18A-06: Sinogram Restoration in Helical Cone-Beam CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, K; Riviere, P La
2014-06-15
Purpose: To extend CT sinogram restoration, which has been shown in 2D to reduce noise and to correct for geometric effects and other degradations at a low computational cost, from 2D to a 3D helical cone-beam geometry. Methods: A method for calculating sinogram degradation coefficients for a helical cone-beam geometry was proposed. These values were used to perform penalized-likelihood sinogram restoration on simulated data that were generated from the FORBILD thorax phantom. Sinogram restorations were performed using both a quadratic penalty and the edge-preserving Huber penalty. After sinogram restoration, Fourier-based analytical methods were used to obtain reconstructions. Resolution-variance trade-offs weremore » investigated for several locations within the reconstructions for the purpose of comparing sinogram restoration to no restoration. In order to compare potential differences, reconstructions were performed using different groups of neighbors in the penalty, two analytical reconstruction methods (Katsevich and single-slice rebinning), and differing helical pitches. Results: The resolution-variance properties of reconstructions restored using sinogram restoration with a Huber penalty outperformed those of reconstructions with no restoration. However, the use of a quadratic sinogram restoration penalty did not lead to an improvement over performing no restoration at the outer regions of the phantom. Application of the Huber penalty to neighbors both within a view and across views did not perform as well as only applying the penalty to neighbors within a view. General improvements in resolution-variance properties using sinogram restoration with the Huber penalty were not dependent on the reconstruction method used or the magnitude of the helical pitch. Conclusion: Sinogram restoration for noise and degradation effects for helical cone-beam CT is feasible and should be able to be applied to clinical data. When applied with the edge-preserving Huber penalty, sinogram restoration leads to an improvement in resolution-variance tradeoffs.« less
SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements
NASA Technical Reports Server (NTRS)
Knox, E. C.; Woods, G. Hamilton
1991-01-01
Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.
Electrodialytic in-line preconcentration for ionic solute analysis.
Ohira, Shin-Ichi; Yamasaki, Takayuki; Koda, Takumi; Kodama, Yuko; Toda, Kei
2018-04-01
Preconcentration is an effective way to improve analytical sensitivity. Many types of methods are used for enrichment of ionic solute analytes. However, current methods are batchwise and include procedures such as trapping and elution. In this manuscript, we propose in-line electrodialytic enrichment of ionic solutes. The method can enrich ionic solutes within seconds by quantitative transfer of analytes from the sample solution to the acceptor solution under an electric field. Because of quantitative ion transfer, the enrichment factor (the ratio of the concentration in the sample and to that in the obtained acceptor solution) only depends on the flow rate ratio of the sample solution to the acceptor solution. The ratios of the concentrations and flow rates are equal for ratios up to 70, 20, and 70 for the tested ionic solutes of inorganic cations, inorganic anions, and heavy metal ions, respectively. The sensitivity of ionic solute determinations is also improved based on the enrichment factor. The method can also simultaneously achieve matrix isolation and enrichment. The method was successively applied to determine the concentrations of trace amounts of chloroacetic acids in tap water. The regulated concentration levels cannot be determined by conventional high-performance liquid chromatography with ultraviolet detection (HPLC-UV) without enrichment. However, enrichment with the present method is effective for determination of tap water quality by improving the limits of detection of HPLC-UV. The standard addition test with real tap water samples shows good recoveries (94.9-109.6%). Copyright © 2017 Elsevier B.V. All rights reserved.
Li, Tingting; Wang, Wei; Zhao, Haijian; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo
2017-09-07
This study aimed to investigate the status of internal quality control (IQC) for cardiac biomarkers from 2011 to 2016 so that we can have overall knowledge of the precision level of measurements in China and set appropriate precision specifications. Internal quality control data of cardiac biomarkers, including creatinine kinase MB (CK-MB) (μg/L), CK-MB(U/L), myoglobin (Mb), cardiac troponin I (cTnI), cardiac troponin T (cTnT), and homocysteines (HCY), were collected by a web-based external quality assessment (EQA) system. Percentages of laboratories meeting five precision quality specifications for current coefficient of variations (CVs) were calculated. Then, appropriate precision specifications were chosen for these six analytes. Finally, the CVs and IQC practice were further analyzed with different grouping methods. The current CVs remained nearly constant for 6 years. cTnT had the highest pass rates every year against five specifications, whereas HCY had the lowest pass rates. Overall, most analytes had a satisfactory performance (pass rates >80%), except for HCY, if one-third TEa or the minimum specification were employed. When the optimal specification was applied, the performance of most analytes was frustrating (pass rates < 60%) except for cTnT. The appropriate precision specifications of Mb, cTnI, cTnT and HCY were set as current CVs less than 9.20%, 9.90%, 7.50%, 10.54%, 7.63%, and 6.67%, respectively. The data of IQC practices indicated wide variation and substantial progress. The precision performance of cTnT was already satisfying, while the other five analytes, especially HCY, were still frustrating; thus, ongoing investigation and continuous improvement for IQC are still needed. © 2017 Wiley Periodicals, Inc.
Experimental and analytical studies of advanced air cushion landing systems
NASA Technical Reports Server (NTRS)
Lee, E. G. S.; Boghani, A. B.; Captain, K. M.; Rutishauser, H. J.; Farley, H. L.; Fish, R. B.; Jeffcoat, R. L.
1981-01-01
Several concepts are developed for air cushion landing systems (ACLS) which have the potential for improving performance characteristics (roll stiffness, heave damping, and trunk flutter), and reducing fabrication cost and complexity. After an initial screening, the following five concepts were evaluated in detail: damped trunk, filled trunk, compartmented trunk, segmented trunk, and roll feedback control. The evaluation was based on tests performed on scale models. An ACLS dynamic simulation developed earlier is updated so that it can be used to predict the performance of full-scale ACLS incorporating these refinements. The simulation was validated through scale-model tests. A full-scale ACLS based on the segmented trunk concept was fabricated and installed on the NASA ACLS test vehicle, where it is used to support advanced system development. A geometrically-scaled model (one third full scale) of the NASA test vehicle was fabricated and tested. This model, evaluated by means of a series of static and dynamic tests, is used to investigate scaling relationships between reduced and full-scale models. The analytical model developed earlier is applied to simulate both the one third scale and the full scale response.
Study on additional carrier sensing for IEEE 802.15.4 wireless sensor networks.
Lee, Bih-Hwang; Lai, Ruei-Lung; Wu, Huai-Kuei; Wong, Chi-Ming
2010-01-01
Wireless sensor networks based on the IEEE 802.15.4 standard are able to achieve low-power transmissions in the guise of low-rate and short-distance wireless personal area networks (WPANs). The slotted carrier sense multiple access with collision avoidance (CSMA/CA) is used for contention mechanism. Sensor nodes perform a backoff process as soon as the clear channel assessment (CCA) detects a busy channel. In doing so they may neglect the implicit information of the failed CCA detection and further cause the redundant sensing. The blind backoff process in the slotted CSMA/CA will cause lower channel utilization. This paper proposes an additional carrier sensing (ACS) algorithm based on IEEE 802.15.4 to enhance the carrier sensing mechanism for the original slotted CSMA/CA. An analytical Markov chain model is developed to evaluate the performance of the ACS algorithm. Both analytical and simulation results show that the proposed algorithm performs better than IEEE 802.15.4, which in turn significantly improves throughput, average medium access control (MAC) delay and power consumption of CCA detection.
High precision analytical description of the allowed β spectrum shape
NASA Astrophysics Data System (ADS)
Hayen, Leendert; Severijns, Nathal; Bodek, Kazimierz; Rozpedzik, Dagmara; Mougeot, Xavier
2018-01-01
A fully analytical description of the allowed β spectrum shape is given in view of ongoing and planned measurements. Its study forms an invaluable tool in the search for physics beyond the standard electroweak model and the weak magnetism recoil term. Contributions stemming from finite size corrections, mass effects, and radiative corrections are reviewed. Particular focus is placed on atomic and chemical effects, where the existing description is extended and analytically provided. The effects of QCD-induced recoil terms are discussed, and cross-checks were performed for different theoretical formalisms. Special attention was given to a comparison of the treatment of nuclear structure effects in different formalisms. Corrections were derived for both Fermi and Gamow-Teller transitions, and methods of analytical evaluation thoroughly discussed. In its integrated form, calculated f values were in agreement with the most precise numerical results within the aimed for precision. The need for an accurate evaluation of weak magnetism contributions was stressed, and the possible significance of the oft-neglected induced pseudoscalar interaction was noted. Together with improved atomic corrections, an analytical description was presented of the allowed β spectrum shape accurate to a few parts in 10-4 down to 1 keV for low to medium Z nuclei, thereby extending the work by previous authors by nearly an order of magnitude.
Capillary waveguide optrodes: an approach to optical sensing in medical diagnostics
NASA Astrophysics Data System (ADS)
Lippitsch, Max E.; Draxler, Sonja; Kieslinger, Dietmar; Lehmann, Hartmut; Weigl, Bernhard H.
1996-07-01
Glass capillaries with a chemically sensitive coating on the inner surface are used as optical sensors for medical diagnostics. A capillary simultaneously serves as a sample compartment, a sensor element, and an inhomogeneous optical waveguide. Various detection schemes based on absorption, fluorescence intensity, or fluorescence lifetime are described. In absorption-based capillary waveguide optrodes the absorption in the sensor layer is analyte dependent; hence light transmission along the inhomogeneous waveguiding structure formed by the capillary wall and the sensing layer is a function of the analyte concentration. Similarly, in fluorescence-based capillary optrodes the fluorescence intensity or the fluorescence lifetime of an indicator dye fixed in the sensing layer is analyte dependent; thus the specific property of fluorescent light excited in the sensing layer and thereafter guided along the inhomogeneous waveguiding structure is a function of the analyte concentration. Both schemes are experimentally demonstrated, one with carbon dioxide as the analyte and the other one with oxygen. The device combines optical sensors with the standard glass capillaries usually applied to gather blood drops from fingertips, to yield a versatile diagnostic instrument, integrating the sample compartment, the optical sensor, and the light-collecting optics into a single piece. This ensures enhanced sensor performance as well as improved handling compared with other sensors. waveguide, blood gases, medical diagnostics.
Big data in health care: using analytics to identify and manage high-risk and high-cost patients.
Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel
2014-07-01
The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.
Pilolli, Rosa; Ditaranto, Nicoletta; Di Franco, Cinzia; Palmisano, Francesco; Cioffi, Nicola
2012-10-01
Metal nanomaterials have an emerging role in surface-assisted laser desorption ionisation-mass spectrometry (SALDI-MS) providing a useful tool to overcome some limitations intrinsically related to the use of conventional organic matrices in matrix-assisted LDI-MS. In this contribution, the possibility to use a stainless-steel-supported gold nanoparticle (AuNP) film as a versatile platform for SALDI-MS was assessed. A sacrificial anode electrosynthetic route was chosen in order to obtain morphologically controlled core-shell AuNPs; the colloidal AuNPs were, thereafter, drop cast onto a stainless-steel sample plate and the resulting AuNP film was thermally annealed in order to improve its effectiveness as LDI-MS promoter. Spectroscopic characterization of the nanostructured film by X-ray photoelectron spectroscopy was crucial for understanding how annealing induced changes in the surface chemistry and influenced the performance of AuNPs as desorption/ionisation promoter. In particular, it was demonstrated that the post-deposition treatments were essential to enhance the AuNP core/analyte interaction, thus resulting in SALDI-MS spectra of significantly improved quality. The AuNP films were applied to the detection of three different classes of low molecular weight (LMW) analytes, i.e. amino acids, peptides and LMW polymers, in order to demonstrate the versatility of this nanostructured material.
Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L
2018-04-01
Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.
Challenges and Opportunities of Big Data in Health Care: A Systematic Review
Goswamy, Rishi; Raval, Yesha; Marawi, Sarah
2016-01-01
Background Big data analytics offers promise in many business sectors, and health care is looking at big data to provide answers to many age-related issues, particularly dementia and chronic disease management. Objective The purpose of this review was to summarize the challenges faced by big data analytics and the opportunities that big data opens in health care. Methods A total of 3 searches were performed for publications between January 1, 2010 and January 1, 2016 (PubMed/MEDLINE, CINAHL, and Google Scholar), and an assessment was made on content germane to big data in health care. From the results of the searches in research databases and Google Scholar (N=28), the authors summarized content and identified 9 and 14 themes under the categories Challenges and Opportunities, respectively. We rank-ordered and analyzed the themes based on the frequency of occurrence. Results The top challenges were issues of data structure, security, data standardization, storage and transfers, and managerial skills such as data governance. The top opportunities revealed were quality improvement, population management and health, early detection of disease, data quality, structure, and accessibility, improved decision making, and cost reduction. Conclusions Big data analytics has the potential for positive impact and global implications; however, it must overcome some legitimate obstacles. PMID:27872036
Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G
2000-07-01
Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.
NASA Astrophysics Data System (ADS)
Chowdhury, Prasun; Saha Misra, Iti
2014-10-01
Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Afzal, Muhammad U., E-mail: muhammad.afzal@mq.edu.au; Esselle, Karu P.
This paper presents a quasi-analytical technique to design a continuous, all-dielectric phase correcting structures (PCSs) for circularly polarized Fabry-Perot resonator antennas (FPRAs). The PCS has been realized by varying the thickness of a rotationally symmetric dielectric block placed above the antenna. A global analytical expression is derived for the PCS thickness profile, which is required to achieve nearly uniform phase distribution at the output of the PCS, despite the non-uniform phase distribution at its input. An alternative piecewise technique based on spline interpolation is also explored to design a PCS. It is shown from both far- and near-field results thatmore » a PCS tremendously improves the radiation performance of the FPRA. These improvements include an increase in peak directivity from 22 to 120 (from 13.4 dBic to 20.8 dBic) and a decrease of 3 dB beamwidth from 41.5° to 15°. The phase-corrected antenna also has a good directivity bandwidth of 1.3 GHz, which is 11% of the center frequency.« less
NASA Astrophysics Data System (ADS)
Wenzel, Thomas J.
2001-09-01
The availability of state-of-the-art instruments such as high performance liquid chromatograph, gas chromatograph-mass spectrometer, inductively coupled plasma-atomic emission spectrometer, capillary electrophoresis system, and ion chromatograph obtained through four Instructional Laboratory Improvement and one Course, Curriculum, and Laboratory Improvement grants from the National Science Foundation has led to a profound change in the structure of the analytical and general chemistry courses at Bates College. Students in both sets of courses now undertake ambitious, semester-long, small-group projects. The general chemistry course, which fulfills the prerequisite requirement for all upper-level chemistry courses, focuses on the connection between chemistry and the study of the environment. The projects provide students with an opportunity to conduct a real scientific investigation. The projects emphasize problem solving, team work, and communication, while still fostering the development of important laboratory skills. Cooperative learning is also used extensively in the classroom portion of these courses.
NASA Astrophysics Data System (ADS)
Wang, Peng; Li, Hong; Zhang, Jiye; Mei, TX
2015-10-01
In this paper, an analytical design approach for the development of self-powered active suspensions is investigated and is applied to optimise the control system design for an active lateral secondary suspension for railway vehicles. The conditions for energy balance are analysed and the relationship between the ride quality improvement and energy consumption is discussed in detail. The modal skyhook control is applied to analyse the energy consumption of this suspension by separating its dynamics into the lateral and yaw modes, and based on a simplified model, the average power consumption of actuators is computed in frequency domain by using the power spectral density of lateral alignment of track irregularities. Then the impact of control gains and actuators' key parameters on the performance for both vibration suppressing and energy recovery/storage is analysed. Computer simulation is used to verify the obtained energy balance condition and to demonstrate that the improved ride comfort is achieved by this self-powered active suspension without any external power supply.
Interactive Management and Updating of Spatial Data Bases
NASA Technical Reports Server (NTRS)
French, P.; Taylor, M.
1982-01-01
The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.
NASA Astrophysics Data System (ADS)
Khazaee, I.
2015-05-01
In this study, the performance of a proton exchange membrane fuel cell in mobile applications is investigated analytically. At present the main use and advantages of fuel cells impact particularly strongly on mobile applications such as vehicles, mobile computers and mobile telephones. Some external parameters such as the cell temperature (Tcell ) , operating pressure of gases (P) and air stoichiometry (λair ) affect the performance and voltage losses in the PEM fuel cell. Because of the existence of many theoretical, empirical and semi-empirical models of the PEM fuel cell, it is necessary to compare the accuracy of these models. But theoretical models that are obtained from thermodynamic and electrochemical approach, are very exact but complex, so it would be easier to use the empirical and smi-empirical models in order to forecast the fuel cell system performance in many applications such as mobile applications. The main purpose of this study is to obtain the semi-empirical relation of a PEM fuel cell with the least voltage losses. Also, the results are compared with the existing experimental results in the literature and a good agreement is seen.
Boeing Smart Rotor Full-scale Wind Tunnel Test Data Report
NASA Technical Reports Server (NTRS)
Kottapalli, Sesi; Hagerty, Brandon; Salazar, Denise
2016-01-01
A full-scale helicopter smart material actuated rotor technology (SMART) rotor test was conducted in the USAF National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel at NASA Ames. The SMART rotor system is a five-bladed MD 902 bearingless rotor with active trailing-edge flaps. The flaps are actuated using piezoelectric actuators. Rotor performance, structural loads, and acoustic data were obtained over a wide range of rotor shaft angles of attack, thrust, and airspeeds. The primary test objective was to acquire unique validation data for the high-performance computing analyses developed under the Defense Advanced Research Project Agency (DARPA) Helicopter Quieting Program (HQP). Other research objectives included quantifying the ability of the on-blade flaps to achieve vibration reduction, rotor smoothing, and performance improvements. This data set of rotor performance and structural loads can be used for analytical and experimental comparison studies with other full-scale rotor systems and for analytical validation of computer simulation models. The purpose of this final data report is to document a comprehensive, highquality data set that includes only data points where the flap was actively controlled and each of the five flaps behaved in a similar manner.
Wang, Zhibing; He, Mengyu; Jiang, Chunzhu; Zhang, Fengqing; Du, Shanshan; Feng, Wennan; Zhang, Hanqi
2015-12-01
Matrix solid-phase dispersion coupled with homogeneous ionic liquid microextraction was developed and applied to the extraction of some sulfonamides, including sulfamerazine, sulfamethazine, sulfathiazole, sulfachloropyridazine, sulfadoxine, sulfisoxazole, and sulfaphenazole, in animal tissues. High-performance liquid chromatography was applied to the separation and determination of the target analytes. The solid sample was directly treated by matrix solid-phase dispersion and the eluate obtained was treated by homogeneous ionic liquid microextraction. The ionic liquid was used as the extraction solvent in this method, which may result in the improvement of the recoveries of the target analytes. To avoid using organic solvent and reduce environmental pollution, water was used as the elution solvent of matrix solid-phase dispersion. The effects of the experimental parameters on recoveries, including the type and volume of ionic liquid, type of dispersant, ratio of sample to dispersant, pH value of elution solvent, volume of elution solvent, amount of salt in eluate, amount of ion-pairing agent (NH4 PF6 ), and centrifuging time, were evaluated. When the present method was applied to the analysis of animal tissues, the recoveries of the analytes ranged from 85.4 to 118.0%, and the relative standard deviations were lower than 9.30%. The detection limits for the analytes were 4.3-13.4 μg/kg. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Enhanced performance of microfluidic soft pressure sensors with embedded solid microspheres
NASA Astrophysics Data System (ADS)
Shin, Hee-Sup; Ryu, Jaiyoung; Majidi, Carmel; Park, Yong-Lae
2016-02-01
The cross-sectional geometry of an embedded microchannel influences the electromechanical response of a soft microfluidic sensor to applied surface pressure. When a pressure is exerted on the surface of the sensor deforming the soft structure, the cross-sectional area of the embedded channel filled with a conductive fluid decreases, increasing the channel’s electrical resistance. This electromechanical coupling can be tuned by adding solid microspheres into the channel. In order to determine the influence of microspheres, we use both analytic and computational methods to predict the pressure responses of soft microfluidic sensors with two different channel cross-sections: a square and an equilateral triangular. The analytical models were derived from contact mechanics in which microspheres were regarded as spherical indenters, and finite element analysis (FEA) was used for simulation. For experimental validation, sensor samples with the two different channel cross-sections were prepared and tested. For comparison, the sensor samples were tested both with and without microspheres. All three results from the analytical models, the FEA simulations, and the experiments showed reasonable agreement confirming that the multi-material soft structure significantly improved its pressure response in terms of both linearity and sensitivity. The embedded solid particles enhanced the performance of soft sensors while maintaining their flexible and stretchable mechanical characteristic. We also provide analytical and experimental analyses of hysteresis of microfluidic soft sensors considering a resistive force to the shape recovery of the polymer structure by the embedded viscous fluid.
Hot film wall shear instrumentation for compressible boundary layer transition research
NASA Technical Reports Server (NTRS)
Schneider, Steven P.
1992-01-01
Experimental and analytical studies of hot film wall shear instrumentation were performed. A new hot film anemometer was developed and tested. The anemometer performance was not quite as good as that of commercial anemometers, but the cost was much less and testing flexibility was improved. The main focus of the project was a parametric study of the effect of sensor size and substrate material on the performance of hot film surface sensors. Both electronic and shock-induced flow experiments were performed to determine the sensitivity and frequency response of the sensors. The results are presented in Michael Moen's M.S. thesis, which is appended. A condensed form of the results was also submitted for publication.
saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings
Myers, Nicholas M.; Strydom, Emmerentia Elza; Sweet, James; Sweet, Christopher; Spohrer, Rebecca; Dhansay, Muhammad Ali; Lieberman, Marya
2016-01-01
We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704) were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production. PMID:29942380
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-04
Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.
Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.
2011-01-01
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097
Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M
2011-07-15
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.
Molecularly imprinted polymers as selective adsorbents for ambient plasma mass spectrometry.
Cegłowski, Michał; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2017-05-01
The application of molecularly imprinted polymers (MIPs) as molecular scavengers for ambient plasma ionization mass spectrometry has been reported for the first time. MIPs were synthesized using methacrylic acid as functional monomer; nicotine, propyphenazone, or methylparaben as templates; ethylene glycol dimethacrylate as a cross-linker; and 2,2'-azobisisobutyronitrile as polymerization initiator. To perform ambient plasma ionization experiments, a setup consisting of the heated crucible, a flowing atmospheric-pressure afterglow (FAPA) plasma ion source, and a quadrupole ion trap mass spectrometer has been used. The heated crucible with programmable temperature allows for desorption of the analytes from MIPs structure which results in their direct introduction into the ion stream. Limits of detection, linearity of the proposed analytical procedure, and selectivities have been determined for three analytes: nicotine, propyphenazone, and methylparaben. The analytes used were chosen from various classes of organic compounds to show the feasibility of the analytical procedure. The limits of detections (LODs) were 10 nM, 10, and 0.5 μM for nicotine, propyphenazone, and methylparaben, respectively. In comparison with the measurements performed for the non-imprinted polymers, the values of LODs were improved for at least one order of magnitude due to preconcentration of the sample and reduction of background noise, contributing to signal suppression. The described procedure has shown linearity in a broad range of concentrations. The overall time of single analysis is short and requires ca. 5 min. The developed technique was applied for the determination of nicotine, propyphenazone, and methylparaben in spiked real-life samples, with recovery of 94.6-98.4%. The proposed method is rapid, sensitive, and accurate which provides a new option for the detection of small organic compounds in various samples. Graphical abstract The experimental setup used for analysis.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics
NASA Technical Reports Server (NTRS)
Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan
2013-01-01
In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Sternick, Edward S
2011-01-01
The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance US business competitiveness and economic growth. Administered by the National Institute of Standards and Technology, the Act created the Baldrige National Quality Program, recently renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well-suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact-based, knowledge-driven system for improving quality of care, increasing patient satisfaction, enhancing leadership effectiveness, building employee engagement, and boosting organizational innovation. This methodology also provides a valuable framework for benchmarking an individual radiation oncology practice's operations and results against guidelines defined by accreditation and professional organizations and regulatory agencies.
Strategy for improved frequency response of electric double-layer capacitors
NASA Astrophysics Data System (ADS)
Wada, Yoshifumi; Pu, Jiang; Takenobu, Taishi
2015-10-01
We propose a strategy for improving the response speed of electric double-layer capacitors (EDLCs) and electric double-layer transistors (EDLTs), based on an asymmetric structure with differently sized active materials and gate electrodes. We validate the strategy analytically by a classical calculation and experimentally by fabricating EDLCs with asymmetric Au electrodes (1:50 area ratio and 7.5 μm gap distance). The performance of the EDLCs is compared with that of conventional symmetric EDLCs. Our strategy dramatically improved the cut-off frequency from 14 to 93 kHz and this improvement is explained by fast charging of smaller electrodes. Therefore, this approach is particularly suitable to EDLTs, potentially expanding the applicability to medium speed (kHz-MHz) devices.
Numerical Modeling of Pulse Detonation Rocket Engine Gasdynamics and Performance
NASA Technical Reports Server (NTRS)
Morris, C. I.
2003-01-01
Pulse detonation engines (PDB) have generated considerable research interest in recent years as a chemical propulsion system potentially offering improved performance and reduced complexity compared to conventional gas turbines and rocket engines. The detonative mode of combustion employed by these devices offers a theoretical thermodynamic advantage over the constant-pressure deflagrative combustion mode used in conventional engines. However, the unsteady blowdown process intrinsic to all pulse detonation devices has made realistic estimates of the actual propulsive performance of PDES problematic. The recent review article by Kailasanath highlights some of the progress that has been made in comparing the available experimental measurements with analytical and numerical models.
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
European Multicenter Study on Analytical Performance of DxN Veris System HCV Assay.
Braun, Patrick; Delgado, Rafael; Drago, Monica; Fanti, Diana; Fleury, Hervé; Gismondo, Maria Rita; Hofmann, Jörg; Izopet, Jacques; Kühn, Sebastian; Lombardi, Alessandra; Marcos, Maria Angeles; Sauné, Karine; O'Shea, Siobhan; Pérez-Rivilla, Alfredo; Ramble, John; Trimoulet, Pascale; Vila, Jordi; Whittaker, Duncan; Artus, Alain; Rhodes, Daniel W
2017-04-01
The analytical performance of the Veris HCV Assay for use on the new and fully automated Beckman Coulter DxN Veris Molecular Diagnostics System (DxN Veris System) was evaluated at 10 European virology laboratories. Precision, analytical sensitivity, specificity, and performance with negative samples, linearity, and performance with hepatitis C virus (HCV) genotypes were evaluated. Precision for all sites showed a standard deviation (SD) of 0.22 log 10 IU/ml or lower for each level tested. Analytical sensitivity determined by probit analysis was between 6.2 and 9.0 IU/ml. Specificity on 94 unique patient samples was 100%, and performance with 1,089 negative samples demonstrated 100% not-detected results. Linearity using patient samples was shown from 1.34 to 6.94 log 10 IU/ml. The assay demonstrated linearity upon dilution with all HCV genotypes. The Veris HCV Assay demonstrated an analytical performance comparable to that of currently marketed HCV assays when tested across multiple European sites. Copyright © 2017 American Society for Microbiology.
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Vehicle-scale investigation of a fluorine jet-pump liquid hydrogen tank pressurization system
NASA Technical Reports Server (NTRS)
Cady, E. C.; Kendle, D. W.
1972-01-01
A comprehensive analytical and experimental program was performed to evaluate the performance of a fluorine-hydrogen jet-pump injector for main tank injection (MTI) pressurization of a liquid hydrogen (LH2) tank. The injector performance during pressurization and LH2 expulsion was determined by a series of seven tests of a full-scale injector and MTI pressure control system in a 28.3 cu m (1000 cu ft) flight-weight LH2 tank. Although the injector did not effectively jet-pump LH2 continuously, it showed improved pressurization performance compared to straight-pipe injectors tested under the same conditions in a previous program. The MTI computer code was modified to allow performance prediction for the jet-pump injector.
Supersonic wings with significant leading-edge thrust at cruise
NASA Technical Reports Server (NTRS)
Robins, A. W.; Carlson, H. W.; Mack, R. J.
1980-01-01
Experimental/theoretical correlations are presented which show that significant levels of leading edge thrust are possible at supersonic speeds for certain planforms which match the theoretical thrust distribution potential with the supporting airfoil geometry. The analytical process employed spanwise distribution of both it and/or that component of full theoretical thrust which acts as vortex lift. Significantly improved aerodynamic performance in the moderate supersonic speed regime is indicated.
Estimating reliable paediatric reference intervals in clinical chemistry and haematology.
Ridefelt, Peter; Hellberg, Dan; Aldrimer, Mattias; Gustafsson, Jan
2014-01-01
Very few high-quality studies on paediatric reference intervals for general clinical chemistry and haematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The present review summarises current reference interval studies for common clinical chemistry and haematology analyses. ©2013 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok
2007-01-01
This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.
Mitchell, Elizabeth O; Stewart, Greg; Bajzik, Olivier; Ferret, Mathieu; Bentsen, Christopher; Shriver, M Kathleen
2013-12-01
A multisite study was conducted to evaluate the performance of the Bio-Rad 4th generation GS HIV Combo Ag/Ab EIA versus Abbott 4th generation ARCHITECT HIV Ag/Ab Combo. The performance of two 3rd generation EIAs, Ortho Diagnostics Anti-HIV 1+2 EIA and Siemens HIV 1/O/2 was also evaluated. Study objective was comparison of analytical HIV-1 p24 antigen detection, sensitivity in HIV-1 seroconversion panels, specificity in blood donors and two HIV false reactive panels. Analytical sensitivity was evaluated with International HIV-1 p24 antigen standards, the AFFSAPS (pg/mL) and WHO 90/636 (IU/mL) standards; sensitivity in acute infection was compared on 55 seroconversion samples, and specificity was evaluated on 1000 negative blood donors and two false reactive panels. GS HIV Combo Ag/Ab demonstrated better analytical HIV antigen sensitivity compared to ARCHITECT HIV Ag/Ab Combo: 0.41 IU/mL versus 1.2 IU/mL (WHO) and 12.7 pg/mL versus 20.1 pg/mL (AFSSAPS); GS HIV Combo Ag/Ab EIA also demonstrated slightly better specificity compared to ARCHITECT HIV Ag/Ab Combo (100% versus 99.7%). The 4th generation HIV Combo tests detected seroconversion 7-11 days earlier than the 3rd generation HIV antibody only EIAs. Both 4th generation immunoassays demonstrated excellent performance in sensitivity, with the reduction of the serological window period (7-11 days earlier detection than the 3rd generation HIV tests). However, GS HIV Combo Ag/Ab demonstrated improved HIV antigen analytical sensitivity and slightly better specificity when compared to ARCHITECT HIV Ag/Ab Combo assay, with higher positive predictive values (PPV) for low prevalence populations. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Doe, T.; McLaren, R.; Finilla, A.
2017-12-01
An enduring legacy of Paul Witherspoon and his students and colleagues has been both the development of geothermal energy and the bases of modern fractured-rock hydrogeology. One of the seminal contributions to the geothermal field was Gringarten, Witherspoon, and Ohnishi's analytical models for enhanced geothermal systems. Although discrete fracture network (DFN) modeling developed somewhat independently in the late 1970s, Paul Witherspoon's foresight in promoting underground in situ testing at the Stripa Mine in Sweden was a major driver in Lawrence Berkeley Laboratory's contributions to its development.This presentation looks extensions of Gringarten's analytical model into discrete fracture network modeling as a basis for providing further insights into the challenges and opportunities of engineered geothermal systems. The analytical solution itself has many insightful applications beyond those presented in the original paper. The definition of dimensionless time by itself shows that thermal breakthrough has a second power dependence on surface area and on flow rate. The fracture intensity also plays a strong role, as it both increases the surface area and decrease his flow rate per fracture. The improvement of EGS performance with fracture intensity reaches a limit where thermal depletion of the rock lags only slightly behind the thermal breakthrough of cold water in the fracture network.Simple network models, which couple a DFN generator (FracMan) with a hydrothermally coupled flow solver (HydroGeoSphere) expand on Gringarten's concepts to show that realistic heterogeneity of spacing and transmissivity significantly degrades EGS performance. EGS production in networks of stimulated fractures initially follows Gringarten's type curves, with a later deviation is the smaller rock blocks thermally deplete and the entire stimulated volume acts as a single sink. Three-dimensional models of EGS performance show the critical importance of the relative magnitudes of fluid pressure and stress gradients, preferential growth and aperture enhancement may change with depth creating preferential pathways through rock this cooler than the injection depth.
Pilot proficiency testing study for second tier congenital adrenal hyperplasia newborn screening.
De Jesús, Víctor R; Simms, David A; Schiffer, Jarad; Kennedy, Meredith; Mei, Joanne V; Hannon, W Harry
2010-11-11
Congenital adrenal hyperplasia (CAH) is caused by inherited defects in steroid biosynthesis. The Newborn Screening Quality Assurance Program (NSQAP) initiated a pilot, dried-blood spot (DBS)-based proficiency testing program designed to investigate materials and laboratory performance for second tier CAH screening by tandem mass spectrometry (MS/MS). The ratio of 17-α-hydroxyprogesterone (17-OHP), androstenedione (4-AD) and cortisol is used as an indicator of CAH in laboratory protocols for second tier analysis of DBS specimens. DBS prepared by NSQAP contained a range of steroid concentrations resulting in different clinical ratios. Laboratories received blind-coded DBS specimens and reported results to NSQAP for evaluation. Quantitative values reported by participants for 17-OHP, 4-AD, and cortisol, reflected small differences in their analytical methods. Average quantitative values for 17-OHP increased from 81% to 107% recovery over the 3.5-year period; cortisol recoveries increased from 61.9% to 89.5%; and 4-AD recoveries decreased from 184% to 68%. Laboratory participation in the CAH second tier proficiency testing program has resulted in improved analyte recoveries and enhanced sample preparation methodologies. NSQAP services for the second tier CAH analysis in DBS demonstrate the need for surveillance to ensure harmonization and continuous improvements, and to achieve sustained high-performance of newborn screening laboratories worldwide. Published by Elsevier B.V.
Freire, Carmen S. R.; Coutinho, João A. P.; Silvestre, Armando J. D.; Freire, Mara G.
2016-01-01
Due to their unique properties, in recent years, ionic liquids (ILs) have been largely investigated in the field of analytical chemistry. Particularly during the last sixteen years, they have been successfully applied in the chromatographic and electrophoretic analysis of value-added compounds extracted from biomass. Considering the growing interest in the use of ILs in this field, this critical review provides a comprehensive overview on the improvements achieved using ILs as constituents of mobile or stationary phases in analytical techniques, namely in capillary electrophoresis and its different modes, in high performance liquid chromatography, and in gas chromatography, for the separation and analysis of natural compounds. The impact of the IL chemical structure and the influence of secondary parameters, such as the IL concentration, temperature, pH, voltage and analysis time (when applied), are also critically addressed regarding the achieved separation improvements. Major conclusions on the role of ILs in the separation mechanisms and the performance of these techniques in terms of efficiency, resolution and selectivity are provided. Based on a critical analysis of all published results, some target-oriented ILs are suggested. Finally, current drawbacks and future challenges in the field are highlighted. In particular, the design and use of more benign and effective ILs as well as the development of integrated (and thus more sustainable) extraction–separation processes using IL aqueous solutions are suggested within a green chemistry perspective. PMID:27667965
The Empower project - a new way of assessing and monitoring test comparability and stability.
De Grande, Linde A C; Goossens, Kenneth; Van Uytfanghe, Katleen; Stöckl, Dietmar; Thienpont, Linda M
2015-07-01
Manufacturers and laboratories might benefit from using a modern integrated tool for quality management/assurance. The tool should not be confounded by commutability issues and focus on the intrinsic analytical quality and comparability of assays as performed in routine laboratories. In addition, it should enable monitoring of long-term stability of performance, with the possibility to quasi "real-time" remedial action. Therefore, we developed the "Empower" project. The project comprises four pillars: (i) master comparisons with panels of frozen single-donation samples, (ii) monitoring of patient percentiles and (iii) internal quality control data, and (iv) conceptual and statistical education about analytical quality. In the pillars described here (i and ii), state-of-the-art as well as biologically derived specifications are used. In the 2014 master comparisons survey, 125 laboratories forming 8 peer groups participated. It showed not only good intrinsic analytical quality of assays but also assay biases/non-comparability. Although laboratory performance was mostly satisfactory, sometimes huge between-laboratory differences were observed. In patient percentile monitoring, currently, 100 laboratories participate with 182 devices. Particularly, laboratories with a high daily throughput and low patient population variation show a stable moving median in time with good between-instrument concordance. Shifts/drifts due to lot changes are sometimes revealed. There is evidence that outpatient medians mirror the calibration set-points shown in the master comparisons. The Empower project gives manufacturers and laboratories a realistic view on assay quality/comparability as well as stability of performance and/or the reasons for increased variation. Therefore, it is a modern tool for quality management/assurance toward improved patient care.
Data quality in drug discovery: the role of analytical performance in ligand binding assays
NASA Astrophysics Data System (ADS)
Wätzig, Hermann; Oltmann-Norden, Imke; Steinicke, Franziska; Alhazmi, Hassan A.; Nachbar, Markus; El-Hady, Deia Abd; Albishri, Hassan M.; Baumann, Knut; Exner, Thomas; Böckler, Frank M.; El Deeb, Sami
2015-09-01
Despite its importance and all the considerable efforts made, the progress in drug discovery is limited. One main reason for this is the partly questionable data quality. Models relating biological activity and structures and in silico predictions rely on precisely and accurately measured binding data. However, these data vary so strongly, such that only variations by orders of magnitude are considered as unreliable. This can certainly be improved considering the high analytical performance in pharmaceutical quality control. Thus the principles, properties and performances of biochemical and cell-based assays are revisited and evaluated. In the part of biochemical assays immunoassays, fluorescence assays, surface plasmon resonance, isothermal calorimetry, nuclear magnetic resonance and affinity capillary electrophoresis are discussed in details, in addition radiation-based ligand binding assays, mass spectrometry, atomic force microscopy and microscale thermophoresis are briefly evaluated. In addition, general sources of error, such as solvent, dilution, sample pretreatment and the quality of reagents and reference materials are discussed. Biochemical assays can be optimized to provide good accuracy and precision (e.g. percental relative standard deviation <10 %). Cell-based assays are often considered superior related to the biological significance, however, typically they cannot still be considered as really quantitative, in particular when results are compared over longer periods of time or between laboratories. A very careful choice of assays is therefore recommended. Strategies to further optimize assays are outlined, considering the evaluation and the decrease of the relevant error sources. Analytical performance and data quality are still advancing and will further advance the progress in drug development.
DRREP: deep ridge regressed epitope predictor.
Sher, Gene; Zhi, Degui; Zhang, Shaojie
2017-10-03
The ability to predict epitopes plays an enormous role in vaccine development in terms of our ability to zero in on where to do a more thorough in-vivo analysis of the protein in question. Though for the past decade there have been numerous advancements and improvements in epitope prediction, on average the best benchmark prediction accuracies are still only around 60%. New machine learning algorithms have arisen within the domain of deep learning, text mining, and convolutional networks. This paper presents a novel analytically trained and string kernel using deep neural network, which is tailored for continuous epitope prediction, called: Deep Ridge Regressed Epitope Predictor (DRREP). DRREP was tested on long protein sequences from the following datasets: SARS, Pellequer, HIV, AntiJen, and SEQ194. DRREP was compared to numerous state of the art epitope predictors, including the most recently published predictors called LBtope and DMNLBE. Using area under ROC curve (AUC), DRREP achieved a performance improvement over the best performing predictors on SARS (13.7%), HIV (8.9%), Pellequer (1.5%), and SEQ194 (3.1%), with its performance being matched only on the AntiJen dataset, by the LBtope predictor, where both DRREP and LBtope achieved an AUC of 0.702. DRREP is an analytically trained deep neural network, thus capable of learning in a single step through regression. By combining the features of deep learning, string kernels, and convolutional networks, the system is able to perform residue-by-residue prediction of continues epitopes with higher accuracy than the current state of the art predictors.
Improved apparatus for continuous culture of hydrogen-fixing bacteria
NASA Technical Reports Server (NTRS)
Foster, J. F.; Litchfield, J. H.
1970-01-01
Improved apparatus permits the continuous culture of Hydrogenomonas eutropha. System incorporates three essential subsystems - /1/ environmentally isolated culture vessel, /2/ analytical system with appropriate sensors and readout devices, /3/ control system with feedback responses to each analytical measurement.
Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle
2018-01-01
Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.
Casella, Innocenzo G; Pierri, Marianna; Contursi, Michela
2006-02-24
The electrochemical behaviour of the polycrystalline platinum electrode towards the oxidation/reduction of short-chain unsaturated aliphatic molecules such as acrylamide and acrylic acid was investigated in acidic solutions. Analytes were separated by reverse phase liquid chromatographic and quantified using a pulsed amperometric detection. A new two-step waveform, is introduced for detection of acrylamide and acrylic acid. Detection limits (LOD) of 20 nM (1. 4 microg/kg) and 45 nM (3.2 microg/kg) were determined in water solutions containing acrylamide and acrylic acid, respectively. Compared to the classical three-step waveform, the proposed two-step waveform shows favourable analytical performance in terms of LOD, linear range, precision and improved long-term reproducibility. The proposed analytical method combined with clean-up procedure accomplished by Carrez clearing reagent and subsequent extraction with a strong cation exchanger cartridges (SPE), was successfully used for the quantification of low concentrations of acrylamide in foodstuffs such as coffee and potato fries.
Zhang, Mei; Zhang, Yong; Ren, Siqi; Zhang, Zunjian; Wang, Yongren; Song, Rui
2018-06-06
A method for monitoring l-asparagine (ASN) depletion in patients' serum using reversed-phase high-performance liquid chromatography with precolumn o-phthalaldehyde and ethanethiol (ET) derivatization is described. In order to improve the signal and stability of analytes, several important factors including precipitant reagent, derivatization conditions and detection wavelengths were optimized. The recovery of the analytes in biological matrix was the highest when 4% sulfosalicylic acid (1:1, v/v) was used as a precipitant reagent. Optimal fluorescence detection parameters were determined as λex = 340 nm and λem = 444 nm for maximal signal. The signal of analytes was the highest when the reagent ET and borate buffer of pH 9.9 were used in the derivatization solution. And the corresponding derivative products were stable up to 19 h. The validated method had been successfully applied to monitor ASN depletion and l-aspartic acid, l-glutamine, l-glutamic acid levels in pediatric patients during l-asparaginase therapy.
Analytical Optimization of the Net Residual Dispersion in SPM-Limited Dispersion-Managed Systems
NASA Astrophysics Data System (ADS)
Xiao, Xiaosheng; Gao, Shiming; Tian, Yu; Yang, Changxi
2006-05-01
Dispersion management is an effective technique to suppress the nonlinear impairment in fiber transmission systems, which includes tuning the amounts of precompensation, residual dispersion per span (RDPS), and net residual dispersion (NRD) of the systems. For self-phase modulation (SPM)-limited systems, optimizing the NRD is necessary because it can greatly improve the system performance. In this paper, an analytical method is presented to optimize NRD for SPM-limited dispersion-managed systems. The method is based on the correlation between the nonlinear impairment and the output pulse broadening of SPM-limited systems; therefore, dispersion-managed systems can be optimized through minimizing the output single-pulse broadening. A set of expressions is derived to calculate the output pulse broadening of the SPM-limited dispersion-managed system, from which the analytical result of optimal NRD is obtained. Furthermore, with the expressions of pulse broadening, how the nonlinear impairment depends on the amounts of precompensation and RDPS can be revealed conveniently.
Peltonen, Leena
2018-06-16
The number of poorly soluble drug candidates is increasing, and this is also seen in the research interest towards drug nanoparticles and (nano-)cocrystals; improved solubility is the most important application of these nanosystems. In order to confirm the functionality of these nanoparticles throughout their lifecycle, repeatability of the formulation processes, functional performance of the formed systems in pre-determined way and system stability, a thorough physicochemical understanding with the aid of necessary analytical techniques is needed. Even very minor deviations in for example particle size or size deviation in nanoscale can alter the product bioavailability, and the effect is even more dramatic with the smallest particle size fractions. Also, small particle size sets special requirements for the analytical techniques. In this review most important physicochemical properties of drug nanocrystals and nano-cocrystals are presented, suitable analytical techniques, their pros and cons, are described with the extra input on practical point of view. Copyright © 2018. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakajima, Yuya; Seino, Junji; Nakai, Hiromi, E-mail: nakai@waseda.jp
In this study, the analytical energy gradient for the spin-free infinite-order Douglas-Kroll-Hess (IODKH) method at the levels of the Hartree-Fock (HF), density functional theory (DFT), and second-order Møller-Plesset perturbation theory (MP2) is developed. Furthermore, adopting the local unitary transformation (LUT) scheme for the IODKH method improves the efficiency in computation of the analytical energy gradient. Numerical assessments of the present gradient method are performed at the HF, DFT, and MP2 levels for the IODKH with and without the LUT scheme. The accuracies are examined for diatomic molecules such as hydrogen halides, halogen dimers, coinage metal (Cu, Ag, and Au) halides,more » and coinage metal dimers, and 20 metal complexes, including the fourth–sixth row transition metals. In addition, the efficiencies are investigated for one-, two-, and three-dimensional silver clusters. The numerical results confirm the accuracy and efficiency of the present method.« less
NASA Astrophysics Data System (ADS)
Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.
2016-07-01
The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.
Performance improvement of planar dielectric elastomer actuators by magnetic modulating mechanism
NASA Astrophysics Data System (ADS)
Zhao, Yun-Hua; Li, Wen-Bo; Zhang, Wen-Ming; Yan, Han; Peng, Zhi-Ke; Meng, Guang
2018-06-01
In this paper, a novel planar dielectric elastomer actuator (DEA) with magnetic modulating mechanism is proposed. This design can provide the availability of wider actuation range and larger output force, which are significant indicators to evaluate the performance of DEAs. The DEA tends to be a compact and simple design, and an analytical model is developed to characterize the mechanical behavior. The result shows that the output force induced by the DEA can be improved by 76.90% under a certain applied voltage and initial magnet distance. Moreover, experiments are carried out to reveal the performance of the proposed DEA and validate the theoretical model. It demonstrates that the DEA using magnetic modulating mechanism can enlarge the actuation range and has more remarkable effect with decreasing initial magnet distance within the stable range. It can be useful to promote the applications of DEAs to soft robots and haptic feedback.
Improved scintillation detector performance via a method of enhanced layered coatings
Wakeford, Daniel Tyler; Tornga, Shawn Robert; Adams, Jillian Cathleen; ...
2016-11-16
Increasing demand for better detection performance with a simultaneous reduction in size, weight and power consumption has motivated the use of compact semiconductors as photo-converters for many gamma-ray and neutron scintillators. The spectral response of devices such as silicon avalanche photodiodes (APDs) is poorly matched to many common high-performance scintillators. We have developed a generalized analytical method that utilizes an optical reference database to match scintillator luminescence to the excitation spectrum of high quantum efficiency semiconductor detectors. This is accomplished by the fabrication and application of a series of high quantum yield, short fluorescence lifetime, wavelengthshifting coatings. Furthermore, we showmore » here a 22% increase in photoelectron collection and a 10% improvement in energy resolution when applying a layered coating to an APD-coupled, cerium-doped, yttrium oxyorthosilicate (YSO:Ce) scintillator. Wavelength-shifted radioluminescence emission and rise time analysis are also discussed.« less
NASA Technical Reports Server (NTRS)
Kwatra, S. C.
1998-01-01
A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.
A probabilistic and multi-objective analysis of lexicase selection and ε-lexicase selection.
Cava, William La; Helmuth, Thomas; Spector, Lee; Moore, Jason H
2018-05-10
Lexicase selection is a parent selection method that considers training cases individually, rather than in aggregate, when performing parent selection. Whereas previous work has demonstrated the ability of lexicase selection to solve difficult problems in program synthesis and symbolic regression, the central goal of this paper is to develop the theoretical underpinnings that explain its performance. To this end, we derive an analytical formula that gives the expected probabilities of selection under lexicase selection, given a population and its behavior. In addition, we expand upon the relation of lexicase selection to many-objective optimization methods to describe the behavior of lexicase selection, which is to select individuals on the boundaries of Pareto fronts in high-dimensional space. We show analytically why lexicase selection performs more poorly for certain sizes of population and training cases, and show why it has been shown to perform more poorly in continuous error spaces. To address this last concern, we propose new variants of ε-lexicase selection, a method that modifies the pass condition in lexicase selection to allow near-elite individuals to pass cases, thereby improving selection performance with continuous errors. We show that ε-lexicase outperforms several diversity-maintenance strategies on a number of real-world and synthetic regression problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhenhuan; Boyuka, David; Zou, X
Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less
Laminar flow burner system with infrared heated spray chamber and condenser.
Hell, A; Ulrich, W F; Shifrin, N; Ramírez-Muñoz, J
1968-07-01
A laminar flow burner is described that provides several advantages in atomic absorption flame photometry. Included in its design is a heated spray chamber followed by a condensing system. This combination improves the concentration level of the analyte in the flame and keeps solvent concentration low. Therefore, sensitivities are significantly improved for most elements relative to cold chamber burners. The burner also contains several safety features. These various design features are discussed in detail, and performance data are given on (a) signal size, (b) signal-to-noise ratio, (c) linearity, (d) working range, (e) precision, and (g) accuracy.
The X-ARAPUCA: an improvement of the ARAPUCA device
NASA Astrophysics Data System (ADS)
Machado, A. A.; Segreto, E.; Warner, D.; Fauth, A.; Gelli, B.; Máximo, R.; Pissolatti, A.; Paulucci, L.; Marinho, F.
2018-04-01
The ARAPUCA is a novel technology for the detection of liquid argon scintillation light, which has been proposed for the far detector of the Deep Underground Neutrino Experiment. The X-ARAPUCA is an improvement to the original ARAPUCA design, retaining the original ARAPUCA concept of photon trapping inside a highly reflective box while using a wavelength shifting slab inside the box to increase the probability of collecting trapped photons onto a silicon photomultiplier array. The X-ARAPUCA concept is presented and its performances are compared to those of a standard ARAPUCA by means of analytical calculations and Monte Carlo simulations.
Van Hooreweder, Brecht; Apers, Yanni; Lietaert, Karel; Kruth, Jean-Pierre
2017-01-01
This paper provides new insights into the fatigue properties of porous metallic biomaterials produced by additive manufacturing. Cylindrical porous samples with diamond unit cells were produced from Ti6Al4V powder using Selective Laser Melting (SLM). After measuring all morphological and quasi-static properties, compression-compression fatigue tests were performed to determine fatigue strength and to identify important fatigue influencing factors. In a next step, post-SLM treatments were used to improve the fatigue life of these biomaterials by changing the microstructure and by reducing stress concentrators and surface roughness. In particular, the influence of stress relieving, hot isostatic pressing and chemical etching was studied. Analytical and numerical techniques were developed to calculate the maximum local tensile stress in the struts as function of the strut diameter and load. With this method, the variability in the relative density between all samples was taken into account. The local stress in the struts was then used to quantify the exact influence of the applied post-SLM treatments on the fatigue life. A significant improvement of the fatigue life was achieved. Also, the post-SLM treatments, procedures and calculation methods can be applied to different types of porous metallic structures and hence this paper provides useful tools for improving fatigue performance of metallic biomaterials. Additive Manufacturing (AM) techniques such as Selective Laser Melting (SLM) are increasingly being used for producing customized porous metallic biomaterials. These biomaterials are regularly used for biomedical implants and hence a long lifetime is required. In this paper, a set of post-built surface and heat treatments is presented that can be used to significantly improve the fatigue life of porous SLM-Ti6Al4V samples. In addition, a novel and efficient analytical local stress method was developed to accurately quantify the influence of the post-built treatments on the fatigue life. Also numerical simulation techniques were used for validation. The developed methods and techniques can be applied to other types of porous biomaterials and hence provide new and useful tools for improving and predicting the fatigue life of porous biomaterials. Copyright © 2016 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
An Improved Method for Determination of Cyanide Content in Bitter Almond Oil.
Chen, Jia; Liu, Lei; Li, Mengjun; Yu, Xiuzhu; Zhang, Rui
2018-01-01
An improved colorimetric method for determination of cyanide content in bitter almond oil was developed. The optimal determination parameters were as follows: volume ratio of hydrochloric acid to bitter almond oil (v/v), 1.5:1; holding time for hydrolysis, 120 min; and volume ratio of distillation solution to bitter almond oil (v/v), 8:1. Analytical results showed that the relative standard deviations (SDs) of determinations were less than 10%, which satisfies the test requirements. The results of high-performance liquid chromatography and measurements exhibited a significant correlation (R = 0.9888, SD = 0.2015). Therefore, the improved colorimetric method can be used to determine cyanide content in bitter almond oil.
42 CFR 493.807 - Condition: Reinstatement of laboratories performing nonwaived testing.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., subspecialties, analyte or test, or voluntarily withdraws its certification under CLIA for the failed specialty, subspecialty, or analyte, the laboratory must then demonstrate sustained satisfactory performance on two... reinstatement for certification and Medicare or Medicaid approval in that specialty, subspecialty, analyte or...
42 CFR 493.807 - Condition: Reinstatement of laboratories performing nonwaived testing.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., subspecialties, analyte or test, or voluntarily withdraws its certification under CLIA for the failed specialty, subspecialty, or analyte, the laboratory must then demonstrate sustained satisfactory performance on two... reinstatement for certification and Medicare or Medicaid approval in that specialty, subspecialty, analyte or...
42 CFR 493.807 - Condition: Reinstatement of laboratories performing nonwaived testing.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., subspecialties, analyte or test, or voluntarily withdraws its certification under CLIA for the failed specialty, subspecialty, or analyte, the laboratory must then demonstrate sustained satisfactory performance on two... reinstatement for certification and Medicare or Medicaid approval in that specialty, subspecialty, analyte or...
Esposito, Gabriella; Ruggiero, Raffaella; Savarese, Maria; Savarese, Giovanni; Tremolaterra, Maria Roberta; Salvatore, Francesco; Carsana, Antonella
2013-12-01
Neuromuscular disease is a broad term that encompasses many diseases that either directly, via an intrinsic muscle disorder, or indirectly, via a nerve disorder, impairs muscle function. Here we report the experience of our group in the counselling and molecular prenatal diagnosis of three inherited neuromuscular diseases, i.e., Duchenne/Becker muscular dystrophy (DMD/BMD), myotonic dystrophy type 1 (DM1), spinal muscular atrophy (SMA). We performed a total of 83 DMD/BMD, 15 DM1 and 54 SMA prenatal diagnoses using a combination of technologies for either direct or linkage diagnosis. We identified 16, 5 and 10 affected foetuses, respectively. The improvement of analytical procedures in recent years has increased the mutation detection rate and reduced the analytical time. Due to the complexity of the experimental procedures and the high, specific professional expertise required for both laboratory activities and the related counselling, these types of analyses should be preferentially performed in reference molecular diagnostic centres.
GANViz: A Visual Analytics Approach to Understand the Adversarial Game.
Wang, Junpeng; Gou, Liang; Yang, Hao; Shen, Han-Wei
2018-06-01
Generative models bear promising implications to learn data representations in an unsupervised fashion with deep learning. Generative Adversarial Nets (GAN) is one of the most popular frameworks in this arena. Despite the promising results from different types of GANs, in-depth understanding on the adversarial training process of the models remains a challenge to domain experts. The complexity and the potential long-time training process of the models make it hard to evaluate, interpret, and optimize them. In this work, guided by practical needs from domain experts, we design and develop a visual analytics system, GANViz, aiming to help experts understand the adversarial process of GANs in-depth. Specifically, GANViz evaluates the model performance of two subnetworks of GANs, provides evidence and interpretations of the models' performance, and empowers comparative analysis with the evidence. Through our case studies with two real-world datasets, we demonstrate that GANViz can provide useful insight into helping domain experts understand, interpret, evaluate, and potentially improve GAN models.
Liang, Xiao-Ping; Liang, Qiong-Lin; Xia, Jian-Fei; Wang, Yong; Hu, Ping; Wang, Yi-Ming; Zheng, Xiao-Ying; Zhang, Ting; Luo, Guo-An
2009-06-15
Disturbances in maternal folate, homocysteine, and glutathione metabolism have been reported to be associated with neural tube defects (NTDs). However, the role played by specific components in the metabolic pathways leading to NTDs remains unclear. Thus an analytical method for simultaneous measurement of sixteen compounds involved in such three metabolic pathways by high performance liquid chromatography-tandem mass spectrometry was developed. The use of hydrophilic chromatography column improved the separation of polar analytes and the detection mode of multiple-reaction monitoring (MRM) enhanced the specificity and sensitivity so as to achieve simultaneous determination of three class of metabolites which have much variance in polarity and contents. The influence of parameters such as temperature, pH, flow rate on the performance of the analytes were studied to get an optimal condition. The method was validated for its linearity, accuracy, and precision, and also used for the analysis of serum samples of NTDs-affected pregnancies and normal women. The result showed that the present method is sensitive and reliable for simultaneous determination of as many as sixteen interesting metabolites which may provide a new means to study the underlying mechanism of NTDs as well as to discover new potential biomarkers.
NASA Astrophysics Data System (ADS)
Larabi, Mohamed Aziz; Mutschler, Dimitri; Mojtabi, Abdelkader
2016-06-01
Our present work focuses on the coupling between thermal diffusion and convection in order to improve the thermal gravitational separation of mixture components. The separation phenomenon was studied in a porous medium contained in vertical columns. We performed analytical and numerical simulations to corroborate the experimental measurements of the thermal diffusion coefficients of ternary mixture n-dodecane, isobutylbenzene, and tetralin obtained in microgravity in the international space station. Our approach corroborates the existing data published in the literature. The authors show that it is possible to quantify and to optimize the species separation for ternary mixtures. The authors checked, for ternary mixtures, the validity of the "forgotten effect hypothesis" established for binary mixtures by Furry, Jones, and Onsager. Two complete and different analytical resolution methods were used in order to describe the separation in terms of Lewis numbers, the separation ratios, the cross-diffusion coefficients, and the Rayleigh number. The analytical model is based on the parallel flow approximation. In order to validate this model, a numerical simulation was performed using the finite element method. From our new approach to vertical separation columns, new relations for mass fraction gradients and the optimal Rayleigh number for each component of the ternary mixture were obtained.
NASA Astrophysics Data System (ADS)
Mohan, C.
In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.
Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned
NASA Technical Reports Server (NTRS)
Tamkin, Glenn
2013-01-01
Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.
Predicting Student Success using Analytics in Course Learning Management Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; Thakur, Gautam; McNair, Wade
Educational data analytics is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from the educational context. For example, predicting college student performance is crucial for both the student and educational institutions. It can support timely intervention to prevent students from failing a course, increasing efficacy of advising functions, and improving course completion rate. In this paper, we present the efforts carried out at Oak Ridge National Laboratory (ORNL) toward conducting predictive analytics to academic data collected from 2009 through 2013 and available in one of the most commonly used learning management systems,more » called Moodle. First, we have identified the data features useful for predicting student outcomes such as students scores in homework assignments, quizzes, exams, in addition to their activities in discussion forums and their total GPA at the same term they enrolled in the course. Then, Logistic Regression and Neural Network predictive models are used to identify students as early as possible that are in danger of failing the course they are currently enrolled in. These models compute the likelihood of any given student failing (or passing) the current course. Numerical results are presented to evaluate and compare the performance of the developed models and their predictive accuracy.« less
Predicting student success using analytics in course learning management systems
NASA Astrophysics Data System (ADS)
Olama, Mohammed M.; Thakur, Gautam; McNair, Allen W.; Sukumar, Sreenivas R.
2014-05-01
Educational data analytics is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from the educational context. For example, predicting college student performance is crucial for both the student and educational institutions. It can support timely intervention to prevent students from failing a course, increasing efficacy of advising functions, and improving course completion rate. In this paper, we present the efforts carried out at Oak Ridge National Laboratory (ORNL) toward conducting predictive analytics to academic data collected from 2009 through 2013 and available in one of the most commonly used learning management systems, called Moodle. First, we have identified the data features useful for predicting student outcomes such as students' scores in homework assignments, quizzes, exams, in addition to their activities in discussion forums and their total GPA at the same term they enrolled in the course. Then, Logistic Regression and Neural Network predictive models are used to identify students as early as possible that are in danger of failing the course they are currently enrolled in. These models compute the likelihood of any given student failing (or passing) the current course. Numerical results are presented to evaluate and compare the performance of the developed models and their predictive accuracy.
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
ERIC Educational Resources Information Center
Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.
2016-01-01
This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…