Effects of a Format-based Second Language Teaching Method in Kindergarten.
ERIC Educational Resources Information Center
Uilenburg, Noelle; Plooij, Frans X.; de Glopper, Kees; Damhuis, Resi
2001-01-01
Focuses on second language teaching with a format-based method. The differences between a format-based teaching method and a standard approach used as treatments in a quasi-experimental, non-equivalent control group are described in detail. Examines whether the effects of a format-based teaching method and a standard foreign language method differ…
A Comparative Study of Two Azimuth Based Non Standard Location Methods
2017-03-23
Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It
ERIC Educational Resources Information Center
Needham, Martha Elaine
2010-01-01
This research compares differences between standardized test scores in problem-based learning (PBL) classrooms and a traditional classroom for 6th grade students using a mixed-method, quasi-experimental and qualitative design. The research shows that problem-based learning is as effective as traditional teaching methods on standardized tests. The…
A new IRT-based standard setting method: application to eCat-listening.
García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David
2013-01-01
Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.
Methods for estimating flood frequency in Montana based on data through water year 1998
Parrett, Charles; Johnson, Dave R.
2004-01-01
Annual peak discharges having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (T-year floods) were determined for 660 gaged sites in Montana and in adjacent areas of Idaho, Wyoming, and Canada, based on data through water year 1998. The updated flood-frequency information was subsequently used in regression analyses, either ordinary or generalized least squares, to develop equations relating T-year floods to various basin and climatic characteristics, equations relating T-year floods to active-channel width, and equations relating T-year floods to bankfull width. The equations can be used to estimate flood frequency at ungaged sites. Montana was divided into eight regions, within which flood characteristics were considered to be reasonably homogeneous, and the three sets of regression equations were developed for each region. A measure of the overall reliability of the regression equations is the average standard error of prediction. The average standard errors of prediction for the equations based on basin and climatic characteristics ranged from 37.4 percent to 134.1 percent. Average standard errors of prediction for the equations based on active-channel width ranged from 57.2 percent to 141.3 percent. Average standard errors of prediction for the equations based on bankfull width ranged from 63.1 percent to 155.5 percent. In most regions, the equations based on basin and climatic characteristics generally had smaller average standard errors of prediction than equations based on active-channel or bankfull width. An exception was the Southeast Plains Region, where all equations based on active-channel width had smaller average standard errors of prediction than equations based on basin and climatic characteristics or bankfull width. Methods for weighting estimates derived from the basin- and climatic-characteristic equations and the channel-width equations also were developed. The weights were based on the cross correlation of residuals from the different methods and the average standard errors of prediction. When all three methods were combined, the average standard errors of prediction ranged from 37.4 percent to 120.2 percent. Weighting of estimates reduced the standard errors of prediction for all T-year flood estimates in four regions, reduced the standard errors of prediction for some T-year flood estimates in two regions, and provided no reduction in average standard error of prediction in two regions. A computer program for solving the regression equations, weighting estimates, and determining reliability of individual estimates was developed and placed on the USGS Montana District World Wide Web page. A new regression method, termed Region of Influence regression, also was tested. Test results indicated that the Region of Influence method was not as reliable as the regional equations based on generalized least squares regression. Two additional methods for estimating flood frequency at ungaged sites located on the same streams as gaged sites also are described. The first method, based on a drainage-area-ratio adjustment, is intended for use on streams where the ungaged site of interest is located near a gaged site. The second method, based on interpolation between gaged sites, is intended for use on streams that have two or more streamflow-gaging stations.
Improved lossless intra coding for H.264/MPEG-4 AVC.
Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J
2006-09-01
A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.
Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W
2013-02-01
Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.
ERIC Educational Resources Information Center
Wang, Tianyou
2009-01-01
Holland and colleagues derived a formula for analytical standard error of equating using the delta-method for the kernel equating method. Extending their derivation, this article derives an analytical standard error of equating procedure for the conventional percentile rank-based equipercentile equating with log-linear smoothing. This procedure is…
Use of focused ultrasonication in activity-based profiling of deubiquitinating enzymes in tissue.
Nanduri, Bindu; Shack, Leslie A; Rai, Aswathy N; Epperson, William B; Baumgartner, Wes; Schmidt, Ty B; Edelmann, Mariola J
2016-12-15
To develop a reproducible tissue lysis method that retains enzyme function for activity-based protein profiling, we compared four different methods to obtain protein extracts from bovine lung tissue: focused ultrasonication, standard sonication, mortar & pestle method, and homogenization combined with standard sonication. Focused ultrasonication and mortar & pestle methods were sufficiently effective for activity-based profiling of deubiquitinases in tissue, and focused ultrasonication also had the fastest processing time. We used focused-ultrasonicator for subsequent activity-based proteomic analysis of deubiquitinases to test the compatibility of this method in sample preparation for activity-based chemical proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Woodruff, David; Traynor, Anne; Cui, Zhongmin; Fang, Yu
2013-01-01
Professional standards for educational testing recommend that both the overall standard error of measurement and the conditional standard error of measurement (CSEM) be computed on the score scale used to report scores to examinees. Several methods have been developed to compute scale score CSEMs. This paper compares three methods, based on…
The Objective Borderline Method: A Probabilistic Method for Standard Setting
ERIC Educational Resources Information Center
Shulruf, Boaz; Poole, Phillippa; Jones, Philip; Wilkinson, Tim
2015-01-01
A new probability-based standard setting technique, the Objective Borderline Method (OBM), was introduced recently. This was based on a mathematical model of how test scores relate to student ability. The present study refined the model and tested it using 2500 simulated data-sets. The OBM was feasible to use. On average, the OBM performed well…
A Learner-Centered Grading Method Focused on Reaching Proficiency with Course Learning Outcomes
ERIC Educational Resources Information Center
Toledo, Santiago; Dubas, Justin M.
2017-01-01
Getting students to use grading feedback as a tool for learning is a continual challenge for educators. This work proposes a method for evaluating student performance that provides feedback to students based on standards of learning dictated by clearly delineated course learning outcomes. This method combines elements of standards-based grading…
Characterization of Used MIL-L-7808 Lubricants.
1985-05-01
Gravimetric Finish 20 2.3.3 Separation of Mineral Oil by means of Sulphuric Acid 21 2.3.4 Rolls-Royce Method 1032 26 2.3.5 Conclusion 26 3 Minimum...and STM No. 1 i Application of Sulphuric Acid Method for MOC to:- IV Standards based upon ATL.9100 23 V Standards based upon ATL.9101 24 VI Standards...total acid numbers of the standards before, and after application of each technique. Considering firstly the volatile contaminant contents obtained by STM
Alignment of Standards and Assessment: A Theoretical and Empirical Study of Methods for Alignment
ERIC Educational Resources Information Center
Nasstrom, Gunilla; Henriksson, Widar
2008-01-01
Introduction: In a standards-based school-system alignment of policy documents with standards and assessment is important. To be able to evaluate whether schools and students have reached the standards, the assessment should focus on the standards. Different models and methods can be used for measuring alignment, i.e. the correspondence between…
Lin, Long-Ze; Harnly, James M
2008-11-12
A screening method using LC-DAD-ESI/MS was developed for the identification of common hydroxycinnamoylquinic acids based on direct comparison with standards. A complete standard set for mono-, di-, and tricaffeoylquinic isomers was assembled from commercially available standards, positively identified compounds in common plants (artichokes, asparagus, coffee bean, honeysuckle flowers, sweet potato, and Vernonia amygdalina leaves) and chemically modified standards. Four C18 reversed phase columns were tested using the standardized profiling method (based on LC-DAD-ESI/MS) for 30 phenolic compounds, and their elution order and retention times were evaluated. Using only two columns under standardized LC condition and the collected phenolic compound database, it was possible to separate all of the hydroxycinnamoylquinic acid conjugates and to identify 28 and 18 hydroxycinnamoylquinic acids in arnica flowers (Arnica montana L.) and burdock roots (Arctium lappa L.), respectively. Of these, 22 are reported for the first time.
Collective Leadership Measurement for the U.S. Army
2014-03-01
methods employed were adapted from standard texts on survey research methods (e.g., Podsakoff, MacKenzie, Lee, & Podsakoff, 2003; Shadish, Cook...members of the research team, as well as procedures for interviews in standard research methods texts (e.g., Campion, Palmer, & Campion, 1997; Latham...critical incident protocol was based on procedures for critical incidents in standard research methods texts (e.g., Flanagan, 1954; Lowenberg, 1979
Comparison of Web-Based and Face-to-Face Standard Setting Using the Angoff Method
ERIC Educational Resources Information Center
Katz, Irvin R.; Tannenbaum, Richard J.
2014-01-01
Web-based standard setting holds promise for reducing the travel and logistical inconveniences of traditional, face-to-face standard setting meetings. However, because there are few published reports of setting standards via remote meeting technology, little is known about the practical potential of the approach, including technical feasibility of…
Trofimov, Vyacheslav A.; Varentsova, Svetlana A.
2016-01-01
Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria. PMID:27070617
Trofimov, Vyacheslav A; Varentsova, Svetlana A
2016-04-08
Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria.
A Comparison of Web-Based Standard Setting and Monitored Standard Setting.
ERIC Educational Resources Information Center
Harvey, Anne L.; Way, Walter D.
Standard setting, when carefully done, can be an expensive and time-consuming process. The modified Angoff method and the benchmark method, as utilized in this study, employ representative panels of judges to provide recommended passing scores to standard setting decision-makers. It has been considered preferable to have the judges meet in a…
Dong, Ren G; Sinsel, Erik W; Welcome, Daniel E; Warren, Christopher; Xu, Xueyan S; McDowell, Thomas W; Wu, John Z
2015-09-01
The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study.
Dong, Ren G.; Sinsel, Erik W.; Welcome, Daniel E.; Warren, Christopher; Xu, Xueyan S.; McDowell, Thomas W.; Wu, John Z.
2015-01-01
The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study. PMID:26929824
Valente, Marta Sofia; Pedro, Paulo; Alonso, M Carmen; Borrego, Juan J; Dionísio, Lídia
2010-03-01
Monitoring the microbiological quality of water used for recreational activities is very important to human public health. Although the sanitary quality of recreational marine waters could be evaluated by standard methods, they are time-consuming and need confirmation. For these reasons, faster and more sensitive methods, such as the defined substrate-based technology, have been developed. In the present work, we have compared the standard method of membrane filtration using Tergitol-TTC agar for total coliforms and Escherichia coli, and Slanetz and Bartley agar for enterococci, and the IDEXX defined substrate technology for these faecal pollution indicators to determine the microbiological quality of natural recreational waters. ISO 17994:2004 standard was used to compare these methods. The IDEXX for total coliforms and E. coli, Colilert, showed higher values than those obtained by the standard method. Enterolert test, for the enumeration of enterococci, showed lower values when compared with the standard method. It may be concluded that more studies to evaluate the precision and accuracy of the rapid tests are required in order to apply them for routine monitoring of marine and freshwater recreational bathing areas. The main advantages of these methods are that they are more specific, feasible and simpler than the standard methodology.
ERIC Educational Resources Information Center
Coester, Lee Anne
2010-01-01
This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…
Kim, Joo-Hwan; Kim, Jin Ho; Wang, Pengbin; Park, Bum Soo; Han, Myung-Soo
2016-01-01
The identification and quantification of Heterosigma akashiwo cysts in sediments by light microscopy can be difficult due to the small size and morphology of the cysts, which are often indistinguishable from those of other types of algae. Quantitative real-time PCR (qPCR) based assays represent a potentially efficient method for quantifying the abundance of H. akashiwo cysts, although standard curves must be based on cyst DNA rather than on vegetative cell DNA due to differences in gene copy number and DNA extraction yield between these two cell types. Furthermore, qPCR on sediment samples can be complicated by the presence of extracellular DNA debris. To solve these problems, we constructed a cyst-based standard curve and developed a simple method for removing DNA debris from sediment samples. This cyst-based standard curve was compared with a standard curve based on vegetative cells, as vegetative cells may have twice the gene copy number of cysts. To remove DNA debris from the sediment, we developed a simple method involving dilution with distilled water and heating at 75°C. A total of 18 sediment samples were used to evaluate this method. Cyst abundance determined using the qPCR assay without DNA debris removal yielded results up to 51-fold greater than with direct counting. By contrast, a highly significant correlation was observed between cyst abundance determined by direct counting and the qPCR assay in conjunction with DNA debris removal (r2 = 0.72, slope = 1.07, p < 0.001). Therefore, this improved qPCR method should be a powerful tool for the accurate quantification of H. akashiwo cysts in sediment samples.
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, M; Jiang, S; Lu, W
Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data,more » as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.« less
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
B-spline based image tracking by detection
NASA Astrophysics Data System (ADS)
Balaji, Bhashyam; Sithiravel, Rajiv; Damini, Anthony; Kirubarajan, Thiagalingam; Rajan, Sreeraman
2016-05-01
Visual image tracking involves the estimation of the motion of any desired targets in a surveillance region using a sequence of images. A standard method of isolating moving targets in image tracking uses background subtraction. The standard background subtraction method is often impacted by irrelevant information in the images, which can lead to poor performance in image-based target tracking. In this paper, a B-Spline based image tracking is implemented. The novel method models the background and foreground using the B-Spline method followed by a tracking-by-detection algorithm. The effectiveness of the proposed algorithm is demonstrated.
Comparison of EPA Method 1615 RT-qPCR Assays in Standard and Kit Format
EPA Method 1615 contains protocols for measuring enterovirus and norovirus by reverse transcription quantitative polymerase chain reaction. A commercial kit based upon these protocols was designed and compared to the method's standard approach. Reagent grade, secondary effluent, ...
NASA Astrophysics Data System (ADS)
Dong, Min; Dong, Chenghui; Guo, Miao; Wang, Zhe; Mu, Xiaomin
2018-04-01
Multiresolution-based methods, such as wavelet and Contourlet are usually used to image fusion. This work presents a new image fusion frame-work by utilizing area-based standard deviation in dual tree Contourlet trans-form domain. Firstly, the pre-registered source images are decomposed with dual tree Contourlet transform; low-pass and high-pass coefficients are obtained. Then, the low-pass bands are fused with weighted average based on area standard deviation rather than the simple "averaging" rule. While the high-pass bands are merged with the "max-absolute' fusion rule. Finally, the modified low-pass and high-pass coefficients are used to reconstruct the final fused image. The major advantage of the proposed fusion method over conventional fusion is the approximately shift invariance and multidirectional selectivity of dual tree Contourlet transform. The proposed method is compared with wavelet- , Contourletbased methods and other the state-of-the art methods on common used multi focus images. Experiments demonstrate that the proposed fusion framework is feasible and effective, and it performs better in both subjective and objective evaluation.
Hacke, Uwe G; Venturas, Martin D; MacKinnon, Evan D; Jacobsen, Anna L; Sperry, John S; Pratt, R Brandon
2015-01-01
The standard centrifuge method has been frequently used to measure vulnerability to xylem cavitation. This method has recently been questioned. It was hypothesized that open vessels lead to exponential vulnerability curves, which were thought to be indicative of measurement artifact. We tested this hypothesis in stems of olive (Olea europea) because its long vessels were recently claimed to produce a centrifuge artifact. We evaluated three predictions that followed from the open vessel artifact hypothesis: shorter stems, with more open vessels, would be more vulnerable than longer stems; standard centrifuge-based curves would be more vulnerable than dehydration-based curves; and open vessels would cause an exponential shape of centrifuge-based curves. Experimental evidence did not support these predictions. Centrifuge curves did not vary when the proportion of open vessels was altered. Centrifuge and dehydration curves were similar. At highly negative xylem pressure, centrifuge-based curves slightly overestimated vulnerability compared to the dehydration curve. This divergence was eliminated by centrifuging each stem only once. The standard centrifuge method produced accurate curves of samples containing open vessels, supporting the validity of this technique and confirming its utility in understanding plant hydraulics. Seven recommendations for avoiding artefacts and standardizing vulnerability curve methodology are provided. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.
Critical levels as applied to ozone for North American forests
Robert C. Musselman
2006-01-01
The United States and Canada have used concentration-based parameters for air quality standards for ozone effects on forests in North America. The European critical levels method for air quality standards uses an exposure-based parameter, a cumulative ozone concentration index with a threshold cutoff value. The critical levels method has not been used in North America...
ERIC Educational Resources Information Center
Kimball, Steven M.; Milanowski, Anthony
2009-01-01
Purpose: The article reports on a study of school leader decision making that examined variation in the validity of teacher evaluation ratings in a school district that has implemented a standards-based teacher evaluation system. Research Methods: Applying mixed methods, the study used teacher evaluation ratings and value-added student achievement…
New methods of MR image intensity standardization via generalized scale
NASA Astrophysics Data System (ADS)
Madabhushi, Anant; Udupa, Jayaram K.
2005-04-01
Image intensity standardization is a post-acquisition processing operation designed for correcting acquisition-to-acquisition signal intensity variations (non-standardness) inherent in Magnetic Resonance (MR) images. While existing standardization methods based on histogram landmarks have been shown to produce a significant gain in the similarity of resulting image intensities, their weakness is that, in some instances the same histogram-based landmark may represent one tissue, while in other cases it may represent different tissues. This is often true for diseased or abnormal patient studies in which significant changes in the image intensity characteristics may occur. In an attempt to overcome this problem, in this paper, we present two new intensity standardization methods based on the concept of generalized scale. In reference 1 we introduced the concept of generalized scale (g-scale) to overcome the shape, topological, and anisotropic constraints imposed by other local morphometric scale models. Roughly speaking, the g-scale of a voxel in a scene was defined as the largest set of voxels connected to the voxel that satisfy some homogeneity criterion. We subsequently formulated a variant of the generalized scale notion, referred to as generalized ball scale (gB-scale), which, in addition to having the advantages of g-scale, also has superior noise resistance properties. These scale concepts are utilized in this paper to accurately determine principal tissue regions within MR images, and landmarks derived from these regions are used to perform intensity standardization. The new methods were qualitatively and quantitatively evaluated on a total of 67 clinical 3D MR images corresponding to four different protocols and to normal, Multiple Sclerosis (MS), and brain tumor patient studies. The generalized scale-based methods were found to be better than the existing methods, with a significant improvement observed for severely diseased and abnormal patient studies.
Comparison of five methods for the estimation of methane production from vented in vitro systems.
Alvarez Hess, P S; Eckard, R J; Jacobs, J L; Hannah, M C; Moate, P J
2018-05-23
There are several methods for estimating methane production (MP) from feedstuffs in vented in vitro systems. One method (A; "gold standard") measures methane proportions in the incubation bottle's head space (HS) and in the vented gas collected in gas bags. Four other methods (B, C, D and E) measure methane proportion in a single gas sample from HS. Method B assumes the same methane proportion in the vented gas as in HS, method C assumes constant methane to carbon dioxide ratio, method D has been developed based on empirical data and method E assumes constant individual venting volumes. This study aimed to compare the MP predictions from these methods to that of the gold standard method under different incubation scenarios, to validate these methods based on their concordance with a gold standard method. Methods C, D and E had greater concordance (0.85, 0.88 and 0.81), lower root mean square error (RMSE) (0.80, 0.72 and 0.85) and lower mean bias (0.20, 0.35, -0.35) with the gold standard than did method B (concordance 0.67, RMSE 1.49 and mean bias 1.26). Methods D and E were simpler to perform than method C and method D was slightly more accurate than method E. Based on precision, accuracy and simplicity of implementation, it is recommended that, when method A cannot be used, methods D and E are preferred to estimate MP from vented in vitro systems. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Combining the Best of Two Standard Setting Methods: The Ordered Item Booklet Angoff
ERIC Educational Resources Information Center
Smith, Russell W.; Davis-Becker, Susan L.; O'Leary, Lisa S.
2014-01-01
This article describes a hybrid standard setting method that combines characteristics of the Angoff (1971) and Bookmark (Mitzel, Lewis, Patz & Green, 2001) methods. The proposed approach utilizes strengths of each method while addressing weaknesses. An ordered item booklet, with items sorted based on item difficulty, is used in combination…
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong
2014-01-01
Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817
ERIC Educational Resources Information Center
Battistone, William A., Jr.
2017-01-01
Problem: There is an existing cycle of questionable grading practices at the K-12 level. As a result, districts continue to search for innovative methods of evaluating and reporting student progress. One result of this effort has been the adoption of a standards-based grading approach. Research concerning standards-based grading implementation has…
Heinrich, Andreas; Teichgräber, Ulf K; Güttler, Felix V
2015-12-01
The standard ASTM F2119 describes a test method for measuring the size of a susceptibility artifact based on the example of a passive implant. A pixel in an image is considered to be a part of an image artifact if the intensity is changed by at least 30% in the presence of a test object, compared to a reference image in which the test object is absent (reference value). The aim of this paper is to simplify and accelerate the test method using a histogram-based reference value. Four test objects were scanned parallel and perpendicular to the main magnetic field, and the largest susceptibility artifacts were measured using two methods of reference value determination (reference image-based and histogram-based reference value). The results between both methods were compared using the Mann-Whitney U-test. The difference between both reference values was 42.35 ± 23.66. The difference of artifact size was 0.64 ± 0.69 mm. The artifact sizes of both methods did not show significant differences; the p-value of the Mann-Whitney U-test was between 0.710 and 0.521. A standard-conform method for a rapid, objective, and reproducible evaluation of susceptibility artifacts could be implemented. The result of the histogram-based method does not significantly differ from the ASTM-conform method.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Methodological Pluralism: The Gold Standard of STEM Evaluation
ERIC Educational Resources Information Center
Lawrenz, Frances; Huffman, Douglas
2006-01-01
Nationally, there is continuing debate about appropriate methods for conducting educational evaluations. The U.S. Department of Education has placed a priority on "scientifically" based evaluation methods and has advocated a "gold standard" of randomized controlled experimentation. The priority suggests that randomized control methods are best,…
NASA Astrophysics Data System (ADS)
Pacheco-Sanchez, Anibal; Claus, Martin; Mothes, Sven; Schröter, Michael
2016-11-01
Three different methods for the extraction of the contact resistance based on both the well-known transfer length method (TLM) and two variants of the Y-function method have been applied to simulation and experimental data of short- and long-channel CNTFETs. While for TLM special CNT test structures are mandatory, standard electrical device characteristics are sufficient for the Y-function methods. The methods have been applied to CNTFETs with low and high channel resistance. It turned out that the standard Y-function method fails to deliver the correct contact resistance in case of a relatively high channel resistance compared to the contact resistances. A physics-based validation is also given for the application of these methods based on applying traditional Si MOSFET theory to quasi-ballistic CNTFETs.
ERIC Educational Resources Information Center
Li, Deping; Oranje, Andreas
2007-01-01
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han
2014-01-01
Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.
Wei, Xiang; Camino, Acner; Pi, Shaohua; Cepurna, William; Huang, David; Morrison, John C; Jia, Yali
2018-05-01
Phase-based optical coherence tomography (OCT), such as OCT angiography (OCTA) and Doppler OCT, is sensitive to the confounding phase shift introduced by subject bulk motion. Traditional bulk motion compensation methods are limited by their accuracy and computing cost-effectiveness. In this Letter, to the best of our knowledge, we present a novel bulk motion compensation method for phase-based functional OCT. Bulk motion associated phase shift can be directly derived by solving its equation using a standard deviation of phase-based OCTA and Doppler OCT flow signals. This method was evaluated on rodent retinal images acquired by a prototype visible light OCT and human retinal images acquired by a commercial system. The image quality and computational speed were significantly improved, compared to two conventional phase compensation methods.
Fuguet, Elisabet; Ràfols, Clara; Bosch, Elisabeth; Rosés, Martí
2009-04-24
A new and fast method to determine acidity constants of monoprotic weak acids and bases by capillary zone electrophoresis based on the use of an internal standard (compound of similar nature and acidity constant as the analyte) has been developed. This method requires only two electrophoretic runs for the determination of an acidity constant: a first one at a pH where both analyte and internal standard are totally ionized, and a second one at another pH where both are partially ionized. Furthermore, the method is not pH dependent, so an accurate measure of the pH of the buffer solutions is not needed. The acidity constants of several phenols and amines have been measured using internal standards of known pK(a), obtaining a mean deviation of 0.05 pH units compared to the literature values.
Fibrinolysis standards: a review of the current status.
Thelwell, C
2010-07-01
Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
External Standards or Standard Addition? Selecting and Validating a Method of Standardization
NASA Astrophysics Data System (ADS)
Harvey, David T.
2002-05-01
A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.
ERIC Educational Resources Information Center
Caruthers, Tarchell Peeples
2013-01-01
Current research shows that, despite standards-based mathematics reform, American students lag behind in mathematics achievement when compared to their counterparts in other countries. The purpose of this mixed methods study was to examine if reading level, as measured by the Scholastic Reading Inventory, is related to standards-based mathematics…
NASA Astrophysics Data System (ADS)
Tian, Lunfu; Wang, Lili; Gao, Wei; Weng, Xiaodong; Liu, Jianhui; Zou, Deshuang; Dai, Yichun; Huang, Shuke
2018-03-01
For the quantitative analysis of the principal elements in lead-antimony-tin alloys, directly X-ray fluorescence (XRF) method using solid metal disks introduces considerable errors due to the microstructure inhomogeneity. To solve this problem, an aqueous solution XRF method is proposed for determining major amounts of Sb, Sn, Pb in lead-based bearing alloys. The alloy samples were dissolved by a mixture of nitric acid and tartaric acid to eliminated the effects of microstructure of these alloys on the XRF analysis. Rh Compton scattering was used as internal standard for Sb and Sn, and Bi was added as internal standard for Pb, to correct for matrix effects, instrumental and operational variations. High-purity lead, antimony and tin were used to prepare synthetic standards. Using these standards, calibration curves were constructed for the three elements after optimizing the spectrometer parameters. The method has been successfully applied to the analysis of lead-based bearing alloys and is more rapid than classical titration methods normally used. The determination results are consistent with certified values or those obtained by titrations.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
On standardization of low symmetry crystal fields
NASA Astrophysics Data System (ADS)
Gajek, Zbigniew
2015-07-01
Standardization methods of low symmetry - orthorhombic, monoclinic and triclinic - crystal fields are formulated and discussed. Two alternative approaches are presented, the conventional one, based on the second-rank parameters and the standardization based on the fourth-rank parameters. Mainly f-electron systems are considered but some guidelines for d-electron systems and the spin Hamiltonian describing the zero-field splitting are given. The discussion focuses on premises for choosing the most suitable method, in particular on inadequacy of the conventional one. Few examples from the literature illustrate this situation.
Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations
NASA Astrophysics Data System (ADS)
von Martens, Hans-Jürgen
2010-05-01
The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s2). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.
Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
[Establishment of database with standard 3D tooth crowns based on 3DS MAX].
Cheng, Xiaosheng; An, Tao; Liao, Wenhe; Dai, Ning; Yu, Qing; Lu, Peijun
2009-08-01
The database with standard 3D tooth crowns has laid the groundwork for dental CAD/CAM system. In this paper, we design the standard tooth crowns in 3DS MAX 9.0 and create a database with these models successfully. Firstly, some key lines are collected from standard tooth pictures. Then we use 3DS MAX 9.0 to design the digital tooth model based on these lines. During the design process, it is important to refer to the standard plaster tooth model. After some tests, the standard tooth models designed with this method are accurate and adaptable; furthermore, it is very easy to perform some operations on the models such as deforming and translating. This method provides a new idea to build the database with standard 3D tooth crowns and a basis for dental CAD/CAM system.
NASA Astrophysics Data System (ADS)
Powell, P. E.
Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.
Fast calculation of the line-spread-function by transversal directions decoupling
NASA Astrophysics Data System (ADS)
Parravicini, Jacopo; Tartara, Luca; Hasani, Elton; Tomaselli, Alessandra
2016-07-01
We propose a simplified method to calculate the optical spread function of a paradigmatic system constituted by a pupil-lens with a line-shaped illumination (‘line-spread-function’). Our approach is based on decoupling the two transversal directions of the beam and treating the propagation by means of the Fourier optics formalism. This requires simpler calculations with respect to the more usual Bessel-function-based method. The model is discussed and compared with standard calculation methods by carrying out computer simulations. The proposed approach is found to be much faster than the Bessel-function-based one (CPU time ≲ 5% of the standard method), while the results of the two methods present a very good mutual agreement.
An XML-based method for astronomy software designing
NASA Astrophysics Data System (ADS)
Liao, Mingxue; Aili, Yusupu; Zhang, Jin
XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.
16 CFR 1000.29 - Directorate for Engineering Sciences.
Code of Federal Regulations, 2010 CFR
2010-01-01
... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...
16 CFR 1000.29 - Directorate for Engineering Sciences.
Code of Federal Regulations, 2012 CFR
2012-01-01
... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...
Confidence Limits for the Indirect Effect: Distribution of the Product and Resampling Methods
ERIC Educational Resources Information Center
MacKinnon, David P.; Lockwood, Chondra M.; Williams, Jason
2004-01-01
The most commonly used method to test an indirect effect is to divide the estimate of the indirect effect by its standard error and compare the resulting z statistic with a critical value from the standard normal distribution. Confidence limits for the indirect effect are also typically based on critical values from the standard normal…
Safaei-Asl, Afshin; Enshaei, Mercede; Heydarzadeh, Abtin; Maleknejad, Shohreh
2016-01-01
Assessment of glomerular filtration rate (GFR) is an important tool for monitoring renal function. Regarding to limitations in available methods, we intended to calculate GFR by cystatin C (Cys C) based formulas and determine correlation rate of them with current methods. We studied 72 children (38 boys and 34 girls) with renal disorders. The 24 hour urinary creatinine (Cr) clearance was the gold standard method. GFR was measured with Schwartz formula and Cys C-based formulas (Grubb, Hoek, Larsson and Simple). Then correlation rates of these formulas were determined. Using Pearson correlation coefficient, a significant positive correlation between all formulas and the standard method was seen (R(2) for Schwartz, Hoek, Larsson, Grubb and Simple formula was 0.639, 0.722, 0.705, 0.712, 0.722, respectively) (P<0.001). Cys C-based formulas could predict the variance of standard method results with high power. These formulas had correlation with Schwarz formula by R(2) 0.62-0.65 (intermediate correlation). Using linear regression and constant (y-intercept), it revealed that Larsson, Hoek and Grubb formulas can estimate GFR amounts with no statistical difference compared with standard method; but Schwartz and Simple formulas overestimate GFR. This study shows that Cys C-based formulas have strong relationship with 24 hour urinary Cr clearance. Hence, they can determine GFR in children with kidney injury, easier and with enough accuracy. It helps the physician to diagnosis of renal disease in early stages and improves the prognosis.
Li, Li; Liu, Dong-Jun
2014-01-01
Since 2012, China has been facing haze-fog weather conditions, and haze-fog pollution and PM2.5 have become hot topics. It is very necessary to evaluate and analyze the ecological status of the air environment of China, which is of great significance for environmental protection measures. In this study the current situation of haze-fog pollution in China was analyzed first, and the new Ambient Air Quality Standards were introduced. For the issue of air quality evaluation, a comprehensive evaluation model based on an entropy weighting method and nearest neighbor method was developed. The entropy weighting method was used to determine the weights of indicators, and the nearest neighbor method was utilized to evaluate the air quality levels. Then the comprehensive evaluation model was applied into the practical evaluation problems of air quality in Beijing to analyze the haze-fog pollution. Two simulation experiments were implemented in this study. One experiment included the indicator of PM2.5 and was carried out based on the new Ambient Air Quality Standards (GB 3095-2012); the other experiment excluded PM2.5 and was carried out based on the old Ambient Air Quality Standards (GB 3095-1996). Their results were compared, and the simulation results showed that PM2.5 was an important indicator for air quality and the evaluation results of the new Air Quality Standards were more scientific than the old ones. The haze-fog pollution situation in Beijing City was also analyzed based on these results, and the corresponding management measures were suggested. PMID:25170682
Jha, Abhinav K.; Kupinski, Matthew A.; Rodríguez, Jeffrey J.; Stephen, Renu M.; Stopeck, Alison T.
2012-01-01
In many studies, the estimation of the apparent diffusion coefficient (ADC) of lesions in visceral organs in diffusion-weighted (DW) magnetic resonance images requires an accurate lesion-segmentation algorithm. To evaluate these lesion-segmentation algorithms, region-overlap measures are used currently. However, the end task from the DW images is accurate ADC estimation, and the region-overlap measures do not evaluate the segmentation algorithms on this task. Moreover, these measures rely on the existence of gold-standard segmentation of the lesion, which is typically unavailable. In this paper, we study the problem of task-based evaluation of segmentation algorithms in DW imaging in the absence of a gold standard. We first show that using manual segmentations instead of gold-standard segmentations for this task-based evaluation is unreliable. We then propose a method to compare the segmentation algorithms that does not require gold-standard or manual segmentation results. The no-gold-standard method estimates the bias and the variance of the error between the true ADC values and the ADC values estimated using the automated segmentation algorithm. The method can be used to rank the segmentation algorithms on the basis of both accuracy and precision. We also propose consistency checks for this evaluation technique. PMID:22713231
NASA Astrophysics Data System (ADS)
Salminen, J.; Högström, R.; Saxholm, S.; Lakka, A.; Riski, K.; Heinonen, M.
2018-04-01
In this paper we present the development of a primary standard for dynamic pressures that is based on the drop weight method. At the moment dynamic pressure transducers are typically calibrated using reference transducers, which are calibrated against static pressure standards. Because dynamic and static characteristics of pressure transducers may significantly differ from each other, it is important that these transducers are calibrated against dynamic pressure standards. In a method developed in VTT Technical Research Centre of Finland Ltd, Centre for Metrology MIKES, a pressure pulse is generated by impact between a dropping weight and a piston of a liquid-filled piston-cylinder assembly. The traceability to SI-units is realized through interferometric measurement of the acceleration of the dropping weight during impact, the effective area of the piston-cylinder assembly and the mass of the weight. Based on experimental validation and an uncertainty evaluation, the developed primary standard provides traceability for peak pressures in the range from 10 MPa to 400 MPa with a few millisecond pulse width and a typical relative expanded uncertainty (k = 2) of 1.5%. The performance of the primary standard is demonstrated by test calibrations of two dynamic pressure transducers.
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262
An Investigation of Undefined Cut Scores with the Hofstee Standard-Setting Method
ERIC Educational Resources Information Center
Wyse, Adam E.; Babcock, Ben
2017-01-01
This article provides an overview of the Hofstee standard-setting method and illustrates several situations where the Hofstee method will produce undefined cut scores. The situations where the cut scores will be undefined involve cases where the line segment derived from the Hofstee ratings does not intersect the score distribution curve based on…
Standard Error Estimation of 3PL IRT True Score Equating with an MCMC Method
ERIC Educational Resources Information Center
Liu, Yuming; Schulz, E. Matthew; Yu, Lei
2008-01-01
A Markov chain Monte Carlo (MCMC) method and a bootstrap method were compared in the estimation of standard errors of item response theory (IRT) true score equating. Three test form relationships were examined: parallel, tau-equivalent, and congeneric. Data were simulated based on Reading Comprehension and Vocabulary tests of the Iowa Tests of…
A Mapmark method of standard setting as implemented for the National Assessment Governing Board.
Schulz, E Matthew; Mitzel, Howard C
2011-01-01
This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.
Phillips, Melissa M; Bedner, Mary; Reitz, Manuela; Burdette, Carolyn Q; Nelson, Michael A; Yen, James H; Sander, Lane C; Rimmer, Catherine A
2017-02-01
Two independent analytical approaches, based on liquid chromatography with absorbance detection and liquid chromatography with mass spectrometric detection, have been developed for determination of isoflavones in soy materials. These two methods yield comparable results for a variety of soy-based foods and dietary supplements. Four Standard Reference Materials (SRMs) have been produced by the National Institute of Standards and Technology to assist the food and dietary supplement community in method validation and have been assigned values for isoflavone content using both methods. These SRMs include SRM 3234 Soy Flour, SRM 3236 Soy Protein Isolate, SRM 3237 Soy Protein Concentrate, and SRM 3238 Soy-Containing Solid Oral Dosage Form. A fifth material, SRM 3235 Soy Milk, was evaluated using the methods and found to be inhomogeneous for isoflavones and unsuitable for value assignment. Graphical Abstract Separation of six isoflavone aglycones and glycosides found in Standard Reference Material (SRM) 3236 Soy Protein Isolate.
Li, Dan; Jiang, Jia; Han, Dandan; Yu, Xinyu; Wang, Kun; Zang, Shuang; Lu, Dayong; Yu, Aimin; Zhang, Ziwei
2016-04-05
A new method is proposed for measuring the antioxidant capacity by electron spin resonance spectroscopy based on the loss of electron spin resonance signal after Cu(2+) is reduced to Cu(+) with antioxidant. Cu(+) was removed by precipitation in the presence of SCN(-). The remaining Cu(2+) was coordinated with diethyldithiocarbamate, extracted into n-butanol and determined by electron spin resonance spectrometry. Eight standards widely used in antioxidant capacity determination, including Trolox, ascorbic acid, ferulic acid, rutin, caffeic acid, quercetin, chlorogenic acid, and gallic acid were investigated. The standard curves for determining the eight standards were plotted, and results showed that the linear regression correlation coefficients were all high enough (r > 0.99). Trolox equivalent antioxidant capacity values for the antioxidant standards were calculated, and a good correlation (r > 0.94) between the values obtained by the present method and cupric reducing antioxidant capacity method was observed. The present method was applied to the analysis of real fruit samples and the evaluation of the antioxidant capacity of these fruits.
Barrett, Bruce; Brown, Roger; Mundt, Marlon
2008-02-01
Evaluative health-related quality-of-life instruments used in clinical trials should be able to detect small but important changes in health status. Several approaches to minimal important difference (MID) and responsiveness have been developed. To compare anchor-based and distributional approaches to important difference and responsiveness for the Wisconsin Upper Respiratory Symptom Survey (WURSS), an illness-specific quality of life outcomes instrument. Participants with community-acquired colds self-reported daily using the WURSS-44. Distribution-based methods calculated standardized effect size (ES) and standard error of measurement (SEM). Anchor-based methods compared daily interval changes to global ratings of change, using: (1) standard MID methods based on correspondence to ratings of "a little better" or "somewhat better," and (2) two-level multivariate regression models. About 150 adults were monitored throughout their colds (1,681 sick days.): 88% were white, 69% were women, and 50% had completed college. The mean age was 35.5 years (SD = 14.7). WURSS scores increased 2.2 points from the first to second day, and then dropped by an average of 8.2 points per day from days 2 to 7. The SEM averaged 9.1 during these 7 days. Standard methods yielded a between day MID of 22 points. Regression models of MID projected 11.3-point daily changes. Dividing these estimates of small-but-important-difference by pooled SDs yielded coefficients of .425 for standard MID, .218 for regression model, .177 for SEM, and .157 for ES. These imply per-group sample sizes of 870 using ES, 616 for SEM, 302 for regression model, and 89 for standard MID, assuming alpha = .05, beta = .20 (80% power), and two-tailed testing. Distribution and anchor-based approaches provide somewhat different estimates of small but important difference, which in turn can have substantial impact on trial design.
Shao, Jing-Yuan; Qu, Hai-Bin; Gong, Xing-Chu
2018-05-01
In this work, two algorithms (overlapping method and the probability-based method) for design space calculation were compared by using the data collected from extraction process of Codonopsis Radix as an example. In the probability-based method, experimental error was simulated to calculate the probability of reaching the standard. The effects of several parameters on the calculated design space were studied, including simulation number, step length, and the acceptable probability threshold. For the extraction process of Codonopsis Radix, 10 000 times of simulation and 0.02 for the calculation step length can lead to a satisfactory design space. In general, the overlapping method is easy to understand, and can be realized by several kinds of commercial software without coding programs, but the reliability of the process evaluation indexes when operating in the design space is not indicated. Probability-based method is complex in calculation, but can provide the reliability to ensure that the process indexes can reach the standard within the acceptable probability threshold. In addition, there is no probability mutation in the edge of design space by probability-based method. Therefore, probability-based method is recommended for design space calculation. Copyright© by the Chinese Pharmaceutical Association.
Leff, J.; Henley, J.; Tittl, J.; De Nardo, E.; Butler, M.; Griggs, R.; Fierer, N.
2017-01-01
ABSTRACT Hands play a critical role in the transmission of microbiota on one’s own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples (P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS (P < 0.05) and ethanol control (P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. PMID:28351915
Animal behavior and well-being symposium: Farm animal welfare assurance: science and application.
Rushen, J; Butterworth, A; Swanson, J C
2011-04-01
Public and consumer pressure for assurances that farm animals are raised humanely has led to a range of private and public animal welfare standards, and for methods to assess compliance with these standards. The standards usually claim to be science based, but even though researchers have developed measures of animal welfare and have tested the effects of housing and management variables on welfare within controlled laboratory settings, there are challenges in extending this research to develop on-site animal welfare standards. The standards need to be validated against a definition of welfare that has broad support and which is amenable to scientific investigation. Ensuring that such standards acknowledge scientific uncertainty is also challenging, and balanced input from all scientific disciplines dealing with animal welfare is needed. Agencies providing animal welfare audit services need to integrate these scientific standards and legal requirements into successful programs that effectively measure and objectively report compliance. On-farm assessment of animal welfare requires a combination of animal-based measures to assess the actual state of welfare and resource-based measures to identify risk factors. We illustrate this by referring to a method of assessing welfare in broiler flocks. Compliance with animal welfare standards requires buy-in from all stakeholders, and this will be best achieved by a process of inclusion in the development of pragmatic assessment methods and the development of audit programs verifying the conditions and continuous improvement of farm animal welfare.
Blom, Kimberly C; Farina, Sasha; Gomez, Yessica-Haydee; Campbell, Norm R C; Hemmelgarn, Brenda R; Cloutier, Lyne; McKay, Donald W; Dawes, Martin; Tobe, Sheldon W; Bolli, Peter; Gelfer, Mark; McLean, Donna; Bartlett, Gillian; Joseph, Lawrence; Featherstone, Robin; Schiffrin, Ernesto L; Daskalopoulou, Stella S
2015-04-01
Despite progress in automated blood pressure measurement (BPM) technology, there is limited research linking hard outcomes to automated office BPM (OBPM) treatment targets and thresholds. Equivalences for automated BPM devices have been estimated from approximations of standardized manual measurements of 140/90 mmHg. Until outcome-driven targets and thresholds become available for automated measurement methods, deriving evidence-based equivalences between automated methods and standardized manual OBPM is the next best solution. The MeasureBP study group was initiated by the Canadian Hypertension Education Program to close this critical knowledge gap. MeasureBP aims to define evidence-based equivalent values between standardized manual OBPM and automated BPM methods by synthesizing available evidence using a systematic review and individual subject-level data meta-analyses. This manuscript provides a review of the literature and MeasureBP study protocol. These results will lay the evidenced-based foundation to resolve uncertainties within blood pressure guidelines which, in turn, will improve the management of hypertension.
NASA Astrophysics Data System (ADS)
Li, Xiongwei; Wang, Zhe; Lui, Siu-Lung; Fu, Yangting; Li, Zheng; Liu, Jianming; Ni, Weidou
2013-10-01
A bottleneck of the wide commercial application of laser-induced breakdown spectroscopy (LIBS) technology is its relatively high measurement uncertainty. A partial least squares (PLS) based normalization method was proposed to improve pulse-to-pulse measurement precision for LIBS based on our previous spectrum standardization method. The proposed model utilized multi-line spectral information of the measured element and characterized the signal fluctuations due to the variation of plasma characteristic parameters (plasma temperature, electron number density, and total number density) for signal uncertainty reduction. The model was validated by the application of copper concentration prediction in 29 brass alloy samples. The results demonstrated an improvement on both measurement precision and accuracy over the generally applied normalization as well as our previously proposed simplified spectrum standardization method. The average relative standard deviation (RSD), average of the standard error (error bar), the coefficient of determination (R2), the root-mean-square error of prediction (RMSEP), and average value of the maximum relative error (MRE) were 1.80%, 0.23%, 0.992, 1.30%, and 5.23%, respectively, while those for the generally applied spectral area normalization were 3.72%, 0.71%, 0.973, 1.98%, and 14.92%, respectively.
Evaluation of methods for measuring particulate matter emissions from gas turbines.
Petzold, Andreas; Marsh, Richard; Johnson, Mark; Miller, Michael; Sevcenco, Yura; Delhaye, David; Ibrahim, Amir; Williams, Paul; Bauer, Heidi; Crayford, Andrew; Bachalo, William D; Raper, David
2011-04-15
The project SAMPLE evaluated methods for measuring particle properties in the exhaust of aircraft engines with respect to the development of standardized operation procedures for particulate matter measurement in aviation industry. Filter-based off-line mass methods included gravimetry and chemical analysis of carbonaceous species by combustion methods. Online mass methods were based on light absorption measurement or used size distribution measurements obtained from an electrical mobility analyzer approach. Number concentrations were determined using different condensation particle counters (CPC). Total mass from filter-based methods balanced gravimetric mass within 8% error. Carbonaceous matter accounted for 70% of gravimetric mass while the remaining 30% were attributed to hydrated sulfate and noncarbonaceous organic matter fractions. Online methods were closely correlated over the entire range of emission levels studied in the tests. Elemental carbon from combustion methods and black carbon from optical methods deviated by maximum 5% with respect to mass for low to medium emission levels, whereas for high emission levels a systematic deviation between online methods and filter based methods was found which is attributed to sampling effects. CPC based instruments proved highly reproducible for number concentration measurements with a maximum interinstrument standard deviation of 7.5%.
Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis
Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.
2011-01-01
Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184
NASA Astrophysics Data System (ADS)
Richings, Gareth W.; Habershon, Scott
2018-04-01
We present significant algorithmic improvements to a recently proposed direct quantum dynamics method, based upon combining well established grid-based quantum dynamics approaches and expansions of the potential energy operator in terms of a weighted sum of Gaussian functions. Specifically, using a sum of low-dimensional Gaussian functions to represent the potential energy surface (PES), combined with a secondary fitting of the PES using singular value decomposition, we show how standard grid-based quantum dynamics methods can be dramatically accelerated without loss of accuracy. This is demonstrated by on-the-fly simulations (using both standard grid-based methods and multi-configuration time-dependent Hartree) of both proton transfer on the electronic ground state of salicylaldimine and the non-adiabatic dynamics of pyrazine.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
24 CFR 35.1335 - Standard treatments.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1335 Standard treatments. Standard... § 35.1330, unless it is found not to be a soil-lead hazard in accordance with § 35.1320(b). (e) Safe...
24 CFR 35.1335 - Standard treatments.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1335 Standard treatments. Standard... § 35.1330, unless it is found not to be a soil-lead hazard in accordance with § 35.1320(b). (e) Safe...
24 CFR 35.1335 - Standard treatments.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1335 Standard treatments. Standard... § 35.1330, unless it is found not to be a soil-lead hazard in accordance with § 35.1320(b). (e) Safe...
24 CFR 35.1335 - Standard treatments.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1335 Standard treatments. Standard... § 35.1330, unless it is found not to be a soil-lead hazard in accordance with § 35.1320(b). (e) Safe...
Ionization-Assisted Getter Pumping for Ultra-Stable Trapped Ion Frequency Standards
NASA Technical Reports Server (NTRS)
Tjoelker, Robert L.; Burt, Eric A.
2010-01-01
A method eliminates (or recovers from) residual methane buildup in getter-pumped atomic frequency standard systems by applying ionizing assistance. Ultra-high stability trapped ion frequency standards for applications requiring very high reliability, and/or low power and mass (both for ground-based and space-based platforms) benefit from using sealed vacuum systems. These systems require careful material selection and system processing (cleaning and high-temperature bake-out). Even under the most careful preparation, residual hydrogen outgassing from vacuum chamber walls typically limits the base pressure. Non-evaporable getter pumps (NEGs) provide a convenient pumping option for sealed systems because of low mass and volume, and no power once activated. An ion gauge in conjunction with a NEG can be used to provide a low mass, low-power method for avoiding the deleterious effects of methane buildup in high-performance frequency standard vacuum systems.
Investigating a Judgemental Rank-Ordering Method for Maintaining Standards in UK Examinations
ERIC Educational Resources Information Center
Black, Beth; Bramley, Tom
2008-01-01
A new judgemental method of equating raw scores on two tests, based on rank-ordering scripts from both tests, has been developed by Bramley. The rank-ordering method has potential application as a judgemental standard-maintaining mechanism, because given a mark on one test (e.g. the A grade boundary mark), the equivalent mark (i.e. at the same…
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies.
NASA Astrophysics Data System (ADS)
Leggett, Allison Gail Wilson
The No Child Left Behind (NCLB) Act of 2001 presented one of the most significant and comprehensive literacy reforms in many years (McDonnell, 2005; U.S. Department of Education, 2006). The era of school accountability and standards based reform has brought many challenges and changes to public schools. Increasingly, public officials and educational administrators are asked to use standards based assessments to make high-stakes decisions, such as whether a student will move on to the next grade level or receive a diploma (American Psychological Association, 2005). It is important to understand any shifts in teachers' perceptions and to identify the changes teachers are making as they implement standards-based reform. This mixed-methods study was designed to assess teachers' perceptions of changes related to standards-based reform as supported by Fullan's (2001) change theory and transformational leadership theory. Survey questions sought to identify teacher perceptions of changes in curriculum, instruction and daily practice as schools documented and incorporated standards-based reform and began focusing on preparing students for the California Standards Test in Science (CSTS). Using descriptive statistical analysis and in-depth interviews, results show favorable insight towards standards-based reform. The survey was distributed to 30 middle school science teachers from 10 low-performing schools in Los Angeles, California. Results were analyzed using Spearman rank-ordered correlations. Interviews were conducted on middle school teachers represented by each grade level. Teachers who receive more support from administrators have more positive attitudes toward all aspects of SBR and the CSTS as measured in this study. No school should overlook the potential of a supportive administration in its effort to improve school programs.
Zhang, Dabing; Guo, Jinchao
2011-07-01
As the worldwide commercialization of genetically modified organisms (GMOs) increases and consumers concern the safety of GMOs, many countries and regions are issuing labeling regulations on GMOs and their products. Analytical methods and their standardization for GM ingredients in foods and feed are essential for the implementation of labeling regulations. To date, the GMO testing methods are mainly based on the inserted DNA sequences and newly produced proteins in GMOs. This paper presents an overview of GMO testing methods as well as their standardization. © 2011 Institute of Botany, Chinese Academy of Sciences.
ERIC Educational Resources Information Center
Fowell, S. L.; Fewtrell, R.; McLaughlin, P. J.
2008-01-01
Absolute standard setting procedures are recommended for assessment in medical education. Absolute, test-centred standard setting procedures were introduced for written assessments in the Liverpool MBChB in 2001. The modified Angoff and Ebel methods have been used for short answer question-based and extended matching question-based papers,…
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Quantitative data standardization of X-ray based densitometry methods
NASA Astrophysics Data System (ADS)
Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.
2018-02-01
In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.
Adaptive reconnection-based arbitrary Lagrangian Eulerian method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bo, Wurigen; Shashkov, Mikhail
We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less
Adaptive reconnection-based arbitrary Lagrangian Eulerian method
Bo, Wurigen; Shashkov, Mikhail
2015-07-21
We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less
Applying open source data visualization tools to standard based medical data.
Kopanitsa, Georgy; Taranik, Maxim
2014-01-01
Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.
Phillips, Melissa M.; Bedner, Mary; Gradl, Manuela; Burdette, Carolyn Q.; Nelson, Michael A.; Yen, James H.; Sander, Lane C.; Rimmer, Catherine A.
2017-01-01
Two independent analytical approaches, based on liquid chromatography with absorbance detection and liquid chromatography with mass spectrometric detection, have been developed for determination of isoflavones in soy materials. These two methods yield comparable results for a variety of soy-based foods and dietary supplements. Four Standard Reference Materials (SRMs) have been produced by the National Institute of Standards and Technology to assist the food and dietary supplement community in method validation and have been assigned values for isoflavone content using both methods. These SRMs include SRM 3234 Soy Flour, SRM 3236 Soy Protein Isolate, SRM 3237 Soy Protein Concentrate, and SRM 3238 Soy-Containing Solid Oral Dosage Form. A fifth material, SRM 3235 Soy Milk, was evaluated using the methods and found to be inhomogeneous for isoflavones and unsuitable for value assignment. PMID:27832301
Method and platform standardization in MRM-based quantitative plasma proteomics.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H
2013-12-16
There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
NASA Astrophysics Data System (ADS)
Chen, Ming-Chih; Hsiao, Shen-Fu
In this paper, we propose an area-efficient design of Advanced Encryption Standard (AES) processor by applying a new common-expression-elimination (CSE) method to the sub-functions of various transformations required in AES. The proposed method reduces the area cost of realizing the sub-functions by extracting the common factors in the bit-level XOR/AND-based sum-of-product expressions of these sub-functions using a new CSE algorithm. Cell-based implementation results show that the AES processor with our proposed CSE method has significant area improvement compared with previous designs.
NASA Astrophysics Data System (ADS)
Pfefer, Joshua; Agrawal, Anant
2012-03-01
In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.
Joseph, Leena; Das, A P; Ravindra, Anuradha; Kulkarni, D B; Kulkarni, M S
2018-07-01
4πβ-γ coincidence method is a powerful and widely used method to determine the absolute activity concentration of radioactive solutions. A new automated liquid scintillator based coincidence system has been designed, developed, tested and established as absolute standard for radioactivity measurements. The automation is achieved using PLC (programmable logic controller) and SCADA (supervisory control and data acquisition). Radioactive solution of 60 Co was standardized to compare the performance of the automated system with proportional counter based absolute standard maintained in the laboratory. The activity concentrations determined using these two systems were in very good agreement; the new automated system can be used for absolute measurement of activity concentration of radioactive solutions. Copyright © 2018. Published by Elsevier Ltd.
Quan, Hui; Zhang, Ji
2003-09-15
Analyses of study variables are frequently based on log transformations. To calculate the power for detecting the between-treatment difference in the log scale, we need an estimate of the standard deviation of the log-transformed variable. However, in many situations a literature search only provides the arithmetic means and the corresponding standard deviations. Without individual log-transformed data to directly calculate the sample standard deviation, we need alternative methods to estimate it. This paper presents methods for estimating and constructing confidence intervals for the standard deviation of a log-transformed variable given the mean and standard deviation of the untransformed variable. It also presents methods for estimating the standard deviation of change from baseline in the log scale given the means and standard deviations of the untransformed baseline value, on-treatment value and change from baseline. Simulations and examples are provided to assess the performance of these estimates. Copyright 2003 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Hughitt, Brian; Generazio, Edward (Principal Investigator); Nichols, Charles; Myers, Mika (Principal Investigator); Spencer, Floyd (Principal Investigator); Waller, Jess (Principal Investigator); Wladyka, Jordan (Principal Investigator); Aldrin, John; Burke, Eric; Cerecerez, Laura;
2016-01-01
NASA-STD-5009 requires that successful flaw detection by NDE methods be statistically qualified for use on fracture critical metallic components, but does not standardize practices. This task works towards standardizing calculations and record retention with a web-based tool, the NNWG POD Standards Library or NPSL. Test methods will also be standardized with an appropriately flexible appendix to -5009 identifying best practices. Additionally, this appendix will describe how specimens used to qualify NDE systems will be cataloged, stored and protected from corrosion, damage, or loss.
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2010 CFR
2010-07-01
... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2011 CFR
2011-07-01
... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
The Development of MST Test Information for the Prediction of Test Performances
ERIC Educational Resources Information Center
Park, Ryoungsun; Kim, Jiseon; Chung, Hyewon; Dodd, Barbara G.
2017-01-01
The current study proposes novel methods to predict multistage testing (MST) performance without conducting simulations. This method, called MST test information, is based on analytic derivation of standard errors of ability estimates across theta levels. We compared standard errors derived analytically to the simulation results to demonstrate the…
A low noise and ultra-narrow bandwidth frequency-locked loop based on the beat method.
Gao, Wei; Sui, Jianping; Chen, Zhiyong; Yu, Fang; Sheng, Rongwu
2011-06-01
A novel frequency-locked loop (FLL) based on the beat method is proposed in this paper. Compared with other frequency feedback loops, this FLL is a digital loop with simple structure and very low noise. As shown in the experimental results, this FLL can be used to reduce close-in phase noise on atomic frequency standards, through which a composite frequency standard with ultra-low phase noise and low cost can be easily realized.
Schiffman, Eric L.; Truelove, Edmond L.; Ohrbach, Richard; Anderson, Gary C.; John, Mike T.; List, Thomas; Look, John O.
2011-01-01
AIMS The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. An overview is presented, including Axis I and II methodology and descriptive statistics for the study participant sample. This paper details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. Validity testing for the Axis II biobehavioral instruments was based on previously validated reference standards. METHODS The Axis I reference standards were based on the consensus of 2 criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion exam reliability was also assessed within study sites. RESULTS Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas ≥ 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion exam agreement with reference standards was excellent (k ≥ 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). CONCLUSION The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods. PMID:20213028
Zhang, Donglu; Raghavan, Nirmala; Chando, Theodore; Gambardella, Janice; Fu, Yunlin; Zhang, Duxi; Unger, Steve E; Humphreys, W Griffith
2007-12-01
An LC-MS/MS-based approach that employs authentic radioactive metabolites as reference standards was developed to estimate metabolite exposures in early drug development studies. This method is useful to estimate metabolite levels in studies done with non-radiolabeled compounds where metabolite standards are not available to allow standard LC-MS/MS assay development. A metabolite mixture obtained from an in vivo source treated with a radiolabeled compound was partially purified, quantified, and spiked into human plasma to provide metabolite standard curves. Metabolites were analyzed by LC-MS/MS using the specific mass transitions and an internal standard. The metabolite concentrations determined by this approach were found to be comparable to those determined by valid LC-MS/MS assays. This approach does not requires synthesis of authentic metabolites or the knowledge of exact structures of metabolites, and therefore should provide a useful method to obtain early estimates of circulating metabolites in early clinical or toxicological studies.
Evaluation of MLACF based calculated attenuation brain PET imaging for FDG patient studies
NASA Astrophysics Data System (ADS)
Bal, Harshali; Panin, Vladimir Y.; Platsch, Guenther; Defrise, Michel; Hayden, Charles; Hutton, Chloe; Serrano, Benjamin; Paulmier, Benoit; Casey, Michael E.
2017-04-01
Calculating attenuation correction for brain PET imaging rather than using CT presents opportunities for low radiation dose applications such as pediatric imaging and serial scans to monitor disease progression. Our goal is to evaluate the iterative time-of-flight based maximum-likelihood activity and attenuation correction factors estimation (MLACF) method for clinical FDG brain PET imaging. FDG PET/CT brain studies were performed in 57 patients using the Biograph mCT (Siemens) four-ring scanner. The time-of-flight PET sinograms were acquired using the standard clinical protocol consisting of a CT scan followed by 10 min of single-bed PET acquisition. Images were reconstructed using CT-based attenuation correction (CTAC) and used as a gold standard for comparison. Two methods were compared with respect to CTAC: a calculated brain attenuation correction (CBAC) and MLACF based PET reconstruction. Plane-by-plane scaling was performed for MLACF images in order to fix the variable axial scaling observed. The noise structure of the MLACF images was different compared to those obtained using CTAC and the reconstruction required a higher number of iterations to obtain comparable image quality. To analyze the pooled data, each dataset was registered to a standard template and standard regions of interest were extracted. An SUVr analysis of the brain regions of interest showed that CBAC and MLACF were each well correlated with CTAC SUVrs. A plane-by-plane error analysis indicated that there were local differences for both CBAC and MLACF images with respect to CTAC. Mean relative error in the standard regions of interest was less than 5% for both methods and the mean absolute relative errors for both methods were similar (3.4% ± 3.1% for CBAC and 3.5% ± 3.1% for MLACF). However, the MLACF method recovered activity adjoining the frontal sinus regions more accurately than CBAC method. The use of plane-by-plane scaling of MLACF images was found to be a crucial step in order to obtain improved activity estimates. Presence of local errors in both MLACF and CBAC based reconstructions would require the use of a normal database for clinical assessment. However, further work is required in order to assess the clinical advantage of MLACF over CBAC based method.
Simple method to detect triacylglycerol biosynthesis in a yeast-based recombinant system
USDA-ARS?s Scientific Manuscript database
Standard methods to quantify the activity of triacylglycerol (TAG) synthesizing enzymes DGAT and PDAT (TAG-SE) require a sensitive but rather arduous laboratory assay based on radio-labeled substrates. Here we describe two straightforward methods to detect TAG production in baker’s yeast Saccharomyc...
Many PCR-based methods for microbial source tracking (MST) have been developed and validated within individual research laboratories. Inter-laboratory validation of these methods, however, has been minimal, and the effects of protocol standardization regimes have not been thor...
Exploiting salient semantic analysis for information retrieval
NASA Astrophysics Data System (ADS)
Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui
2016-11-01
Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Jaromy; Sun Zaijing; Wells, Doug
2009-03-10
Photon activation analysis detected elements in two NIST standards that did not have reported concentration values. A method is currently being developed to infer these concentrations by using scaling parameters and the appropriate known quantities within the NIST standard itself. Scaling parameters include: threshold, peak and endpoint energies; photo-nuclear cross sections for specific isotopes; Bremstrahlung spectrum; target thickness; and photon flux. Photo-nuclear cross sections and energies from the unknown elements must also be known. With these quantities, the same integral was performed for both the known and unknown elements resulting in an inference of the concentration of the un-reported elementmore » based on the reported value. Since Rb and Mn were elements that were reported in the standards, and because they had well-identified peaks, they were used as the standards of inference to determine concentrations of the unreported elements of As, I, Nb, Y, and Zr. This method was tested by choosing other known elements within the standards and inferring a value based on the stated procedure. The reported value of Mn in the first NIST standard was 403{+-}15 ppm and the reported value of Ca in the second NIST standard was 87000 ppm (no reported uncertainty). The inferred concentrations were 370{+-}23 ppm and 80200{+-}8700 ppm respectively.« less
Zhang, X; Platt, R W; Cnattingius, S; Joseph, K S; Kramer, M S
2007-04-01
The objective of this study was to critically examine potential artifacts and biases underlying the use of 'customised' standards of birthweight for gestational age (GA). Population-based cohort study. Sweden. A total of 782,303 singletons > or =28 weeks of gestation born in 1992-2001 to Nordic mothers with complete data on birthweight; GA; and maternal age, parity, height, and pre-pregnancy weight. We compared perinatal mortality in four groups of infants based on the following classification of small for gestational age (SGA): non-SGA based on either population-based or customised standards (the reference group), SGA based on the population-based standard only, SGA based on the customised standard only, and SGA according to both standards. We used graphical methods to compare GA-specific birthweight cutoffs for SGA using the two standards and also used logistic regression to control for differences in GA and maternal pre-pregnancy body mass index (BMI) in the four groups. Perinatal mortality, including stillbirth and neonatal death. Customisation led to a large artifactual increase in the proportion of SGA infants born preterm. Adjustment for differences in GA and maternal BMI markedly reduced the excess risk among infants classified as SGA by customised standards only. The large increase in perinatal mortality risk among infants classified as SGA based on customised standards is largely an artifact due to inclusion of more preterm births.
Winterfield, Craig; van de Voort, F R
2014-12-01
The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.
David, Sophia; Mentasti, Massimo; Tewolde, Rediat; Aslett, Martin; Harris, Simon R; Afshar, Baharak; Underwood, Anthony; Fry, Norman K; Parkhill, Julian; Harrison, Timothy G
2016-08-01
Sequence-based typing (SBT), analogous to multilocus sequence typing (MLST), is the current "gold standard" typing method for investigation of legionellosis outbreaks caused by Legionella pneumophila However, as common sequence types (STs) cause many infections, some investigations remain unresolved. In this study, various whole-genome sequencing (WGS)-based methods were evaluated according to published guidelines, including (i) a single nucleotide polymorphism (SNP)-based method, (ii) extended MLST using different numbers of genes, (iii) determination of gene presence or absence, and (iv) a kmer-based method. L. pneumophila serogroup 1 isolates (n = 106) from the standard "typing panel," previously used by the European Society for Clinical Microbiology Study Group on Legionella Infections (ESGLI), were tested together with another 229 isolates. Over 98% of isolates were considered typeable using the SNP- and kmer-based methods. Percentages of isolates with complete extended MLST profiles ranged from 99.1% (50 genes) to 86.8% (1,455 genes), while only 41.5% produced a full profile with the gene presence/absence scheme. Replicates demonstrated that all methods offer 100% reproducibility. Indices of discrimination range from 0.972 (ribosomal MLST) to 0.999 (SNP based), and all values were higher than that achieved with SBT (0.940). Epidemiological concordance is generally inversely related to discriminatory power. We propose that an extended MLST scheme with ∼50 genes provides optimal epidemiological concordance while substantially improving the discrimination offered by SBT and can be used as part of a hierarchical typing scheme that should maintain backwards compatibility and increase discrimination where necessary. This analysis will be useful for the ESGLI to design a scheme that has the potential to become the new gold standard typing method for L. pneumophila. Copyright © 2016 David et al.
Chiu, Huai-Hsuan; Liao, Hsiao-Wei; Shao, Yu-Yun; Lu, Yen-Shen; Lin, Ching-Hung; Tsai, I-Lin; Kuo, Ching-Hua
2018-08-17
Monoclonal antibody (mAb) drugs have generated much interest in recent years for treating various diseases. Immunoglobulin G (IgG) represents a high percentage of mAb drugs that have been approved by the Food and Drug Administration (FDA). To facilitate therapeutic drug monitoring and pharmacokinetic/pharmacodynamic studies, we developed a general liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to quantify the concentration of IgG-based mAbs in human plasma. Three IgG-based drugs (bevacizumab, nivolumab and pembrolizumab) were selected to demonstrate our method. Protein G beads were used for sample pretreatment due to their universal ability to trap IgG-based drugs. Surrogate peptides that were obtained after trypsin digestion were quantified by using LC-MS/MS. To calibrate sample preparation errors and matrix effects that occur during LC-MS/MS analysis, we used two internal standards (IS) method that include the IgG-based drug-IS tocilizumab and post-column infused IS. Using two internal standards was found to effectively improve quantification accuracy, which was within 15% for all mAb drugs that were tested at three different concentrations. This general method was validated in term of its precision, accuracy, linearity and sensitivity for 3 demonstration mAb drugs. The successful application of the method to clinical samples demonstrated its' applicability in clinical analysis. It is anticipated that this general method could be applied to other mAb-based drugs for use in precision medicine and clinical studies. Copyright © 2018 Elsevier B.V. All rights reserved.
Pang, Susan; Cowen, Simon
2017-12-13
We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.
A framework for automatic creation of gold-standard rigid 3D-2D registration datasets.
Madan, Hennadii; Pernuš, Franjo; Likar, Boštjan; Špiclin, Žiga
2017-02-01
Advanced image-guided medical procedures incorporate 2D intra-interventional information into pre-interventional 3D image and plan of the procedure through 3D/2D image registration (32R). To enter clinical use, and even for publication purposes, novel and existing 32R methods have to be rigorously validated. The performance of a 32R method can be estimated by comparing it to an accurate reference or gold standard method (usually based on fiducial markers) on the same set of images (gold standard dataset). Objective validation and comparison of methods are possible only if evaluation methodology is standardized, and the gold standard dataset is made publicly available. Currently, very few such datasets exist and only one contains images of multiple patients acquired during a procedure. To encourage the creation of gold standard 32R datasets, we propose an automatic framework. The framework is based on rigid registration of fiducial markers. The main novelty is spatial grouping of fiducial markers on the carrier device, which enables automatic marker localization and identification across the 3D and 2D images. The proposed framework was demonstrated on clinical angiograms of 20 patients. Rigid 32R computed by the framework was more accurate than that obtained manually, with the respective target registration error below 0.027 mm compared to 0.040 mm. The framework is applicable for gold standard setup on any rigid anatomy, provided that the acquired images contain spatially grouped fiducial markers. The gold standard datasets and software will be made publicly available.
White, Donald J; Schneiderman, Eva; Colón, Ellen; St John, Samuel
2015-01-01
This paper describes the development and standardization of a profilometry-based method for assessment of dentifrice abrasivity called Radioactive Dentin Abrasivity - Profilometry Equivalent (RDA-PE). Human dentine substrates are mounted in acrylic blocks of precise standardized dimensions, permitting mounting and brushing in V8 brushing machines. Dentin blocks are masked to create an area of "contact brushing." Brushing is carried out in V8 brushing machines and dentifrices are tested as slurries. An abrasive standard is prepared by diluting the ISO 11609 abrasivity reference calcium pyrophosphate abrasive into carboxymethyl cellulose/glycerin, just as in the RDA method. Following brushing, masked areas are removed and profilometric analysis is carried out on treated specimens. Assessments of average abrasion depth (contact or optical profilometry) are made. Inclusion of standard calcium pyrophosphate abrasive permits a direct RDA equivalent assessment of abrasion, which is characterized with profilometry as Depth test/Depth control x 100. Within the test, the maximum abrasivity standard of 250 can be created in situ simply by including a treatment group of standard abrasive with 2.5x number of brushing strokes. RDA-PE is enabled in large part by the availability of easy-to-use and well-standardized modern profilometers, but its use in V8 brushing machines is enabled by the unique specific conditions described herein. RDA-PE permits the evaluation of dentifrice abrasivity to dentin without the requirement of irradiated teeth and infrastructure for handling them. In direct comparisons, the RDA-PE method provides dentifrice abrasivity assessments comparable to the gold industry standard RDA technique.
NASA Astrophysics Data System (ADS)
Jin, Yang; Ciwei, Gao; Jing, Zhang; Min, Sun; Jie, Yu
2017-05-01
The selection and evaluation of priority domains in Global Energy Internet standard development will help to break through limits of national investment, thus priority will be given to standardizing technical areas with highest urgency and feasibility. Therefore, in this paper, the process of Delphi survey based on technology foresight is put forward, the evaluation index system of priority domains is established, and the index calculation method is determined. Afterwards, statistical method is used to evaluate the alternative domains. Finally the top four priority domains are determined as follows: Interconnected Network Planning and Simulation Analysis, Interconnected Network Safety Control and Protection, Intelligent Power Transmission and Transformation, and Internet of Things.
Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya
2013-01-01
Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607
Tanuja, Penmatsa; Venugopal, Namburi; Sashidhar, Rao Beedu
2007-01-01
A simple thin-layer chromatography-digital image-based analytical method has been developed for the quantitation of the botanical pesticide, azadirachtin. The method was validated by analyzing azadirachtin in the spiked food matrixes and processed commercial pesticide formulations, using acidified vanillin reagent as a postchromatographic derivatizing agent. The separated azadirachtin was clearly identified as a green spot. The Rf value was found to be 0.55, which was similar to that of a reference standard. A standard calibration plot was established using a reference standard, based on the linear regression analysis [r2 = 0.996; y = 371.43 + (634.82)x]. The sensitivity of the method was found to be 0.875 microg azadirachtin. Spiking studies conducted at the 1 ppm (microg/g) level in various agricultural matrixes, such as brinjal, tomato, coffee, and cotton seeds, revealed the recoveries of azadirachtin in the range of 67-92%. Azadirachtin content of commercial neem formulations analyzed by the method was in the range of 190-1825 ppm (microg/mL). Further, the present method was compared with an immunoanalytical method enzyme-linked immonosorbent assay developed earlier in our laboratory. Statistical comparison of the 2 methods, using Fischer's F-test, indicated no significant difference in variance, suggesting that both methods are comparable.
Schargus, Marc; Grehn, Franz; Glaucocard Workgroup
2008-12-01
To evaluate existing international IT-based ophthalmological medical data projects, and to define a glaucoma data set based on existing international standards of medical and ophthalmological documentation. To develop the technical environment for easy data mining and data exchange in different countries in Europe. Existing clinical and IT-based projects for documentation of medical data in general medicine and ophthalmology were analyzed to create new data sets for medical documentation in glaucoma patients. Different types of data transfer methods were evaluated to find the best method of data exchange between ophthalmologists in different European countries. Data sets from existing IT projects showed a wide variability in specifications, use of codes, terms and graphical data (perimetry, optic nerve analysis etc.) in glaucoma patients. New standardized digital datasets for glaucoma patients were defined, based on existing standards, which can be used by general ophthalmologists for follow-up examinations and for glaucoma specialists to perform teleconsultation, also across country borders. Datasets are available in different languages. Different types of data exchange methods using secure medical data transfer by internet, USB stick and smartcard were tested for different countries with regard to legal acceptance, practicability and technical realization (e.g. compatibility with EMR systems). By creating new standardized glaucoma specific cross-national datasets, it is now possible to develop an electronic glaucoma patient record system for data storage and transfer based on internet, smartcard or USB stick. The digital data can be used for referrals and for teleconsultation of glaucoma specialists in order to optimize glaucoma treatment. This should lead to an increase of quality in glaucoma care, and prevent expenses in health care costs by unnecessary repeated examinations.
NASA Astrophysics Data System (ADS)
Wu, Fan; Cao, Pin; Yang, Yongying; Li, Chen; Chai, Huiting; Zhang, Yihui; Xiong, Haoliang; Xu, Wenlin; Yan, Kai; Zhou, Lin; Liu, Dong; Bai, Jian; Shen, Yibing
2016-11-01
The inspection of surface defects is one of significant sections of optical surface quality evaluation. Based on microscopic scattering dark-field imaging, sub-aperture scanning and stitching, the Surface Defects Evaluating System (SDES) can acquire full-aperture image of defects on optical elements surface and then extract geometric size and position information of defects with image processing such as feature recognization. However, optical distortion existing in the SDES badly affects the inspection precision of surface defects. In this paper, a distortion correction algorithm based on standard lattice pattern is proposed. Feature extraction, polynomial fitting and bilinear interpolation techniques in combination with adjacent sub-aperture stitching are employed to correct the optical distortion of the SDES automatically in high accuracy. Subsequently, in order to digitally evaluate surface defects with American standard by using American military standards MIL-PRF-13830B to judge the surface defects information obtained from the SDES, an American standard-based digital evaluation algorithm is proposed, which mainly includes a judgment method of surface defects concentration. The judgment method establishes weight region for each defect and adopts the method of overlap of weight region to calculate defects concentration. This algorithm takes full advantage of convenience of matrix operations and has merits of low complexity and fast in running, which makes itself suitable very well for highefficiency inspection of surface defects. Finally, various experiments are conducted and the correctness of these algorithms are verified. At present, these algorithms have been used in SDES.
Place-Based Pedagogy in the Era of Accountability: An Action Research Study
ERIC Educational Resources Information Center
Saracino, Peter C.
2010-01-01
Today's most common method of teaching biology--driven by calls for standardization and high-stakes testing--relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep…
STANDARD REFERENCE MATERIALS FOR THE POLYMERS INDUSTRY.
McDonough, Walter G; Orski, Sara V; Guttman, Charles M; Migler, Kalman D; Beers, Kathryn L
2016-01-01
The National Institute of Standards and Technology (NIST) provides science, industry, and government with a central source of well-characterized materials certified for chemical composition or for some chemical or physical property. These materials are designated Standard Reference Materials ® (SRMs) and are used to calibrate measuring instruments, to evaluate methods and systems, or to produce scientific data that can be referred readily to a common base. In this paper, we discuss the history of polymer based SRMs, their current status, and challenges and opportunities to develop new standards to address industrial measurement challenges.
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
In July 1997, EPA promulgated a new National Ambient Air Quality Standard (NAAQS) for fine particulate matter (PM2.5). This new standard was based on collection of an integrated mass sample on a filter. Field studies have demonstrated that the collection of semivolatile compoun...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2013 CFR
2013-07-01
... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2012 CFR
2012-07-01
... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2014 CFR
2014-07-01
... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
NASA Astrophysics Data System (ADS)
Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.
2017-12-01
We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.
Hanford Technical Basis for Multiple Dosimetry Effective Dose Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, Robin L.; Rathbone, Bruce A.
2010-08-01
The current method at Hanford for dealing with the results from multiple dosimeters worn during non-uniform irradiation is to use a compartmentalization method to calculate the effective dose (E). The method, as documented in the current version of Section 6.9.3 in the 'Hanford External Dosimetry Technical Basis Manual, PNL-MA-842,' is based on the compartmentalization method presented in the 1997 ANSI/HPS N13.41 standard, 'Criteria for Performing Multiple Dosimetry.' With the adoption of the ICRP 60 methodology in the 2007 revision to 10 CFR 835 came changes that have a direct affect on the compartmentalization method described in the 1997 ANSI/HPS N13.41more » standard, and, thus, to the method used at Hanford. The ANSI/HPS N13.41 standard committee is in the process of updating the standard, but the changes to the standard have not yet been approved. And, the drafts of the revision of the standard tend to align more with ICRP 60 than with the changes specified in the 2007 revision to 10 CFR 835. Therefore, a revised method for calculating effective dose from non-uniform external irradiation using a compartmental method was developed using the tissue weighting factors and remainder organs specified in 10 CFR 835 (2007).« less
NASA Astrophysics Data System (ADS)
Shi, Liehang; Ling, Tonghui; Zhang, Jianguo
2016-03-01
Radiologists currently use a variety of terminologies and standards in most hospitals in China, and even there are multiple terminologies being used for different sections in one department. In this presentation, we introduce a medical semantic comprehension system (MedSCS) to extract semantic information about clinical findings and conclusion from free text radiology reports so that the reports can be classified correctly based on medical terms indexing standards such as Radlex or SONMED-CT. Our system (MedSCS) is based on both rule-based methods and statistics-based methods which improve the performance and the scalability of MedSCS. In order to evaluate the over all of the system and measure the accuracy of the outcomes, we developed computation methods to calculate the parameters of precision rate, recall rate, F-score and exact confidence interval.
Ash, Susan; O'Connor, Jackie; Anderson, Sarah; Ridgewell, Emily; Clarke, Leigh
2015-06-01
The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis - a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups - an expert and a recent graduate group of Australian orthotist/prosthetists - were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.
Peedikayil, Musthafa Chalikandy; AlSohaibani, Fahad Ibrahim; Alkhenizan, Abdullah Hamad
2014-01-01
Background First-line levofloxacin-based treatments eradicate Helicobacter pylori with varying success. We examined the efficacy and safety of first-line levofloxacin-based treatment in comparison to standard first-line therapy for H pylori eradication. Materials and Methods We searched literature databases from Medline, EMBASE, and the Cochrane Register of Randomized Controlled Trials through March 2013 for randomized controlled trials comparing first-line levofloxacin and standard therapy. We included randomized controlled trials conducted only on naïve H pylori infected patients in adults. A systematic review was conducted. Meta-analysis was performed with Review Manager 5.2. Treatment effect was determined by relative risk with a random or fixed model by the Mantel-Haenszel method. Results Seven trials were identified with 888 patients receiving 7 days of first-line levofloxacin and 894 treated with standard therapy (Amoxicillin, Clarithromycin and proton pump inhibitor) for 7 days. The overall crude eradication rate in the Levofloxacin group was 79.05% versus 81.4% in the standard group (risk ratio 0.97; 95% CI; 0.93, 1.02). The overall dropout was 46 (5.2%) in the levofloxacin group and 52 (5.8%) for standard therapy. The dizziness was more common among group who took Levofloxacin based treatment and taste disturbance was more common among group who took standard therapy. Meta-analysis of overall adverse events were similar between the two groups with a relative risk of 1.06 (95% CI 0.72, 1.57). Conclusion Helicobacter pylori eradication with 7 days of Levofloxacin-based first line therapy was safe and equal compared to 7 days of standard first-line therapy. PMID:24465624
Earth Science for Educators: Preparing 7-12 Teachers for Standards-based, Inquiry Instruction
NASA Astrophysics Data System (ADS)
Sloan, H.
2002-05-01
"Earth Science for Educators" is an innovative, standards-based, graduate level teacher education curriculum that presents science content and pedagogic technique in parallel. The curriculum calls upon the resources and expertise of the American Museum of Natural History (AMNH) to prepare novice New York City teachers for teaching Earth Science. One of the goals of teacher education is to assure and facilitate science education reform through preparation of K-12 teachers who understand and are able to implement standard-based instruction. Standards reflect not only the content knowledge students are expected to attain but also the science skills and dispositions towards science they are expected to develop. Melding a list of standards with a curriculum outline to create inquiry-based classroom instruction that reaches a very diverse population of learners is extremely challenging. "Earth Science for Educators" helps novice teachers make the link between standards and practice by constantly connecting standards with instruction they receive and activities they carry out. Development of critical thinking and enthusiasm for inquiry is encouraged through engaging experience and contact with scientists and their work. Teachers are taught Earth systems science content through modeling of a wide variety of instruction and assessment methods based upon authentic scientific inquiry and aimed at different learning styles. Use of fieldwork and informal settings, such as the Museum, familiarizes novice teachers with ways of drawing on community resources for content and instructional settings. Metacognitive reflection that articulates standards, practice, and the teachers' own learning experience help draw out teachers' insights into their students' learning. The innovation of bring science content together with teaching methods is key to preparing teachers for standards-based, inquiry instruction. This curriculum was successfully piloted with a group of 28 novice teachers as part of the AMNH-City University of New York partnership and the CUNY Teaching Opportunity Program Scholarship. Reactions and feedback from program coordinators and teachers have been extremely positive during the year and a half since its implementation.
Potential reductions in ambient NO2 concentrations from meeting diesel vehicle emissions standards
NASA Astrophysics Data System (ADS)
von Schneidemesser, Erika; Kuik, Friderike; Mar, Kathleen A.; Butler, Tim
2017-11-01
Exceedances of the concentration limit value for ambient nitrogen dioxide (NO2) at roadside sites are an issue in many cities throughout Europe. This is linked to the emissions of light duty diesel vehicles which have on-road emissions that are far greater than the regulatory standards. These exceedances have substantial implications for human health and economic loss. This study explores the possible gains in ambient air quality if light duty diesel vehicles were able to meet the regulatory standards (including both emissions standards from Europe and the United States). We use two independent methods: a measurement-based and a model-based method. The city of Berlin is used as a case study. The measurement-based method used data from 16 monitoring stations throughout the city of Berlin to estimate annual average reductions in roadside NO2 of 9.0 to 23 µg m-3 and in urban background NO2 concentrations of 1.2 to 2.7 µg m-3. These ranges account for differences in fleet composition assumptions, and the stringency of the regulatory standard. The model simulations showed reductions in urban background NO2 of 2.0 µg m-3, and at the scale of the greater Berlin area of 1.6 to 2.0 µg m-3 depending on the setup of the simulation and resolution of the model. Similar results were found for other European cities. The similarities in results using the measurement- and model-based methods support our ability to draw robust conclusions that are not dependent on the assumptions behind either methodology. The results show the significant potential for NO2 reductions if regulatory standards for light duty diesel vehicles were to be met under real-world operating conditions. Such reductions could help improve air quality by reducing NO2 exceedances in urban areas, but also have broader implications for improvements in human health and other benefits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makoto Kashiwagi; Garamszeghy, Mike; Lantes, Bertrand
Disposal of low-and intermediate-level activated waste generated at nuclear power plants is being planned or carried out in many countries. The radioactivity concentrations and/or total quantities of long-lived, difficult-to-measure nuclides (DTM nuclides), such as C-14, Ni-63, Nb-94, α emitting nuclides etc., are often restricted by the safety case for a final repository as determined by each country's safety regulations, and these concentrations or amounts are required to be known and declared. With respect to waste contaminated by contact with process water, the Scaling Factor method (SF method), which is empirically based on sampling and analysis data, has been applied asmore » an important method for determining concentrations of DTM nuclides. This method was standardized by the International Organization for Standardization (ISO) and published in 2007 as ISO21238 'Scaling factor method to determine the radioactivity of low and intermediate-level radioactive waste packages generated at nuclear power plants' [1]. However, for activated metal waste with comparatively high concentrations of radioactivity, such as may be found in reactor control rods and internal structures, direct sampling and radiochemical analysis methods to evaluate the DTM nuclides are limited by access to the material and potentially high personnel radiation exposure. In this case, theoretical calculation methods in combination with empirical methods based on remote radiation surveys need to be used to best advantage for determining the disposal inventory of DTM nuclides while minimizing exposure to radiation workers. Pursuant to this objective a standard for the theoretical evaluation of the radioactivity concentration of DTM nuclides in activated waste, is in process through ISO TC85/SC5 (ISO Technical Committee 85: Nuclear energy, nuclear technologies, and radiological protection; Subcommittee 5: Nuclear fuel cycle). The project team for this ISO standard was formed in 2011 and is composed of experts from 11 countries. The project team has been conducting technical discussions on theoretical methods for determining concentrations of radioactivity, and has developed the draft International Standard of ISO16966 'Theoretical activation calculation method to evaluate the radioactivity of activated waste generated at nuclear reactors' [2]. This paper describes the international standardization process developed by the ISO project team, and outlines the following two theoretical activity evaluation methods:? Point method? Range method. (authors)« less
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Faravan, Amir; Mohammadi, Nooredin; Alizadeh Ghavidel, Alireza; Toutounchi, Mohammad Zia; Ghanbari, Ameneh; Mazloomi, Mehran
2016-01-01
Introduction: Standards have a significant role in showing the minimum level of optimal optimum and the expected performance. Since the perfusion technology staffs play an the leading role in providing the quality services to the patients undergoing open heart surgery with cardiopulmonary bypass machine, this study aimed to assess the standards on how Iranian perfusion technology staffs evaluate and manage the patients during the cardiopulmonary bypass process and compare their practice with the recommended standards by American Society of Extracorporeal Technology. Methods: In this descriptive study, data was collected from 48 Iranian public hospitals and educational health centers through a researcher-created questionnaire. The data collection questionnaire assessed the standards which are recommended by American Society of Extracorporeal Technology. Results: Findings showed that appropriate measurements were carried out by the perfusion technology staffs to prevent the hemodilution and avoid the blood transfusion and unnecessary blood products, determine the initial dose of heparin based on one of the proposed methods, monitor the anticoagulants based on ACT measurement, and determine the additional doses of heparin during the cardiopulmonary bypass based on ACT or protamine titration. It was done only in 4.2% of hospitals and health centers. Conclusion: Current practices of cardiopulmonary perfusion technology in Iran are inappropriate based on the standards of American Society of Cardiovascular Perfusion. This represents the necessity of authorities’ attention to the validation programs and development of the caring standards on one hand and continuous assessment of using these standards on the other hand. PMID:27489600
Standardization of Spore Inactivation Method for PMA-PhyloChip Analysis
NASA Technical Reports Server (NTRS)
Schrader, Michael
2011-01-01
In compliance with the Committee on Space Research (COSPAR) planetary protection policy, National Aeronautics and Space Administration (NASA) monitors the total microbial burden of spacecraft as a means for minimizing the inadvertent transfer of viable contaminant microorganisms to extraterrestrial environments (forward contamination). NASA standard assay-based counts are used both as a proxy for relative surface cleanliness and to estimate overall microbial burden as well as to assess whether forward planetary protection risk criteria are met for a given mission, which vary by the planetary body to be explored and whether or not life detection missions are present. Despite efforts to reduce presence of microorganisms from spacecraft prior to launch, microbes have been isolated from spacecraft and associated surfaces within the extreme conditions of clean room facilities using state of the art molecular technologies. Development of a more sensitive method that will better enumerate all viable microorganisms from spacecraft and associated surfaces could support future life detection missions. Current culture-based (NASA standard spore assay) and nucleic-acid-based polymerase chain reaction (PCR) methods have significant shortcomings in this type of analysis. The overall goal of this project is to evaluate and validate a new molecular method based on the use of a deoxyribonucleic acid (DNA) intercalating agent propidium monoazide (PMA). This is used in combination with DNA microarray (PhyloChip) which has been shown to identify very low levels of organisms on spacecraft associated surfaces. PMA can only penetrate the membrane of dead cells. Once penetrated, it intercalates the DNA and, upon photolysis using visible light it produces stable DNA monoadducts. This allows DNA to be unavailable for further PCR analysis. The specific aim of this study is to standardize the spore inactivation method for PMA-PhyloChip analysis. We have used the bacterial spores Bacillus subtilis 168 (standard laboratory isolate) as a test organism.
Development of the Nurse Practitioner Standards for Practice Australia
Buckley, Thomas; Donoghue, Judith; Heartfield, Marie; Bryce, Julianne; Cox, Darlene; Waters, Donna; Gosby, Helen; Kelly, John; Dunn, Sandra V.
2015-01-01
This article describes the context and development of the new Nurse Practitioner Standards for Practice in Australia, which went into effect in January 2014. The researchers used a mixed-methods design to engage a broad range of stakeholders who brought both political and practice knowledge to the development of the new standards. Methods included interviews, focus groups, surveys, and work-based observation of nurse practitioner practice. Stakeholders varied in terms of their need for detail in the standards. Nonetheless, they invariably agreed that the standards should be clinically focussed attributes. The pillars common in many advanced practice nursing standards, such as practice, research, education, and leadership, were combined and expressed in a new and unique clinical attribute. PMID:26162455
ERIC Educational Resources Information Center
Nixon, Lisa
2013-01-01
The purpose of this mixed methods study was to determine the key implementation issues of a standards-based teacher evaluation system as perceived by campus administrators. The 80 campus administrators that participated in this study were from six public school districts located in southeastern Texas that serve students in grades Kindergarten…
Standards and Assessment: Coherence from the Teacher's Perspective
ERIC Educational Resources Information Center
Bonner, Sarah M.; Torres Rivera, Camila; Chen, Peggy P.
2018-01-01
We sought to understand how teachers' perspectives on standards-based instructional practices, classroom assessment, and external testing do or do not show coherence and alignment. Based on survey methods (n = 155) and interviews with a sample of secondary school teachers (n = 9) in a large urban district in the USA, we explored general trends and…
A Standards-Based Approach for Reporting Assessment Results in South Africa
ERIC Educational Resources Information Center
Kanjee, Anil; Moloi, Qetelo
2016-01-01
This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…
A rapid method for soil cement design : Louisiana slope value method.
DOT National Transportation Integrated Search
1964-03-01
The current procedure used by the Louisiana Department of Highways for laboratory design of cement stabilized soil base and subbase courses is taken from standard AASHO test methods, patterned after Portland Cement Association criteria. These methods...
An entropy-based statistic for genomewide association studies.
Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao
2005-07-01
Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.
Development of ASTM Standard for SiC-SiC Joint Testing Final Scientific/Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsen, George; Back, Christina
2015-10-30
As the nuclear industry moves to advanced ceramic based materials for cladding and core structural materials for a variety of advanced reactors, new standards and test methods are required for material development and licensing purposes. For example, General Atomics (GA) is actively developing silicon carbide (SiC) based composite cladding (SiC-SiC) for its Energy Multiplier Module (EM2), a high efficiency gas cooled fast reactor. Through DOE funding via the advanced reactor concept program, GA developed a new test method for the nominal joint strength of an endplug sealed to advanced ceramic tubes, Fig. 1-1, at ambient and elevated temperatures called themore » endplug pushout (EPPO) test. This test utilizes widely available universal mechanical testers coupled with clam shell heaters, and specimen size is relatively small, making it a viable post irradiation test method. The culmination of this effort was a draft of an ASTM test standard that will be submitted for approval to the ASTM C28 ceramic committee. Once the standard has been vetted by the ceramics test community, an industry wide standard methodology to test joined tubular ceramic components will be available for the entire nuclear materials community.« less
Lenselink, Eelke B; Ten Dijke, Niels; Bongers, Brandon; Papadatos, George; van Vlijmen, Herman W T; Kowalczyk, Wojtek; IJzerman, Adriaan P; van Westen, Gerard J P
2017-08-14
The increase of publicly available bioactivity data in recent years has fueled and catalyzed research in chemogenomics, data mining, and modeling approaches. As a direct result, over the past few years a multitude of different methods have been reported and evaluated, such as target fishing, nearest neighbor similarity-based methods, and Quantitative Structure Activity Relationship (QSAR)-based protocols. However, such studies are typically conducted on different datasets, using different validation strategies, and different metrics. In this study, different methods were compared using one single standardized dataset obtained from ChEMBL, which is made available to the public, using standardized metrics (BEDROC and Matthews Correlation Coefficient). Specifically, the performance of Naïve Bayes, Random Forests, Support Vector Machines, Logistic Regression, and Deep Neural Networks was assessed using QSAR and proteochemometric (PCM) methods. All methods were validated using both a random split validation and a temporal validation, with the latter being a more realistic benchmark of expected prospective execution. Deep Neural Networks are the top performing classifiers, highlighting the added value of Deep Neural Networks over other more conventional methods. Moreover, the best method ('DNN_PCM') performed significantly better at almost one standard deviation higher than the mean performance. Furthermore, Multi-task and PCM implementations were shown to improve performance over single task Deep Neural Networks. Conversely, target prediction performed almost two standard deviations under the mean performance. Random Forests, Support Vector Machines, and Logistic Regression performed around mean performance. Finally, using an ensemble of DNNs, alongside additional tuning, enhanced the relative performance by another 27% (compared with unoptimized 'DNN_PCM'). Here, a standardized set to test and evaluate different machine learning algorithms in the context of multi-task learning is offered by providing the data and the protocols. Graphical Abstract .
Teramura, Hajime; Fukuda, Noriko; Okada, Yumiko; Ogihara, Hirokazu
2018-01-01
The four types of chromogenic selective media that are commercially available in Japan were compared for establishing a Japanese standard method for detecting Cronobacter spp. based on ISO/TS 22964:2006. When assessed using 9 standard Cronobacter spp. strains and 29 non-Cronobacter strains, Enterobacter sakazakii isolation agar, Chromocult TM Enterobacter sakazakii agar, CHROMagar TM E. sakazakii, and XM-sakazakii agar demonstrated excellent inclusivity and exclusivity. Using the ISO/TS 22964:2006 method, the recovered numbers of 38 Cronobacter spp. strains, including 29 C. sakazakii isolates obtained from each medium, were equivalent, indicating that there was no significant difference (p > 0.05) among the four types of chromogenic selective media. Thus, we demonstrated that these four chromogenic selective media are suitable alternatives when using the standard method for detecting Cronobacter spp. in Japan, based on the ISO/TS 22964:2006.
Place, Benjamin J
2017-05-01
To address community needs, the National Institute of Standards and Technology has developed a candidate Standard Reference Material (SRM) for infant/adult nutritional formula based on milk and whey protein concentrates with isolated soy protein called SRM 1869 Infant/Adult Nutritional Formula. One major component of this candidate SRM is the fatty acid content. In this study, multiple extraction techniques were evaluated to quantify the fatty acids in this new material. Extraction methods that were based on lipid extraction followed by transesterification resulted in lower mass fraction values for all fatty acids than the values measured by methods utilizing in situ transesterification followed by fatty acid methyl ester extraction (ISTE). An ISTE method, based on the identified optimal parameters, was used to determine the fatty acid content of the new infant/adult nutritional formula reference material.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.
2013-01-01
Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045
Adaptive image coding based on cubic-spline interpolation
NASA Astrophysics Data System (ADS)
Jiang, Jian-Xing; Hong, Shao-Hua; Lin, Tsung-Ching; Wang, Lin; Truong, Trieu-Kien
2014-09-01
It has been investigated that at low bit rates, downsampling prior to coding and upsampling after decoding can achieve better compression performance than standard coding algorithms, e.g., JPEG and H. 264/AVC. However, at high bit rates, the sampling-based schemes generate more distortion. Additionally, the maximum bit rate for the sampling-based scheme to outperform the standard algorithm is image-dependent. In this paper, a practical adaptive image coding algorithm based on the cubic-spline interpolation (CSI) is proposed. This proposed algorithm adaptively selects the image coding method from CSI-based modified JPEG and standard JPEG under a given target bit rate utilizing the so called ρ-domain analysis. The experimental results indicate that compared with the standard JPEG, the proposed algorithm can show better performance at low bit rates and maintain the same performance at high bit rates.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... accordance with § 35.1340. If encapsulation or enclosure is used as a method of abatement, ongoing lead-based...
Atiyeh, Bishara S
2007-01-01
Hypertrophic scars, resulting from alterations in the normal processes of cutaneous wound healing, are characterized by proliferation of dermal tissue with excessive deposition of fibroblast-derived extracellular matrix proteins, especially collagen, over long periods, and by persistent inflammation and fibrosis. Hypertrophic scars are among the most common and frustrating problems after injury. As current aesthetic surgical techniques become more standardized and results more predictable, a fine scar may be the demarcating line between acceptable and unacceptable aesthetic results. However, hypertrophic scars remain notoriously difficult to eradicate because of the high recurrence rates and the incidence of side effects associated with available treatment methods. This review explores the various treatment methods for hypertrophic scarring described in the literature including evidence-based therapies, standard practices, and emerging methods, attempting to distinguish those with clearly proven efficiency from anecdotal reports about therapies of doubtful benefits while trying to differentiate between prophylactic measures and actual treatment methods. Unfortunately, the distinction between hypertrophic scar treatments and keloid treatments is not obvious in most reports, making it difficult to assess the efficacy of hypertrophic scar treatment.
Rishikesh, N.; Quélennec, G.
1983-01-01
Vector resistance and other constraints have necessitated consideration of the use of alternative materials and methods in an integrated approach to vector control. Bacillus thuringiensis serotype H-14 is a promising biological control agent which acts as a conventional larvicide through its delta-endotoxin (active ingredient) and which now has to be suitably formulated for application in vector breeding habitats. The active ingredient in the formulations has so far not been chemically characterized or quantified and therefore recourse has to be taken to a bioassay method. Drawing on past experience and through the assistance mainly of various collaborating centres, the World Health Organization has standardized a bioassay method (described in the Annex), which gives consistent and reproducible results. The method permits the determination of the potency of a B.t. H-14 preparation through comparison with a standard powder. The universal adoption of the standardized bioassay method will ensure comparability of the results of different investigators. PMID:6601545
Parallelism measurement for base plate of standard artifact with multiple tactile approaches
NASA Astrophysics Data System (ADS)
Ye, Xiuling; Zhao, Yan; Wang, Yiwen; Wang, Zhong; Fu, Luhua; Liu, Changjie
2018-01-01
Nowadays, as workpieces become more precise and more specialized which results in more sophisticated structures and higher accuracy for the artifacts, higher requirements have been put forward for measuring accuracy and measuring methods. As an important method to obtain the size of workpieces, coordinate measuring machine (CMM) has been widely used in many industries. In order to achieve the calibration of a self-developed CMM, it is found that the parallelism of the base plate used for fixing the standard artifact is an important factor which affects the measurement accuracy in the process of studying self-made high-precision standard artifact. And aimed to measure the parallelism of the base plate, by using the existing high-precision CMM, gauge blocks, dial gauge and marble platform with the tactile approach, three methods for parallelism measurement of workpieces are employed, and comparisons are made within the measurement results. The results of experiments show that the final accuracy of all the three methods is able to reach micron level and meets the measurement requirements. Simultaneously, these three approaches are suitable for different measurement conditions which provide a basis for rapid and high-precision measurement under different equipment conditions.
Analytic Methods in Investigative Geometry.
ERIC Educational Resources Information Center
Dobbs, David E.
2001-01-01
Suggests an alternative proof by analytic methods, which is more accessible than rigorous proof based on Euclid's Elements, in which students need only apply standard methods of trigonometry to the data without introducing new points or lines. (KHR)
Song, Mi; Chen, Zeng-Ping; Chen, Yao; Jin, Jing-Wen
2014-07-01
Liquid chromatography-mass spectrometry assays suffer from signal instability caused by the gradual fouling of the ion source, vacuum instability, aging of the ion multiplier, etc. To address this issue, in this contribution, an internal standard was added into the mobile phase. The internal standard was therefore ionized and detected together with the analytes of interest by the mass spectrometer to ensure that variations in measurement conditions and/or instrument have similar effects on the signal contributions of both the analytes of interest and the internal standard. Subsequently, based on the unique strategy of adding internal standard in mobile phase, a multiplicative effects model was developed for quantitative LC-MS assays and tested on a proof of concept model system: the determination of amino acids in water by LC-MS. The experimental results demonstrated that the proposed method could efficiently mitigate the detrimental effects of continuous signal variation, and achieved quantitative results with average relative predictive error values in the range of 8.0-15.0%, which were much more accurate than the corresponding results of conventional internal standard method based on the peak height ratio and partial least squares method (their average relative predictive error values were as high as 66.3% and 64.8%, respectively). Therefore, it is expected that the proposed method can be developed and extended in quantitative LC-MS analysis of more complex systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Cueto Díaz, Sergio; Ruiz Encinar, Jorge; García Alonso, J Ignacio
2014-09-24
We present a novel method for the purity assessment of peptide standards which is applicable to any water soluble peptide. The method is based on the online (13)C isotope dilution approach in which the peptide is separated from its related impurities by liquid chromatography (LC) and the eluent is mixed post-column with a continuous flow of (13)C-enriched sodium bicarbonate. An online oxidation step using sodium persulfate in acidic media at 99°C provides quantitative oxidation to (12)CO2 and (13)CO2 respectively which is extracted to a gaseous phase with the help of a gas permeable membrane. The measurement of the isotope ratio 44/45 in the mass spectrometer allows the construction of the mass flow chromatogram. As the only species that is finally measured in the mass spectrometer is CO2, the peptide content in the standard can be quantified, on the base of its carbon content, using a generic primary standard such as potassium hydrogen phthalate. The approach was validated by the analysis of a reference material (NIST 8327), and applied to the quantification of two commercial synthetic peptide standards. In that case, the results obtained were compared with those obtained using alternative methods, such as amino acid analysis and ICP-MS. The results obtained proved the value of the method for the fast, accurate and precise mass purity assignment of synthetic peptide standards. Copyright © 2014 Elsevier B.V. All rights reserved.
Methods for Environments and Contaminants: Criteria Air Pollutants
EPA’s Office of Air Quality Planning and Standards (OAQPS) has set primary (health-based) National Ambient Air Quality Standards (NAAQS) for six common air pollutants, often referred to as criteria air pollutants (or simply criteria pollutants).
NASA Astrophysics Data System (ADS)
Lu, Jun; Xiao, Jun; Gao, Dong Jun; Zong, Shu Yu; Li, Zhu
2018-03-01
In the production of the Association of American Railroads (AAR) locomotive wheel-set, the press-fit curve is the most important basis for the reliability of wheel-set assembly. In the past, Most of production enterprises mainly use artificial detection methods to determine the quality of assembly. There are cases of miscarriage of justice appear. For this reason, the research on the standard is carried out. And the automatic judgment of press-fit curve is analysed and designed, so as to provide guidance for the locomotive wheel-set production based on AAR standard.
Zapka, C; Leff, J; Henley, J; Tittl, J; De Nardo, E; Butler, M; Griggs, R; Fierer, N; Edmonds-Wilson, S
2017-03-28
Hands play a critical role in the transmission of microbiota on one's own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples ( P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS ( P < 0.05) and ethanol control ( P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. IMPORTANCE The hand microbiome is a critical area of research for diverse fields, such as public health and forensics. The suitability of culture-independent methods for assessing effects of hygiene products on microbiota has not been demonstrated. This is the first controlled laboratory clinical hand study to have compared traditional hand hygiene test methods with newer culture-independent characterization methods typically used by skin microbiologists. This study resulted in recommendations for hand hygiene product testing, development of methods, and future hand skin microbiome research. It also demonstrated the importance of inclusion of skin physiological metadata in skin microbiome research, which is atypical for skin microbiome studies. Copyright © 2017 Zapka et al.
Optimizing fish sampling for fish - mercury bioaccumulation factors
Scudder Eikenberry, Barbara C.; Riva-Murray, Karen; Knightes, Christopher D.; Journey, Celeste A.; Chasar, Lia C.; Brigham, Mark E.; Bradley, Paul M.
2015-01-01
Fish Bioaccumulation Factors (BAFs; ratios of mercury (Hg) in fish (Hgfish) and water (Hgwater)) are used to develop Total Maximum Daily Load and water quality criteria for Hg-impaired waters. Both applications require representative Hgfish estimates and, thus, are sensitive to sampling and data-treatment methods. Data collected by fixed protocol from 11 streams in 5 states distributed across the US were used to assess the effects of Hgfish normalization/standardization methods and fish sample numbers on BAF estimates. Fish length, followed by weight, was most correlated to adult top-predator Hgfish. Site-specific BAFs based on length-normalized and standardized Hgfish estimates demonstrated up to 50% less variability than those based on non-normalized Hgfish. Permutation analysis indicated that length-normalized and standardized Hgfish estimates based on at least 8 trout or 5 bass resulted in mean Hgfish coefficients of variation less than 20%. These results are intended to support regulatory mercury monitoring and load-reduction program improvements.
Odibo, Anthony O; Francis, Andre; Cahill, Alison G; Macones, George A; Crane, James P; Gardosi, Jason
2011-03-01
To derive coefficients for developing a customized growth chart for a Mid-Western US population, and to estimate the association between pregnancy outcomes and smallness for gestational age (SGA) defined by the customized growth chart compared with a population-based growth chart for the USA. A retrospective cohort study of an ultrasound database using 54,433 pregnancies meeting inclusion criteria was conducted. Coefficients for customized centiles were derived using 42,277 pregnancies and compared with those obtained from other populations. Two adverse outcome indicators were defined (greater than 7 day stay in the neonatal unit and stillbirth [SB]), and the risk for each outcome was calculated for the groups of pregnancies defined as SGA by the population standard and SGA by the customized standard using 12,456 pregnancies for the validation sample. The growth potential expressed as weight at 40 weeks in this population was 3524 g (standard error: 402 g). In the validation population, 4055 cases of SGA were identified using both population and customized standards. The cases additionally identified as SGA by the customized method had a significantly increased risk of each of the adverse outcome categories. The sensitivity and specificity of those identified as SGA by customized method only for detecting pregnancies at risk for SB was 32.7% (95% confidence interval [CI] 27.0-38.8%) and 95.1% (95% CI: 94.7-95.0%) versus 0.8% (95% CI 0.1-2.7%) and 98.0% (95% CI 97.8-98.2%)for those identified by only the population-based method, respectively. SGA defined by customized growth potential is able to identify substantially more pregnancies at a risk for adverse outcome than the currently used national standard for fetal growth.
The Evolution of Strain Typing in the Mycobacterium tuberculosis Complex.
Merker, Matthias; Kohl, Thomas A; Niemann, Stefan; Supply, Philip
2017-01-01
Tuberculosis (TB) is a contagious disease with a complex epidemiology. Therefore, molecular typing (genotyping) of Mycobacterium tuberculosis complex (MTBC) strains is of primary importance to effectively guide outbreak investigations, define transmission dynamics and assist global epidemiological surveillance of the disease. Large-scale genotyping is also needed to get better insights into the biological diversity and the evolution of the pathogen. Thanks to its shorter turnaround and simple numerical nomenclature system, mycobacterial interspersed repetitive unit-variable-number tandem repeat (MIRU-VNTR) typing, based on 24 standardized plus 4 hypervariable loci, optionally combined with spoligotyping, has replaced IS6110 DNA fingerprinting over the last decade as a gold standard among classical strain typing methods for many applications. With the continuous progress and decreasing costs of next-generation sequencing (NGS) technologies, typing based on whole genome sequencing (WGS) is now increasingly performed for near complete exploitation of the available genetic information. However, some important challenges remain such as the lack of standardization of WGS analysis pipelines, the need of databases for sharing WGS data at a global level, and a better understanding of the relevant genomic distances for defining clusters of recent TB transmission in different epidemiological contexts. This chapter provides an overview of the evolution of genotyping methods over the last three decades, which culminated with the development of WGS-based methods. It addresses the relative advantages and limitations of these techniques, indicates current challenges and potential directions for facilitating standardization of WGS-based typing, and provides suggestions on what method to use depending on the specific research question.
Gantner, Martin; Schwarzmann, Günter; Sandhoff, Konrad; Kolter, Thomas
2014-12-01
Within recent years, ganglioside patterns have been increasingly analyzed by MS. However, internal standards for calibration are only available for gangliosides GM1, GM2, and GM3. For this reason, we prepared homologous internal standards bearing nonnatural fatty acids of the major mammalian brain gangliosides GM1, GD1a, GD1b, GT1b, and GQ1b, and of the tumor-associated gangliosides GM2 and GD2. The fatty acid moieties were incorporated after selective chemical or enzymatic deacylation of bovine brain gangliosides. For modification of the sphingoid bases, we developed a new synthetic method based on olefin cross metathesis. This method was used for the preparation of a lyso-GM1 and a lyso-GM2 standard. The total yield of this method was 8.7% for the synthesis of d17:1-lyso-GM1 from d20:1/18:0-GM1 in four steps. The title compounds are currently used as calibration substances for MS quantification and are also suitable for functional studies. Copyright © 2014 by the American Society for Biochemistry and Molecular Biology, Inc.
Vialaret, Jérôme; Picas, Alexia; Delaby, Constance; Bros, Pauline; Lehmann, Sylvain; Hirtz, Christophe
2018-06-01
Hepcidin-25 peptide is a biomarker which is known to have considerable clinical potential for diagnosing iron-related diseases. Developing analytical methods for the absolute quantification of hepcidin is still a real challenge, however, due to the sensitivity, specificity and reproducibility issues involved. In this study, we compare and discuss two MS-based assays for quantifying hepcidin, which differ only in terms of the type of liquid chromatography (nano LC/MS versus standard LC/MS) involved. The same sample preparation, the same internal standards and the same MS analyzer were used with both approaches. In the field of proteomics, nano LC chromatography is generally known to be more sensitive and less robust than standard LC methods. In this study, we established that the performances of the standard LC method are equivalent to those of our previously developed nano LC method. Although the analytical performances were very similar in both cases. The standard-flow platform therefore provides the more suitable alternative for accurately determining hepcidin in clinical settings. Copyright © 2018 Elsevier B.V. All rights reserved.
Mortamais, Marion; Chevrier, Cécile; Philippat, Claire; Petit, Claire; Calafat, Antonia M; Ye, Xiaoyun; Silva, Manori J; Brambilla, Christian; Eijkemans, Marinus J C; Charles, Marie-Aline; Cordier, Sylvaine; Slama, Rémy
2012-04-26
Environmental epidemiology and biomonitoring studies typically rely on biological samples to assay the concentration of non-persistent exposure biomarkers. Between-participant variations in sampling conditions of these biological samples constitute a potential source of exposure misclassification. Few studies attempted to correct biomarker levels for this error. We aimed to assess the influence of sampling conditions on concentrations of urinary biomarkers of select phenols and phthalates, two widely-produced families of chemicals, and to standardize biomarker concentrations on sampling conditions. Urine samples were collected between 2002 and 2006 among 287 pregnant women from Eden and Pélagie cohorts, from which phthalates and phenols metabolites levels were assayed. We applied a 2-step standardization method based on regression residuals. First, the influence of sampling conditions (including sampling hour, duration of storage before freezing) and of creatinine levels on biomarker concentrations were characterized using adjusted linear regression models. In the second step, the model estimates were used to remove the variability in biomarker concentrations due to sampling conditions and to standardize concentrations as if all samples had been collected under the same conditions (e.g., same hour of urine collection). Sampling hour was associated with concentrations of several exposure biomarkers. After standardization for sampling conditions, median concentrations differed by--38% for 2,5-dichlorophenol to +80 % for a metabolite of diisodecyl phthalate. However, at the individual level, standardized biomarker levels were strongly correlated (correlation coefficients above 0.80) with unstandardized measures. Sampling conditions, such as sampling hour, should be systematically collected in biomarker-based studies, in particular when the biomarker half-life is short. The 2-step standardization method based on regression residuals that we proposed in order to limit the impact of heterogeneity in sampling conditions could be further tested in studies describing levels of biomarkers or their influence on health.
Visualization of medical data based on EHR standards.
Kopanitsa, G; Hildebrand, C; Stausberg, J; Englmeier, K H
2013-01-01
To organize an efficient interaction between a doctor and an EHR the data has to be presented in the most convenient way. Medical data presentation methods and models must be flexible in order to cover the needs of the users with different backgrounds and requirements. Most visualization methods are doctor oriented, however, there are indications that the involvement of patients can optimize healthcare. The research aims at specifying the state of the art of medical data visualization. The paper analyzes a number of projects and defines requirements for a generic ISO 13606 based data visualization method. In order to do so it starts with a systematic search for studies on EHR user interfaces. In order to identify best practices visualization methods were evaluated according to the following criteria: limits of application, customizability, re-usability. The visualization methods were compared by using specified criteria. The review showed that the analyzed projects can contribute knowledge to the development of a generic visualization method. However, none of them proposed a model that meets all the necessary criteria for a re-usable standard based visualization method. The shortcomings were mostly related to the structure of current medical concept specifications. The analysis showed that medical data visualization methods use hardcoded GUI, which gives little flexibility. So medical data visualization has to turn from a hardcoded user interface to generic methods. This requires a great effort because current standards are not suitable for organizing the management of visualization data. This contradiction between a generic method and a flexible and user-friendly data layout has to be overcome.
Du, Baoqiang; Dong, Shaofeng; Wang, Yanfeng; Guo, Shuting; Cao, Lingzhi; Zhou, Wei; Zuo, Yandi; Liu, Dan
2013-11-01
A wide-frequency and high-resolution frequency measurement method based on the quantized phase step law is presented in this paper. Utilizing a variation law of the phase differences, the direct different frequency phase processing, and the phase group synchronization phenomenon, combining an A/D converter and the adaptive phase shifting principle, a counter gate is established in the phase coincidences at one-group intervals, which eliminates the ±1 counter error in the traditional frequency measurement method. More importantly, the direct phase comparison, the measurement, and the control between any periodic signals have been realized without frequency normalization in this method. Experimental results show that sub-picosecond resolution can be easily obtained in the frequency measurement, the frequency standard comparison, and the phase-locked control based on the phase quantization processing technique. The method may be widely used in navigation positioning, space techniques, communication, radar, astronomy, atomic frequency standards, and other high-tech fields.
High-efficiency power transfer for silicon-based photonic devices
NASA Astrophysics Data System (ADS)
Son, Gyeongho; Yu, Kyoungsik
2018-02-01
We demonstrate an efficient coupling of guided light of 1550 nm from a standard single-mode optical fiber to a silicon waveguide using the finite-difference time-domain method and propose a fabrication method of tapered optical fibers for efficient power transfer to silicon-based photonic integrated circuits. Adiabatically-varying fiber core diameters with a small tapering angle can be obtained using the tube etching method with hydrofluoric acid and standard single-mode fibers covered by plastic jackets. The optical power transmission of the fundamental HE11 and TE-like modes between the fiber tapers and the inversely-tapered silicon waveguides was calculated with the finite-difference time-domain method to be more than 99% at a wavelength of 1550 nm. The proposed method for adiabatic fiber tapering can be applied in quantum optics, silicon-based photonic integrated circuits, and nanophotonics. Furthermore, efficient coupling within the telecommunication C-band is a promising approach for quantum networks in the future.
NASA Astrophysics Data System (ADS)
Grunin, A. P.; Kalinov, G. A.; Bolokhovtsev, A. V.; Sai, S. V.
2018-05-01
This article reports on a novel method to improve the accuracy of positioning an object by a low frequency hyperbolic radio navigation system like an eLoran. This method is based on the application of the standard Kalman filter. Investigations of an affection of the filter parameters and the type of the movement on accuracy of the vehicle position estimation are carried out. Evaluation of the method accuracy was investigated by separating data from the semi-empirical movement model to different types of movements.
The impact of heterogeneity in individual frailty on the dynamics of mortality.
Vaupel, J W; Manton, K G; Stallard, E
1979-08-01
Life table methods are developed for populations whose members differ in their endowment for longevity. Unlike standard methods, which ignore such heterogeneity, these methods use different calculations to construct cohort, period, and individual life tables. The results imply that standard methods overestimate current life expectancy and potential gains in life expectancy from health and safety interventions, while underestimating rates of individual aging, past progress in reducing mortality, and mortality differentials between pairs of populations. Calculations based on Swedish mortality data suggest that these errors may be important, especially in old age.
Absolute method of measuring magnetic susceptibility
Thorpe, A.; Senftle, F.E.
1959-01-01
An absolute method of standardization and measurement of the magnetic susceptibility of small samples is presented which can be applied to most techniques based on the Faraday method. The fact that the susceptibility is a function of the area under the curve of sample displacement versus distance of the magnet from the sample, offers a simple method of measuring the susceptibility without recourse to a standard sample. Typical results on a few substances are compared with reported values, and an error of less than 2% can be achieved. ?? 1959 The American Institute of Physics.
NASA Astrophysics Data System (ADS)
Tsai, Suh-Jen Jane; Shiue, Chia-Chann; Chang, Shiow-Ing
1997-07-01
The analytical characteristics of copper in nickel-base alloys have been investigated with electrothermal atomic absorption spectrometry. Deuterium background correction was employed. The effects of various chemical modifiers on the analysis of copper were investigated. Organic modifiers which included 2-(5-bromo-2-pyridylazo)-5-(diethylamino-phenol) (Br-PADAP), ammonium citrate, 1-(2-pyridylazo)-naphthol, 4-(2-pyridylazo)resorcinol, ethylenediaminetetraacetic acid and Triton X-100 were studied. Inorganic modifiers palladium nitrate, magnesium nitrate, aluminum chloride, ammonium dihydrogen phosphate, hydrogen peroxide and potassium nitrate were also applied in this work. In addition, zirconium hydroxide and ammonium hydroxide precipitation methods have also been studied. Interference effects were effectively reduced with Br-PADAP modifier. Aqueous standards were used to construct the calibration curves. The detection limit was 1.9 pg. Standard reference materials of nickel-base alloys were used to evaluate the accuracy of the proposed method. The copper contents determined with the proposed method agreed closely with the certified values of the reference materials. The recoveries were within the range 90-100% with relative standard deviation of less than 10%. Good precision was obtained.
A frequency standard via spectrum analysis and direct digital synthesis
NASA Astrophysics Data System (ADS)
Li, Dawei; Shi, Daiting; Hu, Ermeng; Wang, Yigen; Tian, Lu; Zhao, Jianye; Wang, Zhong
2014-11-01
We demonstrated a frequency standard based on a detuned coherent population beating phenomenon. In this phenomenon, the beat frequency of the radio frequency for laser modulation and the hyperfine splitting can be obtained by digital signal processing technology. After analyzing the spectrum of the beat frequency, the fluctuation information is obtained and applied to compensate for the frequency shift to generate the standard frequency by the digital synthesis method. Frequency instability of 2.6 × 1012 at 1000 s is observed in our preliminary experiment. By eliminating the phase-locking loop, the method will enable us to achieve a full-digital frequency standard with remarkable stability.
Chromý, Vratislav; Vinklárková, Bára; Šprongl, Luděk; Bittová, Miroslava
2015-01-01
We found previously that albumin-calibrated total protein in certified reference materials causes unacceptable positive bias in analysis of human sera. The simplest way to cure this defect is the use of human-based serum/plasma standards calibrated by the Kjeldahl method. Such standards, commutative with serum samples, will compensate for bias caused by lipids and bilirubin in most human sera. To find a suitable primary reference procedure for total protein in reference materials, we reviewed Kjeldahl methods adopted by laboratory medicine. We found two methods recommended for total protein in human samples: an indirect analysis based on total Kjeldahl nitrogen corrected for its nonprotein nitrogen and a direct analysis made on isolated protein precipitates. The methods found will be assessed in a subsequent article.
Validation of a standardized extraction method for formalin-fixed paraffin-embedded tissue samples.
Lagheden, Camilla; Eklund, Carina; Kleppe, Sara Nordqvist; Unger, Elizabeth R; Dillner, Joakim; Sundström, Karin
2016-07-01
Formalin-fixed paraffin-embedded (FFPE) samples can be DNA-extracted and used for human papillomavirus (HPV) genotyping. The xylene-based gold standard for extracting FFPE samples is laborious, suboptimal and involves health hazards for the personnel involved. To compare extraction with the standard xylene method to a xylene-free method used in an HPV LabNet Global Reference Laboratory at the Centers for Disease Control (CDC); based on a commercial method with an extra heating step. Fifty FFPE samples were randomly selected from a national audit of all cervical cancer cases diagnosed in Sweden during 10 years. For each case-block, a blank-block was sectioned, as a control for contamination. For xylene extraction, the standard WHO Laboratory Manual protocol was used. For the CDC method, the manufacturers' protocol was followed except for an extra heating step, 120°C for 20min. Samples were extracted and tested in parallel with β-globin real-time PCR, HPV16 real-time PCR and HPV typing using modified general primers (MGP)-PCR and Luminex assays. For a valid result the blank-block had to be betaglobin-negative in all tests and the case-block positive for beta-globin. Overall, detection was improved with the heating method and the amount of HPV-positive samples increased from 70% to 86% (p=0.039). For all samples where HPV type concordance could be evaluated, there was 100% type concordance. A xylene-free and robust extraction method for HPV-DNA typing in FFPE material is currently in great demand. Our proposed standardized protocol appears to be generally useful. Copyright © 2016. Published by Elsevier B.V.
Pediatric Psycho-oncology Care: Standards, Guidelines and Consensus Reports
Wiener, Lori; Viola, Adrienne; Koretski, Julia; Perper, Emily Diana; Patenaude, Andrea Farkas
2014-01-01
Objective To identify existing guidelines, standards, or consensus-based reports for psychosocial care of children with cancer and their families. Purpose Psychosocial standards of care for children with cancer can systematize the approach to care and create a replicable model that can be utilized in pediatric hospitals around the world. Determining gaps in existing standards in pediatric psycho-oncology can guide development of useful evidence- and consensus-based standards. Methods The MEDLINE and PubMed databases were searched by investigators at two major pediatric oncology centers for existing guidelines, consensus-based reports, or standards for psychosocial care of pediatric cancer patients and their families published in peer-reviewed journals in English between 1980 and 2013. Results We located 27 articles about psychosocial care that met inclusion criteria: 5 set forth standards, 19 guidelines and 3 were consensus-based reports. None were sufficiently up-to-date, significantly evidence-based, comprehensive and specific enough to serve as a current standard for psychosocial care for children with cancer and their families. Conclusion Despite calls by a number of international pediatric oncology and psycho-oncology professional organizations about the urgency of addressing the psychosocial needs of the child with cancer in order to reduce suffering, there remains a need for development of a widely acceptable, evidence- and consensus-based, comprehensive standard of care to guide provision of essential psychosocial services to all pediatric cancer patients. PMID:24906202
NASA Astrophysics Data System (ADS)
Huebner, Claudia S.
2016-10-01
As a consequence of fluctuations in the index of refraction of the air, atmospheric turbulence causes scintillation, spatial and temporal blurring as well as global and local image motion creating geometric distortions. To mitigate these effects many different methods have been proposed. Global as well as local motion compensation in some form or other constitutes an integral part of many software-based approaches. For the estimation of motion vectors between consecutive frames simple methods like block matching are preferable to more complex algorithms like optical flow, at least when challenged with near real-time requirements. However, the processing power of commercially available computers continues to increase rapidly and the more powerful optical flow methods have the potential to outperform standard block matching methods. Therefore, in this paper three standard optical flow algorithms, namely Horn-Schunck (HS), Lucas-Kanade (LK) and Farnebäck (FB), are tested for their suitability to be employed for local motion compensation as part of a turbulence mitigation system. Their qualitative performance is evaluated and compared with that of three standard block matching methods, namely Exhaustive Search (ES), Adaptive Rood Pattern Search (ARPS) and Correlation based Search (CS).
Xiao, Meng; Kong, Fanrong; Jin, Ping; Wang, Qinning; Xiao, Kelin; Jeoffreys, Neisha; James, Gregory
2012-01-01
PCR ribotyping is the most commonly used Clostridium difficile genotyping method, but its utility is limited by lack of standardization. In this study, we analyzed four published whole genomes and tested an international collection of 21 well-characterized C. difficile ribotype 027 isolates as the basis for comparison of two capillary gel electrophoresis (CGE)-based ribotyping methods. There were unexpected differences between the 16S-23S rRNA intergenic spacer region (ISR) allelic profiles of the four ribotype 027 genomes, but six bands were identified in all four and a seventh in three genomes. All seven bands and another, not identified in any of the whole genomes, were found in all 21 isolates. We compared sequencer-based CGE (SCGE) with three different primer pairs to the Qiagen QIAxcel CGE (QCGE) platform. Deviations from individual reference/consensus band sizes were smaller for SCGE (0 to 0.2 bp) than for QCGE (4.2 to 9.5 bp). Compared with QCGE, SCGE more readily distinguished bands of similar length (more discriminatory), detected bands of larger size and lower intensity (more sensitive), and assigned band sizes more accurately and reproducibly, making it more suitable for standardization. Specifically, QCGE failed to identify the largest ISR amplicon. Based on several criteria, we recommend the primer set 16S-USA/23S-USA for use in a proposed standard SCGE method. Similar differences between SCGE and QCGE were found on testing of 14 isolates of four other C. difficile ribotypes. Based on our results, ISR profiles based on accurate sequencer-based band lengths would be preferable to agarose gel-based banding patterns for the assignment of ribotypes. PMID:22692737
Fission matrix-based Monte Carlo criticality analysis of fuel storage pools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farlotti, M.; Ecole Polytechnique, Palaiseau, F 91128; Larsen, E. W.
2013-07-01
Standard Monte Carlo transport procedures experience difficulties in solving criticality problems in fuel storage pools. Because of the strong neutron absorption between fuel assemblies, source convergence can be very slow, leading to incorrect estimates of the eigenvalue and the eigenfunction. This study examines an alternative fission matrix-based Monte Carlo transport method that takes advantage of the geometry of a storage pool to overcome this difficulty. The method uses Monte Carlo transport to build (essentially) a fission matrix, which is then used to calculate the criticality and the critical flux. This method was tested using a test code on a simplemore » problem containing 8 assemblies in a square pool. The standard Monte Carlo method gave the expected eigenfunction in 5 cases out of 10, while the fission matrix method gave the expected eigenfunction in all 10 cases. In addition, the fission matrix method provides an estimate of the error in the eigenvalue and the eigenfunction, and it allows the user to control this error by running an adequate number of cycles. Because of these advantages, the fission matrix method yields a higher confidence in the results than standard Monte Carlo. We also discuss potential improvements of the method, including the potential for variance reduction techniques. (authors)« less
NASA Astrophysics Data System (ADS)
Aschonitis, Vassilis G.; Papamichail, Dimitris; Demertzi, Kleoniki; Colombani, Nicolo; Mastrocicco, Micol; Ghirardini, Andrea; Castaldelli, Giuseppe; Fano, Elisa-Anna
2017-08-01
The objective of the study is to provide global grids (0.5°) of revised annual coefficients for the Priestley-Taylor (P-T) and Hargreaves-Samani (H-S) evapotranspiration methods after calibration based on the ASCE (American Society of Civil Engineers)-standardized Penman-Monteith method (the ASCE method includes two reference crops: short-clipped grass and tall alfalfa). The analysis also includes the development of a global grid of revised annual coefficients for solar radiation (Rs) estimations using the respective Rs formula of H-S. The analysis was based on global gridded climatic data of the period 1950-2000. The method for deriving annual coefficients of the P-T and H-S methods was based on partial weighted averages (PWAs) of their mean monthly values. This method estimates the annual values considering the amplitude of the parameter under investigation (ETo and Rs) giving more weight to the monthly coefficients of the months with higher ETo values (or Rs values for the case of the H-S radiation formula). The method also eliminates the effect of unreasonably high or low monthly coefficients that may occur during periods where ETo and Rs fall below a specific threshold. The new coefficients were validated based on data from 140 stations located in various climatic zones of the USA and Australia with expanded observations up to 2016. The validation procedure for ETo estimations of the short reference crop showed that the P-T and H-S methods with the new revised coefficients outperformed the standard methods reducing the estimated root mean square error (RMSE) in ETo values by 40 and 25 %, respectively. The estimations of Rs using the H-S formula with revised coefficients reduced the RMSE by 28 % in comparison to the standard H-S formula. Finally, a raster database was built consisting of (a) global maps for the mean monthly ETo values estimated by ASCE-standardized method for both reference crops, (b) global maps for the revised annual coefficients of the P-T and H-S evapotranspiration methods for both reference crops and a global map for the revised annual coefficient of the H-S radiation formula and (c) global maps that indicate the optimum locations for using the standard P-T and H-S methods and their possible annual errors based on reference values. The database can support estimations of ETo and solar radiation for locations where climatic data are limited and it can support studies which require such estimations on larger scales (e.g. country, continent, world). The datasets produced in this study are archived in the PANGAEA database (https://doi.org/10.1594/PANGAEA.868808) and in the ESRN database (http://www.esrn-database.org or http://esrn-database.weebly.com).
PRA and Risk Informed Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.
2006-01-01
The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less
Burkhardt, Mark R.; Cinotto, Pete J.; Frahm, Galen W.; Woodworth, Mark T.; Pritt, Jeffrey W.
1995-01-01
A method for the determination of methylene blue active substances in whole-water samples by liquid-liquid extraction and spectrophotometric detection is described. Sulfate and sulfonate-based surfectants are reacted with methylene blue to form a blue-colored complex. The complex is extracted into chloroform, back-washed with an acidified phosphate-based buffer solution, and measured against external standards with a probe spectrophotometer. The method detection limt for routine analysis is 0.02 milligram per liter. The precision is plus/minus 10 percent relative standard deviation. The positive bias from nitrate and chloride and U.S. Geological Survey method O-3111-83 for methylene blue active substances is minized by adding a back-washing step.
Generalized Preconditioned Locally Harmonic Residual Eigensolver (GPLHR) v0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
VECHARYNSKI, EUGENE; YANG, CHAO
The software contains a MATLAB implementation of the Generalized Preconditioned Locally Harmonic Residual (GPLHR) method for solving standard and generalized non-Hermitian eigenproblems. The method is particularly useful for computing a subset of eigenvalues, and their eigen- or Schur vectors, closest to a given shift. The proposed method is based on block iterations and can take advantage of a preconditioner if it is available. It does not need to perform exact shift-and-invert transformation. Standard and generalized eigenproblems are handled in a unified framework.
Residual gravimetric method to measure nebulizer output.
Vecellio None, Laurent; Grimbert, Daniel; Bordenave, Joelle; Benoit, Guy; Furet, Yves; Fauroux, Brigitte; Boissinot, Eric; De Monte, Michele; Lemarié, Etienne; Diot, Patrice
2004-01-01
The aim of this study was to assess a residual gravimetric method based on weighing dry filters to measure the aerosol output of nebulizers. This residual gravimetric method was compared to assay methods based on spectrophotometric measurement of terbutaline (Bricanyl, Astra Zeneca, France), high-performance liquid chromatography (HPLC) measurement of tobramycin (Tobi, Chiron, U.S.A.), and electrochemical measurements of NaF (as defined by the European standard). Two breath-enhanced jet nebulizers, one standard jet nebulizer, and one ultrasonic nebulizer were tested. Output produced by the residual gravimetric method was calculated by weighing the filters both before and after aerosol collection and by filter drying corrected by the proportion of drug contained in total solute mass. Output produced by the electrochemical, spectrophotometric, and HPLC methods was determined after assaying the drug extraction filter. The results demonstrated a strong correlation between the residual gravimetric method (x axis) and assay methods (y axis) in terms of drug mass output (y = 1.00 x -0.02, r(2) = 0.99, n = 27). We conclude that a residual gravimetric method based on dry filters, when validated for a particular agent, is an accurate way of measuring aerosol output.
ERIC Educational Resources Information Center
Duncombe, William; Yinger, John
This policy brief explains why performance focus and educational cost indexes must go hand in hand, discusses alternative methods for estimating educational cost indexes, and shows how these costs indexes can be incorporated into a performance-based state aid program. A shift to educational performance standards, whether these standards are…
Standards for space automation and robotics
NASA Technical Reports Server (NTRS)
Kader, Jac B.; Loftin, R. B.
1992-01-01
The AIAA's Committee on Standards for Space Automation and Robotics (COS/SAR) is charged with the identification of key functions and critical technologies applicable to multiple missions that reflect fundamental consideration of environmental factors. COS/SAR's standards/practices/guidelines implementation methods will be based on reliability, performance, and operations, as well as economic viability and life-cycle costs, simplicity, and modularity.
Standardized Photometric Calibrations for Panchromatic SSA Sensors
NASA Astrophysics Data System (ADS)
Castro, P.; Payne, T.; Battle, A.; Cole, Z.; Moody, J.; Gregory, S.; Dao, P.
2016-09-01
Panchromatic sensors used for Space Situational Awareness (SSA) have no standardized method for transforming the net flux detected by a CCD without a spectral filter into an exo-atmospheric magnitude in a standard magnitude system. Each SSA data provider appears to have their own method for computing the visual magnitude based on panchromatic brightness making cross-comparisons impossible. We provide a procedure in order to standardize the calibration of panchromatic sensors for the purposes of SSA. A technique based on theoretical modeling is presented that derives standard panchromatic magnitudes from the Johnson-Cousins photometric system defined by Arlo Landolt. We verify this technique using observations of Landolt standard stars and a Vega-like star to determine empirical panchromatic magnitudes and compare these to synthetically derived panchromatic magnitudes. We also investigate color terms caused by differences in the quantum efficiency (QE) between the Landolt standard system and panchromatic systems. We evaluate calibrated panchromatic satellite photometry by observing several GEO satellites and standard stars using three different sensors. We explore the effect of satellite color terms by comparing the satellite signatures. In order to remove other variables affecting the satellite photometry, two of the sensors are at the same site using different CCDs. The third sensor is geographically separate from the first two allowing for a definitive test of calibrated panchromatic satellite photometry.
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K.T.
2016-01-01
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. PMID:26928563
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K T
2016-04-15
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
A STANDARDIZED ASSESSMENT METHOD (SAM) FOR RIVERINE MACROINVERTEBRATES
A macroinvertebrate sampling method for large rivers based on desirable characteristics of existing nonwadeable methods was developed and tested. Six sites each were sampled on the Great Miami and Kentucky Rivers, reflecting a human disturbance gradient. Samples were collected ...
How far is it? Distance measurements and their consequences
NASA Astrophysics Data System (ADS)
Krełowski, Jacek
2017-08-01
Methods of measuring distances to objects in our Milky Way are briefly discussed. They generally base on three principles: of using a standard rod, of standard candle and of column density of interstellar matter. Weak and strong points of these methods are presented. The presence of gray extinction towards some objects is suggested which makes the most universal method of standard candle (spectroscopic parallax) very uncertain. Hard to say whether gray extinc-tion appears only in the form of circumstellar debris discs or is present also in the general interstellar medium. The application of the method of measuring column densities of interstellar gases suggests that the rotation curve of our Milky Way system is rather Keplerian than flat which creates doubts as to whether any Dark Matter halo is present around our Galaxy. It is emphasized that the most universal method, i.e. that of standard candle, used to estimate distances to cosmological objects, may suffer serious errors because of improper subtraction of extinction effects.
A new NIST primary standardization of 18F.
Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S
2014-02-01
A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.
Rosenblum, Michael A; Laan, Mark J van der
2009-01-07
The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).
Use of the Budyko Framework to Estimate the Virtual Water Content in Shijiazhuang Plain, North China
NASA Astrophysics Data System (ADS)
Zhang, E.; Yin, X.
2017-12-01
One of the most challenging steps in implementing analysis of virtual water content (VWC) of agricultural crops is how to properly assess the volume of consumptive water use (CWU) for crop production. In practice, CWU is considered equivalent to the crop evapotranspiration (ETc). Following the crop coefficient method, ETc can be calculated under standard or non-standard conditions by multiplying the reference evapotranspiration (ET0) by one or a few coefficients. However, when current crop growing conditions deviate from standard conditions, accurately determining the coefficients under non-standard conditions remains to be a complicated process and requires lots of field experimental data. Based on regional surface water-energy balance, this research integrates the Budyko framework into the traditional crop coefficient approach to simplify the coefficients determination. This new method enables us to assess the volume of agricultural VWC only based on some hydrometeorological data and agricultural statistic data in regional scale. To demonstrate the new method, we apply it to the Shijiazhuang Plain, which is an agricultural irrigation area in the North China Plain. The VWC of winter wheat and summer maize is calculated and we further subdivide VWC into blue and green water components. Compared with previous studies in this study area, VWC calculated by the Budyko-based crop coefficient approach uses less data and agrees well with some of the previous research. It shows that this new method may serve as a more convenient tool for assessing VWC.
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
We compared three methods for estimating fungal species diversity in soil samples. A rapid screening method based on gross colony morphological features and color reference standards was compared with traditional fungal taxonomic methods and PCR-RFLP for estimation of ecological ...
NASA Astrophysics Data System (ADS)
Chaillat, Stéphanie; Desiderio, Luca; Ciarlet, Patrick
2017-12-01
In this work, we study the accuracy and efficiency of hierarchical matrix (H-matrix) based fast methods for solving dense linear systems arising from the discretization of the 3D elastodynamic Green's tensors. It is well known in the literature that standard H-matrix based methods, although very efficient tools for asymptotically smooth kernels, are not optimal for oscillatory kernels. H2-matrix and directional approaches have been proposed to overcome this problem. However the implementation of such methods is much more involved than the standard H-matrix representation. The central questions we address are twofold. (i) What is the frequency-range in which the H-matrix format is an efficient representation for 3D elastodynamic problems? (ii) What can be expected of such an approach to model problems in mechanical engineering? We show that even though the method is not optimal (in the sense that more involved representations can lead to faster algorithms) an efficient solver can be easily developed. The capabilities of the method are illustrated on numerical examples using the Boundary Element Method.
Gloss uniformity measurement update for ISO/IEC 19751
NASA Astrophysics Data System (ADS)
Ng, Yee S.; Cui, Chengwu; Kuo, Chunghui; Maggard, Eric; Mashtare, Dale; Morris, Peter
2005-01-01
To address the standardization issues of perceptually based image quality for printing systems, ISO/IEC JTC1/SC28, the standardization committee for office equipment chartered the W1.1 project with the responsibility of drafting a proposal for an international standard for the evaluation of printed image quality1. An ISO draft Standard2, ISO/WD 19751-1, Office Equipment - Appearance-based image quality standards for printers - Part 1: Overview, Procedure and Common Methods, 2004 describes the overview of this multi-part appearance-based image quality standard. One of the ISO 19751 multi-part Standard"s tasks is to address the appearance-based gloss and gloss uniformity issues (in ISO 19751-2). This paper summarizes the current status and technical progress since the last two updates3, 4. In particular, we will be discussion our attempt to include 75 degree gloss (G75) objective measurement5 in differential gloss and within-page gloss uniformity. The result for a round-robin experiment involving objective measurement of differential gloss using G60 and G75 gloss measurement geometry is described. The results for two perceptual-based round-robin experiments relating to haze effect on the perception of gloss, and gloss artifacts (gloss streaks/bands, gloss graininess/mottle) are discussed.
Gloss uniformity measurement update for ISO/IEC 19751
NASA Astrophysics Data System (ADS)
Ng, Yee S.; Cui, Chengwu; Kuo, Chunghui; Maggard, Eric; Mashtare, Dale; Morris, Peter
2004-10-01
To address the standardization issues of perceptually based image quality for printing systems, ISO/IEC JTC1/SC28, the standardization committee for office equipment chartered the W1.1 project with the responsibility of drafting a proposal for an international standard for the evaluation of printed image quality1. An ISO draft Standard2, ISO/WD 19751-1, Office Equipment - Appearance-based image quality standards for printers - Part 1: Overview, Procedure and Common Methods, 2004 describes the overview of this multi-part appearance-based image quality standard. One of the ISO 19751 multi-part Standard"s tasks is to address the appearance-based gloss and gloss uniformity issues (in ISO 19751-2). This paper summarizes the current status and technical progress since the last two updates3, 4. In particular, we will be discussion our attempt to include 75 degree gloss (G75) objective measurement5 in differential gloss and within-page gloss uniformity. The result for a round-robin experiment involving objective measurement of differential gloss using G60 and G75 gloss measurement geometry is described. The results for two perceptual-based round-robin experiments relating to haze effect on the perception of gloss, and gloss artifacts (gloss streaks/bands, gloss graininess/mottle) are discussed.
Intrathoracic airway measurement: ex-vivo validation
NASA Astrophysics Data System (ADS)
Reinhardt, Joseph M.; Raab, Stephen A.; D'Souza, Neil D.; Hoffman, Eric A.
1997-05-01
High-resolution x-ray CT (HRCT) provides detailed images of the lungs and bronchial tree. HRCT-based imaging and quantitation of peripheral bronchial airway geometry provides a valuable tool for assessing regional airway physiology. Such measurements have been sued to address physiological questions related to the mechanics of airway collapse in sleep apnea, the measurement of airway response to broncho-constriction agents, and to evaluate and track the progression of disease affecting the airways, such as asthma and cystic fibrosis. Significant attention has been paid to the measurements of extra- and intra-thoracic airways in 2D sections from volumetric x-ray CT. A variety of manual and semi-automatic techniques have been proposed for airway geometry measurement, including the use of standardized display window and level settings for caliper measurements, methods based on manual or semi-automatic border tracing, and more objective, quantitative approaches such as the use of the 'half-max' criteria. A recently proposed measurements technique uses a model-based deconvolution to estimate the location of the inner and outer airway walls. Validation using a plexiglass phantom indicates that the model-based method is more accurate than the half-max approach for thin-walled structures. In vivo validation of these airway measurement techniques is difficult because of the problems in identifying a reliable measurement 'gold standard.' In this paper we report on ex vivo validation of the half-max and model-based methods using an excised pig lung. The lung is sliced into thin sections of tissue and scanned using an electron beam CT scanner. Airways of interest are measured from the CT images, and also measured with using a microscope and micrometer to obtain a measurement gold standard. The result show no significant difference between the model-based measurements and the gold standard; while the half-max estimates exhibited a measurement bias and were significantly different than the gold standard.
Salganik, Matthew J; Fazito, Dimitri; Bertoni, Neilane; Abdo, Alexandre H; Mello, Maeve B; Bastos, Francisco I
2011-11-15
One of the many challenges hindering the global response to the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) epidemic is the difficulty of collecting reliable information about the populations most at risk for the disease. Thus, the authors empirically assessed a promising new method for estimating the sizes of most at-risk populations: the network scale-up method. Using 4 different data sources, 2 of which were from other researchers, the authors produced 5 estimates of the number of heavy drug users in Curitiba, Brazil. The authors found that the network scale-up and generalized network scale-up estimators produced estimates 5-10 times higher than estimates made using standard methods (the multiplier method and the direct estimation method using data from 2004 and 2010). Given that equally plausible methods produced such a wide range of results, the authors recommend that additional studies be undertaken to compare estimates based on the scale-up method with those made using other methods. If scale-up-based methods routinely produce higher estimates, this would suggest that scale-up-based methods are inappropriate for populations most at risk of HIV/AIDS or that standard methods may tend to underestimate the sizes of these populations.
Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine
2016-01-01
ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian Health Surveillance Agency (Agência Nacional de Vigilância Sanitária - ANVISA) Collegiate Board Resolution (Resolução de Diretoria Colegiada - RDC) no. 35/2010 and 15/2012. The proposed method was applied to a commercially available device using a high-level 0.2% peracetic acid-based disinfectant. Results: the proposed method of assessment was found to be robust when the recommendations made in the relevant legislation were incorporated with some adjustments to ensure their feasibility. Application of the proposed method provided evidence of the efficacy of the tested equipment for the high-level disinfection of endoscopes. Conclusion: the proposed method may serve as a reference for the assessment of flexible endoscope reprocessors, thereby providing solid ground for the purchase of this category of health products. PMID:27508915
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Near-infrared fluorescence image quality test methods for standardized performance evaluation
NASA Astrophysics Data System (ADS)
Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua
2017-03-01
Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.
On the influence of high-pass filtering on ICA-based artifact reduction in EEG-ERP.
Winkler, Irene; Debener, Stefan; Müller, Klaus-Robert; Tangermann, Michael
2015-01-01
Standard artifact removal methods for electroencephalographic (EEG) signals are either based on Independent Component Analysis (ICA) or they regress out ocular activity measured at electrooculogram (EOG) channels. Successful ICA-based artifact reduction relies on suitable pre-processing. Here we systematically evaluate the effects of high-pass filtering at different frequencies. Offline analyses were based on event-related potential data from 21 participants performing a standard auditory oddball task and an automatic artifactual component classifier method (MARA). As a pre-processing step for ICA, high-pass filtering between 1-2 Hz consistently produced good results in terms of signal-to-noise ratio (SNR), single-trial classification accuracy and the percentage of `near-dipolar' ICA components. Relative to no artifact reduction, ICA-based artifact removal significantly improved SNR and classification accuracy. This was not the case for a regression-based approach to remove EOG artifacts.
Mokkink, Lidwine B.; Prinsen, Cecilia A. C.; Bouter, Lex M.; de Vet, Henrica C. W.; Terwee, Caroline B.
2016-01-01
Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. Method: In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. Conclusions: In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments. PMID:26786084
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Application of FT-IR Classification Method in Silica-Plant Extracts Composites Quality Testing
NASA Astrophysics Data System (ADS)
Bicu, A.; Drumea, V.; Mihaiescu, D. E.; Purcareanu, B.; Florea, M. A.; Trică, B.; Vasilievici, G.; Draga, S.; Buse, E.; Olariu, L.
2018-06-01
Our present work is concerned with the validation and quality testing efforts of mesoporous silica - plant extracts composites, in order to sustain the standardization process of plant-based pharmaceutical products. The synthesis of the silica support were performed by using a TEOS based synthetic route and CTAB as a template, at room temperature and normal pressure. The silica support was analyzed by advanced characterization methods (SEM, TEM, BET, DLS and FT-IR), and loaded with Calendula officinalis and Salvia officinalis standardized extracts. Further desorption studies were performed in order to prove the sustained release properties of the final materials. Intermediate and final product identification was performed by a FT-IR classification method, using the MID-range of the IR spectra, and statistical representative samples from repetitive synthetic stages. The obtained results recommend this analytical method as a fast and cost effective alternative to the classic identification methods.
Peng, Zhihang; Bao, Changjun; Zhao, Yang; Yi, Honggang; Xia, Letian; Yu, Hao; Shen, Hongbing; Chen, Feng
2010-01-01
This paper first applies the sequential cluster method to set up the classification standard of infectious disease incidence state based on the fact that there are many uncertainty characteristics in the incidence course. Then the paper presents a weighted Markov chain, a method which is used to predict the future incidence state. This method assumes the standardized self-coefficients as weights based on the special characteristics of infectious disease incidence being a dependent stochastic variable. It also analyzes the characteristics of infectious diseases incidence via the Markov chain Monte Carlo method to make the long-term benefit of decision optimal. Our method is successfully validated using existing incidents data of infectious diseases in Jiangsu Province. In summation, this paper proposes ways to improve the accuracy of the weighted Markov chain, specifically in the field of infection epidemiology. PMID:23554632
Peng, Zhihang; Bao, Changjun; Zhao, Yang; Yi, Honggang; Xia, Letian; Yu, Hao; Shen, Hongbing; Chen, Feng
2010-05-01
This paper first applies the sequential cluster method to set up the classification standard of infectious disease incidence state based on the fact that there are many uncertainty characteristics in the incidence course. Then the paper presents a weighted Markov chain, a method which is used to predict the future incidence state. This method assumes the standardized self-coefficients as weights based on the special characteristics of infectious disease incidence being a dependent stochastic variable. It also analyzes the characteristics of infectious diseases incidence via the Markov chain Monte Carlo method to make the long-term benefit of decision optimal. Our method is successfully validated using existing incidents data of infectious diseases in Jiangsu Province. In summation, this paper proposes ways to improve the accuracy of the weighted Markov chain, specifically in the field of infection epidemiology.
Vela, J; Vitorica, J; Ruano, D
2001-12-01
We describe a fast and easy method for the synthesis of competitor molecules based on non-specific conditions of PCR. RT-competitive PCR is a sensitive technique that allows quantification of very low quantities of mRNA molecules in small tissue samples. This technique is based on the competition established between the native and standard templates for nucleotides, primers or other factors during PCR. Thus, the most critical parameter is the use of good internal standards to generate a standard curve from which the amount of native sequences can be properly estimated. At the present time different types of internal standards and methods for their synthesis have been described. Normally, most of these methods are time-consuming and require the use of different sets of primers, different rounds of PCR or specific modifications, such as site-directed mutagenesis, that need subsequent analysis of the PCR products. Using our method, we obtained in a single round of PCR and with the same primer pair, competitor molecules that were successfully used in RT-competitive PCR experiments. The principal advantage of this method is high versatility and economy. Theoretically it is possible to synthesize a specific competitor molecule for each primer pair used. Finally, using this method we have been able to quantify the increase in the expression of the beta(2) GABA(A) receptor subunit mRNA that occurs during rat hippocampus development.
Gold-standard evaluation of a folksonomy-based ontology learning model
NASA Astrophysics Data System (ADS)
Djuana, E.
2018-03-01
Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.
48 CFR 11.107 - Solicitation provision.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Solicitation provision. 11... transaction-based reporting method to report its use of voluntary consensus standards to the National... Use of Voluntary Consensus Standards and in Conformity Assessment Activities”). Use of the provision...
Investigating the Utility of a GPA Institutional Adjustment Index
ERIC Educational Resources Information Center
Didier, Thomas; Kreiter, Clarence D.; Buri, Russell; Solow, Catherine
2006-01-01
Background: Grading standards vary widely across undergraduate institutions. If, during the medical school admissions process, GPA is considered without reference to the institution attended, it will disadvantage applicants from undergraduate institutions employing rigorous grading standards. Method: A regression-based GPA institutional equating…
NASA Astrophysics Data System (ADS)
Kowalewski, M. G.; Janz, S. J.
2015-02-01
Methods of absolute radiometric calibration of backscatter ultraviolet (BUV) satellite instruments are compared as part of an effort to minimize pre-launch calibration uncertainties. An internally illuminated integrating sphere source has been used for the Shuttle Solar BUV, Total Ozone Mapping Spectrometer, Ozone Mapping Instrument, and Global Ozone Monitoring Experiment 2 using standardized procedures traceable to national standards. These sphere-based spectral responsivities agree to within the derived combined standard uncertainty of 1.87% relative to calibrations performed using an external diffuser illuminated by standard irradiance sources, the customary spectral radiance responsivity calibration method for BUV instruments. The combined standard uncertainty for these calibration techniques as implemented at the NASA Goddard Space Flight Center’s Radiometric Calibration and Development Laboratory is shown to less than 2% at 250 nm when using a single traceable calibration standard.
Shehata, Atef S.; Mukherjee, Pranab K.; Ghannoum, Mahmoud A.
2008-01-01
In this study, we determined the utility of a 2,3-bis(2-methoxy-4-nitro-5-[(sulfenylamino)carbonyl]-2H-tetrazolium hydroxide (XTT)-based assay for determining antifungal susceptibilities of dermatophytes to terbinafine, ciclopirox, and voriconazole in comparison to the Clinical and Laboratory Standards Institute (CLSI) M38-A2 method. Forty-eight dermatophyte isolates, including Trichophyton rubrum (n = 15), Trichophyton mentagrophytes (n = 7), Trichophyton tonsurans (n = 11), and Epidermophyton floccosum (n = 13), and two quality control strains, were tested. In the XTT-based method, MICs were determined spectrophotometrically at 490 nm after addition of XTT and menadione. For the CLSI method, the MICs were determined visually. With T. rubrum, the XTT assay revealed MIC ranges of 0.004 to >64 μg/ml, 0.125 to 0.25 μg/ml, and 0.008 to 0.025 μg/ml for terbinafine, ciclopirox, and voriconazole, respectively. Similar MIC ranges were obtained against T. rubrum by using the CLSI method. Additionally, when tested with T. mentagrophytes, T. tonsurans, and E. floccosum isolates, the XTT and CLSI methods resulted in comparable MIC ranges. Both methods revealed similar lowest drug concentrations that inhibited 90% of the isolates for the majority of tested drug-dermatophyte combinations. The levels of agreement within 1 dilution between both methods were as follows: 100% with terbinafine, 97.8% with ciclopirox, and 89.1% with voriconazole. However, the agreement within 2 dilutions between these two methods was 100% for all tested drugs. Our results revealed that the XTT assay can be a useful tool for antifungal susceptibility testing of dermatophytes. PMID:18832129
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Hajian, Reza; Mousavi, Esmat; Shams, Nafiseh
2013-06-01
Net analyte signal standard addition method has been used for the simultaneous determination of sulphadiazine and trimethoprim by spectrophotometry in some bovine milk and veterinary medicines. The method combines the advantages of standard addition method with the net analyte signal concept which enables the extraction of information concerning a certain analyte from spectra of multi-component mixtures. This method has some advantages such as the use of a full spectrum realisation, therefore it does not require calibration and prediction step and only a few measurements require for the determination. Cloud point extraction based on the phenomenon of solubilisation used for extraction of sulphadiazine and trimethoprim in bovine milk. It is based on the induction of micellar organised media by using Triton X-100 as an extraction solvent. At the optimum conditions, the norm of NAS vectors increased linearly with concentrations in the range of 1.0-150.0 μmolL(-1) for both sulphadiazine and trimethoprim. The limits of detection (LOD) for sulphadiazine and trimethoprim were 0.86 and 0.92 μmolL(-1), respectively. Copyright © 2012 Elsevier Ltd. All rights reserved.
Protein quantification using a cleavable reporter peptide.
Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno
2015-02-06
Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.
Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca
2017-04-01
The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.
Wu, Jia; Heike, Carrie; Birgfeld, Craig; Evans, Kelly; Maga, Murat; Morrison, Clinton; Saltzman, Babette; Shapiro, Linda; Tse, Raymond
2016-11-01
Quantitative measures of facial form to evaluate treatment outcomes for cleft lip (CL) are currently limited. Computer-based analysis of three-dimensional (3D) images provides an opportunity for efficient and objective analysis. The purpose of this study was to define a computer-based standard of identifying the 3D midfacial reference plane of the face in children with unrepaired cleft lip for measurement of facial symmetry. The 3D images of 50 subjects (35 with unilateral CL, 10 with bilateral CL, five controls) were included in this study. Five methods of defining a midfacial plane were applied to each image, including two human-based (Direct Placement, Manual Landmark) and three computer-based (Mirror, Deformation, Learning) methods. Six blinded raters (three cleft surgeons, two craniofacial pediatricians, and one craniofacial researcher) independently ranked and rated the accuracy of the defined planes. Among computer-based methods, the Deformation method performed significantly better than the others. Although human-based methods performed best, there was no significant difference compared with the Deformation method. The average correlation coefficient among raters was .4; however, it was .7 and .9 when the angular difference between planes was greater than 6° and 8°, respectively. Raters can agree on the 3D midfacial reference plane in children with unrepaired CL using digital surface mesh. The Deformation method performed best among computer-based methods evaluated and can be considered a useful tool to carry out automated measurements of facial symmetry in children with unrepaired cleft lip.
Eckner, Karl F.
1998-01-01
A total of 338 water samples, 261 drinking water samples and 77 bathing water samples, obtained for routine testing were analyzed in duplicate by Swedish standard methods using multiple-tube fermentation or membrane filtration and by the Colilert and/or Enterolert methods. Water samples came from a wide variety of sources in southern Sweden (Skåne). The Colilert method was found to be more sensitive than Swedish standard methods for detecting coliform bacteria and of equal sensitivity for detecting Escherichia coli when all drinking water samples were grouped together. Based on these results, Swedac, the Swedish laboratory accreditation body, approved for the first time in Sweden use of the Colilert method at this laboratory for the analysis of all water sources not falling under public water regulations (A-krav). The coliform detection study of bathing water yielded anomalous results due to confirmation difficulties. E. coli detection in bathing water was similar by both the Colilert and Swedish standard methods as was fecal streptococcus and enterococcus detection by both the Enterolert and Swedish standard methods. PMID:9687478
Automatic Recognition of Fetal Facial Standard Plane in Ultrasound Image via Fisher Vector.
Lei, Baiying; Tan, Ee-Leng; Chen, Siping; Zhuo, Liu; Li, Shengli; Ni, Dong; Wang, Tianfu
2015-01-01
Acquisition of the standard plane is the prerequisite of biometric measurement and diagnosis during the ultrasound (US) examination. In this paper, a new algorithm is developed for the automatic recognition of the fetal facial standard planes (FFSPs) such as the axial, coronal, and sagittal planes. Specifically, densely sampled root scale invariant feature transform (RootSIFT) features are extracted and then encoded by Fisher vector (FV). The Fisher network with multi-layer design is also developed to extract spatial information to boost the classification performance. Finally, automatic recognition of the FFSPs is implemented by support vector machine (SVM) classifier based on the stochastic dual coordinate ascent (SDCA) algorithm. Experimental results using our dataset demonstrate that the proposed method achieves an accuracy of 93.27% and a mean average precision (mAP) of 99.19% in recognizing different FFSPs. Furthermore, the comparative analyses reveal the superiority of the proposed method based on FV over the traditional methods.
The Use of a Corpus in Contrastive Studies.
ERIC Educational Resources Information Center
Filipovic, Rudolf
1973-01-01
Before beginning the Serbocroatian-English Contrastive Project, it was necessary to determine whether to base the analysis on a corpus or on native intuitions. It seemed that the best method would combine the theoretical and the empirical. A translation method based on a corpus of text was adopted. The Brown University "Standard Sample of…
Development of a Pancake-Making Method for a Batter-Based Product
USDA-ARS?s Scientific Manuscript database
Cake and pancake are major batter-based products made with soft wheat flour. A standardized baking method for high-ratio cake has been widely used for evaluating the cake-baking performance of soft wheat flour. Chlorinated flour is used to make high-ratio cake, and the cake formula contains relative...
Developing a Competency-Based Pan-European Accreditation Framework for Health Promotion
ERIC Educational Resources Information Center
Battel-Kirk, Barbara; Van der Zanden, Gerard; Schipperen, Marielle; Contu, Paolo; Gallardo, Carmen; Martinez, Ana; Garcia de Sola, Silvia; Sotgiu, Alessandra; Zaagsma, Miriam; Barry, Margaret M.
2012-01-01
Background: The CompHP Pan-European Accreditation Framework for Health Promotion was developed as part of the CompHP Project that aimed to develop competency-based standards and an accreditation system for health promotion practice, education, and training in Europe. Method: A phased, multiple-method approach was employed to facilitate consensus…
An ecological method to understand agricultural standardization in peach orchard ecosystems
Wan, Nian-Feng; Zhang, Ming-Yi; Jiang, Jie-Xian; Ji, Xiang-Yun; Hao-Zhang
2016-01-01
While the worldwide standardization of agricultural production has been advocated and recommended, relatively little research has focused on the ecological significance of such a shift. The ecological concerns stemming from the standardization of agricultural production may require new methodology. In this study, we concentrated on how ecological two-sidedness and ecological processes affect the standardization of agricultural production which was divided into three phrases (pre-, mid- and post-production), considering both the positive and negative effects of agricultural processes. We constructed evaluation indicator systems for the pre-, mid- and post-production phases and here we presented a Standardization of Green Production Index (SGPI) based on the Full Permutation Polygon Synthetic Indicator (FPPSI) method which we used to assess the superiority of three methods of standardized production for peaches. The values of SGPI for pre-, mid- and post-production were 0.121 (Level IV, “Excellent” standard), 0.379 (Level III, “Good” standard), and 0.769 × 10−2 (Level IV, “Excellent” standard), respectively. Here we aimed to explore the integrated application of ecological two-sidedness and ecological process in agricultural production. Our results are of use to decision-makers and ecologists focusing on eco-agriculture and those farmers who hope to implement standardized agricultural production practices. PMID:26899360
An ecological method to understand agricultural standardization in peach orchard ecosystems.
Wan, Nian-Feng; Zhang, Ming-Yi; Jiang, Jie-Xian; Ji, Xiang-Yun; Hao-Zhang
2016-02-22
While the worldwide standardization of agricultural production has been advocated and recommended, relatively little research has focused on the ecological significance of such a shift. The ecological concerns stemming from the standardization of agricultural production may require new methodology. In this study, we concentrated on how ecological two-sidedness and ecological processes affect the standardization of agricultural production which was divided into three phrases (pre-, mid- and post-production), considering both the positive and negative effects of agricultural processes. We constructed evaluation indicator systems for the pre-, mid- and post-production phases and here we presented a Standardization of Green Production Index (SGPI) based on the Full Permutation Polygon Synthetic Indicator (FPPSI) method which we used to assess the superiority of three methods of standardized production for peaches. The values of SGPI for pre-, mid- and post-production were 0.121 (Level IV, "Excellent" standard), 0.379 (Level III, "Good" standard), and 0.769 × 10(-2) (Level IV, "Excellent" standard), respectively. Here we aimed to explore the integrated application of ecological two-sidedness and ecological process in agricultural production. Our results are of use to decision-makers and ecologists focusing on eco-agriculture and those farmers who hope to implement standardized agricultural production practices.
Application of micromechanics to the characterization of mortar by ultrasound.
Hernández, M G; Anaya, J J; Izquierdo, M A G; Ullate, L G
2002-05-01
Mechanical properties of concrete and mortar structures can be estimated by ultrasonic non-destructive testing. When the ultrasonic velocity is known, there are standardized methods based on considering the concrete a homogeneous material. Cement composites, however, are heterogeneous and porous, and have a negative effect on the mechanical properties of structures. This work studies the impact of porosity on mechanical properties by considering concrete a multiphase material. A micromechanical model is applied in which the material is considered to consist of two phases: a solid matrix and pores. From this method, a set of expressions is obtained that relates the acoustic velocity and Young's modulus of mortar. Experimental work is based on non-destructive and destructive procedures over mortar samples whose porosity is varied. A comparison is drawn between micromechanical and standard methods, showing positive results for the method here proposed.
A new evaluation tool to obtain practice-based evidence of worksite health promotion programs.
Dunet, Diane O; Sparling, Phillip B; Hersey, James; Williams-Piehota, Pamela; Hill, Mary D; Hanssen, Carl; Lawrenz, Frances; Reyes, Michele
2008-10-01
The Centers for Disease Control and Prevention developed the Swift Worksite Assessment and Translation (SWAT) evaluation method to identify promising practices in worksite health promotion programs. The new method complements research studies and evaluation studies of evidence-based practices that promote healthy weight in working adults. We used nationally recognized program evaluation standards of utility, feasibility, accuracy, and propriety as the foundation for our 5-step method: 1) site identification and selection, 2) site visit, 3) post-visit evaluation of promising practices, 4) evaluation capacity building, and 5) translation and dissemination. An independent, outside evaluation team conducted process and summative evaluations of SWAT to determine its efficacy in providing accurate, useful information and its compliance with evaluation standards. The SWAT evaluation approach is feasible in small and medium-sized workplace settings. The independent evaluation team judged SWAT favorably as an evaluation method, noting among its strengths its systematic and detailed procedures and service orientation. Experts in worksite health promotion evaluation concluded that the data obtained by using this evaluation method were sufficient to allow them to make judgments about promising practices. SWAT is a useful, business-friendly approach to systematic, yet rapid, evaluation that comports with program evaluation standards. The method provides a new tool to obtain practice-based evidence of worksite health promotion programs that help prevent obesity and, more broadly, may advance public health goals for chronic disease prevention and health promotion.
Method of gear fault diagnosis based on EEMD and improved Elman neural network
NASA Astrophysics Data System (ADS)
Zhang, Qi; Zhao, Wei; Xiao, Shungen; Song, Mengmeng
2017-05-01
Aiming at crack and wear and so on of gears Fault information is difficult to diagnose usually due to its weak, a gear fault diagnosis method that is based on EEMD and improved Elman neural network fusion is proposed. A number of IMF components are obtained by decomposing denoised all kinds of fault signals with EEMD, and the pseudo IMF components is eliminated by using the correlation coefficient method to obtain the effective IMF component. The energy characteristic value of each effective component is calculated as the input feature quantity of Elman neural network, and the improved Elman neural network is based on standard network by adding a feedback factor. The fault data of normal gear, broken teeth, cracked gear and attrited gear were collected by field collecting. The results were analyzed by the diagnostic method proposed in this paper. The results show that compared with the standard Elman neural network, Improved Elman neural network has the advantages of high diagnostic efficiency.
Wavelet images and Chou's pseudo amino acid composition for protein classification.
Nanni, Loris; Brahnam, Sheryl; Lumini, Alessandra
2012-08-01
The last decade has seen an explosion in the collection of protein data. To actualize the potential offered by this wealth of data, it is important to develop machine systems capable of classifying and extracting features from proteins. Reliable machine systems for protein classification offer many benefits, including the promise of finding novel drugs and vaccines. In developing our system, we analyze and compare several feature extraction methods used in protein classification that are based on the calculation of texture descriptors starting from a wavelet representation of the protein. We then feed these texture-based representations of the protein into an Adaboost ensemble of neural network or a support vector machine classifier. In addition, we perform experiments that combine our feature extraction methods with a standard method that is based on the Chou's pseudo amino acid composition. Using several datasets, we show that our best approach outperforms standard methods. The Matlab code of the proposed protein descriptors is available at http://bias.csr.unibo.it/nanni/wave.rar .
USEPA MANUAL OF METHODS FOR VIROLOGY
This chapter describes procedures for the detection of coliphases in water matrices. These procedures are based on those presented in the Supplement to the 20th Edition of Standard Methods for the Examination of Water and Eastewater and EPA Methods 1601 and 1602. Two quantitati...
Preparation method and quality control of multigamma volume sources with different matrices.
Listkowska, A; Lech, E; Saganowski, P; Tymiński, Z; Dziel, T; Cacko, D; Ziemek, T; Kołakowska, E; Broda, R
2018-04-01
The aim of the work was to develop new radioactive standard sources based on epoxy resins. The optimal proportions of the components and the homogeneity of the matrices were determined. The activity of multigamma sources prepared in Marinelli beakers was determined with reference to the National Standard of Radionuclides Activity in Poland. The difference of radionuclides activity values determined using calibrated gamma spectrometer and the activity of standard solutions used are in most cases significantly lower than measurement uncertainty limits. Sources production method and quality control procedure have been developed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Walking Distance Estimation Using Walking Canes with Inertial Sensors
Suh, Young Soo
2018-01-01
A walking distance estimation algorithm for cane users is proposed using an inertial sensor unit attached to various positions on the cane. A standard inertial navigation algorithm using an indirect Kalman filter was applied to update the velocity and position of the cane during movement. For quadripod canes, a standard zero-velocity measurement-updating method is proposed. For standard canes, a velocity-updating method based on an inverted pendulum model is proposed. The proposed algorithms were verified by three walking experiments with two different types of canes and different positions of the sensor module. PMID:29342971
Modified Standard Penetration Test–based Drilled Shaft Design Method for Weak Rocks (Phase 2 Study)
DOT National Transportation Integrated Search
2017-12-15
In this project, Illinois-specific design procedures were developed for drilled shafts founded in weak shale or rock. In particular, a modified standard penetration test was developed and verified to characterize the in situ condition of weak shales ...
Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea
2016-03-26
Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients' care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine.
Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea
2016-01-01
Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients’ care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine. PMID:27019800
Holschneider, Alexander; Hutson, John; Peña, Albert; Beket, Elhamy; Chatterjee, Subir; Coran, Arnold; Davies, Michael; Georgeson, Keith; Grosfeld, Jay; Gupta, Devendra; Iwai, Naomi; Kluth, Dieter; Martucciello, Giuseppe; Moore, Samuel; Rintala, Risto; Smith, E Durham; Sripathi, D V; Stephens, Douglas; Sen, Sudipta; Ure, Benno; Grasshoff, Sabine; Boemers, Thomas; Murphy, Feilin; Söylet, Yunus; Dübbers, Martin; Kunst, Marc
2005-10-01
Anorectal malformations (ARM) are common congenital anomalies seen throughout the world. Comparison of outcome data has been hindered because of confusion related to classification and assessment systems. The goals of the Krinkenbeck Conference on ARM was to develop standards for an International Classification of ARM based on a modification of fistula type and adding rare and regional variants, and design a system for comparable follow up studies. Lesions were classified into major clinical groups based on the fistula location (perineal, recto-urethral, recto-vesical, vestibular), cloacal lesions, those with no fistula and anal stenosis. Rare and regional variants included pouch colon, rectal atresia or stenosis, rectovaginal fistula, H-fistula and others. Groups would be analyzed according to the type of procedure performed stratified for confounding associated conditions such as sacral anomalies and tethered cord. A standard method for postoperative assessment of continence was determined. A new International diagnostic classification system, operative groupings and a method of postoperative assessment of continence was developed by consensus of a large contingent of participants experienced in the management of patients with ARM. These methods should allow for a common standardization of diagnosis and comparing postoperative results.
Aurumskjöld, Marie-Louise; Söderberg, Marcus; Stålhammar, Fredrik; von Steyern, Kristina Vult; Tingberg, Anders; Ydström, Kristina
2018-06-01
Background In pediatric patients, computed tomography (CT) is important in the medical chain of diagnosing and monitoring various diseases. Because children are more radiosensitive than adults, they require minimal radiation exposure. One way to achieve this goal is to implement new technical solutions, like iterative reconstruction. Purpose To evaluate the potential of a new, iterative, model-based method for reconstructing (IMR) pediatric abdominal CT at a low radiation dose and determine whether it maintains or improves image quality, compared to the current reconstruction method. Material and Methods Forty pediatric patients underwent abdominal CT. Twenty patients were examined with the standard dose settings and 20 patients were examined with a 32% lower radiation dose. Images from the standard examination were reconstructed with a hybrid iterative reconstruction method (iDose 4 ), and images from the low-dose examinations were reconstructed with both iDose 4 and IMR. Image quality was evaluated subjectively by three observers, according to modified EU image quality criteria, and evaluated objectively based on the noise observed in liver images. Results Visual grading characteristics analyses showed no difference in image quality between the standard dose examination reconstructed with iDose 4 and the low dose examination reconstructed with IMR. IMR showed lower image noise in the liver compared to iDose 4 images. Inter- and intra-observer variance was low: the intraclass coefficient was 0.66 (95% confidence interval = 0.60-0.71) for the three observers. Conclusion IMR provided image quality equivalent or superior to the standard iDose 4 method for evaluating pediatric abdominal CT, even with a 32% dose reduction.
NASA Astrophysics Data System (ADS)
Arkeman, Y.; Rizkyanti, R. A.; Hambali, E.
2017-05-01
Development of Indonesian palm-oil-based bioenergy faces an international challenge regarding to sustainability issue, indicated by the establishment of standards on sustainable bioenergy. Currently, Indonesia has sustainability standards limited to palm-oil cultivation, while other standards are lacking appropriateness for Indonesian palm-oil-based bioenergy sustainability regarding to real condition in Indonesia. Thus, Indonesia requires sustainability indicators for Indonesian palm-oil-based bioenergy to gain recognition and easiness in marketing it. Determination of sustainability indicators was accomplished through three stages, which were preliminary analysis, indicator assessment (using fuzzy inference system), and system validation. Global Bioenergy partnership (GBEP) was used as the standard for the assessment because of its general for use, internationally accepted, and it contained balanced proportion between environment, economic, and social aspects. Result showed that the number of sustainability indicators using FIS method are 21 indicators. The system developed has an accuracy of 85%.
Method and Apparatus for Processing UDP Data Packets
NASA Technical Reports Server (NTRS)
Murphy, Brandon M. (Inventor)
2017-01-01
A method and apparatus for processing a plurality of data packets. A data packet is received. A determination is made as to whether a portion of the data packet follows a selected digital recorder standard protocol based on a header of the data packet. Raw data in the data packet is converted into human-readable information in response to a determination that the portion of the data packet follows the selected digital recorder standard protocol.
Cordeiro, Fernando; Robouch, Piotr; de la Calle, Maria Beatriz; Emteborg, Håkan; Charoud-Got, Jean; Schmitz, Franz
2011-01-01
A collaborative study, International Evaluation Measurement Programme-25a, was conducted in accordance with international protocols to determine the performance characteristics of an analytical method for the determination of dissolved bromate in drinking water. The method should fulfill the analytical requirements of Council Directive 98/83/EC (referred to in this work as the Drinking Water Directive; DWD). The new draft standard method under investigation is based on ion chromatography followed by post-column reaction and UV detection. The collaborating laboratories used the Draft International Organization for Standardization (ISO)/Draft International Standard (DIS) 11206 document. The existing standard method (ISO 15061:2001) is based on ion chromatography using suppressed conductivity detection, in which a preconcentration step may be required for the determination of bromate concentrations as low as 3 to 5 microg/L. The new method includes a dilution step that reduces the matrix effects, thus allowing the determination of bromate concentrations down to 0.5 microg/L. Furthermore, the method aims to minimize any potential interference of chlorite ions. The collaborative study investigated different types of drinking water, such as soft, hard, and mineral water. Other types of water, such as raw water (untreated), swimming pool water, a blank (named river water), and a bromate standard solution, were included as test samples. All test matrixes except the swimming pool water were spiked with high-purity potassium bromate to obtain bromate concentrations ranging from 1.67 to 10.0 microg/L. Swimming pool water was not spiked, as this water was incurred with bromate. Test samples were dispatched to 17 laboratories from nine different countries. Sixteen participants reported results. The repeatability RSD (RSD(r)) ranged from 1.2 to 4.1%, while the reproducibility RSD (RSDR) ranged from 2.3 to 5.9%. These precision characteristics compare favorably with those of ISO 15601. A thorough comparison of the performance characteristics is presented in this report. All method performance characteristics obtained in the frame of this collaborative study indicate that the draft ISO/DIS 11206 standard method meets the requirements set down by the DWD. It can, therefore, be considered to fit its intended analytical purpose.
Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-01-01
Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791
High sensitivity optical measurement of skin gloss
Ezerskaia, Anna; Ras, Arno; Bloemen, Pascal; Pereira, Silvania F.; Urbach, H. Paul; Varghese, Babu
2017-01-01
We demonstrate a low-cost optical method for measuring the gloss properties with improved sensitivity in the low gloss regime, relevant for skin gloss properties. The gloss estimation method is based on, on the one hand, the slope of the intensity gradient in the transition regime between specular and diffuse reflection and on the other on the sum over the intensities of pixels above threshold, derived from a camera image obtained using unpolarized white light illumination. We demonstrate the improved sensitivity of the two proposed methods using Monte Carlo simulations and experiments performed on ISO gloss calibration standards with an optical prototype. The performance and linearity of the method was compared with different professional gloss measurement devices based on the ratio of specular to diffuse intensity. We demonstrate the feasibility for in-vivo skin gloss measurements by quantifying the temporal evolution of skin gloss after application of standard paraffin cream bases on skin. The presented method opens new possibilities in the fields of cosmetology and dermatopharmacology for measuring the skin gloss and resorption kinetics and the pharmacodynamics of various external agents. PMID:29026683
High sensitivity optical measurement of skin gloss.
Ezerskaia, Anna; Ras, Arno; Bloemen, Pascal; Pereira, Silvania F; Urbach, H Paul; Varghese, Babu
2017-09-01
We demonstrate a low-cost optical method for measuring the gloss properties with improved sensitivity in the low gloss regime, relevant for skin gloss properties. The gloss estimation method is based on, on the one hand, the slope of the intensity gradient in the transition regime between specular and diffuse reflection and on the other on the sum over the intensities of pixels above threshold, derived from a camera image obtained using unpolarized white light illumination. We demonstrate the improved sensitivity of the two proposed methods using Monte Carlo simulations and experiments performed on ISO gloss calibration standards with an optical prototype. The performance and linearity of the method was compared with different professional gloss measurement devices based on the ratio of specular to diffuse intensity. We demonstrate the feasibility for in-vivo skin gloss measurements by quantifying the temporal evolution of skin gloss after application of standard paraffin cream bases on skin. The presented method opens new possibilities in the fields of cosmetology and dermatopharmacology for measuring the skin gloss and resorption kinetics and the pharmacodynamics of various external agents.
Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207
Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.
Stability analysis of spacecraft power systems
NASA Technical Reports Server (NTRS)
Halpin, S. M.; Grigsby, L. L.; Sheble, G. B.; Nelms, R. M.
1990-01-01
The problems in applying standard electric utility models, analyses, and algorithms to the study of the stability of spacecraft power conditioning and distribution systems are discussed. Both single-phase and three-phase systems are considered. Of particular concern are the load and generator models that are used in terrestrial power system studies, as well as the standard assumptions of load and topological balance that lead to the use of the positive sequence network. The standard assumptions regarding relative speeds of subsystem dynamic responses that are made in the classical transient stability algorithm, which forms the backbone of utility-based studies, are examined. The applicability of these assumptions to a spacecraft power system stability study is discussed in detail. In addition to the classical indirect method, the applicability of Liapunov's direct methods to the stability determination of spacecraft power systems is discussed. It is pointed out that while the proposed method uses a solution process similar to the classical algorithm, the models used for the sources, loads, and networks are, in general, more accurate. Some preliminary results are given for a linear-graph, state-variable-based modeling approach to the study of the stability of space-based power distribution networks.
Grate, Jay W; Gonzalez, Jhanis J; O'Hara, Matthew J; Kellogg, Cynthia M; Morrison, Samuel S; Koppenaal, David W; Chan, George C-Y; Mao, Xianglei; Zorba, Vassilia; Russo, Richard E
2017-09-08
Solid sampling and analysis methods, such as laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), are challenged by matrix effects and calibration difficulties. Matrix-matched standards for external calibration are seldom available and it is difficult to distribute spikes evenly into a solid matrix as internal standards. While isotopic ratios of the same element can be measured to high precision, matrix-dependent effects in the sampling and analysis process frustrate accurate quantification and elemental ratio determinations. Here we introduce a potentially general solid matrix transformation approach entailing chemical reactions in molten ammonium bifluoride (ABF) salt that enables the introduction of spikes as tracers or internal standards. Proof of principle experiments show that the decomposition of uranium ore in sealed PFA fluoropolymer vials at 230 °C yields, after cooling, new solids suitable for direct solid sampling by LA. When spikes are included in the molten salt reaction, subsequent LA-ICP-MS sampling at several spots indicate that the spikes are evenly distributed, and that U-235 tracer dramatically improves reproducibility in U-238 analysis. Precisions improved from 17% relative standard deviation for U-238 signals to 0.1% for the ratio of sample U-238 to spiked U-235, a factor of over two orders of magnitude. These results introduce the concept of solid matrix transformation (SMT) using ABF, and provide proof of principle for a new method of incorporating internal standards into a solid for LA-ICP-MS. This new approach, SMT-LA-ICP-MS, provides opportunities to improve calibration and quantification in solids based analysis. Looking forward, tracer addition to transformed solids opens up LA-based methods to analytical methodologies such as standard addition, isotope dilution, preparation of matrix-matched solid standards, external calibration, and monitoring instrument drift against external calibration standards.
Automated feature detection and identification in digital point-ordered signals
Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.
1998-01-01
A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.
Validation of a Rapid Bacteria Endospore Enumeration System for Planetary Protection Application
NASA Astrophysics Data System (ADS)
Chen, Fei; Kern, Roger; Kazarians, Gayane; Venkateswaran, Kasthuri
NASA monitors spacecraft surfaces to assure that the presence of bacterial endospores meets strict criteria at launch, to minimize the risk of inadvertent contamination of the surface of Mars. Currently, the only approved method for enumerating the spores is a culture based assay that requires three days to produce results. In order to meet the demanding schedules of spacecraft assembly, a more rapid spore detection assay is being considered as an alternate method to the NASA standard culture-based assay. The Millipore Rapid Microbiology Detection System (RMDS) has been used successfully for rapid bioburden enumeration in the pharmaceutical and food industries. The RMDS is rapid and simple, shows high sensitivity (to 1 colony forming unit [CFU]/sample), and correlates well with traditional culture-based methods. It combines membrane filtration, adenosine triphosphate (ATP) bioluminescence chemistry, and image analysis based on photon detection with a Charge Coupled Device (CCD) camera. In this study, we have optimized the assay conditions and evaluated the use of the RMDS as a rapid spore detection tool for NASA applications. In order to select for spores, the samples were subjected to a heat shock step before proceeding with the RMDS incubation protocol. Seven species of Bacillus (nine strains) that have been repeatedly isolated from clean room environments were assayed. All strains were detected by the RMDS in 5 hours and these assay times were repeatedly demonstrated along with low image background noise. Validation experiments to compare the Rapid Sore Assay (RSA) and NASA standard assay (NSA) were also performed. The evaluation criteria were modeled after the FDA Guideline of Process Validation, and Analytical Test Methods. This body of research demonstrates that the Rapid Spore Assay (RSA) is quick, and of equivalent sensitivity to the NASA standard assay, potentially reducing the assay time for bacterial endospores from over 72 hours to less than 8 hours. Accordingly, JPL has produced a report recommending that NASA adopt the RSA method as a suitable alternative to the NASA standard assay.
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Computing tools for implementing standards for single-case designs.
Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E
2015-11-01
In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.
Small-Scale System for Evaluation of Stretch-Flangeability with Excellent Reliability
NASA Astrophysics Data System (ADS)
Yoon, Jae Ik; Jung, Jaimyun; Lee, Hak Hyeon; Kim, Hyoung Seop
2018-02-01
We propose a system for evaluating the stretch-flangeability of small-scale specimens based on the hole-expansion ratio (HER). The system has no size effect and shows excellent reproducibility, reliability, and economic efficiency. To verify the reliability and reproducibility of the proposed hole-expansion testing (HET) method, the deformation behavior of the conventional standard stretch-flangeability evaluation method was compared with the proposed method using finite-element method simulations. The distribution of shearing defects in the hole-edge region of the specimen, which has a significant influence on the HER, was investigated using scanning electron microscopy. The stretch-flangeability of several kinds of advanced high-strength steel determined using the conventional standard method was compared with that using the proposed small-scale HET method. It was verified that the deformation behavior, morphology and distribution of shearing defects, and stretch-flangeability results for the specimens were the same for the conventional standard method and the proposed small-scale stretch-flangeability evaluation system.
Small-Scale System for Evaluation of Stretch-Flangeability with Excellent Reliability
NASA Astrophysics Data System (ADS)
Yoon, Jae Ik; Jung, Jaimyun; Lee, Hak Hyeon; Kim, Hyoung Seop
2018-06-01
We propose a system for evaluating the stretch-flangeability of small-scale specimens based on the hole-expansion ratio (HER). The system has no size effect and shows excellent reproducibility, reliability, and economic efficiency. To verify the reliability and reproducibility of the proposed hole-expansion testing (HET) method, the deformation behavior of the conventional standard stretch-flangeability evaluation method was compared with the proposed method using finite-element method simulations. The distribution of shearing defects in the hole-edge region of the specimen, which has a significant influence on the HER, was investigated using scanning electron microscopy. The stretch-flangeability of several kinds of advanced high-strength steel determined using the conventional standard method was compared with that using the proposed small-scale HET method. It was verified that the deformation behavior, morphology and distribution of shearing defects, and stretch-flangeability results for the specimens were the same for the conventional standard method and the proposed small-scale stretch-flangeability evaluation system.
Karmarkar, S; Yang, X; Garber, R; Szajkovics, A; Koberda, M
2014-11-01
The USP monograph describes an HPLC method for seven impurities in the amiodarone drug substance using a L1 column, 4.6mm×150mm, 5μm packing (PF listed ODS2 GL-Science, Inertsil column) at 30°C with detection at 240nm. The standard contains 0.01mg/mL of amiodarone, and USP specified impurities D and E with a resolution requirement of NLT 3.5 between peaks D and E. Impurities in a 5mg/mL sample are quantitated against the standard. Impurity A peak elutes just before peak D. We observed two problems with the method; the column lot-to-lot variability resulted in unresolved A, D, and E peaks, and peak D in the sample preparation eluted much later than that in the standard solution. Therefore, optimization experiments were conducted on the USP method following the QbD approach with Fusion AE™ software (S-Matrix Corporation). The resulting optimized conditions were within the allowable changes per USP 〈621〉. Lot-to-lot variability was negligible with the Atlantis T3 (Waters Corporation) L1 column. Peak D retention time remained constant from standard to sample. The optimized method was validated in terms of accuracy, precision, linearity, range, LOQ/LOD, specificity, robustness, equivalency to the USP method, and solution stability. The QbD based development helped in generating a design space and operating space with knowledge of all method performance characteristics and limitations and successful method robustness within the operating space. Copyright © 2014 Elsevier B.V. All rights reserved.
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva
2017-02-01
An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.
HIPS: A new hippocampus subfield segmentation method.
Romero, José E; Coupé, Pierrick; Manjón, José V
2017-12-01
The importance of the hippocampus in the study of several neurodegenerative diseases such as Alzheimer's disease makes it a structure of great interest in neuroimaging. However, few segmentation methods have been proposed to measure its subfields due to its complex structure and the lack of high resolution magnetic resonance (MR) data. In this work, we present a new pipeline for automatic hippocampus subfield segmentation using two available hippocampus subfield delineation protocols that can work with both high and standard resolution data. The proposed method is based on multi-atlas label fusion technology that benefits from a novel multi-contrast patch match search process (using high resolution T1-weighted and T2-weighted images). The proposed method also includes as post-processing a new neural network-based error correction step to minimize systematic segmentation errors. The method has been evaluated on both high and standard resolution images and compared to other state-of-the-art methods showing better results in terms of accuracy and execution time. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Marchant, T. E.; Joshi, K. D.; Moore, C. J.
2018-03-01
Radiotherapy dose calculations based on cone-beam CT (CBCT) images can be inaccurate due to unreliable Hounsfield units (HU) in the CBCT. Deformable image registration of planning CT images to CBCT, and direct correction of CBCT image values are two methods proposed to allow heterogeneity corrected dose calculations based on CBCT. In this paper we compare the accuracy and robustness of these two approaches. CBCT images for 44 patients were used including pelvis, lung and head & neck sites. CBCT HU were corrected using a ‘shading correction’ algorithm and via deformable registration of planning CT to CBCT using either Elastix or Niftyreg. Radiotherapy dose distributions were re-calculated with heterogeneity correction based on the corrected CBCT and several relevant dose metrics for target and OAR volumes were calculated. Accuracy of CBCT based dose metrics was determined using an ‘override ratio’ method where the ratio of the dose metric to that calculated on a bulk-density assigned version of the same image is assumed to be constant for each patient, allowing comparison to the patient’s planning CT as a gold standard. Similar performance is achieved by shading corrected CBCT and both deformable registration algorithms, with mean and standard deviation of dose metric error less than 1% for all sites studied. For lung images, use of deformed CT leads to slightly larger standard deviation of dose metric error than shading corrected CBCT with more dose metric errors greater than 2% observed (7% versus 1%).
Recent progress in the development of ISO 19751
NASA Astrophysics Data System (ADS)
Farnand, Susan P.; Dalal, Edul N.; Ng, Yee S.
2006-01-01
A small number of general visual attributes have been recognized as essential in describing image quality. These include micro-uniformity, macro-uniformity, colour rendition, text and line quality, gloss, sharpness, and spatial adjacency or temporal adjacency attributes. The multiple-part International Standard discussed here was initiated by the INCITS W1 committee on the standardization of office equipment to address the need for unambiguously documented procedures and methods, which are widely applicable over the multiple printing technologies employed in office applications, for the appearance-based evaluation of these visually significant image quality attributes of printed image quality. 1,2 The resulting proposed International Standard, for which ISO/IEC WD 19751-1 3 presents an overview and an outline of the overall procedure and common methods, is based on a proposal that was predicated on the idea that image quality could be described by a small set of broad-based attributes. 4 Five ad hoc teams were established (now six since a sharpness team is in the process of being formed) to generate standards for one or more of these image quality attributes. Updates on the colour rendition, text and line quality, and gloss attributes are provided.
40 CFR 98.386 - Data reporting requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.386 Data reporting... measurement standard method or other industry standard practice used. For natural gas liquids, quantity shall... site, report the total annual quantity in metric tons or barrels. For natural gas liquids, quantity...
40 CFR 98.386 - Data reporting requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.386 Data reporting... measurement standard method or other industry standard practice used. For natural gas liquids, quantity shall... site, report the total annual quantity in metric tons or barrels. For natural gas liquids, quantity...
40 CFR 98.386 - Data reporting requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.386 Data reporting... measurement standard method or other industry standard practice used. For natural gas liquids, quantity shall... site, report the total annual quantity in metric tons or barrels. For natural gas liquids, quantity...
40 CFR 98.386 - Data reporting requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.386 Data reporting... measurement standard method or other industry standard practice used. For natural gas liquids, quantity shall... site, report the total annual quantity in metric tons or barrels. For natural gas liquids, quantity...
ERIC Educational Resources Information Center
Nelson, Catherine J.
2012-01-01
The author is a strong proponent of incorporating the Content and Process Standards (NCTM 2000) into the teaching of mathematics. For candidates in her methods course, she models research-based best practices anchored in the Standards. Her students use manipulatives, engage in problem-solving activities, listen to children's literature, and use…
A Stable Whole Building Performance Method for Standard 90.1-Part II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Eley, Charles
2016-06-01
In May of 2013 we introduced a new approach for compliance with Standard 90.1 that was under development based on the Performance Rating Method of Appendix G to Standard 90.11. Since then, the approach has been finalized through Addendum BM to Standard 90.1-2013 and will be published in the 2016 edition of the Standard. In the meantime, ASHRAE has published an advanced copy of Appendix G including Addendum BM and several other addenda so that software developers and energy program administrators can get a preview of what is coming in the 2016 edition of the Standard2. This article is anmore » update on Addendum BM, summarizes changes made to the original concept as introduced in May of 2013, and provides an approach for developing performance targets for code compliance and beyond code programs.« less
Li, Hui
2009-03-01
To construct the growth standardized data and curves based on weight, length/height, head circumference for Chinese children under 7 years of age. Random cluster sampling was used. The fourth national growth survey of children under 7 years in the nine cities (Beijing, Harbin, Xi'an, Shanghai, Nanjing, Wuhan, Fuzhou, Guangzhou and Kunming) of China was performed in 2005 and from this survey, data of 69 760 urban healthy boys and girls were used to set up the database for weight-for-age, height-for-age (length was measured for children under 3 years) and head circumference-for-age. Anthropometric data were ascribed to rigorous methods of data collection and standardized procedures across study sites. LMS method based on BOX-COX normal transformation and cubic splines smoothing technique was chosen for fitting the raw data according to study design and data features, and standardized values of any percentile and standard deviation were obtained by the special formulation of L, M and S parameters. Length-for-age and height-for-age standards were constructed by fitting the same model but the final curves reflected the 0.7 cm average difference between these two measurements. A set of systematic diagnostic tools was used to detect possible biases in estimated percentiles or standard deviation curves, including chi2 test, which was used for reference to evaluate to the goodness of fit. The 3rd, 10th, 25th, 50th, 75th, 90th, 97th smoothed percentiles and -3, -2, -1, 0, +1, +2, +3 SD values and curves of weight-for-age, length/height-for-age and head circumference-for-age for boys and girls aged 0-7 years were made out respectively. The Chinese child growth charts was slightly higher than the WHO child growth standards. The newly established growth charts represented the growth level of healthy and well-nourished Chinese children. The sample size was very large and national, the data were high-quality and the smoothing method was internationally accepted. The new Chinese growth charts are recommended as the Chinese child growth standards in 21st century used in China.
2014-01-01
Background Levels of haemoglobin A1c (HbA1c) and blood lipids are important determinants of risk in patients with diabetes. Standard analysis methods based upon venous blood samples can be logistically challenging in resource-poor settings where much of the diabetes epidemic is occurring. Dried blood spots (DBS) provide a simple alternative method for sample collection but the comparability of data from analyses based on DBS is not well established. Methods We conducted a systematic review and meta-analysis to define the association of findings for HbA1c and blood lipids for analyses based upon standard methods compared to DBS. The Cochrane, Embase and Medline databases were searched for relevant reports and summary regression lines were estimated. Results 705 abstracts were found by the initial electronic search with 6 further reports identified by manual review of the full papers. 16 studies provided data for one or more outcomes of interest. There was a close agreement between the results for HbA1c assays based on venous and DBS samples (DBS = 0.9858venous + 0.3809), except for assays based upon affinity chromatography. Significant adjustment was required for assays of total cholesterol (DBS = 0.6807venous + 1.151) but results for triglycerides (DBS = 0.9557venous + 0.1427) were directly comparable. Conclusions For HbA1c and selected blood lipids, assays based on DBS samples are clearly associated with assays based on standard venous samples. There are, however, significant uncertainties about the nature of these associations and there is a need for standardisation of the sample collection, transportation, storage and analysis methods before the technique can be considered mainstream. This should be a research priority because better elucidation of metabolic risks in resource poor settings, where venous sampling is infeasible, will be key to addressing the global epidemic of cardiovascular diseases. PMID:25045323
Reducing data friction through site-based data curation
NASA Astrophysics Data System (ADS)
Thomer, A.; Palmer, C. L.
2017-12-01
Much of geoscience research takes place at "scientifically significant sites": localities which have attracted a critical mass of scientific interest, and thereby merit protection by government bodies, as well as the preservation of specimen and data collections and the development of site-specific permitting requirements for access to the site and its associated collections. However, many data standards and knowledge organization schemas do not adequately describe key characteristics of the sites, despite their centrality to research projects. Through work conducted as part of the IMLS-funded Site-Based Data Curation (SBDC) project, we developed a Minimum Information Framework (MIF) for site-based science, in which "information about a site's structure" is considered a core class of information. Here we present our empirically-derived information framework, as well as the methods used to create it. We believe these approaches will lead to the development of more effective data repositories and tools, and thereby will reduce "data friction" in interdisciplinary, yet site-based, geoscience workflows. The Minimum Information Framework for Site-based Research was developed through work at two scientifically significant sites: the hot springs at Yellowstone National Park, which are key to geobiology research; and the La Brea Tar Pits, an important paleontology locality in Southern California. We employed diverse methods of participatory engagement, in which key stakeholders at our sites (e.g. curators, collections managers, researchers, permit officers) were consulted through workshops, focus groups, interviews, action research methods, and collaborative information modeling and systems analysis. These participatory approaches were highly effective in fostering on-going partnership among a diverse team of domain scientists, information scientists, and software developers. The MIF developed in this work may be viewed as a "proto-standard" that can inform future repository development and data standards. Further, the approaches used to develop the MIF represent an important step toward systematic methods of developing geoscience data standards. Finally, we argue that organizing data around aspects of a site makes data collections more accessible to a range of scientific communities.
Sample handling for mass spectrometric proteomic investigations of human sera.
West-Nielsen, Mikkel; Høgdall, Estrid V; Marchiori, Elena; Høgdall, Claus K; Schou, Christian; Heegaard, Niels H H
2005-08-15
Proteomic investigations of sera are potentially of value for diagnosis, prognosis, choice of therapy, and disease activity assessment by virtue of discovering new biomarkers and biomarker patterns. Much debate focuses on the biological relevance and the need for identification of such biomarkers while less effort has been invested in devising standard procedures for sample preparation and storage in relation to model building based on complex sets of mass spectrometric (MS) data. Thus, development of standardized methods for collection and storage of patient samples together with standards for transportation and handling of samples are needed. This requires knowledge about how sample processing affects MS-based proteome analyses and thereby how nonbiological biased classification errors are avoided. In this study, we characterize the effects of sample handling, including clotting conditions, storage temperature, storage time, and freeze/thaw cycles, on MS-based proteomics of human serum by using principal components analysis, support vector machine learning, and clustering methods based on genetic algorithms as class modeling and prediction methods. Using spiking to artificially create differentiable sample groups, this integrated approach yields data that--even when working with sample groups that differ more than may be expected in biological studies--clearly demonstrate the need for comparable sampling conditions for samples used for modeling and for the samples that are going into the test set group. Also, the study emphasizes the difference between class prediction and class comparison studies as well as the advantages and disadvantages of different modeling methods.
Wu, Yan; He, Yi; He, Wenyi; Zhang, Yumei; Lu, Jing; Dai, Zhong; Ma, Shuangcheng; Lin, Ruichao
2014-03-01
Quantitative nuclear magnetic resonance spectroscopy (qNMR) has been developed into an important tool in the drug analysis, biomacromolecule detection, and metabolism study. Compared with mass balance method, qNMR method bears some advantages in the calibration of reference standard (RS): it determines the absolute amount of a sample; other chemical compound and its certified reference material (CRM) can be used as internal standard (IS) to obtain the purity of the sample. Protoberberine alkaloids have many biological activities and have been used as reference standards for the control of many herbal drugs. In present study, the qNMR methods were developed for the calibration of berberine hydrochloride, palmatine hydrochloride, tetrahydropalmatine, and phellodendrine hydrochloride with potassium hydrogen phthalate as IS. Method validation was carried out according to the guidelines for the method validation of Chinese Pharmacopoeia. The results of qNMR were compared with those of mass balance method and the differences between the results of two methods were acceptable based on the analysis of estimated measurement uncertainties. Therefore, qNMR is an effective and reliable analysis method for the calibration of RS and can be used as a good complementarity to the mass balance method. Copyright © 2013 Elsevier B.V. All rights reserved.
Takarabe, S; Yabuuchi, H; Morishita, J
2012-06-01
To investigate the usefulness of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high- density mammary glands region to a whole mammary glands region as features for classification of mammograms into four categories based on the ACR BI-RADS breast composition. We used 36 digital mediolateral oblique view mammograms (18 patients) approved by our IRB. These images were classified into the four categories of breast compositions by an experienced breast radiologist and the results of the classification were regarded as a gold standard. First, a whole mammary region in a breast was divided into two regions such as a high-density mammary glands region and a low/iso-density mammary glands region by using a threshold value that was obtained from the pixel values corresponding to a pectoral muscle region. Then the percentage of a high-density mammary glands region to a whole mammary glands region was calculated. In addition, as a new method, the standard deviation of pixel values in a whole mammary glands region was calculated as an index based on the intermingling of mammary glands and fats. Finally, all mammograms were classified by using the combination of the percentage of a high-density mammary glands region and the standard deviation of each image. The agreement rates of the classification between our proposed method and gold standard was 86% (31/36). This result signified that our method has the potential to classify mammograms. The combination of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high-density mammary glands region to a whole mammary glands region was available as features to classify mammograms based on the ACR BI- RADS breast composition. © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, J; Gao, H
2016-06-15
Purpose: Different from the conventional computed tomography (CT), spectral CT based on energy-resolved photon-counting detectors is able to provide the unprecedented material composition. However, an important missing piece for accurate spectral CT is to incorporate the detector response function (DRF), which is distorted by factors such as pulse pileup and charge-sharing. In this work, we propose material reconstruction methods for spectral CT with DRF. Methods: The polyenergetic X-ray forward model takes the DRF into account for accurate material reconstruction. Two image reconstruction methods are proposed: a direct method based on the nonlinear data fidelity from DRF-based forward model; a linear-data-fidelitymore » based method that relies on the spectral rebinning so that the corresponding DRF matrix is invertible. Then the image reconstruction problem is regularized with the isotropic TV term and solved by alternating direction method of multipliers. Results: The simulation results suggest that the proposed methods provided more accurate material compositions than the standard method without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Conclusion: We have proposed material reconstruction methods for spectral CT with DRF, whichprovided more accurate material compositions than the standard methods without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Jiulong Liu and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).« less
The Delphi Method: An Approach for Facilitating Evidence Based Practice in Athletic Training
ERIC Educational Resources Information Center
Sandrey, Michelle A.; Bulger, Sean M.
2008-01-01
Objective: The growing importance of evidence based practice in athletic training is necessitating academics and clinicians to be able to make judgments about the quality or lack of the body of research evidence and peer-reviewed standards pertaining to clinical questions. To assist in the judgment process, consensus methods, namely brainstorming,…
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Procedure for Mixing Base Fluids With Sediments (EPA Method 1646) 3 Appendix 3 to Subpart A of Part 435 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT...
ERIC Educational Resources Information Center
Owen-Stone, Deborah S.
2012-01-01
The purpose of this concurrent mixed methods study was to examine the collaborative relationship between scientists and science teachers and to incorporate and advocate scientific literacy based on past and current educational theories such as inquiry based teaching. The scope of this study included archived student standardized test scores,…
Effect of costing methods on unit cost of hospital medical services.
Riewpaiboon, Arthorn; Malaroje, Saranya; Kongsawatt, Sukalaya
2007-04-01
To explore the variance of unit costs of hospital medical services due to different costing methods employed in the analysis. Retrospective and descriptive study at Kaengkhoi District Hospital, Saraburi Province, Thailand, in the fiscal year 2002. The process started with a calculation of unit costs of medical services as a base case. After that, the unit costs were re-calculated based on various methods. Finally, the variations of the results obtained from various methods and the base case were computed and compared. The total annualized capital cost of buildings and capital items calculated by the accounting-based approach (averaging the capital purchase prices throughout their useful life) was 13.02% lower than that calculated by the economic-based approach (combination of depreciation cost and interest on undepreciated portion over the useful life). A change of discount rate from 3% to 6% results in a 4.76% increase of the hospital's total annualized capital cost. When the useful life of durable goods was changed from 5 to 10 years, the total annualized capital cost of the hospital decreased by 17.28% from that of the base case. Regarding alternative criteria of indirect cost allocation, unit cost of medical services changed by a range of -6.99% to +4.05%. We explored the effect on unit cost of medical services in one department. Various costing methods, including departmental allocation methods, ranged between -85% and +32% against those of the base case. Based on the variation analysis, the economic-based approach was suitable for capital cost calculation. For the useful life of capital items, appropriate duration should be studied and standardized. Regarding allocation criteria, single-output criteria might be more efficient than the combined-output and complicated ones. For the departmental allocation methods, micro-costing method was the most suitable method at the time of study. These different costing methods should be standardized and developed as guidelines since they could affect implementation of the national health insurance scheme and health financing management.
Dong, Shuya; He, Jiao; Hou, Huiping; Shuai, Yaping; Wang, Qi; Yang, Wenling; Sun, Zheng; Li, Qing; Bi, Kaishun; Liu, Ran
2017-12-01
A novel, improved, and comprehensive method for quality evaluation and discrimination of Herba Leonuri has been developed and validated based on normal- and reversed-phase chromatographic methods. To identify Herba Leonuri, normal- and reversed-phase high-performance thin-layer chromatography fingerprints were obtained by comparing the colors and R f values of the bands, and reversed-phase high-performance liquid chromatography fingerprints were obtained by using an Agilent Poroshell 120 SB-C18 within 28 min. By similarity analysis and hierarchical clustering analysis, we show that there are similar chromatographic patterns in Herba Leonuri samples, but significant differences in counterfeits and variants. To quantify the bio-active components of Herba Leonuri, reversed-phase high-performance liquid chromatography was performed to analyze syringate, leonurine, quercetin-3-O-robiniaglycoside, hyperoside, rutin, isoquercitrin, wogonin, and genkwanin simultaneously by single standard to determine multi-components method with rutin as internal standard. Meanwhile, normal-phase high-performance liquid chromatography was performed by using an Agilent ZORBAX HILIC Plus within 6 min to determine trigonelline and stachydrine using trigonelline as internal standard. Innovatively, among these compounds, bio-active components of quercetin-3-O-robiniaglycoside and trigonelline were first determined in Herba Leonuri. In general, the method integrating multi-chromatographic analyses offered an efficient way for the standardization and identification of Herba Leonuri. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Basye, Austin T.
A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.
Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2012-01-01
Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.
Open-source platform to benchmark fingerprints for ligand-based virtual screening
2013-01-01
Similarity-search methods using molecular fingerprints are an important tool for ligand-based virtual screening. A huge variety of fingerprints exist and their performance, usually assessed in retrospective benchmarking studies using data sets with known actives and known or assumed inactives, depends largely on the validation data sets used and the similarity measure used. Comparing new methods to existing ones in any systematic way is rather difficult due to the lack of standard data sets and evaluation procedures. Here, we present a standard platform for the benchmarking of 2D fingerprints. The open-source platform contains all source code, structural data for the actives and inactives used (drawn from three publicly available collections of data sets), and lists of randomly selected query molecules to be used for statistically valid comparisons of methods. This allows the exact reproduction and comparison of results for future studies. The results for 12 standard fingerprints together with two simple baseline fingerprints assessed by seven evaluation methods are shown together with the correlations between methods. High correlations were found between the 12 fingerprints and a careful statistical analysis showed that only the two baseline fingerprints were different from the others in a statistically significant way. High correlations were also found between six of the seven evaluation methods, indicating that despite their seeming differences, many of these methods are similar to each other. PMID:23721588
Broadband standard dipole antenna for antenna calibration
NASA Astrophysics Data System (ADS)
Koike, Kunimasa; Sugiura, Akira; Morikawa, Takao
1995-06-01
Antenna calibration of EMI antennas is mostly performed by the standard antenna method at an open-field test site using a specially designed dipole antenna as a reference. In order to develop broadband standard antennas, the antenna factors of shortened dipples are theoretically investigated. First, the effects of the dipole length are analyzed using the induced emf method. Then, baluns and loads are examined to determine their influence on the antenna factors. It is found that transformer-type baluns are very effective for improving the height dependence of the antenna factors. Resistive loads are also useful for flattening the frequency dependence. Based on these studies, a specification is developed for a broadband standard antenna operating in the 30 to 150 MHz frequency range.
Viera, Mariela S; Rizzetti, Tiele M; de Souza, Maiara P; Martins, Manoel L; Prestes, Osmar D; Adaime, Martha B; Zanella, Renato
2017-12-01
In this study, a QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method, optimized by a 2 3 full factorial design, was developed for the determination of 72 pesticides in plant parts of carrot, corn, melon, rice, soy, silage, tobacco, cassava, lettuce and wheat by ultra-high-performance liquid chromatographic tandem mass spectrometry (UHPLC-MS/MS). Considering the complexity of these matrices and the need of use calibration in matrix, a new calibration approach based on single level standard addition in the sample (SLSAS) was proposed in this work and compared with the matrix-matched calibration (MMC), the procedural standard calibration (PSC) and the diluted standard addition calibration (DSAC). All approaches presented satisfactory validation parameters with recoveries from 70 to 120% and relative standard deviations≤20%. SLSAS was the most practical from the evaluated approaches and proved to be an effective way of calibration. Method limit of detection were between 4.8 and 48μgkg -1 and limit of quantification were from 16 to 160μgkg -1 . Method application to different kinds of plants found residues of 20 pesticides that were quantified with z-scores values≤2 in comparison with other calibration approaches. The proposed QuEChERS method combined with UHPLC-MS/MS analysis and using an easy and effective calibration procedure presented satisfactory results for pesticide residues determination in different crop plants and is a good alternative for routine analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Mao, De-Hua; Hu, Guang-Wei; Liu, Hui-Jie; Li, Zheng-Zui; Li, Zhi-Long; Tan, Zi-Fang
2014-02-01
The annual emergy and currency value of the main ecological service value of returning cropland to lake in Dongting Lake region from 1999 to 2010 was calculated based on emergy analysis. The calculation method of ecological compensation standard was established by calculating annual total emergy of ecological service function increment since the starting year of returning cropland to lake, and the annual ecological compensation standard and compensation area were analyzed from 1999 to 2010. The results indicated that ecological compensation standard from 1999 to 2010 was 40.31-86.48 yuan x m(-2) with the mean of 57.33 yuan x m(-2). The ecological compensation standard presented an increase trend year by year due to the effect of eco-recovery of returning cropland to lake. The ecological compensation standard in the research area presented a swift and steady growth trend after 2005 mainly due to the intensive economy development of Hunan Province, suggesting the value of natural ecological resources would increase along with the development of society and economy. Appling the emergy analysis to research the ecological compensation standard could reveal the dynamics of annual ecological compensation standard, solve the abutment problem of matter flow, energy flow and economic flow, and overcome the subjective and arbitrary of environment economic methods. The empirical research of ecological compensation standard in Dongting Lake region showed that the emergy analysis was feasible and advanced.
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.
Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P
2016-07-01
Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Koivula, Lauri; Kapanen, Mika; Seppälä, Tiina; Collan, Juhani; Dowling, Jason A; Greer, Peter B; Gustafsson, Christian; Gunnlaugsson, Adalsteinn; Olsson, Lars E; Wee, Leonard; Korhonen, Juha
2017-12-01
Recent studies have shown that it is possible to conduct entire radiotherapy treatment planning (RTP) workflow using only MR images. This study aims to develop a generalized intensity-based method to generate synthetic CT (sCT) images from standard T2-weighted (T2 w ) MR images of the pelvis. This study developed a generalized dual model HU conversion method to convert standard T2 w MR image intensity values to synthetic HU values, separately inside and outside of atlas-segmented bone volume contour. The method was developed and evaluated with 20 and 35 prostate cancer patients, respectively. MR images with scanning sequences in clinical use were acquired with four different MR scanners of three vendors. For the generated synthetic CT (sCT) images of the 35 prostate patients, the mean (and maximal) HU differences in soft and bony tissue volumes were 16 ± 6 HUs (34 HUs) and -46 ± 56 HUs (181 HUs), respectively, against the true CT images. The average of the PTV mean dose difference in sCTs compared to those in true CTs was -0.6 ± 0.4% (-1.3%). The study provides a generalized method for sCT creation from standard T2 w images of the pelvis. The method produced clinically acceptable dose calculation results for all the included scanners and MR sequences. Copyright © 2017 Elsevier B.V. All rights reserved.
Monitoring for airborne allergens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, H.A.
1992-07-01
Monitoring for allergens can provide some information on the kinds and levels of exposure experienced by local patient populations, providing volumetric methods are used for sample collection and analysis is accurate and consistent. Such data can also be used to develop standards for the specific environment and to begin to develop predictive models. Comparing outdoor allergen aerosols between different monitoring sites requires identical collection and analysis methods and some kind of rational standard, whether arbitrary, or based on recognized health effects.32 references.
Sabzghabaei, Foroogh; Salajeghe, Mahla; Soltani Arabshahi, Seyed Kamran
2017-01-01
Background: In this study, ambulatory care training in Firoozgar hospital was evaluated based on Iranian national standards of undergraduate medical education related to ambulatory education using Baldrige Excellence Model. Moreover, some suggestions were offered to promote education quality in the current condition of ambulatory education in Firoozgar hospital and national standards using the gap analysis method. Methods: This descriptive analytic study was a kind of evaluation research performed using the standard check lists published by the office of undergraduate medical education council. Data were collected through surveying documents, interviewing, and observing the processes based on the Baldrige Excellence Model. After confirming the validity and reliability of the check lists, we evaluated the establishment level of the national standards of undergraduate medical education in the clinics of this hospital in the 4 following domains: educational program, evaluation, training and research resources, and faculty members. Data were analyzed according to the national standards of undergraduate medical education related to ambulatory education and the Baldrige table for scoring. Finally, the quality level of the current condition was determined as very appropriate, appropriate, medium, weak, and very weak. Results: In domains of educational program 62%, in evaluation 48%, in training and research resources 46%, in faculty members 68%, and in overall ratio, 56% of the standards were appropriate. Conclusion: The most successful domains were educational program and faculty members, but evaluation and training and research resources domains had a medium performance. Some domains and indicators were determined as weak and their quality needed to be improved, so it is suggested to provide the necessary facilities and improvements by attending to the quality level of the national standards of ambulatory education PMID:29951400
Product line cost estimation: a standard cost approach.
Cooper, J C; Suver, J D
1988-04-01
Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line.
ERIC Educational Resources Information Center
Henry, Gary T.; And Others
1992-01-01
A statistical technique is presented for developing performance standards based on benchmark groups. The benchmark groups are selected using a multivariate technique that relies on a squared Euclidean distance method. For each observation unit (a school district in the example), a unique comparison group is selected. (SLD)
Intrajudge Consistency Using the Angoff Standard-Setting Method.
ERIC Educational Resources Information Center
Plake, Barbara S.; Impara, James C.
This study investigated the intrajudge consistency of Angoff-based item performance estimates. The examination used was a certification examination in an emergency medicine specialty. Ten expert panelists rated the same 24 items twice during an operational standard setting study. Results indicate that the panelists were highly consistent, in terms…
NASA Astrophysics Data System (ADS)
Connick, Robert J.
Accurate measurement of normal incident transmission loss is essential for the acoustic characterization of building materials. In this research, a method of measuring normal incidence sound transmission loss proposed by Salissou et al. as a complement to standard E2611-09 of the American Society for Testing and Materials [Standard Test Method for Measurement of Normal Incidence Sound Transmission of Acoustical Materials Based on the Transfer Matrix Method (American Society for Testing and Materials, New York, 2009)] is verified. Two sam- ples from the original literature are used to verify the method as well as a Filtros RTM sample. Following the verification, several nano-material Aerogel samples are measured.
Domain Decomposition Algorithms for First-Order System Least Squares Methods
NASA Technical Reports Server (NTRS)
Pavarino, Luca F.
1996-01-01
Least squares methods based on first-order systems have been recently proposed and analyzed for second-order elliptic equations and systems. They produce symmetric and positive definite discrete systems by using standard finite element spaces, which are not required to satisfy the inf-sup condition. In this paper, several domain decomposition algorithms for these first-order least squares methods are studied. Some representative overlapping and substructuring algorithms are considered in their additive and multiplicative variants. The theoretical and numerical results obtained show that the classical convergence bounds (on the iteration operator) for standard Galerkin discretizations are also valid for least squares methods.
NASA Astrophysics Data System (ADS)
Dementjev, Aleksandr S.; Jovaisa, A.; Silko, Galina; Ciegis, Raimondas
2005-11-01
Based on the developed efficient numerical methods for calculating the propagation of light beams, the alternative methods for measuring the beam radius and propagation ratio proposed in the international standard ISO 11146 are analysed. The specific calculations of the alternative beam propagation ratios Mi2 performed for a number of test beams with a complicated spatial structure showed that the correlation coefficients ci used in the international standard do not establish the universal one-to-one relation between the alternative propagation ratios Mi2 and invariant propagation ratios Mσ2 found by the method of moments.
Vanderlip, Erik R.; Cerimele, Joseph M.; Monroe-DeVita, Maria
2014-01-01
Objective This study compared program measures of assertive community treatment (ACT) with standards of accreditation for the patient-centered medical home (PCMH) to determine whether there were similarities in the infrastructure of the two methods of service delivery and whether high-fidelity ACT teams would qualify for medical home accreditation. Methods The authors compared National Committee for Quality Assurance PCMH standards with two ACT fidelity measures (the Dartmouth Assertive Community Treatment Scale and the Tool for Measurement of Assertive Community Treatment [TMACT]) and with national ACT program standards. Results PCMH standards pertaining to enhanced access and continuity, management of care, and self-care support demonstrated strong overlap across ACT measures. Standards for identification and management of populations, care coordination and follow-up, and quality improvement demonstrated less overlap. The TMACT and the program standards had sufficient overlap to score in the range of a level 1 PCMH, but no ACT measure sufficiently detailed methods of population-based screening and tracking of referrals to satisfy “must-pass” elements of the standards. Conclusions ACT measures and medical home standards had significant overlap in innate infrastructure. ACT teams following the program standards or undergoing TMACT fidelity review could have the necessary infrastructure to serve as medical homes if they were properly equipped to supervise general medical care and administer activities to improve management of chronic diseases. PMID:23820753
Hawkins, K A; Tulsky, D S
2001-11-01
Since memory performance expectations may be IQ-based, unidirectional base rate data for IQ-Memory Score discrepancies are provided in the WAIS-III/WMS-III Technical Manual. The utility of these data partially rests on the assumption that discrepancy base rates do not vary across ability levels. FSIQ stratified base rate data generated from the standardization sample, however, demonstrate substantial variability across the IQ spectrum. A superiority of memory score over FSIQ is typical at lower IQ levels, whereas the converse is true at higher IQ levels. These data indicate that the use of IQ-memory score unstratified "simple difference" tables could lead to erroneous conclusions for clients with low or high IQ. IQ stratified standardization base rate data are provided as a complement to the "predicted difference" method detailed in the Technical Manual.
A method for modelling GP practice level deprivation scores using GIS
Strong, Mark; Maheswaran, Ravi; Pearson, Tim; Fryers, Paul
2007-01-01
Background A measure of general practice level socioeconomic deprivation can be used to explore the association between deprivation and other practice characteristics. An area-based categorisation is commonly chosen as the basis for such a deprivation measure. Ideally a practice population-weighted area-based deprivation score would be calculated using individual level spatially referenced data. However, these data are often unavailable. One approach is to link the practice postcode to an area-based deprivation score, but this method has limitations. This study aimed to develop a Geographical Information Systems (GIS) based model that could better predict a practice population-weighted deprivation score in the absence of patient level data than simple practice postcode linkage. Results We calculated predicted practice level Index of Multiple Deprivation (IMD) 2004 deprivation scores using two methods that did not require patient level data. Firstly we linked the practice postcode to an IMD 2004 score, and secondly we used a GIS model derived using data from Rotherham, UK. We compared our two sets of predicted scores to "gold standard" practice population-weighted scores for practices in Doncaster, Havering and Warrington. Overall, the practice postcode linkage method overestimated "gold standard" IMD scores by 2.54 points (95% CI 0.94, 4.14), whereas our modelling method showed no such bias (mean difference 0.36, 95% CI -0.30, 1.02). The postcode-linked method systematically underestimated the gold standard score in less deprived areas, and overestimated it in more deprived areas. Our modelling method showed a small underestimation in scores at higher levels of deprivation in Havering, but showed no bias in Doncaster or Warrington. The postcode-linked method showed more variability when predicting scores than did the GIS modelling method. Conclusion A GIS based model can be used to predict a practice population-weighted area-based deprivation measure in the absence of patient level data. Our modelled measure generally had better agreement with the population-weighted measure than did a postcode-linked measure. Our model may also avoid an underestimation of IMD scores in less deprived areas, and overestimation of scores in more deprived areas, seen when using postcode linked scores. The proposed method may be of use to researchers who do not have access to patient level spatially referenced data. PMID:17822545
2011-01-01
Background Verbal autopsy methods are critically important for evaluating the leading causes of death in populations without adequate vital registration systems. With a myriad of analytical and data collection approaches, it is essential to create a high quality validation dataset from different populations to evaluate comparative method performance and make recommendations for future verbal autopsy implementation. This study was undertaken to compile a set of strictly defined gold standard deaths for which verbal autopsies were collected to validate the accuracy of different methods of verbal autopsy cause of death assignment. Methods Data collection was implemented in six sites in four countries: Andhra Pradesh, India; Bohol, Philippines; Dar es Salaam, Tanzania; Mexico City, Mexico; Pemba Island, Tanzania; and Uttar Pradesh, India. The Population Health Metrics Research Consortium (PHMRC) developed stringent diagnostic criteria including laboratory, pathology, and medical imaging findings to identify gold standard deaths in health facilities as well as an enhanced verbal autopsy instrument based on World Health Organization (WHO) standards. A cause list was constructed based on the WHO Global Burden of Disease estimates of the leading causes of death, potential to identify unique signs and symptoms, and the likely existence of sufficient medical technology to ascertain gold standard cases. Blinded verbal autopsies were collected on all gold standard deaths. Results Over 12,000 verbal autopsies on deaths with gold standard diagnoses were collected (7,836 adults, 2,075 children, 1,629 neonates, and 1,002 stillbirths). Difficulties in finding sufficient cases to meet gold standard criteria as well as problems with misclassification for certain causes meant that the target list of causes for analysis was reduced to 34 for adults, 21 for children, and 10 for neonates, excluding stillbirths. To ensure strict independence for the validation of methods and assessment of comparative performance, 500 test-train datasets were created from the universe of cases, covering a range of cause-specific compositions. Conclusions This unique, robust validation dataset will allow scholars to evaluate the performance of different verbal autopsy analytic methods as well as instrument design. This dataset can be used to inform the implementation of verbal autopsies to more reliably ascertain cause of death in national health information systems. PMID:21816095
de Vries, W H K; Veeger, H E J; Cutti, A G; Baten, C; van der Helm, F C T
2010-07-20
Inertial Magnetic Measurement Systems (IMMS) are becoming increasingly popular by allowing for measurements outside the motion laboratory. The latest models enable long term, accurate measurement of segment motion in terms of joint angles, if initial segment orientations can accurately be determined. The standard procedure for definition of segmental orientation is based on the measurement of positions of bony landmarks (BLM). However, IMMS do not deliver position information, so an alternative method to establish IMMS based, anatomically understandable segment orientations is proposed. For five subjects, IMMS recordings were collected in a standard anatomical position for definition of static axes, and during a series of standardized motions for the estimation of kinematic axes of rotation. For all axes, the intra- and inter-individual dispersion was estimated. Subsequently, local coordinate systems (LCS) were constructed on the basis of the combination of IMMS axes with the lowest dispersion and compared with BLM based LCS. The repeatability of the method appeared to be high; for every segment at least two axes could be determined with a dispersion of at most 3.8 degrees. Comparison of IMMS based with BLM based LCS yielded compatible results for the thorax, but less compatible results for the humerus, forearm and hand, where differences in orientation rose to 17.2 degrees. Although different from the 'gold standard' BLM based LCS, IMMS based LCS can be constructed repeatable, enabling the estimation of segment orientations outside the laboratory. A procedure for the definition of local reference frames using IMMS is proposed. 2010 Elsevier Ltd. All rights reserved.
Research on rebuilding the data information environment for aeronautical manufacturing enterprise
NASA Astrophysics Data System (ADS)
Feng, Xilan; Jiang, Zhiqiang; Zong, Xuewen; Shi, Jinfa
2005-12-01
The data environment on integrated information system and the basic standard on information resource management are the key effectively of the remote collaborative designing and manufacturing for complex product. A study project on rebuilding the data information environment for aeronautical manufacturing enterprise (Aero-ME) is put forwarded. Firstly, the data environment on integrated information system, the basic standard on information resource management, the basic establishment on corporation's information, the development on integrated information system, and the information education are discussed profoundly based on the practical requirement of information resource and technique for contemporary Aero-ME. Then, the idea and method with the data environment rebuilding based on I-CASE in the corporation is put forward, and the effective method and implement approach for manufacturing enterprise information is brought forwards. It will also the foundation and assurance that rebuilding the corporation data-environment and promoting standardizing information resource management for the development of Aero-ME information engineering.
Application to Noninvasive Measurement of Blood Components Based on Infrared Spectroscopy
NASA Astrophysics Data System (ADS)
Tamura, Kazuto; Ishizawa, Hiroaki; Fujita, Keiichi; Kaneko, Wataru; Morikawa, Tomotaka; Toba, Eiji; Kobayashi, Hideo
Recently, lifestyle diseases (diabetics, hyperlipemia etc.) have been steadily increasing, because change of diet, lack of exercise, increase an alcoholic intake, and increase a stress. It is a matter of vital importance to us. About tens of millions of people in Japan have approached the danger of lifestyle diseases. So they have to do a blood test to make sure that they have controlled physical condition themselves. Therefore, they have to measure blood components again and again. So, they are burden too heavy. This paper describes a new noninvasive measurement of blood components based on optical sensing. This uses Fourier transform infrared spectroscopy of attenuated total reflection. In order to study, the influence of individual difference, the internal standard method was introduced. This paper describes the detail of the internal standard method and its effect to the blood components calibration. Significant improvement was obtained by using the internal standard.
Rønning, Helene Thorsen; Einarsen, Kristin; Asp, Tone Normann
2006-06-23
A simple and rapid method for the determination and confirmation of chloramphenicol in several food matrices with LC-MS/MS was developed. Following addition of d5-chloramphenicol as internal standard, meat, seafood, egg, honey and milk samples were extracted with acetonitrile. Chloroform was then added to remove water. After evaporation, the residues were reconstituted in methanol/water (3+4) before injection. The urine and plasma samples were after addition of internal standard applied to a Chem Elut extraction cartridge, eluted with ethyl acetate, and hexane washed. Also these samples were reconstituted in methanol/water (3+4) after evaporation. By using an MRM acquisition method in negative ionization mode, the transitions 321-->152, 321-->194 and 326-->157 were used for quantification, confirmation and internal standard, respectively. Quantification of chloramphenicol positive samples regardless of matrix could be achieved with a common water based calibration curve. The validation of the method was based on EU-decision 2002/657 and different ways of calculating CCalpha and CCbeta were evaluated. The common CCalpha and CCbeta for all matrices were 0.02 and 0.04 microg/kg for the 321-->152 ion transition, and 0.02 and 0.03 microg/kg for the 321-->194 ion transition. At fortification level 0.1 microg/kg the within-laboratory reproducibility is below 25%.
Optimum SNR data compression in hardware using an Eigencoil array.
King, Scott B; Varosi, Steve M; Duensing, G Randy
2010-05-01
With the number of receivers available on clinical MRI systems now ranging from 8 to 32 channels, data compression methods are being explored to lessen the demands on the computer for data handling and processing. Although software-based methods of compression after reception lessen computational requirements, a hardware-based method before the receiver also reduces the number of receive channels required. An eight-channel Eigencoil array is constructed by placing a hardware radiofrequency signal combiner inline after preamplification, before the receiver system. The Eigencoil array produces signal-to-noise ratio (SNR) of an optimal reconstruction using a standard sum-of-squares reconstruction, with peripheral SNR gains of 30% over the standard array. The concept of "receiver channel reduction" or MRI data compression is demonstrated, with optimal SNR using only four channels, and with a three-channel Eigencoil, superior sum-of-squares SNR was achieved over the standard eight-channel array. A three-channel Eigencoil portion of a product neurovascular array confirms in vivo SNR performance and demonstrates parallel MRI up to R = 3. This SNR-preserving data compression method advantageously allows users of MRI systems with fewer receiver channels to achieve the SNR of higher-channel MRI systems. (c) 2010 Wiley-Liss, Inc.
Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.
Demerdash, Omar N A; Mitchell, Julie C
2012-07-01
Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.
Backe, Will J; Day, Thomas C; Field, Jennifer A
2013-05-21
A new analytical method was developed to quantify 26 newly-identified and 21 legacy (e.g. perfluoroalkyl carboxylates, perfluoroalkyl sulfonates, and fluorotelomer sulfonates) per and polyfluorinated alkyl substances (PFAS) in groundwater and aqueous film forming foam (AFFF) formulations. Prior to analysis, AFFF formulations were diluted into methanol and PFAS in groundwater were micro liquid-liquid extracted. Methanolic dilutions of AFFF formulations and groundwater extracts were analyzed by large-volume injection (900 μL) high-performance liquid chromatography tandem mass spectrometry. Orthogonal chromatography was performed using cation exchange (silica) and anion exchange (propylamine) guard columns connected in series to a reverse-phase (C18) analytical column. Method detection limits for PFAS in groundwater ranged from 0.71 ng/L to 67 ng/L, and whole-method accuracy ranged from 96% to 106% for analytes for which matched authentic analytical standards were available. For analytes without authentic analytical standards, whole-method accuracy ranged from 78 % to 144 %, and whole-method precision was less than 15 % relative standard deviation for all analytes. A demonstration of the method on groundwater samples from five military bases revealed eight of the 26 newly-identified PFAS present at concentrations up to 6900 ng/L. The newly-identified PFAS represent a minor fraction of the fluorinated chemicals in groundwater relative to legacy PFAS. The profiles of PFAS in groundwater differ from those found in fluorotelomer- and electrofluorination-based AFFF formulations, which potentially indicates environmental transformation of PFAS.
Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L
2012-10-01
Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.
Echegaray, Sebastian; Nair, Viswam; Kadoch, Michael; Leung, Ann; Rubin, Daniel; Gevaert, Olivier; Napel, Sandy
2016-12-01
Quantitative imaging approaches compute features within images' regions of interest. Segmentation is rarely completely automatic, requiring time-consuming editing by experts. We propose a new paradigm, called "digital biopsy," that allows for the collection of intensity- and texture-based features from these regions at least 1 order of magnitude faster than the current manual or semiautomated methods. A radiologist reviewed automated segmentations of lung nodules from 100 preoperative volume computed tomography scans of patients with non-small cell lung cancer, and manually adjusted the nodule boundaries in each section, to be used as a reference standard, requiring up to 45 minutes per nodule. We also asked a different expert to generate a digital biopsy for each patient using a paintbrush tool to paint a contiguous region of each tumor over multiple cross-sections, a procedure that required an average of <3 minutes per nodule. We simulated additional digital biopsies using morphological procedures. Finally, we compared the features extracted from these digital biopsies with our reference standard using intraclass correlation coefficient (ICC) to characterize robustness. Comparing the reference standard segmentations to our digital biopsies, we found that 84/94 features had an ICC >0.7; comparing erosions and dilations, using a sphere of 1.5-mm radius, of our digital biopsies to the reference standard segmentations resulted in 41/94 and 53/94 features, respectively, with ICCs >0.7. We conclude that many intensity- and texture-based features remain consistent between the reference standard and our method while substantially reducing the amount of operator time required.
Quality assurance and management in microelectronics companies: ISO 9000 versus Six Sigma
NASA Astrophysics Data System (ADS)
Lupan, Razvan; Kobi, Abdessamad; Robledo, Christian; Bacivarov, Ioan; Bacivarov, Angelica
2009-01-01
A strategy for the implementation of the Six Sigma method as an improvement solution for the ISO 9000:2000 Quality Standard is proposed. Our approach is focused on integrating the DMAIC cycle of the Six Sigma method with the PDCA process approach, highly recommended by the standard ISO 9000:2000. The Six Sigma steps applied to each part of the PDCA cycle are presented in detail, giving some tools and training examples. Based on this analysis the authors conclude that applying Six Sigma philosophy to the Quality Standard implementation process is the best way to achieve the optimal results in quality progress and therefore in customers satisfaction.
Optical frequency standards for gravitational wave detection using satellite velocimetry
NASA Astrophysics Data System (ADS)
Vutha, Amar
2015-04-01
Satellite Doppler velocimetry, building on the work of Kaufmann and Estabrook and Wahlquist, is a complementary technique to interferometric methods of gravitational wave detection. This method is based on the fact that the gravitational wave amplitude appears in the apparent Doppler shift of photons propagating from an emitter to a receiver. This apparent Doppler shift can be resolved provided that a frequency standard, capable of quickly averaging down to a high stability, is available. We present a design for a space-capable optical atomic frequency standard, and analyze the sensitivity of satellite Doppler velocimetry for gravitational wave astronomy in the milli-hertz frequency band.
Validation of a new ELISA method for in vitro potency testing of hepatitis A vaccines.
Morgeaux, S; Variot, P; Daas, A; Costanzo, A
2013-01-01
The goal of the project was to standardise a new in vitro method in replacement of the existing standard method for the determination of hepatitis A virus antigen content in hepatitis A vaccines (HAV) marketed in Europe. This became necessary due to issues with the method used previously, requiring the use of commercial test kits. The selected candidate method, not based on commercial kits, had already been used for many years by an Official Medicines Control Laboratory (OMCL) for routine testing and batch release of HAV. After a pre-qualification phase (Phase 1) that showed the suitability of the commercially available critical ELISA reagents for the determination of antigen content in marketed HAV present on the European market, an international collaborative study (Phase 2) was carried out in order to fully validate the method. Eleven laboratories took part in the collaborative study. They performed assays with the candidate standard method and, in parallel, for comparison purposes, with their own in-house validated methods where these were available. The study demonstrated that the new assay provides a more reliable and reproducible method when compared to the existing standard method. A good correlation of the candidate standard method with the in vivo immunogenicity assay in mice was shown previously for both potent and sub-potent (stressed) vaccines. Thus, the new standard method validated during the collaborative study may be implemented readily by manufacturers and OMCLs for routine batch release but also for in-process control or consistency testing. The new method was approved in October 2012 by Group of Experts 15 of the European Pharmacopoeia (Ph. Eur.) as the standard method for in vitro potency testing of HAV. The relevant texts will be revised accordingly. Critical reagents such as coating reagent and detection antibodies have been adopted by the Ph. Eur. Commission and are available from the EDQM as Ph. Eur. Biological Reference Reagents (BRRs).
ERIC Educational Resources Information Center
Snelling, Anastasia M.; Yezek, Jennifer
2012-01-01
Background: The study investigated how nutrient standards affected the number of kilocalories and grams of fat and saturated fat in competitive foods offered and sold in 3 high schools. Methods: The study is a quasi-experimental design with 3 schools serving as the units of assignment and analysis. The effect of the nutrient standards was measured…
Beichel, Reinhard R.; Van Tol, Markus; Ulrich, Ethan J.; Bauer, Christian; Chang, Tangel; Plichta, Kristin A.; Smith, Brian J.; Sunderland, John J.; Graham, Michael M.; Sonka, Milan; Buatti, John M.
2016-01-01
Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction. PMID:27277044
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beichel, Reinhard R., E-mail: reinhard-beichel@uiowa.edu; Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242; Department of Internal Medicine, University of Iowa, Iowa City, Iowa 52242
Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behaviormore » of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.« less
Adaptive Set-Based Methods for Association Testing
Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo
2017-01-01
With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371
Better Than Counting: Density Profiles from Force Sampling
NASA Astrophysics Data System (ADS)
de las Heras, Daniel; Schmidt, Matthias
2018-05-01
Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.
Information form the previously approved extended abstract A standardized area source measurement method based on mobile tracer correlation was used for methane emissions assessment in 52 field deployments...
The Weakest Link: Library Catalogs.
ERIC Educational Resources Information Center
Young, Terrence E., Jr.
2002-01-01
Describes methods of correcting MARC records in online public access catalogs in school libraries. Highlights include in-house methods; professional resources; conforming to library cataloging standards; vendor services, including Web-based services; software specifically developed for record cleanup; and outsourcing. (LRW)
Teaching Prevention in Pediatrics.
ERIC Educational Resources Information Center
Cheng, Tina L.; Greenberg, Larrie; Loeser, Helen; Keller, David
2000-01-01
Reviews methods of teaching preventive medicine in pediatrics and highlights innovative programs. Methods of teaching prevention in pediatrics include patient interactions, self-directed learning, case-based learning, small-group learning, standardized patients, computer-assisted instruction, the Internet, student-centered learning, and lectures.…
Well-characterized and standardized methods are the foundation upon which monitoring of regulated and unregulated contaminants in drinking water are based. To obtain reliable, high quality data for trace analysis of contaminants, these methods must be rugged, selective and sensit...
75 FR 12753 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-17
... effective at improving health care quality. While evidence-based approaches for decisionmaking have become standard in healthcare, this has been limited in laboratory medicine. No single- evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint... abatement of lead-based paint or lead-based paint hazards shall be performed in accordance with 40 CFR 745...
Efficient physics-based tracking of heart surface motion for beating heart surgery robotic systems.
Bogatyrenko, Evgeniya; Pompey, Pascal; Hanebeck, Uwe D
2011-05-01
Tracking of beating heart motion in a robotic surgery system is required for complex cardiovascular interventions. A heart surface motion tracking method is developed, including a stochastic physics-based heart surface model and an efficient reconstruction algorithm. The algorithm uses the constraints provided by the model that exploits the physical characteristics of the heart. The main advantage of the model is that it is more realistic than most standard heart models. Additionally, no explicit matching between the measurements and the model is required. The application of meshless methods significantly reduces the complexity of physics-based tracking. Based on the stochastic physical model of the heart surface, this approach considers the motion of the intervention area and is robust to occlusions and reflections. The tracking algorithm is evaluated in simulations and experiments on an artificial heart. Providing higher accuracy than the standard model-based methods, it successfully copes with occlusions and provides high performance even when all measurements are not available. Combining the physical and stochastic description of the heart surface motion ensures physically correct and accurate prediction. Automatic initialization of the physics-based cardiac motion tracking enables system evaluation in a clinical environment.
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
Ringuet, Stephanie; Sassano, Lara; Johnson, Zackary I
2011-02-01
A sensitive, accurate and rapid analysis of major nutrients in aquatic systems is essential for monitoring and maintaining healthy aquatic environments. In particular, monitoring ammonium (NH(4)(+)) concentrations is necessary for maintenance of many fish stocks, while accurate monitoring and regulation of ammonium, orthophosphate (PO(4)(3-)), silicate (Si(OH)(4)) and nitrate (NO(3)(-)) concentrations are required for regulating algae production. Monitoring of wastewater streams is also required for many aquaculture, municipal and industrial wastewater facilities to comply with local, state or federal water quality effluent regulations. Traditional methods for quantifying these nutrient concentrations often require laborious techniques or expensive specialized equipment making these analyses difficult. Here we present four alternative microcolorimetric assays that are based on a standard 96-well microplate format and microplate reader that simplify the quantification of each of these nutrients. Each method uses small sample volumes (200 µL), has a detection limit ≤ 1 µM in freshwater and ≤ 2 µM in saltwater, precision of at least 8% and compares favorably with standard analytical procedures. Routine use of these techniques in the laboratory and at an aquaculture facility to monitor nutrient concentrations associated with microalgae growth demonstrates that they are rapid, accurate and highly reproducible among different users. These techniques offer an alternative to standard nutrient analyses and because they are based on the standard 96-well format, they significantly decrease the cost and time of processing while maintaining high precision and sensitivity.
DOT National Transportation Integrated Search
2017-09-01
The mechanistic-empirical pavement design method requires the elastic resilient modulus as the key input for characterization of geomaterials. Current density-based QA procedures do not measure resilient modulus. Additionally, the density-based metho...
A Fixed-Pattern Noise Correction Method Based on Gray Value Compensation for TDI CMOS Image Sensor.
Liu, Zhenwang; Xu, Jiangtao; Wang, Xinlei; Nie, Kaiming; Jin, Weimin
2015-09-16
In order to eliminate the fixed-pattern noise (FPN) in the output image of time-delay-integration CMOS image sensor (TDI-CIS), a FPN correction method based on gray value compensation is proposed. One hundred images are first captured under uniform illumination. Then, row FPN (RFPN) and column FPN (CFPN) are estimated based on the row-mean vector and column-mean vector of all collected images, respectively. Finally, RFPN are corrected by adding the estimated RFPN gray value to the original gray values of pixels in the corresponding row, and CFPN are corrected by subtracting the estimated CFPN gray value from the original gray values of pixels in the corresponding column. Experimental results based on a 128-stage TDI-CIS show that, after correcting the FPN in the image captured under uniform illumination with the proposed method, the standard-deviation of row-mean vector decreases from 5.6798 to 0.4214 LSB, and the standard-deviation of column-mean vector decreases from 15.2080 to 13.4623 LSB. Both kinds of FPN in the real images captured by TDI-CIS are eliminated effectively with the proposed method.
Perskvist, Nasrin; Norlin, Loreana; Dillner, Joakim
2015-04-01
This article addresses the important issue of the standardization of the biobank process. It reports on i) the implementation of standard operating procedures for the processing of liquid-based cervical cells, ii) the standardization of storage conditions, and iii) the ultimate establishment of nationwide standardized biorepositories for cervical specimens. Given the differences in the infrastructure and healthcare systems of various county councils in Sweden, these efforts were designed to develop standardized methods of biobanking across the nation. The standardization of cervical sample processing and biobanking is an important and widely acknowledged issue. Efforts to address these concerns will facilitate better patient care and improve research based on retrospective and prospective collections of patient samples and cohorts. The successful nationalization of the Cervical Cytology Biobank in Sweden is based on three vital issues: i) the flexibility of the system to adapt to other regional systems, ii) the development of the system based on national collaboration between the university and the county councils, and iii) stable governmental financing by the provider, the Biobanking and Molecular Resource Infrastructure of Sweden (BBMRI.se). We will share our experiences with biorepository communities to promote understanding of and advances in opportunities to establish a nationalized biobank which covers the healthcare of the entire nation.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
Li, Yongtao; Whitaker, Joshua S; McCarty, Christina L
2012-07-06
A large volume direct aqueous injection method was developed for the analysis of iodinated haloacetic acids in drinking water by using reversed-phase liquid chromatography/electrospray ionization/tandem mass spectrometry in the negative ion mode. Both the external and internal standard calibration methods were studied for the analysis of monoiodoacetic acid, chloroiodoacetic acid, bromoiodoacetic acid, and diiodoacetic acid in drinking water. The use of a divert valve technique for the mobile phase solvent delay, along with isotopically labeled analogs used as internal standards, effectively reduced and compensated for the ionization suppression typically caused by coexisting common inorganic anions. Under the optimized method conditions, the mean absolute and relative recoveries resulting from the replicate fortified deionized water and chlorinated drinking water analyses were 83-107% with a relative standard deviation of 0.7-11.7% and 84-111% with a relative standard deviation of 0.8-12.1%, respectively. The method detection limits resulting from the external and internal standard calibrations, based on seven fortified deionized water replicates, were 0.7-2.3 ng/L and 0.5-1.9 ng/L, respectively. Copyright © 2012 Elsevier B.V. All rights reserved.
Robust Skull-Stripping Segmentation Based on Irrational Mask for Magnetic Resonance Brain Images.
Moldovanu, Simona; Moraru, Luminița; Biswas, Anjan
2015-12-01
This paper proposes a new method for simple, efficient, and robust removal of the non-brain tissues in MR images based on an irrational mask for filtration within a binary morphological operation framework. The proposed skull-stripping segmentation is based on two irrational 3 × 3 and 5 × 5 masks, having the sum of its weights equal to the transcendental number π value provided by the Gregory-Leibniz infinite series. It allows maintaining a lower rate of useful pixel loss. The proposed method has been tested in two ways. First, it has been validated as a binary method by comparing and contrasting with Otsu's, Sauvola's, Niblack's, and Bernsen's binary methods. Secondly, its accuracy has been verified against three state-of-the-art skull-stripping methods: the graph cuts method, the method based on Chan-Vese active contour model, and the simplex mesh and histogram analysis skull stripping. The performance of the proposed method has been assessed using the Dice scores, overlap and extra fractions, and sensitivity and specificity as statistical methods. The gold standard has been provided by two neurologist experts. The proposed method has been tested and validated on 26 image series which contain 216 images from two publicly available databases: the Whole Brain Atlas and the Internet Brain Segmentation Repository that include a highly variable sample population (with reference to age, sex, healthy/diseased). The approach performs accurately on both standardized databases. The main advantage of the proposed method is its robustness and speed.
Li, S P; Qiao, C F; Chen, Y W; Zhao, J; Cui, X M; Zhang, Q W; Liu, X M; Hu, D J
2013-10-25
Root of Panax notoginseng (Burk.) F.H. Chen (Sanqi in Chinese) is one of traditional Chinese medicines (TCMs) based functional food. Saponins are the major bioactive components. The shortage of reference compounds or chemical standards is one of the main bottlenecks for quality control of TCMs. A novel strategy, i.e. standardized reference extract based qualification and single calibrated components directly quantitative estimation of multiple analytes, was proposed to easily and effectively control the quality of natural functional foods such as Sanqi. The feasibility and credibility of this methodology were also assessed with a developed fast HPLC method. Five saponins, including ginsenoside Rg1, Re, Rb1, Rd and notoginsenoside R1 were rapidly separated using a conventional HPLC in 20 min. The quantification method was also compared with individual calibration curve method. The strategy is feasible and credible, which is easily and effectively adapted for improving the quality control of natural functional foods such as Sanqi. Copyright © 2013 Elsevier B.V. All rights reserved.
The Same or Not the Same: Equivalence as an Issue in Educational Research
NASA Astrophysics Data System (ADS)
Lewis, Scott E.; Lewis, Jennifer E.
2005-09-01
In educational research, particularly in the sciences, a common research design calls for the establishment of a control and experimental group to determine the effectiveness of an intervention. As part of this design, it is often desirable to illustrate that the two groups were equivalent at the start of the intervention, based on measures such as standardized cognitive tests or student grades in prior courses. In this article we use SAT and ACT scores to illustrate a more robust way of testing equivalence. The method incorporates two one-sided t tests evaluating two null hypotheses, providing a stronger claim for equivalence than the standard method, which often does not address the possible problem of low statistical power. The two null hypotheses are based on the construction of an equivalence interval particular to the data, so the article also provides a rationale for and illustration of a procedure for constructing equivalence intervals. Our consideration of equivalence using this method also underscores the need to include sample sizes, standard deviations, and group means in published quantitative studies.
NASA Technical Reports Server (NTRS)
Warner, Joseph D.; Theofylaktos, Onoufrios
2012-01-01
A method of determining the bit error rate (BER) of a digital circuit from the measurement of the analog S-parameters of the circuit has been developed. The method is based on the measurement of the noise and the standard deviation of the noise in the S-parameters. Once the standard deviation and the mean of the S-parameters are known, the BER of the circuit can be calculated using the normal Gaussian function.
NASA Astrophysics Data System (ADS)
Anees, Amir; Khan, Waqar Ahmad; Gondal, Muhammad Asif; Hussain, Iqtadar
2013-07-01
The aim of this work is to make use of the mean of absolute deviation (MAD) method for the evaluation process of substitution boxes used in the advanced encryption standard. In this paper, we use the MAD technique to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, MAD is applied to advanced encryption standard (AES), affine power affine (APA), Gray, Lui J., Residue Prime, S8 AES, SKIPJACK, and Xyi substitution boxes.
[Undergraduate psychiatric training in Turkey].
Cıngı Başterzi, Ayşe Devrim; Tükel, Raşit; Uluşahin, Aylin; Coşkun, Bülent; Alkın, Tunç; Murat Demet, Mehmet; Konuk, Numan; Taşdelen, Bahar
2010-01-01
The current trend in medical education is to abandon the experience-based traditional model and embrace the competency-based education model (CBE). The basic principle behind CBE is standardization. The first step in standardization is to determine what students must know, what they must accomplish, and what attitude they should display, and the establishment of educational goals. One of the goals of the Psychiatric Association of Turkey, Psychiatric Training Section is to standardize psychiatric training in Turkish medical schools. This study aimed to determine the current state of undergraduate psychiatric training in Turkish medical schools. Questionnaires were sent to the psychiatry department chairs of 41 medical schools. Data were analyzed using descriptive statistical methods. Of the 41 department chairs that were sent the questionnaire, 29 (70%) completed and returned them, of which 16 (66.7%) reported that they had already defined goals and educational objectives for their undergraduate psychiatric training programs. The Core Education Program, prepared by the Turkish Medicine and Health Education Council, was predominately used at 9 (37.5%) medical schools. Pre-clinical and clinical training schedules varied between medical schools. In all, 3 of the medical schools did not offer internships in psychiatry. The majority of chairs emphasized the importance of mood disorders (49.9%) and anxiety disorders (40%), suggesting that these disorders should be treated by general practitioners. Computer technology was commonly used for lecturing; however, utilization of interactive and skill-based teaching methods was limited. The most commonly used evaluation methods were written examination (87.5%) during preclinical training and oral examination (91.6%) during clinical training. The most important finding of this study was the lack of a standardized curriculum for psychiatric training in Turkey. Standardization of psychiatric training in Turkish medical schools must be developed.
Yamamoto, Shingo; Tanooka, Masao; Ando, Kumiko; Yamano, Toshiko; Ishikura, Reiichi; Nojima, Michio; Hirota, Shozo; Shima, Hiroki
2009-12-01
To evaluate the diagnostic accuracy of computed tomography (CT)-based imaging methods for assessing renal vascular anatomy, imaging studies, including standard axial CT, three-dimensional volume-rendered CT (3DVR-CT), and a 3DVR-CT movie, were performed on 30 patients who underwent laparoscopic donor nephrectomy (10 right side, 20 left side) for predicting the location of the renal arteries and renal, adrenal, gonadal, and lumbar veins. These findings were compared with videos obtained during the operation. Two of 37 renal arteries observed intraoperatively were missed by standard axial CT and 3DVR-CT, whereas all arteries were identified by the 3DVR-CT movie. Two of 36 renal veins were missed by standard axial CT and 3DVR-CT, whereas 1 was missed by the 3DVR-CT movie. In 20 left renal hilar anatomical structures, 20 adrenal, 20 gonadal, and 22 lumbar veins were observed during the operation. Preoperatively, the standard axial CT, 3DVR-CT, and 3DVR-CT movie detected 11, 19, and 20 adrenal veins; 13, 14, and 19 gonadal veins; and 6, 11, and 15 lumbar veins, respectively. Overall, of 135 renal vascular structures, the standard axial CT, 3DVR-CT, and 3DVR-CT movie accurately detected 99 (73.3%), 113 (83.7%), and 126 (93.3%) vessels, respectively, which indicated that the 3DVR-CT movie demonstrated a significantly higher detection rate than other CT-based imaging methods (P < 0.05). The 3DVR-CT movie accurately provides essential information about the renal vascular anatomy before laparoscopic donor nephrectomy.
Meta-analysis of two studies in the presence of heterogeneity with applications in rare diseases.
Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat
2017-07-01
Random-effects meta-analyses are used to combine evidence of treatment effects from multiple studies. Since treatment effects may vary across trials due to differences in study characteristics, heterogeneity in treatment effects between studies must be accounted for to achieve valid inference. The standard model for random-effects meta-analysis assumes approximately normal effect estimates and a normal random-effects model. However, standard methods based on this model ignore the uncertainty in estimating the between-trial heterogeneity. In the special setting of only two studies and in the presence of heterogeneity, we investigate here alternatives such as the Hartung-Knapp-Sidik-Jonkman method (HKSJ), the modified Knapp-Hartung method (mKH, a variation of the HKSJ method) and Bayesian random-effects meta-analyses with priors covering plausible heterogeneity values; R code to reproduce the examples is presented in an appendix. The properties of these methods are assessed by applying them to five examples from various rare diseases and by a simulation study. Whereas the standard method based on normal quantiles has poor coverage, the HKSJ and mKH generally lead to very long, and therefore inconclusive, confidence intervals. The Bayesian intervals on the whole show satisfying properties and offer a reasonable compromise between these two extremes. © 2016 The Authors. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Faassen, Elisabeth J; Antoniou, Maria G; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-02-29
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer's disease and Parkinson's disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D₃BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%-32%), implying that D₃BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis.
NASA Astrophysics Data System (ADS)
Cameron, James F.; Fradkin, Leslie; Moore, Kathryn; Pohlers, Gerd
2000-06-01
Chemically amplified deep UV (CA-DUV) positive resists are the enabling materials for manufacture of devices at and below 0.18 micrometer design rules in the semiconductor industry. CA-DUV resists are typically based on a combination of an acid labile polymer and a photoacid generator (PAG). Upon UV exposure, a catalytic amount of a strong Bronsted acid is released and is subsequently used in a post-exposure bake step to deprotect the acid labile polymer. Deprotection transforms the acid labile polymer into a base soluble polymer and ultimately enables positive tone image development in dilute aqueous base. As CA-DUV resist systems continue to mature and are used in increasingly demanding situations, it is critical to develop a fundamental understanding of how robust these materials are. One of the most important factors to quantify is how much acid is photogenerated in these systems at key exposure doses. For the purpose of quantifying photoacid generation several methods have been devised. These include spectrophotometric methods, ion conductivity methods and most recently an acid-base type titration similar to the standard addition method. This paper compares many of these techniques. First, comparisons between the most commonly used acid sensitive dye, tetrabromophenol blue sodium salt (TBPB) and a less common acid sensitive dye, Rhodamine B base (RB) are made in several resist systems. Second, the novel acid-base type titration based on the standard addition method is compared to the spectrophotometric titration method. During these studies, the make up of the resist system is probed as follows: the photoacid generator and resist additives are varied to understand the impact of each of these resist components on the acid generation process.
NASA Astrophysics Data System (ADS)
Chen, Xiaogang; Wang, Yijun; Gao, Shangkai; Jung, Tzyy-Ping; Gao, Xiaorong
2015-08-01
Objective. Recently, canonical correlation analysis (CCA) has been widely used in steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) due to its high efficiency, robustness, and simple implementation. However, a method with which to make use of harmonic SSVEP components to enhance the CCA-based frequency detection has not been well established. Approach. This study proposed a filter bank canonical correlation analysis (FBCCA) method to incorporate fundamental and harmonic frequency components to improve the detection of SSVEPs. A 40-target BCI speller based on frequency coding (frequency range: 8-15.8 Hz, frequency interval: 0.2 Hz) was used for performance evaluation. To optimize the filter bank design, three methods (M1: sub-bands with equally spaced bandwidths; M2: sub-bands corresponding to individual harmonic frequency bands; M3: sub-bands covering multiple harmonic frequency bands) were proposed for comparison. Classification accuracy and information transfer rate (ITR) of the three FBCCA methods and the standard CCA method were estimated using an offline dataset from 12 subjects. Furthermore, an online BCI speller adopting the optimal FBCCA method was tested with a group of 10 subjects. Main results. The FBCCA methods significantly outperformed the standard CCA method. The method M3 achieved the highest classification performance. At a spelling rate of ˜33.3 characters/min, the online BCI speller obtained an average ITR of 151.18 ± 20.34 bits min-1. Significance. By incorporating the fundamental and harmonic SSVEP components in target identification, the proposed FBCCA method significantly improves the performance of the SSVEP-based BCI, and thereby facilitates its practical applications such as high-speed spelling.
Comparison of landmark-based and automatic methods for cortical surface registration
Pantazis, Dimitrios; Joshi, Anand; Jiang, Jintao; Shattuck, David; Bernstein, Lynne E.; Damasio, Hanna; Leahy, Richard M.
2009-01-01
Group analysis of structure or function in cerebral cortex typically involves as a first step the alignment of the cortices. A surface based approach to this problem treats the cortex as a convoluted surface and coregisters across subjects so that cortical landmarks or features are aligned. This registration can be performed using curves representing sulcal fundi and gyral crowns to constrain the mapping. Alternatively, registration can be based on the alignment of curvature metrics computed over the entire cortical surface. The former approach typically involves some degree of user interaction in defining the sulcal and gyral landmarks while the latter methods can be completely automated. Here we introduce a cortical delineation protocol consisting of 26 consistent landmarks spanning the entire cortical surface. We then compare the performance of a landmark-based registration method that uses this protocol with that of two automatic methods implemented in the software packages FreeSurfer and BrainVoyager. We compare performance in terms of discrepancy maps between the different methods, the accuracy with which regions of interest are aligned, and the ability of the automated methods to correctly align standard cortical landmarks. Our results show similar performance for ROIs in the perisylvian region for the landmark based method and FreeSurfer. However, the discrepancy maps showed larger variability between methods in occipital and frontal cortex and also that automated methods often produce misalignment of standard cortical landmarks. Consequently, selection of the registration approach should consider the importance of accurate sulcal alignment for the specific task for which coregistration is being performed. When automatic methods are used, the users should ensure that sulci in regions of interest in their studies are adequately aligned before proceeding with subsequent analysis. PMID:19796696
A review of the quantum current standard
NASA Astrophysics Data System (ADS)
Kaneko, Nobu-Hisa; Nakamura, Shuji; Okazaki, Yuma
2016-03-01
The electric current, voltage, and resistance standards are the most important standards related to electricity and magnetism. Of these three standards, only the ampere, which is the unit of electric current, is an International System of Units (SI) base unit. However, even with modern technology, relatively large uncertainty exists regarding the generation and measurement of current. As a result of various innovative techniques based on nanotechnology and novel materials, new types of junctions for quantum current generation and single-electron current sources have recently been proposed. These newly developed methods are also being used to investigate the consistency of the three quantum electrical effects, i.e. the Josephson, quantum Hall, and single-electron tunneling effects, which are also known as ‘the quantum metrology triangle’. This article describes recent research and related developments regarding current standards and quantum-metrology-triangle experiments.
[Standardization of the terms for Chinese herbal functions based on functional targeting].
Xiao, Bin; Tao, Ou; Gu, Hao; Wang, Yun; Qiao, Yan-Jiang
2011-03-01
Functional analysis concisely summarizes and concentrates on the therapeutic characteristics and features of Chinese herbal medicine. Standardization of the terms for Chinese herbal functions not only plays a key role in modern research and development of Chinese herbal medicine, but also has far-reaching clinical applications. In this paper, a new method for standardizing the terms for Chinese herbal function was proposed. Firstly, functional targets were collected. Secondly, the pathological conditions and the mode of action of every functional target were determined by analyzing the references. Thirdly, the relationships between the pathological condition and the mode of action were determined based on Chinese medicine theory and data. This three-step approach allows for standardization of the terms for Chinese herbal functions. Promoting the standardization of Chinese medicine terms will benefit the overall clinical application of Chinese herbal medicine.
Bissonnette, Luc; Maheux, Andrée F; Bergeron, Michel G
2017-01-01
The microbial assessment of potable/drinking water is done to ensure that the resource is free of fecal contamination indicators or waterborne pathogens. Culture-based methods for verifying the microbial safety are limited in the sense that a standard volume of water is generally tested for only one indicator (family) or pathogen.In this work, we describe a membrane filtration-based molecular microbiology method, CRENAME (Concentration Recovery Extraction of Nucleic Acids and Molecular Enrichment), exploiting molecular enrichment by whole genome amplification (WGA) to yield, in less than 4 h, a nucleic acid preparation which can be repetitively tested by real-time PCR for example, to provide multiparametric presence/absence tests (1 colony forming unit or microbial particle per standard volume of 100-1000 mL) for bacterial or protozoan parasite cells or particles susceptible to contaminate potable/drinking water.
Hybrid Differential Dynamic Programming with Stochastic Search
NASA Technical Reports Server (NTRS)
Aziz, Jonathan; Parker, Jeffrey; Englander, Jacob
2016-01-01
Differential dynamic programming (DDP) has been demonstrated as a viable approach to low-thrust trajectory optimization, namely with the recent success of NASAs Dawn mission. The Dawn trajectory was designed with the DDP-based Static Dynamic Optimal Control algorithm used in the Mystic software. Another recently developed method, Hybrid Differential Dynamic Programming (HDDP) is a variant of the standard DDP formulation that leverages both first-order and second-order state transition matrices in addition to nonlinear programming (NLP) techniques. Areas of improvement over standard DDP include constraint handling, convergence properties, continuous dynamics, and multi-phase capability. DDP is a gradient based method and will converge to a solution nearby an initial guess. In this study, monotonic basin hopping (MBH) is employed as a stochastic search method to overcome this limitation, by augmenting the HDDP algorithm for a wider search of the solution space.
Identifying inaccuracy of MS Project using system analysis
NASA Astrophysics Data System (ADS)
Fachrurrazi; Husin, Saiful; Malahayati, Nurul; Irzaidi
2018-05-01
The problem encountered in project owner’s financial accounting report is the difference in total project costs of MS Project to the Indonesian Standard (Standard Indonesia Standard / Cost Estimating Standard Book of Indonesia). It is one of the MS Project problems concerning to its cost accuracy, so cost data cannot be used in an integrated way for all project components. This study focuses on finding the causes of inaccuracy of the MS Projects. The aim of this study, which is operationally, are: (i) identifying cost analysis procedures for both current methods (SNI) and MS Project; (ii) identifying cost bias in each element of the cost analysis procedure; and (iii) analysing the cost differences (cost bias) in each element to identify what the cause of inaccuracies in MS Project toward SNI is. The method in this study is comparing for both the system analysis of MS Project and SNI. The results are: (i) MS Project system in Work of Resources element has limitation for two decimal digits only, have led to its inaccuracy. Where the Work of Resources (referred to as effort) in MS Project represents multiplication between the Quantities of Activities and Requirements of resources in SNI; (ii) MS Project and SNI have differences in the costing methods (the cost estimation methods), in which the SNI uses the Quantity-Based Costing (QBC), meanwhile MS Project uses the Time-Based Costing (TBC). Based on this research, we recommend to the contractors who use SNI should make an adjustment for Work of Resources in MS Project (with correction index) so that it can be used in an integrated way to the project owner’s financial accounting system. Further research will conduct for improvement the MS Project as an integrated tool toward all part of the project participant.
Standard plane localization in ultrasound by radial component model and selective search.
Ni, Dong; Yang, Xin; Chen, Xin; Chin, Chien-Ting; Chen, Siping; Heng, Pheng Ann; Li, Shengli; Qin, Jing; Wang, Tianfu
2014-11-01
Acquisition of the standard plane is crucial for medical ultrasound diagnosis. However, this process requires substantial experience and a thorough knowledge of human anatomy. Therefore it is very challenging for novices and even time consuming for experienced examiners. We proposed a hierarchical, supervised learning framework for automatically detecting the standard plane from consecutive 2-D ultrasound images. We tested this technique by developing a system that localizes the fetal abdominal standard plane from ultrasound video by detecting three key anatomical structures: the stomach bubble, umbilical vein and spine. We first proposed a novel radial component-based model to describe the geometric constraints of these key anatomical structures. We then introduced a novel selective search method which exploits the vessel probability algorithm to produce probable locations for the spine and umbilical vein. Next, using component classifiers trained by random forests, we detected the key anatomical structures at their probable locations within the regions constrained by the radial component-based model. Finally, a second-level classifier combined the results from the component detection to identify an ultrasound image as either a "fetal abdominal standard plane" or a "non- fetal abdominal standard plane." Experimental results on 223 fetal abdomen videos showed that the detection accuracy of our method was as high as 85.6% and significantly outperformed both the full abdomen and the separate anatomy detection methods without geometric constraints. The experimental results demonstrated that our system shows great promise for application to clinical practice. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Schwartzenberger, Justin; Presson, Angela; Lyle, Adam; O'Farrell, Andrew; Tyser, Andrew R
2017-09-01
Obtaining remote patient-reported outcomes (PRO) is limited by low patient response rates and resource-intensive collection methods. We hypothesized that an e-mail-delivered Web-based data collection tool would outperform the traditional methods of telephone and standard mail for collecting long-term Boston Carpal Tunnel Questionnaire (BCTQ) scores at a minimum of 1 year following carpal tunnel release (CTR). We conducted a randomized trial of 969 patients who underwent CTR at a tertiary medical center within the past 5 years. Participants were randomized to the PRO collection methods of mail, telephone, and e-mail. The primary outcome was survey response rate at 1 year after surgery. Secondary analyses included data completeness and the effect of time from surgery, mode effects, and patient modality preference. At 1 year from surgery, the response rates were 64% for telephone and 42% for both mail and e-mail. Ninety-nine percent of telephone surveys were complete compared with 88% and 83% for mail and e-mail, respectively. There was no significant difference in the overall response rate at 1 or 5 years after surgery, nor in the BCTQ score between the modalities. A higher response rate and increased survey completeness was achieved by telephone contact methods compared with standard mailings or Web-based methods for PRO collection after CTR 1 to 5 years after surgery. A Web-based method demonstrated response rates equivalent to those of standard mail, was the most preferred modality, and offered logistical advantages such as automation and immediate integration with outcome databases. Obtaining PRO routinely after treatment may increase in importance. A Web-based interface may assist clinicians in decreasing the resource utilization typically associated with more traditional methods used to obtain outcome data. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Validation of the tablet-administered Brief Assessment of Cognition (BAC App).
Atkins, Alexandra S; Tseng, Tina; Vaughan, Adam; Twamley, Elizabeth W; Harvey, Philip; Patterson, Thomas; Narasimhan, Meera; Keefe, Richard S E
2017-03-01
Computerized tests benefit from automated scoring procedures and standardized administration instructions. These methods can reduce the potential for rater error. However, especially in patients with severe mental illnesses, the equivalency of traditional and tablet-based tests cannot be assumed. The Brief Assessment of Cognition in Schizophrenia (BACS) is a pen-and-paper cognitive assessment tool that has been used in hundreds of research studies and clinical trials, and has normative data available for generating age- and gender-corrected standardized scores. A tablet-based version of the BACS called the BAC App has been developed. This study compared performance on the BACS and the BAC App in patients with schizophrenia and healthy controls. Test equivalency was assessed, and the applicability of paper-based normative data was evaluated. Results demonstrated the distributions of standardized composite scores for the tablet-based BAC App and the pen-and-paper BACS were indistinguishable, and the between-methods mean differences were not statistically significant. The discrimination between patients and controls was similarly robust. The between-methods correlations for individual measures in patients were r>0.70 for most subtests. When data from the Token Motor Test was omitted, the between-methods correlation of composite scores was r=0.88 (df=48; p<0.001) in healthy controls and r=0.89 (df=46; p<0.001) in patients, consistent with the test-retest reliability of each measure. Taken together, results indicate that the tablet-based BAC App generates results consistent with the traditional pen-and-paper BACS, and support the notion that the BAC App is appropriate for use in clinical trials and clinical practice. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Multidisciplinary life cycle metrics and tools for green buildings.
Helgeson, Jennifer F; Lippiatt, Barbara C
2009-07-01
Building sector stakeholders need compelling metrics, tools, data, and case studies to support major investments in sustainable technologies. Proponents of green building widely claim that buildings integrating sustainable technologies are cost effective, but often these claims are based on incomplete, anecdotal evidence that is difficult to reproduce and defend. The claims suffer from 2 main weaknesses: 1) buildings on which claims are based are not necessarily "green" in a science-based, life cycle assessment (LCA) sense and 2) measures of cost effectiveness often are not based on standard methods for measuring economic worth. Yet, the building industry demands compelling metrics to justify sustainable building designs. The problem is hard to solve because, until now, neither methods nor robust data supporting defensible business cases were available. The US National Institute of Standards and Technology (NIST) Building and Fire Research Laboratory is beginning to address these needs by developing metrics and tools for assessing the life cycle economic and environmental performance of buildings. Economic performance is measured with the use of standard life cycle costing methods. Environmental performance is measured by LCA methods that assess the "carbon footprint" of buildings, as well as 11 other sustainability metrics, including fossil fuel depletion, smog formation, water use, habitat alteration, indoor air quality, and effects on human health. Carbon efficiency ratios and other eco-efficiency metrics are established to yield science-based measures of the relative worth, or "business cases," for green buildings. Here, the approach is illustrated through a realistic building case study focused on different heating, ventilation, air conditioning technology energy efficiency. Additionally, the evolution of the Building for Environmental and Economic Sustainability multidisciplinary team and future plans in this area are described.
The U.S. Environmental Protection Agency (EPA) held a workshop in January 2003 on the detection of viruses in water using polymerase chain reaction (PCR)-based methods. Speakers were asked to address a series of specific questions, including whether a single standard method coul...
ERIC Educational Resources Information Center
Lanier, Paul; Kohl, Patrica L.; Benz, Joan; Swinger, Dawn; Moussette, Pam; Drake, Brett
2011-01-01
Objectives: The purpose of this study was to evaluate Parent-Child Interaction Therapy (PCIT) deployed in a community setting comparing in-home with the standard office-based intervention. Child behavior, parent stress, parent functioning, and attrition were examined. Methods: Using a quasi-experimental design, standardized measures at three time…
ERIC Educational Resources Information Center
Wall, Denise E.; Least, Christine; Gromis, Judy; Lohse, Barbara
2012-01-01
Background: Impact of a classroom-based, standardized intervention to address limited vegetable consumption of fourth graders was assessed. Methods: A 4-lesson, vegetable-focused intervention, revised from extant materials was repurposed for Pennsylvania fourth graders with lessons aligned with state academic standards. A reliability-tested survey…
Readability Levels of Health-Based Websites: From Content to Comprehension
ERIC Educational Resources Information Center
Schutten, Mary; McFarland, Allison
2009-01-01
Three of the national health education standards include decision-making, accessing information and analyzing influences. WebQuests are a popular inquiry-oriented method used by secondary teachers to help students achieve these content standards. While WebQuests support higher level thinking skills, the readability level of the information on the…
Test Anxiety and High-Stakes Test Performance between School Settings: Implications for Educators
ERIC Educational Resources Information Center
von der Embse, Nathaniel; Hasson, Ramzi
2012-01-01
With the enactment of standards-based accountability in education, high-stakes tests have become the dominant method for measuring school effectiveness and student achievement. Schools and educators are under increasing pressure to meet achievement standards. However, there are variables which may interfere with the authentic measurement of…
A Standards-Based Content Analysis of Selected Biological Science Websites
ERIC Educational Resources Information Center
Stewart, Joy E.
2010-01-01
The purpose of this study was to analyze the biology content, instructional strategies, and assessment methods of 100 biological science websites that were appropriate for Grade 12 educational purposes. For the analysis of each website, an instrument, developed from the National Science Education Standards (NSES) for Grade 12 Life Science coupled…
78 FR 9698 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... effective at improving health care quality. While evidence-based approaches for decision-making have become standard in healthcare, this has been limited in laboratory medicine. No single-evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...
High-performance modeling of plasma-based acceleration and laser-plasma interactions
NASA Astrophysics Data System (ADS)
Vay, Jean-Luc; Blaclard, Guillaume; Godfrey, Brendan; Kirchen, Manuel; Lee, Patrick; Lehe, Remi; Lobet, Mathieu; Vincenti, Henri
2016-10-01
Large-scale numerical simulations are essential to the design of plasma-based accelerators and laser-plasma interations for ultra-high intensity (UHI) physics. The electromagnetic Particle-In-Cell (PIC) approach is the method of choice for self-consistent simulations, as it is based on first principles, and captures all kinetic effects, and also scale favorably to many cores on supercomputers. The standard PIC algorithm relies on second-order finite-difference discretization of the Maxwell and Newton-Lorentz equations. We present here novel formulations, based on very high-order pseudo-spectral Maxwell solvers, which enable near-total elimination of the numerical Cherenkov instability and increased accuracy over the standard PIC method for standard laboratory frame and Lorentz boosted frame simulations. We also present the latest implementations in the PIC modules Warp-PICSAR and FBPIC on the Intel Xeon Phi and GPU architectures. Examples of applications will be given on the simulation of laser-plasma accelerators and high-harmonic generation with plasma mirrors. Work supported by US-DOE Contracts DE-AC02-05CH11231 and by the European Commission through the Marie Slowdoska-Curie fellowship PICSSAR Grant Number 624543. Used resources of NERSC.
Kanazawa, Yuki; Ehara, Masahiro; Sommerfeld, Thomas
2016-03-10
Low-lying π* resonance states of DNA and RNA bases have been investigated by the recently developed projected complex absorbing potential (CAP)/symmetry-adapted cluster-configuration interaction (SAC-CI) method using a smooth Voronoi potential as CAP. In spite of the challenging CAP applications to higher resonance states of molecules of this size, the present calculations reproduce resonance positions observed by electron transmission spectra (ETS) provided the anticipated deviations due to vibronic effects and limited basis sets are taken into account. Moreover, for the standard nucleobases, the calculated positions and widths qualitatively agree with those obtained in previous electron scattering calculations. For guanine, both keto and enol forms were examined, and the calculated values of the keto form agree clearly better with the experimental findings. In addition to these standard bases, three modified forms of cytosine, which serve as epigenetic or biomarkers, were investigated: formylcytosine, methylcytosine, and chlorocytosine. Last, a strong correlation between the computed positions and the observed ETS values is demonstrated, clearly suggesting that the present computational protocol should be useful for predicting the π* resonances of congeners of DNA and RNA bases.
Spectral irradiance standard for the ultraviolet - The deuterium lamp
NASA Technical Reports Server (NTRS)
Saunders, R. D.; Ott, W. R.; Bridges, J. M.
1978-01-01
A set of deuterium lamps is calibrated as spectral irradiance standards in the 200-350-nm spectral region utilizing both a high accuracy tungsten spectral irradiance standard and a newly developed argon mini-arc spectral radiance standard. The method which enables a transfer from a spectral radiance to a spectral irradiance standard is described. The following characteristics of the deuterium lamp irradiance standard are determined: sensitivity to alignment; dependence on input power and solid angle; reproducibility; and stability. The absolute spectral radiance is also measured in the 167-330-nm region. Based upon these measurements, values of the spectral irradiance below 200 nm are obtained through extrapolation.
Deurenberg, Rikie; Vlayen, Joan; Guillo, Sylvie; Oliver, Thomas K; Fervers, Beatrice; Burgers, Jako
2008-03-01
Effective literature searching is particularly important for clinical practice guideline development. Sophisticated searching and filtering mechanisms are needed to help ensure that all relevant research is reviewed. To assess the methods used for the selection of evidence for guideline development by evidence-based guideline development organizations. A semistructured questionnaire assessing the databases, search filters and evaluation methods used for literature retrieval was distributed to eight major organizations involved in evidence-based guideline development. All of the organizations used search filters as part of guideline development. The medline database was the primary source accessed for literature retrieval. The OVID or SilverPlatter interfaces were used in preference to the freely accessed PubMed interface. The Cochrane Library, embase, cinahl and psycinfo databases were also frequently used by the organizations. All organizations reported the intention to improve and validate their filters for finding literature specifically relevant for guidelines. In the first international survey of its kind, eight major guideline development organizations indicated a strong interest in identifying, improving and standardizing search filters to improve guideline development. It is to be hoped that this will result in the standardization of, and open access to, search filters, an improvement in literature searching outcomes and greater collaboration among guideline development organizations.
NASA Astrophysics Data System (ADS)
Nord, Mark; Cafiero, Carlo; Viviani, Sara
2016-11-01
Statistical methods based on item response theory are applied to experiential food insecurity survey data from 147 countries, areas, and territories to assess data quality and develop methods to estimate national prevalence rates of moderate and severe food insecurity at equal levels of severity across countries. Data were collected from nationally representative samples of 1,000 adults in each country. A Rasch-model-based scale was estimated for each country, and data were assessed for consistency with model assumptions. A global reference scale was calculated based on item parameters from all countries. Each country's scale was adjusted to the global standard, allowing for up to 3 of the 8 scale items to be considered unique in that country if their deviance from the global standard exceeded a set tolerance. With very few exceptions, data from all countries were sufficiently consistent with model assumptions to constitute reasonably reliable measures of food insecurity and were adjustable to the global standard with fair confidence. National prevalence rates of moderate-or-severe food insecurity assessed over a 12-month recall period ranged from 3 percent to 92 percent. The correlations of national prevalence rates with national income, health, and well-being indicators provide external validation of the food security measure.
Improving data quality in the linked open data: a survey
NASA Astrophysics Data System (ADS)
Hadhiatma, A.
2018-03-01
The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.
Gómez, Fátima Somovilla; Lorza, Rubén Lostado; Bobadilla, Marina Corral; García, Rubén Escribano
2017-09-21
The kinematic behavior of models that are based on the finite element method (FEM) for modeling the human body depends greatly on an accurate estimate of the parameters that define such models. This task is complex, and any small difference between the actual biomaterial model and the simulation model based on FEM can be amplified enormously in the presence of nonlinearities. The current paper attempts to demonstrate how a combination of the FEM and the MRS methods with desirability functions can be used to obtain the material parameters that are most appropriate for use in defining the behavior of Finite Element (FE) models of the healthy human lumbar intervertebral disc (IVD). The FE model parameters were adjusted on the basis of experimental data from selected standard tests (compression, flexion, extension, shear, lateral bending, and torsion) and were developed as follows: First, three-dimensional parameterized FE models were generated on the basis of the mentioned standard tests. Then, 11 parameters were selected to define the proposed parameterized FE models. For each of the standard tests, regression models were generated using MRS to model the six stiffness and nine bulges of the healthy IVD models that were created by changing the parameters of the FE models. The optimal combination of the 11 parameters was based on three different adjustment criteria. The latter, in turn, were based on the combination of stiffness and bulges that were obtained from the standard test FE simulations. The first adjustment criteria considered stiffness and bulges to be equally important in the adjustment of FE model parameters. The second adjustment criteria considered stiffness as most important, whereas the third considered the bulges to be most important. The proposed adjustment methods were applied to a medium-sized human IVD that corresponded to the L3-L4 lumbar level with standard dimensions of width = 50 mm, depth = 35 mm, and height = 10 mm. Agreement between the kinematic behavior that was obtained with the optimized parameters and that obtained from the literature demonstrated that the proposed method is a powerful tool with which to adjust healthy IVD FE models when there are many parameters, stiffnesses, and bulges to which the models must adjust.
Somovilla Gómez, Fátima
2017-01-01
The kinematic behavior of models that are based on the finite element method (FEM) for modeling the human body depends greatly on an accurate estimate of the parameters that define such models. This task is complex, and any small difference between the actual biomaterial model and the simulation model based on FEM can be amplified enormously in the presence of nonlinearities. The current paper attempts to demonstrate how a combination of the FEM and the MRS methods with desirability functions can be used to obtain the material parameters that are most appropriate for use in defining the behavior of Finite Element (FE) models of the healthy human lumbar intervertebral disc (IVD). The FE model parameters were adjusted on the basis of experimental data from selected standard tests (compression, flexion, extension, shear, lateral bending, and torsion) and were developed as follows: First, three-dimensional parameterized FE models were generated on the basis of the mentioned standard tests. Then, 11 parameters were selected to define the proposed parameterized FE models. For each of the standard tests, regression models were generated using MRS to model the six stiffness and nine bulges of the healthy IVD models that were created by changing the parameters of the FE models. The optimal combination of the 11 parameters was based on three different adjustment criteria. The latter, in turn, were based on the combination of stiffness and bulges that were obtained from the standard test FE simulations. The first adjustment criteria considered stiffness and bulges to be equally important in the adjustment of FE model parameters. The second adjustment criteria considered stiffness as most important, whereas the third considered the bulges to be most important. The proposed adjustment methods were applied to a medium-sized human IVD that corresponded to the L3–L4 lumbar level with standard dimensions of width = 50 mm, depth = 35 mm, and height = 10 mm. Agreement between the kinematic behavior that was obtained with the optimized parameters and that obtained from the literature demonstrated that the proposed method is a powerful tool with which to adjust healthy IVD FE models when there are many parameters, stiffnesses, and bulges to which the models must adjust. PMID:28934161
Costing evidence for health care decision-making in Austria: A systematic review
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata
2017-01-01
Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728
Beyond maximum entropy: Fractal pixon-based image reconstruction
NASA Technical Reports Server (NTRS)
Puetter, R. C.; Pina, R. K.
1994-01-01
We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other methods, including Goodness-of-Fit (e.g. Least-Squares and Lucy-Richardson) and Maximum Entropy (ME). Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME.
Percy, Andrew J; Simon, Romain; Chambers, Andrew G; Borchers, Christoph H
2014-06-25
Mass spectrometry (MS)-based protein quantitation is increasingly being employed to verify candidate protein biomarkers. Multiple or selected reaction monitoring-mass spectrometry (MRM-MS or SRM-MS) with isotopically labeled internal standards has proven to be a successful approach in that regard, but has yet to reach its full potential in terms of multiplexing and sensitivity. Here, we report the development of a new MRM method for the quantitation of 253 disease-associated proteins (represented by 625 interference-free peptides) in 13 LC fractions. This 2D RPLC/MRM-MS approach extends the depth and breadth of the assay by 2 orders of magnitude over pre-fractionation-free assays, with 31 proteins below 10 ng/mL and 41 proteins above 10 ng/mL now quantifiable. Standard flow rates are used in both chromatographic dimensions, and up-front depletion or antibody-based enrichment is not required. The LC separations utilize high and low pH conditions, with the former employing an ammonium hydroxide-based eluent, instead of the conventional ammonium formate, resulting in improved LC column lifetime and performance. The high sensitivity (determined concentration range: 15 mg/mL to 452 pg/mL) and robustness afforded by this method makes the full MRM panel, or subsets thereof, useful for the verification of disease-associated plasma protein biomarkers in patient samples. The described research extends the breadth and depth of protein quantitation in undepleted and non-enriched human plasma by employing standard-flow 2D RPLC/MRM-MS in conjunction with a complex mixture of isotopically labeled peptide standards. The proteins quantified are mainly putative biomarkers of non-communicable (i.e., non-infectious) disease (e.g., cardiovascular or cancer), which require pre-clinical verification and validation before clinical implementation. Based on the enhanced sensitivity and multiplexing, this quantitative plasma proteomic method should prove useful in future candidate biomarker verification studies. Copyright © 2014 Elsevier B.V. All rights reserved.
European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.
Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Schneiderman, Eva; Colón, Ellen; White, Donald J; St John, Samuel
2015-01-01
The purpose of this study was to compare the abrasivity of commercial dentifrices by two techniques: the conventional gold standard radiotracer-based Radioactive Dentin Abrasivity (RDA) method; and a newly validated technique based on V8 brushing that included a profilometry-based evaluation of dentin wear. This profilometry-based method is referred to as RDA-Profilometry Equivalent, or RDA-PE. A total of 36 dentifrices were sourced from four global dentifrice markets (Asia Pacific [including China], Europe, Latin America, and North America) and tested blindly using both the standard radiotracer (RDA) method and the new profilometry method (RDA-PE), taking care to follow specific details related to specimen preparation and treatment. Commercial dentifrices tested exhibited a wide range of abrasivity, with virtually all falling well under the industry accepted upper limit of 250; that is, 2.5 times the level of abrasion measured using an ISO 11609 abrasivity reference calcium pyrophosphate as the reference control. RDA and RDA-PE comparisons were linear across the entire range of abrasivity (r2 = 0.7102) and both measures exhibited similar reproducibility with replicate assessments. RDA-PE assessments were not just linearly correlated, but were also proportional to conventional RDA measures. The linearity and proportionality of the results of the current study support that both methods (RDA or RDA-PE) provide similar results and justify a rationale for making the upper abrasivity limit of 250 apply to both RDA and RDA-PE.
British Thoracic Society quality standards for home oxygen use in adults
Suntharalingam, Jay; Wilkinson, Tom; Annandale, Joseph; Davey, Claire; Fielding, Rhea; Freeman, Daryl; Gibbons, Michael; Hardinge, Maxine; Hippolyte, Sabrine; Knowles, Vikki; Lee, Cassandra; MacNee, William; Pollington, Jacqueline; Vora, Vandana; Watts, Trefor; Wijesinghe, Meme
2017-01-01
Introduction The purpose of the quality standards document is to provide healthcare professionals, commissioners, service providers and patients with a guide to standards of care that should be met for home oxygen provision in the UK, together with measurable markers of good practice. Quality statements are based on the British Thoracic Society (BTS) Guideline for Home Oxygen Use in Adults. Methods Development of BTS Quality Standards follows the BTS process of quality standard production based on the National Institute for Health and Care Excellence process manual for the development of quality standards. Results 10 quality statements have been developed, each describing a key marker of high-quality, cost-effective care for home oxygen use, and each statement is supported by quality measures that aim to improve the structure, process and outcomes of healthcare. Discussion BTS Quality Standards for home oxygen use in adults form a key part of the range of supporting materials that the society produces to assist in the dissemination and implementation of a guideline’s recommendations. PMID:29018527
Menachery, Philby Babu; Noronha, Judith Angelitta; Fernanades, Sweety
2017-08-01
The 'Standard Days Method' is a fertility awareness-based method of family planning that identifies day 8 through day 19 of the menstrual cycle as fertile days during which a woman is likely to conceive with unprotected intercourse. The study was aimed to determine the effectiveness of a promotional program on the 'Standard Days Method' in terms of improving the knowledge scores and attitude scores. A pre-experimental one-group pretest-posttest research design was adopted. The samples included 365 female postgraduate students from selected colleges of Udupi Taluk, Karnataka. The data was collected using self-administered questionnaires. The plan for the promotional program was also established. The findings of the study were analyzed using the descriptive and inferential statistics. The mean pretest and posttest knowledge scores were computed, and it was found that there was an increase in the mean knowledge score from 8.96 ± 3.84 to 32.64 ± 5.59, respectively. It was observed that the promotional program on 'Standard Days Method' was effective in improving the knowledge ( p < 0.001) and attitude ( p < 0.001) of the postgraduate students. The promotional program on Standard Days Method of family planning was effective in improving the knowledge and attitude of the postgraduate female students. This will enable the women to adopt this method and plan their pregnancies naturally and reduce the side effects of using oral contraceptives.
Reboulet, James; Cunningham, Robert; Gunasekar, Palur G; Chapman, Gail D; Stevens, Sean C
2009-02-01
A whole body inhalation study of mixed jet fuel vapor and its aerosol necessitated the development of a method for preparing vapor only standards from the neat fuel. Jet fuel is a complex mixture of components which partitions between aerosol and vapor when aspirated based on relative volatility of the individual compounds. A method was desired which could separate the vapor portion from the aerosol component to prepare standards for the calibration of infrared spectrophotometers and a head space gas chromatography system. A re-circulating loop system was developed which provided vapor only standards whose composition matched those seen in an exposure system. Comparisons of nominal concentrations in the exposure system to those determined by infrared spectrophotometry were in 92-95% agreement. Comparison of jet fuel vapor concentrations determined by infrared spectrophotometry compared to head space gas chromatography yielded a 93% overall agreement in trial runs. These levels of agreement show the loop system to be a viable method for creating jet fuel vapor standards for calibrating instruments.
NASA Astrophysics Data System (ADS)
Clifford, Betsey A.
The Massachusetts Department of Elementary and Secondary Education (DESE) released proposed Science and Technology/Engineering standards in 2013 outlining the concepts that should be taught at each grade level. Previously, standards were in grade spans and each district determined the method of implementation. There are two different methods used teaching middle school science: integrated and discipline-based. In the proposed standards, the Massachusetts DESE uses grade-by-grade standards using an integrated approach. It was not known if there is a statistically significant difference in student achievement on the 8th grade science MCAS assessment for students taught with an integrated or discipline-based approach. The results on the 8th grade science MCAS test from six public school districts from 2010 -- 2013 were collected and analyzed. The methodology used was quantitative. Results of an ANOVA showed that there was no statistically significant difference in overall student achievement between the two curriculum models. Furthermore, there was no statistically significant difference for the various domains: Earth and Space Science, Life Science, Physical Science, and Technology/Engineering. This information is useful for districts hesitant to make the change from a discipline-based approach to an integrated approach. More research should be conducted on this topic with a larger sample size to better support the results.
Comparing generalized ensemble methods for sampling of systems with many degrees of freedom
Lincoff, James; Sasmal, Sukanya; Head-Gordon, Teresa
2016-11-03
Here, we compare two standard replica exchange methods using temperature and dielectric constant as the scaling variables for independent replicas against two new corresponding enhanced sampling methods based on non-equilibrium statistical cooling (temperature) or descreening (dielectric). We test the four methods on a rough 1D potential as well as for alanine dipeptide in water, for which their relatively small phase space allows for the ability to define quantitative convergence metrics. We show that both dielectric methods are inferior to the temperature enhanced sampling methods, and in turn show that temperature cool walking (TCW) systematically outperforms the standard temperature replica exchangemore » (TREx) method. We extend our comparisons of the TCW and TREx methods to the 5 residue met-enkephalin peptide, in which we evaluate the Kullback-Leibler divergence metric to show that the rate of convergence between two independent trajectories is faster for TCW compared to TREx. Finally we apply the temperature methods to the 42 residue amyloid-β peptide in which we find non-negligible differences in the disordered ensemble using TCW compared to the standard TREx. All four methods have been made available as software through the OpenMM Omnia software consortium.« less
Comparing generalized ensemble methods for sampling of systems with many degrees of freedom.
Lincoff, James; Sasmal, Sukanya; Head-Gordon, Teresa
2016-11-07
We compare two standard replica exchange methods using temperature and dielectric constant as the scaling variables for independent replicas against two new corresponding enhanced sampling methods based on non-equilibrium statistical cooling (temperature) or descreening (dielectric). We test the four methods on a rough 1D potential as well as for alanine dipeptide in water, for which their relatively small phase space allows for the ability to define quantitative convergence metrics. We show that both dielectric methods are inferior to the temperature enhanced sampling methods, and in turn show that temperature cool walking (TCW) systematically outperforms the standard temperature replica exchange (TREx) method. We extend our comparisons of the TCW and TREx methods to the 5 residue met-enkephalin peptide, in which we evaluate the Kullback-Leibler divergence metric to show that the rate of convergence between two independent trajectories is faster for TCW compared to TREx. Finally we apply the temperature methods to the 42 residue amyloid-β peptide in which we find non-negligible differences in the disordered ensemble using TCW compared to the standard TREx. All four methods have been made available as software through the OpenMM Omnia software consortium (http://www.omnia.md/).
Lakshmi, Karunanidhi Santhana; Lakshmi, Sivasubramanian
2011-03-01
Simultaneous determination of valsartan and hydrochlorothiazide by the H-point standard additions method (HPSAM) and partial least squares (PLS) calibration is described. Absorbances at a pair of wavelengths, 216 and 228 nm, were monitored with the addition of standard solutions of valsartan. Results of applying HPSAM showed that valsartan and hydrochlorothiazide can be determined simultaneously at concentration ratios varying from 20:1 to 1:15 in a mixed sample. The proposed PLS method does not require chemical separation and spectral graphical procedures for quantitative resolution of mixtures containing the titled compounds. The calibration model was based on absorption spectra in the 200-350 nm range for 25 different mixtures of valsartan and hydrochlorothiazide. Calibration matrices contained 0.5-3 μg mL-1 of both valsartan and hydrochlorothiazide. The standard error of prediction (SEP) for valsartan and hydrochlorothiazide was 0.020 and 0.038 μg mL-1, respectively. Both proposed methods were successfully applied to the determination of valsartan and hydrochlorothiazide in several synthetic and real matrix samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokofiev, I.; Wiencek, T.; McGann, D.
1997-10-07
Powder metallurgy dispersions of uranium alloys and silicides in an aluminum matrix have been developed by the RERTR program as a new generation of proliferation-resistant fuels. Testing is done with miniplate-type fuel plates to simulate standard fuel with cladding and matrix in plate-type configurations. In order to seal the dispersion fuel plates, a diffusion bond must exist between the aluminum coverplates surrounding the fuel meat. Four different variations in the standard method for roll-bonding 6061 aluminum were studied. They included mechanical cleaning, addition of a getter material, modifications to the standard chemical etching, and welding methods. Aluminum test pieces weremore » subjected to a bend test after each rolling pass. Results, based on 400 samples, indicate that at least a 70% reduction in thickness is required to produce a diffusion bond using the standard rollbonding method versus a 60% reduction using the Type II method in which the assembly was welded 100% and contained open 9mm holes at frame corners.« less
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES
LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.
2008-01-01
Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508
Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty
NASA Astrophysics Data System (ADS)
Brumble, K. C.
2012-12-01
What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid critics of particular statistical reconstructions with naïve and misapplied methodological criticism. Examples will include the skeptical responses to multi-proxy mean temperature reconstructions and congressional hearings criticizing the work of Michael Mann et al.'s Hockey Stick.
An efficient auto TPT stitch guidance generation for optimized standard cell design
NASA Astrophysics Data System (ADS)
Samboju, Nagaraj C.; Choi, Soo-Han; Arikati, Srini; Cilingir, Erdem
2015-03-01
As the technology continues to shrink below 14nm, triple patterning lithography (TPT) is a worthwhile lithography methodology for printing dense layers such as Metal1. However, this increases the complexity of standard cell design, as it is very difficult to develop a TPT compliant layout without compromising on the area. Hence, this emphasizes the importance to have an accurate stitch generation methodology to meet the standard cell area requirement as defined by the technology shrink factor. In this paper, we present an efficient auto TPT stitch guidance generation technique for optimized standard cell design. The basic idea here is to first identify the conflicting polygons based on the Fix Guidance [1] solution developed by Synopsys. Fix Guidance is a reduced sub-graph containing minimum set of edges along with the connecting polygons; by eliminating these edges in a design 3-color conflicts can be resolved. Once the conflicting polygons are identified using this method, they are categorized into four types [2] - (Type 1 to 4). The categorization is based on number of interactions a polygon has with the coloring links and the triangle loops of fix guidance. For each type a certain criteria for keep-out region is defined, based on which the final stitch guidance locations are generated. This technique provides various possible stitch locations to the user and helps the user to select the best stitch location considering both design flexibility (max. pin access/small area) and process-preferences. Based on this technique, a standard cell library for place and route (P and R) can be developed with colorless data and a stitch marker defined by designer using our proposed method. After P and R, the full chip (block) would contain the colorless data and standard cell stitch markers only. These stitch markers are considered as "must be stitch" candidates. Hence during full chip decomposition it is not required to generate and select the stitch markers again for the complete data; therefore, the proposed method reduces the decomposition time significantly.
An evolutionary algorithm that constructs recurrent neural networks.
Angeline, P J; Saunders, G M; Pollack, J B
1994-01-01
Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.
Several key issues on using 137Cs method for soil erosion estimation
USDA-ARS?s Scientific Manuscript database
This work was to examine several key issues of using the cesium-137 method to estimate soil erosion rates in order to improve and standardize the method. Based on the comprehensive review and synthesis of a large body of published literature and the author’s extensive research experience, several k...
A Comparison of Imputation Methods for Bayesian Factor Analysis Models
ERIC Educational Resources Information Center
Merkle, Edgar C.
2011-01-01
Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…
Valjevac, Salih; Ridjanovic, Zoran; Masic, Izet
2009-01-01
CONFLICT OF INTEREST: NONE DECLARED SUMMARY Introduction Agency for healthcare quality and accreditation in Federation of Bosnia and Herzegovina (AKAZ) is authorized body in the field of healthcare quality and safety improvement and accreditation of healthcare institutions. Beside accreditation standards for hospitals and primary health care centers, AKAZ has also developed accreditation standards for family medicine teams. Methods Software development was primarily based on Accreditation Standards for Family Medicine Teams. Seven chapters / topics: (1. Physical factors; 2. Equipment; 3. Organization and Management; 4. Health promotion and illness prevention; 5. Clinical services; 6. Patient survey; and 7. Patient’s rights and obligations) contain 35 standards describing expected level of family medicine team’s quality. Based on accreditation standards structure and needs of different potential users, it was concluded that software backbone should be a database containing all accreditation standards, self assessment and external assessment details. In this article we will present the development of standardized software for self and external evaluation of quality of service in family medicine, as well as plans for the future development of this software package. Conclusion Electronic data gathering and storing enhances the management, access and overall use of information. During this project we came to conclusion that software for self assessment and external assessment is ideal for accreditation standards distribution, their overview by the family medicine team members, their self assessment and external assessment. PMID:24109157
Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-10-28
The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.
Marchetti, Bárbara V; Candotti, Cláudia T; Raupp, Eduardo G; Oliveira, Eduardo B C; Furlanetto, Tássia S; Loss, Jefferson F
The purpose of this study was to assess a radiographic method for spinal curvature evaluation in children, based on spinous processes, and identify its normality limits. The sample consisted of 90 radiographic examinations of the spines of children in the sagittal plane. Thoracic and lumbar curvatures were evaluated using angular (apex angle [AA]) and linear (sagittal arrow [SA]) measurements based on the spinous processes. The same curvatures were also evaluated using the Cobb angle (CA) method, which is considered the gold standard. For concurrent validity (AA vs CA), Pearson's product-moment correlation coefficient, root-mean-square error, Pitman- Morgan test, and Bland-Altman analysis were used. For reproducibility (AA, SA, and CA), the intraclass correlation coefficient, standard error of measurement, and minimal detectable change measurements were used. A significant correlation was found between CA and AA measurements, as was a low root-mean-square error. The mean difference between the measurements was 0° for thoracic and lumbar curvatures, and the mean standard deviations of the differences were ±5.9° and 6.9°, respectively. The intraclass correlation coefficients of AA and SA were similar to or higher than the gold standard (CA). The standard error of measurement and minimal detectable change of the AA were always lower than the CA. This study determined the concurrent validity, as well as intra- and interrater reproducibility, of the radiographic measurements of kyphosis and lordosis in children. Copyright © 2017. Published by Elsevier Inc.
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems.
Glover, Jack L; Hudson, Lawrence T
2016-06-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard.
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems
Glover, Jack L.; Hudson, Lawrence T.
2016-01-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard. PMID:27499586
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems
NASA Astrophysics Data System (ADS)
Glover, Jack L.; Hudson, Lawrence T.
2016-06-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in an international aviation security standard.
Synthesis, characterization and biological studies of Schiff bases derived from heterocyclic moiety.
Shanty, Angamaly Antony; Philip, Jessica Elizabeth; Sneha, Eeettinilkunnathil Jose; Prathapachandra Kurup, Maliyeckal R; Balachandran, Sreedharannair; Mohanan, Puzhavoorparambil Velayudhan
2017-02-01
Some new Schiff bases (H 1 -H 7 ) have been synthesized by the condensation of 2-aminophenol, 2-amino-4-nitrophenol, 2-amino-4-methylphenol, 2-amino benzimidazole with thiophene-2-carboxaldehyde and pyrrole-2-carboxaldehyde. The structures of newly synthesized compounds were characterized by elemental analysis, FT-IR, 1 H NMR, UV-VIS, and single crystal X-ray crystallography. The in vitro antibacterial activity of the synthesized compounds has been tested against Salmonella typhi, Bacillus coagulans, Bacillus pumills, Escherichia coli, Bacillus circulans, Pseudomonas, Clostridium and Klebsilla pneumonia by disk diffusion method. The quantitative antimicrobial activity of the test compounds was evaluated using Resazurin based Microtiter Dilution Assay. Ampicillin was used as standard antibiotics. Schiff bases individually exhibited varying degrees of inhibitory effects on the growth of the tested bacterial species. The antioxidant activity of the synthesized compounds was determined by the 1,1-diphenyl-2-picrylhydrazyl(DPPH) method. IC 50 value of synthesized Schiff bases were calculated and compared with standard BHA. Copyright © 2016 Elsevier Inc. All rights reserved.
Lin, Yuehe; Bennett, Wendy D.; Timchalk, Charles; Thrall, Karla D.
2004-03-02
Microanalytical systems based on a microfluidics/electrochemical detection scheme are described. Individual modules, such as microfabricated piezoelectrically actuated pumps and a microelectrochemical cell were integrated onto portable platforms. This allowed rapid change-out and repair of individual components by incorporating "plug and play" concepts now standard in PC's. Different integration schemes were used for construction of the microanalytical systems based on microfluidics/electrochemical detection. In one scheme, all individual modules were integrated in the surface of the standard microfluidic platform based on a plug-and-play design. Microelectrochemical flow cell which integrated three electrodes based on a wall-jet design was fabricated on polymer substrate. The microelectrochemical flow cell was then plugged directly into the microfluidic platform. Another integration scheme was based on a multilayer lamination method utilizing stacking modules with different functionality to achieve a compact microanalytical device. Application of the microanalytical system for detection of lead in, for example, river water and saliva samples using stripping voltammetry is described.
Predictive Methods for Dense Polymer Networks: Combating Bias with Bio-Based Structures
2016-03-16
Informatics Tools Acknowledgements: Air Force Office of Scientific Research, Air Force Research Laboratory, Office of Naval Research, Strategic...Sources and Methods • Bio-based cyanate esters have been made from anethole, resveratrol, eugenol, cresol, lignin, vanillin, and even creosote oils ...not large by informatics standards, it nonetheless represents a significant amount of synthetic effort. Because the data is limited, minimizing
Creating More Seamless Connections between University-Based Coursework and School-Based Mentoring
ERIC Educational Resources Information Center
Schuster, Dwight
2014-01-01
New accreditation standards for teacher preparation programs call for more seamless and effective connections between methods courses, clinical experiences, and school-based mentoring. Intentional clinical experiences and intermediate instructional strategies can foster collaboration between teacher preparation programs and teacher leaders in K-12…
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Ongoing lead-based paint...
24 CFR 35.1355 - Ongoing lead-based paint maintenance and reevaluation activities.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard Evaluation and Hazard Reduction Activities... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Ongoing lead-based paint...
WEIGHT OF EVIDENCE IN ECOLOGICAL ASSESSMENT
This document provides guidance on methods for weighing ecological evidence using a a standard framework consisting of three steps: assemble evidence, weight evidence and weigh the body of evidence. Use of the methods will improve the consistency and reliability of WoE-based asse...
Rapid assessment of urban wetlands: Do hydrogeomorpic classification and reference criteria work?
The Hydrogeomorphic (HGM) functional assessment method is predicated on the ability of a wetland classification method based on hydrology (HGM classification) and a visual assessment of disturbance and alteration to provide reference standards against which functions in individua...
NASA Astrophysics Data System (ADS)
Přibil, Jiří; Přibilová, Anna; Frollo, Ivan
2017-12-01
The paper focuses on two methods of evaluation of successfulness of speech signal enhancement recorded in the open-air magnetic resonance imager during phonation for the 3D human vocal tract modeling. The first approach enables to obtain a comparison based on statistical analysis by ANOVA and hypothesis tests. The second method is based on classification by Gaussian mixture models (GMM). The performed experiments have confirmed that the proposed ANOVA and GMM classifiers for automatic evaluation of the speech quality are functional and produce fully comparable results with the standard evaluation based on the listening test method.
NASA Astrophysics Data System (ADS)
Filatov, Michael; Cremer, Dieter
2002-01-01
A recently developed variationally stable quasi-relativistic method, which is based on the low-order approximation to the method of normalized elimination of the small component, was incorporated into density functional theory (DFT). The new method was tested for diatomic molecules involving Ag, Cd, Au, and Hg by calculating equilibrium bond lengths, vibrational frequencies, and dissociation energies. The method is easy to implement into standard quantum chemical programs and leads to accurate results for the benchmark systems studied.
Time delayed Ensemble Nudging Method
NASA Astrophysics Data System (ADS)
An, Zhe; Abarbanel, Henry
Optimal nudging method based on time delayed embedding theory has shows potentials on analyzing and data assimilation in previous literatures. To extend the application and promote the practical implementation, new nudging assimilation method based on the time delayed embedding space is presented and the connection with other standard assimilation methods are studied. Results shows the incorporating information from the time series of data can reduce the sufficient observation needed to preserve the quality of numerical prediction, making it a potential alternative in the field of data assimilation of large geophysical models.
Tensor-GMRES method for large sparse systems of nonlinear equations
NASA Technical Reports Server (NTRS)
Feng, Dan; Pulliam, Thomas H.
1994-01-01
This paper introduces a tensor-Krylov method, the tensor-GMRES method, for large sparse systems of nonlinear equations. This method is a coupling of tensor model formation and solution techniques for nonlinear equations with Krylov subspace projection techniques for unsymmetric systems of linear equations. Traditional tensor methods for nonlinear equations are based on a quadratic model of the nonlinear function, a standard linear model augmented by a simple second order term. These methods are shown to be significantly more efficient than standard methods both on nonsingular problems and on problems where the Jacobian matrix at the solution is singular. A major disadvantage of the traditional tensor methods is that the solution of the tensor model requires the factorization of the Jacobian matrix, which may not be suitable for problems where the Jacobian matrix is large and has a 'bad' sparsity structure for an efficient factorization. We overcome this difficulty by forming and solving the tensor model using an extension of a Newton-GMRES scheme. Like traditional tensor methods, we show that the new tensor method has significant computational advantages over the analogous Newton counterpart. Consistent with Krylov subspace based methods, the new tensor method does not depend on the factorization of the Jacobian matrix. As a matter of fact, the Jacobian matrix is never needed explicitly.
SU-E-J-221: A Novel Expansion Method for MRI Based Target Delineation in Prostate Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, B; East Carolina University, Greenville, NC; Feng, Y
Purpose: To compare a novel bladder/rectum carveout expansion method on MRI delineated prostate to standard CT and expansion based methods for maintaining prostate coverage while providing superior bladder and rectal sparing. Methods: Ten prostate cases were planned to include four trials: MRI vs CT delineated prostate/proximal seminal vesicles, and each image modality compared to both standard expansions (8mm 3D expansion and 5mm posterior, i.e. ∼8mm) and carveout method expansions (5mm 3D expansion, 4mm posterior for GTV-CTV excluding expansion into bladder/rectum followed by additional 5mm 3D expansion to PTV, i.e. ∼1cm). All trials were planned to total dose 7920 cGy viamore » IMRT. Evaluation and comparison was made using the following criteria: QUANTEC constraints for bladder/rectum including analysis of low dose regions, changes in PTV volume, total control points, and maximum hot spot. Results: ∼8mm MRI expansion consistently produced the most optimal plan with lowest total control points and best bladder/rectum sparing. However, this scheme had the smallest prostate (average 22.9% reduction) and subsequent PTV volume, consistent with prior literature. ∼1cm MRI had an average PTV volume comparable to ∼8mm CT at 3.79% difference. Bladder QUANTEC constraints were on average less for the ∼1cm MRI as compared to the ∼8mm CT and observed as statistically significant with 2.64% reduction in V65. Rectal constraints appeared to follow the same trend. Case-by-case analysis showed variation in rectal V30 with MRI delineated prostate being most favorable regardless of expansion type. ∼1cm MRI and ∼8mm CT had comparable plan quality. Conclusion: MRI delineated prostate with standard expansions had the smallest PTV leading to margins that may be too tight. Bladder/rectum carveout expansion method on MRI delineated prostate was found to be superior to standard CT based methods in terms of bladder and rectal sparing while maintaining prostate coverage. Continued investigation is warranted for further validation.« less
A Kernel-free Boundary Integral Method for Elliptic Boundary Value Problems ⋆
Ying, Wenjun; Henriquez, Craig S.
2013-01-01
This paper presents a class of kernel-free boundary integral (KFBI) methods for general elliptic boundary value problems (BVPs). The boundary integral equations reformulated from the BVPs are solved iteratively with the GMRES method. During the iteration, the boundary and volume integrals involving Green's functions are approximated by structured grid-based numerical solutions, which avoids the need to know the analytical expressions of Green's functions. The KFBI method assumes that the larger regular domain, which embeds the original complex domain, can be easily partitioned into a hierarchy of structured grids so that fast elliptic solvers such as the fast Fourier transform (FFT) based Poisson/Helmholtz solvers or those based on geometric multigrid iterations are applicable. The structured grid-based solutions are obtained with standard finite difference method (FDM) or finite element method (FEM), where the right hand side of the resulting linear system is appropriately modified at irregular grid nodes to recover the formal accuracy of the underlying numerical scheme. Numerical results demonstrating the efficiency and accuracy of the KFBI methods are presented. It is observed that the number of GM-RES iterations used by the method for solving isotropic and moderately anisotropic BVPs is independent of the sizes of the grids that are employed to approximate the boundary and volume integrals. With the standard second-order FEMs and FDMs, the KFBI method shows a second-order convergence rate in accuracy for all of the tested Dirichlet/Neumann BVPs when the anisotropy of the diffusion tensor is not too strong. PMID:23519600
Study on the criteria for assessing skull-face correspondence in craniofacial superimposition.
Ibáñez, Oscar; Valsecchi, Andrea; Cavalli, Fabio; Huete, María Isabel; Campomanes-Alvarez, Blanca Rosario; Campomanes-Alvarez, Carmen; Vicente, Ricardo; Navega, David; Ross, Ann; Wilkinson, Caroline; Jankauskas, Rimantas; Imaizumi, Kazuhiko; Hardiman, Rita; Jayaprakash, Paul Thomas; Ruiz, Elena; Molinero, Francisco; Lestón, Patricio; Veselovskaya, Elizaveta; Abramov, Alexey; Steyn, Maryna; Cardoso, Joao; Humpire, Daniel; Lusnig, Luca; Gibelli, Daniele; Mazzarelli, Debora; Gaudio, Daniel; Collini, Federica; Damas, Sergio
2016-11-01
Craniofacial superimposition has the potential to be used as an identification method when other traditional biological techniques are not applicable due to insufficient quality or absence of ante-mortem and post-mortem data. Despite having been used in many countries as a method of inclusion and exclusion for over a century it lacks standards. Thus, the purpose of this research is to provide forensic practitioners with standard criteria for analysing skull-face relationships. Thirty-seven experts from 16 different institutions participated in this study, which consisted of evaluating 65 criteria for assessing skull-face anatomical consistency on a sample of 24 different skull-face superimpositions. An unbiased statistical analysis established the most objective and discriminative criteria. Results did not show strong associations, however, important insights to address lack of standards were provided. In addition, a novel methodology for understanding and standardizing identification methods based on the observation of morphological patterns has been proposed. Crown Copyright © 2016. Published by Elsevier Ireland Ltd. All rights reserved.
Study on Quality Standard of Processed Curcuma Longa Radix
Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo
2017-01-01
To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640
NASA Astrophysics Data System (ADS)
Ratnadewi; Pramono Adhie, Roy; Hutama, Yonatan; Saleh Ahmar, A.; Setiawan, M. I.
2018-01-01
Cryptography is a method used to create secure communication by manipulating sent messages during the communication occurred so only intended party that can know the content of that messages. Some of the most commonly used cryptography methods to protect sent messages, especially in the form of text, are DES and 3DES cryptography method. This research will explain the DES and 3DES cryptography method and its use for stored data security in smart cards that working in the NFC-based communication system. Several things that will be explained in this research is the ways of working of DES and 3DES cryptography method in doing the protection process of a data and software engineering through the creation of application using C++ programming language to realize and test the performance of DES and 3DES cryptography method in encrypted data writing process to smart cards and decrypted data reading process from smart cards. The execution time of the entering and the reading process data using a smart card DES cryptography method is faster than using 3DES cryptography.
NASA Astrophysics Data System (ADS)
Herrington, Jason S.; Hays, Michael D.
2012-08-01
There is high demand for accurate and reliable airborne carbonyl measurement methods due to the human and environmental health impacts of carbonyls and their effects on atmospheric chemistry. Standardized 2,4-dinitrophenylhydrazine (DNPH)-based sampling methods are frequently applied for measuring gaseous carbonyls in the atmospheric environment. However, there are multiple short-comings associated with these methods that detract from an accurate understanding of carbonyl-related exposure, health effects, and atmospheric chemistry. The purpose of this brief technical communication is to highlight these method challenges and their influence on national ambient monitoring networks, and to provide a logical path forward for accurate carbonyl measurement. This manuscript focuses on three specific carbonyl compounds of high toxicological interest—formaldehyde, acetaldehyde, and acrolein. Further method testing and development, the revision of standardized methods, and the plausibility of introducing novel technology for these carbonyls are considered elements of the path forward. The consolidation of this information is important because it seems clear that carbonyl data produced utilizing DNPH-based methods are being reported without acknowledgment of the method short-comings or how to best address them.
Film-based delivery quality assurance for robotic radiosurgery: Commissioning and validation.
Blanck, Oliver; Masi, Laura; Damme, Marie-Christin; Hildebrandt, Guido; Dunst, Jürgen; Siebert, Frank-Andre; Poppinga, Daniela; Poppe, Björn
2015-07-01
Robotic radiosurgery demands comprehensive delivery quality assurance (DQA), but guidelines for commissioning of the DQA method is missing. We investigated the stability and sensitivity of our film-based DQA method with various test scenarios and routine patient plans. We also investigated the applicability of tight distance-to-agreement (DTA) Gamma-Index criteria. We used radiochromic films with multichannel film dosimetry and re-calibration and our analysis was performed in four steps: 1) Film-to-plan registration, 2) Standard Gamma-Index criteria evaluation (local-pixel-dose-difference ≤2%, distance-to-agreement ≤2 mm, pass-rate ≥90%), 3) Dose distribution shift until maximum pass-rate (Maxγ) was found (shift acceptance <1 mm), and 4) Final evaluation with tight DTA criteria (≤1 mm). Test scenarios consisted of purposefully introduced phantom misalignments, dose miscalibrations, and undelivered MU. Initial method evaluation was done on 30 clinical plans. Our method showed similar sensitivity compared to the standard End-2-End-Test and incorporated an estimate of global system offsets in the analysis. The simulated errors (phantom shifts, global robot misalignment, undelivered MU) were detected by our method while standard Gamma-Index criteria often did not reveal these deviations. Dose miscalibration was not detected by film alone, hence simultaneous ion-chamber measurement for film calibration is strongly recommended. 83% of the clinical patient plans were within our tight DTA tolerances. Our presented methods provide additional measurements and quality references for film-based DQA enabling more sensitive error detection. We provided various test scenarios for commissioning of robotic radiosurgery DQA and demonstrated the necessity to use tight DTA criteria. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Mentasti, Massimo; Tewolde, Rediat; Aslett, Martin; Harris, Simon R.; Afshar, Baharak; Underwood, Anthony; Harrison, Timothy G.
2016-01-01
Sequence-based typing (SBT), analogous to multilocus sequence typing (MLST), is the current “gold standard” typing method for investigation of legionellosis outbreaks caused by Legionella pneumophila. However, as common sequence types (STs) cause many infections, some investigations remain unresolved. In this study, various whole-genome sequencing (WGS)-based methods were evaluated according to published guidelines, including (i) a single nucleotide polymorphism (SNP)-based method, (ii) extended MLST using different numbers of genes, (iii) determination of gene presence or absence, and (iv) a kmer-based method. L. pneumophila serogroup 1 isolates (n = 106) from the standard “typing panel,” previously used by the European Society for Clinical Microbiology Study Group on Legionella Infections (ESGLI), were tested together with another 229 isolates. Over 98% of isolates were considered typeable using the SNP- and kmer-based methods. Percentages of isolates with complete extended MLST profiles ranged from 99.1% (50 genes) to 86.8% (1,455 genes), while only 41.5% produced a full profile with the gene presence/absence scheme. Replicates demonstrated that all methods offer 100% reproducibility. Indices of discrimination range from 0.972 (ribosomal MLST) to 0.999 (SNP based), and all values were higher than that achieved with SBT (0.940). Epidemiological concordance is generally inversely related to discriminatory power. We propose that an extended MLST scheme with ∼50 genes provides optimal epidemiological concordance while substantially improving the discrimination offered by SBT and can be used as part of a hierarchical typing scheme that should maintain backwards compatibility and increase discrimination where necessary. This analysis will be useful for the ESGLI to design a scheme that has the potential to become the new gold standard typing method for L. pneumophila. PMID:27280420
Adaptive Set-Based Methods for Association Testing.
Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo
2016-02-01
With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian
2003-09-30
The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.
Beichel, Reinhard R; Van Tol, Markus; Ulrich, Ethan J; Bauer, Christian; Chang, Tangel; Plichta, Kristin A; Smith, Brian J; Sunderland, John J; Graham, Michael M; Sonka, Milan; Buatti, John M
2016-06-01
The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the "just-enough-interaction" principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.
New Primary Standards for Establishing SI Traceability for Moisture Measurements in Solid Materials
NASA Astrophysics Data System (ADS)
Heinonen, M.; Bell, S.; Choi, B. Il; Cortellessa, G.; Fernicola, V.; Georgin, E.; Hudoklin, D.; Ionescu, G. V.; Ismail, N.; Keawprasert, T.; Krasheninina, M.; Aro, R.; Nielsen, J.; Oğuz Aytekin, S.; Österberg, P.; Skabar, J.; Strnad, R.
2018-01-01
A European research project METefnet addresses a fundamental obstacle to improving energy-intensive drying process control: due to ambiguous reference analysis methods and insufficient methods for estimating uncertainty in moisture measurements, the achievable accuracy in the past was limited and measurement uncertainties were largely unknown. This paper reports the developments in METefnet that provide a sound basis for the SI traceability: four new primary standards for realizing the water mass fraction were set up, analyzed and compared to each other. The operation of these standards is based on combining sample weighing with different water vapor detection techniques: cold trap, chilled mirror, electrolytic and coulometric Karl Fischer titration. The results show that an equivalence of 0.2 % has been achieved between the water mass fraction realizations and that the developed methods are applicable to a wide range of materials.
Melanins and melanogenesis: methods, standards, protocols.
d'Ischia, Marco; Wakamatsu, Kazumasa; Napolitano, Alessandra; Briganti, Stefania; Garcia-Borron, José-Carlos; Kovacs, Daniela; Meredith, Paul; Pezzella, Alessandro; Picardo, Mauro; Sarna, Tadeusz; Simon, John D; Ito, Shosuke
2013-09-01
Despite considerable advances in the past decade, melanin research still suffers from the lack of universally accepted and shared nomenclature, methodologies, and structural models. This paper stems from the joint efforts of chemists, biochemists, physicists, biologists, and physicians with recognized and consolidated expertise in the field of melanins and melanogenesis, who critically reviewed and experimentally revisited methods, standards, and protocols to provide for the first time a consensus set of recommended procedures to be adopted and shared by researchers involved in pigment cell research. The aim of the paper was to define an unprecedented frame of reference built on cutting-edge knowledge and state-of-the-art methodology, to enable reliable comparison of results among laboratories and new progress in the field based on standardized methods and shared information. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A New Activity-Based Financial Cost Management Method
NASA Astrophysics Data System (ADS)
Qingge, Zhang
The standard activity-based financial cost management model is a new model of financial cost management, which is on the basis of the standard cost system and the activity-based cost and integrates the advantages of the two. It is a new model of financial cost management with more accurate and more adequate cost information by taking the R&D expenses as the accounting starting point and after-sale service expenses as the terminal point and covering the whole producing and operating process and the whole activities chain and value chain aiming at serving the internal management and decision.
A new method for calculating ecological flow: Distribution flow method
NASA Astrophysics Data System (ADS)
Tan, Guangming; Yi, Ran; Chang, Jianbo; Shu, Caiwen; Yin, Zhi; Han, Shasha; Feng, Zhiyong; Lyu, Yiwei
2018-04-01
A distribution flow method (DFM) and its ecological flow index and evaluation grade standard are proposed to study the ecological flow of rivers based on broadening kernel density estimation. The proposed DFM and its ecological flow index and evaluation grade standard are applied into the calculation of ecological flow in the middle reaches of the Yangtze River and compared with traditional calculation method of hydrological ecological flow, method of flow evaluation, and calculation result of fish ecological flow. Results show that the DFM considers the intra- and inter-annual variations in natural runoff, thereby reducing the influence of extreme flow and uneven flow distributions during the year. This method also satisfies the actual runoff demand of river ecosystems, demonstrates superiority over the traditional hydrological methods, and shows a high space-time applicability and application value.
Duff, Kevin
2012-01-01
Repeated assessments are a relatively common occurrence in clinical neuropsychology. The current paper will review some of the relevant concepts (e.g., reliability, practice effects, alternate forms) and methods (e.g., reliable change index, standardized based regression) that are used in repeated neuropsychological evaluations. The focus will be on the understanding and application of these concepts and methods in the evaluation of the individual patient through examples. Finally, some future directions for assessing change will be described. PMID:22382384
Research on Estimates of Xi’an City Life Garbage Pay-As-You-Throw Based on Two-part Tariff method
NASA Astrophysics Data System (ADS)
Yaobo, Shi; Xinxin, Zhao; Fuli, Zheng
2017-05-01
Domestic waste whose pricing can’t be separated from the pricing of public economics category is quasi public goods. Based on Two-part Tariff method on urban public utilities, this paper designs the pricing model in order to match the charging method and estimates the standard of pay-as-you-throw using data of the past five years in Xi’an. Finally, this paper summarizes the main results and proposes corresponding policy recommendations.
Beyond single-stream with the Schrödinger method
NASA Astrophysics Data System (ADS)
Uhlemann, Cora; Kopp, Michael
2016-10-01
We investigate large scale structure formation of collisionless dark matter in the phase space description based on the Vlasov-Poisson equation. We present the Schrödinger method, originally proposed by \\cite{WK93} as numerical technique based on the Schrödinger Poisson equation, as an analytical tool which is superior to the common standard pressureless fluid model. Whereas the dust model fails and develops singularities at shell crossing the Schrödinger method encompasses multi-streaming and even virialization.
Martin, Jeffrey D.
2002-01-01
Correlation analysis indicates that for most pesticides and concentrations, pooled estimates of relative standard deviation rather than pooled estimates of standard deviation should be used to estimate variability because pooled estimates of relative standard deviation are less affected by heteroscedasticity. The 2 Variability of Pesticide Detections and Concentrations in Field Replicate Water Samples, 1992–97 median pooled relative standard deviation was calculated for all pesticides to summarize the typical variability for pesticide data collected for the NAWQA Program. The median pooled relative standard deviation was 15 percent at concentrations less than 0.01 micrograms per liter (µg/L), 13 percent at concentrations near 0.01 µg/L, 12 percent at concentrations near 0.1 µg/L, 7.9 percent at concentrations near 1 µg/L, and 2.7 percent at concentrations greater than 5 µg/L. Pooled estimates of standard deviation or relative standard deviation presented in this report are larger than estimates based on averages, medians, smooths, or regression of the individual measurements of standard deviation or relative standard deviation from field replicates. Pooled estimates, however, are the preferred method for characterizing variability because they provide unbiased estimates of the variability of the population. Assessments of variability based on standard deviation (rather than variance) underestimate the true variability of the population. Because pooled estimates of variability are larger than estimates based on other approaches, users of estimates of variability must be cognizant of the approach used to obtain the estimate and must use caution in the comparison of estimates based on different approaches.
Schiffman, Eric L; Truelove, Edmond L; Ohrbach, Richard; Anderson, Gary C; John, Mike T; List, Thomas; Look, John O
2010-01-01
The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. The aim of this article is to provide an overview of the project's methodology, descriptive statistics, and data for the study participant sample. This article also details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. The Axis I reference standards were based on the consensus of two criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion examination reliability was also assessed within study sites. Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas > or = 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion examiner agreement with reference standards was excellent (k > or = 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods.
A standard telemental health evaluation model: the time is now.
Kramer, Greg M; Shore, Jay H; Mishkind, Matt C; Friedl, Karl E; Poropatich, Ronald K; Gahm, Gregory A
2012-05-01
The telehealth field has advanced historic promises to improve access, cost, and quality of care. However, the extent to which it is delivering on its promises is unclear as the scientific evidence needed to justify success is still emerging. Many have identified the need to advance the scientific knowledge base to better quantify success. One method for advancing that knowledge base is a standard telemental health evaluation model. Telemental health is defined here as the provision of mental health services using live, interactive video-teleconferencing technology. Evaluation in the telemental health field largely consists of descriptive and small pilot studies, is often defined by the individual goals of the specific programs, and is typically focused on only one outcome. The field should adopt new evaluation methods that consider the co-adaptive interaction between users (patients and providers), healthcare costs and savings, and the rapid evolution in communication technologies. Acceptance of a standard evaluation model will improve perceptions of telemental health as an established field, promote development of a sounder empirical base, promote interagency collaboration, and provide a framework for more multidisciplinary research that integrates measuring the impact of the technology and the overall healthcare aspect. We suggest that consideration of a standard model is timely given where telemental health is at in terms of its stage of scientific progress. We will broadly recommend some elements of what such a standard evaluation model might include for telemental health and suggest a way forward for adopting such a model.
CytometryML with DICOM and FCS
NASA Astrophysics Data System (ADS)
Leif, Robert C.
2018-02-01
Abstract: Flow Cytometry Standard, FCS, and Digital Imaging and Communications in Medicine standard, DICOM, are based on extensive, superb domain knowledge, However, they are isolated systems, do not take advantage of data structures, require special programs to read and write the data, lack the capability to interoperate or work with other standards and FCS lacks many of the datatypes necessary for clinical laboratory data. The large overlap between imaging and flow cytometry provides strong evidence that both modalities should be covered by the same standard. Method: The XML Schema Definition Language, XSD 1.1 was used to translate FCS and/or DICOM objects. A MIFlowCyt file was tested with published values. Results: Previously, a significant part of an XML standard based upon a combination of FCS and DICOM has been implemented and validated with MIFlowCyt data. Strongly typed translations of FCS keywords have been constructed in XML. These keywords contain links to their DICOM and FCS equivalents.
NASA Astrophysics Data System (ADS)
Liu, G.; Wu, C.; Li, X.; Song, P.
2013-12-01
The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.
The need for performance criteria in evaluating the durability of wood products
Stan Lebow; Bessie Woodward; Patricia Lebow; Carol Clausen
2010-01-01
Data generated from wood-product durability evaluations can be difficult to interpret. Standard methods used to evaluate the potential long-term durability of wood products often provide little guidance on interpretation of test results. Decisions on acceptable performance for standardization and code compliance are based on the judgment of reviewers or committees....
Coordination and standardization of federal sedimentation activities
Glysson, G. Douglas; Gray, John R.
1997-01-01
- precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.
Criteria for establishing water quality standards that are protective of all native biota are generally based upon laboratory toxicity tests. These test utilize common model organisms that have established test methods. However, only a small portion of species have established ...
ERIC Educational Resources Information Center
Gibbone, Anne; Mercier, Kevin
2014-01-01
Teacher candidates' use of technology is a component of physical education teacher education (PETE) program learning goals and accreditation standards. The methods presented in this article can help teacher candidates to learn about and apply technology as an instructional tool prior to and during field or clinical experiences. The goal in…
Nelson, Bryant C; Sharpless, Katherine E
2003-01-29
Catechins are polyphenolic plant compounds (flavonoids) that may offer significant health benefits to humans. These benefits stem largely from their anticarcinogenic, antioxidant, and antimutagenic properties. Recent epidemiological studies suggest that the consumption of flavonoid-containing foods is associated with reduced risk of cardiovascular disease. Chocolate is a natural cocoa bean-based product that reportedly contains high levels of monomeric, oligomeric, and polymeric catechins. We have applied solid-liquid extraction and liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometry to the identification and determination of the predominant monomeric catechins, (+)-catechin and (-)-epicatechin, in a baking chocolate Standard Reference Material (NIST Standard Reference Material 2384). (+)-Catechin and (-)-epicatechin are detected and quantified in chocolate extracts on the basis of selected-ion monitoring of their protonated [M + H](+) molecular ions. Tryptophan methyl ester is used as an internal standard. The developed method has the capacity to accurately quantify as little as 0.1 microg/mL (0.01 mg of catechin/g of chocolate) of either catechin in chocolate extracts, and the method has additionally been used to certify (+)-catechin and (-)-epicatechin levels in the baking chocolate Standard Reference Material. This is the first reported use of liquid chromatography/mass spectrometry for the quantitative determination of monomeric catechins in chocolate and the only report certifying monomeric catechin levels in a food-based Standard Reference Material.
Modeling and Control of a Tailsitter with a Ducted Fan
NASA Astrophysics Data System (ADS)
Argyle, Matthew Elliott
There are two traditional aircraft categories: fixed-wing which have a long endurance and a high cruise airspeed and rotorcraft which can take-off and land vertically. The tailsitter is a type of aircraft that has the strengths of both platforms, with no additional mechanical complexity, because it takes off and lands vertically on its tail and can transition the entire aircraft horizontally into high-speed flight. In this dissertation, we develop the entire control system for a tailsitter with a ducted fan. The standard method to compute the quaternion-based attitude error does not generate ideal trajectories for a hovering tailsitter for some situations. In addition, the only approach in the literature to mitigate this breaks down for large attitude errors. We develop an alternative quaternion-based error method which generates better trajectories than the standard approach and can handle large errors. We also derive a hybrid backstepping controller with almost global asymptotic stability based on this error method. Many common altitude and airspeed control schemes for a fixed-wing airplane assume that the altitude and airspeed dynamics are decoupled which leads to errors. The Total Energy Control System (TECS) is an approach that controls the altitude and airspeed by manipulating the total energy rate and energy distribution rate, of the aircraft, in a manner which accounts for the dynamic coupling. In this dissertation, a nonlinear controller, which can handle inaccurate thrust and drag models, based on the TECS principles is derived. Simulation results show that the nonlinear controller has better performance than the standard PI TECS control schemes. Most constant altitude transitions are accomplished by generating an optimal trajectory, and potentially actuator inputs, based on a high fidelity model of the aircraft. While there are several approaches to mitigate the effects of modeling errors, these do not fully remove the accurate model requirement. In this dissertation, we develop two different approaches that can achieve near constant altitude transitions for some types of aircraft. The first method, based on multiple LQR controllers, requires a high fidelity model of the aircraft. However, the second method, based on the energy along the body axes, requires almost no aerodynamic information.
Methods for determining time of death.
Madea, Burkhard
2016-12-01
Medicolegal death time estimation must estimate the time since death reliably. Reliability can only be provided empirically by statistical analysis of errors in field studies. Determining the time since death requires the calculation of measurable data along a time-dependent curve back to the starting point. Various methods are used to estimate the time since death. The current gold standard for death time estimation is a previously established nomogram method based on the two-exponential model of body cooling. Great experimental and practical achievements have been realized using this nomogram method. To reduce the margin of error of the nomogram method, a compound method was developed based on electrical and mechanical excitability of skeletal muscle, pharmacological excitability of the iris, rigor mortis, and postmortem lividity. Further increasing the accuracy of death time estimation involves the development of conditional probability distributions for death time estimation based on the compound method. Although many studies have evaluated chemical methods of death time estimation, such methods play a marginal role in daily forensic practice. However, increased precision of death time estimation has recently been achieved by considering various influencing factors (i.e., preexisting diseases, duration of terminal episode, and ambient temperature). Putrefactive changes may be used for death time estimation in water-immersed bodies. Furthermore, recently developed technologies, such as H magnetic resonance spectroscopy, can be used to quantitatively study decompositional changes. This review addresses the gold standard method of death time estimation in forensic practice and promising technological and scientific developments in the field.
Johnston, Jennifer M.
2014-01-01
The majority of biological processes mediated by G Protein-Coupled Receptors (GPCRs) take place on timescales that are not conveniently accessible to standard molecular dynamics (MD) approaches, notwithstanding the current availability of specialized parallel computer architectures, and efficient simulation algorithms. Enhanced MD-based methods have started to assume an important role in the study of the rugged energy landscape of GPCRs by providing mechanistic details of complex receptor processes such as ligand recognition, activation, and oligomerization. We provide here an overview of these methods in their most recent application to the field. PMID:24158803
Fully automated motion correction in first-pass myocardial perfusion MR image sequences.
Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2008-11-01
This paper presents a novel method for registration of cardiac perfusion magnetic resonance imaging (MRI). The presented method is capable of automatically registering perfusion data, using independent component analysis (ICA) to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of that ICA. This reference image is used in a two-pass registration framework. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Despite varying image quality and motion patterns in the evaluation set, validation of the method showed a reduction of the average right ventricle (LV) motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. Comparison of clinically relevant parameters computed using registered data and the manual gold standard show a good agreement. Additional tests with a simulated free-breathing protocol showed robustness against considerable deviations from a standard breathing protocol. We conclude that this fully automatic ICA-based method shows an accuracy, a robustness and a computation speed adequate for use in a clinical environment.
L'Empereur, Karen; Stadalius, Marilyn; Zhu, Yongdong; Mansoori, Bashir A; Isemura, Tsuguhide; Kaiser, Mary A; Knaup, Wolfgang; Noguchi, Masahiro
2008-08-01
Fluorotelomer-based acrylic polymers are applied to the surface of carpet to impart oil, stain, and water repellence properties. Concerns that fluorotelomer-based polymers are a possible source of "low level" exposure to humans, coupled with their widespread use have prompted the need to develop a method to detect and measure perfluorooctanoate (PFO) in carpet. A liquid chromatography tandem mass spectrometry method for the determination of PFO in carpet using a dual labeled 13C-perfluoroctanoic acid (13C-PFOA) internal standard is successfully developed and validated. Levels of PFO are determined using a gradient, reversed-phase high-performance liquid chromatography (HPLC) method with acetic acid acidified water-methanol, separated on a 50 mm Phenomenex Synergi Polar RP column. Ions monitored are 413 (parent) and 369 (daughter) for PFO and 415 (parent) and 370 (daughter) for dual labeled 13C-PFOA internal standard. Accuracy and precision over three days for 5 to 900 ng/g PFO in carpet ranged from 2.4% to 7.6% and 3.7% to 14.1%, respectively. Overall extraction efficiency for samples (n=30) fortified with 13C-PFOA at 20 ng/g and perfluorooctanoic acid (PFOA) at 5, 50, and 500 ng/g is 98.9%+/-8.1%. Specificity of the method was evaluated with two different carpet samples.
LABORATORY TOXICITY TESTS FOR EVALUATING POTENTIAL EFFECTS OF ENDOCRINE-DISRUPTING COMPOUNDS
The scope of the Laboratory Testing Work Group was to evaluate methods for testing aquatic and terrestrial invertebrates in the laboratory. Specifically, discussions focused on the following objectives: 1) assess the extent to which consensus-based standard methods and other pub...
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Lie, Octavian V; Papanastassiou, Alexander M; Cavazos, José E; Szabó, Ákos C
2015-10-01
Poor seizure outcomes after epilepsy surgery often reflect an incorrect localization of the epileptic sources by standard intracranial EEG interpretation because of limited electrode coverage of the epileptogenic zone. This study investigates whether, in such conditions, source modeling is able to provide more accurate source localization than the standard clinical method that can be used prospectively to improve surgical resection planning. Suboptimal epileptogenic zone sampling is simulated by subsets of the electrode configuration used to record intracranial EEG in a patient rendered seizure free after surgery. sLORETA and the clinical method solutions are applied to interictal spikes sampled with these electrode subsets and are compared for colocalization with the resection volume and displacement due to electrode downsampling. sLORETA provides often congruent and at times more accurate source localization when compared with the standard clinical method. However, with electrode downsampling, individual sLORETA solution locations can vary considerably and shift consistently toward the remaining electrodes. sLORETA application can improve source localization based on the clinical method but does not reliably compensate for suboptimal electrode placement. Incorporating sLORETA solutions based on intracranial EEG in surgical planning should proceed cautiously in cases where electrode repositioning is planned on clinical grounds.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.
Younghak Shin; Balasingham, Ilangko
2017-07-01
Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.
NASA Astrophysics Data System (ADS)
To, Anthony; Downs, Corey; Fu, Elain
2017-05-01
Wax printing has become a common method of fabricating channels in cellulose-based microfluidic devices. However, a limitation of wax printing is that it is restricted to relatively thin, smooth substrates that are compatible with processing by a commercial wax printer. In the current report, we describe a simple patterning method that extends the utility of wax printers for creating hydrophobic barriers on non-standard porous substrates via a process called wax transfer printing. We demonstrate the use of multiple wax transfer cycles to create well-defined, robust, and reproducible barriers in a thick cellulose substrate that is not compatible with feeding through a wax printer. We characterize the method for (i) wax spreading within the substrate as a function of heating time, (ii) the ability to create functional barriers in a substrate, and (iii) reproducibility in line width.
Hybrid Differential Dynamic Programming with Stochastic Search
NASA Technical Reports Server (NTRS)
Aziz, Jonathan; Parker, Jeffrey; Englander, Jacob A.
2016-01-01
Differential dynamic programming (DDP) has been demonstrated as a viable approach to low-thrust trajectory optimization, namely with the recent success of NASA's Dawn mission. The Dawn trajectory was designed with the DDP-based Static/Dynamic Optimal Control algorithm used in the Mystic software.1 Another recently developed method, Hybrid Differential Dynamic Programming (HDDP),2, 3 is a variant of the standard DDP formulation that leverages both first-order and second-order state transition matrices in addition to nonlinear programming (NLP) techniques. Areas of improvement over standard DDP include constraint handling, convergence properties, continuous dynamics, and multi-phase capability. DDP is a gradient based method and will converge to a solution nearby an initial guess. In this study, monotonic basin hopping (MBH) is employed as a stochastic search method to overcome this limitation, by augmenting the HDDP algorithm for a wider search of the solution space.
Harmonisation of microbial sampling and testing methods for distillate fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, G.C.; Hill, E.C.
1995-05-01
Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems andmore » describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.« less
Application of DNA-based methods in forensic entomology.
Wells, Jeffrey D; Stevens, Jamie R
2008-01-01
A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.
Hansen, William B; Derzon, James H; Reese, Eric L
2014-06-01
We propose a method for creating groups against which outcomes of local pretest-posttest evaluations of evidence-based programs can be judged. This involves assessing pretest markers for new and previously conducted evaluations to identify groups that have high pretest similarity. A database of 802 prior local evaluations provided six summary measures for analysis. The proximity of all groups using these variables is calculated as standardized proximities having values between 0 and 1. Five methods for creating standardized proximities are demonstrated. The approach allows proximity limits to be adjusted to find sufficient numbers of synthetic comparators. Several index cases are examined to assess the numbers of groups available to serve as comparators. Results show that most local evaluations would have sufficient numbers of comparators available for estimating program effects. This method holds promise as a tool for local evaluations to estimate relative effectiveness. © The Author(s) 2012.
Introducing criteria based audit into Ugandan maternity units.
Weeks, A D; Alia, G; Ononge, S; Mutungi, A; Otolorin, E O; Mirembe, F M
2004-02-01
Maternal mortality in Uganda has remained unchanged at 500/100 000 over the past 10 years despite concerted efforts to improve the standard of maternity care. It is especially difficult to improve standards in rural areas, where there is little money for improvements. Furthermore, staff may be isolated, poorly paid, disempowered, lacking in morale, and have few skills to bring about change. Training programme to introduce criteria based audit into rural Uganda. Makerere University Medical School, Mulago Hospital (large government teaching hospital in Kampala), and Mpigi District (rural area with 10 small health centres around a district hospital). Didactic teaching about criteria based audit followed by practical work in own units, with ongoing support and follow up workshops. Improvements were seen in many standards of care. Staff showed universal enthusiasm for the training; many staff produced simple, cost-free improvements in their standard of care. Teaching of criteria based audit to those providing health care in developing countries can produce low cost improvements in the standards of care. Because the method is simple and can be used to provide improvements even without new funding, it has the potential to produce sustainable and cost effective changes in the standard of health care. Follow up is needed to prevent a waning of enthusiasm with time.
Using Willie's Acid-Base Box for Blood Gas Analysis
ERIC Educational Resources Information Center
Dietz, John R.
2011-01-01
In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…
Metamodel-based inverse method for parameter identification: elastic-plastic damage model
NASA Astrophysics Data System (ADS)
Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb
2017-04-01
This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.
Validated method for quantification of genetically modified organisms in samples of maize flour.
Kunert, Renate; Gach, Johannes S; Vorauer-Uhl, Karola; Engel, Edwin; Katinger, Hermann
2006-02-08
Sensitive and accurate testing for trace amounts of biotechnology-derived DNA from plant material is the prerequisite for detection of 1% or 0.5% genetically modified ingredients in food products or raw materials thereof. Compared to ELISA detection of expressed proteins, real-time PCR (RT-PCR) amplification has easier sample preparation and detection limits are lower. Of the different methods of DNA preparation CTAB method with high flexibility in starting material and generation of sufficient DNA with relevant quality was chosen. Previous RT-PCR data generated with the SYBR green detection method showed that the method is highly sensitive to sample matrices and genomic DNA content influencing the interpretation of results. Therefore, this paper describes a real-time DNA quantification based on the TaqMan probe method, indicating high accuracy and sensitivity with detection limits of lower than 18 copies per sample applicable and comparable to highly purified plasmid standards as well as complex matrices of genomic DNA samples. The results were evaluated with ValiData for homology of variance, linearity, accuracy of the standard curve, and standard deviation.
Should the Standard Count Be Excluded from Neutron Probe Calibration?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Z. Fred
About 6 decades after its introduction, the neutron probe remains one of the most accurate methods for indirect measurement of soil moisture content. Traditionally, the calibration of a neutron probe involves the ratio of the neutron count in the soil to a standard count, which is the neutron count in the fixed environment such as the probe shield or a specially-designed calibration tank. The drawback of this count-ratio-based calibration is that the error in the standard count is carried through to all the measurements. An alternative calibration is to use the neutron counts only, not the ratio, with proper correctionmore » for radioactive decay and counting time. To evaluate both approaches, the shield counts of a neutron probe used for three decades were analyzed. The results show that the surrounding conditions have a substantial effect on the standard count. The error in the standard count also impacts the calculation of water storage and could indicate false consistency among replicates. The analysis of the shield counts indicates negligible aging effect of the instrument over a period of 26 years. It is concluded that, by excluding the standard count, the use of the count-based calibration is appropriate and sometimes even better than ratio-based calibration. The count-based calibration is especially useful for historical data when the standard count was questionable or absent« less
Development of Mycoplasma synoviae (MS) core genome multilocus sequence typing (cgMLST) scheme.
Ghanem, Mostafa; El-Gazzar, Mohamed
2018-05-01
Mycoplasma synoviae (MS) is a poultry pathogen with reported increased prevalence and virulence in recent years. MS strain identification is essential for prevention, control efforts and epidemiological outbreak investigations. Multiple multilocus based sequence typing schemes have been developed for MS, yet the resolution of these schemes could be limited for outbreak investigation. The cost of whole genome sequencing became close to that of sequencing the seven MLST targets; however, there is no standardized method for typing MS strains based on whole genome sequences. In this paper, we propose a core genome multilocus sequence typing (cgMLST) scheme as a standardized and reproducible method for typing MS based whole genome sequences. A diverse set of 25 MS whole genome sequences were used to identify 302 core genome genes as cgMLST targets (35.5% of MS genome) and 44 whole genome sequences of MS isolates from six countries in four continents were used for typing applying this scheme. cgMLST based phylogenetic trees displayed a high degree of agreement with core genome SNP based analysis and available epidemiological information. cgMLST allowed evaluation of two conventional MLST schemes of MS. The high discriminatory power of cgMLST allowed differentiation between samples of the same conventional MLST type. cgMLST represents a standardized, accurate, highly discriminatory, and reproducible method for differentiation between MS isolates. Like conventional MLST, it provides stable and expandable nomenclature, allowing for comparing and sharing the typing results between different laboratories worldwide. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.
GRID-BASED EXPLORATION OF COSMOLOGICAL PARAMETER SPACE WITH SNAKE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikkelsen, K.; Næss, S. K.; Eriksen, H. K., E-mail: kristin.mikkelsen@astro.uio.no
2013-11-10
We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the 'curse of dimensionality' problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3)more » better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N{sub par}. One of the main goals of the present paper is to determine how large N{sub par} can be, while still maintaining reasonable computational efficiency; we find that N{sub par} = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.« less
A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.
Tipton, Elizabeth; Shuster, Jonathan
2017-10-15
Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
Code of Federal Regulations, 2014 CFR
2014-04-01
... Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint...) The HUD Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing (Guidelines...
ROOT BIOMASS ALLOCATION IN THE WORLD'S UPLAND FORESTS
Because the world's forests play a major role in regulating nutrient and carbon cycles, there is much interest in estimating their biomass. Estimates of aboveground biomass based on well-established methods are relatively abundant; estimates of root biomass based on standard meth...
Viscosity measuring using microcantilevers
Oden, Patrick Ian
2001-01-01
A method for the measurement of the viscosity of a fluid uses a micromachined cantilever mounted on a moveable base. As the base is rastered while in contact with the fluid, the deflection of the cantilever is measured and the viscosity determined by comparison with standards.
NASA Astrophysics Data System (ADS)
Ali, Nauman; Ismail, Muhammad; Khan, Adnan; Khan, Hamayun; Haider, Sajjad; Kamal, Tahseen
2018-01-01
In this work, we have developed simple, sensitive and inexpensive methods for the spectrophotometric determination of urea in urine samples using silver nanoparticles (AgNPs). The standard addition and 2nd order derivative methods were adopted for this purpose. AgNPs were prepared by chemical reduction of AgNO3 with hydrazine using 1,3-di-(1H-imidazol-1-yl)-2-propanol (DIPO) as a stabilizing agent in aqueous medium. The proposed methods were based on the complexation of AgNPs with urea. Using this concept, urea in the urine samples was successfully determined spectrophotometric methods. The results showed high percent recovery with ± RSD. The recoveries of urea in the three urine samples by spectrophotometric standard addition were 99.2% ± 5.37, 96.3% ± 4.49, 104.88% ± 4.99 and that of spectrophotometric 2nd order derivative method were 115.3% ± 5.2, 103.4% ± 2.6, 105.93% ± 0.76. The results show that these methods can open doors for a potential role of AgNPs in the clinical determination of urea in urine, blood, biological, non-biological fluids.
A Multi-Center Space Data System Prototype Based on CCSDS Standards
NASA Technical Reports Server (NTRS)
Rich, Thomas M.
2016-01-01
Deep space missions beyond earth orbit will require new methods of data communications in order to compensate for increasing RF propagation delay. The Consultative Committee for Space Data Systems (CCSDS) standard protocols Spacecraft Monitor & Control (SM&C), Asynchronous Message Service (AMS), and Delay/Disruption Tolerant Networking (DTN) provide such a method. The maturity level of this protocol set is, however, insufficient for mission inclusion at this time. This prototype is intended to provide experience which will raise the Technical Readiness Level (TRL) of these protocols..
Filling the voids in the SRTM elevation model — A TIN-based delta surface approach
NASA Astrophysics Data System (ADS)
Luedeling, Eike; Siebert, Stefan; Buerkert, Andreas
The Digital Elevation Model (DEM) derived from NASA's Shuttle Radar Topography Mission is the most accurate near-global elevation model that is publicly available. However, it contains many data voids, mostly in mountainous terrain. This problem is particularly severe in the rugged Oman Mountains. This study presents a method to fill these voids using a fill surface derived from Russian military maps. For this we developed a new method, which is based on Triangular Irregular Networks (TINs). For each void, we extracted points around the edge of the void from the SRTM DEM and the fill surface. TINs were calculated from these points and converted to a base surface for each dataset. The fill base surface was subtracted from the fill surface, and the result added to the SRTM base surface. The fill surface could then seamlessly be merged with the SRTM DEM. For validation, we compared the resulting DEM to the original SRTM surface, to the fill DEM and to a surface calculated by the International Center for Tropical Agriculture (CIAT) from the SRTM data. We calculated the differences between measured GPS positions and the respective surfaces for 187,500 points throughout the mountain range (ΔGPS). Comparison of the means and standard deviations of these values showed that for the void areas, the fill surface was most accurate, with a standard deviation of the ΔGPS from the mean ΔGPS of 69 m, and only little accuracy was lost by merging it to the SRTM surface (standard deviation of 76 m). The CIAT model was much less accurate in these areas (standard deviation of 128 m). The results show that our method is capable of transferring the relative vertical accuracy of a fill surface to the void areas in the SRTM model, without introducing uncertainties about the absolute elevation of the fill surface. It is well suited for datasets with varying altitude biases, which is a common problem of older topographic information.
Combinatorics of transformations from standard to non-standard bases in Brauer algebras
NASA Astrophysics Data System (ADS)
Chilla, Vincenzo
2007-05-01
Transformation coefficients between standard bases for irreducible representations of the Brauer centralizer algebra \\mathfrak{B}_f(x) and split bases adapted to the \\mathfrak{B}_{f_1} (x) \\times \\mathfrak{B}_{f_2} (x) \\subset \\mathfrak{B}_f (x) subalgebra (f1 + f2 = f) are considered. After providing the suitable combinatorial background, based on the definition of the i-coupling relation on nodes of the subduction grid, we introduce a generalized version of the subduction graph which extends the one given in Chilla (2006 J. Phys. A: Math. Gen. 39 7657) for symmetric groups. Thus, we can describe the structure of the subduction system arising from the linear method and give an outline of the form of the solution space. An ordering relation on the grid is also given and then, as in the case of symmetric groups, the choices of the phases and of the free factors governing the multiplicity separations are discussed.
Standardized reporting of functioning information on ICF-based common metrics.
Prodinger, Birgit; Tennant, Alan; Stucki, Gerold
2018-02-01
In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians and researchers to continue using their tools while still being able to compare and aggregate the information within and across tools.
Faassen, Elisabeth J.; Antoniou, Maria G.; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L.; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-01-01
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer’s disease and Parkinson’s disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D3BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%–32%), implying that D3BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis. PMID:26938542
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Faravan, Amir; Mohammadi, Nooredin; Alizadeh Ghavidel, Alireza; Toutounchi, Mohammad Zia; Ghanbari, Ameneh; Mazloomi, Mehran
2016-01-01
Standards have a significant role in showing the minimum level of optimal optimum and the expected performance. Since the perfusion technology staffs play an the leading role in providing the quality services to the patients undergoing open heart surgery with cardiopulmonary bypass machine, this study aimed to assess the standards on how Iranian perfusion technology staffs evaluate and manage the patients during the cardiopulmonary bypass process and compare their practice with the recommended standards by American Society of Extracorporeal Technology. In this descriptive study, data was collected from 48 Iranian public hospitals and educational health centers through a researcher-created questionnaire. The data collection questionnaire assessed the standards which are recommended by American Society of Extracorporeal Technology. Findings showed that appropriate measurements were carried out by the perfusion technology staffs to prevent the hemodilution and avoid the blood transfusion and unnecessary blood products, determine the initial dose of heparin based on one of the proposed methods, monitor the anticoagulants based on ACT measurement, and determine the additional doses of heparin during the cardiopulmonary bypass based on ACT or protamine titration. It was done only in 4.2% of hospitals and health centers. Current practices of cardiopulmonary perfusion technology in Iran are inappropriate based on the standards of American Society of Cardiovascular Perfusion. This represents the necessity of authorities' attention to the validation programs and development of the caring standards on one hand and continuous assessment of using these standards on the other hand.
Stuart, Elizabeth A.; Lee, Brian K.; Leacy, Finbarr P.
2013-01-01
Objective Examining covariate balance is the prescribed method for determining when propensity score methods are successful at reducing bias. This study assessed the performance of various balance measures, including a proposed balance measure based on the prognostic score (also known as the disease-risk score), to determine which balance measures best correlate with bias in the treatment effect estimate. Study Design and Setting The correlations of multiple common balance measures with bias in the treatment effect estimate produced by weighting by the odds, subclassification on the propensity score, and full matching on the propensity score were calculated. Simulated data were used, based on realistic data settings. Settings included both continuous and binary covariates and continuous covariates only. Results The standardized mean difference in prognostic scores, the mean standardized mean difference, and the mean t-statistic all had high correlations with bias in the effect estimate. Overall, prognostic scores displayed the highest correlations of all the balance measures considered. Prognostic score measure performance was generally not affected by model misspecification and performed well under a variety of scenarios. Conclusion Researchers should consider using prognostic score–based balance measures for assessing the performance of propensity score methods for reducing bias in non-experimental studies. PMID:23849158
Aligning ERP systems with companies' real needs: an `Operational Model Based' method
NASA Astrophysics Data System (ADS)
Mamoghli, Sarra; Goepp, Virginie; Botta-Genoulaz, Valérie
2017-02-01
Enterprise Resource Planning (ERP) systems offer standard functionalities that have to be configured and customised by a specific company depending on its own requirements. A consistent alignment is therefore an essential success factor of ERP projects. To manage this alignment, an 'Operational Model Based' method is proposed. It is based on the design and the matching of models, and conforms to the modelling views and constructs of the ISO 19439 and 19440 enterprise-modelling standards. It is characterised by: (1) a predefined design and matching order of the models; (2) the formalisation, in terms of modelling constructs, of alignment and misalignment situations; and (3) their association with a set of decisions in order to mitigate the misalignment risk. Thus, a comprehensive understanding of the alignment management during ERP projects is given. Unlike existing methods, this one includes decisions related to the organisational changes an ERP system can induce, as well as criteria on which the best decision can be based. In this way, it provides effective support and guidance to companies implementing ERP systems, as the alignment process is detailed and structured. The method is applied on the ERP project of a Small and Medium Enterprise, showing that it can be used even in contexts where the ERP project expertise level is low.
The role of the Standard Days Method in modern family planning services in developing countries.
Lundgren, Rebecka I; Karra, Mihira V; Yam, Eileen A
2012-08-01
The mere availability of family planning (FP) services is not sufficient to improve reproductive health; services must also be of adequate quality. The introduction of new contraceptive methods is a means of improving quality of care. The Standard Days Method (SDM) is a new fertility-awareness-based contraceptive method that has been successfully added to reproductive health care services around the world. Framed by the Bruce-Jain quality-of-care paradigm, this paper describes how the introduction of SDM in developing country settings can improve the six elements of quality while contributing to the intrinsic variety of available methods. SDM meets the needs of women and couples who opt not to use other modern methods. SDM providers are sensitised to the potential of fertility-awareness-based contraception as an appropriate choice for these clients. SDM requires the involvement of both partners and thus offers a natural entry point for providers to further explore partner communication, intimate partner violence, condoms, and HIV/STIs. SDM introduction broadens the range of FP methods available to couples in developing countries. SDM counselling presents an opportunity for FP providers to discuss important interpersonal and reproductive health issues with potential users.