SETs: stand evaluation tools: II. tree value conversion standards for hardwood sawtimber
Joseph J. Mendel; Paul S. DeBald; Martin E. Dale
1976-01-01
Tree quatity index tables are presented for 12 important hardwood species of the oak-hickory forest. From these, tree value conversion standards are developed for each species, log grade, merchantable height, and diameter at breast height. The method of calculating tree value conversion standards and adapting them to different conditions is explained. A computer...
Tree value conversion standards revisited
Paul S. DeBald; Martin E. Dale; Martin E. Dale
1991-01-01
Updated tree value conversion standards (TVCS) are presented for 12 important hardwood species of the oak-hickory forest. These updated standards-developed for each species by butt-log grade, merchantable height, and diameter at breast height-reflect the changes in lumber prices and in conversion costs which have occurred since 1976 when the original TVCS were...
McDonough, Ian M; Bui, Dung C; Friedman, Michael C; Castel, Alan D
2015-10-01
The perceived value of information can influence one's motivation to successfully remember that information. This study investigated how information value can affect memory search and evaluation processes (i.e., retrieval monitoring). In Experiment 1, participants studied unrelated words associated with low, medium, or high values. Subsequent memory tests required participants to selectively monitor retrieval for different values. False memory effects were smaller when searching memory for high-value than low-value words, suggesting that people more effectively monitored more important information. In Experiment 2, participants studied semantically-related words, and the need for retrieval monitoring was reduced at test by using inclusion instructions (i.e., endorsement of any word related to the studied words) compared with standard instructions. Inclusion instructions led to increases in false recognition for low-value, but not for high-value words, suggesting that under standard-instruction conditions retrieval monitoring was less likely to occur for important information. Experiment 3 showed that words retrieved with lower confidence were associated with more effective retrieval monitoring, suggesting that the quality of the retrieved memory influenced the degree and effectiveness of monitoring processes. Ironically, unless encouraged to do so, people were less likely to carefully monitor important information, even though people want to remember important memories most accurately. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Davis, Jan Ellen Pfeiffer
2011-01-01
In 2009, a PK-12 public school district board of education approved a teacher evaluation tool developed by a volunteer team of teachers and administrators. The Learning Based Teacher Evaluation (LBTE) was constructed with six broad standards and fifteen specific criteria. The standards and criteria were assumed important to professional practice,…
[Comparative analysis on industrial standardization degree of Chinese and Korean ginseng].
Chu, Qiao; Xi, Xing-Jun; Wang, He-Yan; Si, Ding-Hua; Tang, Fei; Lan, Tao
2017-05-01
Panax ginseng is a well-known medicinal plant all over the world. It has high nutritional value and medicinal value. China and South Korea are the major countries in the world for ginseng cultivation, production and exportation. China's ginseng production accounts for more than half of the world, but the output value is less than that of Korea. The standardization process of ginseng industry plays an important role. This paper makes a detailed analysis of the Chinese and Korean ginseng national standards and the standardization process, and makes a detailed comparative analysis of the categories, standard contents, index selection, age, implementation and promotion status of the Chinese and Korean ginseng standards. The development disadvantages of ginseng industry standardization were displayed. And we give our advises on the standard revision, implementation of China's ginseng industry standardization, hoping to enhance the competitiveness of China's ginseng industry. Copyright© by the Chinese Pharmaceutical Association.
Standards: The Keys to Domestic and International Competitiveness.
ERIC Educational Resources Information Center
Hunter, Robert D.
1993-01-01
Demonstrates the importance of standards for the competitiveness of U.S. companies and for international trade. The value of standards in research and development, marketing, design, purchasing, manufacturing, installation, and service is explained. Examples of specific standards and their application to the computer industry are included. (10…
HOW STUDENTS USE VALUES IN DECISION-MAKING.
ERIC Educational Resources Information Center
VARENHORST, BARBARA B.
HIGH SCHOOL STUDENTS ARE EXAMINING AND INTERNALIZING VALUES. THEIR FEAR OF COMMITMENT TO ANYTHING DEVIATING FROM ACCEPTED VALUES MAY HINDER THEIR CLARIFICATION OF PERSONAL GOALS AND VALUES. THE DECISION-MAKING PROCESS SHOULD BE AN IMPORTANT PART OF THE GUIDANCE PROGRAM. STANDARD QUESTIONNAIRES DO NOT PROVIDE INFORMATION ABOUT SPECIFIC FACTORS…
Banks, Debbie
2014-10-01
The purpose of this paper is to discuss the importance of standard representational photography in clinical photography and its aid in maintaining the gold standard in the medical illustration profession. it is important that professionals are aware of potential threats to good practice with many areas identified in the article.
Vitamin D measurement standardization: The way out of the chaos.
Binkley, N; Dawson-Hughes, B; Durazo-Arvizu, R; Thamm, M; Tian, L; Merkel, J M; Jones, J C; Carter, G D; Sempos, C T
2017-10-01
Substantial variability is associated with laboratory measurement of serum total 25-hydroxyvitamin D [25(OH)D]. The resulting chaos impedes development of consensus 25(OH)D values to define stages of vitamin D status. As resolving this situation requires standardized measurement of 25(OH)D, the Vitamin D Standardization Program (VDSP) developed methodology to standardize 25(OH)D measurement to the gold standard reference measurement procedures of NIST, Ghent University and CDC. Importantly, VDSP developed protocols for standardizing 25(OH)D values from prior research based on availability of stored serum samples. The effect of such retrospective standardization on prevalence of "low" vitamin D status in national studies reported here for The Third National Health and Nutrition Examination Survey (NHANES III, 1988-1994) and the German Health Interview and Examination Survey for Children and Adolescents (KIGGS, 2003-2006) was such that in NHANES III 25(OH)D values were lower than original values while higher in KIGGS. In NHANES III the percentage with values below 30, 50 and 75 nmol/L increased from 4% to 6%, 22% to 31% and 55% to 71%, respectively. Whereas in KIGGS after standardization the percentage below 30, 50, and 70 nmol/L decreased from 28% to 13%, 64% to 47% and 87% to 85% respectively. Moreover, in a hypothetical example, depending on whether the 25(OH)D assay was positively or negatively biased by 12%, the 25(OH)D concentration which maximally suppressed PTH could vary from 20 to 35ng/mL. These examples underscore the challenges (perhaps impossibility) of developing vitamin D guidelines using unstandardized 25(OH)D data. Retrospective 25(OH)D standardization can be applied to old studies where stored serum samples exist. As a way forward, we suggest an international effort to identify key prior studies with stored samples for re-analysis and standardization initially to define the 25(OH)D level associated with vitamin D deficiency (rickets/osteomalacia). Subsequent work could focus on defining inadequacy. Finally, examples reported here highlight the importance of suspending publication of meta-analyses based on unstandardized 25(OH)D results. Published by Elsevier Ltd.
The Emerging Importance of Business Process Standards in the Federal Government
2006-02-23
delivers enough value for its commercialization into the general industry. Today, we are seeing standards such as SOA, BPMN and BPEL hit that...Process Modeling Notation ( BPMN ) and the Business Process Execution Language (BPEL). BPMN provides a standard representation for capturing and...execution. The combination of BPMN and BPEL offers organizations the potential to standardize processes in a distributed environment, enabling
Computation of Standard Errors
Dowd, Bryan E; Greene, William H; Norton, Edward C
2014-01-01
Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304
The Measurement of Values: Effects of Different Assessment Procedures
ERIC Educational Resources Information Center
Feather, N. T.
1973-01-01
Rating and pair-comparison procedures for assessing the importance of terminal and instrumental values were compared with the standard ranking procedure developed by Rokeach. Effects of order of presentation of of the value sets were also investigated. Neither procedure nor order had replicable effect though some sex differences were apparent. (TO)
Li, Zijian; Jennings, Aaron A.
2017-01-01
Worldwide jurisdictions are making efforts to regulate pesticide standard values in residential soil, drinking water, air, and agricultural commodity to lower the risk of pesticide impacts on human health. Because human may exposure to pesticides from many ways, such as ingestion, inhalation, and dermal contact, it is important to examine pesticide standards by considering all major exposure pathways. Analysis of implied maximum dose limits for commonly historical and current used pesticides was adopted in this study to examine whether worldwide pesticide standard values are enough to prevent human health impact or not. Studies show that only U.S. has regulated pesticides standard in the air. Only 4% of the total number of implied maximum dose limits is based on three major exposures. For Chlorpyrifos, at least 77.5% of the total implied maximum dose limits are above the acceptable daily intake. It also shows that most jurisdictions haven't provided pesticide standards in all major exposures yet, and some of the standards are not good enough to protect human health. PMID:29546224
Tibiofemoral wear in standard and non-standard squat: implication for total knee arthroplasty.
Fekete, Gusztáv; Sun, Dong; Gu, Yaodong; Neis, Patric Daniel; Ferreira, Ney Francisco; Innocenti, Bernardo; Csizmadia, Béla M
2017-01-01
Due to the more resilient biomaterials, problems related to wear in total knee replacements (TKRs) have decreased but not disappeared. In the design-related factors, wear is still the second most important mechanical factor that limits the lifetime of TKRs and it is also highly influenced by the local kinematics of the knee. During wear experiments, constant load and slide-roll ratio is frequently applied in tribo-tests beside other important parameters. Nevertheless, numerous studies demonstrated that constant slide-roll ratio is not accurate approach if TKR wear is modelled, while instead of a constant load, a flexion-angle dependent tibiofemoral force should be involved into the wear model to obtain realistic results. A new analytical wear model, based upon Archard's law, is introduced, which can determine the effect of the tibiofemoral force and the varying slide-roll on wear between the tibiofemoral connection under standard and non-standard squat movement. The calculated total wear with constant slide-roll during standard squat was 5.5 times higher compared to the reference value, while if total wear includes varying slide-roll during standard squat, the calculated wear was approximately 6.25 times higher. With regard to non-standard squat, total wear with constant slide-roll during standard squat was 4.16 times higher than the reference value. If total wear included varying slide-roll, the calculated wear was approximately 4.75 times higher. It was demonstrated that the augmented force parameter solely caused 65% higher wear volume while the slide-roll ratio itself increased wear volume by 15% higher compared to the reference value. These results state that the force component has the major effect on wear propagation while non-standard squat should be proposed for TKR patients as rehabilitation exercise.
Tibiofemoral wear in standard and non-standard squat: implication for total knee arthroplasty
Sun, Dong; Gu, Yaodong; Neis, Patric Daniel; Ferreira, Ney Francisco; Innocenti, Bernardo; Csizmadia, Béla M.
2017-01-01
Summary Introduction Due to the more resilient biomaterials, problems related to wear in total knee replacements (TKRs) have decreased but not disappeared. In the design-related factors, wear is still the second most important mechanical factor that limits the lifetime of TKRs and it is also highly influenced by the local kinematics of the knee. During wear experiments, constant load and slide-roll ratio is frequently applied in tribo-tests beside other important parameters. Nevertheless, numerous studies demonstrated that constant slide-roll ratio is not accurate approach if TKR wear is modelled, while instead of a constant load, a flexion-angle dependent tibiofemoral force should be involved into the wear model to obtain realistic results. Methods A new analytical wear model, based upon Archard’s law, is introduced, which can determine the effect of the tibiofemoral force and the varying slide-roll on wear between the tibiofemoral connection under standard and non-standard squat movement. Results The calculated total wear with constant slide-roll during standard squat was 5.5 times higher compared to the reference value, while if total wear includes varying slide-roll during standard squat, the calculated wear was approximately 6.25 times higher. With regard to non-standard squat, total wear with constant slide-roll during standard squat was 4.16 times higher than the reference value. If total wear included varying slide-roll, the calculated wear was approximately 4.75 times higher. Conclusions It was demonstrated that the augmented force parameter solely caused 65% higher wear volume while the slide-roll ratio itself increased wear volume by 15% higher compared to the reference value. These results state that the force component has the major effect on wear propagation while non-standard squat should be proposed for TKR patients as rehabilitation exercise. PMID:29721453
Physiological pharmacokinetic/pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
Physiological pharmacokinetic\\pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
The Future Is Performance Assessment
ERIC Educational Resources Information Center
French, Dan
2017-01-01
As more people question the value of standardized testing, the public appetite for a change in the accountability system grows. A 2016 national survey found that "voters consider standardized tests the least important factor in measuring the performance of students," preferring instead to have a multiple-measures data dashboard of…
Potassium Isotopic Compositions of NIST Potassium Standards and 40Ar/39Ar Mineral Standards
NASA Technical Reports Server (NTRS)
Morgan, Leah; Tappa, Mike; Ellam, Rob; Mark, Darren; Higgins, John; Simon, Justin I.
2013-01-01
Knowledge of the isotopic ratios of standards, spikes, and reference materials is fundamental to the accuracy of many geochronological methods. For example, the 238U/235U ratio relevant to U-Pb geochronology was recently re-determined [1] and shown to differ significantly from the previously accepted value employed during age determinations. These underlying values are fundamental to accurate age calculations in many isotopic systems, and uncertainty in these values can represent a significant (and often unrecognized) portion of the uncertainty budget for determined ages. The potassium isotopic composition of mineral standards, or neutron flux monitors, is a critical, but often overlooked component in the calculation of K-Ar and 40Ar/39Ar ages. It is currently assumed that all terrestrial materials have abundances indistinguishable from that of NIST SRM 985 [2]; this is apparently a reasonable assumption at the 0.25per mille level (1s) [3]. The 40Ar/39Ar method further relies on the assumption that standards and samples (including primary and secondary standards) have indistinguishable 40K/39K values. We will present data establishing the potassium isotopic compositions of NIST isotopic K SRM 985, elemental K SRM 999b, and 40Ar/39Ar biotite mineral standard GA1550 (sample MD-2). Stable isotopic compositions (41K/39K) were measured by the peak shoulder method with high resolution MC-ICP-MS (Thermo Scientific NEPTUNE Plus), using the accepted value of NIST isotopic SRM 985 [2] for fractionation [4] corrections [5]. 40K abundances were measured by TIMS (Thermo Scientific TRITON), using 41K/39K values from ICP-MS measurements (or, for SRM 985, values from [2]) for internal fractionation corrections. Collectively these data represent an important step towards a metrologically traceable calibration of 40K concentrations in primary 40Ar/39Ar mineral standards and improve uncertainties by ca. an order of magnitude in the potassium isotopic compositions of standards.
The Real Value of Teachers: If Good Teachers Matter, Why Don't We Act like It?
ERIC Educational Resources Information Center
Carey, Kevin
2009-01-01
Almost nobody, it seems, disputes the importance of effective teachers--including teachers themselves. However, principals actually do none of the things they do when they value something as highly as most people say they value good teachers. By looking at scores on year-end standardized tests by teachers, principals think they have a pretty good…
Gómez-Campos, Rossana; Lee Andruske, Cinthya; Hespanhol, Jefferson; Sulla Torres, Jose; Arruda, Miguel; Luarte-Rocha, Cristian; Cossio-Bolaños, Marco Antonio
2015-01-01
The measurement of waist circumference (WC) is considered to be an important means to control overweight and obesity in children and adolescents. The objectives of the study were to (a) compare the WC measurements of Chilean students with the international CDC-2012 standard and other international standards, and (b) propose a specific measurement value for the WC of Chilean students based on age and sex. A total of 3892 students (6 to 18 years old) were assessed. Weight, height, body mass index (BMI), and WC were measured. WC was compared with the CDC-2012 international standard. Percentiles were constructed based on the LMS method. Chilean males had a greater WC during infancy. Subsequently, in late adolescence, males showed values lower than those of the international standards. Chilean females demonstrated values similar to the standards until the age of 12. Subsequently, females showed lower values. The 85th and 95th percentiles were adopted as cutoff points for evaluating overweight and obesity based on age and sex. The WC of Chilean students differs from the CDC-2012 curves. The regional norms proposed are a means to identify children and adolescents with a high risk of suffering from overweight and obesity disorders. PMID:26184250
Disruptive Innovation: Value-Based Health Plans
Vogenberg, F. Randy
2008-01-01
Value and a Complex Healthcare Market What Is Value to an Employer? “Worth in usefulness or importance to the possessor; utility or merit.” American Heritage Dictionary “A principle, standard, or quality considered worthwhile or desirable.” American Heritage Stedman's Medical Dictionary “A fair return or equivalent in goods, services, or money for something exchanged.” Merriam-Webster's Dictionary of Law PMID:25128808
On the long-term stability of calibration standards in different matrices.
Kandić, A; Vukanac, I; Djurašević, M; Novković, D; Šešlak, B; Milošević, Z
2012-09-01
In order to assure Quality Control in accordance with ISO/IEC 17025, it was important, from metrological point of view, to examine the long-term stability of calibration standards previously prepared. Comprehensive reconsideration on efficiency curves with respect to the ageing of calibration standards is presented in this paper. The calibration standards were re-used after a period of 5 years and analysis of the results showed discrepancies in efficiency values. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Konopík, P.; Džugan, J.; Bucki, T.; Rzepa, S.; Rund, M.; Procházka, R.
2017-02-01
Absorbed energy obtained from impact Charpy tests is one of the most important values in many applications, for example in residual lifetime assessment of components in service. Minimal absorbed energy is often the value crucial for extending components service life, e.g. turbines, boilers and steam lines. Using a portable electric discharge sampling equipment (EDSE), it is possible to sample experimental material non-destructively and subsequently produce mini-Charpy specimens. This paper presents a new approach in correlation from sub-size to standard Charpy test results.
ERIC Educational Resources Information Center
Osteen, Philip J.
2011-01-01
Edicts within the Council on Social Work Education's 2008 Educational Policy and Accreditation Standards address the importance of understanding the intersection of personal and professional values. Twenty MSW students, chosen on the basis of diverse cultural and personal characteristics, were interviewed about their motivations for pursuing a MSW…
Effect of poor control of film processors on mammographic image quality.
Kimme-Smith, C; Sun, H; Bassett, L W; Gold, R H
1992-11-01
With the increasingly stringent standards of image quality in mammography, film processor quality control is especially important. Current methods are not sufficient for ensuring good processing. The authors used a sensitometer and densitometer system to evaluate the performance of 22 processors at 16 mammographic facilities. Standard sensitometric values of two films were established, and processor performance was assessed for variations from these standards. Developer chemistry of each processor was analyzed and correlated with its sensitometric values. Ten processors were retested, and nine were found to be out of calibration. The developer components of hydroquinone, sulfites, bromide, and alkalinity varied the most, and low concentrations of hydroquinone were associated with lower average gradients at two facilities. Use of the sensitometer and densitometer system helps identify out-of-calibration processors, but further study is needed to correlate sensitometric values with developer component values. The authors believe that present quality control would be improved if sensitometric or other tests could be used to identify developer components that are out of calibration.
Kassam, Alisha; Skiadaresis, Julia; Habib, Sharifa; Alexander, Sarah; Wolfe, Joanne
2013-03-01
The National Consensus Project (NCP) published a set of standards for quality palliative care delivery. A key step before applying these guidelines to pediatric oncology is to evaluate how much families and clinicians value these standards. We aimed to determine which elements of palliative care are considered important according to bereaved parents and pediatric oncology clinicians and to determine accessibility of these elements. We administered questionnaires to 75 bereaved parents (response rate, 54%) and 48 pediatric oncology clinicians (response rate, 91%) at a large teaching hospital. Outcome measures included importance ratings and accessibility of core elements of palliative care delivery. Fifteen of 20 core elements were highly valued by both parents and clinicians (defined as > 60% of parents and clinicians reporting the item as important). Compared with clinicians, parents gave higher ratings to receiving cancer-directed therapy during the last month of life (P < .01) and involvement of a spiritual mentor (P = .03). Of the valued elements, only three were accessible more than 60% of the time according to clinicians and parents. Valued elements least likely to be accessible included a direct admission policy to hospital, sibling support, and parent preparation for medical aspects surrounding death. Parents and clinicians highly value a majority of palliative care elements described in the NCP framework. Children with advanced cancer may not be receiving key elements of palliative care despite parents and clinicians recognizing them as important. Evaluation of barriers to provision of quality palliative care and strategies for overcoming them are critical.
Michaud, Ginette Y
2005-01-01
In the field of clinical laboratory medicine, standardization is aimed at increasing the trueness and reliability of measured values. Standardization relies on the use of written standards, reference measurement procedures and reference materials. These are important tools for the design and validation of new tests, and for establishing the metrological traceability of diagnostic assays. Their use supports the translation of research technologies into new diagnostic assays and leads to more rapid advances in science and medicine, as well as improvements in the quality of patient care. The various standardization tools are described, as are the procedures by which written standards, reference procedures and reference materials are developed. Recent efforts to develop standards for use in the field of molecular diagnostics are discussed. The recognition of standardization tools by the FDA and other regulatory authorities is noted as evidence of their important role in ensuring the safety and performance of in vitro diagnostic devices.
Value-Based Medicine and Pharmacoeconomics.
Brown, Gary C; Brown, Melissa M
2016-01-01
Pharmacoeconomics is assuming increasing importance in the pharmaceutical field since it is entering the public policy arena in many countries. Among the variants of pharmacoeconomic analysis are cost-minimization, cost-benefit, cost-effectiveness and cost-utility analyses. The latter is the most versatile and sophisticated in that it integrates the patient benefit (patient value) conferred by a drug in terms of improvement in length and/or quality of life. It also incorporates the costs expended for that benefit, as well as the dollars returned to patients and society from the use of a drug (financial value). Unfortunately, one cost-utility analysis in the literature is generally not comparable to another because of the lack of standardized formats and standardized input variables (costs, cost perspective, quality-of-life measurement instruments, quality-of-life respondents, discounting and so forth). Thus, millions of variants can be used. Value-based medicine® (VBM) cost-utility analysis standardizes these variants so that one VBM analysis is comparable to another. This system provides a highly rational methodology that allows providers and patients to quantify and compare the patient value and financial value gains associated with the use of pharmaceutical agents for example. © 2016 S. Karger AG, Basel.
Hietala, P; Wolfová, M; Wolf, J; Kantanen, J; Juga, J
2014-02-01
Improving the feed efficiency of dairy cattle has a substantial effect on the economic efficiency and on the reduction of harmful environmental effects of dairy production through lower feeding costs and emissions from dairy farming. To assess the economic importance of feed efficiency in the breeding goal for dairy cattle, the economic values for the current breeding goal traits and the additional feed efficiency traits for Finnish Ayrshire cattle under production circumstances in 2011 were determined. The derivation of economic values was based on a bioeconomic model in which the profit of the production system was calculated, using the generated steady state herd structure. Considering beef production from dairy farms, 2 marketing strategies for surplus calves were investigated: (A) surplus calves were sold at a young age and (B) surplus calves were fattened on dairy farms. Both marketing strategies were unprofitable when subsidies were not included in the revenues. When subsidies were taken into account, a positive profitability was observed in both marketing strategies. The marginal economic values for residual feed intake (RFI) of breeding heifers and cows were -25.5 and -55.8 €/kg of dry matter per day per cow and year, respectively. The marginal economic value for RFI of animals in fattening was -29.5 €/kg of dry matter per day per cow and year. To compare the economic importance among traits, the standardized economic weight of each trait was calculated as the product of the marginal economic value and the genetic standard deviation; the standardized economic weight expressed as a percentage of the sum of all standardized economic weights was called relative economic weight. When not accounting for subsidies, the highest relative economic weight was found for 305-d milk yield (34% in strategy A and 29% in strategy B), which was followed by protein percentage (13% in strategy A and 11% in strategy B). The third most important traits were calving interval (9%) and mature weight of cows (11%) in strategy A and B, respectively. The sums of the relative economic weights over categories for RFI were 6 and 7% in strategy A and B, respectively. Under production conditions in 2011, the relative economic weights for the studied feed efficiency traits were low. However, it is possible that the relative importance of feed efficiency traits in the breeding goal will increase in the future due to increasing requirements to mitigate the environmental impact of milk production. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salata, C; David, M; Almeida, C de
2014-06-15
Purpose: To compare absorbed dose to water standards for HDR brachytherapy dosimetry developed by the Radiological Science Laboratory of Rio de Janeiro State University (LCR) and the National Research Council, Canada (NRC). Methods: The two institutions have separately developed absorbed dose standards based on the Fricke dosimetry system. There are important differences between the two standards, including: preparation and read-out of the Fricke solution, irradiation geometry of the Fricke holder in relation to the Ir-192 source, and determination of the G-value to be used at Ir-192 energies. All measurements for both standards were made directly at the NRC laboratory (i.e.,more » no transfer instrument was used) using a single Ir-192 source (microSelectron v2). In addition, the NRC group has established a self-consistent method to determine the G-value for Ir-192, based on an interpolation between G-values obtained at Co-60 and 250kVp X-rays, and this measurement was repeated using the LCR Fricke solution to investigate possible systematic uncertainties. Results: G-values for Co-60 and 250 kVp x-rays, obtained using the LCR Fricke system, agreed with the NRC values within 0.5 % and 1 % respectively, indicating that the general assumption of universal G-values is appropriate in this case. The standard uncertainty in the determination of G for Ir-192 is estimated to be 0.6 %. For the comparison of absorbed dose measurements at the reference point for Ir-192 (1 cm depth in water, perpendicular to the seed long-axis), the ratio Dw(NRC)/Dw(LCR) was found to be 1.011 with a combined standard uncertainty of 1.7 %, k=1. Conclusion: The agreement in the absorbed dose to water values for the LCR and NRC systems is very encouraging. Combined with the lower uncertainty in this approach compared to the present air-kerma approach, these results reaffirm the use of Fricke solution as a potential primary standard for HDR Ir-192 brachytherapy.« less
Toward Developing a Relative Value Scale for Medical and Surgical Services
Hsiao, William C.; Stason, William B.
1979-01-01
A methodology has been developed to determine the relative values of surgical procedures and medical office visits on the basis of resource costs. The time taken to perform the service and the complexity of that service are the most critical variables. Interspecialty differences in the opportunity costs of training and overhead expenses are also considered. Results indicate some important differences between the relative values based on resource costs and existing standards, prevailing Medicare charges, and California Relative Value Study values. Most dramatic are discrepancies between existing reimbursement levels and resource cost values for office visits compared to surgical procedures. These vary from procedure to procedure and specialty to specialty but indicate that, on the average, office visits are undervalued (or surgical procedures overvalued) by four- to five-fold. After standardizing the variations in the complexity of different procedures, the hourly reimbursement rate in 1978 ranged from $40 for a general practitioner to $200 for surgical specialists. PMID:10309112
Al-Raees, Ghada Y; Al-Amer, Maryam A; Musaiger, Abdulrahman O; D'Souza, Reshma
2009-01-01
A cross-sectional study was carried out on Bahraini preschoolers aged 2-5 years (354 males and 344 females) to determine the prevalence of overweight and obesity using the World Health Organization and the International Obesity Task Force cut-off values. Weight and height were recorded and body mass index (BMI) was calculated to determine the proportion of overweight and obesity. Using the World Health Organization percentile cut-off values, overweight (12.3%) and obesity (8.4%) was higher in females between 2 and <4 years of age whereas, the proportion of both overweight (8.4%) and obesity (7.2%) were higher in males between 4 and <6 years of age. Relative to the International Obesity Task Force indicators, the World Health Organization cut-off values produced nearly a 2-fold increase in both overweight and obesity at most ages. It is therefore important to ensure that the same cut-off reference values are used to define overweight and obesity particularly in preschoolers. Shifting to the new World Health Organization child growth standards may have important implications for child health programmes.
Ponce, Camille; Kaczorowski, Flora; Perpoint, Thomas; Miailhes, Patrick; Sigal, Alain; Javouhey, Etienne; Gillet, Yves; Jacquin, Laurent; Douplat, Marion; Tazarourte, Karim; Potinet, Véronique; Simon, Bruno; Lavoignat, Adeline; Bonnot, Guillaume; Sow, Fatimata; Bienvenu, Anne-Lise; Picot, Stéphane
2017-01-01
Background: Sensitive and easy-to-perform methods for the diagnosis of malaria are not yet available. Improving the limit of detection and following the requirements for certification are issues to be addressed in both endemic and non-endemic settings. The aim of this study was to test whether loop-mediated isothermal amplification of DNA (LAMP) may be an alternative to microscopy or real-time PCR for the screening of imported malaria cases in non-endemic area. Results: 310 blood samples associated with 829 suspected cases of imported malaria were tested during a one year period. Microscopy (thin and thick stained blood slides, reference standard) was used for the diagnosis. Real-time PCR was used as a standard of truth, and LAMP (Meridian Malaria Plus) was used as an index test in a prospective study conducted following the Standards for Reporting Diagnosis Accuracy Studies. In the 83 positive samples, species identification was P. falciparum (n = 66), P. ovale (n = 9), P. vivax (n = 3) P. malariae (n = 3) and 2 co-infections with P. falciparum + P.malariae. Using LAMP methods, 93 samples gave positive results, including 4 false-positives. Sensitivity, specificity, positive predictive value and negative predictive value for LAMP tests were 100%, 98.13%, 95.51%, and 100% compared to PCR. Conclusion: High negative predictive value, and limit of detection suggest that LAMP can be used for screening of imported malaria cases in non-endemic countries when expert microscopists are not immediately available. However, the rare occurrence of non-valid results and the need for species identification and quantification of positive samples preclude the use of LAMP as a single reference method. PMID:29251261
Low-cost space flight for attached payloads
NASA Astrophysics Data System (ADS)
Perkins, Frederick W.
1991-07-01
An important addition to the emerging commercial space sector is Standard Space Platforms Corporation's comprehensive low-cost flight service delivery system for small and developmental payloads. Standard provides a privately funded, proprietary, value-added transportation service which dramatically reduces cost and program duration for compliant payloads. It also provides a business-to-business service which is compatible with business investment decision timing and technology development cycles.
Developing the Precision Magnetic Field for the E989 Muon g{2 Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Matthias W.
The experimental value ofmore » $$(g\\hbox{--}2)_\\mu$$ historically has been and contemporarily remains an important probe into the Standard Model and proposed extensions. Previous measurements of $$(g\\hbox{--}2)_\\mu$$ exhibit a persistent statistical tension with calculations using the Standard Model implying that the theory may be incomplete and constraining possible extensions. The Fermilab Muon g-2 experiment, E989, endeavors to increase the precision over previous experiments by a factor of four and probe more deeply into the tension with the Standard Model. The $$(g\\hbox{--}2)_\\mu$$ experimental implementation measures two spin precession frequencies defined by the magnetic field, proton precession and muon precession. The value of $$(g\\hbox{--}2)_\\mu$$ is derived from a relationship between the two frequencies. The precision of magnetic field measurements and the overall magnetic field uniformity achieved over the muon storage volume are then two undeniably important aspects of the e xperiment in minimizing uncertainty. The current thesis details the methods employed to achieve magnetic field goals and results of the effort.« less
ERIC Educational Resources Information Center
Holt/Hale, Shirley Ann; Persse, Dan
2015-01-01
It is during the early educational years that skills are developed, habits are formed, and values are shaped. The skills for a lifetime of physical activity are developed through quality teaching, deliberate practice, assessment and reflection. Research supports the importance of elementary physical education experiences and the importance of…
Interference between light and heavy neutrinos for 0 νββ decay in the left–right symmetric model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, Fahim; Neacsu, Andrei; Horoi, Mihai
Neutrinoless double-beta decay is proposed as an important low energy phenomenon that could test beyond the Standard Model physics. There are several potentially competing beyond the Standard Model mechanisms that can induce the process. It thus becomes important to disentangle the different processes. In the present study we consider the interference effect between the light left-handed and heavy right-handed Majorana neutrino exchange mechanisms. The decay rate, and consequently, the phase-space factors for the interference term are derived, based on the left–right symmetric model. The numerical values for the interference phase-space factors for several nuclides are calculated, taking into consideration themore » relativistic Coulomb distortion of the electron wave function and finite-size of the nucleus. As a result, the variation of the interference effect with the Q-value of the process is studied.« less
Interference between light and heavy neutrinos for 0 νββ decay in the left–right symmetric model
Ahmed, Fahim; Neacsu, Andrei; Horoi, Mihai
2017-03-31
Neutrinoless double-beta decay is proposed as an important low energy phenomenon that could test beyond the Standard Model physics. There are several potentially competing beyond the Standard Model mechanisms that can induce the process. It thus becomes important to disentangle the different processes. In the present study we consider the interference effect between the light left-handed and heavy right-handed Majorana neutrino exchange mechanisms. The decay rate, and consequently, the phase-space factors for the interference term are derived, based on the left–right symmetric model. The numerical values for the interference phase-space factors for several nuclides are calculated, taking into consideration themore » relativistic Coulomb distortion of the electron wave function and finite-size of the nucleus. As a result, the variation of the interference effect with the Q-value of the process is studied.« less
Makino, Tomoki; Yamasaki, Makoto; Tanaka, Koji; Tatsumi, Mitsuaki; Takiguchi, Shuji; Hatazawa, Jun; Mori, Masaki; Doki, Yuichiro
2017-10-01
There is no consensus strategy for treatment of T4 esophageal cancer, and because of this, a better evaluation of treatment response is crucial to establish personalized therapies. This study aimed to establish a useful system for evaluating treatment response in T4 esophageal cancer. This study included 130 patients with cT4 esophageal cancer without distant metastasis who underwent 18 F-fluorodeoxyglucose-positron emission tomography before and after a series of induction treatments comprising chemoradiation or chemotherapy. We evaluated the maximal standardized uptake value and treatment response. The mean ± standard deviation of standardized uptake value in the primary tumor before and after induction treatments were 13.8 ± 4.4 and 5.4 ± 4.1, respectively, and the mean standardized uptake value decrease was 58.4%. The most significant difference in survival between positron emission tomography-primary tumor responders and nonresponders was at a decrease of 60% standardized uptake value, based on every 10% stepwise cutoff analysis (2-year cause-specific survival: 60.2 vs 23.5%; hazard ratio = 2.705; P < .0001). With this cutoff value, the resectability (P = .0307), pathologic response (P = .0004), and pT stage (P < .0001) were associated with positron emission tomography-primary tumor response. Univariate analysis of 2-year cause-specific survival indicated a correlation between cause-specific survival and clinical stages according to TNM classification, esophageal perforation, positron emission tomography-primary tumor response, lymph node status evaluated by positron emission tomography before and after induction treatments, and operative resection. Multivariate analysis further identified positron emission tomography-primary tumor response (hazard ratio = 2.354; P = .0107), lymph node status evaluated by positron emission tomography after induction treatments (hazard ratio = 1.966; P = .0089), and operative resection (hazard ratio = 2.012; P = .0245) as independent prognostic predictors. Positron emission tomography evaluation of the response of primary and metastatic lesions to induction treatments is important to formulate treatment strategies for cT4 esophageal cancer. Copyright © 2017 Elsevier Inc. All rights reserved.
Poorchangizi, Batool; Farokhzadian, Jamileh; Abbaszadeh, Abbas; Mirzaee, Moghaddameh; Borhani, Fariba
2017-03-01
Today, nurses are required to have knowledge and awareness concerning professional values as standards to provide safe and high-quality ethical care. Nurses' perspective on professional values affects decision-making and patient care. Therefore, the present study aimed to investigate the importance of professional values from clinical nurses' perspective. The present cross-sectional study was conducted in 2016 in four educational hospitals of Kerman University of Medical Sciences, Iran. Data were collected via the Persian version of Nursing Professional Values Scale-Revised (NPVS-R) by Weis and Schank. Sampling was conducted through the use of stratified random sampling method and 250 clinical nurses participated in the study. Results indicated that the total score of the nurses' professional values was high. (102.57 ± 11.94). From nurses' perspective items such as "Maintaining confidentiality of patients" and "Safeguarding patients' right to privacy" had more importance; however, "Recognizing role of professional nursing associations in shaping healthcare policy" and "Participating in nursing research and/or implementing research findings appropriate to practice had less importance. A statistically significant relationship was observed between NPVS-R mean scores and nurses' age, work experience as well as participation in professional ethical training (P < 0.05). Although the total score related to the clinical nurses' perspective on professional values was high, the importance of certain values was at a lower level. Owing to the emerging ethical challenges, it is indispensable to design educational programs in order to improve nurses' awareness and understanding of the comprehensive importance of professional values. Furthermore, it is recommended that mixed methods studies should be conducted in order to design an instrument to evaluate the use of values in nursing practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, Thomas A.; Johnson, Timothy J.; Tonkyn, Russell G.
Infrared integrating sphere measurements of solid samples are important in providing reference data for contact, standoff and remote sensing applications. At the Pacific Northwest National Laboratory (PNNL) we have developed protocols to measure both the directional-hemispherical ( and diffuse (d) reflectances of powders, liquids, and disks of powders and solid materials using a commercially available, matte gold-coated integrating sphere and Fourier transform infrared spectrometer. Detailed descriptions of the sphere alignment and its use for making these reflectance measurements are given. Diffuse reflectance values were found to be dependent on the bidirectional reflection distribution function (BRDF) of the sample and themore » solid angle intercepted by the sphere’s specular exclusion port. To determine how well the sphere and protocols produce quantitative reflectance data, measurements were made of three diffuse and two specular standards prepared by the National institute of Standards and Technology (NIST, USA), LabSphere Infragold and Spectralon standards, hand-loaded sulfur and talc powder samples, and water. The five NIST standards behaved as expected: the three diffuse standards had a high degree of “diffuseness,” d/ = D > 0.9, whereas the two specular standards had D ≤ 0.03. The average absolute differences between the NIST and PNNL measurements of the NIST standards for both directional-hemispherical and diffuse reflectances are on the order of 0.01 reflectance units. Other quantitative differences between the PNNL-measured and calibration (where available) or literature reflectance values for these standards and materials are given and the possible origins of discrepancies are discussed. Random uncertainties and estimates of systematic uncertainties are presented. Corrections necessary to provide better agreement between the PNNL reflectance values as measured for the NIST standards and the NIST reflectance values for these same standards are also discussed.« less
An ecological compensation standard based on emergy theory for the Xiao Honghe River Basin.
Guan, Xinjian; Chen, Moyu; Hu, Caihong
2015-01-01
The calculation of an ecological compensation standard is an important, but also difficult aspect of current ecological compensation research. In this paper, the factors affecting the ecological-economic system in the Xiao Honghe River Basin, China, including the flow of energy, materials, and money, were calculated using the emergy analysis method. A consideration of the relationships between the ecological-economic value of water resources and ecological compensation allowed the ecological-economic value to be calculated. On this basis, the amount of water needed for dilution was used to develop a calculation model for the ecological compensation standard of the basin. Using the Xiao Honghe River Basin as an example, the value of water resources and the ecological compensation standard were calculated using this model according to the emission levels of the main pollutant in the basin, chemical oxygen demand. The compensation standards calculated for the research areas in Xipin, Shangcai, Pingyu, and Xincai were 34.91 yuan/m3, 32.97 yuan/m3, 35.99 yuan/m3, and 34.70 yuan/m3, respectively, and such research output would help to generate and support new approaches to the long-term ecological protection of the basin and improvement of the ecological compensation system.
Role of the standard deviation in the estimation of benchmark doses with continuous data.
Gaylor, David W; Slikker, William
2004-12-01
For continuous data, risk is defined here as the proportion of animals with values above a large percentile, e.g., the 99th percentile or below the 1st percentile, for the distribution of values among control animals. It is known that reducing the standard deviation of measurements through improved experimental techniques will result in less stringent (higher) doses for the lower confidence limit on the benchmark dose that is estimated to produce a specified risk of animals with abnormal levels for a biological effect. Thus, a somewhat larger (less stringent) lower confidence limit is obtained that may be used as a point of departure for low-dose risk assessment. It is shown in this article that it is important for the benchmark dose to be based primarily on the standard deviation among animals, s(a), apart from the standard deviation of measurement errors, s(m), within animals. If the benchmark dose is incorrectly based on the overall standard deviation among average values for animals, which includes measurement error variation, the benchmark dose will be overestimated and the risk will be underestimated. The bias increases as s(m) increases relative to s(a). The bias is relatively small if s(m) is less than one-third of s(a), a condition achieved in most experimental designs.
The surveillance of nursing standards: an organisational case study.
Cooke, Hannah
2006-11-01
Quality assurance has acquired increasing prominence in contemporary healthcare systems and there has been an 'explosion' of audit activity. Some authors have begun to investigate the impact of audit activity on organisational and professional cultures. This paper considers data from a wider study of the management of the 'problem' nurse. Nurses and managers had contrasting perceptions of the value of different methods of assessing ward standards and their views are presented here. The study involved organisational case studies in three healthcare Trusts in the north of England. The fieldwork for this study was funded by the United Kingdom Central Council for Nursing, Midwifery and Health Visiting under their research scholarship programme. Multiple methods were employed including observation, interviewing and documentary analysis. A total of 144 informal interviews were carried out with ward nurses and their managers. The study demonstrated different viewpoints regarding the surveillance of nursing standards at top management, middle management and ward levels. The paper considers the discrepancies between these different viewpoints. None of the participants placed a high value on audit as a method of assessing ward standards. Complaints data and informal methods were more highly valued by managers. Ward nurses stressed the importance of presence and vigilance in assuring high standards of nursing care.
Surveying drinking water quality (Balikhlou River, Ardabil Province, Iran)
NASA Astrophysics Data System (ADS)
Aalipour erdi, Mehdi; Gasempour niari, Hassan; Mousavi Meshkini, Seyyed Reza; Foroug, Somayeh
2018-03-01
Considering the importance of Balikhlou River as one of the most important water sources of Ardabil, Nir and Sarein cities, maintaining water quality of this river is the most important goals in provincial and national levels. This river includes a wide area that provides agricultural, industrial and drinking water for the residents. Thus, surveying the quality of this river is important in planning and managing of region. This study examined the quality of river through eight physicochemical parameters (SO4, No3, BOD5, TDS, turbidity, pH, EC, COD) in two high- and low-water seasons by international and national standards in 2013. For this purpose, a review along the river has been done in five stations using t test and SPSS software. Model results showed that the amount difference in TDS and EC with WHO standards, and TDS rates with Iran standards in low-water seasons, pH and EC with WHO standards in high-water seasons, is not significant in high-water season; but for pH and SO4 parameters, turbidity and NO3 in both standards and EC value with WHO standard in low-water season and pH, EC, SO4 parameters and turbidity and NO3 in high-water season have significant difference from 5 to 1%, this shows the ideal limit and lowness of parameters for different usage.
Quality Management Systems in the Clinical Laboratories in Latin America
2015-01-01
The implementation of management systems in accordance with standards like ISO 9001:2008 (1,2) in the clinical laboratories has conferred and added value of reliability and therefore a very significant input to patient safety. As we know the ISO 9001:2008 (1) a certification standard, and ISO 15189:2012 (2) an accreditation standard, both, at the time have generated institutional memory where they have been implemented, the transformation of culture focused on correct execution, control and following, evidence needed and the importance of register. PMID:27683495
STANDARD BIG BANG NUCLEOSYNTHESIS UP TO CNO WITH AN IMPROVED EXTENDED NUCLEAR NETWORK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coc, Alain; Goriely, Stephane; Xu, Yi
Primordial or big bang nucleosynthesis (BBN) is one of the three strong pieces of evidence for the big bang model together with the expansion of the universe and cosmic microwave background radiation. In this study, we improve the standard BBN calculations taking into account new nuclear physics analyses and enlarge the nuclear network up to sodium. This is, in particular, important to evaluate the primitive value of CNO mass fraction that could affect Population III stellar evolution. For the first time we list the complete network of more than 400 reactions with references to the origin of the rates, includingmore » Almost-Equal-To 270 reaction rates calculated using the TALYS code. Together with the cosmological light elements, we calculate the primordial beryllium, boron, carbon, nitrogen, and oxygen nuclei. We performed a sensitivity study to identify the important reactions for CNO, {sup 9}Be, and boron nucleosynthesis. We re-evaluated those important reaction rates using experimental data and/or theoretical evaluations. The results are compared with precedent calculations: a primordial beryllium abundance increase by a factor of four compared to its previous evaluation, but we note a stability for B/H and for the CNO/H abundance ratio that remains close to its previous value of 0.7 Multiplication-Sign 10{sup -15}. On the other hand, the extension of the nuclear network has not changed the {sup 7}Li value, so its abundance is still 3-4 times greater than its observed spectroscopic value.« less
Marko-Varga, György; Végvári, Ákos; Welinder, Charlotte; Lindberg, Henrik; Rezeli, Melinda; Edula, Goutham; Svensson, Katrin J; Belting, Mattias; Laurell, Thomas; Fehniger, Thomas E
2012-11-02
Biobanks are a major resource to access and measure biological constituents that can be used to monitor the status of health and disease, both in unique individual samples and within populations. Most "omic" activities rely on access to these collections of stored samples to provide the basis for establishing the ranges and frequencies of expression. Furthermore, information about the relative abundance and form of protein constituents found in stored samples provides an important historical index for comparative studies of inherited, epidemic, and developing disease. Standardizations of sample quality, form, and analysis are an important unmet need and requirement for gaining the full benefit from collected samples. Coupled to this standard is the provision of annotation describing clinical status and metadata of measurements of clinical phenotype that characterizes the sample. Today we have not yet achieved consensus on how to collect, manage, and build biobank archives in order to reach goals where these efforts are translated into value for the patient. Several initiatives (OBBR, ISBER, BBMRI) that disseminate best practice examples for biobanking are expected to play an important role in ensuring the need to preserve the sample integrity of biosamples stored for periods that reach one or several decades. These developments will be of great value and importance to programs such as the Chromosome Human Protein Project (C-HPP) that will associate protein expression in healthy and disease states with genetic foci along of each of the human chromosomes.
Biochemical thermodynamics: applications of Mathematica.
Alberty, Robert A
2006-01-01
The most efficient way to store thermodynamic data on enzyme-catalyzed reactions is to use matrices of species properties. Since equilibrium in enzyme-catalyzed reactions is reached at specified pH values, the thermodynamics of the reactions is discussed in terms of transformed thermodynamic properties. These transformed thermodynamic properties are complicated functions of temperature, pH, and ionic strength that can be calculated from the matrices of species values. The most important of these transformed thermodynamic properties is the standard transformed Gibbs energy of formation of a reactant (sum of species). It is the most important because when this function of temperature, pH, and ionic strength is known, all the other standard transformed properties can be calculated by taking partial derivatives. The species database in this package contains data matrices for 199 reactants. For 94 of these reactants, standard enthalpies of formation of species are known, and so standard transformed Gibbs energies, standard transformed enthalpies, standard transformed entropies, and average numbers of hydrogen atoms can be calculated as functions of temperature, pH, and ionic strength. For reactions between these 94 reactants, the changes in these properties can be calculated over a range of temperatures, pHs, and ionic strengths, and so can apparent equilibrium constants. For the other 105 reactants, only standard transformed Gibbs energies of formation and average numbers of hydrogen atoms at 298.15 K can be calculated. The loading of this package provides functions of pH and ionic strength at 298.15 K for standard transformed Gibbs energies of formation and average numbers of hydrogen atoms for 199 reactants. It also provides functions of temperature, pH, and ionic strength for the standard transformed Gibbs energies of formation, standard transformed enthalpies of formation, standard transformed entropies of formation, and average numbers of hydrogen atoms for 94 reactants. Thus loading this package makes available 774 mathematical functions for these properties. These functions can be added and subtracted to obtain changes in these properties in biochemical reactions and apparent equilibrium constants.
When the value of gold is zero.
Chase, J Geoffrey; Moeller, Knut; Shaw, Geoffrey M; Schranz, Christoph; Chiew, Yeong Shiong; Desaive, Thomas
2014-06-27
This manuscript presents the concerns around the increasingly common problem of not having readily available or useful "gold standard" measurements. This issue is particularly important in critical care where many measurements used in decision making are surrogates of what we would truly wish to use. However, the question is broad, important and applicable in many other areas.In particular, a gold standard measurement often exists, but is not clinically (or ethically in some cases) feasible. The question is how does one even begin to develop new measurements or surrogates if one has no gold standard to compare with?We raise this issue concisely with a specific example from mechanical ventilation, a core bread and butter therapy in critical care that is also a leading cause of length of stay and cost of care. Our proposed solution centers around a hierarchical validation approach that we believe would ameliorate ethics issues around radiation exposure that make current gold standard measures clinically infeasible, and thus provide a pathway to create a (new) gold standard.
Robinson, Angela; Spencer, Anne; Moffatt, Peter
2015-04-01
There has been recent interest in using the discrete choice experiment (DCE) method to derive health state utilities for use in quality-adjusted life year (QALY) calculations, but challenges remain. We set out to develop a risk-based DCE approach to derive utility values for health states that allowed 1) utility values to be anchored directly to normal health and death and 2) worse than dead health states to be assessed in the same manner as better than dead states. Furthermore, we set out to estimate alternative models of risky choice within a DCE model. A survey was designed that incorporated a risk-based DCE and a "modified" standard gamble (SG). Health state utility values were elicited for 3 EQ-5D health states assuming "standard" expected utility (EU) preferences. The DCE model was then generalized to allow for rank-dependent expected utility (RDU) preferences, thereby allowing for probability weighting. A convenience sample of 60 students was recruited and data collected in small groups. Under the assumption of "standard" EU preferences, the utility values derived within the DCE corresponded fairly closely to the mean results from the modified SG. Under the assumption of RDU preferences, the utility values estimated are somewhat lower than under the assumption of standard EU, suggesting that the latter may be biased upward. Applying the correct model of risky choice is important whether a modified SG or a risk-based DCE is deployed. It is, however, possible to estimate a probability weighting function within a DCE and estimate "unbiased" utility values directly, which is not possible within a modified SG. We conclude by setting out the relative strengths and weaknesses of the 2 approaches in this context. © The Author(s) 2014.
Mining Hierarchies and Similarity Clusters from Value Set Repositories.
Peterson, Kevin J; Jiang, Guoqian; Brue, Scott M; Shen, Feichen; Liu, Hongfang
2017-01-01
A value set is a collection of permissible values used to describe a specific conceptual domain for a given purpose. By helping to establish a shared semantic understanding across use cases, these artifacts are important enablers of interoperability and data standardization. As the size of repositories cataloging these value sets expand, knowledge management challenges become more pronounced. Specifically, discovering value sets applicable to a given use case may be challenging in a large repository. In this study, we describe methods to extract implicit relationships between value sets, and utilize these relationships to overlay organizational structure onto value set repositories. We successfully extract two different structurings, hierarchy and clustering, and show how tooling can leverage these structures to enable more effective value set discovery.
A Weak Value Based QKD Protocol Robust Against Detector Attacks
NASA Astrophysics Data System (ADS)
Troupe, James
2015-03-01
We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.
New directions in analyses of parenting contributions to children's acquisition of values.
Grusec, J E; Goodnow, J J; Kuczynski, L
2000-01-01
Traditional theories of how children acquire values or standards of behavior have emphasized the importance of specific parenting techniques or styles and have acknowledged the importance of a responsive parent-child relationship, but they have failed to differentiate among forms of responsiveness, have stressed internalization of values as the desired outcome, and have limited their scope to a small set of parenting strategies or methods. This paper outlines new directions for research. It acknowledges the central importance of parents and argues for research that (1) demonstrates that parental understanding of a particular child's characteristics and situation rather than use of specific strategies or styles is the mark of effective parenting; (2) traces the differential impact of varieties of parent responsiveness; (3) assesses the conditions surrounding the fact that parents have goals other than internalization when socializing their children, and evaluates the impact of that fact; and (4) considers a wider range of parenting strategies.
Dunkle, Jennifer; Flynn, Perry
2012-05-01
The Common Core State Standards initiative within public school education is designed to provide uniform guidelines for academic standards, including more explicit language targets. Speech-language pathologists (SLPs) are highly qualified language experts who may find new leadership roles within their clinical practice using the Common Core Standards. However, determining its usage by SLPs in clinical practice needs to be examined. This article seeks to discover the social context of organizations and organizational change in relation to clinical practice. Specifically, this article presents the diffusion of innovations theory to explain how initiatives move from ideas to institutionalization and the importance of social context in which these initiatives are introduced. Next, the values of both SLPs and organizations will be discussed. Finally, this article provides information on how to affect organizational change through the value of an affirmative, socially based theoretical perspective and methodology, appreciative inquiry. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
NOMINAL VALUES FOR SELECTED SOLAR AND PLANETARY QUANTITIES: IAU 2015 RESOLUTION B3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prša, Andrej; Harmanec, Petr; Torres, Guillermo
In this brief communication we provide the rationale for and the outcome of the International Astronomical Union (IAU) resolution vote at the XXIXth General Assembly in Honolulu, Hawaii, in 2015, on recommended nominal conversion constants for selected solar and planetary properties. The problem addressed by the resolution is a lack of established conversion constants between solar and planetary values and SI units: a missing standard has caused a proliferation of solar values (e.g., solar radius, solar irradiance, solar luminosity, solar effective temperature, and solar mass parameter) in the literature, with cited solar values typically based on best estimates at the timemore » of paper writing. As precision of observations increases, a set of consistent values becomes increasingly important. To address this, an IAU Working Group on Nominal Units for Stellar and Planetary Astronomy formed in 2011, uniting experts from the solar, stellar, planetary, exoplanetary, and fundamental astronomy, as well as from general standards fields to converge on optimal values for nominal conversion constants. The effort resulted in the IAU 2015 Resolution B3, passed at the IAU General Assembly by a large majority. The resolution recommends the use of nominal solar and planetary values, which are by definition exact and are expressed in SI units. These nominal values should be understood as conversion factors only, not as the true solar/planetary properties or current best estimates. Authors and journal editors are urged to join in using the standard values set forth by this resolution in future work and publications to help minimize further confusion.« less
Nominal Values for Selected Solar and Planetary Quantities: IAU 2015 Resolution B3
NASA Astrophysics Data System (ADS)
Prša, Andrej; Harmanec, Petr; Torres, Guillermo; Mamajek, Eric; Asplund, Martin; Capitaine, Nicole; Christensen-Dalsgaard, Jørgen; Depagne, Éric; Haberreiter, Margit; Hekker, Saskia; Hilton, James; Kopp, Greg; Kostov, Veselin; Kurtz, Donald W.; Laskar, Jacques; Mason, Brian D.; Milone, Eugene F.; Montgomery, Michele; Richards, Mercedes; Schmutz, Werner; Schou, Jesper; Stewart, Susan G.
2016-08-01
In this brief communication we provide the rationale for and the outcome of the International Astronomical Union (IAU) resolution vote at the XXIXth General Assembly in Honolulu, Hawaii, in 2015, on recommended nominal conversion constants for selected solar and planetary properties. The problem addressed by the resolution is a lack of established conversion constants between solar and planetary values and SI units: a missing standard has caused a proliferation of solar values (e.g., solar radius, solar irradiance, solar luminosity, solar effective temperature, and solar mass parameter) in the literature, with cited solar values typically based on best estimates at the time of paper writing. As precision of observations increases, a set of consistent values becomes increasingly important. To address this, an IAU Working Group on Nominal Units for Stellar and Planetary Astronomy formed in 2011, uniting experts from the solar, stellar, planetary, exoplanetary, and fundamental astronomy, as well as from general standards fields to converge on optimal values for nominal conversion constants. The effort resulted in the IAU 2015 Resolution B3, passed at the IAU General Assembly by a large majority. The resolution recommends the use of nominal solar and planetary values, which are by definition exact and are expressed in SI units. These nominal values should be understood as conversion factors only, not as the true solar/planetary properties or current best estimates. Authors and journal editors are urged to join in using the standard values set forth by this resolution in future work and publications to help minimize further confusion.
Present tendencies in equipment noise normalization. [permissible sound level standards
NASA Technical Reports Server (NTRS)
Sternberg, A.
1974-01-01
The importance of equipment noise normalization in the complex of measures aimed at reducing noise in work spaces, as well as the necessity of correlating these norms with the criteria that establish the noxious values of noise for man are outlined.
Budjan, Johannes; Sauter, Elke A; Zoellner, Frank G; Lemke, Andreas; Wambsganss, Jens; Schoenberg, Stefan O; Attenberger, Ulrike I
2018-01-01
Background Functional techniques like diffusion-weighted imaging (DWI) are gaining more and more importance in liver magnetic resonance imaging (MRI). Diffusion kurtosis imaging (DKI) is an advanced technique that might help to overcome current limitations of DWI. Purpose To evaluate DKI for the differentiation of hepatic lesions in comparison to conventional DWI at 3 Tesla. Material and Methods Fifty-six consecutive patients were examined using a routine abdominal MR protocol at 3 Tesla which included DWI with b-values of 50, 400, 800, and 1000 s/mm 2 . Apparent diffusion coefficient maps were calculated applying a standard mono-exponential fit, while a non-Gaussian kurtosis fit was used to obtain DKI maps. ADC as well as Kurtosis-corrected diffusion ( D) values were quantified by region of interest analysis and compared between lesions. Results Sixty-eight hepatic lesions (hepatocellular carcinoma [HCC] [n = 25]; hepatic adenoma [n = 4], cysts [n = 18]; hepatic hemangioma [HH] [n = 18]; and focal nodular hyperplasia [n = 3]) were identified. Differentiation of malignant and benign lesions was possible based on both DWI ADC as well as DKI D-values ( P values were in the range of 0.04 to < 0.0001). Conclusion In vivo abdominal DKI calculated using standard b-values is feasible and enables quantitative differentiation between malignant and benign liver lesions. Assessment of conventional ADC values leads to similar results when using b-values below 1000 s/mm 2 for DKI calculation.
A New Approach to Defining Human Touch Temperature Standards
NASA Technical Reports Server (NTRS)
Ungar, Eugene; Stroud, Kenneth
2010-01-01
Defining touch temperature limits for skin contact with both hot and cold objects is important to prevent pain and skin damage, which may affect task performance or become a safety concern. Pain and skin damage depend on the skin temperature during contact, which depends on the contact thermal conductance, the object's initial temperature, and its material properties. However, previous spacecraft standards have incorrectly defined touch temperature limits in terms of a single object temperature value for all materials, or have provided limited material-specific values which do not cover the gamut of likely designs. A new approach has been developed for updated NASA standards, which defines touch temperature limits in terms of skin temperature at pain onset for bare skin contact with hot and cold objects. The authors have developed an analytical verification method for safe hot and cold object temperatures for contact times from 1 second to infinity.
A New Approach to Defining Human Touch Temperature Standards
NASA Technical Reports Server (NTRS)
Ungar, Eugene; Stroud, Kenneth
2009-01-01
Defining touch temperature limits for skin contact with both hot and cold objects is important to prevent pain and skin damage, which may affect task performance or become a safety concern. Pain and skin damage depend on the resulting skin temperature during contact, which depends on the object s initial temperature, its material properties and its ability to transfer heat. However, previous spacecraft standards have incorrectly defined touch temperature limits in terms of a single object temperature value for all materials, or have provided limited material-specific values which do not cover the gamut of most designs. A new approach is being used in new NASA standards, which defines touch temperature limits in terms of skin temperature at pain onset for bare skin contact with hot and cold objects. The authors have developed an analytical verification method for safe hot and cold object temperatures for contact times from 1 second to infinity.
Method of calculation overall equipment effectiveness in fertilizer factory
NASA Astrophysics Data System (ADS)
Siregar, I.; Muchtar, M. A.; Rahmat, R. F.; Andayani, U.; Nasution, T. H.; Sari, R. M.
2018-02-01
This research was conducted at a fertilizer company in Sumatra, where companies that produce fertilizers in large quantities to meet the needs of consumers. This company cannot be separated from issues related to the performance/effectiveness of the machinery and equipment. It can be seen from the engine that runs every day without a break resulted in not all of the quality of products in accordance with the quality standards set by the company. Therefore, to measure and improve the performance of the machine in the unit Plant Urea-1 as a whole then used method of Overall Equipment Effectiveness (OEE), which is one important element in the Total Productive Maintenance (TPM) to measure the effectiveness of the machine so that it can take measures to maintain that level. In July, August and September OEE values above the standard set at 85%. Meanwhile, in October, November and December have not reached the standard OEE values. The low value of OEE due to lack of time availability of machines for the production shut down due to the occurrence of the engine long enough so that the availability of reduced production time.
A test of the reward-value hypothesis.
Smith, Alexandra E; Dalecki, Stefan J; Crystal, Jonathon D
2017-03-01
Rats retain source memory (memory for the origin of information) over a retention interval of at least 1 week, whereas their spatial working memory (radial maze locations) decays within approximately 1 day. We have argued that different forgetting functions dissociate memory systems. However, the two tasks, in our previous work, used different reward values. The source memory task used multiple pellets of a preferred food flavor (chocolate), whereas the spatial working memory task provided access to a single pellet of standard chow-flavored food at each location. Thus, according to the reward-value hypothesis, enhanced performance in the source memory task stems from enhanced encoding/memory of a preferred reward. We tested the reward-value hypothesis by using a standard 8-arm radial maze task to compare spatial working memory accuracy of rats rewarded with either multiple chocolate or chow pellets at each location using a between-subjects design. The reward-value hypothesis predicts superior accuracy for high-valued rewards. We documented equivalent spatial memory accuracy for high- and low-value rewards. Importantly, a 24-h retention interval produced equivalent spatial working memory accuracy for both flavors. These data are inconsistent with the reward-value hypothesis and suggest that reward value does not explain our earlier findings that source memory survives unusually long retention intervals.
Managers’ Perceptions of the Value and Impact of HAZWOPER Worker Health and Safety Training
Riley, Kevin; Slatin, Craig; Rice, Carol; Rosen, Mitchel; Weidner, B. Louise; Fleishman, Jane; Alerding, Linda; Delp, Linda
2018-01-01
Background Worker training is a core component of the OSHA Hazardous Waste Operations and Emergency Response (HAZWOPER) standard, but few studies have considered what motivates managers to provide HAZWOPER training to employees or what they value in that training. Methods In 2012, four university-based programs conducted an exploratory survey of managers who sent employees to HAZWOPER courses. Results from 109 respondents were analyzed. Results Forty-two percent of respondents cited regulations as the most important reason to provide HAZWOPER training; many indicated they would provide less training if there were no standard in place. Three-quarters (74%) reported training had improved workplace conditions. Fewer than half said they were likely to involve trained employees in aspects of the organization’s H&S program. Discussion Compliance with regulatory requirements is an important factor shaping managers’ training delivery decisions. Managers recognize positive impacts of training. These impacts could be enhanced by further leveraging employee H&S knowledge and skills. PMID:26010141
Managers' perceptions of the value and impact of HAZWOPER worker health and safety training.
Riley, Kevin; Slatin, Craig; Rice, Carol; Rosen, Mitchel; Weidner, B Louise; Fleishman, Jane; Alerding, Linda; Delp, Linda
2015-07-01
Worker training is a core component of the OSHA Hazardous Waste Operations and Emergency Response (HAZWOPER) standard, but few studies have considered what motivates managers to provide HAZWOPER training to employees or what they value in that training. In 2012, four university-based programs conducted an exploratory survey of managers who sent employees to HAZWOPER courses. Results from 109 respondents were analyzed. Forty-two percent of respondents cited regulations as the most important reason to provide HAZWOPER training; many indicated they would provide less training if there were no standard in place. Three-quarters (74%) reported training had improved workplace conditions. Fewer than half said they were likely to involve trained employees in aspects of the organization's H&S program. Compliance with regulatory requirements is an important factor shaping managers' training delivery decisions. Managers recognize positive impacts of training. These impacts could be enhanced by further leveraging employee H&S knowledge and skills. © 2015 Wiley Periodicals, Inc.
2008-03-01
maturity models and ISO standards, specifically CMMI, CMMI-ACQ and ISO 12207 . Also, the improvement group supplemented their selection of these...compliant with the technologies and standards that are important to the business. Lockheed Martin IS&GS has integrated CMMI, EIA 632, ISO 12207 , and Six...geographically dispersed organization. [Siviy 07-1] Northrop Grumman Mission Systems has integrated CMMI, ISO 9001, AS9100, and Six Sigma, as well as a
Machine Learning to Improve the Effectiveness of ANRS in Predicting HIV Drug Resistance.
Singh, Yashik
2017-10-01
Human immunodeficiency virus infection and acquired immune deficiency syndrome (HIV/AIDS) is one of the major burdens of disease in developing countries, and the standard-of-care treatment includes prescribing antiretroviral drugs. However, antiretroviral drug resistance is inevitable due to selective pressure associated with the high mutation rate of HIV. Determining antiretroviral resistance can be done by phenotypic laboratory tests or by computer-based interpretation algorithms. Computer-based algorithms have been shown to have many advantages over laboratory tests. The ANRS (Agence Nationale de Recherches sur le SIDA) is regarded as a gold standard in interpreting HIV drug resistance using mutations in genomes. The aim of this study was to improve the prediction of the ANRS gold standard in predicting HIV drug resistance. A genome sequence and HIV drug resistance measures were obtained from the Stanford HIV database (http://hivdb.stanford.edu/). Feature selection was used to determine the most important mutations associated with resistance prediction. These mutations were added to the ANRS rules, and the difference in the prediction ability was measured. This study uncovered important mutations that were not associated with the original ANRS rules. On average, the ANRS algorithm was improved by 79% ± 6.6%. The positive predictive value improved by 28%, and the negative predicative value improved by 10%. The study shows that there is a significant improvement in the prediction ability of ANRS gold standard.
Hughes Halbert, Chanita; Barg, Frances K; Weathers, Benita; Delmoor, Ernestine; Coyne, James; Wileyto, E Paul; Arocho, Justin; Mahler, Brandon; Malkowicz, S Bruce
2007-07-01
Although cultural values are increasingly being recognized as important determinants of psychological and behavioral outcomes following cancer diagnosis and treatment, empirical data are not available on cultural values among men. This study evaluated differences in cultural values related to religiosity, temporal orientation, and collectivism among African American and European American men. Participants were 119 African American and European American men who were newly diagnosed with early-stage and locally advanced prostate cancer. Cultural values were evaluated by self-report using standardized instruments during a structured telephone interview. After controlling for sociodemographic characteristics, African American men reported significantly greater levels of religiosity (Beta = 24.44, P < .001) compared with European American men. African American men (Beta = 6.30, P < .01) also reported significantly greater levels of future temporal orientation. In addition, men with more aggressive disease (eg, higher Gleason scores) (Beta = 5.11, P < .01) and those who were pending treatment (Beta = -6.42, P < .01) reported significantly greater levels of future temporal orientation. These findings demonstrate that while ethnicity is associated with some cultural values, clinical experiences with prostate cancer may also be important. This underscores the importance of evaluating the effects of both ethnicity and clinical factors in research on the influence of cultural values on cancer prevention and control.
Li, Zijian
2018-01-01
Regulations for pesticides in soil are important for controlling human health risk; humans can be exposed to pesticides by ingesting soil, inhaling soil dust, and through dermal contact. Previous studies focused on analyses of numerical standard values for pesticides and evaluated the same pesticide using different standards among different jurisdictions. To understand the health consequences associated with pesticide soil standard values, lifetime theoretical maximum contribution and risk characterization factors were used in this study to quantify the severity of damage using disability-adjusted life years (DALYs) under the maximum "legal" exposure to persistent organic pollutant (POP) pesticides that are commonly regulated by the Stockholm Convention. Results show that computed soil characterization factors for some pesticides present lognormal distributions, and some of them have DALY values higher than 1000.0 per million population (e.g., the DALY for dichlorodiphenyltrichloroethane [DDT] is 14,065 in the Netherlands, which exceeds the tolerable risk of uncertainty upper bound of 1380.0 DALYs). Health risk characterization factors computed from national jurisdictions illustrate that values can vary over eight orders of magnitude. Further, the computed characterization factors can vary over four orders of magnitude within the same national jurisdiction. These data indicate that there is little agreement regarding pesticide soil regulatory guidance values (RGVs) among worldwide national jurisdictions or even RGV standard values within the same jurisdiction. Among these POP pesticides, lindane has the lowest median (0.16 DALYs) and geometric mean (0.28 DALYs) risk characterization factors, indicating that worldwide national jurisdictions provide relatively conservative soil RGVs for lindane. In addition, we found that some European nations and members of the former Union of Soviet Socialist Republics share the same pesticide RGVs and data clusters for the computed characterization factors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pharmacoeconomics and macular degeneration.
Brown, Gary C; Brown, Melissa M; Brown, Heidi; Godshalk, Ashlee N
2007-05-01
To describe pharmacoeconomics and its relationship to drug interventions. Pharmacoeconomics is the branch of economics which applies cost-minimization, cost-benefit, cost-effectiveness and cost-utility analyses to compare the economics of different pharmaceutical products or to compare drug therapy to other treatments. Among the four instruments, cost-utility analysis is the most sophisticated, relevant and clinically applicable as it measures the value conferred by drugs for the monies expended. Value-based medicine incorporates cost-utility principles but with strict standardization of all input and output parameters to allow the comparability of analyses, unlike the current situation in the healthcare literature. Pharmacoeconomics is assuming an increasingly important role with regard to whether drugs are listed on the drug formulary of a country or province. It has been estimated that the application of standardized, value-based medicine drug analyses can save over 35% from a public healthcare insurer drug formulary while maintaining or improving patient care.
Li, Longhai; Feng, Cindy X; Qiu, Shi
2017-06-30
An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Bassi, Laurie J.; McMurrer, Daniel P.
2006-01-01
Human capital--the productive capacity that is embedded in people--is one of the most important contributors to the growth in nations' output and standard of living. Globalisation and technological change have increased the importance of human capital in recent years, to the point that there are now only two options to sustain high profits and…
Melamed, Alexander; Vittinghoff, Eric; Sriram, Usha; Schwartz, Ann V.; Kanaya, Alka M.
2010-01-01
The relationship between bone mineral density (BMD) and fracture risk is not well-established for non-white populations. There is no established BMD reference standard for South Asians. Dual energy x-ray absorptiometry (DXA) was used to measure BMD at total hip and lumbar spine in 150 US-based South Asian Indians. For each subject T-scores were calculated using BMD reference values based on US white, North Indian and South Indian populations, and the resulting WHO BMD category assignments were compared. Reference standards derived from Indian populations classified a larger proportion of US-based Indians as normal than did US white-based standards. The percentage of individuals reclassified when changing between reference standards varied by skeletal site and reference population origin, ranging from 13% (95% CI, 7–18%), when switching from US-white- to North Indian-based standard for total hip, to 40% (95% CI, 32–48%), when switching from US white to South Indian reference values for lumbar spine. These finding illustrate that choice of reference standard has a significant effect on the diagnosis of osteoporosis in South Asians, and underscore the importance of future research to quantify the relationship between BMD and fracture risk in this population. PMID:20663699
Wang, Haiyin; Jin, Chunlin; Jiang, Qingwu
2017-11-20
Traditional Chinese medicine (TCM) is an important part of China's medical system. Due to the prolonged low price of TCM procedures and the lack of an effective mechanism for dynamic price adjustment, the development of TCM has markedly lagged behind Western medicine. The World Health Organization (WHO) has emphasized the need to enhance the development of alternative and traditional medicine when creating national health care systems. The establishment of scientific and appropriate mechanisms to adjust the price of medical procedures in TCM is crucial to promoting the development of TCM. This study has examined incorporating value indicators and data on basic manpower expended, time spent, technical difficulty, and the degree of risk in the latest standards for the price of medical procedures in China, and this study also offers a price adjustment model with the relative price ratio as a key index. This study examined 144 TCM procedures and found that prices of TCM procedures were mainly based on the value of medical care provided; on average, medical care provided accounted for 89% of the price. Current price levels were generally low and the current price accounted for 56% of the standardized value of a procedure, on average. Current price levels accounted for a markedly lower standardized value of acupuncture, moxibustion, special treatment with TCM, and comprehensive TCM procedures. This study selected a total of 79 procedures and adjusted them by priority. The relationship between the price of TCM procedures and the suggested price was significantly optimized (p < 0.01). This study suggests that adjustment of the price of medical procedures based on a standardized value parity model is a scientific and suitable method of price adjustment that can serve as a reference for other provinces and municipalities in China and other countries and regions that mainly have fee-for-service (FFS) medical care.
The Business Value of Superior Energy Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKane, Aimee; Scheihing, Paul; Evans, Tracy
Industrial facilities participating in the U.S. Department of Energy’s (US DOE) Superior Energy Performance (SEP) program are finding that it provides them with significant business value. This value starts with the implementation of ISO 50001-Energy management system standard, which provides an internationally-relevant framework for integration of energy management into an organization’s business processes. The resulting structure emphasizes effective use of available data and supports continual improvement of energy performance. International relevance is particularly important for companies with a global presence or trading interests, providing them with access to supporting ISO standards and a growing body of certified companies representing themore » collective knowledge of communities of practice. This paper examines the business value of SEP, a voluntary program that builds on ISO 50001, inviting industry to demonstrate an even greater commitment through third-party verification of energy performance improvement to a specified level of achievement. Information from 28 facilities that have already achieved SEP certification will illustrate key findings concerning both the value and the challenges from SEP/ISO 50001 implementation. These include the facilities’ experience with implementation, internal and external value of third-party verification of energy performance improvement; attractive payback periods and the importance of SEP tools and guidance. US DOE is working to bring the program to scale, including the Enterprise-Wide Accelerator (SEP for multiple facilities in a company), the Ratepayer-Funded Program Accelerator (supporting tools for utilities and program administrators to include SEP in their program offerings), and expansion of the program to other sectors and industry supply chains.« less
ERIC Educational Resources Information Center
Sylwester, Robert
1998-01-01
From fine-tuning muscular systems to integrating emotion and logic, the arts have important biological value. Motion and emotion are central to the arts and life itself. It is counterproductive to promote high performance standards while displacing skill development with computer technologies and reducing arts programs that move students from…
Garrido, G; González, D; Lemus, Y; Delporte, C; Delgado, R
2006-06-01
A standard aqueous extract of Mangifera indica L., used in Cuba as antioxidant under the brand name VIMANG, was tested in vivo for its anti-inflammatory activity, using commonly accepted assays. The standard extract of M. indica, administered orally (50-200mg/kg body wt.), reduced ear edema induced by arachidonic acid (AA) and phorbol myristate acetate (PMA) in mice. In the PMA model, M. indica extract also reduced myeloperoxidase (MPO) activity. In vitro studies were performed using macrophage cell line J774 stimulated with pro-inflammatory stimuli lipopolysaccharide-interferon gamma (LPS-IFNgamma) or calcium ionophore A23187 to determine prostaglandin PGE(2) or leukotriene LTB(4) release, respectively. The extract inhibited the induction of PGE(2) and LTB(4) with IC(50) values of 21.7 and 26.0microg/ml, respectively. Mangiferin (a glucosylxanthone isolated from the extract) also inhibited these AA metabolites (PGE(2), IC(50) value=17.2microg/ml and LTB(4), IC(50) value=2.1microg/ml). These results represent an important contribution to the elucidation of the mechanism involved in the anti-inflammatory and anti-nociceptive effects reported for the standard extract of M. indica VIMANG.
Great Lakes prey fish populations: A cross-basin overview of status and trends in 2008
Gorman, Owen T.; Bunnell, David B.
2009-01-01
Assessments of prey fishes in the Great Lakes have been conducted annually since the 1970s by the Great Lakes Science Center, sometimes assisted by partner agencies. Prey fish assessments differ among lakes in the proportion of a lake covered, seasonal timing, bottom trawl gear used, sampling design, and the manner in which the trawl is towed (across or along bottom contours). Because each assessment is unique in one or more important aspects, a direct comparison of prey fish catches among lakes is problematic. All of the assessments, however, produce indices of abundance or biomass that can be standardized to facilitate comparisons of trends among lakes and to illustrate present status of the populations. We present indices of abundance for important prey fishes in the Great Lakes standardized to the highest value for a time series within each lake: cisco (Coregonus artedi), bloater (C. hoyi), rainbow smelt (Osmerus mordax), and alewife (Alosa pseudoharengus). We also provide indices for round goby (Neogobius melanostomus), an invasive fish presently spreading throughout the basin. Our intent is to provide a short, informal report emphasizing data presentation rather than synthesis; for this reason we intentionally avoid use of tables and cited references.For each lake, standardized relative indices for annual biomass and density estimates of important prey fishes were calculated as the fraction relative to the largest value observed in the times series. To determine whether basin-wide trends were apparent for each species, we first ranked standardized index values within each lake. When comparing ranked index values from three or more lakes, we calculated the Kendall coefficient of concordance (W), which can range from 0 (complete discordance or disagreement among trends) to 1 (complete concordance or agreement among trends). The P-value for W provides the probability of agreement across the lakes. When comparing ranked index values from two lakes, we calculated the Kendall correlation coefficient (τ), which ranges from -1 (inverse association, perfect disagreement) to 1 (direct association, perfect agreement). Here, the P-value for τ provides the probability of either inverse or direct association between the lakes. First, we present trends in relative biomass of age-1 and older prey fishes to show changes in populations within each lake. Then, we present standardized indices of numerical abundance of a single age class to show changes in relative year-class strength within each lake. Indices of year-class strength reliably reflect the magnitude of the cohort size at subsequent ages. However, because of differences in survey timing across lakes, the age class that is used for each species to index year-class strength varies across lakes and, just as surveys differ among lakes, methods for determining fish age-class differ also. For Lake Superior cisco, bloater, smelt, and Lake Michigan alewife, year- class strengths are based on aged fish and age-length keys, and for all other combinations of lakes and species, age-classes are assigned based on fish length cut-offs. Our intent with this report is to provide a cross-lakes view of population trends but not to propose reasons for those trends.
An enhanced fast scanning algorithm for image segmentation
NASA Astrophysics Data System (ADS)
Ismael, Ahmed Naser; Yusof, Yuhanis binti
2015-12-01
Segmentation is an essential and important process that separates an image into regions that have similar characteristics or features. This will transform the image for a better image analysis and evaluation. An important benefit of segmentation is the identification of region of interest in a particular image. Various algorithms have been proposed for image segmentation and this includes the Fast Scanning algorithm which has been employed on food, sport and medical images. It scans all pixels in the image and cluster each pixel according to the upper and left neighbor pixels. The clustering process in Fast Scanning algorithm is performed by merging pixels with similar neighbor based on an identified threshold. Such an approach will lead to a weak reliability and shape matching of the produced segments. This paper proposes an adaptive threshold function to be used in the clustering process of the Fast Scanning algorithm. This function used the gray'value in the image's pixels and variance Also, the level of the image that is more the threshold are converted into intensity values between 0 and 1, and other values are converted into intensity values zero. The proposed enhanced Fast Scanning algorithm is realized on images of the public and private transportation in Iraq. Evaluation is later made by comparing the produced images of proposed algorithm and the standard Fast Scanning algorithm. The results showed that proposed algorithm is faster in terms the time from standard fast scanning.
The truly remarkable universality of half a standard deviation: confirmation through another look.
Norman, Geoffrey R; Sloan, Jeff A; Wyrwich, Kathleen W
2004-10-01
In this issue of Expert Review of Pharmacoeconomics and Outcomes Research, Farivar, Liu, and Hays present their findings in 'Another look at the half standard deviation estimate of the minimally important difference in health-related quality of life scores (hereafter referred to as 'Another look') . These researchers have re-examined the May 2003 Medical Care article 'Interpretation of changes in health-related quality of life: the remarkable universality of half a standard deviation' (hereafter referred to as 'Remarkable') in the hope of supporting their hypothesis that the minimally important difference in health-related quality of life measures is undoubtedly closer to 0.3 standard deviations than 0.5. Nonetheless, despite their extensive wranglings with the exclusion of many articles that we included in our review; the inclusion of articles that we did not include in our review; and the recalculation of effect sizes using the absolute value of the mean differences, in our opinion, the results of the 'Another look' article confirm the same findings in the 'Remarkable' paper.
Vandvik, Per Olav; Alonso-Coello, Pablo; Akl, Elie A; Thornton, Judith; Rigau, David; Adams, Katie; O'Connor, Paul; Guyatt, Gordon; Kristiansen, Annette
2017-01-01
Objectives To investigate practicing physicians' preferences, perceived usefulness and understanding of a new multilayered guideline presentation format—compared to a standard format—as well as conceptual understanding of trustworthy guideline concepts. Design Participants attended a standardised lecture in which they were presented with a clinical scenario and randomised to view a guideline recommendation in a multilayered format or standard format after which they answered multiple-choice questions using clickers. Both groups were also presented and asked about guideline concepts. Setting Mandatory educational lectures in 7 non-academic and academic hospitals, and 2 settings involving primary care in Lebanon, Norway, Spain and the UK. Participants 181 practicing physicians in internal medicine (156) and general practice (25). Interventions A new digitally structured, multilayered guideline presentation format and a standard narrative presentation format currently in widespread use. Primary and secondary outcome measures Our primary outcome was preference for presentation format. Understanding, perceived usefulness and perception of absolute effects were secondary outcomes. Results 72% (95% CI 65 to 79) of participants preferred the multilayered format and 16% (95% CI 10 to 22) preferred the standard format. A majority agreed that recommendations (multilayered 86% vs standard 91%, p value=0.31) and evidence summaries (79% vs 77%, p value=0.76) were useful in the context of the clinical scenario. 72% of participants randomised to the multilayered format vs 58% for standard formats reported correct understanding of the recommendations (p value=0.06). Most participants elected an appropriate clinical action after viewing the recommendations (98% vs 92%, p value=0.10). 82% of the participants considered absolute effect estimates in evidence summaries helpful or crucial. Conclusions Clinicians clearly preferred a novel multilayered presentation format to the standard format. Whether the preferred format improves decision-making and has an impact on patient important outcomes merits further investigation. PMID:28188149
A site-based approach to delivering rangeland ecosystem services
USDA-ARS?s Scientific Manuscript database
Rangeland ecosystems are capable of providing an array of ecosystem services important to the wellbeing of society. Some of these services (e.g. meat, fibre) are transported to markets and their quantity, quality and value are established via a set of widely accepted standards. Other services (e.g. ...
A Perspective on Effective Schools. Education Brief.
ERIC Educational Resources Information Center
Shulman, Lee S.
Although social science has contributed much to the student of schools, the perspectives of earlier thinkers about school effectiveness can fill some of our present need, too, by emphasizing the social values most important and then applying them as educational standards. Former generations viewed good schools normatively rather than empirically…
Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga
2010-05-10
Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.
Valente, Marta Sofia; Pedro, Paulo; Alonso, M Carmen; Borrego, Juan J; Dionísio, Lídia
2010-03-01
Monitoring the microbiological quality of water used for recreational activities is very important to human public health. Although the sanitary quality of recreational marine waters could be evaluated by standard methods, they are time-consuming and need confirmation. For these reasons, faster and more sensitive methods, such as the defined substrate-based technology, have been developed. In the present work, we have compared the standard method of membrane filtration using Tergitol-TTC agar for total coliforms and Escherichia coli, and Slanetz and Bartley agar for enterococci, and the IDEXX defined substrate technology for these faecal pollution indicators to determine the microbiological quality of natural recreational waters. ISO 17994:2004 standard was used to compare these methods. The IDEXX for total coliforms and E. coli, Colilert, showed higher values than those obtained by the standard method. Enterolert test, for the enumeration of enterococci, showed lower values when compared with the standard method. It may be concluded that more studies to evaluate the precision and accuracy of the rapid tests are required in order to apply them for routine monitoring of marine and freshwater recreational bathing areas. The main advantages of these methods are that they are more specific, feasible and simpler than the standard methodology.
Rolling capital: managing investments in a value-based care world.
Jasuta, Lynette
2016-06-01
The importance of capital planning is increasing as the healthcare industry moves toward value-based care. Replacing unwieldy and inflexible traditional capital planning processes with a rolling capital planning approach can result in: Greater standardization, facilitating better strategic planning across the whole system. Reduced labor intensity in the planning and budgeting process. Reduced costs through being able to plan better for replacement purchases and take advantage of group purchasing and bundling opportunities. Increased transparency in the decision-making process.
Comparison of wheat and rye flour solutions for skin prick testing: a multi-centre study (Stad 1).
van Kampen, V; Merget, R; Rabstein, S; Sander, I; Bruening, T; Broding, H C; Keller, C; Muesken, H; Overlack, A; Schultze-Werninghaus, G; Walusiak, J; Raulf-Heimsoth, M
2009-12-01
Skin prick testing (SPT) is the basic method for diagnosing IgE-mediated allergies. However, skin reactivity is related to the quality of allergen extracts, which are often poorly defined for occupational allergens. To compare wheat and rye flour SPT solutions from different producers. Standardized SPTs were performed in seven allergy centres with wheat and rye flour solutions from four producers in 125 symptomatic bakers. Optimal cut-off levels for weal sizes were assessed with the Youden Index. Comparisons between SPT results of different solutions were made with flour-specific IgE (sIgE) as the gold standard. Sensitivities, specificities, positive and negative predictive values, and test efficiencies were calculated and compared with McNemar and chi(2)-tests. The influence of the choice of the gold standard (sIgE or challenge) test was examined for 95 subjects. Additionally, SPT solutions were analysed for protein and antigen content. The optimal cut-off level for all SPT solutions was a weal size of >or=1.5 mm. While differences between wheat and rye flours were small, differences between producers were important. Variability of sensitivities (0.31-0.96), negative predictive values (0.42-0.91), and test efficiencies (0.54-0.90) were higher than variations of specificities (0.74-1.00) and positive predictive values (0.88-1.00). Similar results were obtained when using challenge test results as the gold standard. Variability could be explained by the different antigen contents of the SPT solutions. There is a wide variability of SPT solutions for wheat and rye flour from different producers, mainly with respect to sensitivities, negative predictive values, and test efficiencies. Improvement and standardization of SPT solutions used for the diagnosis of baker's asthma are highly recommended.
Quantifying the relative irreplaceability of important bird and biodiversity areas.
Di Marco, Moreno; Brooks, Thomas; Cuttelod, Annabelle; Fishpool, Lincoln D C; Rondinini, Carlo; Smith, Robert J; Bennun, Leon; Butchart, Stuart H M; Ferrier, Simon; Foppen, Ruud P B; Joppa, Lucas; Juffe-Bignoli, Diego; Knight, Andrew T; Lamoreux, John F; Langhammer, Penny F; May, Ian; Possingham, Hugh P; Visconti, Piero; Watson, James E M; Woodley, Stephen
2016-04-01
World governments have committed to increase the global protected areas coverage by 2020, but the effectiveness of this commitment for protecting biodiversity depends on where new protected areas are located. Threshold- and complementarity-based approaches have been independently used to identify important sites for biodiversity. We brought together these approaches by performing a complementarity-based analysis of irreplaceability in important bird and biodiversity areas (IBAs), which are sites identified using a threshold-based approach. We determined whether irreplaceability values are higher inside than outside IBAs and whether any observed difference depends on known characteristics of the IBAs. We focused on 3 regions with comprehensive IBA inventories and bird distribution atlases: Australia, southern Africa, and Europe. Irreplaceability values were significantly higher inside than outside IBAs, although differences were much smaller in Europe than elsewhere. Higher irreplaceability values in IBAs were associated with the presence and number of restricted-range species; number of criteria under which the site was identified; and mean geographic range size of the species for which the site was identified (trigger species). In addition, IBAs were characterized by higher irreplaceability values when using proportional species representation targets, rather than fixed targets. There were broadly comparable results when measuring irreplaceability for trigger species and when considering all bird species, which indicates a good surrogacy effect of the former. Recently, the International Union for Conservation of Nature has convened a consultation to consolidate global standards for the identification of key biodiversity areas (KBAs), building from existing approaches such as IBAs. Our results informed this consultation, and in particular a proposed irreplaceability criterion that will allow the new KBA standard to draw on the strengths of both threshold- and complementarity-based approaches. © 2015 Society for Conservation Biology.
Jiang, H J; Zhang, J M; Fu, W M; Zheng, Z; Luo, W; Zheng, Y X; Zhu, J M
2016-06-07
To investigate some important issues for diagnosis and treatment of idiopathic normal-pressure hydrocephalus (iNPH), such as standardized pre-operative assessment, initial pressure value of diverter pump, and pressure regulation during follow-up. Twenty six iNPH patients (21 males) who treated in Department of Neurosurgery of 2nd Affiliated Hospital of Zhejiang University School of Medicine from 2011 to 2015 were analyzed retrospectively. The average age was 60.5 year. The analysis focused on the treatment process of iNPH, initial pressure value of diverter pump, choice of diverter pump, and pressure regulation during follow-up. As a result, 24 cases (92.3%) had a good prognosis based on their imaging and clinical manifestations. Based on the literature and their clinical experiences, this department established a diagnosis and treatment procedure of iNPH and a pressure regulation procedure for the follow-up of iNPH. Moreover, it is proposed that choosing an anti-gravity diverter pump and making an initial pressure value 20 mmH2O less than pre-surgical cerebrospinal pressure may be beneficial for the prognosis. This standardized diagnosis and treatment procedure for iNPH is practical and effective.
Clinical tooth preparations and associated measuring methods: a systematic review.
Tiu, Janine; Al-Amleh, Basil; Waddell, J Neil; Duncan, Warwick J
2015-03-01
The geometries of tooth preparations are important features that aid in the retention and resistance of cemented complete crowns. The clinically relevant values and the methods used to measure these are not clear. The purpose of this systematic review was to retrieve, organize, and critically appraise studies measuring clinical tooth preparation parameters, specifically the methodology used to measure the preparation geometry. A database search was performed in Scopus, PubMed, and ScienceDirect with an additional hand search on December 5, 2013. The articles were screened for inclusion and exclusion criteria and information regarding the total occlusal convergence (TOC) angle, margin design, and associated measuring methods were extracted. The values and associated measuring methods were tabulated. A total of 1006 publications were initially retrieved. After removing duplicates and filtering by using exclusion and inclusion criteria, 983 articles were excluded. Twenty-three articles reported clinical tooth preparation values. Twenty articles reported the TOC, 4 articles reported margin designs, 4 articles reported margin angles, and 3 articles reported the abutment height of preparations. A variety of methods were used to measure these parameters. TOC values seem to be the most important preparation parameter. Recommended TOC values have increased over the past 4 decades from an unachievable 2- to 5-degree taper to a more realistic 10 to 22 degrees. Recommended values are more likely to be achieved under experimental conditions if crown preparations are performed outside of the mouth. We recommend that a standardized measurement method based on the cross sections of crown preparations and standardized reporting be developed for future studies analyzing preparation geometry. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Role and Value of the Corporate Medical Director.
Pawlecki, J Brent; Burton, Wayne N; Christensen, Cherryl; Crighton, K Andrew; Heron, Richard; Hudson, T Warner; Hymel, Pamela A; Roomes, David
2018-05-01
: The role of the corporate medical director (CMD) has evolved over the last 300 years since Ramazzini first identified diseases of Italian workers in the early 1700s. Since then, there has been a gradual blurring of the boundaries between private and workplace health concerns. Today's CMD must have intimate knowledge of their corporation's industry and the businesses that they support, particularly the occupational and environmental programs that comply with all local, state, and/or national standards and regulations. Leading companies not only measure compliance with such standards but also may hold programs to their own internal corporate global standards even if these go beyond local government requirements. This document will explore in greater depth the strength and importance that the CMD brings to the business operations to support a healthy, engaged, and high performing workforce. Part 1 describes the role and value of the CMD, while Part 2 provides collective wisdom for the new CMD from current and past highly experienced CMDs.
NASA Astrophysics Data System (ADS)
Tarnapowicz, Dariusz; German-Galkin, Sergiej
2018-03-01
The decisive source of air pollution emissions in ports is the berthed ships. This is primarily caused by the work of ship's autonomous generator sets. One way of reducing the air pollution emissions in ports is the supply of ships from electricity inland system. The main problem connected with the power connection of ships to the inland network is caused by different values of levels and frequencies of voltages in these networks (in various countries) in relation to different values of levels and frequencies of voltages present in the ship's network. It is also important that the source power can range from a few hundred kW up to several MW. In order to realize a universal „Shore to Ship" system that allows the connection of ships to the electricity inland network, the international standardization is necessary. This article presents the current recommendations, standards and regulations for the design of „Shore to Ship" systems.
NASA Technical Reports Server (NTRS)
Johnson, V. J.; Mc Carty, R. D.; Roder, H. M.
1970-01-01
Integrated tables of pressure, volume, and temperature for the saturated liquid, from the triple point to the critical point of the gases, have been developed. Tables include definition of saturated liquid curve. Values are presented in metric and practical units. Advantages of the new tables are discussed.
Agility and Speed Standards for Student Teenager Wrestlers
ERIC Educational Resources Information Center
Bayraktar, Isik
2017-01-01
The processes of talent identification and development provide serious advantages to success of athletes. The interpretation of the current situation in the process of the athletes' education gets more worth through the use of objective assessment, in other words using norm values. The observation of the talent by norms is important for each sport…
Engineering Education: A Clear Content Base for Standards
ERIC Educational Resources Information Center
Grubbs, Michael E.; Strimel, Greg J.; Huffman, Tanner
2018-01-01
Interest in engineering at the P-12 level has increased in recent years, largely in response to STEM educational reform. Despite greater attention to the value, importance, and use of engineering for teaching and learning, the educational community has engaged minimally in its deliberate and coherent study. Specifically, few efforts have been…
Social Accountability of Medical Schools: Do Accreditation Standards Help Promote the Concept?
ERIC Educational Resources Information Center
Abdalla, Mohamed Elhassan
2014-01-01
The social accountability of medical schools is an emerging concept in medical education. This issue calls for the consideration of societal needs in all aspects of medical programmes, including the values of relevance, quality, cost-effectiveness and equity. Most importantly, these needs must be defined collaboratively with people themselves.…
Developing Pre-Professional Identity in Undergraduates through Work-Integrated Learning
ERIC Educational Resources Information Center
Jackson, Denise
2017-01-01
Pre-professional identity is a complex phenomenon spanning awareness of and connection with the skills, qualities, behaviours, values and standards of a student's chosen profession, as well as one's understanding of professional self in relation to the broader general self. It is an important, yet under-explored, aspect of graduate employability…
USDA-ARS?s Scientific Manuscript database
Roasting is of central importance to peanut flavor. Standard industry practice is to roast peanuts to a specific surface color (Hunter L-value) for a given application; however, equivalent surface colors can be generated using different temperature/time roast combinations. To better understand the e...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-22
... INFORMATION CONTACT: John Drury or Ericka Ukrow, AD/CVD Operations, Office 7, Import Administration... conducted sales and cost verifications between June 18 and 28, 2012 of the questionnaire responses submitted by Al Jazeera. We used standard verification procedures, including examination of relevant accounting...
Standard Generalized Markup Language for self-defining structured reports.
Kahn, C E
1999-01-01
Structured reporting is the process of using standardized data elements and predetermined data-entry formats to record observations. The Standard Generalized Markup Language (SGML; International Standards Organization (ISO) 8879:1986)--an open, internationally accepted standard for document interchange was used to encode medical observations acquired in an Internet-based structured reporting system. The resulting report is self-documenting: it includes a definition of its allowable data fields and values encoded as a report-specific SGML document type definition (DTD). The data-entry forms, DTD, and report document instances are based on report specifications written in a simple, SGML-based language designed for that purpose. Reporting concepts can be linked with those of external vocabularies such as the Unified Medical Language System (UMLS) Metathesaurus. The use of open standards such as SGML is an important step in the creation of open, universally comprehensible structured reports.
Threshold network of a financial market using the P-value of correlation coefficients
NASA Astrophysics Data System (ADS)
Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun
2015-06-01
Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.
The role of decision analysis in informed consent: choosing between intuition and systematicity.
Ubel, P A; Loewenstein, G
1997-03-01
An important goal of informed consent is to present information to patients so that they can decide which medical option is best for them, according to their values. Research in cognitive psychology has shown that people are rapidly overwhelmed by having to consider more than a few options in making choices. Decision analysis provides a quantifiable way to assess patients' values, and it eliminates the burden of integrating these values with probabilistic information. In this paper we evaluate the relative importance of intuition and systematicity in informed consent. We point out that there is no gold standard for optimal decision making in decisions that hinge on patient values. We also point out that in some such situations it is too early to assume that the benefits of systematicity outweigh the benefits of intuition. Research is needed to address the question of which situations favor the use of intuitive approaches of decision making and which call for a more systematic approach.
High preservation of DNA standards diluted in 50% glycerol.
Schaudien, Dirk; Baumgärtner, Wolfgang; Herden, Christiane
2007-09-01
Standard curves are important tools in real-time quantitative polymerase chain reaction (PCR) to precisely analyze gene expression patterns under physiologic and pathologic conditions. Handling of DNA standards often implies multiple cycles of freezing and thawing that might affect DNA stability and integrity. This in turn might influence the reliability and reproducibility of quantitative measurements in real-time PCR assays. In this study, 3 DNA standards such as murine tumor necrosis factor (TNF) alpha, interferon (IFN) gamma, and kainat-1 receptor were diluted in 50% glycerol or water after 1, 4, and 16 cycles of freezing and thawing and amplified copy numbers after real-time PCR were compared. The standards diluted in water showed a reduction to 83%, 55%, and 50% after 4 cycles, to 24%, 5%, and 4% after 16 cycles for kainat-1 receptor, TNFalpha, and IFNgamma standards, respectively, when compared with a single cycle of freezing and thawing. Interestingly, all cDNA samples diluted in 50% glycerol were amplified in comparable copy numbers even after 16 cycles of freezing and thawing. The effect of the standards undergoing different cycles of freezing and thawing on sample values was demonstrated by amplifying cDNA obtained from Borna disease virus infected and noninfected TNF-transgenic mice brain. This revealed significant differences of measured cDNA copy numbers using water-diluted DNA standards. In contrast, sample values did not vary using glycerol-diluted standards that were frozen and thawed for 16 times. In conclusion, glycerol storage of DNA standards represents a suitable tool for the accurate and reproducible quantification of cDNA samples in real-time PCR analysis.
AWS breaks new ground with soldering specification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vianco, Paul Thomas
Joining technologies continue to advance with new materials, process innovations, and inspection techniques. An increasing number of high-valued, high-reliability applications -- from boilers and ship hulls to rocket motors and medical devices -- have required the development of industry standards and specifications in order to ensure that the best design and manufacturing practices are being used to produce safe, durable products and assemblies. Standards writing has always had an important role at the American Welding Society (AWS). The AWS standards and specifications cover such topics as filler materials, joining processes, inspection techniques, and qualification methods that are used in weldingmore » and brazing technologies. These AWS standards and specifications, all of which are approved by the American National Standards Institute (ANSI), have also provided the basis for many similar documents used in Europe and in Pacific Rim countries.« less
NASA Astrophysics Data System (ADS)
Sperling, A.; Meyer, M.; Pendsa, S.; Jordan, W.; Revtova, E.; Poikonen, T.; Renoux, D.; Blattner, P.
2018-04-01
Proper characterization of test setups used in industry for testing and traceable measurement of lighting devices by the substitution method is an important task. According to new standards for testing LED lamps, luminaires and modules, uncertainty budgets are requested because in many cases the properties of the device under test differ from the transfer standard used, which may cause significant errors, for example if a LED-based lamp is tested or calibrated in an integrating sphere which was calibrated with a tungsten lamp. This paper introduces a multiple transfer standard, which was designed not only to transfer a single calibration value (e.g. luminous flux) but also to characterize test setups used for LED measurements with additional provided and calibrated output features to enable the application of the new standards.
Magnetic properties comparison of mass standards among seventeen national metrology institutes
NASA Astrophysics Data System (ADS)
Becerra, L. O.; Berry, J.; Chang, C. S.; Chapman, G. D.; Chung, J. W.; Davis, R. S.; Field, I.; Fuchs, P.; Jacobsson, U.; Lee, S. M.; Loayza, V. M.; Madec, T.; Matilla, C.; Ooiwa, A.; Scholz, F.; Sutton, C.; van Andel, I.
2006-10-01
The ubiquitous technology of magnetic force compensation of gravitational forces acting on artifacts on the pans of modern balances and comparators has brought with it the problem of magnetic leakage from the compensation coils. Leaking magnetic fields, as well as those due to the surroundings of the balance, can interact with the artifact whose mass is to be determined, causing erroneous values to be observed. For this reason, and to comply with normative standards, it has become important for mass metrologists to evaluate the magnetic susceptibility and any remanent magnetization that mass standards may possess. This paper describes a comparison of measurements of these parameters among seventeen national metrology institutes. The measurements are made on three transfer standards whose magnetic parameters span the range that might be encountered in stainless steel mass standards.
Direct and indirect drift assessment means. Part 2: wind tunnel experiments.
Nuyttens, D; De Schampheleire, M; Baetens, K; Sonck, B
2008-01-01
Wind tunnel measurements, performed in Silsoe Research Institute (SRI), were used to measure airborne and fallout spray volumes under directly comparable and repeatable conditions for single and static nozzles. Based on these measurements, drift potential reduction percentages (DPRP), expressing the percentage reduction of the drift potential compared with the reference spraying, were calculated following three approaches. The first approach was based on the calculation of the first moment of the airborne spray profile (DPRPv1). In the second and third approach, the surface under the measured airborne (DPRPv2) and fallout (DPRP(H)) deposit curve were used. These DPRP values express the percentage reduction of the drift potential compared with the reference spraying. Ten different spray nozzles were tested. The results showed the expected fallout profiles with the highest deposits closest to the nozzle and a systematic decrease with distance from the nozzle. For the airborne deposit profiles, the highest deposits were found at the Lowest collectors with an important systematic decrease with increasing heights. For the same nozzle size and spray pressure, DPRP values are generally higher for the air inclusion nozzles followed by the low-drift nozzles and the standard flat fan nozzles and the effect of nozzle type is most important for smaller nozzle sizes. In general, the bigger the ISO nozzle size, the higher the DPRP values. Comparing results from the three different approaches namely, DPRPv1, DPRPv2 and DPRP(H), some interesting conclusions can be drawn. For the standard flat fan nozzles, DPRPv1, values were the highest followed by DPRPv2 and DPRP(H) while for the low-drift nozzles opposite results were found. For the air inclusion nozzles, there was a relatively good agreement between DPRPv1, DPRPv1 and DPRP(H) values. All of this is important in the interpretation of wind tunnel data for different nozzle types and sampling methodologies.
Wilson, S.A.; Ridley, W.I.; Koenig, A.E.
2002-01-01
The requirements of standard materials for LA-ICP-MS analysis have been difficult to meet for the determination of trace elements in sulfides. We describe a method for the production of synthetic sulfides by precipitation from solution. The method is detailed by the production of approximately 200 g of a material, PS-1, with a suite of chalcophilic trace elements in an Fe-Zn-Cu-S matrix. Preliminary composition data, together with an evaluation of the homogeneity for individual elements, suggests that this type of material meets the requirements for a sulfide calibration standard that allows for quantitative analysis. Contamination of the standard with Na suggests that H2S gas may prove a better sulfur source for future experiments. We recommend that calibration data be collected in whatever mode is closest to that employed for the analysis of the unknown material, because of variable fractionation effects as a function of analytical mode. For instance, if individual spot analyses are attempted on unknown sample, then a raster of several individual spot analyses, not a continuous scan, should be collected and averaged for the standard. Hg and Au are exceptions to the above and calibration data should always be collected in a scanning mode. Au is more heterogeneously distributed than other trace metals and large-area scans are required to provide an average value for calibration purposes. We emphasize that the values given in Table 1 are preliminary values. Further chemical characterization of this standard, through a round-robin analysis program, will allow the USGS to provide both certified and recommended values for individual elements. The USGS has developed PS-1 as a potential new LA-ICP-MS standard for use by the analytical community, and requests for this material should be addressed to S. Wilson. However, it is stressed that an important aspect of the method described here is the flexibility for individual investigators to produce sulfides with a wide range of trace metals in variable matrices. For example, PS-1 is not well suited to the analysis of galena, and it would be relatively straightforward for other standards to be developed with Pb present in the matrix as a major constituent. These standards can be made easily and cheaply in a standard wet chemistry laboratory using equipment and chemicals that are readily available.
Kaminsky, Leonard A; Imboden, Mary T; Arena, Ross; Myers, Jonathan
2017-02-01
The importance of cardiorespiratory fitness (CRF) is well established. This report provides newly developed standards for CRF reference values derived from cardiopulmonary exercise testing (CPX) using cycle ergometry in the United States. Ten laboratories in the United States experienced in CPX administration with established quality control procedures contributed to the "Fitness Registry and the Importance of Exercise: A National Database" (FRIEND) Registry from April 2014 through May 2016. Data from 4494 maximal (respiratory exchange ratio, ≥1.1) cycle ergometer tests from men and women (20-79 years) from 27 states, without cardiovascular disease, were used to develop these references values. Percentiles of maximum oxygen consumption (VO 2max ) for men and women were determined for each decade from age 20 years through age 79 years. Comparisons of VO 2max were made to reference data established with CPX data from treadmill data in the FRIEND Registry and previously published reports. As expected, there were significant differences between sex and age groups for VO 2max (P<.01). For cycle tests within the FRIEND Registry, the 50th percentile VO 2max of men and women aged 20 to 29 years declined from 41.9 and 31.0 mLO 2 /kg/min to 19.5 and 14.8 mLO 2 /kg/min for ages 70 to 79 years, respectively. The rate of decline in this cohort was approximately 10% per decade. The FRIEND Registry reference data will be useful in providing more accurate interpretations for the US population of CPX-measured VO 2max from exercise tests using cycle ergometry compared with previous approaches based on estimations of standard differences from treadmill testing reference values. Copyright © 2016 Mayo Foundation for Medical Education and Research. All rights reserved.
Size-dependent standard deviation for growth rates: Empirical results and theoretical modeling
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H. Eugene; Grosse, I.
2008-05-01
We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation σ(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation σ(R) on the average value of the wages with a scaling exponent β≈0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation σ(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of σ(R) on the average payroll with a scaling exponent β≈-0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.
Size-dependent standard deviation for growth rates: empirical results and theoretical modeling.
Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H Eugene; Grosse, I
2008-05-01
We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation sigma(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation sigma(R) on the average value of the wages with a scaling exponent beta approximately 0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation sigma(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of sigma(R) on the average payroll with a scaling exponent beta approximately -0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.
Friedl, Karl E
2004-10-01
Weight control is an important early intervention in diabetes, but the nature of the association between weight and disordered metabolism has been confused because fat mass and its distribution are only partly associated with increasing body size. Weight, fat, and regional fat placement, specifically in the abdominal site, may each have distinctly different associations with diabetes risk. Abdominal circumference may be the common marker of poor fitness habits and of increased risk for metabolic diseases such as diabetes. This is an important question for public health policy as well as for occupational standards such as those of the military, which are intended to promote fitness for military missions and include strength and aerobic capacity, as well as military appearance considerations. U.S. soldiers are heavier than ever before, reflecting both increased muscle and fat components. They also have better health care than ever before and are required to exercise regularly, and even the oldest soldiers are required to remain below body fat limits that are more stringent than the current median values of the U.S. population over age 40. The body fat standards assessed by circumference-based equations are 20-26% and 30-36%, for various age groups of men and women, respectively, and the upper limits align with threshold values of waist circumference recommended in national health goals. The basis and effects of the Army standards are presented in this paper. U.S. Army body fat standards may offer practical and reasonable health guidelines suitable for all active Americans that might help stem the increasing prevalence of obesity that is predicted to increase the prevalence of Type 2 diabetes.
Field reliability of Ricor microcoolers
NASA Astrophysics Data System (ADS)
Pundak, N.; Porat, Z.; Barak, M.; Zur, Y.; Pasternak, G.
2009-05-01
Over the recent 25 years Ricor has fielded in excess of 50,000 Stirling cryocoolers, among which approximately 30,000 units are of micro integral rotary driven type. The statistical population of the fielded units is counted in thousands/ hundreds per application category. In contrast to MTTF values as gathered and presented based on standard reliability demonstration tests, where the failure of the weakest component dictates the end of product life, in the case of field reliability, where design and workmanship failures are counted and considered, the values are usually reported in number of failures per million hours of operation. These values are important and relevant to the prediction of service capabilities and plan.
40 CFR 80.1405 - What are the Renewable Fuel Standards?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Renewable Fuel Standards? (a) (1) Renewable Fuel Standards for 2010. (i) The value of the cellulosic biofuel... shall be 1.10 percent. (iii) The value of the advanced biofuel standard for 2010 shall be 0.61 percent... Standards for 2011. (i) The value of the cellulosic biofuel standard for 2011 shall be 0.003 percent. (ii...
40 CFR 80.1405 - What are the Renewable Fuel Standards?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Renewable Fuel Standards? (a) (1) Renewable Fuel Standards for 2010. (i) The value of the cellulosic biofuel... shall be 1.10 percent. (iii) The value of the advanced biofuel standard for 2010 shall be 0.61 percent... Standards for 2011. (i) The value of the cellulosic biofuel standard for 2011 shall be 0.003 percent. (ii...
40 CFR 80.1405 - What are the Renewable Fuel Standards?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Renewable Fuel Standards? (a) (1) Renewable Fuel Standards for 2010. (i) The value of the cellulosic biofuel... shall be 1.10 percent. (iii) The value of the advanced biofuel standard for 2010 shall be 0.61 percent... Standards for 2011. (i) The value of the cellulosic biofuel standard for 2011 shall be 0.003 percent. (ii...
Lee, Joohee; Kim, Jinseok; Lim, Hyunsung
2010-07-01
The purpose of the current study was to examine factors that influence rape myths among Korean college students. This study was particularly interested in the ways in which attitudes toward women and sexual double standard affect the relationship between gender and rape myths. Although the incidence of rape is a common concern in many current societies, within each society, the specific components of rape myths reflect the cultural values and norms of that particular society. A sample of 327 college students in South Korea completed the Korean Rape Myth Acceptance Scale-Revised, the Attitudes Toward Women Scale, and the Sexual Double Standard Scale. Structural equation modeling (SEM) was used to test hypothesized models. Results revealed that in three of the four models, rape survivor myths, rape perpetrator myths, and myths about the impact of rape, attitudes toward women were a more important predictor of rape myths than gender or sexual double standard. In the rape spontaneity myths model, on the other hand, sexual double standard was a more important predictor than gender or attitudes toward women. This study provides valuable information that can be useful in developing culturally specific rape prevention and victim intervention programs.
Tonkin, Matthew J.; Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.
2007-01-01
The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one or more parameters is added.
Is there another major constituent in the atmosphere of Mars?. [radiogenic argon
NASA Technical Reports Server (NTRS)
Wood, G. P.
1974-01-01
In view of the possible finding of several tens percent of inert gas in the atmosphere of Mars by an instrument on the descent module of the USSR's Mars 6 spacecraft, the likelihood of the correctness of this result was examined. The basis for the well-known fact that the most likely candidate is radiogenic argon is described. It is shown that, for the two important methods of investigating the atmosphere, earth-based CO2 is infrared absorption spectroscopy and S-band occultation, within the estimated 1 standard deviation uncertainties of these methods about 20% argon can be accommodated. Within the estimated 3 standard deviation uncertainties, more than 35% is possible. It is also stated that even with 35% argon the maximum value of heat transfer rate on the Viking 75 entry vehicle does not exceed the design value.
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
A Constructivist Approach to Business Ethics: Developing a Student Code of Professional Conduct
ERIC Educational Resources Information Center
Willey, Lorrie; Burke, Debra D.
2011-01-01
Business ethics may be defined as "the principles, values and standards that guide behavior in the world of business." The importance of ethical awareness in business transactions and education is widely recognized, and evidence shows that ethics education can influence decision making in the workplace. As a result, colleges of business often…
ERIC Educational Resources Information Center
Applis, Stefan
2016-01-01
The educational standards in geography in the German-speaking world separately refer to the areas of competence of judgment and evaluation and thus attach outstanding importance to reflective value orientation in geography classes. The tasks and challenges that arise from that for geography teachers will be investigated in a…
ERIC Educational Resources Information Center
Lee, Kwangyhuyn; Weimer, Debbi
2002-01-01
Michigan is designing a new accountability system that combines high standards and statewide testing within a school accreditation framework. Sound assessment techniques are critical if the accountability system is to provide relevant information to schools and policymakers. One important component of a sound assessment system is measurement of…
Quantifying expert diagnosis variability when grading tumor-infiltrating lymphocytes
NASA Astrophysics Data System (ADS)
Toro, Paula; Corredor, Germán.; Wang, Xiangxue; Arias, Viviana; Velcheti, Vamsidhar; Madabhushi, Anant; Romero, Eduardo
2017-11-01
Tumor-infiltrating lymphocytes (TILs) have proved to play an important role in predicting prognosis, survival, and response to treatment in patients with a variety of solid tumors. Unfortunately, currently, there are not a standardized methodology to quantify the infiltration grade. The aim of this work is to evaluate variability among the reports of TILs given by a group of pathologists who examined a set of digitized Non-Small Cell Lung Cancer samples (n=60). 28 pathologists answered a different number of histopathological images. The agreement among pathologists was evaluated by computing the Kappa index coefficient and the standard deviation of their estimations. Furthermore, TILs reports were correlated with patient's prognosis and survival using the Pearson's correlation coefficient. General results show that the agreement among experts grading TILs in the dataset is low since Kappa values remain below 0.4 and the standard deviation values demonstrate that in none of the images there was a full consensus. Finally, the correlation coefficient for each pathologist also reveals a low association between the pathologists' predictions and the prognosis/survival data. Results suggest the need of defining standardized, objective, and effective strategies to evaluate TILs, so they could be used as a biomarker in the daily routine.
Evaluation of the 235 U resonance parameters to fit the standard recommended values
Leal, Luiz; Noguere, Gilles; Paradela, Carlos; ...
2017-09-13
A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. We performed a resonance re-evaluation of the n + 235U interactionmore » in order to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-o-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. Our paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.« less
Evaluation of the 235 U resonance parameters to fit the standard recommended values
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, Luiz; Noguere, Gilles; Paradela, Carlos
A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. We performed a resonance re-evaluation of the n + 235U interactionmore » in order to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-o-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. Our paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.« less
Evaluation of the 235U resonance parameters to fit the standard recommended values
NASA Astrophysics Data System (ADS)
Leal, Luiz; Noguere, Gilles; Paradela, Carlos; Durán, Ignacio; Tassan-Got, Laurent; Danon, Yaron; Jandel, Marian
2017-09-01
A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. A resonance re-evaluation of the n + 235U interaction has been performed to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-of-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. This paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.
Durability, value, and reliability of selected electric powered wheelchairs.
Fass, Megan V; Cooper, Rory A; Fitzgerald, Shirley G; Schmeler, Mark; Boninger, Michael L; Algood, S David; Ammer, William A; Rentschler, Andrew J; Duncan, John
2004-05-01
To compare the durability, value, and reliability of selected electric powered wheelchairs (EPWs), purchased in 1998. Engineering standards tests of quality and performance. A rehabilitation engineering center. Fifteen EPWs: 3 each of the Jazzy, Quickie, Lancer, Arrow, and Chairman models. Not applicable. Wheelchairs were evaluated for durability (lifespan), value (durability, cost), and reliability (rate of repairs) using 2-drum and curb-drop machines in accordance with the standards of the American National Standards Institute and Rehabilitation Engineering and Assistive Technology Society of North America. The 5 brands differed significantly (P
Value-based purchasing of medical devices.
Obremskey, William T; Dail, Teresa; Jahangir, A Alex
2012-04-01
Health care in the United States is known for its continued innovation and production of new devices and techniques. While the intention of these devices is to improve the delivery and outcome of patient care, they do not always achieve this goal. As new technologies enter the market, hospitals and physicians must determine which of these new devices to incorporate into practice, and it is important these devices bring value to patient care. We provide a model of a physician-engaged process to decrease cost and increase review of physician preference items. We describe the challenges, implementation, and outcomes of cost reduction and product stabilization of a value-based process for purchasing medical devices at a major academic medical center. We implemented a physician-driven committee that standardized and utilized evidence-based, clinically sound, and financially responsible methods for introducing or consolidating new supplies, devices, and technology for patient care. This committee worked with institutional finance and administrative leaders to accomplish its goals. Utilizing this physician-driven committee, we provided access to new products, standardized some products, decreased costs of physician preference items 11% to 26% across service lines, and achieved savings of greater than $8 million per year. The implementation of a facility-based technology assessment committee that critically evaluates new technology can decrease hospital costs on implants and standardize some product lines.
Brown, Melissa M; Brown, Gary C; Lieske, Heidi B; Lieske, P Alexander
2012-05-01
This analysis discusses the comparative effectiveness and cost-effectiveness of vitreoretinal interventions, measured in quality-adjusted life years (QALYs) and percentage patient value (PPV gain, or improvement in quality of life and/or length of life). The material is relevant since the Patient Protection and Affordable Care Act enacted by Congress with the support of the President has emphasized the critical importance of patient-based preferences. The majority of preference-based, comparative effectiveness and cost-effectiveness vitreoretinal interventions assessed in the US healthcare literature are Value-Based Medicine analyses, thus comparable. These interventions confer a mean patient (human) value gain (improvement in quality of life) of 8.3% [SD 6.3%, 95% confidence interval (CI) + 2.6%]. The average cost-utility of these vitreoretinal interventions is US$23 026/QALY (SD US$24 508, 95% CI + US$8770). Most vitreoretinal interventions are very cost effective using a conventional US standard of US$50 000/QALY as the upper anchor for a very cost-effective intervention, and the World Health Organization of approximately US$142 200/QALY as the upper anchor for a cost-effective intervention. Most vitreoretinal interventions confer considerable patient value and are very cost effective. Further standardization across healthcare is needed in the preference-based, comparative and cost-utility (cost-effectiveness) arena. The metrics of PPV (percentage patient value) gain and US$/PPV (dollars expended per percentage patient value gain) or financial value gain may be more user-friendly than the QALY.
Beyond the exchange--the future of B2B.
Wise, R; Morrison, D
2000-01-01
Using the Internet to facilitate business-to-business commerce promises many benefits, such as dramatic cost reductions and greater access to buyers and sellers. Yet little is known about how B2B e-commerce will evolve. The authors argue that changes in the financial services industry over the past two decades provide important clues. Exchanges, they say, are not the primary source of value in information-intensive markets; value tends to accumulate among a diverse group of specialists that focus on such tasks as packaging, standard setting, arbitrage, and information management. Because scale and liquidity are vitally important to efficient trading, today's exchanges will consolidate into a relatively small set of mega-exchanges. Originators will handle the origination and aggregation of complex transactions before sending them on to mega-exchanges for execution. E-speculators, seeking to capitalize on an abundance of market information, will tend to concentrate where relatively standardized products can be transferred easily among a large group of buyers. In many markets, a handful of independent solution providers with well-known brand names and solid reputations will thrive alongside mega-exchanges. Sell-side asset exchanges will create the networks and provide the tools to allow suppliers to trade orders among themselves, sometimes after initial transactions with customers are made on the mega-exchanges. For many companies, traditional skills in such areas as product development, manufacturing, and marketing may become relatively less important, while the ability to understand and capitalize on market dynamics may become considerably more important.
Development of NASA Technical Standards Program Relative to Enhancing Engineering Capabilities
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Vaughan, William W.
2003-01-01
The enhancement of engineering capabilities is an important aspect of any organization; especially those engaged in aerospace development activities. Technical Standards are one of the key elements of this endeavor. The NASA Technical Standards Program was formed in 1997 in response to the NASA Administrator s directive to develop an Agencywide Technical Standards Program. The Program s principal objective involved the converting Center-unique technical standards into Agency wide standards and the adoption/endorsement of non-Government technical standards in lieu of government standards. In the process of these actions, the potential for further enhancement of the Agency s engineering capabilities was noted relative to value of being able to access Agencywide the necessary full-text technical standards, standards update notifications, and integration of lessons learned with technical standards, all available to the user from one Website. This was accomplished and is now being enhanced based on feedbacks from the Agency's engineering staff and supporting contractors. This paper addresses the development experiences with the NASA Technical Standards Program and the enhancement of the Agency's engineering capabilities provided by the Program s products. Metrics are provided on significant aspects of the Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, George; Haring, Adrian; Spooner, Ted
To help address the industry's needs for assuring the value and reducing the risk of investments in PV power plants; the International Electrotechnical Commission (IEC) has established a new conformity assessment system for renewable energy (IECRE). There are presently important efforts underway to define the requirements for various types of PV system certificates, and publication of the international standards upon which these certifications will be based. This paper presents a detailed analysis of the interrelationship of these activities and the timing for initiation of IECRE PV system certifications.
Status of the BL2 beam measurement of the neutron lifetime
NASA Astrophysics Data System (ADS)
Hoogerheide, Shannon Fogwell; BL2 Collaboration
2017-09-01
Neutron beta decay is the simplest example of nuclear beta decay and a precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosynthesis models. A new measurement of the neutron lifetime, utilizing the beam method, is underway at the National Institute of Standards and Technology Center for Neutron Research with a projected uncertainty of 1 s. A review of the beam method and the technical improvements in this experiment will be presented. The status of the experiment, as well as preliminary measurements, beam characteristics, and early data will be discussed.
NASA Astrophysics Data System (ADS)
Goeritno, Arief; Rasiman, Syofyan
2017-06-01
Performance examination of the bulk oil circuit breaker that is influenced by its parameters at the Substation of Bogor Baru (the State Electricity Company = PLN) has been done. It is found that (1) dielectric strength of oil still qualifies as an insulating and cooling medium, because the average value of the measurement result is still above the minimum value allowed, where the minimum limit of 80 kV/2.5 cm or 32 kV/cm; (2) the simultaneity of the CB's contacts is still eligible, so that the BOCB can still be operated, because the difference of time between the highest and lowest values when the BOCB's contacts are opened/closed are less than (Δt<) 10 milliseconds (if meeting the PLN standards as recommended by Alsthom); and (3) the parameter of resistance according to the standards, where (i) the resistance of insulation has a value far above the allowed threshold, while the minimum standards are above 2,000 Mn (if meeting the ANSI standards) or on the value of 2,000 MΩ (if meeting PLN standards), (ii) the resistance of contacts has a value far above the allowed threshold, while the minimum standards are below 350 µΩ (if meeting ANSI standards) or on the value of 200 µΩ (if meeting PLN standards). The resistance of grounding is equal to the maximum limit specified, while the maximum standard is on the value of 0.5 Ω (if meeting PLN standard).
Ibrahim, Irwani; Yau, Ying Wei; Ong, Lizhen; Chan, Yiong Huak; Kuan, Win Sen
2015-03-01
Arterial punctures are important procedures performed by emergency physicians in the assessment of ill patients. However, arterial punctures are painful and can create anxiety and needle phobia in patients. The pain score of radial arterial punctures were compared between the insulin needle and the standard 23-gauge hypodermic needle. In a randomized controlled crossover design, healthy volunteers were recruited to undergo bilateral radial arterial punctures. They were assigned to receive either the insulin or the standard needle as the first puncture, using blocked randomization. The primary outcome was the pain score measured on a 100-mm visual analogue scale (VAS) for pain, and secondary outcomes were rate of hemolysis, mean potassium values, and procedural complications immediately and 24 hours postprocedure. Fifty healthy volunteers were included in the study. The mean (±standard deviation) VAS score in punctures with the insulin needle was lower than the standard needle (23 ± 22 mm vs. 39 ± 24 mm; mean difference = -15 mm; 95% confidence interval = -22 mm to -7 mm; p < 0.001). The rates of hemolysis and mean potassium value were greater in samples obtained using the insulin needle compared to the standard needle (31.3% vs. 11.6%, p = 0.035; and 4.6 ±0.7 mmol/L vs. 4.2 ±0.5 mmol/L, p = 0.002). Procedural complications were lower in punctures with the insulin needle both immediately postprocedure (0% vs. 24%; p < 0.001) and at 24 hours postprocedure (5.4% vs. 34.2%; p = 0.007). Arterial punctures using insulin needles cause less pain and fewer procedural complications compared to standard needles. However, due to the higher rate of hemolysis, its use should be limited to conditions that do not require a concurrent potassium value in the same blood sample. © 2015 by the Society for Academic Emergency Medicine.
Finley, B L; Scott, P K; Mayhall, D A
1994-08-01
It has recently been suggested that "standard" data distributions for key exposure variables should be developed wherever appropriate for use in probabilistic or "Monte Carlo" exposure analyses. Soil-on-skin adherence estimates represent an ideal candidate for development of a standard data distribution: There are several readily available studies which offer a consistent pattern of reported results, and more importantly, soil adherence to skin is likely to vary little from site-to-site. In this paper, we thoroughly review each of the published soil adherence studies with respect to study design, sampling, and analytical methods, and level of confidence in the reported results. Based on these studies, probability density functions (PDF) of soil adherence values were examined for different age groups and different sampling techniques. The soil adherence PDF developed from adult data was found to resemble closely the soil adherence PDF based on child data in terms of both central tendency (mean = 0.49 and 0.63 mg-soil/cm2-skin, respectively) and 95th percentile values (1.6 and 2.4 mg-soil/cm2-skin, respectively). Accordingly, a single, "standard" PDF is presented based on all data collected for all age groups. This standard PDF is lognormally distributed; the arithmetic mean and standard deviation are 0.52 +/- 0.9 mg-soil/cm2-skin. Since our review of the literature indicates that soil adherence under environmental conditions will be minimally influenced by age, sex, soil type, or particle size, this PDF should be considered applicable to all settings. The 50th and 95th percentile values of the standard PDF (0.25 and 1.7 mg-soil/cm2-skin, respectively) are very similar to recent U.S. EPA estimates of "average" and "upper-bound" soil adherence (0.2 and 1.0 mg-soil/cm2-skin, respectively).
Analysis of Global Urban Temperature Trends and Urbanization Impacts
NASA Astrophysics Data System (ADS)
Lee, K. I.; Ryu, J.; Jeon, S. W.
2018-04-01
Due to urbanization, urban areas are shrinking green spaces and increasing concrete, asphalt pavement. So urban climates are different from non-urban areas. In addition, long-term macroscopic studies of urban climate change are becoming more important as global urbanization affects global warming. To do this, it is necessary to analyze the effect of urbanization on the temporal change in urban temperature with the same temperature data and standards for urban areas around the world. In this study, time series analysis was performed with the maximum, minimum, mean and standard values of surface temperature during the from 1980 to 2010 and analyzed the effect of urbanization through linear regression analysis with variables (population, night light, NDVI, urban area). As a result, the minimum value of the surface temperature of the urban area reflects an increase by a rate of 0.28K decade-1 over the past 31 years, the maximum value reflects an increase by a rate of 0.372K decade-1, the mean value reflects an increase by a rate of 0.208 decade-1, and the standard deviation reflects a decrease by rate of 0.023K decade-1. And the change of surface temperature in urban areas is affected by urbanization related to land cover such as decrease of greenery and increase of pavement area, but socioeconomic variables are less influential than NDVI in this study. This study are expected to provide an approach to future research and policy-planning for urban temperature change and urbanization impacts.
DESIGN NOTE: New apparatus for haze measurement for transparent media
NASA Astrophysics Data System (ADS)
Yu, H. L.; Hsiao, C. C.; Liu, W. C.
2006-08-01
Precise measurement of luminous transmittance and haze of transparent media is increasingly important to the LCD industry. Currently there are at least three documentary standards for measuring transmission haze. Unfortunately, none of those standard methods by itself can obtain the precise values for the diffuse transmittance (DT), total transmittance (TT) and haze. This note presents a new apparatus capable of precisely measuring all three variables simultaneously. Compared with current structures, the proposed design contains one more compensatory port. For optimal design, the light trap absorbs the beam completely, light scattered by the instrument is zero and the interior surface of the integrating sphere, baffle, as well as the reflectance standard, are of equal characteristic. The accurate values of the TT, DT and haze can be obtained using the new apparatus. Even if the design is not optimal, the measurement errors of the new apparatus are smaller than those of other methods especially for high sphere reflectance. Therefore, the sphere can be made of a high reflectance material for the new apparatus to increase the signal-to-noise ratio.
Confidence Intervals for Proportion Estimates in Complex Samples. Research Report. ETS RR-06-21
ERIC Educational Resources Information Center
Oranje, Andreas
2006-01-01
Confidence intervals are an important tool to indicate uncertainty of estimates and to give an idea of probable values of an estimate if a different sample from the population was drawn or a different sample of measures was used. Standard symmetric confidence intervals for proportion estimates based on a normal approximation can yield bounds…
ERIC Educational Resources Information Center
Levitt, Steven D.; List, John A.; Neckermann, Susanne; Sadoff, Sally
2012-01-01
A long line of research on behavioral economics has established the importance of factors that are typically absent from the standard economic framework: reference dependent preferences, hyperbolic preferences, and the value placed on non-financial rewards. To date, these insights have had little impact on the way the educational system operates.…
A History of School Design and Its Indoor Environmental Standards, 1900 to Today
ERIC Educational Resources Information Center
Baker, Lindsay
2012-01-01
Public education is one of the central tasks of a democratic society, and the buildings that house this important task not only shape the way one teaches, but provide icons and symbols for the values people hold common as a society. Perhaps unsurprisingly, this context has placed school buildings squarely in a position of debate and innovation…
ERIC Educational Resources Information Center
Fedorova, Yevhenia
2014-01-01
The prospects for the cultivation of special needs students' citizenship as a prerequisite for the entry of Ukraine into the European Community have been described. The priority of compliance of European democratic sociocultural standards and humanistic values, among which the most important are the changes of attitude towards the disabled people,…
Extraction of Oleic Acid from Moroccan Olive Mill Wastewater
Elkacmi, Reda; Kamil, Noureddine; Bennajah, Mounir; Kitane, Said
2016-01-01
The production of olive oil in Morocco has recently grown considerably for its economic and nutritional importance favored by the country's climate. After the extraction of olive oil by pressing or centrifuging, the obtained liquid contains oil and vegetation water which is subsequently separated by decanting or centrifugation. Despite its treatment throughout the extraction process, this olive mill wastewater, OMW, still contains a very important oily residue, always regarded as a rejection. The separated oil from OMW can not be intended for food because of its high acidity of 3.397% which exceeds the international standard for human consumption defined by the standard of the Codex Alimentarius, proving its poor quality. This work gives value addition to what would normally be regarded as waste by the extraction of oleic acid as a high value product, using the technique of inclusion with urea for the elimination of saturated and unsaturated fatty acids through four successive crystallizations at 4°C and 20°C to have a final phase with oleic acid purity of 95.49%, as a biodegradable soap and a high quality glycerin will be produced by the reaction of saponification and transesterification. PMID:26933663
Extraction of Oleic Acid from Moroccan Olive Mill Wastewater.
Elkacmi, Reda; Kamil, Noureddine; Bennajah, Mounir; Kitane, Said
2016-01-01
The production of olive oil in Morocco has recently grown considerably for its economic and nutritional importance favored by the country's climate. After the extraction of olive oil by pressing or centrifuging, the obtained liquid contains oil and vegetation water which is subsequently separated by decanting or centrifugation. Despite its treatment throughout the extraction process, this olive mill wastewater, OMW, still contains a very important oily residue, always regarded as a rejection. The separated oil from OMW can not be intended for food because of its high acidity of 3.397% which exceeds the international standard for human consumption defined by the standard of the Codex Alimentarius, proving its poor quality. This work gives value addition to what would normally be regarded as waste by the extraction of oleic acid as a high value product, using the technique of inclusion with urea for the elimination of saturated and unsaturated fatty acids through four successive crystallizations at 4°C and 20°C to have a final phase with oleic acid purity of 95.49%, as a biodegradable soap and a high quality glycerin will be produced by the reaction of saponification and transesterification.
Hermoso, Maria; Tabacchi, Garden; Iglesia-Altaba, Iris; Bel-Serrat, Silvia; Moreno-Aznar, Luis A; García-Santos, Yurena; García-Luzardo, Ma del Rosario; Santana-Salguero, Beatriz; Peña-Quintana, Luis; Serra-Majem, Lluis; Moran, Victoria Hall; Dykes, Fiona; Decsi, Tamás; Benetou, Vassiliki; Plada, Maria; Trichopoulou, Antonia; Raats, Monique M; Doets, Esmée L; Berti, Cristiana; Cetin, Irene; Koletzko, Berthold
2010-10-01
This paper presents a review of the current knowledge regarding the macro- and micronutrient requirements of infants and discusses issues related to these requirements during the first year of life. The paper also reviews the current reference values used in European countries and the methodological approaches used to derive them by a sample of seven European and international authoritative committees from which background scientific reports are available. Throughout the paper, the main issues contributing to disparities in micronutrient reference values for infants are highlighted. The identification of these issues in relation to the specific physiological aspects of infants is important for informing future initiatives aimed at providing standardized approaches to overcome variability of micronutrient reference values across Europe for this age group. © 2010 Blackwell Publishing Ltd.
Postinflationary Higgs relaxation and the origin of matter-antimatter asymmetry.
Kusenko, Alexander; Pearce, Lauren; Yang, Louis
2015-02-13
The recent measurement of the Higgs boson mass implies a relatively slow rise of the standard model Higgs potential at large scales, and a possible second minimum at even larger scales. Consequently, the Higgs field may develop a large vacuum expectation value during inflation. The relaxation of the Higgs field from its large postinflationary value to the minimum of the effective potential represents an important stage in the evolution of the Universe. During this epoch, the time-dependent Higgs condensate can create an effective chemical potential for the lepton number, leading to a generation of the lepton asymmetry in the presence of some large right-handed Majorana neutrino masses. The electroweak sphalerons redistribute this asymmetry between leptons and baryons. This Higgs relaxation leptogenesis can explain the observed matter-antimatter asymmetry of the Universe even if the standard model is valid up to the scale of inflation, and any new physics is suppressed by that high scale.
Experimentally observed conformation-dependent geometry and hidden strain in proteins.
Karplus, P. A.
1996-01-01
A database has been compiled documenting the peptide conformations and geometries from 70 diverse proteins refined at 1.75 A or better. Analysis of the well-ordered residues within the database shows phi, psi-distributions that have more fine structure than is generally observed. Also, clear evidence is presented that the peptide covalent geometry depends on conformation, with the interpeptide N-C alpha-C bond angle varying by nearly +/-5 degrees from its standard value. The observed deviations from standard peptide geometry are greatest near the edges of well-populated regions, consistent with strain occurring in these conformations. Minimization of such hidden strain could be an important factor in thermostability of proteins. These empirical data describing how equilibrium peptide geometry varies as a function of conformation confirm and extend quantum mechanics calculations, and have predictive value that will aid both theoretical and experimental analyses of protein structure. PMID:8819173
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bamberger, Judith A.; Piepel, Gregory F.; Enderlin, Carl W.
Understanding how uncertainty manifests itself in complex experiments is important for developing the testing protocol and interpreting the experimental results. This paper describes experimental and measurement uncertainties, and how they can depend on the order of performing experimental tests. Experiments with pulse-jet mixers in tanks at three scales were conducted to characterize the performance of transient-developing periodic flows in Newtonian slurries. Other test parameters included the simulant, solids concentration, and nozzle exit velocity. Critical suspension velocity and cloud height were the metrics used to characterize Newtonian slurry flow associated with mobilization and mixing. During testing, near-replicate and near-repeat tests weremore » conducted. The experimental results were used to quantify the combined experimental and measurement uncertainties using standard deviations and percent relative standard deviations (%RSD) The uncertainties in critical suspension velocity and cloud height tend to increase with the values of these responses. Hence, the %RSD values are the more appropriate summary measure of near-replicate testing and measurement uncertainty.« less
Antioxidant Activity in the Extracts of Two Edible Aroids
Mandal, P.; Misra, T. K.; Singh, I. D.
2010-01-01
Two neglected species of Araceae, Alocasia macrorhiza (Linn.) G. Don and Alocasia fornicata (Roxb.) Schott are important as food and ethno medicine in Asia and Africa. Their bioefficacy is documented in the Ayurveda. The solvent extracts of different edible parts of these two species like rhizomes, leaves, roots and stolons were screened for in vitro antioxidant properties using standard procedures. The successive extracts in hexane, benzene, toluene, chloroform, diethyl ether, ethyl acetate and water fraction exhibited IC50 values in the following order, roots>rhizome>leaves for Alocasia macrorhiza and leaves>stolon for Alocasia fornicate, respectively in 2,2-diphenyl-1-picryl hydrazyl antioxidant inhibition assay. Maximum antioxidant activity was observed in diethyl ether extracts for both species. The IC50 values were comparable with those of quercetine and ascorbic acid as standards. These results suggest that the two aroid species have antioxidant activity in their edible parts and should be extracted using diethyl ether solvent. PMID:20582198
The importance of values in evidence-based medicine.
Kelly, Michael P; Heath, Iona; Howick, Jeremy; Greenhalgh, Trisha
2015-10-12
Evidence-based medicine (EBM) has always required integration of patient values with 'best' clinical evidence. It is widely recognized that scientific practices and discoveries, including those of EBM, are value-laden. But to date, the science of EBM has focused primarily on methods for reducing bias in the evidence, while the role of values in the different aspects of the EBM process has been almost completely ignored. In this paper, we address this gap by demonstrating how a consideration of values can enhance every aspect of EBM, including: prioritizing which tests and treatments to investigate, selecting research designs and methods, assessing effectiveness and efficiency, supporting patient choice and taking account of the limited time and resources available to busy clinicians. Since values are integral to the practice of EBM, it follows that the highest standards of EBM require values to be made explicit, systematically explored, and integrated into decision making. Through 'values based' approaches, EBM's connection to the humanitarian principles upon which it was founded will be strengthened.
Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R
2014-05-15
Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.
The option value of innovative treatments for non-small cell lung cancer and renal cell carcinoma.
Thornton Snider, Julia; Batt, Katharine; Wu, Yanyu; Tebeka, Mahlet Gizaw; Seabury, Seth
2017-10-01
To develop a model of the option value a therapy provides by enabling patients to live to see subsequent innovations and to apply the model to the case of nivolumab in renal cell carcinoma (RCC) and non-small cell lung cancer (NSCLC). A model of the option value of nivolumab in RCC and NSCLC was developed and estimated. Data from the Surveillance, Epidemiology, and End Results (SEER) cancer registry and published clinical trial results were used to estimate survival curves for metastatic cancer patients with RCC, squamous NSCLC, or nonsquamous NSCLC. To estimate the conventional value of nivolumab, survival with the pre-nivolumab standard of care was compared with survival with nivolumab assuming no future innovation. To estimate the option value of nivolumab, long-term survival trends in RCC and squamous and nonsquamous NSCLC were measured in SEER to forecast mortality improvements that nivolumab patients may live to see. Compared with the previous standard of care, nivolumab extended life expectancy by 6.3 months in RCC, 7.5 months in squamous NSCLC, and 4.5 months in nonsquamous NSCLC, according to conventional methods. Accounting for expected future mortality trends, nivolumab patients are likely to gain an additional 1.2 months in RCC, 0.4 months in squamous NSCLC, and 0.5 months in nonsquamous NSCLC. These option values correspond to 18%, 5%, and 10% of the conventional value of nivolumab, respectively. Option value is important when valuing therapies like nivolumab that extend life in a rapidly evolving area of care.
Value of ecosystem hydropower service and its impact on the payment for ecosystem services.
Fu, B; Wang, Y K; Xu, P; Yan, K; Li, M
2014-02-15
Hydropower is an important service provided by ecosystems. We surveyed all the hydropower plants in the Zagunao River Basin, Southwest China. Then, we assessed the hydropower service by using the InVEST (The Integrated Value and Tradeoff of Ecosystem Service Tools) model. Finally, we discussed the impact on ecological compensation. The results showed that: 1) hydropower service value of ecosystems in the Zagunao River Basin is 216.29 Euro/hm(2) on the average, of which the high-value area with more than 475.65 Euro/hm(2) is about 750.37 km(2), accounting for 16.12% of the whole watershed, but it provides 53.47% of the whole watershed service value; 2) ecosystem is an ecological reservoir with a great regulation capacity. Dams cannot completely replace the reservoir water conservation function of ecosystems, and has high economic and environmental costs that must be paid as well. Compensation for water conservation services should become an important basis for ecological compensation of hydropower development. 3) In the current PES cases, the standard of compensation is generally low. Cascade development makes the value of upstream ecosystem services become more prominent, reflecting the differential rent value, and the value of ecosystem services should be based on the distribution of differentiated ecological compensation. Copyright © 2013 Elsevier B.V. All rights reserved.
Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki
2016-01-01
Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.
Sleno, Lekha; Volmer, Dietrich A
2006-01-01
Growing interest in the ability to conduct quantitative assays for small molecules by matrix-assisted laser desorption/ionization (MALDI) has been the driving force for several recent studies. This present work includes the investigation of internal standards for these analyses using a high-repetition rate MALDI triple quadrupole instrument. Certain physicochemical properties are assessed for predicting possible matches for internal standards for different small molecules. The importance of similar molecular weight of an internal standard to its analyte is seen through experiments with a series of acylcarnitines, having a fixed charge site and growing alkyl chain length. Both acetyl- and hexanoyl-carnitine were systematically assessed with several other acylcarnitine compounds as internal standards. The results clearly demonstrate that closely matched molecular weights between analyte and internal standard are essential for acceptable quantitation results. Using alpha-cyano-4-hydroxycinnamic acid as the organic matrix, the similarities between analyte and internal standard remain the most important parameter and not necessarily their even distribution within the solid sample spot. Several 4-quinolone antibiotics as well as a diverse group of pharmaceutical drugs were tested as internal standards for the 4-quinolone, ciprofloxacin. Quantitative results were shown using the solution-phase properties, log D and pKa, of these molecules. Their distribution coefficients, log D, are demonstrated as a fundamental parameter for similar crystallization patterns of analyte and internal standard. In the end, it was also possible to quantify ciprofloxacin using a drug from a different compound class, namely quinidine, having a similar log D value as the analyte. Copyright 2006 John Wiley & Sons, Ltd.
Leivada, Evelina; Papadopoulou, Elena; Pavlou, Natalia
2017-01-01
Findings from the field of experimental linguistics have shown that a native speaker may judge a variant that is part of her grammar as unacceptable, but still use it productively in spontaneous speech. The process of eliciting acceptability judgments from speakers of non-standard languages is sometimes clouded by factors akin to prescriptive notions of grammatical correctness. It has been argued that standardization enhances the ability to make clear-cut judgments, while non-standardization may result to grammatical hybridity, often manifested in the form of functionally equivalent variants in the repertoire of a single speaker. Recognizing the importance of working with corpora of spontaneous speech, this work investigates patterns of variation in the spontaneous production of five neurotypical, adult speakers of a non-standard variety in terms of three variants, each targeting one level of linguistic analysis: syntax, morphology, and phonology. The results reveal the existence of functionally equivalent variants across speakers and levels of analysis. We first discuss these findings in relation to the notions of competing, mixed, and fused grammars, and then we flesh out the implications that different values of the same variant carry for parametric approaches to Universal Grammar. We observe that intraspeaker realizations of different values of the same variant within the same syntactic environment are incompatible with the 'triggering-a-single-value' approach of parametric models, but we argue that they are compatible with the concept of Universal Grammar itself. Since the analysis of these variants is ultimately a way of investigating the status of Universal Grammar primitives, we conclude that claims about the alleged unfalsifiability of (the contents of) Universal Grammar are unfounded.
Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H
2007-09-07
Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.
Woo, Sungmin; Suh, Chong Hyun; Kim, Sang Youn; Cho, Jeong Yeon; Kim, Seung Hyup
2018-01-01
The purpose of this study was to perform a head-to-head comparison between high-b-value (> 1000 s/mm 2 ) and standard-b-value (800-1000 s/mm 2 ) DWI regarding diagnostic performance in the detection of prostate cancer. The MEDLINE and EMBASE databases were searched up to April 1, 2017. The analysis included diagnostic accuracy studies in which high- and standard-b-value DWI were used for prostate cancer detection with histopathologic examination as the reference standard. Methodologic quality was assessed with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Sensitivity and specificity of all studies were calculated and were pooled and plotted in a hierarchic summary ROC plot. Meta-regression and multiple-subgroup analyses were performed to compare the diagnostic performances of high- and standard-b-value DWI. Eleven studies (789 patients) were included. High-b-value DWI had greater pooled sensitivity (0.80 [95% CI, 0.70-0.87]) (p = 0.03) and specificity (0.92 [95% CI, 0.87-0.95]) (p = 0.01) than standard-b-value DWI (sensitivity, 0.78 [95% CI, 0.66-0.86]); specificity, 0.87 [95% CI, 0.77-0.93] (p < 0.01). Multiple-subgroup analyses showed that specificity was consistently higher for high- than for standard-b-value DWI (p ≤ 0.05). Sensitivity was significantly higher for high- than for standard-b-value DWI only in the following subgroups: peripheral zone only, transition zone only, multiparametric protocol (DWI and T2-weighted imaging), visual assessment of DW images, and per-lesion analysis (p ≤ 0.04). In a head-to-head comparison, high-b-value DWI had significantly better sensitivity and specificity for detection of prostate cancer than did standard-b-value DWI. Multiple-subgroup analyses showed that specificity was consistently superior for high-b-value DWI.
Status and Analysis on Effects of Energy Efficiency Standards for Industrial Boilers in China
NASA Astrophysics Data System (ADS)
Liu, Ren; Chen, Lili; Liu, Meng; Ding, Qing; Zhao, Yuejin
2017-11-01
Energy conservation and environmental protection is the basic policy of China, and is an important part of ecological civilization construction. The industrial boilers in China are featured by large quantity, wide distribution, high energy consumption and heavy environmental pollution, which are key problems faced by energy conservation and environmental protection in China. Meanwhile, industrial boilers are important equipment for national economy and people’s daily life, and energy conservation gets through all segments from type selection, purchase, installation and acceptance to fuel management, operation, maintenance and service. China began to implement such national mandatory standards and regulations for industrial boiler as GB24500-2009 The Minimum Allowable Values of Energy Efficiency and Energy Efficiency Grades of Industrial Boilers and TSG G002-2010 Supervision Regulation on Energy-Saving Technology for Boilers since 2009, which obviously promote the development of energy conservation of industrial boilers, but there are also some problems with the rapid development of technologies for energy conservation of industrial boilers. In this paper, the implementation of energy efficiency standards for industrial boilers in China and the significance are analyzed based on survey data, and some suggestions are proposed for the energy efficiency standards for industrial boilers.
NASA Astrophysics Data System (ADS)
Chaturvedi, K.; Willenborg, B.; Sindram, M.; Kolbe, T. H.
2017-10-01
Semantic 3D city models play an important role in solving complex real-world problems and are being adopted by many cities around the world. A wide range of application and simulation scenarios directly benefit from the adoption of international standards such as CityGML. However, most of the simulations involve properties, whose values vary with respect to time, and the current generation semantic 3D city models do not support time-dependent properties explicitly. In this paper, the details of solar potential simulations are provided operating on the CityGML standard, assessing and estimating solar energy production for the roofs and facades of the 3D building objects in different ways. Furthermore, the paper demonstrates how the time-dependent simulation results are better-represented inline within 3D city models utilizing the so-called Dynamizer concept. This concept not only allows representing the simulation results in standardized ways, but also delivers a method to enhance static city models by such dynamic property values making the city models truly dynamic. The dynamizer concept has been implemented as an Application Domain Extension of the CityGML standard within the OGC Future City Pilot Phase 1. The results are given in this paper.
Assay Dilution Factors Confound Measures of Total Antioxidant Capacity in Polyphenol-Rich Juices
Bolling, Bradley W.; Chen, Ya-Yen; Kamil, Alison G.; Chen, C-Y. Oliver
2016-01-01
The extent to which sample dilution factor (DF) affects total antioxidant capacity (TAC) values is poorly understood. Thus, we examined the impact of DF on the ORAC, FRAP, DPPH, and total phenols (TP) assays using pomegranate juice (PJ), grape juice (GJ), selected flavonoids, ascorbic acid, and ellagic acid. For ORAC, GJ was comparable to PJ at DF 750, but at DF 2000, the ORAC value of GJ was 40% more than PJ. Increasing DF increased GJ and PJ, DPPH, TP, and FRAP values 11% and 14%, respectively. Increased test concentrations of quercetin and catechin resulted in 51% and 126% greater ORAC values, but decreased naringenin by 68%. Flavonoids, but not ellagic acid or ascorbic acid, may contribute to the dilution effect on the variation of final TAC values. Thus, reporting TAC or TP using a single DF may introduce uncertainty about the confidence of TAC assay values, especially when comparing different juices. These results underscore the importance of using compatible test standards for reporting TAC values. PMID:22251245
Assessing values for health: numeracy matters.
Woloshin, S; Schwartz, L M; Moncur, M; Gabriel, S; Tosteson, A N
2001-01-01
Patients' values are fundamental to decision models, cost-effectiveness analyses, and pharmacoeconomic analyses. The standard methods used to assess how patients value different health states are inherently quantitative. People without strong quantitative skills (i.e., low numeracy) may not be able to complete these tasks in a meaningful way. To determine whether the validity of utility assessments depends on the respondent's level of numeracy, the authors conducted in-person interviews and written surveys and assessed utility for the current health for 96 women volunteers. Numeracy was measured using a previously validated 3-item scale. The authors examined the correlation between self-reported health and utility for current health (assessed using the standard gamble, time trade-off, and visual analog techniques) across levels of numeracy. For half of the women, the authors also assessed standard gamble utility for 3 imagined health states (breast cancer, heart disease, and osteoporosis) and asked how much the women feared each disease. Respondent ages ranged from 50 to 79 years (mean = 63), all were high school graduates, and 52% had a college or postgraduate degree. Twenty-six percent answered 0 or only 1 of the numeracy questions correctly, 37% answered 2 correctly, and 37% answered all 3 correctly. Among women with the lowest level of numeracy, the correlation between utility for current health and self-reported health was in the wrong direction (i.e., worse health valued higher than better health): for standard gamble, Spearman r=-0.16, P = 0.44;for time trade-off, Spearman r=-0.13, P=0.54. Among the most numerate women, the authors observed a fair to moderate positive correlation with both standard gamble (Spearman r=0.22, P=0.19) and time trade-off (Spearman r=0.50, P=0.002). In contrast, using the visual analog scale, the authors observed a substantial correlation in the expected direction at all levels of numeracy (Spearman r= 0.82, 0.50, and 0.60 for women answering 0-1, 2, and 3 numeracy questions, respectively; all Ps < or = 0.003). With regard to the imagined health states, the most feared disease had the lowest utility for 35% of the women with the lowest numeracy compared to 76% of the women with the highest numeracy (P=0.03). The validity of standard utility assessments is related to the subject's facility with numbers. Limited numeracy may be an important barrier to meaningfully assessing patients' values using the standard gamble and time trade-off techniques.
Symbols as Substance in National Civics Standards.
ERIC Educational Resources Information Center
Merelman, Richard M.
1996-01-01
Criticizes the national civics standards for emphasizing shared political values over political participation, oversimplifying the relationships among U.S. political values, and relying upon elite statements to identify those values. Characterizes the proposed standards as a symbolic ritual for reinforcing cultural hegemony. (MJP)
John Tipton; Gretchen Moisen; Paul Patterson; Thomas A. Jackson; John Coulston
2012-01-01
There are many factors that will determine the final cost of modeling and mapping tree canopy cover nationwide. For example, applying a normalization process to Landsat data used in the models is important in standardizing reflectance values among scenes and eliminating visual seams in the final map product. However, normalization at the national scale is expensive and...
Nursing competency standards in primary health care: an integrative review.
Halcomb, Elizabeth; Stephens, Moira; Bryce, Julianne; Foley, Elizabeth; Ashley, Christine
2016-05-01
This paper reports an integrative review of the literature on nursing competency standards for nurses working in primary health care and, in particular, general practice. Internationally, there is growing emphasis on building a strong primary health care nursing workforce to meet the challenges of rising chronic and complex disease. However, there has been limited emphasis on examining the nursing workforce in this setting. Integrative review. A comprehensive search of relevant electronic databases using keywords (e.g. 'competencies', 'competen*' and 'primary health care', 'general practice' and 'nurs*') was combined with searching of the Internet using the Google scholar search engine. Experts were approached to identify relevant grey literature. Key websites were also searched and the reference lists of retrieved sources were followed up. The search focussed on English language literature published since 2000. Limited published literature reports on competency standards for nurses working in general practice and primary health care. Of the literature that is available, there are differences in the reporting of how the competency standards were developed. A number of common themes were identified across the included competency standards, including clinical practice, communication, professionalism and health promotion. Many competency standards also included teamwork, education, research/evaluation, information technology and the primary health care environment. Given the potential value of competency standards, further work is required to develop and test robust standards that can communicate the skills and knowledge required of nurses working in primary health care settings to policy makers, employers, other health professionals and consumers. Competency standards are important tools for communicating the role of nurses to consumers and other health professionals, as well as defining this role for employers, policy makers and educators. Understanding the content of competency standards internationally is an important step to understanding this growing workforce. © 2016 John Wiley & Sons Ltd.
Certification of reference materials for the determination of alkylphenols.
Hanari, Nobuyasu; Ishikawa, Keiichiro; Shimizu, Yoshitaka; Otsuka, Satoko; Iwasawa, Ryoko; Fujiki, Naomi; Numata, Masahiko; Yarita, Takashi; Kato, Kenji
2015-04-01
Certified reference materials (CRMs) are playing an increasingly important role in national and international standardizing activities. In Japan, primary standard solutions for analyses of endocrine disrupters are supplied under the national standards dissemination system named the Japan Calibration Service System (JCSS). For the traceability on reference materials used for preparation of the primary standard solutions based on the JCSS, the National Metrology Institute of Japan, National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) has developed and certified high-purity reference materials of alkylphenols as NMIJ CRMs, such as 4-n-nonylphenol, 4-tert-octylphenol, 4-n-heptylphenol, 4-tert-butylphenol, and 2,4-dichlorophenol. Thereafter, it is essential to determine the alkylphenols by using these solutions based on the JCSS for environmental monitoring and risk assessments because analytical values obtained by using the solutions can ensure the reliability and traceability of the chemical analyses.
Heidari, M.; Ranjithan, S.R.
1998-01-01
In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.
Value-Based Emergency Management.
Corrigan, Zachary; Winslow, Walter; Miramonti, Charlie; Stephens, Tim
2016-02-01
This article touches on the complex and decentralized network that is the US health care system and how important it is to include emergency management in this network. By aligning the overarching incentives of opposing health care organizations, emergency management can become resilient to up-and-coming changes in reimbursement, staffing, and network ownership. Coalitions must grasp the opportunity created by changes in value-based purchasing and impending Centers for Medicare and Medicaid Services emergency management rules to engage payers, physicians, and executives. Hope and faith in doing good is no longer enough for preparedness and health care coalitions; understanding how physicians are employed and health care is delivered and paid for is now necessary. Incentivizing preparedness through value-based compensation systems will become the new standard for emergency management.
Burst strength of tubing and casing based on twin shear unified strength theory.
Lin, Yuanhua; Deng, Kuanhai; Sun, Yongxing; Zeng, Dezhi; Liu, Wanying; Kong, Xiangwei; Singh, Ambrish
2014-01-01
The internal pressure strength of tubing and casing often cannot satisfy the design requirements in high pressure, high temperature and high H2S gas wells. Also, the practical safety coefficient of some wells is lower than the design standard according to the current API 5C3 standard, which brings some perplexity to the design. The ISO 10400: 2007 provides the model which can calculate the burst strength of tubing and casing better than API 5C3 standard, but the calculation accuracy is not desirable because about 50 percent predictive values are remarkably higher than real burst values. So, for the sake of improving strength design of tubing and casing, this paper deduces the plastic limit pressure of tubing and casing under internal pressure by applying the twin shear unified strength theory. According to the research of the influence rule of yield-to-tensile strength ratio and mechanical properties on the burst strength of tubing and casing, the more precise calculation model of tubing-casing's burst strength has been established with material hardening and intermediate principal stress. Numerical and experimental comparisons show that the new burst strength model is much closer to the real burst values than that of other models. The research results provide an important reference to optimize the tubing and casing design of deep and ultra-deep wells.
Burst Strength of Tubing and Casing Based on Twin Shear Unified Strength Theory
Lin, Yuanhua; Deng, Kuanhai; Sun, Yongxing; Zeng, Dezhi; Liu, Wanying; Kong, Xiangwei; Singh, Ambrish
2014-01-01
The internal pressure strength of tubing and casing often cannot satisfy the design requirements in high pressure, high temperature and high H2S gas wells. Also, the practical safety coefficient of some wells is lower than the design standard according to the current API 5C3 standard, which brings some perplexity to the design. The ISO 10400: 2007 provides the model which can calculate the burst strength of tubing and casing better than API 5C3 standard, but the calculation accuracy is not desirable because about 50 percent predictive values are remarkably higher than real burst values. So, for the sake of improving strength design of tubing and casing, this paper deduces the plastic limit pressure of tubing and casing under internal pressure by applying the twin shear unified strength theory. According to the research of the influence rule of yield-to-tensile strength ratio and mechanical properties on the burst strength of tubing and casing, the more precise calculation model of tubing-casing's burst strength has been established with material hardening and intermediate principal stress. Numerical and experimental comparisons show that the new burst strength model is much closer to the real burst values than that of other models. The research results provide an important reference to optimize the tubing and casing design of deep and ultra-deep wells. PMID:25397886
NASA Astrophysics Data System (ADS)
2017-11-01
To deal with these problems investigators usually rely on a calibration method that makes use of a substance with an accurately known set of interatomic distances. The procedure consists of carrying out a diffraction experiment on the chosen calibrating substance, determining the value of the distances with use of the nominal (meter) value of the voltage, and then correcting the nominal voltage by an amount that produces the distances in the calibration substance. Examples of gases that have been used for calibration are carbon dioxide, carbon tetrachloride, carbon disulfide, and benzene; solids such as zinc oxide smoke (powder) deposited on a screen or slit have also been used. The question implied by the use of any standard molecule is, how accurate are the interatomic distance values assigned to the standard? For example, a solid calibrant is subject to heating by the electron beam, possibly producing unknown changes in the lattice constants, and polyatomic gaseous molecules require corrections for vibrational averaging ("shrinkage") effects that are uncertain at best. It has lately been necessary for us to investigate this matter in connection with on-going studies of several molecules in which size is the most important issue. These studies indicated that our usual method for retrieval of data captured on film needed improvement. The following is an account of these two issues - the accuracy of the distances assigned to the chosen standard molecule, and the improvements in our methods of retrieving the scattered intensity data.
Validity Study of a Jump Mat Compared to the Reference Standard Force Plate.
Rogan, Slavko; Radlinger, Lorenz; Imhasly, Caroline; Kneubuehler, Andrea; Hilfiker, Roger
2015-12-01
In the field of vertical jump diagnostics, force plates (FP) are the reference standard. Recently, despite a lack of evidence, jump mats have been used increasingly. Important factors in favor of jumping mats are their low cost and portability. This validity study compared the Haynl-Elektronik jump mat (HE jump mat) with the reference standard force plate. Ten healthy volunteers participated and each participant completed three series of five drop jumps (DJ). The parameters ground contact time (GCT) and vertical jump height (VJH) from the HE jump mat and the FP were used to evaluate the concurrent validity. The following statistical calculations were performed: Pearson's correlation (r), Bland-Altman plots (standard and for adjusted trend), and regression equations. The Bland-Altman plots suggest that the HE jump mat measures shorter contact times and higher jump heights than the FP. The trend-adjusted Bland-Altman plot shows higher mean differences and wider wing-spreads of confidence limits during longer GCT. During the VJH the mean differences and the wing-spreads of the confidence limits throughout the range present as relatively constant. The following regression equations were created, as close as possible to the true value: GCT = 5.920385 + 1.072293 × [value HE jump mat] and VJH = -1.73777 + 1.011156 × [value HE jump mat]. The HE jump mat can be recommended in relation to the validity of constraints. In this study, only a part of the quality criteria were examined. For the final recommendation it is advised to examine the HE jump mat on the other quality criteria (test-retest reliability, sensitivity change).
New Quality Standards of Testing Idlers for Highly Effective Belt Conveyors
NASA Astrophysics Data System (ADS)
Król, Robert; Gladysiewicz, Lech; Kaszuba, Damian; Kisielewski, Waldemar
2017-12-01
The paper presents result of research and analyses carried out into the belt conveyors idlers’ rotational resistance which is one of the key factor indicating the quality of idlers. Moreover, idlers’ rotational resistance is important factor in total resistance to motion of belt conveyor. The evaluation of the technical condition of belt conveyor idlers is carried out in accordance with actual national and international standards which determine the methodology of measurements and acceptable values of measured idlers’ parameters. Requirements defined by the standards, which determine the suitability of idlers to a specific application, despite the development of knowledge on idlers and quality of presently manufactured idlers maintain the same level of parameters values over long periods of time. Nowadays the need to implement new, efficient and economically justified solution for belt conveyor transportation systems characterized by long routes and energy-efficiency is often discussed as one of goals in belt conveyors’ future. One of the basic conditions for achieving this goal is to use only carefully selected idlers with low rotational resistance under the full range of operational loads and high durability. Due to this it is necessary to develop new guidelines for evaluation of the technical condition of belt conveyor idlers in accordance with actual standards and perfecting of existing and development of new methods of idlers testing. The changes in particular should concern updating of values of parameters used for evaluation of the technical condition of belt conveyor idlers in relation to belt conveyors’ operational challenges and growing demands in terms of belt conveyors’ energy efficiency.
Determination of Gibbs energies of formation in aqueous solution using chemical engineering tools.
Toure, Oumar; Dussap, Claude-Gilles
2016-08-01
Standard Gibbs energies of formation are of primary importance in the field of biothermodynamics. In the absence of any directly measured values, thermodynamic calculations are required to determine the missing data. For several biochemical species, this study shows that the knowledge of the standard Gibbs energy of formation of the pure compounds (in the gaseous, solid or liquid states) enables to determine the corresponding standard Gibbs energies of formation in aqueous solutions. To do so, using chemical engineering tools (thermodynamic tables and a model enabling to predict activity coefficients, solvation Gibbs energies and pKa data), it becomes possible to determine the partial chemical potential of neutral and charged components in real metabolic conditions, even in concentrated mixtures. Copyright © 2016 Elsevier Ltd. All rights reserved.
Minimal clinically important difference of the Modified Fatigue Impact Scale in Parkinson's disease.
Kluger, Benzi M; Garimella, Sanjana; Garvan, Cynthia
2017-10-01
Fatigue is a common and debilitating symptom of Parkinson's disease (PD) with no evidence-based treatments. While several fatigue scales are partially validated in PD the minimal clinically important difference (MCID) is unknown for any scale but is an important psychometric value to design and interpret therapeutic trials. We thus sought to determine the MCID for the Modified Fatigue Impact Scale (MFIS). This is a secondary data analysis from 94 PD participants in an acupuncture trial for PD fatigue. Standard psychometric approaches were used to establish validity and an anchor-based approach was used to determine the MCID. The MFIS demonstrated good concurrent validity with other outcome measures and high internal consistency. MCIDs values were found to be 13.8, 6.8 and 6.2 for the MFIS total, MFIS cognitive, and MFIS physical subscores respectively. The MFIS is a valid multidimensional measure of fatigue in PD with demonstrable MCID. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne
The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of amore » formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting. • Advantages of HIA in the air quality standard setting process are demonstrated.« less
Rabeprazole can overcome the impact of CYP2C19 polymorphism on quadruple therapy.
Kuo, Chao-Hung; Wang, Sophie S W; Hsu, Wen-Hung; Kuo, Fu-Chen; Weng, Bi-Chuang; Li, Chia-Jung; Hsu, Ping-I; Chen, Angela; Hung, Wen-Chun; Yang, Yuan-Chieh; Wang, Wen-Ming; Wu, Deng-Chyang
2010-08-01
The prospective study was designed to clarify the impact of CYP2C19 on quadruple therapies and survey the efficacies of rabeprazole-based quadruple therapy for Helicobacter pylori infection after failure of standard triple therapies. From January 2007 to March 2009, 1055 H. pylori-infected patients received standard triple regimens (proton-pump inhibitor (PPI), clarithromycin, and amoxicillin). Helicobacter pylori eradication was achieved in 865 (81.9%) subjects. One hundred ninety eradication-failure patients were enrolled and randomly assigned to receive a 7-day eradication therapy. Ninety-six patients were treated with esomeprazole-based quadruple rescue therapies (EB), while 94 patients were treated with rabeprazole-based quadruple rescue therapies (RB). Follow-up endoscopy was done 16 weeks later to assess the treatment response. Patients' responses, CYP2C19 genotypes, and antibiotics resistances were also examined. Intention-to-treat analysis revealed that RB had better eradication rates than EB (EB: 72.9%; 95% CI: 64.9-80.9% and RB: 78.7%; 95% CI 72.5-84.9%) (p value = .543). Per-protocol results were EB = 75.3%; 95% CI: 70.3-80.3% and RB = 85.1%; 95% CI: 80.6-89.6% (p value = .0401). Both regimens had similar compliance (p value = 0.155) and adverse events (p value = 0.219). We also surveyed those patients without resistance of any antibiotics. RB still showed better outcome than EB. Our data showed that esomeprazole-based regimen and CYP2C19 Hom EM genotype were important predictors for eradication failure. In quadruple therapy, rabeprazole-based regimens had better efficacy than esomeprazole-based regimens. CYP2C19 polymorphism also played an important role in quadruple therapy. It seems advisable to change PPI to rabeprazole in second-line quadruple therapy.
Fast analytical spectral filtering methods for magnetic resonance perfusion quantification.
Reddy, Kasireddy V; Mitra, Abhishek; Yalavarthy, Phaneendra K
2016-08-01
The deconvolution in the perfusion weighted imaging (PWI) plays an important role in quantifying the MR perfusion parameters. The PWI application to stroke and brain tumor studies has become a standard clinical practice. The standard approach for this deconvolution is oscillatory-limited singular value decomposition (oSVD) and frequency domain deconvolution (FDD). The FDD is widely recognized as the fastest approach currently available for deconvolution of MR perfusion data. In this work, two fast deconvolution methods (namely analytical fourier filtering and analytical showalter spectral filtering) are proposed. Through systematic evaluation, the proposed methods are shown to be computationally efficient and quantitatively accurate compared to FDD and oSVD.
Medically Inappropriate or Futile Treatment: Deliberation and Justification 1
Misak, Cheryl J.; White, Douglas B.; Truog, Robert D.
2016-01-01
This paper reframes the futility debate, moving away from the question “Who decides when to end what is considered to be a medically inappropriate or futile treatment?” and toward the question “How can society make policy that will best account for the multitude of values and conflicts involved in such decision-making?” It offers a pragmatist moral epistemology that provides us with (1) a clear justification of why it is important to take best standards, norms, and physician judgment seriously and (2) a clear justification of why ample opportunity must be made for patients, families, and society to challenge those standards and norms. PMID:26681796
Dehghani, Mansooreh; Anushiravani, Amir; Hashemi, Hassan; Shamsedini, Narges
2014-06-01
Expanding cities with rapid economic development has resulted in increased energy consumption leading to numerous environmental problems for their residents. The aim of this study was to investigate the correlation between air pollution and mortality rate due to cardiovascular and respiratory diseases in Shiraz. This is an analytical cross-sectional study in which the correlation between major air pollutants (including carbon monoxide [CO], sulfur dioxide [SO2], nitrogen dioxide [NO2] and particle matter with a diameter of less than 10 μ [PM10]) and climatic parameters (temperature and relative humidity) with the number of those whom expired from cardiopulmonary disease in Shiraz from March 2011 to January 2012 was investigated. Data regarding the concentration of air pollutants were determined by Shiraz Environmental Organization. Information about climatic parameters was collected from the database of Iran's Meteorological Organization. The number of those expired from cardiopulmonary disease in Shiraz were provided by the Department of Health, Shiraz University of Medical Sciences. We used non-parametric correlation test to analyze the relationship between these parameters. The results demonstrated that in all the recorded data, the average monthly pollutants standard index (PSI) values of PM10 were higher than standard limits, while the average monthly PSI value of NO2 were lower than standard. There was no significant relationship between the number of those expired from cardiopulmonary disease and the air pollutant (P > 0.05). Air pollution can aggravate chronic cardiopulmonary disease. In the current study, one of the most important air pollutants in Shiraz was the PM10 component. Mechanical processes, such as wind blowing from neighboring countries, is the most important parameter increasing PM10 in Shiraz to alarming conditions. The average monthly variation in PSI values of air pollutants such as NO2, CO, and SO2 were lower than standard limits. Moreover, there was no significant correlation between the average monthly variation in PSI of NO2, CO, PM10, and SO2 and the number of those expired from cardiopulmonary disease in Shiraz.
Li, R; Li, C T; Zhao, S M; Li, H X; Li, L; Wu, R G; Zhang, C C; Sun, H Y
2017-04-01
To establish a query table of IBS critical value and identification power for the detection systems with different numbers of STR loci under different false judgment standards. Samples of 267 pairs of full siblings and 360 pairs of unrelated individuals were collected and 19 autosomal STR loci were genotyped by Golden e ye™ 20A system. The full siblings were determined using IBS scoring method according to the 'Regulation for biological full sibling testing'. The critical values and identification power for the detection systems with different numbers of STR loci under different false judgment standards were calculated by theoretical methods. According to the formal IBS scoring criteria, the identification power of full siblings and unrelated individuals was 0.764 0 and the rate of false judgment was 0. The results of theoretical calculation were consistent with that of sample observation. The query table of IBS critical value for identification of full sibling detection systems with different numbers of STR loci was successfully established. The IBS scoring method defined by the regulation has high detection efficiency and low false judgment rate, which provides a relatively conservative result. The query table of IBS critical value for identification of full sibling detection systems with different numbers of STR loci provides an important reference data for the result judgment of full sibling testing and owns a considerable practical value. Copyright© by the Editorial Department of Journal of Forensic Medicine
Lloyd, C H; Yearn, J A; Cowper, G A; Blavier, J; Vanderdonckt, M
2004-07-01
The setting expansion is an important property for a phosphate-bonded investment material. This research was undertaken to investigate a test that might be suitable for its measurement when used in a Standard. In the 'Casting-Ring Test', the investment sample is contained in a steel ring and expands to displace a precisely positioned pin. Variables with the potential to alter routine reproduction of the value were investigated. The vacuum-mixer model is a production laboratory variable that must not be ignored and for this reason, experiments were repeated using a different vacuum-mixer located at a second test site. Restraint by the rigid ring material increased expansion, while force on the pin reduced it. Expansion was specific to the lining selected. Increased environmental temperature decreased the final value. Expansion was still taking place at a time at which its value might be measured. However, when these factors are set, the reproducibility of values for setting expansion was good at both test sites (coefficient of variation 14%, at most). The results revealed that with the control that is available reliable routine measurement is possible in a Standard test. The inter-laboratory variable, vacuum-mixer model, produced significant differences and it should be the subject of further investigation.
Warren, Joseph D; Smith, Joy N
2007-07-01
The density and sound speed of two coastal, gelatinous zooplankton, Mnemiopsis leidyi (a ctenophore) and Cyanea capillata (lion's mane jellyfish), were measured. These parameters are important inputs to acoustic scattering models. Two different methods were used to measure the density of individual animals: one used a balance and graduated cylinder to determine the mass and displacement volume of the animal, the other varied the density of the solution the animal was immersed in. When the same animal was measured using both methods, density values were within 1% of each other. A travel-time difference method was used to measure the sound speed within the animals. The densities of both zooplankton slightly decreased as the animals increased in length, mass, and volume. The ratio of animal density and sound speed to the surrounding seawater (g and h, respectively) are reported for both animals. For Mnemiopsis leidyi ranging in length from 1 to 5 cm, the mean value (+/-standard deviation) of g and h were 1.009 (+/-0.004) and 1.007 (+/-0.001). For Cyanea capillata ranging in bell diameter from 2 to 11 cm, the mean value (+/-standard deviation) of g and single value of h were 1.009 (+/-0.004) and 1.0004.
Age-Related Changes and Reference Values of Bicaudate Ratio and Sagittal Brainstem Diameters on MRI.
Garbade, Sven F; Boy, Nikolas; Heringer, Jana; Kölker, Stefan; Harting, Inga
2018-06-05
Cranial magnetic resonance imaging (MRI) plays an important role in the diagnosis of neurometabolic diseases, and, in addition, temporal patterns of signal and volume changes allow insight into the underlying pathogenesis. While assessment of volume changes by visual inspection is subjective, volumetric approaches are often not feasible with rare neurometabolic diseases, where MRIs are often acquired with different scanners and protocols. Linear surrogate parameters of brain volume, for example, the bicaudate ratio, present a robust alternative that can be derived from standard imaging sequences. Due to the continuing postnatal brain and skull development and later brain involution, it is, however, necessary to compare patient values with age age-adapted normal values.In this article, we present age-dependent normal values derived from 993 standard scans of patients with normal MRI findings (age range: 0-80 years; mean = 19.9; median = 12.8 years) for bicaudate ratio as a measure of global supratentorial volume, as well as the maximal anteroposterior diameters of mesencephalon, pons, and medulla oblongata as parameters of brainstem volume. The provided data allow quantitative, objective assessment of brain volume changes instead of the usually performed visual and therefore subjective assessment. Georg Thieme Verlag KG Stuttgart · New York.
A proof for Rhiel's range estimator of the coefficient of variation for skewed distributions.
Rhiel, G Steven
2007-02-01
In this research study is proof that the coefficient of variation (CV(high-low)) calculated from the highest and lowest values in a set of data is applicable to specific skewed distributions with varying means and standard deviations. Earlier Rhiel provided values for d(n), the standardized mean range, and a(n), an adjustment for bias in the range estimator of micro. These values are used in estimating the coefficient of variation from the range for skewed distributions. The d(n) and an values were specified for specific skewed distributions with a fixed mean and standard deviation. In this proof it is shown that the d(n) and an values are applicable for the specific skewed distributions when the mean and standard deviation can take on differing values. This will give the researcher confidence in using this statistic for skewed distributions regardless of the mean and standard deviation.
Yurttutan, Nursel; Bakacak, Murat; Kızıldağ, Betül
2017-09-29
Endotel dysfunction, vasoconstriction, and oxidative stress are described in the pathophysiology of pre-eclampsia, but its aetiology has not been revealed clearly. To examine whether there is a difference between the placentas of pre-eclamptic pregnant women and those of a control group in terms of their T2 star values. Case-control study. Twenty patients diagnosed with pre-eclampsia and 22 healthy controls were included in this study. The placentas obtained after births performed via Caesarean section were taken into the magnetic resonance imaging area in plastic bags within the first postnatal hour, and imaging was performed via modified DIXON-Quant sequence. Average values were obtained by performing T2 star measurements from four localisations on the placentas. T2 star values measured in the placentas of the control group were found to be significantly lower than those in the pre-eclampsia group (p<0.01). While the mean T2 star value in the pre-eclamptic group was found to be 37.48 ms (standard deviation ± 11.3), this value was 28.74 (standard deviation ± 8.08) in the control group. The cut-off value for the T2 star value, maximising the accuracy of diagnosis, was 28.59 ms (area under curve: 0.741; 95% confidence interval: 0.592-0.890); sensitivity and specificity were 70% and 63.6%, respectively. This study, the T2 star value, which is an indicator of iron amount, was found to be significantly lower in the control group than in the pre-eclampsia group. This may be related to the reduction in blood flow to the placenta due to endothelial dysfunction and vasoconstriction, which are important in pre-eclampsia pathophysiology.
Rakszegi, Marianna; Löschenberger, Franziska; Hiltbrunner, Jürg; Vida, Gyula; Mikó, Péter
2016-06-01
An assessment was previously made of the effects of organic and low-input field management systems on the physical, grain compositional and processing quality of wheat and on the performance of varieties developed using different breeding methods ("Comparison of quality parameters of wheat varieties with different breeding origin under organic and low-input conventional conditions" [1]). Here, accompanying data are provided on the performance and stability analysis of the genotypes using the coefficient of variation and the 'ranking' and 'which-won-where' plots of GGE biplot analysis for the most important quality traits. Broad-sense heritability was also evaluated and is given for the most important physical and quality properties of the seed in organic and low-input management systems, while mean values and standard deviation of the studied properties are presented separately for organic and low-input fields.
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
Anti-inflammatory drugs and prediction of new structures by comparative analysis.
Bartzatt, Ronald
2012-01-01
Nonsteroidal anti-inflammatory drugs (NSAIDs) are a group of agents important for their analgesic, anti-inflammatory, and antipyretic properties. This study presents several approaches to predict and elucidate new molecular structures of NSAIDs based on 36 known and proven anti-inflammatory compounds. Based on 36 known NSAIDs the mean value of Log P is found to be 3.338 (standard deviation= 1.237), mean value of polar surface area is 63.176 Angstroms2 (standard deviation = 20.951 A2), and the mean value of molecular weight is 292.665 (standard deviation = 55.627). Nine molecular properties are determined for these 36 NSAID agents, including Log P, number of -OH and -NHn, violations of Rule of 5, number of rotatable bonds, and number of oxygens and nitrogens. Statistical analysis of these nine molecular properties provides numerical parameters to conform to in the design of novel NSAID drug candidates. Multiple regression analysis is accomplished using these properties of 36 agents followed with examples of predicted molecular weight based on minimum and maximum property values. Hierarchical cluster analysis indicated that licofelone, tolfenamic acid, meclofenamic acid, droxicam, and aspirin are substantially distinct from all remaining NSAIDs. Analysis of similarity (ANOSIM) produced R = 0.4947, which indicates low to moderate level of dissimilarity between these 36 NSAIDs. Non-hierarchical K-means cluster analysis separated the 36 NSAIDs into four groups having members of greatest similarity. Likewise, discriminant analysis divided the 36 agents into two groups indicating the greatest level of distinction (discrimination) based on nine properties. These two multivariate methods together provide investigators a means to compare and elucidate novel drug designs to 36 proven compounds and ascertain to which of those are most analogous in pharmacodynamics. In addition, artificial neural network modeling is demonstrated as an approach to predict numerous molecular properties of new drug designs that is based on neural training from 36 proven NSAIDs. Comprehensive and effective approaches are presented in this study for the design of new NSAID type agents which are so very important for inhibition of COX-2 and COX-1 isoenzymes.
Space Weather Outreach: Connection to STEM Standards
NASA Astrophysics Data System (ADS)
Dusenbery, P. B.
2008-12-01
Many scientists are studying the Sun-Earth system and attempting to provide timely, accurate, and reliable space environment observations and forecasts. Research programs and missions serve as an ideal focal point for creating educational content, making this an ideal time to inform the public about the importance and value of space weather research. In order to take advantage of this opportunity, the Space Science Institute (SSI) is developing a comprehensive Space Weather Outreach program to reach students, educators, and other members of the public, and share with them the exciting discoveries from this important scientific discipline. The Space Weather Outreach program has the following five components: (1) the Space Weather Center Website that includes online educational games; (2) Small Exhibits for Libraries, Shopping Malls, and Science Centers; (3) After-School Programs; (4) Professional Development Workshops for Educators, and (5) an innovative Evaluation and Education Research project. Its overarching goal is to inspire, engage, and educate a broad spectrum of the public and make strategic and innovative connections between informal and K-12 education communities. An important factor in the success of this program will be its alignment with STEM standards especially those related to science and mathematics. This presentation will describe the Space Weather Outreach program and how standards are being used in the development of each of its components.
Comparison between presepsin and procalcitonin in early diagnosis of neonatal sepsis.
Iskandar, Agustin; Arthamin, Maimun Z; Indriana, Kristin; Anshory, Muhammad; Hur, Mina; Di Somma, Salvatore
2018-05-09
Neonatal sepsis remains worldwide one of the leading causes of morbidity and mortality in both term and preterm infants. Lower mortality rates are related to timely diagnostic evaluation and prompt initiation of empiric antibiotic therapy. Blood culture, as gold standard examination for sepsis, has several limitations for early diagnosis, so that sepsis biomarkers could play an important role in this regard. This study was aimed to compare the value of the two biomarkers presepsin and procalcitonin in early diagnosis of neonatal sepsis. This was a prospective cross-sectional study performed, in Saiful Anwar General Hospital Malang, Indonesia, in 51 neonates that fulfill the criteria of systemic inflammatory response syndrome (SIRS) with blood culture as diagnostic gold standard for sepsis. At reviewer operating characteristic (ROC) curve analyses, using a presepsin cutoff of 706,5 pg/mL, the obtained area under the curve (AUCs) were: sensitivity = 85.7%, specificity = 68.8%, positive predictive value = 85.7%, negative predictive value = 68.8%, positive likelihood ratio = 2.75, negative likelihood ratio = 0.21, and accuracy = 80.4%. On the other hand, with a procalcitonin cutoff value of 161.33 pg/mL the obtained AUCs showed: sensitivity = 68.6%, specificity = 62.5%, positive predictive value = 80%, negative predictive value = 47.6%, positive likelihood ratio = 1.83, the odds ratio negative = 0.5, and accuracy = 66.7%. In early diagnosis of neonatal sepsis, compared with procalcitonin, presepsin seems to provide better early diagnostic value with consequent possible faster therapeutical decision making and possible positive impact on outcome of neonates.
Precision half-life measurement of 11C: The most precise mirror transition F t value
NASA Astrophysics Data System (ADS)
Valverde, A. A.; Brodeur, M.; Ahn, T.; Allen, J.; Bardayan, D. W.; Becchetti, F. D.; Blankstein, D.; Brown, G.; Burdette, D. P.; Frentz, B.; Gilardy, G.; Hall, M. R.; King, S.; Kolata, J. J.; Long, J.; Macon, K. T.; Nelson, A.; O'Malley, P. D.; Skulski, M.; Strauss, S. Y.; Vande Kolk, B.
2018-03-01
Background: The precise determination of the F t value in T =1 /2 mixed mirror decays is an important avenue for testing the standard model of the electroweak interaction through the determination of Vu d in nuclear β decays. 11C is an interesting case, as its low mass and small QE C value make it particularly sensitive to violations of the conserved vector current hypothesis. The present dominant source of uncertainty in the 11CF t value is the half-life. Purpose: A high-precision measurement of the 11C half-life was performed, and a new world average half-life was calculated. Method: 11C was created by transfer reactions and separated using the TwinSol facility at the Nuclear Science Laboratory at the University of Notre Dame. It was then implanted into a tantalum foil, and β counting was used to determine the half-life. Results: The new half-life, t1 /2=1220.27 (26 ) s, is consistent with the previous values but significantly more precise. A new world average was calculated, t1/2 world=1220.41 (32 ) s, and a new estimate for the Gamow-Teller to Fermi mixing ratio ρ is presented along with standard model correlation parameters. Conclusions: The new 11C world average half-life allows the calculation of a F tmirror value that is now the most precise value for all superallowed mixed mirror transitions. This gives a strong impetus for an experimental determination of ρ , to allow for the determination of Vu d from this decay.
Performance specifications of critical results management.
Piva, Elisa; Sciacovelli, Laura; Pelloso, Michela; Plebani, Mario
2017-07-01
Formerly defined "critical values", the importance of critical results (CRs) management in patient care has grown in recent years. According to the George Lundberg definition the result becomes "critical" when, exceeding actionable thresholds, it suggests imminent danger for the patient, unless appropriate therapy is initiated promptly. As required in most important accreditation standards, such as the ISO:15,189 or the Joint Commission standards, a quality reporting system should deliver the correct result to the appropriate clinician in a time-frame that ensures patient safety. From this point of view, medical laboratories should implement a process that assures the most effective communication in a timely manner, to the referring physician or care team member. Failure in communication, particularly in this type of situation, continues to be one of the most common factors contributing to the occurrence of adverse events. In the last few decades, Information Technology (IT) in Health Care has become increasingly important. The ability to interface radiology, anatomic pathology or laboratory information systems with electronic medical records is now a real opportunity, offering much safer communication than in the past. Future achievements on performance criteria and quality indicators for the notification of CRs, should ensure a comparable examination across different institutions, adding value to clinical laboratories in controlling post-analytical processes that concern patient safety. Therefore, the novel approach to CRs should combine quality initiatives, IT solutions and a culture to strengthen professional interaction. Copyright © 2017. Published by Elsevier Inc.
Energetic studies and phase diagram of thioxanthene.
Freitas, Vera L S; Monte, Manuel J S; Santos, Luís M N B F; Gomes, José R B; Ribeiro da Silva, Maria D M C
2009-11-19
The molecular stability of thioxanthene, a key species from which very important compounds with industrial relevance are derived, has been studied by a combination of several experimental techniques and computational approaches. The standard (p degrees = 0.1 MPa) molar enthalpy of formation of crystalline thioxanthene (117.4 +/- 4.1 kJ x mol(-1)) was determined from the experimental standard molar energy of combustion, in oxygen, measured by rotating-bomb combustion calorimetry at T = 298.15 K. The enthalpy of sublimation was determined by a direct method, using the vacuum drop microcalorimetric technique, and also by an indirect method, using a static apparatus, where the vapor pressures at different temperatures were measured. The latter technique was used for both crystalline and undercooled liquid samples, and the phase diagram of thioxanthene near the triple point was obtained (triple point coordinates T = 402.71 K and p = 144.7 Pa). From the two methods, a mean value for the standard (p degrees = 0.1 MPa) molar enthalpy of sublimation, at T = 298.15 K (101.3 +/- 0.8 kJ x mol(-1)), was derived. From the latter value and from the enthalpy of formation of the solid, the standard (p degrees = 0.1 MPa) enthalpy of formation of gaseous thioxanthene was calculated as 218.7 +/- 4.2 kJ x mol(-1). Standard ab initio molecular orbital calculations were performed using the G3(MP2)//B3LYP composite procedure and several homodesmotic reactions in order to derive the standard molar enthalpy of formation of thioxanthene. The ab initio results are in excellent agreement with the experimental data.
Difference of refraction values between standard autorefractometry and Plusoptix.
Bogdănici, Camelia Margareta; Săndulache, Codrina Maria; Vasiliu, Rodica; Obadă, Otilia
2016-01-01
Aim: Comparison between the objective refraction measurement results determined with Topcon KR-8900 standard autorefractometer and Plusoptix A09 photo-refractometer in children. Material and methods: A prospective transversal study was performed in the Department of Ophthalmology of "Sf. Spiridon" Hospital in Iași on 90 eyes of 45 pediatric patients, with a mean age of 8,82 ± 3,52 years, examined with noncycloplegic measurements provided by Plusoptix A09 and cycloplegic and noncycloplegic measurements provided by Topcon KR-8900 standard autorefractometer. The clinical parameters compared were the following: spherical equivalent (SE), spherical and cylindrical values, and cylinder axis. Astigmatism was recorded and evaluated with the cylindrical value on minus after transposition. The statistical calculation was performed with paired t-tests and Pearson's correlation analysis. All the data were analyzed with SPSS statistical package 19 (SPSS for Windows, Chicago, IL). Results: Plusoptix A09 noncycloplegic values were relatively equal between the eyes, with slightly lower values compared to noncycloplegic auto refractometry. Mean (± SD) measurements provided by Plusoptix AO9 were the following: spherical power 1.11 ± 1.52, cylindrical power 0.80 ± 0.80, and spherical equivalent 0.71 ± 1.39. The noncycloplegic auto refractometer mean (± SD) measurements were spherical power 1.12 ± 1.63, cylindrical power 0.79 ± 0,77 and spherical equivalent 0.71 ± 1.58. The cycloplegic auto refractometer mean (± SD) measurements were spherical power 2.08 ± 1.95, cylindrical power 0,82 ± 0.85 and spherical equivalent 1.68 ± 1.87. 32% of the eyes were hyperopic, 2.67% were myopic, 65.33% had astigmatism, and 30% eyes had amblyopia. Conclusions: Noncycloplegic objective refraction values were similar with those determined by autorefractometry. Plusoptix had an important role in the ophthalmological screening, but did not detect higher refractive errors, justifying the cycloplegic autorefractometry.
Chemical Identity Crisis: Glass and Glassblowing in the Identification of Organic Compounds.
Jackson, Catherine M
2015-04-01
This essay explains why and how nineteenth-century chemists sought to stabilize the melting and boiling points of organic substances as reliable characteristics of identity and purity and how, by the end of the century, they established these values as 'Constants of Nature'. Melting and boiling points as characteristic values emerge from this study as products of laboratory standardization, developed by chemists in their struggle to classify, understand and control organic nature. A major argument here concerns the role played by the introduction of organic synthesis in driving these changes. Synthetic organic chemistry vastly increased the number of known organic substances, precipitating the chemical identity crisis of my title. Successful natural product synthesis, moreover, depended on chemists' ability to demonstrate the absolute identity of synthetic product and natural target--something late nineteenth-century chemists eventually achieved by making reliable, replicable melting and boiling point measurements. In the period before the establishment of national standards laboratories, chemists and scientific glassblowers worked together to standardize melting and boiling points as physical constants, such collaborations highlighting the essential importance of chemical glassware and glassblowing skill in the development of nineteenth-century organic chemistry.
Ja'afar, Fairuzeta; Leow, Chee Hau; Garbin, Valeria; Sennoga, Charles A; Tang, Meng-Xing; Seddon, John M
2015-11-01
Microbubble (MB) contrast-enhanced ultrasonography is a promising tool for targeted molecular imaging. It is important to determine the MB surface charge accurately as it affects the MB interactions with cell membranes. In this article, we report the surface charge measurement of SonoVue, Definity and Optison. We compare the performance of the widely used laser Doppler electrophoresis with an in-house micro-electrophoresis system. By optically tracking MB electrophoretic velocity in a microchannel, we determined the zeta potentials of MB samples. Using micro-electrophoresis, we obtained zeta potential values for SonoVue, Definity and Optison of -28.3, -4.2 and -9.5 mV, with relative standard deviations of 5%, 48% and 8%, respectively. In comparison, laser Doppler electrophoresis gave -8.7, +0.7 and +15.8 mV with relative standard deviations of 330%, 29,000% and 130%, respectively. We found that the reliability of laser Doppler electrophoresis is compromised by MB buoyancy. Micro-electrophoresis determined zeta potential values with a 10-fold improvement in relative standard deviation. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
An environmental decision framework applied to marine engine control technologies.
Corbett, James J; Chapman, David
2006-06-01
This paper develops a decision framework for considering emission control technologies on marine engines, informed by standard decision theory, with an open structure that may be adapted by operators with specific vessel and technology attributes different from those provided here. Attributes relate objectives important to choosing control technologies with specific alternatives that may meet several of the objectives differently. The transparent framework enables multiple stakeholders to understand how different subjective judgments and varying attribute properties may result in different technology choices. Standard scoring techniques ensure that attributes are not biased by subjective scoring and that weights are the primary quantitative input where subjective preferences are exercised. An expected value decision structure is adopted that considers probabilities (likelihood) that a given alternative can meet its claims; alternative decision criteria are discussed. Capital and annual costs are combined using a net present value approach. An iterative approach is advocated that allows for screening and disqualifying alternatives that do not meet minimum conditions for acceptance, such as engine warranty or U.S. Coast Guard requirements. This decision framework assists vessel operators in considering explicitly important attributes and in representing choices clearly to other stakeholders concerned about reducing air pollution from vessels. This general decision structure may also be applied similarly to other environmental controls in marine applications.
Zhuang, Ping; Lu, Huanping; Li, Zhian; Zou, Bi; McBride, Murray B
2014-01-01
The objective of this study was to investigate the levels of Cd, Pb, Cu and Zn in the environment and several important food sources grown and consumed in the vicinity of Dabaoshan mine in Southern China, and evaluate potential health risks among local residents. The Cd, Pb, Cu and Zn concentrations of arable soils and well water near the mines exceeded the quality standard values. The concentrations of Cd and Pb in some food crops (rice grain, vegetable and soybean) samples were significantly higher than the maximum permissible level. The Cd and Pb concentrations in half of the chicken and fish meat samples were higher than the national standard. The residents living near Dabaoshan mine had higher Cd and Pb levels in hair than those of a non-exposed population. The intake of rice was identified as a major contributor to the estimated daily intake of these metals by the residents. The hazard index values for adults and children were 10.25 and 11.11, respectively, with most of the estimated risks coming from the intake of home-grown rice and vegetables. This study highlights the importance of multiple pathways in studying health risk assessment of heavy metal exposure in China.
Zhuang, Ping; Lu, Huanping; Li, Zhian; Zou, Bi; McBride, Murray B.
2014-01-01
The objective of this study was to investigate the levels of Cd, Pb, Cu and Zn in the environment and several important food sources grown and consumed in the vicinity of Dabaoshan mine in Southern China, and evaluate potential health risks among local residents. The Cd, Pb, Cu and Zn concentrations of arable soils and well water near the mines exceeded the quality standard values. The concentrations of Cd and Pb in some food crops (rice grain, vegetable and soybean) samples were significantly higher than the maximum permissible level. The Cd and Pb concentrations in half of the chicken and fish meat samples were higher than the national standard. The residents living near Dabaoshan mine had higher Cd and Pb levels in hair than those of a non-exposed population. The intake of rice was identified as a major contributor to the estimated daily intake of these metals by the residents. The hazard index values for adults and children were 10.25 and 11.11, respectively, with most of the estimated risks coming from the intake of home-grown rice and vegetables. This study highlights the importance of multiple pathways in studying health risk assessment of heavy metal exposure in China. PMID:24728502
Ridley, W.I.; Stetson, S.J.
2006-01-01
There are seven stable isotopes of Hg that can be fractionated as a result of inorganic and organic interactions. Important inorganic reactions involve speciation changes resulting from variations in environmental redox conditions, and phase changes resulting from variations in temperature and/or atmospheric pressure. Important organic reactions include methylation and demethylation, reactions that are bacterially mediated, and complexing with organic anions in soils. The measurement of Hg isotopes by multi-collector-inductively coupled plasma-mass spectrometry (MC-ICP-MS) is now sufficiently precise and sensitive that it is potentially possible to develop the systematics of Hg isotopic fractionation. This provides an opportunity to evaluate the utility of Hg isotopes in identifying source processes, transport mechanisms, and sinks. New values are provided for, 201Hg/198Hg, 200Hg/198Hg, 199Hg/198Hg for three standard materials (IRMM-AE639, SRM 1641c, SRM 3133) that can be used to make inter-laboratory data comparisons, and these values are tabulated with published isotopic information. Overall, the isotopic data for these standards agree to approximately 0.2???. The paper reviews Hg isotope studies that deal with hydrothermal ore deposits, sediments, coal and organic complexing. ?? 2006 Elsevier Ltd. All rights reserved.
A counterfactual p-value approach for benefit-risk assessment in clinical trials.
Zeng, Donglin; Chen, Ming-Hui; Ibrahim, Joseph G; Wei, Rachel; Ding, Beiying; Ke, Chunlei; Jiang, Qi
2015-01-01
Clinical trials generally allow various efficacy and safety outcomes to be collected for health interventions. Benefit-risk assessment is an important issue when evaluating a new drug. Currently, there is a lack of standardized and validated benefit-risk assessment approaches in drug development due to various challenges. To quantify benefits and risks, we propose a counterfactual p-value (CP) approach. Our approach considers a spectrum of weights for weighting benefit-risk values and computes the extreme probabilities of observing the weighted benefit-risk value in one treatment group as if patients were treated in the other treatment group. The proposed approach is applicable to single benefit and single risk outcome as well as multiple benefit and risk outcomes assessment. In addition, the prior information in the weight schemes relevant to the importance of outcomes can be incorporated in the approach. The proposed CPs plot is intuitive with a visualized weight pattern. The average area under CP and preferred probability over time are used for overall treatment comparison and a bootstrap approach is applied for statistical inference. We assess the proposed approach using simulated data with multiple efficacy and safety endpoints and compare its performance with a stochastic multi-criteria acceptability analysis approach.
NASA Astrophysics Data System (ADS)
Yogish, H.; Chandrashekara, K.; Pramod Kumar, M. R.
2012-11-01
India is looking at the renewable alternative sources of energy to reduce its dependence on import of crude oil. As India imports 70 % of the crude oil, the country has been greatly affected by increasing cost and uncertainty. Biodiesel fuel derived by the two step acid transesterification of mixed non-edible oils from Jatropha curcas and Pongamia (karanja) can meet the requirements of diesel fuel in the coming years. In the present study, different proportions of Methanol, Sodium hydroxide, variation of Reaction time, Sulfuric acid and Reaction Temperature were adopted in order to optimize the experimental conditions for maximum biodiesel yield. The preliminary studies revealed that biodiesel yield varied widely in the range of 75-95 % using the laboratory scale reactor. The average yield of 95 % was obtained. The fuel and chemical properties of biodiesel, namely kinematic viscosity, specific gravity, density, flash point, fire point, calorific value, pH, acid value, iodine value, sulfur content, water content, glycerin content and sulfated ash values were found to be within the limits suggested by Bureau of Indian Standards (BIS 15607: 2005). The optimum combination of Methanol, Sodium hydroxide, Sulfuric acid, Reaction Time and Reaction Temperature are established.
1993-05-01
This Principles and Practices Board project was undertaken in response to the frequent requests from HFMA members for a standard calculation of "days of revenue in receivables." The board's work on this project indicated that every element of the calculation required standards, which is what this statement provides. Since there have been few standards for accounts receivable related to patient services, the industry follows a variety of practices, which often differ from each other. This statement is intended to provide a framework for enhanced external comparison of accounts receivable related to patient services, and thereby improve management information related to this very important asset. Thus, the standards described in this statement represent long-term goals for gradual transition of recordkeeping practices and not a sudden or revolutionary change. The standards described in this statement will provide the necessary framework for the most meaningful external comparisons. Furthermore, management's understanding of deviations from these standards will immediately assist in analysis of differences in data between providers.
U.S. Geological Survey, remote sensing, and geoscience data: Using standards to serve us all
Benson, Michael G.; Faundeen, John L.
2000-01-01
The U.S. Geological Survey (USGS) advocates the use of standards with geosciences and remotely sensed data and metadata for its own purposes and those of its customers. In activities that range from archiving data to making a product, the incorporation of standards makes these functions repeatable and understandable. More important, when accepted standards are followed, data discovery and sharing can be more efficient and the overall value to society can be expanded. The USGS archives many terabytes of digital geoscience and remotely sensed data. Several million photographs are also available to the research community. To manage these vast holdings and ensure that strict preservation and high usability criteria are observed, the USGS uses standards within the archival, data management, public access and ordering, and data distribution areas. The USGS uses Federal and international standards in performing its role as the U.S. National Satellite Land Remote Sensing Data Archive and in its mission as the long-term archive and production center for aerial photographs and cartographic data covering the United States.
Kalateh Sadati, Ahmad; Iman, Mohammad Taghi; Bagheri Lankarani, Kamran
2014-07-01
Despite its benefits and importance, clinical counseling affects the patient both psychosocially and socially. Illness labeling not only leads to many problems for patient and his/her family but also it imposes high costs to health care system. Among various factors, doctor-patient relationship has an important role in the clinical counseling and its medical approach. The goal of this study is to evaluate the nature of clinical counseling based on critical approach. The context of research is the second major medical training center in Shiraz, Iran. In this study, Critical Conversation Analysis was used based on the methodologies of critical theories. Among about 50 consultation meetings digitally recorded, 33 were selected for this study. RESULTS show that the nature of doctor-patient relationship in these cases is based on paternalistic model. On the other hand, in all consultations, the important values that were legitimated with physicians were medical paraclinical standards. Paternalism in one hand and standardization on the other leads to dependency of patients to the clinic. Although we can't condone the paraclinical standards, clinical counseling and doctor-patient relationship need to reduce its dominance over counseling based on interpretation of human relations, paying attention to social and economical differences of peoples and biosocial and biocultural differences, and focusing on clinical examinations. Also, we need to accept that medicine is an art of interaction that can't reduce it to instrumental and linear methods of body treatment.
Determination of the Characteristic Values and Variation Ratio for Sensitive Soils
NASA Astrophysics Data System (ADS)
Milutinovici, Emilia; Mihailescu, Daniel
2017-12-01
In 2008, Romania adopted Eurocode 7, part II, regarding the geotechnical investigations - called SR EN1997-2/2008. However a previous standard already existed in Romania, by using the mathematical statistics in determination of the calculation values, the requirements of Eurocode can be taken into consideration. The setting of characteristics and calculations values of the geotechnical parameters was finally issued in Romania at the end of 2010 at standard NP122-2010 - “Norm regarding determination of the characteristic and calculation values of the geotechnical parameters”. This standard allows using of data already known from analysed area and setting the calculation values of geotechnical parameters. However, this possibility exist, it is not performed easy in Romania, considering that there isn’t any centralized system of information coming from the geotechnical studies performed for various objectives of private or national interests. Every company performing geotechnical studies tries to organize its own data base, but unfortunately none of them use existing centralized data. When determining the values of calculation, an important role is played by the variation ratio of the characteristic values of a geotechnical parameter. There are recommendations in the mentioned Norm, that could be taken into account, regarding the limits of the variation ratio, but these values are mentioned for Quaternary age soils only, normally consolidated, with a content of organic material < 5%. All of the difficult soils are excluded from the Norm even if they exist and affect the construction foundations on more than a half of the Romania’s surface. A type of difficult soil, extremely widespread on the Romania’s territory, is the contractile soil (with high swelling and contractions, very sensitive to the seasonal moisture variations). This type of material covers and influences the construction foundations in one over third of Romania’s territory. This work is proposing to be a step in determination of limits of the variation ratios for the contractile soils category, for the most used geotechnical parameters in the Romanian engineering practice, namely: the index of consistency and the cohesion.
42 CFR 495.348 - Procurement standards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION STANDARDS FOR THE ELECTRONIC HEALTH RECORD TECHNOLOGY INCENTIVE... solicit nor accept gratuities, favors, or anything of monetary value from contractors, or parties to sub... or the gift is an unsolicited item of nominal value. (5) The standards of conduct provide for...
42 CFR 495.348 - Procurement standards.
Code of Federal Regulations, 2012 CFR
2012-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION STANDARDS FOR THE ELECTRONIC HEALTH RECORD TECHNOLOGY INCENTIVE... solicit nor accept gratuities, favors, or anything of monetary value from contractors, or parties to sub... or the gift is an unsolicited item of nominal value. (5) The standards of conduct provide for...
42 CFR 495.348 - Procurement standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION STANDARDS FOR THE ELECTRONIC HEALTH RECORD TECHNOLOGY INCENTIVE... solicit nor accept gratuities, favors, or anything of monetary value from contractors, or parties to sub... or the gift is an unsolicited item of nominal value. (5) The standards of conduct provide for...
Measurement of diffusion coefficients from solution rates of bubbles
NASA Technical Reports Server (NTRS)
Krieger, I. M.
1979-01-01
The rate of solution of a stationary bubble is limited by the diffusion of dissolved gas molecules away from the bubble surface. Diffusion coefficients computed from measured rates of solution give mean values higher than accepted literature values, with standard errors as high as 10% for a single observation. Better accuracy is achieved with sparingly soluble gases, small bubbles, and highly viscous liquids. Accuracy correlates with the Grashof number, indicating that free convection is the major source of error. Accuracy should, therefore, be greatly increased in a gravity-free environment. The fact that the bubble will need no support is an additional important advantage of Spacelab for this measurement.
Flexible displays as key for high-value and unique automotive design
NASA Astrophysics Data System (ADS)
Isele, Robert
2011-03-01
Within the last few years' car industry changed very fast. Information and Communication became more important and displays are now standard in nearly every car. But this is not the only trend which could be recognized in this industry. CO2 emission, fuel price as well as the increasing traffic inside the Mega Cities initialized a big change in the behavior of the customers. The big battle for the car industry will enter the interior extremely fast, and the premium cars need ore innovative design icons. Flexible Displays are one big step that enables totally different designs and a new value of the driver experience.
Waters, Peter M
2015-01-01
The continuing increases in health care expenditures as well as the importance of providing safe, effective, timely, patient-centered care has brought government and commercial payer pressure on hospitals and providers to document the value of the care they deliver. This article introduces work at Boston Children's Hospital on time-driven activity-based accounting to determine cost of care delivery; combined with Systemic Clinical Assessment and Management Plans to reduce variation and improve outcomes. The focus so far has been on distal radius fracture care for children and adolescents.
Work as a cultural and personal value: attitudes towards work in Polish society.
Skarzyńska, Krystyna
2002-01-01
The meaning of work for Poles is analyzed here from 2 perspectives: macrosocial and individual. From the macrosocial perspective work attitudes are explained by 3 factors: traditional Polish Catholicism, cultural patterns (influence of noble class tradition), and experience of "real socialism." From an individual perspective some psychological and demographic predictors of an autonomous (intrinsic) work attitude are empirically tested. The autonomous attitude towards work is understood here as treating work as an important autonomous value versus only an instrumental means for earning money. The data was collected by means of standardized interviews run on a representative random sample of adult working Poles, N = 1340.
NASA Astrophysics Data System (ADS)
Doustdar, O.; Wyszynski, M. L.; Mahmoudi, H.; Tsolakis, A.
2016-09-01
Bio-fuel produced from renewable sources is considered the most viable alternatives for the replacement of mineral diesel fuel in compression ignition engines. There are several options for biomass derived fuels production involving chemical, biological and thermochemical processes. One of the best options is Fischer Tropsch Synthesis, which has an extensive history of gasoline and diesel production from coal and natural gas. FTS fuel could be one of the best solutions to the fuel emission due to its high quality. FTS experiments were carried out in 16 different operation conditions. Mini structured vertical downdraft fixed bed reactor was used for the FTS. Instead of Biomass gasification, a simulated N2 -rich syngas cylinder of, 33% H2 and 50% N2 was used. FT fuels products were analyzed in GCMS to find the hydrocarbon distributions of FT fuel. Calorific value and lubricity of liquid FT product were measured and compared with commercial diesel fuel. Lubricity has become an important quality, particularly for biodiesel, due to higher pressures in new diesel fuel injection (DFI) technology which demands better lubrication from the fuel and calorific value which is amount of energy released in combustion paly very important role in CI engines. Results show that prepared FT fuel has desirable properties and it complies with standard values. FT samples lubricities as measured by ASTM D6079 standard vary from 286μm (HFRR scar diameter) to 417μm which are less than limit of 520μm. Net Calorific value for FT fuels vary from 9.89 MJ/kg to 43.29 MJ/kg, with six of the samples less than EN 14213 limit of 35MJ/kg. Effect of reaction condition on FT fuel properties was investigated which illustrates that in higher pressure Fischer-Tropsch reaction condition liquid product has better properties.
McLaren, Stuart J; Page, Wyatt H; Parker, Lou; Rushton, Martin
2013-12-19
An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys.
McLaren, Stuart J.; Page, Wyatt H.; Parker, Lou; Rushton, Martin
2013-01-01
An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys. PMID:24452254
Attributions of poverty among social work and non-social work students in Croatia.
Ljubotina, Olja Druzić; Ljubotina, Damir
2007-10-01
To investigate how students in Croatia perceive causes of poverty and to examine the differences in attributions of poverty between students of social work, economics, and agriculture. The study included 365 participants, students of social work (n=143), economics (n=137), and agriculture (n=82). We used the newly developed Attribution of Poverty Scale, consisting of 4 factors, as follows: individual causes of poverty (eg, lack of skills and capabilities, lack of effort, poor money management, alcohol abuse); micro-environmental causes (eg, poor family, region, single parenthood); structural/societal causes (eg, poor economy, consequences of political transition, war); and fatalistic causes (eg, bad luck, fate, God's will). We also used a questionnaire that measured 5 dimensions of students' personal values: humanistic values, family values, striving for self-actualization, traditional values, and hedonistic values. In both questionnaires, items were rated on a 5-point Likert-type scale. Students of all three faculties put most emphasis on structural causes of poverty (mean+/-standard deviation=3.54+/-0.76 on a 1-5 scale), followed by environmental (3.18+/-0.60), individual (2.95+/-0.68), and fatalistic causes (1.81+/-0.74). Social work students perceived individual factors as significantly less important causes of poverty (ANOVA, F-value=12.55, P<0.001) than students of economics and agriculture. We found a correlation between humanistic values and perceived structural (r=0.267, P<0.001) and micro-environmental causes of poverty (r=0.185, P<0.001), and also between traditional values and structural (r=0.168, P<0.001), micro-environmental (r=0.170, P<0.001), and fatalistic causes of poverty (r=0.149, P<0.001). Students see structural/societal factors, such as poor economy and political transition as main causes of poverty in Croatia. Individual factors connected with individual's personal characteristics were considered less important, while luck and fate were considered as least important. Students of social work perceived individual causes to be less important than students of agriculture and economics. Students with strong humanistic and traditional values put more emphasis on external sources of poverty.
Characterizing the mammography technologist workforce in North Carolina
Henderson, Louise M.; Marsh, Mary W.; Benefield, Thad; Pearsall, Elizabeth; Durham, Danielle; Schroeder, Bruce F.; Bowling, J. Michael; Viglione, Cheryl A.; Yankaskas, Bonnie C.
2016-01-01
Background Mammography technologists’ level of training, years of experience, and feedback on technique may play an important role in the breast cancer screening process. However, very little information on the mammography technologist workforce exists. Methods In 2013, we conducted a mailed survey to 912 mammography technologists working in 224 Mammography Quality Standards Act accredited facilities in North Carolina. Using standard survey methodology we developed and implemented a questionnaire focused on the education and training, work experiences, and workplace interactions of mammography technologists. We aggregated responses using survey weights to account for non-response. We describe and compare lead (administrative responsibilities) and non-lead (supervised by another technologist) mammography technologist characteristics, testing for differences using t-tests and chi-square tests. Results A total of 433 mammography technologists responded (survey response rate=47.5%; 95% confidence interval:44.2%-50.7%), including 128 lead and 305 non-lead technologists. Most mammography technologists were non-Hispanic, white, females and the average age was 48 years. Approximately 93% of lead and non-lead technologists had mammography specific training but <4% had sonography certification and 3% had MRI certification. Lead technologists reported more years performing screening mammography (p-value=0.02) and film mammography (p-value=0.03), more administrative hours (p-value<0.0001), and more workplace autonomy (p-value=0.002) than non-lead technologists. Non-lead technologists were more likely to report performing diagnostic mammograms (p-value=0.0004) or other breast imaging (p-value=0.001), discuss image quality with a peer (p-value=0.013), and have frequent face-to-face interaction with radiologists (p-value=0.03). Conclusion Our findings offer insights into mammography technologists’ training and work experiences, highlighting variability in technologist characteristics between lead and non-lead technologists. PMID:26614888
Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.
2013-01-01
Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.
Farnbach, Sara; Eades, Anne-Maree; Gwynn, Josephine D; Glozier, Nick; Hackett, Maree L
2018-06-14
Objectives and importance of study: Values and ethics: guidelines for ethical conduct in Aboriginal and Torres Strait Islander health research (Values and ethics) describes key values that should underpin Aboriginal and Torres Strait Islander (Indigenous)-focused health research. It is unclear how research teams address this document in primary health care research. We systematically review the primary health care literature focusing on Indigenous social and emotional wellbeing (SEWB) to identify how Values and ethics and community preferences for standards of behaviour (local protocols) are addressed during research. Systematic review in accordance with PRISMA Guidelines and MOOSE Guidelines for Meta-Analyses and Systematic Reviews of Observational Studies. We searched four databases and one Indigenous-specific website for qualitative, quantitative and mixed-method studies published since Values and ethics was implemented (2003). Included studies were conducted in primary health care services, focused on Indigenous SEWB and were conducted by research teams. Using standard data extraction forms, we identified actions taken (reported by authors or identified by us) relating to Values and ethics and local protocols. A total of 25 studies were included. Authors of two studies explicitly mentioned the Values and ethics document, but neither reported how their actions related to the document's values. In more than half the studies, we identified at least three actions relating to the values. Some actions related to multiple values, including use of culturally sensitive research processes and involving Indigenous representatives in the research team. Local protocols were rarely reported. Addressing Values and ethics appears to improve research projects. The academic community should focus on culturally sensitive research processes, relationship building and developing the Indigenous research workforce, to facilitate acceptable research that affects health outcomes. For Values and ethics to achieve its full impact and to improve learning between research teams, authors should be encouraged to report how the principles are addressed during research, including barriers and enablers that are encountered.
Analysis of the landscape complexity and heterogeneity of the Pantanal wetland.
Miranda, C S; Gamarra, R M; Mioto, C L; Silva, N M; Conceição Filho, A P; Pott, A
2018-05-01
This is the first report on analysis of habitat complexity and heterogeneity of the Pantanal wetland. The Pantanal encompasses a peculiar mosaic of environments, being important to evaluate and monitor this area concerning conservation of biodiversity. Our objective was to indirectly measure the habitat complexity and heterogeneity of the mosaic forming the sub-regions of the Pantanal, by means of remote sensing. We obtained free images of Normalized Difference Vegetation Index (NDVI) from the sensor MODIS and calculated the mean value (complexity) and standard deviation (heterogeneity) for each sub-region in the years 2000, 2008 and 2015. The sub-regions of Poconé, Canoeira, Paraguai and Aquidauana presented the highest values of complexity (mean NDVI), between 0.69 and 0.64 in the evaluated years. The highest horizontal heterogeneity (NDVI standard deviation) was observed in the sub-region of Tuiuiú, with values of 0.19 in the years 2000 and 2015, and 0.21 in the year 2008. We concluded that the use of NDVI to estimate landscape parameters is an efficient tool for assessment and monitoring of the complexity and heterogeneity of the Pantanal habitats, applicable in other regions.
Standardisation of costs: the Dutch Manual for Costing in economic evaluations.
Oostenbrink, Jan B; Koopmanschap, Marc A; Rutten, Frans F H
2002-01-01
The lack of a uniform costing methodology is often considered a weakness of economic evaluations that hinders the interpretation and comparison of studies. Standardisation is therefore an important topic within the methodology of economic evaluations and in national guidelines that formulate the formal requirements for studies to be considered when deciding on the reimbursement of new medical therapies. Recently, the Dutch Manual for Costing: Methods and Standard Costs for Economic Evaluations in Health Care (further referred to as "the manual") has been published, in addition to the Dutch guidelines for pharmacoeconomic research. The objectives of this article are to describe the main content of the manual and to discuss some key issues of the manual in relation to the standardisation of costs. The manual introduces a six-step procedure for costing. These steps concern: the scope of the study;the choice of cost categories;the identification of units;the measurement of resource use;the monetary valuation of units; andthe calculation of unit costs. Each step consists of a number of choices and these together define the approach taken. In addition to a description of the costing process, five key issues regarding the standardisation of costs are distinguished. These are the use of basic principles, methods for measurement and valuation, standard costs (average prices of healthcare services), standard values (values that can be used within unit cost calculations), and the reporting of outcomes. The use of the basic principles, standard values and minimal requirements for reporting outcomes, as defined in the manual, are obligatory in studies that support submissions to acquire reimbursement for new pharmaceuticals. Whether to use standard costs, and the choice of a particular method to measure or value costs, is left mainly to the investigator, depending on the specific study setting. In conclusion, several instruments are available to increase standardisation in costing methodology among studies. These instruments have to be used in such a way that a balance is found between standardisation and the specific setting in which a study is performed. The way in which the Dutch manual tries to reach this balance can serve as an illustration for other countries.
Competence in chronic mental illness: the relevance of practical wisdom.
Widdershoven, Guy A M; Ruissen, Andrea; van Balkom, Anton J L M; Meynen, Gerben
2017-06-01
The concept of competence is central to healthcare because informed consent can only be obtained from a competent patient. The standard approach to competence focuses on cognitive abilities. Several authors have challenged this approach by emphasising the role of emotions and values. Combining cognition, emotion and values, we suggest an approach which is based on the notion of practical wisdom. This focuses on knowledge and on determining what is important in a specific situation and finding a balance between various values, which are enacted in an individual's personal life. Our approach is illustrated by two cases of patients with obsessive-compulsive disorder. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
A Framework for Enhancing the Value of Research for Dissemination and Implementation
Glasgow, Russell E.; Carpenter, Christopher R.; Grimshaw, Jeremy M.; Rabin, Borsika A.; Fernandez, Maria E.; Brownson, Ross C.
2015-01-01
A comprehensive guide that identifies critical evaluation and reporting elements necessary to move research into practice is needed. We propose a framework that highlights the domains required to enhance the value of dissemination and implementation research for end users. We emphasize the importance of transparent reporting on the planning phase of research in addition to delivery, evaluation, and long-term outcomes. We highlight key topics for which well-established reporting and assessment tools are underused (e.g., cost of intervention, implementation strategy, adoption) and where such tools are inadequate or lacking (e.g., context, sustainability, evolution) within the context of existing reporting guidelines. Consistent evaluation of and reporting on these issues with standardized approaches would enhance the value of research for practitioners and decision-makers. PMID:25393182
Study on small molecular organic compounds pyrolysed from rubber seed oil and its sodium soap.
Fernando, T L D; Prashantha, M A B; Amarasinghe, A D U S
2016-01-01
Rubber seed oil (RSO) and its sodium soap were pyrolysed in a batch reactor to obtain low molar mass organic substances. The pyrolitic oil of RSO was redistilled and the distillates were characterized by GC-MS and FTIR. Density, acid value, saponification value and ester values were also measured according to the ASTM standard methods. A similar analysis was done for samples taken out at different time intervals from the reaction mixture. Industrially important low molar mass alkanes, alkenes, aromatics, cyclic compounds and carboxylic acids were identified in the pyrolysis process of rubber seed oil. However, pyrolysis of the sodium soap of rubber seed oil gave a mixture of hydrocarbons in the range of C14-C17 and hence it has more applications as a fuel.
Cultural competence in the era of evidence-based practice.
Engebretson, Joan; Mahoney, Jane; Carlson, Elizabeth D
2008-01-01
Cultural competence has become an important concern for contemporary health care delivery, with ethical and legal implications. Numerous educational approaches have been developed to orient clinicians, and standards and position statements promoting cultural competence have been published by both the American Medical Association and the American Nurses Association. Although a number of health care regulatory agencies have developed standards or recommendations, clinical application to patient care has been challenging. These challenges include the abstract nature of the concept, essentializing culture to race or ethnicity, and the attempts to associate culture with health disparities. To make cultural competence relevant to clinical practice, we linked a cultural competency continuum that identifies the levels of cultural competency (cultural destructiveness, cultural incapacity, cultural blindness, cultural precompetence, and cultural proficiency) to well-established values in health care. This situates cultural competence and proficiency in alignment with patient-centered care. A model integrating the cultural competency continuum with the components of evidence-based care (i.e., best research practice, clinical expertise, and patient's values and circumstances) is presented.
NASA Astrophysics Data System (ADS)
Erfaisalsyah, M. H.; Mansur, A.; Khasanah, A. U.
2017-11-01
For a company which engaged in the textile field, specify the supplier of raw materials for production is one important part of supply chain management which can affect the company's business processes. This study aims to identify the best suppliers of raw material suppliers of yarn for PC. PKBI based on several criteria. In this study, the integration between the Analytical Hierarchy Process (AHP) and the Standardized Unitless Rating (SUR) are used to assess the performance of the suppliers. By using AHP, it can be known the value of the relative weighting of each criterion. While SUR shows the sequence performance value of the supplier. The result of supplier ranking calculation can be used to know the strengths and weaknesses of each supplier based on its performance criteria. From the final result, it can be known which suppliers should improve their performance in order to create long term cooperation with the company.
The probability of probability and research truths.
Fatovich, Daniel M; Phillips, Michael
2017-04-01
The foundation of much medical research rests on the statistical significance of the P-value, but we have fallen prey to the seductive certainty of significance. Other scientific disciplines work to a different standard. This may partly explain why medical reversal is an increasing phenomenon, whereby new studies (based on the 0.05 standard) overturn previous significant findings. This has generated a crisis in the rigour of evidence-based medicine, as many people erroneously believe that a P < 0.05 means the treatment effect is clinically important. However, statistics are not facts about the world. Nor should they be based on an arbitrary threshold that arose for historical reasons. This arbitrary threshold encourages an unthinking automatic response that contributes to industry's influence on medical research. Examples from emergency medicine practice illustrate these themes. Study replication needs to be valued as much as discovery. Careful and thoughtful unbiased thinking about the results we do have is undervalued. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
The Role of Religiousness and Gender in Sexuality Among Mexican Adolescents.
Espinosa-Hernández, Graciela; Bissell-Havran, Joanna; Nunn, Anna
2015-01-01
This study examined the role of religiousness and gender in age at first intercourse, and sexual expectations and values in Mexico, a macrocontext where the majority is Mexican and Catholic (83%). Participants were Catholic and nonreligious adolescents (54% girls) attending middle (71%) or high school. Findings indicated that Catholic adolescents engaged in sexual intercourse at later ages than nonreligious adolescents. Both religious attendance and importance of religion and values in sexual decision making were associated with more conservative sexual values. Boys who were raised Catholic were more likely to endorse female virginity values and were less likely to expect to wait to have sex until married than nonreligious boys. These associations were not significant among girls. Catholic boys may be more likely to internalize sexual double standard beliefs regarding premarital sex than nonreligious boys. This study expands our understanding of the role of religiousness in Mexican adolescents' sexuality.
Regulations on consume and commercialization of food irradiation in Mexico
NASA Astrophysics Data System (ADS)
Bustos Ramírez, Ma. Emilia; Jiménez Pérez, Jesús
1995-02-01
A Mexican standard for food irradiation is ready for final publication after the authority received and reviewed public comments of the project published in April 1994. The standard establish the radiation doses for different classes of food, based on ICGFI recommendations. Also included are controls for sampling, packaging, labelling, transportation, process inspection and accordance with international regulations. The results of the economical analysis of cost-benefit of the application of the standard show that the net present value is positive. The method of calculation is presented explaining the assumptions considered for the estimation of the total annual savings and surveillance costs. A final version of the research program report on radiation quarantine treatment of Mexican mangoes will be used for the petition to APHIS for the amendment of quarantine procedures to permit importation into the USA of irradiated products.
Extraction of Coastlines with Fuzzy Approach Using SENTINEL-1 SAR Image
NASA Astrophysics Data System (ADS)
Demir, N.; Kaynarca, M.; Oy, S.
2016-06-01
Coastlines are important features for water resources, sea products, energy resources etc. Coastlines are changed dynamically, thus automated methods are necessary for analysing and detecting the changes along the coastlines. In this study, Sentinel-1 C band SAR image has been used to extract the coastline with fuzzy logic approach. The used SAR image has VH polarisation and 10x10m. spatial resolution, covers 57 sqkm area from the south-east of Puerto-Rico. Additionally, radiometric calibration is applied to reduce atmospheric and orbit error, and speckle filter is used to reduce the noise. Then the image is terrain-corrected using SRTM digital surface model. Classification of SAR image is a challenging task since SAR and optical sensors have very different properties. Even between different bands of the SAR sensors, the images look very different. So, the classification of SAR image is difficult with the traditional unsupervised methods. In this study, a fuzzy approach has been applied to distinguish the coastal pixels than the land surface pixels. The standard deviation and the mean, median values are calculated to use as parameters in fuzzy approach. The Mean-standard-deviation (MS) Large membership function is used because the large amounts of land and ocean pixels dominate the SAR image with large mean and standard deviation values. The pixel values are multiplied with 1000 to easify the calculations. The mean is calculated as 23 and the standard deviation is calculated as 12 for the whole image. The multiplier parameters are selected as a: 0.58, b: 0.05 to maximize the land surface membership. The result is evaluated using airborne LIDAR data, only for the areas where LIDAR dataset is available and secondly manually digitized coastline. The laser points which are below 0,5 m are classified as the ocean points. The 3D alpha-shapes algorithm is used to detect the coastline points from LIDAR data. Minimum distances are calculated between the LIDAR points of coastline with the extracted coastline. The statistics of the distances are calculated as following; the mean is 5.82m, standard deviation is 5.83m and the median value is 4.08 m. Secondly, the extracted coastline is also evaluated with manually created lines on SAR image. Both lines are converted to dense points with 1 m interval. Then the closest distances are calculated between the points from extracted coastline and manually created coastline. The mean is 5.23m, standard deviation is 4.52m. and the median value is 4.13m for the calculated distances. The evaluation values are within the accuracy of used SAR data for both quality assessment approaches.
Fluid-flow-rate metrology: laboratory uncertainties and traceabilities
NASA Astrophysics Data System (ADS)
Mattingly, G. E.
1991-03-01
Increased concerns for improved fluid flowrate measurement are driving the fluid metering community-meter manufacturers and users alike-to search for better verification and documentation for their fluid measurements. These concerns affect both our domestic and international market places they permeate our technologies - aerospace chemical processes automotive bioengineering etc. They involve public health and safety and they impact our national defense. These concerns are based upon the rising value of fluid resources and products and the importance of critical material accountability. These values directly impact the accuracy needs of fluid buyers and sellers in custody transfers. These concerns impact the designers and operators of chemical process systems where control and productivity optimization depend critically upon measurement precision. Public health and safety depend upon the quality of numerous pollutant measurements - both liquid and gaseous. The performance testing of engines - both automotive and aircraft are critically based upon accurate fuel measurements - both liquid and oxidizer streams. Fluid flowrate measurements are established differently from counterparts in length and mass measurement systems because these have the benefits of " identity" standards. For rate measurement systems the metrology is based upon " derived standards" . These use facilities and transfer standards which are designed built characterized and used to constitute basic measurement capabilities and quantify performance - accuracy and precision. Because " identity standards" do not exist for flow measurements facsimiles or equivalents must
Do volunteer community-based preceptors value students' feedback?
Dent, M Marie; Boltri, John; Okosun, Ike S
2004-11-01
A key component of educational practice is to provide feedback and evaluation to teachers and learners to improve the teaching and learning process. The purpose of this study was to determine whether volunteer community preceptors value evaluation and feedback by students as much as they value other resources or rewards. In Fall 1999, a questionnaire concerning the resources and rewards of preceptorship was mailed to 236 community preceptors affiliated with the Mercer University School of Medicine, Macon, Georgia. Preceptors were asked to rate 20 factors on a five-point Likert scale (5 = very important to 1 = not very important). The mean values were compared using t-tests. One hundred sixty-eight preceptors (71%) completed questionnaires. Preceptors rated evaluation and feedback from students significantly higher (p < .001) than all other factors (mean = 4.02, standard deviation [SD] = .87). Continuing medical education for teaching was the next most highly valued factor (mean = 3.67, SD = 1.14). Preceptors rated financial compensation the lowest (mean = 2.01, SD = 1.19) of all factors. The high rank of feedback and evaluation from students persisted across gender, specialty, length of time as a preceptor, practice location, and years practicing medicine. This study demonstrates that feedback and evaluation from students is highly valued. The knowledge that community-based preceptors highly value feedback and evaluation from students should stimulate medical school programs to provide feedback and evaluation to preceptors that will enhance the educational outcomes for both faculty and learners.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaznovsky, A. P., E-mail: kaznovskyap@atech.ru; Kasiyanov, K. G.; Ryasnyj, S. I.
2015-01-15
A classification of the equipment important for the safety of nuclear power plants is proposed in terms of its dynamic behavior under seismic loading. An extended bank of data from dynamic tests over the entire range of thermal and mechanical equipment in generating units with VVER-1000 and RBMK-1000 reactors is analyzed. Results are presented from a study of the statistical behavior of the distribution of vibrational frequencies and damping decrements with the “small perturbation” factor that affects the measured damping decrements taken into account. A need to adjust the regulatory specifications for choosing the values of the damping decrements withmore » specified inertial loads on equipment owing to seismic effects during design calculations is identified. Minimum values of the decrements are determined and proposed for all types of equipment as functions of the directions and natural vibration frequencies of the dynamic interactions to be adopted as conservative standard values in the absence of actual experimental data in the course of design studies of seismic resistance.« less
Cost-effectiveness analyses and their role in improving healthcare strategies.
Rodriguez, Maria I; Caughey, Aaron B
2013-12-01
In this era of healthcare reform, attention is focused on increasing the quality of care and access to services, while simultaneously reducing the cost. Economic evaluations can play an important role in translating research to evidence-based practice and policy. Cost-effectiveness analysis (CEA) and its utility for clinical and policy decision making among U.S. obstetricians and gynecologists is reviewed. Three case examples demonstrating the value of this methodology in decision making are considered. A discussion of the methodologic principles of CEA, the advantages, and the limitations of the methodology are presented. CEA can play an important role in evidence-based decision making, with value for clinicians and policy makers alike. These studies are of particular interest in the field of obstetrics and gynecology, in which uncertainty from epidemiologic or clinical trials exists, or multiple perspectives need to be considered (maternal, neonatal, and societal). As with all research, it is essential that economic evaluations are conducted according to established methodologic standards. Interpretation and application of results should occur with a clear understanding of both the value and the limitations of economic evaluations.
NASA Astrophysics Data System (ADS)
Fournelle, J.; Hanchar, J. M.
2013-12-01
It is not commonly recognized as such, but the accurate measurement of Hf in zircon is not a trivial analytical issue. This is important to assess because Hf is often used as an internal standard for trace element analyses of zircon by LA-ICPMS. The issues pertaining to accuracy revolve around: (1) whether the Hf Ma or the La line is used; (2) what accelerating voltage is applied if Zr La is also measured, and (3) what standard for Hf is used. Weidenbach, et al.'s (2004) study of the 91500 zircon demonstrated the spread (in accuracy) of possible EPMA values for six EPMA labs, 2 of which used Hf Ma, 3 used Hf La, and one used Hf Lb, and standards ranged from HfO2, a ZrO2-HfO2 compound, Hf metal, and hafnon. Weidenbach, et al., used the ID-TIMS values as the correct value (0.695 wt.% Hf.), for which not one of the EPMA labs came close to that value (3 were low and 3 were high). Those data suggest: (1) that there is a systematic underestimation error of the 0.695 wt% Hf (ID-TIMS Hf) value if Hf Ma is used; most likely an issue with the matrix correction, as the analytical lines and absorption edges of Zr La, Si Ka and Hf Ma are rather tightly packed in the electromagnetic spectrum. Mass absorption coefficients are easily in error (e.g., Donovan's determination of the MAC of Hf by Si Ka of 5061 differs from the typically used Henke value of 5449 (Donovan et al, 2002); and (2) For utilization of the Hf La line, however, the second order Zr Ka line interferes with Hf La if the accelerating voltage is greater than 17.99 keV. If this higher keV is used and differential mode PHA is applied, only a portion of the interference is removed (e.g., removal of escape peaks), causing an overestimation of Hf content. Unfortunately, it is virtually impossible to apply an interference correction in this case, as it is impossible to locate Hf-free Zr probe standard. We have examined many of the combinations used by those six EPMA labs and concluded that the optimal EPMA is done with Hf La with the accelerating voltage under 18 keV (e.g. 17 keV is optimal), and also with synthetic stoichiometric hafnon as the standard. We have developed useful standards that are to be distributed to the community for those researchers working on this problem and can be obtained from the second author at jhanchar@mun.ca. The standards include synthetic stoichiometric undoped zircon and hafnon, and synthetic zircon doped with 2 wt. % Hf. Donovan et al. (2002) Probe for Windows: User's Guide and Reference Wiedenbeck, M., et al. (2004) Further characterisation of the 91500 zircon crystal. Geostandards and Geoanatytical Research, 28: 9-39.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoogheem, T.J.; Woods, L.A.
The Monsanto Analytical Testing (MAT) program was devised and implemented in order to provide analytical standards to Monsanto manufacturing plants involved in the self-monitoring of plant discharges as required by National Pollutant Discharge Elimination System (NPDES) permit conditions. Standards are prepared and supplied at concentration levels normally observed at each individual plant. These levels were established by canvassing all Monsanto plants having NPDES permits and by determining which analyses and concentrations were most appropriate. Standards are prepared by Monsanto's analyses and concentrations were most appropriate. Standards are prepared by Monsanto's Environmental Sciences Center (ESC) using Environmental Protection Agency (EPA) methods.more » Eleven standards are currently available, each in three concentrations. Standards are issued quarterly in a company internal round-robin program or on a per request basis or both. Since initiation of the MAT program in 1981, the internal round-robin program has become an integral part of Monsanto's overall Good Laboratory Practices (GLP) program. Overall, results have shown that the company's plant analytical personnel can accurately analyze and report standard test samples. More importantly, such personnel have gained increased confidence in their ability to report accurate values for compounds regulated in their respective plant NPDES permits. 3 references, 3 tables.« less
Toccalino, Patricia L.; Norman, Julia E.; Phillips, Robyn H.; Kauffman, Leon J.; Stackelberg, Paul E.; Nowell, Lisa H.; Krietzman, Sandra J.; Post, Gloria B.
2004-01-01
A state-scale pilot effort was conducted to evaluate a Health-Based Screening Level (HBSL) approach developed for communicating findings from the U.S. Geological Survey (USGS) National Water-Quality Assessment Program in a human-health context. Many aquifers sampled by USGS are used as drinking-water sources, and water-quality conditions historically have been assessed by comparing measured contaminant concentrations to established drinking-water standards and guidelines. Because drinking-water standards and guidelines do not exist for many analyzed contaminants, HBSL values were developed collaboratively by the USGS, U.S. Environmental Protection Agency (USEPA), New Jersey Department of Environmental Protection, and Oregon Health & Science University, using USEPA toxicity values and USEPA Office of Water methodologies. The main objective of this report is to demonstrate the use of HBSL approach as a tool for communicating water-quality data in a human-health context by conducting a retrospective analysis of ground-water quality data from New Jersey. Another important objective is to provide guidance on the use and interpretation of HBSL values and other human-health benchmarks in the analyses of water-quality data in a human-health context. Ground-water samples collected during 1996-98 from 30 public-supply, 82 domestic, and 108 monitoring wells were analyzed for 97 pesticides and 85 volatile organic compounds (VOCs). The occurrence of individual pesticides and VOCs was evaluated in a human-health context by calculating Benchmark Quotients (BQs), defined as ratios of measured concentrations of regulated compounds (that is, compounds with Federal or state drinking-water standards) to Maximum Contaminant Level (MCL) values and ratios of measured concentrations of unregulated compounds to HBSL values. Contaminants were identified as being of potential human-health concern if maximum detected concentrations were within a factor of 10 of the associated MCL or HBSL (that is, maximum BQ value (BQmax) greater than or equal to 0.1) in any well type (public supply, domestic, monitoring). Most (57 of 77) pesticides and VOCs with human-health benchmarks were detected at concentrations well below these levels (BQmax less than 0.1) for all three well types; however, BQmax values ranged from 0.1 to 3,000 for 6 pesticides and 14 VOCs. Of these 20 contaminants, one pesticide (dieldrin) and three VOCs (1,2-dibromoethane, tetrachloroethylene, and trichloroethylene) both (1) were measured at concentrations that met or exceeded MCL or HBSL values, and (2) were detected in more than 10 percent of samples collected from raw ground water used as sources of drinking water (public-supply and (or) domestic wells) and, therefore, are particularly relevant to human health. The occurrence of multiple pesticides and VOCs in individual wells also was evaluated in a human-health context because at least 53 different contaminants were detected in each of the three well types. To assess the relative human-health importance of the occurrence of multiple contaminants in different wells, the BQ values for all contaminants in a given well were summed. The median ratio of the maximum BQ to the sum of all BQ values for each well ranged from 0.83 to 0.93 for all well types, indicating that the maximum BQ makes up the majority of the sum for most wells. Maximum and summed BQ values were statistically greater for individual public-supply wells than for individual domestic and monitoring wells. The HBSL approach is an effective tool for placing water-quality data in a human-health context. For 79 of the 182 compounds analyzed in this study, no USEPA drinking-water standards or guidelines exist, but new HBSL values were calculated for 39 of these 79 compounds. The new HBSL values increased the number of detected pesticides and VOCs with human-health benchmarks from 65 to 77 (of 97 detected compounds), thereby expanding the basis for interpreting contaminant-occu
NASA Astrophysics Data System (ADS)
Gonzalez, Yvonne Lynne
This qualitative, descriptive multiple case study took place in a Southwest Texas city bordering Mexico. The study examined specific resources and practices used in four different exemplary-rated elementary school campuses, with standardized test data reflecting 93% or more of their 5th-grade Hispanic student population passing the state mandated standardized science test. The sample for this study included one principal, one assistant principal, and three 5th-grade teachers from each campus. In total, the sample participants consisted of four principals, four assistant principals, and 12 5th-grade teachers. Data collection involved conducting in-depth, semi-structured interviews guided by five literature-based, researcher-generated questions. Fifth grade teachers and administrators reflected and reported upon their pedagogy for best practices in helping Hispanic students achieve success. Analysis of the data revealed eight themes: (a) successful schools have committed teachers, an environment conducive to learning, and incorporate best practices that work for all students; (b) curriculum alignment is very important; (c) teachers have access to a variety of resources; (d) teacher collaboration and planning is very important; (e) science camps, science reviews, and hands-on centers are effective in preparing students for the standardized test; (f) the most effective instructional practices include high emphasis on vocabulary, hands-on and differentiated instruction, and the 5E Model; (g) teachers see value in self-contained, dual-language classes; and (h) professional development and performance feedback are important to educators. The results of this study provide educational leaders with specific science instructional resources, practices, and interventions proven effective for the 5 th-grade Hispanic student population in passing the science state standardized test.
Service quality and perceived customer value in community pharmacies.
Guhl, Dennis; Blankart, Katharina E; Stargardt, Tom
2018-01-01
A patient's perception of the service provided by a health care provider is essential for the successful delivery of health care. This study examines the value created by community pharmacies-defined as perceived customer value-in the prescription drug market through varying elements of service quality. We develop a path model that describes the relationship between service elements and perceived customer value. We then analyze the effect of perceived customer value on customer satisfaction and loyalty. We use data obtained from 289 standardized interviews on respondents' prescription fill in the last six months in Germany. The service elements personal interaction (path coefficient: 0.31), physical aspect (0.12), store policy (0.24), and availability (0.1) have a positive significant effect on perceived customer value. Consultation and reliability have no significant influence. We further find a strong positive interdependency between perceived customer value, customer satisfaction (0.75), and customer loyalty (0.71). Thus, pharmacies may enhance customer satisfaction and loyalty if they consider the customer perspective and focus on the relevant service elements. To enhance benefit, personal interaction appears to be most important to address appropriately.
Wickramasinghe, Kremlin; Rayner, Mike; Goldacre, Michael; Townsend, Nick; Scarborough, Peter
2017-01-01
Objectives The aim of this modelling study was to estimate the expected changes in the nutritional quality and greenhouse gas emissions (GHGEs) of primary school meals due to the adoption of new mandatory food-based standards for school meals. Setting Nationally representative random sample of 136 primary schools in England was selected for the Primary School Food Survey (PSFS) with 50% response rate. Participants A sample of 6690 primary students from PSFS who consumed school meals. Outcome measures Primary School Food Plan (SFP) nutritional impact was assessed using both macronutrient and micronutrient quality. The environmental impact was measured by GHGEs. Methods The scenario tested was one in which every meal served in schools met more than half of the food-based standards mentioned in the SFP (SFP scenario). We used findings from a systematic review to assign GHGE values for each food item in the data set. The GHGE value and nutritional quality of SFP scenario meals was compared with the average primary school meal in the total PSFS data set (pre-SFP scenario). Prior to introduction of the SFP (pre-SFP scenario), the primary school meals had mandatory nutrient-based guidelines. Results The percentage of meals that met the protein standard increased in the SFP scenario and the proportion of meals that met the standards for important micronutrients (eg, iron, calcium, vitamin A and C) also increased. However, the SFP scenario did not improve the salt, saturated fat and free sugar levels. The mean GHGE value of meals which met the SFP standards was 0.79 (95% CI 0.77 to 0.81) kgCO2e compared with a mean value of 0.72 (0.71 to 0.74) kgCO2e for all meals. Adopting the SFP would increase the total emissions associated with primary school meals by 22 000 000 kgCO2e per year. Conclusions The universal adoption of the new food-based standards, without reformulation would result in an increase in the GHGEs of school meals and improve some aspects of the nutritional quality, but it would not improve the average salt, sugar and saturated fat content levels. PMID:28381419
Wickramasinghe, Kremlin; Rayner, Mike; Goldacre, Michael; Townsend, Nick; Scarborough, Peter
2017-04-05
The aim of this modelling study was to estimate the expected changes in the nutritional quality and greenhouse gas emissions (GHGEs) of primary school meals due to the adoption of new mandatory food-based standards for school meals. Nationally representative random sample of 136 primary schools in England was selected for the Primary School Food Survey (PSFS) with 50% response rate. A sample of 6690 primary students from PSFS who consumed school meals. Primary School Food Plan (SFP) nutritional impact was assessed using both macronutrient and micronutrient quality. The environmental impact was measured by GHGEs. The scenario tested was one in which every meal served in schools met more than half of the food-based standards mentioned in the SFP (SFP scenario). We used findings from a systematic review to assign GHGE values for each food item in the data set. The GHGE value and nutritional quality of SFP scenario meals was compared with the average primary school meal in the total PSFS data set (pre-SFP scenario). Prior to introduction of the SFP (pre-SFP scenario), the primary school meals had mandatory nutrient-based guidelines. The percentage of meals that met the protein standard increased in the SFP scenario and the proportion of meals that met the standards for important micronutrients (eg, iron, calcium, vitamin A and C) also increased. However, the SFP scenario did not improve the salt, saturated fat and free sugar levels. The mean GHGE value of meals which met the SFP standards was 0.79 (95% CI 0.77 to 0.81) kgCO 2 e compared with a mean value of 0.72 (0.71 to 0.74) kgCO 2 e for all meals. Adopting the SFP would increase the total emissions associated with primary school meals by 22 000 000 kgCO 2 e per year. The universal adoption of the new food-based standards, without reformulation would result in an increase in the GHGEs of school meals and improve some aspects of the nutritional quality, but it would not improve the average salt, sugar and saturated fat content levels. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Estimating missing daily temperature extremes in Jaffna, Sri Lanka
NASA Astrophysics Data System (ADS)
Thevakaran, A.; Sonnadara, D. U. J.
2018-04-01
The accuracy of reconstructing missing daily temperature extremes in the Jaffna climatological station, situated in the northern part of the dry zone of Sri Lanka, is presented. The adopted method utilizes standard departures of daily maximum and minimum temperature values at four neighbouring stations, Mannar, Anuradhapura, Puttalam and Trincomalee to estimate the standard departures of daily maximum and minimum temperatures at the target station, Jaffna. The daily maximum and minimum temperatures from 1966 to 1980 (15 years) were used to test the validity of the method. The accuracy of the estimation is higher for daily maximum temperature compared to daily minimum temperature. About 95% of the estimated daily maximum temperatures are within ±1.5 °C of the observed values. For daily minimum temperature, the percentage is about 92. By calculating the standard deviation of the difference in estimated and observed values, we have shown that the error in estimating the daily maximum and minimum temperatures is ±0.7 and ±0.9 °C, respectively. To obtain the best accuracy when estimating the missing daily temperature extremes, it is important to include Mannar which is the nearest station to the target station, Jaffna. We conclude from the analysis that the method can be applied successfully to reconstruct the missing daily temperature extremes in Jaffna where no data is available due to frequent disruptions caused by civil unrests and hostilities in the region during the period, 1984 to 2000.
ERIC Educational Resources Information Center
Rianawati
2017-01-01
This paper based on the importance of moral character that it has been formed by cultural values and national character and it also can be found in the various subjects are contained in the standard of national curricula. The foundation of moral development is Qur'an and Hadist that is enacted in National Education Law No. 20/2003 which stated…
Gries, Katharine S; Regier, Dean A; Ramsey, Scott D; Patrick, Donald L
2017-06-01
To develop a statistical model generating utility estimates for prostate cancer specific health states, using preference weights derived from the perspectives of prostate cancer patients, men at risk for prostate cancer, and society. Utility estimate values were calculated using standard gamble (SG) methodology. Study participants valued 18 prostate-specific health states with the five attributes: sexual function, urinary function, bowel function, pain, and emotional well-being. Appropriateness of model (linear regression, mixed effects, or generalized estimating equation) to generate prostate cancer utility estimates was determined by paired t-tests to compare observed and predicted values. Mixed-corrected standard SG utility estimates to account for loss aversion were calculated based on prospect theory. 132 study participants assigned values to the health states (n = 40 men at risk for prostate cancer; n = 43 men with prostate cancer; n = 49 general population). In total, 792 valuations were elicited (six health states for each 132 participants). The most appropriate model for the classification system was a mixed effects model; correlations between the mean observed and predicted utility estimates were greater than 0.80 for each perspective. Developing a health-state classification system with preference weights for three different perspectives demonstrates the relative importance of main effects between populations. The predicted values for men with prostate cancer support the hypothesis that patients experiencing the disease state assign higher utility estimates to health states and there is a difference in valuations made by patients and the general population.
Simpson, Christine A; Cusano, Anna Maria; Bihuniak, Jessica; Walker, Joanne; Insogna, Karl L
2015-04-01
The Vitamin D Standardization Program (VDSP) has identified ID-LC/MS/MS as the reference method procedure (RMP) for 25(OH) vitamin D and NIST Standard SRM2972 as the standard reference material (SRM). As manufacturers align their products to the RMP and NIST standard, a concern is that results obtained in aligned assays will be divergent from those obtained with pre-alignment assays. The Immunodiagnostic Systems Ltd., chemiluminescent, 25(OH) vitamin D iSYS platform assay, was recently harmonized to the RMP. To determine the impact of standardization on results obtained with iSYS reagents, 119 single donor serum samples from eight different disease categories were analyzed in four non-standardized and two standardized iSYS assays. There were strong correlations between the four non-standardized and two standardized assays with Spearman's rank r values between 0.975 and 0.961 and four of the eight r values were >0.97. R(2) values for the eight best-fit linear regression equations ranging between 0.947 and 0.916. None of the slopes were found to be significantly different from one another. Bland-Altman plots showed that the bias was comparable when each of the four non-standardized assays was compared to either of the standardized assays. When the data were segregated in values between 6 and 49ng/mL (15-122nmol/L) or between 50 and 100ng/mL (125-250nmol/L) significant associations remained between results obtained with non-standardized and standardized calibrators regardless of the absolute value. When five recent DEQAS unknowns were analyzed in one non-standardized and one standardized assay the mean percent difference from the NIST target in values obtained using standardized vs. non-standardized calibrators were not significantly different. Finally, strong and statistically significant associations between the results were obtained using non-standardized and standardized assays for six of eight clinical conditions. The only exceptions were hypocalcemia and breast cancer, which likely reflect the small sample sizes for each of these diseases. These initial data provide confidence that the move to a NIST standardized assay will have little impact on results obtained with the iSYS platform. This article is part of a Special Issue entitled '17th Vitamin D Workshop'. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kim, Jae Hyun; Yun, Sungha; Hwang, Seung-sik; Shim, Jung Ok; Chae, Hyun Wook; Lee, Yeoun Joo; Lee, Ji Hyuk; Kim, Soon Chul; Lim, Dohee; Yang, Sei Won
2018-01-01
Growth charts are curves or tables that facilitate the visualization of anthropometric parameters, and are widely used as an important indicator when evaluating the growth status of children and adolescents. The latest version of the Korean National Growth Charts released in 2007 has raised concerns regarding the inclusion of data from both breastfed and formula-fed infants, higher body mass index (BMI) values in boys, and smaller 3rd percentile values in height-for-age charts. Thus, new growth charts have been developed to improve the previous version. The 2006 World Health Organization Child Growth Standards, regarded as the standard for breastfed infants and children, were introduced for children aged 0–35 months. For children and adolescents aged 3–18 years, these new growth charts include height-for-age, weight-for-age, BMI-for-age, weight-for-height, and head circumference-for-age charts, and were developed using data obtained in 1997 and 2005. Data sets and exclusion criteria were applied differently for the development of the different growth charts. BMI-for-age charts were adjusted to decrease the 95th percentile values of BMI. Criteria for obesity were simplified and defined as a BMI of ≥95th percentile for age and sex. The 3rd percentile values for height-for-age charts were also increased. Additional percentile lines (1st and 99th) and growth charts with standard deviation lines were introduced. 2017 Korean National Growth Charts are recommended for the evaluation of body size and growth of Korean children and adolescents for use in clinics and the public health sector in Korea. PMID:29853938
Winners love winning and losers love money.
Kassam, Karim S; Morewedge, Carey K; Gilbert, Daniel T; Wilson, Timothy D
2011-05-01
Salience and satisfaction are important factors in determining the comparisons that people make. We hypothesized that people make salient comparisons first, and then make satisfying comparisons only if salient comparisons leave them unsatisfied. This hypothesis suggests an asymmetry between winning and losing. For winners, comparison with a salient alternative (i.e., losing) brings satisfaction. Therefore, winners should be sensitive only to the relative value of their outcomes. For losers, comparison with a salient alternative (i.e., winning) brings little satisfaction. Therefore, losers should be drawn to compare outcomes with additional standards, which should make them sensitive to both relative and absolute values of their outcomes. In Experiment 1, participants won one of two cash prizes on a scratch-off ticket. Winners were sensitive to the relative value of their prizes, whereas losers were sensitive to both the relative and the absolute values of their prizes. In Experiment 2, losers were sensitive to the absolute value of their prize only when they had sufficient cognitive resources to engage in effortful comparison.
NASA Astrophysics Data System (ADS)
Moustris, Konstantinos; Tsiros, Ioannis X.; Tseliou, Areti; Nastos, Panagiotis
2018-04-01
The present study deals with the development and application of artificial neural network models (ANNs) to estimate the values of a complex human thermal comfort-discomfort index associated with urban heat and cool island conditions inside various urban clusters using as only inputs air temperature data from a standard meteorological station. The index used in the study is the Physiologically Equivalent Temperature (PET) index which requires as inputs, among others, air temperature, relative humidity, wind speed, and radiation (short- and long-wave components). For the estimation of PET hourly values, ANN models were developed, appropriately trained, and tested. Model results are compared to values calculated by the PET index based on field monitoring data for various urban clusters (street, square, park, courtyard, and gallery) in the city of Athens (Greece) during an extreme hot weather summer period. For the evaluation of the predictive ability of the developed ANN models, several statistical evaluation indices were applied: the mean bias error, the root mean square error, the index of agreement, the coefficient of determination, the true predictive rate, the false alarm rate, and the Success Index. According to the results, it seems that ANNs present a remarkable ability to estimate hourly PET values within various urban clusters using only hourly values of air temperature. This is very important in cases where the human thermal comfort-discomfort conditions have to be analyzed and the only available parameter is air temperature.
Moustris, Konstantinos; Tsiros, Ioannis X; Tseliou, Areti; Nastos, Panagiotis
2018-04-11
The present study deals with the development and application of artificial neural network models (ANNs) to estimate the values of a complex human thermal comfort-discomfort index associated with urban heat and cool island conditions inside various urban clusters using as only inputs air temperature data from a standard meteorological station. The index used in the study is the Physiologically Equivalent Temperature (PET) index which requires as inputs, among others, air temperature, relative humidity, wind speed, and radiation (short- and long-wave components). For the estimation of PET hourly values, ANN models were developed, appropriately trained, and tested. Model results are compared to values calculated by the PET index based on field monitoring data for various urban clusters (street, square, park, courtyard, and gallery) in the city of Athens (Greece) during an extreme hot weather summer period. For the evaluation of the predictive ability of the developed ANN models, several statistical evaluation indices were applied: the mean bias error, the root mean square error, the index of agreement, the coefficient of determination, the true predictive rate, the false alarm rate, and the Success Index. According to the results, it seems that ANNs present a remarkable ability to estimate hourly PET values within various urban clusters using only hourly values of air temperature. This is very important in cases where the human thermal comfort-discomfort conditions have to be analyzed and the only available parameter is air temperature.
78 FR 52679 - Safety Standard for Cigarette Lighters; Adjusted Customs Value for Cigarette Lighters
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-26
... Customs Value for Cigarette Lighters AGENCY: Consumer Product Safety Commission. ACTION: Final rule... refillable lighters that use butane or similar fuels and have a Customs Value or ex-factory price below a threshold value (initially set at $2.00 in 1993). The standard provides that the initial $2.00 value adjusts...
Formulae Based on Biomathematics to Estimate the Standard Value of Fetal Growth of Japanese.
Miyagi, Yasunari; Tada, Katsuhiko; Takayoshi, Riko; Oguni, Nobutsugu; Sato, Yasushi; Shibata, Maki; Kiyokawa, Machiko; Hashimoto, Tadashi; Takada, Tomoyoshi; Oda, Takashi; Miyake, Takahito
2018-04-01
We devised biomathematics-based formulae to estimate the standard values of fetal growth of Japanese after 22 weeks' gestation. The growth rates of bi-parietal diameter (BPD), abdominal circumference (AC), femur length (FL), and estimated fetal body weight (EFBW) at the time of gestation were assumed to be proportional to the product of the value at the time and the rest value of an unknown maximum value, respectively. The EFBW was also assumed to follow a multiple logistic function of BPD, AC and FL to fit the standard values of Japanese fetuses published by the Japan Society of Ultrasonics in Medicine. The Mann-Whitney test was used for statistical analysis. The values as a function of gestational day, t, were as follows: BPD(t)=99.6/(1+exp (2.725-0.01837*t)) (mm); AC(t)=39.7/(1+exp (2.454-0.01379*t)) (cm); FL(t)=79.6/(1+exp (2.851-0.01710*t)) (mm); EFBW(t)=8045.1/(1+exp (6.028-0.06582*BPD(t)-0.1469*AC(t)+ 0.07377*FL(t))) (g). EFBW as a function of BPD, AC and FL was as follows: EFBW=8045.1/(1+exp (4.747+ 0.02584*BPD+0.1010*AC-0.1416*FL)) (g). When the BPD, AC and FL were at -2 standard deviation (SD), -1SD, mean and + 2SD, the EFBW values calculated by the formula were statistically closer to the standard values than conventional formulas with p-values of 4.871×10-7, 4.228×10-7, 9.777×10-7 and 0.028, respectively. The formulae based on biomathematics might be useful to estimate the fetal growth standard values.
Efficient Estimation of the Standardized Value
ERIC Educational Resources Information Center
Longford, Nicholas T.
2009-01-01
We derive an estimator of the standardized value which, under the standard assumptions of normality and homoscedasticity, is more efficient than the established (asymptotically efficient) estimator and discuss its gains for small samples. (Contains 1 table and 3 figures.)
Professional nursing values: A concept analysis.
Schmidt, Bonnie J; McArthur, Erin C
2018-01-01
The aim of this concept analysis is to clarify the meaning of professional nursing values. In a time of increasing ethical dilemmas, it is essential that nurses internalize professional values to develop and maintain a professional identity. However, nursing organizations and researchers provide different conceptions of professional nursing values, leading to a lack of clarity as to the meaning and attributes of this construct. Walker and Avant's (2011) method was used to guide an analysis of this concept. Resources published from 1973 to 2016 were identified via electronic databases and hand-searching of reference lists. A review of the literature was completed and the data were analyzed to identify uses of the concept; the defining attributes of the concept; borderline, related, contrary, and illegitimate examples; antecedents and consequences; and empirical referents. Professional nursing values were defined as important professional nursing principles of human dignity, integrity, altruism, and justice that serve as a framework for standards, professional practice, and evaluation. Further research is needed in the development and testing of professional nursing values theory, and the reassessment of values instruments. Core professional values that are articulated may help unify the profession and demonstrate the value of nursing to the public. © 2017 Wiley Periodicals, Inc.
Skela-Savič, Brigita; Hvalič-Touzery, Simona; Pesjak, Katja
2017-08-01
To establish the connection between values, competencies, selected job characteristics and evidence-based practice use. Nurses rarely apply evidence-based practice in everyday work. A recent body of research has looked at various variables explaining the use of evidence-based practice, but not values and competencies. A cross-sectional, non-experimental quantitative explorative research design. Standardized instruments were used (Nurse Professional Values Scale-R, Nurse Competence Scale, Evidence-Based Practice Beliefs and Implementation Scale). The sample included 780 nurses from 20 Slovenian hospitals. The data were collected in 2015. The study identifies two new variables contributing to a better understanding of beliefs on and implementation of evidence-based practice, thus broadening the existing research evidence. These are the values of activism and professionalism and competencies aimed at the development and professionalization of nursing. Values of caring, trust and justice and competencies expected in everyday practice do not influence the beliefs and implementation of evidence-based practice. Respondents ascribed less importance to values connected with activism and professionalism and competencies connected with the development of professionalism. Nurses agree that evidence-based practice is useful in their clinical work, but they lack the knowledge to implement it in practice. Evidence-based practice implementation in nursing practice is low. Study results stress the importance of increasing the knowledge and skills on professional values of activism and professionalism and competencies connected to nursing development. The study expands the current understanding of evidence-based practice use and provides invaluable insight for nursing managers, higher education managers and the national nursing association. © 2017 John Wiley & Sons Ltd.
Suliman, Huda S; Fecura, Stephen E; Baskin, Jonathan; Kalns, John E
2011-06-01
Heat and moisture exchangers (HMEs) are used for airway humidification in mechanically ventilated patients and have been evaluated only under hospital conditions. U.S. Air Force aeromedical evacuation transports are performed under rugged conditions further complicated by the cold and dry environment in military aircrafts, and HMEs are used to provide airway humidification for patients. This study evaluated 10 commercial HMEs using a test system that simulated aeromedical evacuation conditions. Although the American National Standards Institute recommends inspired air to be at an absolute humidity value of > or = 30 mg/L for mechanically ventilated patients, the highest absolute humidity by any HME was approximately 20 mg/L. Although none of the HMEs were able to maintain a temperature high enough to achieve the humidity standard of the American National Standards Institute, the clinical significance of this standard may be less important than the relative humidity maintained in the respired air, especially on evacuation flights of short duration.
Haiden, N; Pimpel, B; Assadian, O; Binder, C; Kreissl, A; Repa, A; Thanhäuser, M; Roberts, C D; Berger, A
2016-03-01
Bacterial counts in 1466 expressed breast milk (EBM) samples from women following one of two infection control regimens (standard vs strict) were investigated. Overall, 12% of samples yielded Gram-negative bacteria, with no significant differences between the standard [11.9% (94/788)] and strict [12.1% (82/678)] regimens (P = 0.92). Significantly more samples were contaminated when expressed at home (standard regimen home/hospital: 17.9% vs 6.1%; strict regimen home/hospital: 19.6% vs 3.4%; P < 0.001). Bacterial contamination of EBM was not associated with the regimen, but was associated with the location of breast milk expression. Attempts to improve personal hygiene during milk collection seem to be of limited value. Good hygiene of collection and storage equipment is likely to be the most important way to ensure the microbiological quality of EBM. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Utanohara, Yuri; Hayashi, Ryo; Yoshikawa, Mineka; Yoshida, Mitsuyoshi; Tsuga, Kazuhiro; Akagawa, Yasumasa
2008-09-01
It is clinically important to evaluate tongue function in terms of rehabilitation of swallowing and eating ability. We have developed a disposable tongue pressure measurement device designed for clinical use. In this study we used this device to determine standard values of maximum tongue pressure in adult Japanese. Eight hundred fifty-three subjects (408 male, 445 female; 20-79 years) were selected for this study. All participants had no history of dysphagia and maintained occlusal contact in the premolar and molar regions with their own teeth. A balloon-type disposable oral probe was used to measure tongue pressure by asking subjects to compress it onto the palate for 7 s with maximum voluntary effort. Values were recorded three times for each subject, and the mean values were defined as maximum tongue pressure. Although maximum tongue pressure was higher for males than for females in the 20-49-year age groups, there was no significant difference between males and females in the 50-79-year age groups. The maximum tongue pressure of the seventies age group was significantly lower than that of the twenties to fifties age groups. It may be concluded that maximum tongue pressures were reduced with primary aging. Males may become weaker with age at a faster rate than females; however, further decreases in strength were in parallel for male and female subjects.
On the Distribution of Protein Refractive Index Increments
Zhao, Huaying; Brown, Patrick H.; Schuck, Peter
2011-01-01
The protein refractive index increment, dn/dc, is an important parameter underlying the concentration determination and the biophysical characterization of proteins and protein complexes in many techniques. In this study, we examine the widely used assumption that most proteins have dn/dc values in a very narrow range, and reappraise the prediction of dn/dc of unmodified proteins based on their amino acid composition. Applying this approach in large scale to the entire set of known and predicted human proteins, we obtain, for the first time, to our knowledge, an estimate of the full distribution of protein dn/dc values. The distribution is close to Gaussian with a mean of 0.190 ml/g (for unmodified proteins at 589 nm) and a standard deviation of 0.003 ml/g. However, small proteins <10 kDa exhibit a larger spread, and almost 3000 proteins have values deviating by more than two standard deviations from the mean. Due to the widespread availability of protein sequences and the potential for outliers, the compositional prediction should be convenient and provide greater accuracy than an average consensus value for all proteins. We discuss how this approach should be particularly valuable for certain protein classes where a high dn/dc is coincidental to structural features, or may be functionally relevant such as in proteins of the eye. PMID:21539801
On the distribution of protein refractive index increments.
Zhao, Huaying; Brown, Patrick H; Schuck, Peter
2011-05-04
The protein refractive index increment, dn/dc, is an important parameter underlying the concentration determination and the biophysical characterization of proteins and protein complexes in many techniques. In this study, we examine the widely used assumption that most proteins have dn/dc values in a very narrow range, and reappraise the prediction of dn/dc of unmodified proteins based on their amino acid composition. Applying this approach in large scale to the entire set of known and predicted human proteins, we obtain, for the first time, to our knowledge, an estimate of the full distribution of protein dn/dc values. The distribution is close to Gaussian with a mean of 0.190 ml/g (for unmodified proteins at 589 nm) and a standard deviation of 0.003 ml/g. However, small proteins <10 kDa exhibit a larger spread, and almost 3000 proteins have values deviating by more than two standard deviations from the mean. Due to the widespread availability of protein sequences and the potential for outliers, the compositional prediction should be convenient and provide greater accuracy than an average consensus value for all proteins. We discuss how this approach should be particularly valuable for certain protein classes where a high dn/dc is coincidental to structural features, or may be functionally relevant such as in proteins of the eye. Copyright © 2011 Biophysical Society. Published by Elsevier Inc. All rights reserved.
40 CFR 80.1405 - What are the Renewable Fuel Standards?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Renewable Fuel Standards? (a) Renewable Fuel Standards for 2011. (1) The value of the cellulosic biofuel... be 0.69 percent. (3) The value of the advanced biofuel standard for 2011 shall be 0.78 percent. (4... ER10MY10.003 ER10MY10.004 Where: StdCB,i = The cellulosic biofuel standard for year i, in percent. StdBBD,i...
Nyokabi, Simon; Birner, Regina; Bett, Bernard; Isuyi, Linda; Grace, Delia; Güttler, Denise; Lindahl, Johanna
2018-03-01
Zoonotic diseases, transmitted from animals to humans, are a public health challenge in developing countries. Livestock value chain actors have an important role to play as the first line of defence in safeguarding public health. However, although the livelihood and economic impacts of zoonoses are widely known, adoption of biosecurity measures aimed at preventing zoonoses is low, particularly among actors in informal livestock value chains in low and middle-income countries. The main objective of this study was to investigate knowledge of zoonoses and adoption of biosecurity measures by livestock and milk value chain actors in Bura, Tana River County, in Kenya, where cattle, camels, sheep and goats are the main livestock kept. The study utilised a mixed methods approach, with a questionnaire survey administered to 154 value chain actors. Additional information was elicited through key informant interviews and participatory methods with relevant stakeholders outside the value chain. Our results found low levels of knowledge of zoonoses and low levels of adherence to food safety standards, with only 37% of milk traders knowing about brucellosis, in spite of a sero-prevalence of 9% in the small ruminants tested in this study, and no slaughterhouse worker knew about Q fever. Actors had little formal education (between 0 and 10%) and lacked training in food safety and biosecurity measures. Adoption of biosecurity measures by value chain actors was very low or non-existent, with only 11% of butchers wearing gloves. There was a gendered dimension, evidenced by markedly different participation in value chains and lower adoption rates and knowledge levels among female actors. Finally, cultural and religious practices were shown to play an important role in exposure and transmission of diseases, influencing perceptions and attitudes to risks and adoption of biosecurity measures.
Comparing interval estimates for small sample ordinal CFA models
Natesan, Prathiba
2015-01-01
Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research. PMID:26579002
Gil, Emilio; Llorens, Jordi; Gallart, Montserrat; Gil-Ribes, Jesús A; Miranda-Fuentes, Antonio
2018-06-15
The current standard for the field measurements of spray drift (ISO 22866) is the only official standard for drift measurements in field conditions for all type of crops, including bushes and trees. A series of field trials following all the requirements established in the standard were arranged in a traditional olive grove in Córdoba (south of Spain). The aims of the study were to evaluate the applicability of the current standard procedure to the particular conditions of traditional olive trees plantations, to evaluate the critical requirements for performing the tests and to obtain a specific drift curve for such as important and specific crop as olive trees in traditional plantations, considering the enormous area covered by this type of crop all around the world. Results showed that the field trials incur a very complex process due to the particular conditions of the crop and the very precise environmental requirements. Furthermore, the trials offered a very low level of repeatability as the drift values varied significantly from one spray application to the next, with the obtained results being closely related to the wind speed, even when considering the standard minimum value of 1 m·s -1 . The collector's placement with respect to the position of the isolated trees was determined as being critical since this substantially modifies the ground deposit in the first 5 m. Even though, a new drift curve for olive trees in traditional plantation has been defined, giving an interesting tool for regulatory aspects. Conclusions indicated that a deep review of the official standard is needed to allow its application to the most relevant orchard/fruit crops. Copyright © 2018 Elsevier B.V. All rights reserved.
JOMJUNYONG, K.; RUNGSIYAKULL, P.; RUNGSIYAKULL, C.; AUNMEUNGTONG, W.; CHANTARAMUNGKORN, M.; KHONGKHUNTHIAN, P.
2017-01-01
SUMMARY Introduction. Although many previous studies have reported on the high success rate of short dental implants, prosthetic design still plays an important role in the long-term implant treatment results. This study aims to evaluate stress distribution characteristics involved with various prosthetic designs on standard implants or short implants in the posterior maxilla. Materials and methods. Six finite element models were simulated representing the missing first and second maxillary molars. A standard implant (PW+ implant: 5.0×10 mm) and a short implant (PW+ implant: 5.0×6.0 mm) were applied under the various prosthetic conditions. The peri-implant maximum bone stress (V on mises stress) was evaluated when 200 N 30° oblique load was applied. A type III bone was approximated and complete osseous integration was assumed. Results. Maximum Von mises stress was numerically located at the cortical bone around the implant neck in all models. In every standard implant model shows better stress distribution. Stress values and concentration area decreased in the cortical and cancellous bone when implants were splinted in both the standard and short implant models. With regard to the non-replacing second molar models found that the area of stress at the cortical bone around the first molar implant to be more intensive. Moreover, in the non-replacing second molar models, the stress also spread to the second pre-molar in both the standard and short implant models. Conclusions. The length of the implant and prosthetics designs both affect the stress value and distribution of stress to the cortical and cancellous bones around the implant. PMID:29682254
Comparing interval estimates for small sample ordinal CFA models.
Natesan, Prathiba
2015-01-01
Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research.
Code of Federal Regulations, 2010 CFR
2010-07-01
... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...
Code of Federal Regulations, 2014 CFR
2014-07-01
... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...
Code of Federal Regulations, 2012 CFR
2012-07-01
... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...
Code of Federal Regulations, 2011 CFR
2011-07-01
... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...
Code of Federal Regulations, 2013 CFR
2013-07-01
... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...
A Standard Atmosphere of the Antarctic Plateau
NASA Technical Reports Server (NTRS)
Mahesh, Ashwin; Lubin, Dan
2004-01-01
Climate models often rely on standard atmospheres to represent various regions; these broadly capture the important physical and radiative characteristics of regional atmospheres, and become benchmarks for simulations by researchers. The high Antarctic plateau is a significant region of the earth for which such standard atmospheres are as yet unavailable. Moreover, representative profiles from atmospheres over other regions of the planet, including &om the northern high latitudes, are not comparable to the atmosphere over the Antarctic plateau, and are therefore only of limited value as substitutes in climate models. Using data from radiosondes, ozonesondes and satellites along with other observations from South Pole station, typical seasonal atmospheric profiles for the high plateau are compiled. Proper representations of rapidly changing ozone concentrations (during the ozone hole) and the effect of surface elevation on tropospheric temperatures are discussed. The differences between standard profiles developed here and the most similar standard atmosphere that already exists - namely, the Arctic Winter profile - suggest that these new profiles will be extremely useful to make accurate representations of the atmosphere over the high plateau.
Beam uniformity of flat top lasers
NASA Astrophysics Data System (ADS)
Chang, Chao; Cramer, Larry; Danielson, Don; Norby, James
2015-03-01
Many beams that output from standard commercial lasers are multi-mode, with each mode having a different shape and width. They show an overall non-homogeneous energy distribution across the spot size. There may be satellite structures, halos and other deviations from beam uniformity. However, many scientific, industrial and medical applications require flat top spatial energy distribution, high uniformity in the plateau region, and complete absence of hot spots. Reliable standard methods for the evaluation of beam quality are of great importance. Standard methods are required for correct characterization of the laser for its intended application and for tight quality control in laser manufacturing. The International Organization for Standardization (ISO) has published standard procedures and definitions for this purpose. These procedures have not been widely adopted by commercial laser manufacturers. This is due to the fact that they are unreliable because an unrepresentative single-pixel value can seriously distort the result. We hereby propose a metric of beam uniformity, a way of beam profile visualization, procedures to automatically detect hot spots and beam structures, and application examples in our high energy laser production.
Value-Based Standards Guide Sexism Inferences for Self and Others.
Mitamura, Chelsea; Erickson, Lynnsey; Devine, Patricia G
2017-09-01
People often disagree about what constitutes sexism, and these disagreements can be both socially and legally consequential. It is unclear, however, why or how people come to different conclusions about whether something or someone is sexist. Previous research on judgments about sexism has focused on the perceiver's gender and attitudes, but neither of these variables identifies comparative standards that people use to determine whether any given behavior (or person) is sexist. Extending Devine and colleagues' values framework (Devine, Monteith, Zuwerink, & Elliot, 1991; Plant & Devine, 1998), we argue that, when evaluating others' behavior, perceivers rely on the morally-prescriptive values that guide their own behavior toward women. In a series of 3 studies we demonstrate that (1) people's personal standards for sexism in their own and others' behavior are each related to their values regarding sexism, (2) these values predict how much behavioral evidence people need to infer sexism, and (3) people with stringent, but not lenient, value-based standards get angry and try to regulate a sexist perpetrator's behavior to reduce sexism. Furthermore, these personal values are related to all outcomes in the present work above and beyond other person characteristics previously used to predict sexism inferences. We discuss the implications of differing value-based standards for explaining and reconciling disputes over what constitutes sexist behavior.
Ivanova, Maria V; Isaev, Dmitry Yu; Dragoy, Olga V; Akinina, Yulia S; Petrushevskiy, Alexey G; Fedina, Oksana N; Shklovsky, Victor M; Dronkers, Nina F
2016-12-01
A growing literature is pointing towards the importance of white matter tracts in understanding the neural mechanisms of language processing, and determining the nature of language deficits and recovery patterns in aphasia. Measurements extracted from diffusion-weighted (DW) images provide comprehensive in vivo measures of local microstructural properties of fiber pathways. In the current study, we compared microstructural properties of major white matter tracts implicated in language processing in each hemisphere (these included arcuate fasciculus (AF), superior longitudinal fasciculus (SLF), inferior longitudinal fasciculus (ILF), inferior frontal-occipital fasciculus (IFOF), uncinate fasciculus (UF), and corpus callosum (CC), and corticospinal tract (CST) for control purposes) between individuals with aphasia and healthy controls and investigated the relationship between these neural indices and language deficits. Thirty-seven individuals with aphasia due to left hemisphere stroke and eleven age-matched controls were scanned using DW imaging sequences. Fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), axial diffusivity (AD) values for each major white matter tract were extracted from DW images using tract masks chosen from standardized atlases. Individuals with aphasia were also assessed with a standardized language test in Russian targeting comprehension and production at the word and sentence level. Individuals with aphasia had significantly lower FA values for left hemisphere tracts and significantly higher values of MD, RD and AD for both left and right hemisphere tracts compared to controls, all indicating profound impairment in tract integrity. Language comprehension was predominantly related to integrity of the left IFOF and left ILF, while language production was mainly related to integrity of the left AF. In addition, individual segments of these three tracts were differentially associated with language production and comprehension in aphasia. Our findings highlight the importance of fiber pathways in supporting different language functions and point to the importance of temporal tracts in language processing, in particular, comprehension. Copyright © 2016 Elsevier Ltd. All rights reserved.
Standardized Emission Quantification and Control of Costs for Environmental Measures
NASA Astrophysics Data System (ADS)
Walter, J.; Hustedt, M.; Wesling, V.; Barcikowski, S.
Laser welding and soldering are important industrial joining processes. As is known, LGACs (Laser Generated Air Contaminants) cause costs for environmental measures during production of complex metallic components (steel, aluminium, magnesium, alloys). The hazardous potential of such processes has been assessed by analyzing the specific emissions with respect to relevant threshold limit values (TLVs). Avoiding and controlling emissions caused by laser processing of metals or metal composites is an important task. Using the experimental results, the planning of appropriate exhaust systems for laser processing is facilitated significantly. The costs quantified for environmental measures account for significant percentages of the total manufacturing costs.
Autonomy, religion and clinical decisions: findings from a national physician survey.
Lawrence, R E; Curlin, F A
2009-04-01
Patient autonomy has been promoted as the most important principle to guide difficult clinical decisions. To examine whether practising physicians indeed value patient autonomy above other considerations, physicians were asked to weight patient autonomy against three other criteria that often influence doctors' decisions. Associations between physicians' religious characteristics and their weighting of the criteria were also examined. Mailed survey in 2007 of a stratified random sample of 1000 US primary care physicians, selected from the American Medical Association masterfile. Physicians were asked how much weight should be given to the following: (1) the patient's expressed wishes and values, (2) the physician's own judgment about what is in the patient's best interest, (3) standards and recommendations from professional medical bodies and (4) moral guidelines from religious traditions. Response rate 51% (446/879). Half of physicians (55%) gave the patient's expressed wishes and values "the highest possible weight". In comparative analysis, 40% gave patient wishes more weight than the other three factors, and 13% ranked patient wishes behind some other factor. Religious doctors tended to give less weight to the patient's expressed wishes. For example, 47% of doctors with high intrinsic religious motivation gave patient wishes the "highest possible weight", versus 67% of those with low (OR 0.5; 95% CI 0.3 to 0.8). Doctors believe patient wishes and values are important, but other considerations are often equally or more important. This suggests that patient autonomy does not guide physicians' decisions as much as is often recommended in the ethics literature.
Addressing current and future challenges for the NHS: the role of good leadership.
Elton, Lotte
2016-10-03
Purpose This paper aims to describe and analyse some of the ways in which good leadership can enable those working within the National Health Service (NHS) to weather the changes and difficulties likely to arise in the coming years, and takes the format of an essay written by the prize-winner of the Faculty of Medical Leadership and Management's Student Prize. The Faculty of Medical Leadership and Management ran its inaugural Student Prize in 2015-2016, which aimed at medical students with an interest in medical leadership. In running the Prize, the Faculty hoped to foster an enthusiasm for and understanding of the importance of leadership in medicine. Design/methodology/approach The Faculty asked entrants to discuss the role of good leadership in addressing the current and future challenges faced by the NHS, making reference to the Leadership and Management Standards for Medical Professionals published by the Faculty in 2015. These standards were intended to help guide current and future leaders and were grouped into three categories, namely, self, team and corporate responsibility. Findings This paper highlights the political nature of health care in the UK and the increasing impetus on medical professionals to navigate debates on austerity measures and health-care costs, particularly given the projected deficit in NHS funding. It stresses the importance of building organisational cultures prizing transparency to prevent future breaches in standards of care and the value of patient-centred approaches in improving satisfaction for both patients and staff. Identification of opportunities for collaboration and partnership is emphasised as crucial to assuage the burden that lack of appropriate social care places on clinical services. Originality/value This paper offers a novel perspective - that of a medical student - on the complex issues faced by the NHS over the coming years and utilises a well-regarded set of standards in conceptualising the role that health professionals have to play in leading the NHS.
Binding, N; Schilder, K; Czeschinski, P A; Witting, U
1998-08-01
The 2,4-dinitrophenylhydrazine (2,4-DNPH) derivatization method mainly used for the determination of airborne formaldehyde was extended for acetaldehyde, acetone, 2-butanone, and cyclohexanone, the next four carbonyl compounds of industrial importance. Sampling devices and sampling conditions were adjusted for the respective limit value regulations. Analytical reliability criteria were established and compared to those of other recommended methods. With a minimum analytical range from one tenth to the 3-fold limit value in all cases and with relative standard deviations below 5%, the adjusted method meets all requirements for the reliable quantification of the four compounds in workplace air as well as in ambient air.
Modification of surface morphology of Ti6Al4V alloy manufactured by Laser Sintering
NASA Astrophysics Data System (ADS)
Draganovská, Dagmar; Ižariková, Gabriela; Guzanová, Anna; Brezinová, Janette; Koncz, Juraj
2016-06-01
The paper deals with the evaluation of relation between roughness parameters of Ti6Al4V alloy produced by DMLS and modified by abrasive blasting. There were two types of blasting abrasives that were used - white corundum and Zirblast at three levels of air pressure. The effect of pressure on the value of individual roughness parameters and an influence of blasting media on the parameters for samples blasted by white corundum and Zirblast were evaluated by ANOVA. Based on the measured values, the correlation matrix was set and the standard of correlation statistic importance between the monitored parameters was determined from it. The correlation coefficient was also set.
Measurement of the transverse polarization of electrons emitted in free-neutron decay.
Kozela, A; Ban, G; Białek, A; Bodek, K; Gorel, P; Kirch, K; Kistryn, St; Kuźniak, M; Naviliat-Cuncic, O; Pulut, J; Severijns, N; Stephan, E; Zejma, J
2009-05-01
Both components of the transverse polarization of electrons (sigmaT1, sigmaT2) emitted in the beta-decay of polarized, free neutrons have been measured. The T-odd, P-odd correlation coefficient quantifying sigmaT2, perpendicular to the neutron polarization and electron momentum, was found to be R=0.008+/-0.015+/-0.005. This value is consistent with time reversal invariance and significantly improves limits on the relative strength of imaginary scalar couplings in the weak interaction. The value obtained for the correlation coefficient associated with sigmaT1, N=0.056+/-0.011+/-0.005, agrees with the Standard Model expectation, providing an important sensitivity test of the experimental setup.
Kalateh Sadati, Ahmad; Iman, Mohammad Taghi; Bagheri Lankarani, Kamran
2014-01-01
Background: Despite its benefits and importance, clinical counseling affects the patient both psychosocially and socially. Illness labeling not only leads to many problems for patient and his/her family but also it imposes high costs to health care system. Among various factors, doctor-patient relationship has an important role in the clinical counseling and its medical approach. The goal of this study is to evaluate the nature of clinical counseling based on critical approach. Methods: The context of research is the second major medical training center in Shiraz, Iran. In this study, Critical Conversation Analysis was used based on the methodologies of critical theories. Among about 50 consultation meetings digitally recorded, 33 were selected for this study. Results: Results show that the nature of doctor-patient relationship in these cases is based on paternalistic model. On the other hand, in all consultations, the important values that were legitimated with physicians were medical paraclinical standards. Paternalism in one hand and standardization on the other leads to dependency of patients to the clinic. Conclusion: Although we can’t condone the paraclinical standards, clinical counseling and doctor-patient relationship need to reduce its dominance over counseling based on interpretation of human relations, paying attention to social and economical differences of peoples and biosocial and biocultural differences, and focusing on clinical examinations. Also, we need to accept that medicine is an art of interaction that can’t reduce it to instrumental and linear methods of body treatment. PMID:25349858
Zhang, Z L; Li, J P; Li, G; Ma, X C
2017-02-09
Objective: To establish and validate a computer program used to aid the detection of dental proximal caries in the images cone beam computed tomography (CBCT) images. Methods: According to the characteristics of caries lesions in X-ray images, a computer aided detection program for proximal caries was established with Matlab and Visual C++. The whole process for caries lesion detection included image import and preprocessing, measuring average gray value of air area, choosing region of interest and calculating gray value, defining the caries areas. The program was used to examine 90 proximal surfaces from 45 extracted human teeth collected from Peking University School and Hospital of Stomatology. The teeth were then scanned with a CBCT scanner (Promax 3D). The proximal surfaces of the teeth were respectively detected by caries detection program and scored by human observer for the extent of lesions with 6-level-scale. With histologic examination serving as the reference standard, the caries detection program and the human observer performances were assessed with receiver operating characteristic (ROC) curves. Student t -test was used to analyze the areas under the ROC curves (AUC) for the differences between caries detection program and human observer. Spearman correlation coefficient was used to analyze the detection accuracy of caries depth. Results: For the diagnosis of proximal caries in CBCT images, the AUC values of human observers and caries detection program were 0.632 and 0.703, respectively. There was a statistically significant difference between the AUC values ( P= 0.023). The correlation between program performance and gold standard (correlation coefficient r (s)=0.525) was higher than that of observer performance and gold standard ( r (s)=0.457) and there was a statistically significant difference between the correlation coefficients ( P= 0.000). Conclusions: The program that automatically detects dental proximal caries lesions could improve the diagnostic value of CBCT images.
Alexithymia in the German general population.
Franz, Matthias; Popp, Kerstin; Schaefer, Ralf; Sitte, Wolfgang; Schneider, Christine; Hardt, Jochen; Decker, Oliver; Braehler, Elmar
2008-01-01
The Toronto-Alexithymia-Scale (TAS-20) is used worldwide as a valid measurement of alexithymia. Until now, population-based standardization and cut-off values of the German TAS-20 version have not been available. This study provides these by means of a representative German sample and by investigating the factorial structure of the TAS-20. Data were generated from a representative random sample of the German general population (1,859 subjects aged between 20 and 69). The TAS-20 sum score was normally distributed. The mean value was 49.5 (SD=9.3) in men and 48.2 (SD=9.2) in women. Divorce, single and low social status were associated with enhanced sum scores. Ten percent of the population exceeded the TAS-20 sum score threshold of >or=61. The 66th percentile reached 53 for men and 52 for women. Factor analysis identified three factors that match the scales of the English original version. An additional fourth factor ("importance of emotional introspection") was extracted. Total explanation of variance by these four factors was 52.27%. The sum score of the German TAS-20 version is suited for the standardized measure of alexithymia. For selecting alexithymic individuals in experimental studies, the cut-off >or=61 is possibly too restrictive. Therefore, we propose the 66th percentile for the identification of high alexithymics. The TAS-20 sum score is associated with important socio-demographic variables. The factorial structure is reliable; the fourth factor ("importance of emotional introspection") provides differentiation of content and allows for enhanced explanation of variance.
[The International Standards for Tuberculosis Care (ISTC): what is the importance for Japan?].
Fujiwara, Paula I
2008-07-01
In 2005, the World Health Assembly resolved that all Member States should ensure that all persons with tuberculosis (TB) "have access to the universal standard of care based on proper diagnosis, treatment and reporting consistent with the DOTS strategy..." The purpose of the International Standards for Tuberculosis Care (ISTC) is to define the widely accepted level of care of persons either suspected of, or diagnosed with, TB by all health practitioners, especially those in the private sector, who often lack guidance and systematic evaluation of outcomes provided by government programs. Since their publication in 2006 on World TB Day, the standards have been endorsed by the major international health organizations as well as many country-level professional societies. The intention is to complement local and national control polices consistent with those of the World Health Organization: they are not intended to replace local guidelines, but are written to accommodate local differences in practice. The ISTC comprise seventeen evidence-based standards on tuberculosis diagnosis and treatment, as well as the responsibility of the public health sector. These are based on the basic principles of TB care: prompt and accurate diagnosis, standardized treatment regimens of proven efficacy, appropriate treatment support and supervision, monitoring of response to treatment and the carrying out of essential public health responsibilities. The relevance of the ISTC to the Japanese context is highlighted, in terms of when persons should be suspected of TB; the appropriate diagnostic modalities, including the use of chest radiographs; the advantages of fixed dose combinations; the importance of follow-up laboratory tests to document response to treatment, the importance of recordkeeping and reporting to public health authorities, the value of HIV testing of TB patients and the use of anti-retrovirals for those dually infected; and the assessment of drug resistance and the appropriate treatment of multidrug-resistant tuberculosis. Finally, some proposals were made on the way forward for Japan.
NASA Astrophysics Data System (ADS)
Gneiser, Martin; Heidemann, Julia; Klier, Mathias; Landherr, Andrea; Probst, Florian
Online social networks have been gaining increasing economic importance in light of the rising number of their users. Numerous recent acquisitions priced at enormous amounts have illustrated this development and revealed the need for adequate business valuation models. The value of an online social network is largely determined by the value of its users, the relationships between these users, and the resulting network effects. Therefore, the interconnectedness of a user within the network has to be considered explicitly to get a reasonable estimate for the economic value. Established standard business valuation models, however, do not sufficiently take these aspects into account. Thus, we propose a measure based on the PageRank-algorithm to quantify users’ interconnectedness in an online social network. This is a first but indispensible step towards an adequate economic valuation of online social networks.
Watershed-based Morphometric Analysis: A Review
NASA Astrophysics Data System (ADS)
Sukristiyanti, S.; Maria, R.; Lestiana, H.
2018-02-01
Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.
AIAA spacecraft GN&C interface standards initiative: Overview
NASA Technical Reports Server (NTRS)
Challoner, A. Dorian
1995-01-01
The American Institute of Aeronautics and Astronautics (AIAA) has undertaken an important standards initiative in the area of spacecraft guidance, navigation, and control (GN&C) subsystem interfaces. The objective of this effort is to establish standards that will promote interchangeability of major GN&C components, thus enabling substantially lower spacecraft development costs. Although initiated by developers of conventional spacecraft GN&C, it is anticipated that interface standards will also be of value in reducing the development costs of micro-engineered spacecraft. The standardization targets are specifically limited to interfaces only, including information (i.e. data and signal), power, mechanical, thermal, and environmental interfaces between various GN&C components and between GN&C subsystems and other subsystems. The current emphasis is on information interfaces between various hardware elements (e.g., between star trackers and flight computers). The poster presentation will briefly describe the program, including the mechanics and schedule, and will publicize the technical products as they exist at the time of the conference. In particular, the rationale for the adoption of the AS1773 fiber-optic serial data bus and the status of data interface standards at the application layer will be presented.
Hyaluronan Tumor Cell Interactions in Prostate Cancer Growth and Survival
2006-12-01
prognostic value of CD44 standard and variant v3 and v6 isoforms in prostate cancer. Eur Urol, 2001. 39(2): p. 138-44. 32. De Marzo , A.M., et al., CD44...subcutaneous injection model [ 24 ]and in orthotopic or intrafemoral bone injection models (see progress report below). Importantly, the addition of...expression from these cells, completely reverses growth inhibition[ 24 ]. CD44 and Rhamm – Two Hyaladherins with Overlapping Function: The two most
Translations on USSR Military Affairs, Number 1297
1977-09-01
standards. The platoon leader assured me that the exercise would be "up to snuff ." In actual fact it was not. They were working, for example, on meeting...television film faced a difficult task—to choose the most important and the most valuable from that tremendous archive and documentary material which...in the television films is of equal value: some things turned out better, others worse, but the main hero did not disappear from the screen—the
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
2004-01-01
Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal distribution (ref.1). This new approach allows for a very simple and direct algebraic solution without restricting the standard deviation. The beta parameters obtained by the new method are comparable to the conventional method (and identical when the distribution is symmetrical). However, the proposed method generally produces a less peaked distribution with a slightly larger standard deviation (up to 7 percent) than the conventional method in cases where the distribution is asymmetric or skewed. The beta distribution model has now been implemented into the Fast Probability Integration (FPI) module used in the NESSUS computer code for probabilistic analyses of structures (ref. 2).
Detection of 12.5% and 25% Salt Reduction in Bread in a Remote Indigenous Australian Community
McMahon, Emma; Clarke, Rozlynne; Jaenke, Rachael; Brimblecombe, Julie
2016-01-01
Food reformulation is an important strategy to reduce the excess salt intake observed in remote Indigenous Australia. We aimed to examine whether 12.5% and 25% salt reduction in bread is detectable, and, if so, whether acceptability is changed, in a sample of adults living in a remote Indigenous community in the Northern Territory of Australia. Convenience samples were recruited for testing of reduced-salt (300 and 350 mg Na/100 g) versus Standard (~400 mg Na/100 g) white and wholemeal breads (n = 62 for white; n = 72 for wholemeal). Triangle testing was used to examine whether participants could detect a difference between the breads. Liking of each bread was also measured; standard consumer acceptability questionnaires were modified to maximise cultural appropriateness and understanding. Participants were unable to detect a difference between Standard and reduced-salt breads (all p values > 0.05 when analysed using binomial probability). Further, as expected, liking of the breads was not changed with salt reduction (all p values > 0.05 when analysed using ANOVA). Reducing salt in products commonly purchased in remote Indigenous communities has potential as an equitable, cost-effective and sustainable strategy to reduce population salt intake and reduce risk of chronic disease, without the barriers associated with strategies that require individual behaviour change. PMID:26999196
Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George
2013-05-01
The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.
1993-10-01
continues to consider that Sound value compensation, in the detrimental liability for payment of the change to fair value compensation in abstract...in concession the new standard contract generally contract. Depending on the buildings. The fair value compensation provides for a redefined " fair ...standard contract also changes book concessioner sound value concession operators is of benefit to value to fair value in most compensation. For example
Mattos, Jose L; Schlosser, Rodney J; Mace, Jess C; Smith, Timothy L; Soler, Zachary M
2018-05-02
Olfactory-specific quality of life (QOL) can be measured using the Questionnaire of Olfactory Disorders Negative Statements (QOD-NS). Changes in the QOD-NS after treatment can be difficult to interpret since there is no standardized definition of clinically meaningful improvement. Patients with chronic rhinosinusitis (CRS) completed the QOD-NS. Four distribution-based methods were used to calculate the minimal clinically important difference (MCID): (1) one-half standard deviation (SD); (2) standard error of the mean (SEM); (3) Cohen's effect size (d) of the smallest unit of change; and (4) minimal detectable change (MDC). We also averaged all 4 of the scores together. Finally, the likelihood of achieving a MCID after sinus surgery using these methods, as well as average QOD-NS scores, was stratified by normal vs abnormal baseline QOD-NS scores. Outcomes were examined on 128 patients. The mean ± SD improvement in QOD-NS score after surgery was 4.3 ± 11.0 for the entire cohort and 9.6 ± 12.9 for those with abnormal baseline scores (p < 0.001). The MCID values using the different techniques were: (1) SD = 6.5; (2) SEM = 3.1; (3) d = 2.6; and (4) MDC = 8.6. The MCID score was 5.2 on average. For the total cohort analysis, the likelihood of reporting a MCID ranged from 26% to 51%, and 49% to 70% for patients reporting preoperative abnormal olfaction. Distribution-based MCID values of the QOD-NS range between 2.6 and 8.6 points, with an average of 5.2. When stratified by preoperative QOD-NS scores the majority of patients reporting abnormal preoperative QOD-NS scores achieved a MCID. © 2018 ARS-AAOA, LLC.
Mancia, G; Ferrari, A; Gregorini, L; Parati, G; Pomidossi, G; Bertinieri, G; Grassi, G; Zanchetti, A
1980-12-01
1. Intra-arterial blood pressure and heart rate were recorded for 24 h in ambulant hospitalized patients of variable age who had normal blood pressure or essential hypertension. Mean 24 h values, standard deviations and variation coefficient were obtained as the averages of values separately analysed for 48 consecutive half-hour periods. 2. In older subjects standard deviation and variation coefficient for mean arterial pressure were greater than in younger subjects with similar pressure values, whereas standard deviation and variation coefficient for mean arterial pressure were greater than in younger subjects with similar pressure values, whereas standard deviation aations and variation coefficient were obtained as the averages of values separately analysed for 48 consecurive half-hour periods. 2. In older subjects standard deviation and variation coefficient for mean arterial pressure were greater than in younger subjects with similar pressure values, whereas standard deviation and variation coefficient for heart rate were smaller. 3. In hypertensive subjects standard deviation for mean arterial pressure was greater than in normotensive subjects of similar ages, but this was not the case for variation coefficient, which was slightly smaller in the former than in the latter group. Normotensive and hypertensive subjects showed no difference in standard deviation and variation coefficient for heart rate. 4. In both normotensive and hypertensive subjects standard deviation and even more so variation coefficient were slightly or not related to arterial baroreflex sensitivity as measured by various methods (phenylephrine, neck suction etc.). 5. It is concluded that blood pressure variability increases and heart rate variability decreases with age, but that changes in variability are not so obvious in hypertension. Also, differences in variability among subjects are only marginally explained by differences in baroreflex function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Jaromy; Sun Zaijing; Wells, Doug
2009-03-10
Photon activation analysis detected elements in two NIST standards that did not have reported concentration values. A method is currently being developed to infer these concentrations by using scaling parameters and the appropriate known quantities within the NIST standard itself. Scaling parameters include: threshold, peak and endpoint energies; photo-nuclear cross sections for specific isotopes; Bremstrahlung spectrum; target thickness; and photon flux. Photo-nuclear cross sections and energies from the unknown elements must also be known. With these quantities, the same integral was performed for both the known and unknown elements resulting in an inference of the concentration of the un-reported elementmore » based on the reported value. Since Rb and Mn were elements that were reported in the standards, and because they had well-identified peaks, they were used as the standards of inference to determine concentrations of the unreported elements of As, I, Nb, Y, and Zr. This method was tested by choosing other known elements within the standards and inferring a value based on the stated procedure. The reported value of Mn in the first NIST standard was 403{+-}15 ppm and the reported value of Ca in the second NIST standard was 87000 ppm (no reported uncertainty). The inferred concentrations were 370{+-}23 ppm and 80200{+-}8700 ppm respectively.« less
Sacristán, José A; Lizan, Luís; Comellas, Marta; Garrido, Pilar; Avendaño, Cristina; Cruz-Hernández, Juan J; Espinosa, Javier; Dilla, Tatiana
2016-11-01
The purpose of this study was to explore the main factors explaining the relative weight of the different attributes that determine the value of oncologic treatments from the different perspectives of healthcare policy makers (HCPM), oncologists, patients and the general population in Spain. Structured interviews were conducted to assess: (1) the importance of the attributes on treatment choice when comparing a new cancer drug with a standard cancer treatment; (2) the importance of survival, quality of life (QoL), costs and innovation in cancer; and (3) the most worrying side effects related to cancer drugs. A total of 188 individuals participated in the study. For all participants, when choosing treatments, the best rated characteristics were greater efficacy, greater safety, treatment adaptation to patients' individual requirements and the rapid reincorporation of patients to their daily activities. There were important differences among participants in their opinion about survival, QoL and cost. In general, oncologists, patients, and the general population gave greater value to gains in QoL than healthcare policy makers. Compared to other participants healthcare policy makers gave greater importance to the economic impact related to oncology treatments. Gains in QoL, survival, safety, cost and innovation are perceived differently by different groups of stakeholders. It is recommended to consider the perspective of different stakeholders in the assessment of a new cancer drugs to obtain more informed decisions when deciding on the most appropriate treatment to use. Eli Lilly & Co, Madrid (Spain).
A National Trial on Differences in Cerebral Perfusion Pressure Values by Measurement Location.
McNett, Molly M; Bader, Mary Kay; Livesay, Sarah; Yeager, Susan; Moran, Cristina; Barnes, Arianna; Harrison, Kimberly R; Olson, DaiWai M
2018-04-01
Cerebral perfusion pressure (CPP) is a key parameter in management of brain injury with suspected impaired cerebral autoregulation. CPP is calculated by subtracting intracranial pressure (ICP) from mean arterial pressure (MAP). Despite consensus on importance of CPP monitoring, substantial variations exist on anatomical reference points used to measure arterial MAP when calculating CPP. This study aimed to identify differences in CPP values based on measurement location when using phlebostatic axis (PA) or tragus (Tg) as anatomical reference points. The secondary study aim was to determine impact of differences on patient outcomes at discharge. This was a prospective, repeated measures, multi-site national trial. Adult ICU patients with neurological injury necessitating ICP and CPP monitoring were consecutively enrolled from seven sites. Daily MAP/ICP/CPP values were gathered with the arterial transducer at the PA, followed by the Tg as anatomical reference points. A total of 136 subjects were enrolled, resulting in 324 paired observations. There were significant differences for CPP when comparing values obtained at PA and Tg reference points (p < 0.000). Differences remained significant in repeated measures model when controlling for clinical factors (mean CPP-PA = 80.77, mean CPP-Tg = 70.61, p < 0.000). When categorizing CPP as binary endpoint, 18.8% of values were identified as adequate with PA values, yet inadequate with CPP values measured at the Tg. Findings identify numerical differences for CPP based on anatomical reference location and highlight importance of a standard reference point for both clinical practice and future trials to limit practice variations and heterogeneity of findings.
The PKRC's Value as a Professional Development Model Validated
ERIC Educational Resources Information Center
Larson, Dale
2013-01-01
After a brief review of the 4-H professional development standards, a new model for determining the value of continuing professional development is introduced and applied to the 4-H standards. The validity of the 4-H standards is affirmed. 4-H Extension professionals are encouraged to celebrate the strength of their standards and to engage the…
Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki
2010-09-15
Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Developing Resources for Teaching Ethics in Geoscience
NASA Astrophysics Data System (ADS)
Mogk, David W.; Geissman, John W.
2014-11-01
Ethics education is an increasingly important component of the pre-professional training of geoscientists. Geoethics encompasses the values and professional standards required of geoscientists to work responsibly in any geoscience profession and in service to society. Funding agencies (e.g., the National Science Foundation, the National Institutes of Health) require training of graduate students in the responsible conduct of research; employers are increasingly expecting their workers to have basic training in ethics; and the public demands the highest standards of ethical conduct by scientists. However, there is currently no formal course of instruction in ethics in the geoscience curriculum, and few faculty members have the experience, resources, and sometimes willingness required to teach ethics as a component of their geoscience courses.
Organochlorine pesticides residues in bottled drinking water from Mexico City.
Díaz, Gilberto; Ortiz, Rutilio; Schettino, Beatriz; Vega, Salvador; Gutiérrez, Rey
2009-06-01
This work describes concentrations of organochlorine pesticides in bottled drinking water (BDW) in Mexico City. The results of 36 samples (1.5 and 19 L presentations, 18 samples, respectively) showed the presence of seven pesticides (HCH isomers, heptachlor, aldrin, and p,p'-DDE) in bottled water compared with the drinking water standards set by NOM-127-SSA1-1994, EPA, and World Health Organization. The concentrations of the majority of organochlorine pesticides were within drinking water standards (0.01 ng/mL) except for beta-HCH of BW 3, 5, and 6 samples with values of 0.121, 0.136, and 0.192 ng/mL, respectively. It is important monitoring drinking bottled water for protecting human health.
Progress toward a new beam measurement of the neutron lifetime
NASA Astrophysics Data System (ADS)
Hoogerheide, Shannon Fogwell
2016-09-01
Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosysnthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method will be performed at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement and the technical improvements will be discussed.
Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M.; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R.
2015-01-01
The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML‐based format, capable of representing data about two‐dimensional features from LC‐MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java‐based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)‐level quantification values from peptide‐level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC‐MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in‐built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq‐lib/. PMID:26037908
Qi, Da; Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R
2015-09-01
The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML-based format, capable of representing data about two-dimensional features from LC-MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java-based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)-level quantification values from peptide-level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC-MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in-built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq-lib/. © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
The new agreement of the international RIGA consensus conference on nasal airway function tests.
Vogt, K; Bachmann-Harildstad, G; Lintermann, A; Nechyporenko, A; Peters, F; Wernecke, K D
2018-01-21
The report reflects an agreement based on the consensus conference of the International Standardization Committee on the Objective Assessment of the Nasal Airway in Riga, 2nd Nov. 2016. The aim of the conference was to address the existing nasal airway function tests and to take into account physical, mathematical and technical correctness as a base of international standardization as well as the requirements of the Council Directive 93/42/EEC of 14 June 1993 concerning medical devices. Rhinomanometry, acoustic rhinometry, peak nasal inspiratory flow, Odiosoft-Rhino, optical rhinometry, 24-h measurements, computational fluid dynamics, nasometry and the mirrow test were evaluated for important diagnostic criteria, which are the precision of the equipment including calibration and the software applied; validity with sensitivity, specificity, positive and negative predictive values, reliability with intra-individual and inter-individual reproducibility and responsiveness in clinical studies. For rhinomanometry, the logarithmic effective resistance was set as the parameter of high diagnostic relevance. In acoustic rhinometry, the area of interest for the minimal cross-sectional area will need further standardization. Peak nasal inspiratory flow is a reproducible and fast test, which showed a high range of mean values in different studies. The state of the art with computational fluid dynamics for the simulation of the airway still depends on high performance computing hardware and will, after standardization of the software and both the software and hardware for imaging protocols, certainly deliver a better understanding of the nasal airway flux.
Weighted mining of massive collections of [Formula: see text]-values by convex optimization.
Dobriban, Edgar
2018-06-01
Researchers in data-rich disciplines-think of computational genomics and observational cosmology-often wish to mine large bodies of [Formula: see text]-values looking for significant effects, while controlling the false discovery rate or family-wise error rate. Increasingly, researchers also wish to prioritize certain hypotheses, for example, those thought to have larger effect sizes, by upweighting, and to impose constraints on the underlying mining, such as monotonicity along a certain sequence. We introduce Princessp , a principled method for performing weighted multiple testing by constrained convex optimization. Our method elegantly allows one to prioritize certain hypotheses through upweighting and to discount others through downweighting, while constraining the underlying weights involved in the mining process. When the [Formula: see text]-values derive from monotone likelihood ratio families such as the Gaussian means model, the new method allows exact solution of an important optimal weighting problem previously thought to be non-convex and computationally infeasible. Our method scales to massive data set sizes. We illustrate the applications of Princessp on a series of standard genomics data sets and offer comparisons with several previous 'standard' methods. Princessp offers both ease of operation and the ability to scale to extremely large problem sizes. The method is available as open-source software from github.com/dobriban/pvalue_weighting_matlab (accessed 11 October 2017).
NASA Astrophysics Data System (ADS)
Leijenaar, Ralph T. H.; Nalbantov, Georgi; Carvalho, Sara; van Elmpt, Wouter J. C.; Troost, Esther G. C.; Boellaard, Ronald; Aerts, Hugo J. W. L.; Gillies, Robert J.; Lambin, Philippe
2015-08-01
FDG-PET-derived textural features describing intra-tumor heterogeneity are increasingly investigated as imaging biomarkers. As part of the process of quantifying heterogeneity, image intensities (SUVs) are typically resampled into a reduced number of discrete bins. We focused on the implications of the manner in which this discretization is implemented. Two methods were evaluated: (1) RD, dividing the SUV range into D equally spaced bins, where the intensity resolution (i.e. bin size) varies per image; and (2) RB, maintaining a constant intensity resolution B. Clinical feasibility was assessed on 35 lung cancer patients, imaged before and in the second week of radiotherapy. Forty-four textural features were determined for different D and B for both imaging time points. Feature values depended on the intensity resolution and out of both assessed methods, RB was shown to allow for a meaningful inter- and intra-patient comparison of feature values. Overall, patients ranked differently according to feature values-which was used as a surrogate for textural feature interpretation-between both discretization methods. Our study shows that the manner of SUV discretization has a crucial effect on the resulting textural features and the interpretation thereof, emphasizing the importance of standardized methodology in tumor texture analysis.
A high-fidelity Monte Carlo evaluation of CANDU-6 safety parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Y.; Hartanto, D.
2012-07-01
Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANDU-6 (CANada Deuterium Uranium) reactor have been evaluated by using a modified MCNPX code. For accurate analysis of the parameters, the DBRC (Doppler Broadening Rejection Correction) scheme was implemented in MCNPX in order to account for the thermal motion of the heavy uranium nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted by using the MCNPX and the FTC value is evaluated for several burnup points including the mid-burnupmore » representing a near-equilibrium core. The Doppler effect has been evaluated by using several cross section libraries such as ENDF/B-VI, ENDF/B-VII, JEFF, JENDLE. The PCR value is also evaluated at mid-burnup conditions to characterize safety features of equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, huge number of neutron histories are considered in this work and the standard deviation of the k-inf values is only 0.5{approx}1 pcm. It has been found that the FTC is significantly enhanced by accounting for the Doppler broadening of scattering resonance and the PCR are clearly improved. (authors)« less
Standards vs. Customization: Finding the Balance
ERIC Educational Resources Information Center
Cuban, Larry
2012-01-01
Many practitioners (and the public) highly value standardizing curriculum and instruction for students. They believe that common standards and instruction will produce equal opportunity--a value dear to most policymakers and educators, and to Americans in general. Yet educators and the public also prize individual excellence. Differentiating the…
Valuing vaccines using value of statistical life measures.
Laxminarayan, Ramanan; Jamison, Dean T; Krupnick, Alan J; Norheim, Ole F
2014-09-03
Vaccines are effective tools to improve human health, but resources to pursue all vaccine-related investments are lacking. Benefit-cost and cost-effectiveness analysis are the two major methodological approaches used to assess the impact, efficiency, and distributional consequences of disease interventions, including those related to vaccinations. Childhood vaccinations can have important non-health consequences for productivity and economic well-being through multiple channels, including school attendance, physical growth, and cognitive ability. Benefit-cost analysis would capture such non-health benefits; cost-effectiveness analysis does not. Standard cost-effectiveness analysis may grossly underestimate the benefits of vaccines. A specific willingness-to-pay measure is based on the notion of the value of a statistical life (VSL), derived from trade-offs people are willing to make between fatality risk and wealth. Such methods have been used widely in the environmental and health literature to capture the broader economic benefits of improving health, but reservations remain about their acceptability. These reservations remain mainly because the methods may reflect ability to pay, and hence be discriminatory against the poor. However, willingness-to-pay methods can be made sensitive to income distribution by using appropriate income-sensitive distributional weights. Here, we describe the pros and cons of these methods and how they compare against standard cost-effectiveness analysis using pure health metrics, such as quality-adjusted life years (QALYs) and disability-adjusted life years (DALYs), in the context of vaccine priorities. We conclude that if appropriately used, willingness-to-pay methods will not discriminate against the poor, and they can capture important non-health benefits such as financial risk protection, productivity gains, and economic wellbeing. Copyright © 2014 Elsevier Ltd. All rights reserved.
Lisková, Anna; Krivánková, Ludmila
2005-12-01
Accurate determination of pK(a) values is important for proper characterization of newly synthesized molecules. In this work we have used CZE for determination of pK(a) values of new compounds prepared from intermediates, 2, 3 and 4-(2-chloro-acetylamino)-phenoxyacetic acids, by substituting chloride for 2-oxo-pyrrolidine, 2-oxo-piperidine or 2-oxo-azepane. These substances are expected to have a cognition enhancing activity and free radicals scavenging effect. Measurements were performed in a polyacrylamide-coated fused-silica capillary of 0.075 mm ID using direct UV detection at 254 nm. Three electrolyte systems were used for measurements to eliminate effects of potential interactions between tested compounds and components of the BGE. In the pH range 2.7-5.4, chloride, formate, acetate and phosphate were used as BGE co-ions, and sodium, beta-alanine and epsilon-aminocaproate as counterions. Mobility standards were measured simultaneously with the tested compounds for calculations of correct electrophoretic mobilities. Several approaches for the calculation of the pK(a) values were used. The values of pK(a) were determined by standard point-to-point calculation using Henderson-Hasselbach equation. Mobility and pH data were also evaluated by using nonlinear regression. Three parameter sigmoidal function fitted the experimental data with correlation coefficients higher than 0.99. Results from CZE measurements were compared with spectrophotometric measurements performed in sodium formate buffer solutions and evaluated at wavelength where the highest absorbance difference for varying pH was recorded. The experimental pK(a) values were compared with corresponding values calculated by the SPARC online calculator. Results of all three used methods were in good correlation.
Precision measurements of the RSA method using a phantom model of hip prosthesis.
Mäkinen, Tatu J; Koort, Jyri K; Mattila, Kimmo T; Aro, Hannu T
2004-04-01
Radiostereometric analysis (RSA) has become one of the recommended techniques for pre-market evaluation of new joint implant designs. In this study we evaluated the effect of repositioning of X-ray tubes and phantom model on the precision of the RSA method. In precision measurements, we utilized mean error of rigid body fitting (ME) values as an internal control for examinations. ME value characterizes relative motion among the markers within each rigid body and is conventionally used to detect loosening of a bone marker. Three experiments, each consisting of 10 double examinations, were performed. In the first experiment, the X-ray tubes and the phantom model were not repositioned between one double examination. In experiments two and three, the X-ray tubes were repositioned between one double examination. In addition, the position of the phantom model was changed in experiment three. Results showed that significant differences could be found in 2 of 12 comparisons when evaluating the translation and rotation of the prosthetic components. Repositioning procedures increased ME values mimicking deformation of rigid body segments. Thus, ME value seemed to be a more sensitive parameter than migration values in this study design. These results confirmed the importance of standardized radiographic technique and accurate patient positioning for RSA measurements. Standardization and calibration procedures should be performed with phantom models in order to avoid unnecessary radiation dose of the patients. The present model gives the means to establish and to follow the intra-laboratory precision of the RSA method. The model is easily applicable in any research unit and allows the comparison of the precision values in different laboratories of multi-center trials.
Liu, Xiaoyan; Yu, Yijun; Zeng, Xiaoyun; Li, Huanhuan
2018-01-01
Non-pharmacological therapies, especially the physical maneuvers, are viewed as important and promising strategies for reducing syncope recurrences in vasovagal syncope (VVS) patients. We observed the efficacy of a modified Valsalva maneuver (MVM) in VVS patients. 72 VVS patients with syncope history and positive head-up tilt table testing (HUTT) results were randomly divided into conventional treatment group (NVM group, n = 36) and conventional treatment plus standard MVM for 30 days group (MVM group, n = 36). Incidence of recurrent syncope after 12 months (6.5% vs. 41.2%, P<0.01) and rate of positive HUTT after 30 days (9.7% vs.79.4%, P<0.01) were significantly lower in MVM group than in NVM group. HRV results showed that low frequency (LF), LF/ high frequency (HF), standard deviation of NN intervals (SDNN) and standard deviation of all 5-min average NN intervals (SDANN) values were significantly lower in the NVM and MVM groups than in the control group at baseline. After 30 days treatment, LF, LF/HF, SDNN, SDANN values were significantly higher compared to baseline in MVM group. Results of Cox proportional hazard model showed that higher SDNN and SDANN values at 30 days after intervention were protective factors, while positive HUTT at 30 days after intervention was risk factor for recurrent syncope. Our results indicate that 30 days MVM intervention could effectively reduce the incidence of recurrent syncope up to 12 months in VVS patients, possibly through improving sympathetic function of VVS patients. PMID:29381726
He, Li; Wang, Lan; Li, Lun; Liu, Xiaoyan; Yu, Yijun; Zeng, Xiaoyun; Li, Huanhuan; Gu, Ye
2018-01-01
Non-pharmacological therapies, especially the physical maneuvers, are viewed as important and promising strategies for reducing syncope recurrences in vasovagal syncope (VVS) patients. We observed the efficacy of a modified Valsalva maneuver (MVM) in VVS patients. 72 VVS patients with syncope history and positive head-up tilt table testing (HUTT) results were randomly divided into conventional treatment group (NVM group, n = 36) and conventional treatment plus standard MVM for 30 days group (MVM group, n = 36). Incidence of recurrent syncope after 12 months (6.5% vs. 41.2%, P<0.01) and rate of positive HUTT after 30 days (9.7% vs.79.4%, P<0.01) were significantly lower in MVM group than in NVM group. HRV results showed that low frequency (LF), LF/ high frequency (HF), standard deviation of NN intervals (SDNN) and standard deviation of all 5-min average NN intervals (SDANN) values were significantly lower in the NVM and MVM groups than in the control group at baseline. After 30 days treatment, LF, LF/HF, SDNN, SDANN values were significantly higher compared to baseline in MVM group. Results of Cox proportional hazard model showed that higher SDNN and SDANN values at 30 days after intervention were protective factors, while positive HUTT at 30 days after intervention was risk factor for recurrent syncope. Our results indicate that 30 days MVM intervention could effectively reduce the incidence of recurrent syncope up to 12 months in VVS patients, possibly through improving sympathetic function of VVS patients.
Harvey, Stephen B; Krimer, Paula M; Correa, Maria T; Hanes, Martha A
2008-07-01
Plasma biochemical and hematologic values are important parameters for assessing animal health and experimental results. Although normal reference values for many rodent species have been published, there is a dearth of similar information for the genus Microtus. In addition, most studies use a mean and standard deviation to establish reference intervals, but doing so is not the recommendation of the Clinical and Laboratory Standards Institute (formerly the National Committee on Clinical Laboratory Standards) or the International Federation of Clinical Chemistry and Laboratory Medicine. The purpose of this study was to establish normal reference parameters for plasma biochemistry and hematology in mature pine voles (Microtus pinetorum) by using the nonparametric rank percentile method as recommended by the 2 laboratory medicine organizations mentioned. Samples of cardiac blood from a closed colony of pine voles were collected at euthanasia and evaluated under rodent settings on 2 automated hematology analyzers from 2 different manufacturers and on the same type of automated biochemistry analyzer. There were no sex-associated clinically significant differences between the sexes; younger animals had a lower hematocrit, higher mean corpuscular volume, and lower mean corpuscular hemoglobin concentration than did older animals. Only platelet counts differed when comparing hematologic values from different analyzers. Relative to rats and mice, pine voles have a lower mean corpuscular volume and higher red blood cell count, higher blood urea nitrogen, much higher alanine aminotransferase, and lower glucose and phosphorous concentrations. Hematology and plasma biochemical results obtained in this study are considered representative for healthy adult laboratory pine voles under similar environmental conditions.
Searle, Nancy S; Teal, Cayla R; Richards, Boyd F; Friedland, Joan A; Weigel, Nancy L; Hernandez, Rachael A; Lomax, James W; Coburn, Michael; Nelson, Elizabeth A
2012-07-01
The authors provide the rationale, design, and description of a unique teaching award that has enhanced Baylor College of Medicine's teaching environment and become highly valued by the promotions and tenure (P&T) committee in determining a faculty member's readiness for promotion. This award is self-nominating and standards based. The primary purpose for development of the award was to provide the Baylor community and the P&T committee a method to understand and value the scholarship of teaching to the same degree that they understand and value the scholarship of discovery.The authors also present results from an internal evaluation of the program that included a survey and interviews. Between the inception of the award in 2001 and the internal review conducted in 2010, the award could have had an influence on the promotion of 130 of the recipients. Of the 130, 88 (65.6%) received this award before gaining their current rank (χ (1) = 16.3, P < .001). Stakeholders, including department chairs and members of the P&T committee, agreed that this award is valuable to those seeking promotion. Individual recipients stated that the award is good for the institution by encouraging reflection on teaching; increasing the recognition, importance, and value of teaching; encouraging the improvement of teaching skills; and providing a better understanding to others about what medical teachers really do. Of the 214 open-ended responses to survey questions of award recipients, more than half the comments were about the value of the award and its positive effect on promotion.
Okuyucu, Kursat; Ozaydın, Sukru; Alagoz, Engin; Ozgur, Gokhan; Oysul, Fahrettin Guven; Ozmen, Ozlem; Tuncel, Murat; Ozturk, Mustafa; Arslan, Nuri
2016-01-01
Abstract Background Non-Hodgkin’s lymphomas arising from the tissues other than primary lymphatic organs are named primary extranodal lymphoma. Most of the studies evaluated metabolic tumor parameters in different organs and histopathologic variants of this disease generally for treatment response. We aimed to evaluate the prognostic value of metabolic tumor parameters derived from initial FDG-PET/CT in patients with a medley of primary extranodal lymphoma in this study. Patients and methods There were 67 patients with primary extranodal lymphoma for whom FDG-PET/CT was requested for primary staging. Quantitative PET/CT parameters: maximum standardized uptake value (SUVmax), average standardized uptake value (SUVmean), metabolic tumor volume (MTV) and total lesion glycolysis (TLG) were used to estimate disease-free survival and overall survival. Results SUVmean, MTV and TLG were found statistically significant after multivariate analysis. SUVmean remained significant after ROC curve analysis. Sensitivity and specificity were calculated as 88% and 64%, respectively, when the cut-off value of SUVmean was chosen as 5.15. After the investigation of primary presentation sites and histo-pathological variants according to recurrence, there is no difference amongst the variants. Primary site of extranodal lymphomas however, is statistically important (p = 0.014). Testis and central nervous system lymphomas have higher recurrence rate (62.5%, 73%, respectively). Conclusions High SUVmean, MTV and TLG values obtained from primary staging FDG-PET/CT are potential risk factors for both disease-free survival and overall survival in primary extranodal lymphoma. SUVmean is the most significant one amongst them for estimating recurrence/metastasis. PMID:27904443
Schwappach, David LB
2002-01-01
Background Health economic analysis aimed at informing policy makers and supporting resource allocation decisions has to evaluate not only improvements in health but also avoided decline. Little is known however, whether the "direction" in which changes in health are experienced is important for the public in prioritizing among patients. This experimental study investigates the social value people place on avoiding (further) health decline when directly compared to curative treatments in resource allocation decisions. Methods 127 individuals completed an interactive survey that was published in the World Wide Web. They were confronted with a standard gamble (SG) and three person trade-off tasks, either comparing improvements in health (PTO-Up), avoided decline (PTO-Down), or both, contrasting health changes of equal magnitude differing in the direction in which they are experienced (PTO-WAD). Finally, a direct priority ranking of various interventions was obtained. Results Participants strongly prioritized improving patients' health rather than avoiding decline. The mean substitution rate between health improvements and avoided decline (WAD) ranged between 0.47 and 0.64 dependent on the intervention. Weighting PTO values according to the direction in which changes in health are experienced improved their accuracy in predicting a direct prioritization ranking. Health state utilities obtained by the standard gamble method seem not to reflect social values in resource allocation contexts. Conclusion Results suggest that the utility of being cured of a given health state might not be a good approximation for the societal value of avoiding this health state, especially in cases of competition between preventive and curative interventions. PMID:11879529
Summary of Aquifer Test Data for Arkansas - 1940-2006
Pugh, Aaron L.
2008-01-01
As demands on Arkansas's ground water continue to increase, decision-makers need all available information to ensure the sustainability of this important natural resource. From 1940 through 2006, the U.S. Geological Survey has conducted over 300 aquifer tests in Arkansas. Much of these data never have been published. This report presents the results from 206 of these aquifer tests from 21 different hydrogeologic units spread across 51 Arkansas counties. Ten of the hydrogeologic units are within the Atlantic Plain of Arkansas and consist mostly of unconsolidated and semi-consolidated deposits. The remaining 11 units are within the Interior Highlands consisting mainly of consolidated rock. Descriptive statistics are reported for each hydrologic unit with two or more tests, including the mean, minimum, median, maximum and standard deviation values for specific capacity, transmissivity, hydraulic conductivity, and storage coefficient. Hydraulic conductivity values for the major water-bearing hydrogeologic units are estimated because few conductivity values are recorded in the original records. Nearly all estimated hydraulic conductivity values agree with published hydraulic conductivity values based on the hydrogeologic unit material types. Similarly, because few specific capacity values were available in the original aquifer test records, specific capacity values are estimated for individual wells.
Human Rights and Values Education: Using the International Standards.
ERIC Educational Resources Information Center
Reardon, Betty A.
1994-01-01
Asserts that, in teaching about human rights, the international standards should be the fundamental core of the content and values to be communicated. Recommends that teachers should use the Universal Declaration of Human Rights as the standard by which the actions of individuals and governments should be compared. (CFR)
High background in Luminex® assay for HLA antibody screening: Interest of Adsorb Out™.
Zerrouki, Asmae; Ouadghiri, Sanae; Benseffaj, Nadia; Razine, Rachid; Essakalli, Malika
2016-05-01
The Luminex® technology has become an integral component of clinical decision-making and diagnosis of transplanted organ rejection. Despite the superior sensibility of this technology, it is not completely problem free. We have observed in these bead-based assays that sera of some patients give a high negative control bead (NC) value which makes assessing HLA antibodies difficult. Treatment of sera by the Adsorb Out™ reagent may reduce the high background. In this study, we want to evaluate the effect of the Adsorb Out™ on the NC's MFI value by comparing treated and untreated patients' sera. HLA antibody screening was performed on 3011 sera. These sera came from patients awaiting and undergoing renal transplant from different Moroccan hospitals. The sera were analyzed using the standard protocol for Luminex® antibody screening. Sera with high NC's value has been pre-incubated by the Adsorb Out™, and analyzed on Luminex®. 3% of studied samples have high NC's value. The Adsorb Out™ decreases the NC's value and brings it back to a normal range in 62.2% treated sera. It has no effect in 12.3%. The Adsorb Out™ effect depends only of NC's value, independently to age, storage date, sex and immunization. The Adsorb Out™ reagent has an important effect in decreasing NC value of sera. However, it has no effect in some patient's sera. In these cases we could try another treatment, as EDTA, DTT. The non-specific binding may be caused by multiple patient-specific factors, it would be important to search correlation between them and NC's values. Copyright © 2016 Elsevier B.V. All rights reserved.
Simulated annealing two-point ray tracing
NASA Astrophysics Data System (ADS)
Velis, Danilo R.; Ulrych, Tadeusz J.
We present a new method for solving the two-point seismic ray tracing problem based on Fermat's principle. The algorithm overcomes some well known difficulties that arise in standard ray shooting and bending methods. Problems related to: (1) the selection of new take-off angles, and (2) local minima in multipathing cases, are overcome by using an efficient simulated annealing (SA) algorithm. At each iteration, the ray is propagated from the source by solving a standard initial value problem. The last portion of the raypath is then forced to pass through the receiver. Using SA, the total traveltime is then globally minimized by obtaining the initial conditions that produce the absolute minimum path. The procedure is suitable for tracing rays through 2D complex structures, although it can be extended to deal with 3D velocity media. Not only direct waves, but also reflected and head-waves can be incorporated in the scheme. One important advantage is its simplicity, in as much as any available or user-preferred initial value solver system can be used. A number of clarifying examples of multipathing in 2D media are examined.
NASA Astrophysics Data System (ADS)
Luo, B.; Ren, F.; Fitch, R. C.; Gillespie, J. K.; Jenkins, T.; Sewell, J.; Via, D.; Crespo, A.; Baca, A. G.; Briggs, R. D.; Gotthold, D.; Birkhahn, R.; Peres, B.; Pearton, S. J.
2003-06-01
A comparison was made of specific contact resistivity and morphology of Ti/Al/Pt/WSi/Ti/Au and Ti/Al/Pt/W/Ti/Au ohmic contacts to AlGaN/GaN heterostructures relative to the standard Ti/Al/Pt/Au metallization. The W- and WSi-based contacts show comparable specific resistivities to that of the standard contact on similar layer structures, reaching minimum values of ˜10-5 Ω cm2 after annealing in the range 850-900 °C. However, the W- and WSi-based contacts exhibit much smoother surface morphologies, even after 950 °C annealing. For example, the root-mean-square roughness of the Ti/Al/Pt/WSi/Ti/Au contact annealed at 950 °C was unchanged from the as-deposited values whereas the Ti/Al/Pt/Au contact shows significant deterioration of the morphology under these conditions. The improved thermal stability of the W- and WSix-based contacts is important for maintaining edge acuity during high-temperature operation.
The accuracy and efficiency of electronic screening for recruitment into a clinical trial on COPD.
Schmickl, Christopher N; Li, Man; Li, Guangxi; Wetzstein, Marnie M; Herasevich, Vitaly; Gajic, Ognjen; Benzo, Roberto P
2011-10-01
Participant recruitment is an important process in successful conduct of randomized controlled trials. To facilitate enrollment into a National Institutes of Health-sponsored clinical trial involving patients with chronic obstructive pulmonary disease (COPD), we developed and prospectively validated an automated electronic screening tool based on boolean free-text search of admission notes in electronic medical records. During a 2-week validation period, all patients admitted to prespecified general medical services were screened for eligibility by both the electronic screening tool and a COPD nurse. Group discussion was the gold standard for confirmation of true-positive results. Compared with the gold standard, electronic screening yielded 100% sensitivity, 92% specificity, 100% negative predictive value, and 72% positive predictive value. Compared with traditional manual screening, electronic screening demonstrated time-saving potential of 76%. Thus, the electronic screening tool accurately identifies potential study subjects and improves efficiency of patient accrual for a clinical trial on COPD. This method may be expanded into other institutional and clinical settings. Copyright © 2011 Elsevier Ltd. All rights reserved.
Satherley, Nicole; Milojev, Petar; Greaves, Lara M.; Huang, Yanshu; Osborne, Danny; Bulbulia, Joseph; Sibley, Chris G.
2015-01-01
This study examines attrition rates over the first four years of the New Zealand Attitudes and Values Study, a longitudinal national panel sample of New Zealand adults. We report the base rate and covariates for the following four distinct classes of respondents: explicit withdrawals, lost respondents, intermittent respondents and constant respondents. A multinomial logistic regression examined an extensive range of demographic and socio-psychological covariates (among them the Big-Six personality traits) associated with membership in these classes (N = 5,814). Results indicated that men, Māori and Asian peoples were less likely to be constant respondents. Conscientiousness and Honesty-Humility were also positively associated with membership in the constant respondent class. Notably, the effect sizes for the socio-psychological covariates of panel attrition tended to match or exceed those of standard demographic covariates. This investigation broadens the focus of research on panel attrition beyond demographics by including a comprehensive set of socio-psychological covariates. Our findings show that core psychological covariates convey important information about panel attrition, and are practically important to the management of longitudinal panel samples like the New Zealand Attitudes and Values Study. PMID:25793746
New Data Bases and Standards for Gravity Anomalies
NASA Astrophysics Data System (ADS)
Keller, G. R.; Hildenbrand, T. G.; Webring, M. W.; Hinze, W. J.; Ravat, D.; Li, X.
2008-12-01
Ever since the use of high-precision gravimeters emerged in the 1950's, gravity surveys have been an important tool for geologic studies. Recent developments that make geologically useful measurements from airborne and satellite platforms, the ready availability of the Global Positioning System that provides precise vertical and horizontal control, improved global data bases, and the increased availability of processing and modeling software have accelerated the use of the gravity method. As a result, efforts are being made to improve the gravity databases publicly available to the geoscience community by expanding their holdings and increasing the accuracy and precision of the data in them. Specifically the North American Gravity Database as well as the individual databases of Canada, Mexico, and the United States are being revised using new formats and standards to improve their coverage, standardization, and accuracy. An important part of this effort is revision of procedures and standards for calculating gravity anomalies taking into account the enhanced computational power available, modern satellite-based positioning technology, improved terrain databases, and increased interest in more accurately defining the different components of gravity anomalies. The most striking revision is the use of one single internationally accepted reference ellipsoid for the horizontal and vertical datums of gravity stations as well as for the computation of the calculated value of theoretical gravity. The new standards hardly impact the interpretation of local anomalies, but do improve regional anomalies in that long wavelength artifacts are removed. Most importantly, such new standards can be consistently applied to gravity database compilations of nations, continents, and even the entire world. Although many types of gravity anomalies have been described, they fall into three main classes. The primary class incorporates planetary effects, which are analytically prescribed, to derive the predicted or modeled gravity, and thus, anomalies of this class are termed planetary. The most primitive version of a gravity anomaly is simply the difference between the value of gravity predicted by the effect of the reference ellipsoid and the observed gravity anomaly. When the height of the gravity station increases, the ellipsoidal gravity anomaly decreases because of the increased distance of measurement from the anomaly- producing masses. The two primary anomalies in geophysics, which are appropriately classified as planetary anomalies, are the Free-air and Bouguer gravity anomalies. They employ models that account for planetary effects on gravity including the topography of the earth. A second class of anomaly, geological anomalies, includes the modeled gravity effect of known or assumed masses leading to the predicted gravity by using geological data such as densities and crustal thickness. The third class of anomaly, filtered anomalies, removes arbitrary gravity effects of largely unknown sources that are empirically or analytically determined from the nature of the gravity anomalies by filtering.
2012-12-01
calibrated using a certified mineral or pure metal standard and counting times are chosen to provide 3-sigma detection limits of between 100-200 ppm...also submit “blind” duplicates for analyses. The precision of the data generated by the “EMPA point count ” will be evaluated by calculating RPD values...important to consider the variation in results among all samples studied for a particular media, since the overall particle count is very large. Data
Determination of the heat capacities of Lithium/BCX (bromide chloride in thionyl chloride) batteries
NASA Technical Reports Server (NTRS)
Kubow, Stephen A.; Takeuchi, Kenneth J.; Takeuchi, Esther S.
1989-01-01
Heat capacities of twelve different Lithium/BCX (BrCl in thionyl chloride) batteries in sizes AA, C, D, and DD were determined. Procedures and measurement results are reported. The procedure allowed simple, reproducible, and precise determinations of heat capacities of industrially important Lithium/BCX cells, without interfering with performance of the cells. Use of aluminum standards allowed the accuracy of the measurements to be maintained. The measured heat capacities were within 5 percent of calculated heat capacity values.
The Hpp Rule with Memory and the Density Classification Task
NASA Astrophysics Data System (ADS)
Alonso-Sanz, Ramón
This article considers an extension to the standard framework of cellular automata by implementing memory capability in cells. It is shown that the important block HPP rule behaves as an excellent classifier of the density in the initial configuration when applied to cells endowed with pondered memory of their previous states. If the weighing is made so that the most recent state values are assigning the highest weights, the HPP rule surpasses the performance of the best two-dimensional density classifiers reported in the literature.
NASA Technical Reports Server (NTRS)
Stutzman, W. L.; Dishman, W. K.
1982-01-01
A simple attenuation model (SAM) is presented for estimating rain-induced attenuation along an earth-space path. The rain model uses an effective spatial rain distribution which is uniform for low rain rates and which has an exponentially shaped horizontal rain profile for high rain rates. When compared to other models, the SAM performed well in the important region of low percentages of time, and had the lowest percent standard deviation of all percent time values tested.
NASA Technical Reports Server (NTRS)
Beij, K Hilding
1933-01-01
This report presents a concise survey of the measurement of air speed and ground speed on board aircraft. Special attention is paid to the pitot-static air-speed meter which is the standard in the United States for airplanes. Air-speed meters of the rotating vane type are also discussed in considerable detail on account of their value as flight test instruments and as service instruments for airships. Methods of ground-speed measurement are treated briefly, with reference to the more important instruments. A bibliography on air-speed measurement concludes the report.
Dental technician pneumoconiosis mimicking lung cancer.
Uyar, Meral; Sokucu, Oral; Sanli, Maruf; Filiz, Ayten; Ali Ikidag, Mehmet; Feridun Isik, Ahmet; Bakir, Kemal
2015-09-01
A 47-year-old man was referred for assessment of bilateral lymph node enlargement identified on a routine chest radiograph. Positron emission tomography showed high standardized uptake values (SUVmax: 20.5) in right supraclavicular, right intercostal, and multiple mediastinal lymph nodes. Biopsy samples obtained from the right upper and left lower paratracheal nodes by mediastinoscopy revealed granulomatous inflammation. Clinical and laboratory findings indicated a diagnosis of dental technician pneumoconiosis. The patient is alive and well 3 years after diagnosis. This case highlights the importance of obtaining an occupational history.
Carbon Sequestration in Created and Natural Tidal Marshes of the Florida Panhandle
NASA Astrophysics Data System (ADS)
Rainville, K. M.; Davis, J.; Currin, C.
2016-12-01
Salt marshes are widely understood to be efficient at storing carbon in sediments (aka blue carbon) through the production of roots and rhizomes. These marshes are also able to trap sediments from incoming tides, slowly increasing their elevation over time. These qualities have led to a great deal of interest in creation and preservation of salt marshes for offsetting changes associated with anthropogenic CO2 emissions. Determinations of the value of marshes in terms of CO2 offsets requires detailed knowledge of sediment carbon storage rates, but to date, measured rates of carbon storage in created salt marsh sediments are sparse. We measured carbon storage in natural and created marshes along the Northern Gulf Coast of Florida. The created marshes were in `living shoreline' projects and ranged in age from 8 to 28 years. Dominant plant cover of the marshes included Spartina alterniflora and Juncus spp. At all sites, sediment cores (22-75 cm in depth) were collected, extruded in 5 cm increments, and carbon content was determined by elemental analysis. Measured C storage rates in the created marshes ranged from 60 to 130 g C m-2 yr-1 and decreased with marsh age. A decrease in storage rates over time is evidence of continued decomposition of stored carbon as sediments age, an important factor to consider when estimating the value of a given marsh for CO2 offsets. The rates measured in Florida are well below previously published average values ( 200 g m-2 yr-1) and also below the default value allowed for carbon crediting through the verified carbon standard (146 g m-2 yr), but similar to those measured in created marshes in North Carolina. In addition, factors such as dominant plant type, water inundation, temperature, latitude, biological belowground activity and biomass values can impact carbon storage rates of marshes among geographically distinct regions. This makes it especially important to determine carbon storage rates on a local scale, and not following a verified carbon standard. These data add to the geographic coverage over which documented C storage rates are currently available and suggest that locally determined rates are necessary for accurate carbon accounting.
Assessing the Added Value of Dynamical Downscaling Using the Standardized Precipitation Index
In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drough...
Call to action: Better care, better health, and greater value in college health.
Ciotoli, Carlo; Smith, Allison J; Keeling, Richard P
2018-03-05
It is time for action by leaders across higher education to strengthen quality improvement (QI) in college health, in pursuit of better care, better health, and increased value - goals closely linked to students' learning and success. The size and importance of the college student population; the connections between wellbeing, and therefore QI, and student success; the need for improved standards and greater accountability; and the positive contributions of QI to employee satisfaction and professionalism all warrant a widespread commitment to building greater capacity and capability for QI in college health. This report aims to inspire, motivate, and challenge college health professionals and their colleagues, campus leaders, and national entities to take both immediate and sustainable steps to bring QI to the forefront of college health practice - and, by doing so, to elevate care, health, and value of college health as a key pathway to advancing student success.
Exposure to formaldehyde in health care: an evaluation of the white blood count differential.
Sancini, Angela; Rosati, Maria Valeria; De Sio, Simone; Casale, Teodorico; Caciari, Tiziana; Samperi, Ilaria; Sacco, Carmina; Fortunato, Bruna Rita; Pimpinella, Benedetta; Andreozzi, Giorgia; Tomei, Gianfranco; Tomei, Francesco
2014-01-01
The aim of our study is to estimate if the occupational exposure to formaldehyde can cause alterations of leukocytes plasma values in health care workers employed in a big hospital compared to a control group. We studied employees in operating rooms and laboratories of Pathological Anatomy, Molecular Biology, Molecular Neurobiology, Parasitology and Experimental Oncology (exposed to formaldehyde) and employees of the Department of Internal Medicine (not exposed). The sample studied was composed of 86 workers exposed to formaldehyde and 86 workers not exposed. All subjects underwent a clinical-anamnaestic examination and for all subjects were measured the following values: total white blood cells, lymphocytes, monocytes and granulocytes (eosinophils, basophils, neutrophils). Statistical analysis of data was based on calculation of the mean, standard deviation and the distribution into classes according to the nature of each variable. Differences were considered significant when p was < 0.05. The mean and the distribution of values of the white blood cells, lymphocytes, monocytes and eosinophils were significantly higher in male subjects exposed to formaldehyde compared to not-exposed. Not significant differences were found in female subjects exposed compared to not exposed. The results underline the importance of a careful risk assessment of workers exposed to formaldehyde and the use of appropriate preventive measures. The health care trained and informed about the risks he is exposed to should observe good standards of behavior and, where it is not possible to use alternative materials, the indoor concentrations of formaldehyde should never exceed occupational limit values.
Perspective: the revolution is upon us.
Sierles, Frederick S
2010-05-01
Profound socioeconomic pressures on medical student education have been catalogued extensively. These pressures include teaching patient shortages, teacher shortages, conflicting systems, and financial problems. Many of these problems have been caused by an unregulated free market affecting medicine overall, with market values sometimes overshadowing the academic values of education, research, and patient care. This has caused profound changes in the conduct of medical student education. Particularly important has been a reduction in the "gold standard" of teaching: direct student-teacher and supervised student-patient interaction, replaced by a potpourri of online and simulated modules. The aggregate of these changes constitutes a revolution that challenges whether medical schools, school buildings, classes, and dedicated faculty are even necessary. The author posits several recommendations in response to this revolution: (1) recognize the revolution as such, and carefully guide or abort it, lest its outcome be inadequate, inauthentic, or corrupt, (2) prioritize academic rather than business values, (3) ensure that funds allotted for education are used for education, (4) insist that medical schools, not industry, teach students, (5) value authentic education more than simulation, (6) adopt learner-centered teaching without misusing it, (7) maintain acceptable class attendance without requiring it, (8) provide, from the first school day, authentic, patient-centered medical education characterized by vertical integration, humanism, early patient exposure, biopsychosocial orientation, and physician role modeling, (9) ensure that third- and fourth-year students have rich patient-care responsibility, and 10) keep tenure. These actions would permit the preservation of an educational gold standard that justifies medical education's cost.
Suárez, Inmaculada; Coto, Baudilio
2015-08-14
Average molecular weights and polydispersity indexes are some of the most important parameters considered in the polymer characterization. Usually, gel permeation chromatography (GPC) and multi angle light scattering (MALS) are used for this determination, but GPC values are overestimated due to the dispersion introduced by the column separation. Several procedures were proposed to correct such effect usually involving more complex calibration processes. In this work, a new method of calculation has been considered including diffusion effects. An equation for the concentration profile due to diffusion effects along the GPC column was considered to be a Fickian function and polystyrene narrow standards were used to determine effective diffusion coefficients. The molecular weight distribution function of mono and poly disperse polymers was interpreted as a sum of several Fickian functions representing a sample formed by only few kind of polymer chains with specific molecular weight and diffusion coefficient. Proposed model accurately fit the concentration profile along the whole elution time range as checked by the computed standard deviation. Molecular weights obtained by this new method are similar to those obtained by MALS or traditional GPC while polydispersity index values are intermediate between those obtained by the traditional GPC combined to Universal Calibration method and the MALS method. Values for Pearson and Lin coefficients shows improvement in the correlation of polydispersity index values determined by GPC and MALS methods when diffusion coefficients and new methods are used. Copyright © 2015 Elsevier B.V. All rights reserved.
Singla, Ashish; Kundu, Hansa; P., Basavaraj; Singh, Shilpi; Singh, Khushboo; Jain, Swati
2014-01-01
Introduction: Quality of drinking water is a powerful environmental determinant of health. The main objective of introduction of bottled water in the society was its better safety, taste and convenience over tap water. The present study was conducted to assess physicochemical and bacterial qualities of bottled water and sachet water which were available in various markets of Delhi. Materials and Methods: Sixteen water bottles and four water sachets were selected through stratified random sampling from various public places in Delhi and their analysis was done at National Test House, Ghaziabad. Results were then compared with national (IS10500, IS14543) and international (WHO, FDA, USEPA) standards. Results: Bottled water showed better quality than sachet water. The mean value of copper (0.0746mg/l) in bottles exceeded the standard values of IS10500 and IS14543(0.05), while the mean value of lead (0.008mg/l) exceeded the FDA standard value (0.005). When the results of sachets were compared with those of standards, the mean values of selenium (0.1195mg/l) and lead (0.862mg/l) were found to exceed values of both Indian and International standards. For the biological parameter i.e. coliform count, the mean value for bottles was 0 (nil), whereas the mean value for sachets was 16.75, which showed the unhealthy nature of sachets. Conclusion: The parameters which were tested in the present study showed excess of various chemical and bacterial parameters in drinking water, which could pose serious threats to consumers. Thus, these results suggest a more stringent standardization of bottled water market with special attention to quality, identity and licensing by concerned authorities, to safeguard health of consumers. PMID:24783149
Leivada, Evelina; Papadopoulou, Elena; Pavlou, Natalia
2017-01-01
Findings from the field of experimental linguistics have shown that a native speaker may judge a variant that is part of her grammar as unacceptable, but still use it productively in spontaneous speech. The process of eliciting acceptability judgments from speakers of non-standard languages is sometimes clouded by factors akin to prescriptive notions of grammatical correctness. It has been argued that standardization enhances the ability to make clear-cut judgments, while non-standardization may result to grammatical hybridity, often manifested in the form of functionally equivalent variants in the repertoire of a single speaker. Recognizing the importance of working with corpora of spontaneous speech, this work investigates patterns of variation in the spontaneous production of five neurotypical, adult speakers of a non-standard variety in terms of three variants, each targeting one level of linguistic analysis: syntax, morphology, and phonology. The results reveal the existence of functionally equivalent variants across speakers and levels of analysis. We first discuss these findings in relation to the notions of competing, mixed, and fused grammars, and then we flesh out the implications that different values of the same variant carry for parametric approaches to Universal Grammar. We observe that intraspeaker realizations of different values of the same variant within the same syntactic environment are incompatible with the ‘triggering-a-single-value’ approach of parametric models, but we argue that they are compatible with the concept of Universal Grammar itself. Since the analysis of these variants is ultimately a way of investigating the status of Universal Grammar primitives, we conclude that claims about the alleged unfalsifiability of (the contents of) Universal Grammar are unfounded. PMID:28790953
Traceable quantum sensing and metrology relied up a quantum electrical triangle principle
NASA Astrophysics Data System (ADS)
Fang, Yan; Wang, Hengliang; Yang, Xinju; Wei, Jingsong
2016-11-01
Hybrid quantum state engineering in quantum communication and imaging1-2 needs traceable quantum sensing and metrology, which are especially critical to quantum internet3 and precision measurements4 that are important across all fields of science and technology-. We aim to set up a mode of traceable quantum sensing and metrology. We developed a method by specially transforming an atomic force microscopy (AFM) and a scanning tunneling microscopy (STM) into a conducting atomic force microscopy (C-AFM) with a feedback control loop, wherein quantum entanglement enabling higher precision was relied upon a set-point, a visible light laser beam-controlled an interferometer with a surface standard at z axis, diffractometers with lateral standards at x-y axes, four-quadrant photodiode detectors, a scanner and its image software, a phase-locked pre-amplifier, a cantilever with a kHz Pt/Au conducting tip, a double barrier tunneling junction model, a STM circuit by frequency modulation and a quantum electrical triangle principle involving single electron tunneling effect, quantum Hall effect and Josephson effect5. The average and standard deviation result of repeated measurements on a 1 nm height local micro-region of nanomedicine crystal hybrid quantum state engineering surface and its differential pA level current and voltage (dI/dV) in time domains by using C-AFM was converted into an international system of units: Siemens (S), an indicated value 0.86×10-12 S (n=6) of a relative standard uncertainty was superior over a relative standard uncertainty reference value 2.3×10-10 S of 2012 CODADA quantized conductance6. It is concluded that traceable quantum sensing and metrology is emerging.
On prediction of genetic values in marker-assisted selection.
Lange, C; Whittaker, J C
2001-01-01
We suggest a new approximation for the prediction of genetic values in marker-assisted selection. The new approximation is compared to the standard approach. It is shown that the new approach will often provide substantially better prediction of genetic values; furthermore the new approximation avoids some of the known statistical problems of the standard approach. The advantages of the new approach are illustrated by a simulation study in which the new approximation outperforms both the standard approach and phenotypic selection. PMID:11729177
Autonomy, religion and clinical decisions: findings from a national physician survey
Lawrence, R E; Curlin, F A
2010-01-01
Background Patient autonomy has been promoted as the most important principle to guide difficult clinical decisions. To examine whether practising physicians indeed value patient autonomy above other considerations, physicians were asked to weight patient autonomy against three other criteria that often influence doctors’ decisions. Associations between physicians’ religious characteristics and their weighting of the criteria were also examined. Methods Mailed survey in 2007 of a stratified random sample of 1000 US primary care physicians, selected from the American Medical Association masterfile. Physicians were asked how much weight should be given to the following: (1) the patient’s expressed wishes and values, (2) the physician’s own judgment about what is in the patient’s best interest, (3) standards and recommendations from professional medical bodies and (4) moral guidelines from religious traditions. Results Response rate 51% (446/879). Half of physicians (55%) gave the patient’s expressed wishes and values “the highest possible weight”. In comparative analysis, 40% gave patient wishes more weight than the other three factors, and 13% ranked patient wishes behind some other factor. Religious doctors tended to give less weight to the patient’s expressed wishes. For example, 47% of doctors with high intrinsic religious motivation gave patient wishes the “highest possible weight”, versus 67% of those with low (OR 0.5; 95% CI 0.3 to 0.8). Conclusions Doctors believe patient wishes and values are important, but other considerations are often equally or more important. This suggests that patient autonomy does not guide physicians’ decisions as much as is often recommended in the ethics literature. PMID:19332575
Assessment of an undergraduate psychiatry course in an African setting.
Baig, Benjamin J; Beaglehole, Anna; Stewart, Robert C; Boeing, Leonie; Blackwood, Douglas H; Leuvennink, Johan; Kauye, Felix
2008-04-22
International reports recommend the improvement in the amount and quality of training for mental health workers in low and middle income countries. The Scotland-Malawi Mental Health Education Project (SMMHEP) has been established to support the teaching of psychiatry to medical students in the University of Malawi. While anecdotally supportive medical educational initiatives appear of value, little quantitative evidence exists to demonstrate whether such initiatives can deliver comparable educational standards. This study aimed to assess the effectiveness of an undergraduate psychiatry course given by UK psychiatrists in Malawi by studying University of Malawi and Edinburgh University medical students' performance on an MCQ examination paper. An undergraduate psychiatry course followed by an MCQ exam was delivered by the SMMHEP to 57 Malawi medical students. This same MCQ exam was given to 71 Edinburgh University medical students who subsequently sat their own Edinburgh University examination. There were no significant differences between Edinburgh students' performance on the Malawi exam and their own Edinburgh University exam. (p = 0.65). This would suggest that the Malawi exam is a comparable standard to the Edinburgh exam. Malawi students marks ranged from 52.4%-84.6%. Importantly 84.4% of Malawi students scored above 60% on their exam which would equate to a hypothetical pass by UK university standards. The support of an undergraduate course in an African setting by high income country specialists can attain a high percentage pass rate by UK standards. Although didactic teaching has been surpassed by more novel educational methods, in resource poor countries it remains an effective and cost effective method of gaining an important educational standard.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
... pollution in fresh water systems can significantly negatively impact aquatic life and long-term ecosystem... Water Quality Standards for the State of Florida's Streams and Downstream Protection Values for Lakes... its numeric water quality standards for nutrients in Florida that were promulgated and published on...
NASA Astrophysics Data System (ADS)
Lee, H.; Sheen, D.; Kim, S.
2013-12-01
The b-value in Gutenberg-Richter relation is an important parameter widely used not only in the interpretation of regional tectonic structure but in the seismic hazard analysis. In this study, we tested four methods for estimating the stable b-value in a small number of events using Monte-Carlo method. One is the Least-Squares method (LSM) which minimizes the observation error. Others are based on the Maximum Likelihood method (MLM) which maximizes the likelihood function: Utsu's (1965) method for continuous magnitudes and an infinite maximum magnitude, Page's (1968) for continuous magnitudes and a finite maximum magnitude, and Weichert's (1980) for interval magnitude and a finite maximum magnitude. A synthetic parent population of the earthquake catalog of million events from magnitude 2.0 to 7.0 with interval of 0.1 was generated for the Monte-Carlo simulation. The sample, the number of which was increased from 25 to 1000, was extracted from the parent population randomly. The resampling procedure was applied 1000 times with different random seed numbers. The mean and the standard deviation of the b-value were estimated for each sample group that has the same number of samples. As expected, the more samples were used, the more stable b-value was obtained. However, in a small number of events, the LSM gave generally low b-value with a large standard deviation while other MLMs gave more accurate and stable values. It was found that Utsu (1965) gives the most accurate and stable b-value even in a small number of events. It was also found that the selection of the minimum magnitude could be critical for estimating the correct b-value for Utsu's (1965) method and Page's (1968) if magnitudes were binned into an interval. Therefore, we applied Utsu (1965) to estimate the b-value using two instrumental earthquake catalogs, which have events occurred around the southern part of the Korean Peninsula from 1978 to 2011. By a careful choice of the minimum magnitude, the b-values of the earthquake catalogs of the Korea Meteorological Administration and Kim (2012) are estimated to be 0.72 and 0.74, respectively.
Product or waste? Importation and end-of-life processing of computers in Peru.
Kahhat, Ramzy; Williams, Eric
2009-08-01
This paper considers the importation of used personal computers (PCs) in Peru and domestic practices in their production, reuse, and end-of-life processing. The empirical pillars of this study are analysis of government data describing trade in used and new computers and surveys and interviews of computer sellers, refurbishers, and recyclers. The United States is the primary source of used PCs imported to Peru. Analysis of shipment value (as measured by trade statistics) shows that 87-88% of imported used computers had a price higher than the ideal recycle value of constituent materials. The official trade in end-of-life computers is thus driven by reuse as opposed to recycling. The domestic reverse supply chain of PCs is well developed with extensive collection, reuse, and recycling. Environmental problems identified include open burning of copper-bearing wires to remove insulation and landfilling of CRT glass. Distinct from informal recycling in China and India, printed circuit boards are usually not recycled domestically but exported to Europe for advanced recycling or to China for (presumably) informal recycling. It is notable that purely economic considerations lead to circuit boards being exported to Europe where environmental standards are stringent, presumably due to higher recovery of precious metals.
Coors, Marilyn E; Matthew, Thomas L; Matthew, Dayna B
2015-10-01
At the invitation of the Rwandan Government, Team Heart, a team of American healthcare professionals, performs volunteer rheumatic heart disease (RHD) surgery in Rwanda every year, and confronts ethical concerns that call for cultural sensitivity. This article describes how five standard bioethical precepts are applied in practice in medical volunteerism related to RHD surgery in Rwanda. The content for the applied precepts stems from semiscripted, transcribed conversations with the authors, two Rwandan cardiologists, a Rwandan nurse and a Rwandan premedical student. The conversations revealed that the criteria for RHD surgical selection in Rwanda are analogous to the patient-selection process involving material scarcity in the USA. Rwandan notions of benefit and harm focus more attention on structural issues, such as shared benefit, national reputation and expansion of expertise, than traditional Western notions. Harm caused by inadequate patient follow-up remains a critical concern. Gender disparities regarding biological and social implications of surgical valve choices impact considerations of justice. Individual agency remains important, but not central to Rwandan concepts of justice, transparency and respect, particularly regarding women. The Rwandan understanding of standard bioethical precepts is substantively similar to the traditionally recognised interpretation with important contextual differences. The communal importance of improving the health of a small number of individuals may be underestimated in previous literature. Moreover, openness and the incorporation of Rwandan stakeholders in difficult ethical choices and long-term contributions to indigenous medical capacity appear to be valued by Rwandans. These descriptions of applied precepts are applicable to different medical missions in other emerging nations following a similar process of inclusion. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
How to model a negligible probability under the WTO sanitary and phytosanitary agreement?
Powell, Mark R
2013-06-01
Since the 1997 EC--Hormones decision, World Trade Organization (WTO) Dispute Settlement Panels have wrestled with the question of what constitutes a negligible risk under the Sanitary and Phytosanitary Agreement. More recently, the 2010 WTO Australia--Apples Panel focused considerable attention on the appropriate quantitative model for a negligible probability in a risk assessment. The 2006 Australian Import Risk Analysis for Apples from New Zealand translated narrative probability statements into quantitative ranges. The uncertainty about a "negligible" probability was characterized as a uniform distribution with a minimum value of zero and a maximum value of 10(-6) . The Australia - Apples Panel found that the use of this distribution would tend to overestimate the likelihood of "negligible" events and indicated that a triangular distribution with a most probable value of zero and a maximum value of 10⁻⁶ would correct the bias. The Panel observed that the midpoint of the uniform distribution is 5 × 10⁻⁷ but did not consider that the triangular distribution has an expected value of 3.3 × 10⁻⁷. Therefore, if this triangular distribution is the appropriate correction, the magnitude of the bias found by the Panel appears modest. The Panel's detailed critique of the Australian risk assessment, and the conclusions of the WTO Appellate Body about the materiality of flaws found by the Panel, may have important implications for the standard of review for risk assessments under the WTO SPS Agreement. © 2012 Society for Risk Analysis.
Liu, Hsin-yi; Pearlman, Jonathan; Cooper, Rosemarie; Hong, Eun-kyoung; Wang, Hongwu; Salatin, Benjamin; Cooper, Rory A
2010-01-01
Previous studies found that select titanium ultralight rigid wheelchairs (TURWs) had fewer equivalent cycles and less value than select aluminum ultralight folding wheelchairs (AUFWs). The causes of premature failure of TURWs were not clear because the TURWs had different frame material and design than the AUFWs. We tested 12 aluminum ultralight rigid wheelchairs (AURWs) with similar frame designs and dimensions as the TURWs using the American National Standards Institute/Rehabilitation Engineering and Assistive Technology Society of North America and International Organization for Standardization wheelchair standards and hypothesized that the AURWs would be more durable than the TURWs. Across wheelchair models, no significant differences were found in the test results between the AURWs and TURWs, except in their overall length. Tire pressure, tube-wall thickness, and tube manufacturing were proposed to be the factors affecting wheelchair durability through comparison of the failure modes, frames, and components. The frame material did not directly affect the performance of AURWs and TURWs, but proper wheelchair manufacture and design based on mechanical properties are important.
Statistical Data Editing in Scientific Articles.
Habibzadeh, Farrokh
2017-07-01
Scientific journals are important scholarly forums for sharing research findings. Editors have important roles in safeguarding standards of scientific publication and should be familiar with correct presentation of results, among other core competencies. Editors do not have access to the raw data and should thus rely on clues in the submitted manuscripts. To identify probable errors, they should look for inconsistencies in presented results. Common statistical problems that can be picked up by a knowledgeable manuscript editor are discussed in this article. Manuscripts should contain a detailed section on statistical analyses of the data. Numbers should be reported with appropriate precisions. Standard error of the mean (SEM) should not be reported as an index of data dispersion. Mean (standard deviation [SD]) and median (interquartile range [IQR]) should be used for description of normally and non-normally distributed data, respectively. If possible, it is better to report 95% confidence interval (CI) for statistics, at least for main outcome variables. And, P values should be presented, and interpreted with caution, if there is a hypothesis. To advance knowledge and skills of their members, associations of journal editors are better to develop training courses on basic statistics and research methodology for non-experts. This would in turn improve research reporting and safeguard the body of scientific evidence. © 2017 The Korean Academy of Medical Sciences.
Farmer views on calving difficulty consequences on dairy and beef farms.
Martin-Collado, D; Hely, F; Byrne, T J; Evans, R; Cromie, A R; Amer, P R
2017-02-01
Calving difficulty (CD) is a key functional trait with significant influence on herd profitability and animal welfare. Breeding plays an important role in managing CD both at farm and industry level. An alternative to the economic value approach to determine the CD penalty is to complement the economic models with the analysis of farmer perceived on-farm impacts of CD. The aim of this study was to explore dairy and beef farmer views and perceptions on the economic and non-economic on-farm consequences of CD, to ultimately inform future genetic selection tools for the beef and dairy industries in Ireland. A standardised quantitative online survey was released to all farmers with e-mail addresses on the Irish Cattle Breeding Federation database. In total, 271 farmers completed the survey (173 beef farmers and 98 dairy farmers). Both dairy and beef farmers considered CD a very important issue with economic and non-economic components. However, CD was seen as more problematic by dairy farmers, who mostly preferred to slightly reduce its incidence, than by beef farmers, who tended to support increases in calf value even though it would imply a slight increase in CD incidence. Farm size was found to be related to dairy farmer views of CD with farmers from larger farms considering CD as more problematic than farmers from smaller farms. CD breeding value was reported to be critical for selecting beef sires to mate with either beef or dairy cows, whereas when selecting dairy sires, CD had lower importance than breeding values for other traits. There was considerable variability in the importance farmers give to CD breeding values that could not be explained by the farm type or the type of sire used, which might be related to the farmer non-economic motives. Farmer perceived economic value associated with incremental increases in CD increases substantially as the CD level considered increases. This non-linear relationship cannot be reflected in a standard linear index weighting. The results of this paper provide key underpinning support to the development of non-linear index weightings for CD in Irish national indexes.
Ma, Weina; Yang, Liu; Lv, Yanni; Fu, Jia; Zhang, Yanmin; He, Langchong
2017-06-23
The equilibrium dissociation constant (K D ) of drug-membrane receptor affinity is the basic parameter that reflects the strength of interaction. The cell membrane chromatography (CMC) method is an effective technique to study the characteristics of drug-membrane receptor affinity. In this study, the K D value of CMC relative standard method for the determination of drug-membrane receptor affinity was established to analyze the relative K D values of drugs binding to the membrane receptors (Epidermal growth factor receptor and angiotensin II receptor). The K D values obtained by the CMC relative standard method had a strong correlation with those obtained by the frontal analysis method. Additionally, the K D values obtained by CMC relative standard method correlated with pharmacological activity of the drug being evaluated. The CMC relative standard method is a convenient and effective method to evaluate drug-membrane receptor affinity. Copyright © 2017 Elsevier B.V. All rights reserved.
Pessiglione, Mathias
2017-01-01
A standard view in neuroeconomics is that to make a choice, an agent first assigns subjective values to available options, and then compares them to select the best. In choice tasks, these cardinal values are typically inferred from the preference expressed by subjects between options presented in pairs. Alternatively, cardinal values can be directly elicited by asking subjects to place a cursor on an analog scale (rating task) or to exert a force on a power grip (effort task). These tasks can vary in many respects: they can notably be more or less costly and consequential. Here, we compared the value functions elicited by choice, rating and effort tasks on options composed of two monetary amounts: one for the subject (gain) and one for a charity (donation). Bayesian model selection showed that despite important differences between the three tasks, they all elicited a same value function, with similar weighting of gain and donation, but variable concavity. Moreover, value functions elicited by the different tasks could predict choices with equivalent accuracy. Our finding therefore suggests that comparable value functions can account for various motivated behaviors, beyond economic choice. Nevertheless, we report slight differences in the computational efficiency of parameter estimation that may guide the design of future studies. PMID:29161252
Addressing the hidden curriculum: understanding educator professionalism.
Glicken, Anita Duhl; Merenstein, Gerald B
2007-02-01
Several authors agree that student observations of behaviors are a far greater influence than prescriptions for behavior offered in the classroom. While these authors stress the importance of modeling of professional relationships with patients and colleagues, at times they have fallen short of acknowledging the importance of the values inherent in the role of the professional educator. This includes relationships and concomitant behaviors that stem from the responsibilities of being an educator based on expectations of institutional and societal culture. While medical professionals share standards of medical practice in exercising medical knowledge, few have obtained formal training in the knowledge, skills and attitudes requisite for teaching excellence. Attention needs to be paid to the professionalization of medical educators as teachers, a professionalization process that parallels and often intersects the values and behaviors of medical practice but remains a distinct and important body of knowledge and skills unto itself. Enhancing educator professionalism is a critical issue in educational reform, increasing accountability for meeting student needs. Assumptions regarding educator professionalism are subject to personal and cultural interpretation, warranting additional dialogue and research as we work to expand definitions and guidelines that assess and reward educator performance.
How to compare cross-lagged associations in a multilevel autoregressive model.
Schuurman, Noémi K; Ferrer, Emilio; de Boer-Sonnenschein, Mieke; Hamaker, Ellen L
2016-06-01
By modeling variables over time it is possible to investigate the Granger-causal cross-lagged associations between variables. By comparing the standardized cross-lagged coefficients, the relative strength of these associations can be evaluated in order to determine important driving forces in the dynamic system. The aim of this study was twofold: first, to illustrate the added value of a multilevel multivariate autoregressive modeling approach for investigating these associations over more traditional techniques; and second, to discuss how the coefficients of the multilevel autoregressive model should be standardized for comparing the strength of the cross-lagged associations. The hierarchical structure of multilevel multivariate autoregressive models complicates standardization, because subject-based statistics or group-based statistics can be used to standardize the coefficients, and each method may result in different conclusions. We argue that in order to make a meaningful comparison of the strength of the cross-lagged associations, the coefficients should be standardized within persons. We further illustrate the bivariate multilevel autoregressive model and the standardization of the coefficients, and we show that disregarding individual differences in dynamics can prove misleading, by means of an empirical example on experienced competence and exhaustion in persons diagnosed with burnout. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Hoffman, Robert A; Wang, Lili; Bigos, Martin; Nolan, John P
2012-09-01
Results from a standardization study cosponsored by the International Society for Advancement of Cytometry (ISAC) and the US National Institute of Standards and Technology (NIST) are reported. The study evaluated the variability of assigning intensity values to fluorophore standard beads by bead manufacturers and the variability of cross calibrating the standard beads to stained polymer beads (hard-dyed beads) using different flow cytometers. Hard dyed beads are generally not spectrally matched to the fluorophores used to stain cells, and spectral response varies among flow cytometers. Thus if hard dyed beads are used as fluorescence calibrators, one expects calibration for specific fluorophores (e.g., FITC or PE) to vary among different instruments. Using standard beads surface-stained with specific fluorophores (FITC, PE, APC, and Pacific Blue™), the study compared the measured intensity of fluorophore standard beads to that of hard dyed beads through cross calibration on 133 different flow cytometers. Using robust CV as a measure of variability, the variation of cross calibrated values was typically 20% or more for a particular hard dyed bead in a specific detection channel. The variation across different instrument models was often greater than the variation within a particular instrument model. As a separate part of the study, NIST and four bead manufacturers used a NIST supplied protocol and calibrated fluorophore solution standards to assign intensity values to the fluorophore beads. Values assigned to the reference beads by different groups varied by orders of magnitude in most cases, reflecting differences in instrumentation used to perform the calibration. The study concluded that the use of any spectrally unmatched hard dyed bead as a general fluorescence calibrator must be verified and characterized for every particular instrument model. Close interaction between bead manufacturers and NIST is recommended to have reliable and uniformly assigned fluorescence standard beads. Copyright © 2012 International Society for Advancement of Cytometry.
Numerical analysis of whole-body cryotherapy chamber design improvement
NASA Astrophysics Data System (ADS)
Yerezhep, D.; Tukmakova, A. S.; Fomin, V. E.; Masalimov, A.; Asach, A. V.; Novotelnova, A. V.; Baranov, A. Yu
2018-05-01
Whole body cryotherapy is a state-of-the-art method that uses cold for treatment and prevention of diseases. The process implies the impact of cryogenic gas on a human body that implements in a special cryochamber. The temperature field in the chamber is of great importance since local integument over-cooling may occur. Numerical simulation of WBC has been carried out. Chamber design modification has been proposed in order to increase the uniformity of the internal temperature field. The results have been compared with the ones obtained for a standard chamber design. The value of temperature gradient formed in the chamber containing curved wall with certain height has been decreased almost twice in comparison with the results obtained for the standard design. The modification proposed may increase both safety and comfort of cryotherapy.
Mayer, Dieter; Rancic, Zoran; Pfammatter, Thomas; Hechelhammer, Lukas; Veith, Frank J; Donas, Konstantin; Lachat, Mario
2010-01-01
The value of emergency endovascular aneurysm repair (EVAR) in the setting of ruptured abdominal aortic aneurysm remains controversial owing to differing results. However, interpretation of published results remains difficult as there is a lack of generally accepted protocols or standard operating procedures. Furthermore, such protocols and standard operating procedures often are reported incompletely or not at all, thereby making interpretation of results difficult. We herein report our integrated logistic system for the endovascular treatment of ruptured abdominal aortic aneurysms. Important components of this system are prehospital logistics, in-hospital treatment logistics, and aftercare. Further studies should include details about all of these components, and a description of these logistic components must be included in all future studies of emergency EVAR for ruptured abdominal aortic aneurysms.
Progress toward a new beam measurement of the neutron lifetime
NASA Astrophysics Data System (ADS)
Hoogerheide, Shannon Fogwell; BL2 Collaboration
2017-01-01
Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosynthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method is underway at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement, its current status, and the technical improvements will be discussed.
Conditions of transfer and quality of food.
Southern, K J; Rasekh, J G; Hemphill, F E; Thaler, A M
2006-08-01
Many factors contribute to the production of safe foods of animal origin. Initiatives for an integrated approach to food safety recognise the importance of optimising transportation conditions to ensure on-farm interventions are preserved. Physical, microbial, and environmental hazards during the transportation process may adversely affect the safety and quality of meat, poultry, and egg products. Additionally, the stress level in animals can be raised by transportation conditions, potentially causing increased pathogen shedding in carrier animals which exposes other animals to possible contamination. The physiological effects of stress on animals can reduce the quality of meat, poultry, and egg products produced by the animals, thus decreasing the economic value of the animal. Increased globalisation of markets provides an incentive for transportation standards of food animals within a country as well as transportation standards between countries.
Edge-Based Image Compression with Homogeneous Diffusion
NASA Astrophysics Data System (ADS)
Mainberger, Markus; Weickert, Joachim
It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.
Gregg, Daniel; Wheeler, Sarah Ann
2018-08-15
To date, the majority of environmental assets studied in the economic valuation literature clearly have high amenity and recreational use values. However there are many cases where small, but nevertheless unique and important, ecosystems survive as islands amongst large areas of modified, productive, or urban, landscapes. Development encroaches on the landscape and as urban landscapes become more concentrated these types of conservation islands will become increasingly more important. Previous experience with economic valuation suggests that lower total values for smaller contributions to conservation are more liable to be swamped by survey and hypothetical bias measures. Hence there needs to be more understanding of approaches to economic valuation for small and isolated environmental assets, in particular regarding controlling stated preference biases. This study applied the recently developed method of Inferred Valuation (IV) to a small private wetland in South-East Australia, and compared willingness to pay values with estimates from a standard Contingent Valuation (CV) approach. We found that hypothetical bias did seem to be slightly lower with the IV method. However, other methods such as the use of log-normal transformations and median measures, significantly mitigate apparent hypothetical biases and are easier to apply and allow use of the well-tested CV method. Copyright © 2018 Elsevier Ltd. All rights reserved.
2013-01-01
Background Measurements of the morphology of the ankle joint, performed mostly for surgical planning of total ankle arthroplasty and for collecting data for total ankle prosthesis design, are often made on planar radiographs, and therefore can be very sensitive to the positioning of the joint during imaging. The current study aimed to compare ankle morphological measurements using CT-generated 2D images with gold standard values obtained from 3D CT data; to determine the sensitivity of the 2D measurements to mal-positioning of the ankle during imaging; and to quantify the repeatability of the 2D measurements under simulated positioning conditions involving random errors. Method Fifty-eight cadaveric ankles fixed in the neutral joint position (standard pose) were CT scanned, and the data were used to simulate lateral and frontal radiographs under various positioning conditions using digitally reconstructed radiographs (DRR). Results and discussion In the standard pose for imaging, most ankle morphometric parameters measured using 2D images were highly correlated (R > 0.8) to the gold standard values defined by the 3D CT data. For measurements made on the lateral views, the only parameters sensitive to rotational pose errors were longitudinal distances between the most anterior and the most posterior points of the tibial mortise and the tibial profile, which have important implications for determining the optimal cutting level of the bone during arthroplasty. Measurements of the trochlea tali width on the frontal views underestimated the standard values by up to 31.2%, with only a moderate reliability, suggesting that pre-surgical evaluations based on the trochlea tali width should be made with caution in order to avoid inappropriate selection of prosthesis sizes. Conclusions While highly correlated with 3D morphological measurements, some 2D measurements were affected by the bone poses in space during imaging, which may affect surgical decision-making in total ankle arthroplasty, including the amount of bone resection and the selection of the implant sizes. The linear regression equations for the relationship between 2D and 3D measurements will be helpful for correcting the errors in 2D morphometric measurements for clinical applications. PMID:24359413
Standardization of the finished product: Habbe Irqun Nisa - A Unani anti-inflammatory formulation.
Husain, S Farhan; Ahmad, Irshad; Shamsi, Shariq
2012-07-01
Habb (Pill) is one of the important dosage forms of Unani system of medicine. A number of effective formulations are manufactured in form of Habb because of its various advantages. Out of these, Habbe Irqun Nisa (HI) is a popular anti-inflammatory formulation used in the treatment of Warame Mafasil (arthritis) and Irqun Nisa (sciatica). Nowadays, with increased incidence of these diseases many non-steroidal anti-inflammatory drugs (NSAIDs) are being used in their treatment. Owing to the adverse effects of these drugs, the use of herbal medicines is seen as a better alternative. The basic requirement for the development of Unani system of Medicine is the standardization of single and compound drugs. HI is mentioned in National Formulary of Unani Medicne and selected for the present study. HI was prepared manually with the powder of crude drugs, passed through sieve no. 100 and mixed with 1% w/w of gum acacia in mucilage form. It was then dried at 60°C for 90 min and then tested for its standardization on different physicochemical parameters, e.g. organoleptic properties, pH values, moisture content, ash values, friability, hardness, weight variation, disintegration time, and thin layer chromatography (TLC). The data evolved from this study will make it a validated product and will help in the quality control of other finished products in future research.
[Physicochemical composition of bottled drinking water marketed in Ouagadougou (Burkina Faso)].
Some, Issa Touridomon; Banao, Issouf; Gouado, Inocent; Tapsoba, Théophile Lincoln
2009-01-01
The bottled drinking water marketed in urban areas includes natural mineral water, spring water, and treated drinking water. Their physicochemical qualities depend on the type and quantity of their components and define their safe use. Bottled water is widely consumed in Ouagadougou (Burkina Faso), and many brand names exist. Although many publications have examined the microbiological qualities of such water, no study has examined the physicochemical quality of water from Burkina Faso. This study, conducted from March 2005 through January 2006, aimed to assess the physicochemical composition of drinking water sold in Ouagadougou to facilitate better choices and use by consumers. Results showed that all the water analyzed in Ouagadougou is soft (TH < 50 ppm) or moderately soft (50 < TH < 200 ppm) and weakly mineralized (total dissolved solid content < 500 mg/L, sulfates [SO(2-)(4)] < 200 mg/L, [Ca(++)] < 150 mg/L, [Mg(2+)] < 50 mg/L, and [HCO(3)-] < 600 mg/l). Some imported water, however, is hard and highly mineralized. French standards do not set limit values for the natural mineral water parameters described above, and much of the water sold in Ouagadougou was natural mineral water. The spring water met potability standards, except for the Montagne d'Arrée brand, which had a pH value of 5.8, below the WHO standards of 6.5 < pH 8.5.
NASA Astrophysics Data System (ADS)
Iwaki, Y.
2010-07-01
The Quality Assurance (QA) of measurand has been discussed over many years by Quality Engineering (QE). It is need to more discuss about ISO standard. It is mining to find out root fault element for improvement of measured accuracy, and it remove. The accuracy assurance needs to investigate the Reference Material (RM) for calibration and an improvement accuracy of data processing. This research follows the accuracy improvement in field of data processing by how to improve of accuracy. As for the fault element relevant to measurement accuracy, in many cases, two or more element is buried exist. The QE is to assume the generating frequency of fault state, and it is solving from higher ranks for fault factor first by "Failure Mode and Effects Analysis (FMEA)". Then QE investigate the root cause over the fault element by "Root Cause Analysis (RCA)" and "Fault Tree Analysis (FTA)" and calculate order to the generating element of assume specific fault. These days comes, the accuracy assurance of measurement result became duty in the Professional Test (PT). ISO standard was legislated by ISO-GUM (Guide of express Uncertainty in Measurement) as guidance of an accuracy assurance in 1993 [1] for QA. Analysis method of ISO-GUM is changed into Exploratory Data Analysis (EDA) from Analysis of Valiance (ANOVA). EDA calculate one by one until an assurance performance is obtained according to "Law of the propagation of uncertainty". If the truth value was unknown, ISO-GUM is changed into reference value. A reference value set up by the EDA and it does check with a Key Comparison (KC) method. KC is comparing between null hypothesis and frequency hypothesis. It performs operation of assurance by ISO-GUM in order of standard uncertainty, the combined uncertainty of many fault elements and an expansion uncertain for assurance. An assurance value is authorized by multiplying the final expansion uncertainty [2] by K of coverage factor. K-value is calculated from the Effective Free Degree (EFD) which thought the number of samples is important. Free degree is based on maximum likelihood method of an improved information criterion (AIC) for a Quality Control (QC). The assurance performance of ISO-GUM is come out by set up of the confidence interval [3] and is decided. The result of research of "Decided level/Minimum Detectable Concentration (DL/MDC)" was able to profit by the operation. QE has developed for the QC of industry. However, these have been processed by regression analysis by making frequency probability of a statistic value into normalized distribution. The occurrence probability of the statistics value of a fault element which is accompanied element by a natural phenomenon becomes an abnormal distribution in many cases. The abnormal distribution needs to obtain an assurance value by other method than statistical work of type B in ISO-GUM. It is tried fusion the improvement of worker by QE became important for reservation of the reliability of measurement accuracy and safety. This research was to make the result of Blood Chemical Analysis (BCA) in the field of clinical test.
Palm-Based Standard Reference Materials for Iodine Value and Slip Melting Point
Tarmizi, Azmil Haizam Ahmad; Lin, Siew Wai; Kuntom, Ainie
2008-01-01
This work described study protocols on the production of Palm-Based Standard Reference Materials for iodine value and slip melting point. Thirty-three laboratories collaborated in the inter-laboratory proficiency tests for characterization of iodine value, while thirty-two laboratories for characterization of slip melting point. The iodine value and slip melting point of palm oil, palm olein and palm stearin were determined in accordance to MPOB Test Methods p3.2:2004 and p4.2:2004, respectively. The consensus values and their uncertainties were based on the acceptability of statistical agreement of results obtained from collaborating laboratories. The consensus values and uncertainties for iodine values were 52.63 ± 0.14 Wijs in palm oil, 56.77 ± 0.12 Wijs in palm olein and 33.76 ± 0.18 Wijs in palm stearin. For the slip melting points, the consensus values and uncertainties were 35.6 ± 0.3 °C in palm oil, 22.7 ± 0.4 °C in palm olein and 53.4 ± 0.2 °C in palm stearin. Repeatability and reproducibility relative standard deviations were found to be good and acceptable, with values much lower than that of 10%. Stability of Palm-Based Standard Reference Materials remained stable at temperatures of −20 °C, 0 °C, 6 °C and 24 °C upon storage for one year. PMID:19609396
Narrative Interest Standard: A Novel Approach to Surrogate Decision-Making for People With Dementia.
Wilkins, James M
2017-06-17
Dementia is a common neurodegenerative process that can significantly impair decision-making capacity as the disease progresses. When a person is found to lack capacity to make a decision, a surrogate decision-maker is generally sought to aid in decision-making. Typical bases for surrogate decision-making include the substituted judgment standard and the best interest standard. Given the heterogeneous and progressive course of dementia, however, these standards for surrogate decision-making are often insufficient in providing guidance for the decision-making for a person with dementia, escalating the likelihood of conflict in these decisions. In this article, the narrative interest standard is presented as a novel and more appropriate approach to surrogate decision-making for people with dementia. Through case presentation and ethical analysis, the standard mechanisms for surrogate decision-making for people with dementia are reviewed and critiqued. The narrative interest standard is then introduced and discussed as a dementia-specific model for surrogate decision-making. Through incorporation of elements of a best interest standard in focusing on the current benefit-burden ratio and elements of narrative to provide context, history, and flexibility for values and preferences that may change over time, the narrative interest standard allows for elaboration of an enriched context for surrogate decision-making for people with dementia. More importantly, however, a narrative approach encourages the direct contribution from people with dementia in authoring the story of what matters to them in their lives.
Recommendations for postmarketing surveillance studies in haemophilia and other bleeding disorders.
Lassila, R; Rothschild, C; De Moerloose, P; Richards, M; Perez, R; Gajek, H
2005-07-01
Prospective surveillance studies to monitor drug safety in the postapproval period are rarely employed systematically, although they are of greatest value for caregivers, drug users and regulatory authorities. Safety issues have affected not only conventional pharmaceuticals, but also especially coagulation factors in haemophilia treatment. The reputation of postmarketing surveillance (PMS) studies has been questionable, mainly due to their misuse to solicit prescriptions. Other weaknesses include inappropriate design, lack of standardized observation, limited follow-up periods, absence of rigour in identifying potential adverse drug effects, and infrequent publication. Although well-designed clinical trials represent the gold standard for generating sound clinical evidence, a number of aspects would make PMS studies valuable, if properly conducted. One of their main advantages is broader inclusion, and absence of an 'experimental' design. Lack of proper guidelines, and standardization may constitute a reason for the generally low quality of PMS studies. This paper proposes guidelines for haemophilia-specific PMS studies, in order to improve the acceptance of a basically valuable tool. In the absence of consistent regulatory guidance it will be especially important that the design and supervision of PMS studies involves physicians from the beginning. This will not only make such studies more scientifically relevant, but also help to implement them into daily clinical practice. Specifically in haemophilia, PMS studies may provide valuable data on clinical outcomes, or Quality of Life, which is of great importance when considering adequate standards of care in haemophilia patients.
Evaluation of Radiopacity of Bulk-fill Flowable Composites Using Digital Radiography.
Tarcin, B; Gumru, B; Peker, S; Ovecoglu, H S
2016-01-01
New flowable composites that may be bulk-filled in layers up to 4 mm are indicated as a base beneath posterior composite restorations. Sufficient radiopacity is one of the several important requirements such materials should meet. The aim of this study was to evaluate the radiopacity of bulk-fill flowable composites and to provide a comparison with conventional flowable composites using digital imaging. Ten standard specimens (5 mm in diameter, 1 mm in thickness) were prepared from each of four different bulk-fill flowable composites and nine different conventional flowable composites. Radiographs of the specimens were taken together with 1-mm-thick tooth slices and an aluminum step wedge using a digital imaging system. For the radiographic exposures, a storage phosphor plate and a dental x-ray unit at 70 kVp and 8 mA were used. The object-to-focus distance was 30 cm, and the exposure time was 0.2 seconds. The gray values of the materials were measured using the histogram function of the software available with the system, and radiopacity was calculated as the equivalent thickness of aluminum. The data were analyzed statistically (p<0.05). All of the tested bulk-fill flowable composites showed significantly higher radiopacity values in comparison with those of enamel, dentin, and most of the conventional flowable composites (p<0.05). Venus Bulk Fill (Heraeus Kulzer) provided the highest radiopacity value, whereas Arabesk Flow (Voco) showed the lowest. The order of the radiopacity values for the bulk-fill flowable composites was as follows: Venus Bulk Fill (Heraeus Kulzer) ≥ X-tra Base (Voco) > SDR (Dentsply DeTrey) ≥ Filtek Bulk Fill (3M ESPE). To conclude, the bulk-fill flowable restorative materials, which were tested in this study using digital radiography, met the minimum standard of radiopacity specified by the International Standards Organization.
Greenberg, Dan; Hammerman, Ariel; Vinker, Shlomo; Shani, Adi; Yermiahu, Yuval; Neumann, Peter J
2013-01-01
We determined how Israeli oncologists and family physicians value life-prolongation versus quality-of-life (QOL)-enhancing outcomes attributable to cancer and congestive heart failure interventions. We presented physicians with two scenarios involving a hypothetical patient with metastatic cancer expected to survive 12 months with current treatment. In a life-prolongation scenario, we suggested that a new treatment increases survival at an incremental cost of $50,000 over the standard of care. Participants were asked what minimum improvement in median survival the new therapy would need to provide for them to recommend it over the standard of care. In the QOL-enhancing scenario, we asked the maximum willingness to pay for an intervention that leads to the same survival as the standard treatment, but increases patient's QOL from 50 to 75 (on a 0-100 scale). We replicated these scenarios by substituting a patient with congestive heart failure instead of metastatic cancer. We derived the incremental cost-effectiveness ratio per quality-adjusted life-year (QALY) gained threshold implied by each response. In the life-prolongation scenario, the cost-effectiveness thresholds implied by oncologists were $150,000/QALY and $100,000/QALY for cancer and CHF, respectively. Cost-effectiveness thresholds implied by family physicians were $50,000/QALY regardless of the disease type. Willingness to pay for the QOL-enhancing scenarios was $60,000/QALY and did not differ by physicians' specialty or disease. Our findings suggest that family physicians value life-prolonging and QOL-enhancing interventions roughly equally, while oncologists value interventions that extend survival more highly than those that improve only QOL. These findings may have important implications for coverage and reimbursement decisions of new technologies. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
A comparative assessment of statistical methods for extreme weather analysis
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.
Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy
NASA Astrophysics Data System (ADS)
Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan
2015-05-01
With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.
Commutability of Cytomegalovirus WHO International Standard in Different Matrices
Jones, Sara; Webb, Erika M.; Barry, Catherine P.; Choi, Won S.; Abravaya, Klara B.; Schneider, George J.
2016-01-01
Commutability of quantitative standards allows patient results to be compared across molecular diagnostic methods and laboratories. This is critical to establishing quantitative thresholds for use in clinical decision-making. A matrix effect associated with the 1st cytomegalovirus (CMV) WHO international standard (IS) was identified using the Abbott RealTime CMV assay. A commutability study was performed to compare the CMV WHO IS and patient specimens diluted in plasma and whole blood. Patient specimens showed similar CMV DNA quantitation values regardless of the diluent or extraction procedure used. The CMV WHO IS, on the other hand, exhibited a matrix effect. The CMV concentration reported for the WHO IS diluted in plasma was within the 95% prediction interval established with patient samples. In contrast, the reported DNA concentration of the CMV WHO IS diluted in whole blood was reduced approximately 0.4 log copies/ml, and values fell outside the 95% prediction interval. Calibrating the assay by using the CMV WHO IS diluted in whole blood would introduce a bias for CMV whole-blood quantitation; samples would be reported as having higher measured concentrations, by approximately 0.4 log IU/ml. Based on the commutability study with patient samples, the RealTime CMV assay was standardized based on the CMV WHO IS diluted in plasma. A revision of the instructions for use of the CMV WHO IS should be considered to alert users of the potential impact from the diluent matrix. The identification of a matrix effect with the CMV WHO IS underscores the importance of assessing commutability of the IS in order to achieve consistent results across methods. PMID:27030491
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landis, E.R.; Rohrbacher, T.J.; Gluskoter, H.
1999-07-01
As part of the activities conducted under the U.S. Hungarian Science and Technology Fund, a total of 39 samples from five coal mines in Hungary were selected for standard coal analyses and major, minor and trace elements analysis. The mine areas sampled were selected to provide a spectrum of coal quality information for comparison with other coal areas in central Europe and worldwide. All of the areas are of major importance in the energy budget of Hungary. The five sample sites contain coal in rocks of Jurassic, Cretaceous, Eocene, Miocene, and Pliocene age. The coals, from four underground and onemore » surface mine, range in rank from high volatile bituminous to lignite B. Most of the coal produced from the mines sampled is used to generate electricity. Some of the power plants that utilize the coals also provide heat for domestic and process usage. The standard coal analysis program is based on tests performed in accordance with standards of the American Society for Testing and Materials (ASTM). Proximate and ultimate analyses were supplemented by determinations of the heating value, equilibrium moisture, forms of sulfur, free-swelling index, ash fusion temperatures (both reducing and oxidizing), apparent specific gravity and Hardgrove Grindability index. The major, minor and trace element analyses were performed in accordance with standardized procedures of the U.S. Geological Survey. The analytical results will be available in the International Coal Quality Data Base of the USGS. The results of the program provide data for comparison with test data from Europe and information of value to potential investors or cooperators in the coal industry of Hungary and Central Europe.« less
Pulse oximetry-derived respiratory rate in general care floor patients.
Addison, Paul S; Watson, James N; Mestek, Michael L; Ochs, James P; Uribe, Alberto A; Bergese, Sergio D
2015-02-01
Respiratory rate is recognized as a clinically important parameter for monitoring respiratory status on the general care floor (GCF). Currently, intermittent manual assessment of respiratory rate is the standard of care on the GCF. This technique has several clinically-relevant shortcomings, including the following: (1) it is not a continuous measurement, (2) it is prone to observer error, and (3) it is inefficient for the clinical staff. We report here on an algorithm designed to meet clinical needs by providing respiratory rate through a standard pulse oximeter. Finger photoplethysmograms were collected from a cohort of 63 GCF patients monitored during free breathing over a 25-min period. These were processed using a novel in-house algorithm based on continuous wavelet-transform technology within an infrastructure incorporating confidence-based averaging and logical decision-making processes. The computed oximeter respiratory rates (RRoxi) were compared to an end-tidal CO2 reference rate (RRETCO2). RRETCO2 ranged from a lowest recorded value of 4.7 breaths per minute (brpm) to a highest value of 32.0 brpm. The mean respiratory rate was 16.3 brpm with standard deviation of 4.7 brpm. Excellent agreement was found between RRoxi and RRETCO2, with a mean difference of -0.48 brpm and standard deviation of 1.77 brpm. These data demonstrate that our novel respiratory rate algorithm is a potentially viable method of monitoring respiratory rate in GCF patients. This technology provides the means to facilitate continuous monitoring of respiratory rate, coupled with arterial oxygen saturation and pulse rate, using a single non-invasive sensor in low acuity settings.
Fu, Jiali; Hu, Zhaochu; Zhang, Wen; Yang, Lu; Liu, Yongsheng; Li, Ming; Zong, Keqing; Gao, Shan; Hu, Shenghong
2016-03-10
The sulfur isotope is an important geochemical tracer in diverse fields of geosciences. In this study, the effects of three different cone combinations with the addition of N2 on the performance of in situ S isotope analyses were investigated in detail. The signal intensities of S isotopes were improved by a factor of 2.3 and 3.6 using the X skimmer cone combined with the standard sample cone or the Jet sample cone, respectively, compared with the standard arrangement (H skimmer cone combined with the standard sample cone). This signal enhancement is important for the improvement of the precision and accuracy of in situ S isotope analysis at high spatial resolution. Different cone combinations have a significant effect on the mass bias and mass bias stability for S isotopes. Poor precisions of S isotope ratios were obtained using the Jet and X cones combination at their corresponding optimum makeup gas flow when using Ar plasma only. The addition of 4-8 ml min(-1) nitrogen to the central gas flow in laser ablation MC-ICP-MS was found to significantly enlarge the mass bias stability zone at their corresponding optimum makeup gas flow in these three different cone combinations. The polyatomic interferences of OO, SH, OOH were also significantly reduced, and the interference free plateaus of sulfur isotopes became broader and flatter in the nitrogen mode (N2 = 4 ml min(-1)). However, the signal intensity of S was not increased by the addition of nitrogen in this study. The laser fluence and ablation mode had significant effects on sulfur isotope fractionation during the analysis of sulfides and elemental sulfur by laser ablation MC-ICP-MS. The matrix effect among different sulfides and elemental sulfur was observed, but could be significantly reduced by line scan ablation in preference to single spot ablation under the optimized fluence. It is recommended that the d90 values of the particles in pressed powder pellets for accurate and precise S isotope analysis should be less than 10 μm. Under the selected optimized analytical conditions, excellent agreements between the determined values and the reference values were achieved for the IAEA-S series standard reference materials and a set of six well-characterized, isotopic homogeneous sulfide standards (PPP-1, MoS2, MASS-1, P-GBW07267, P-GBW07268, P-GBW07270), validating the capability of the developed method for providing high-quality in situ S isotope data in sulfides and elemental sulfur. Copyright © 2016. Published by Elsevier B.V.
Inter-annual and spatial variability of Hamon potential evapotranspiration model coefficients
McCabe, Gregory J.; Hay, Lauren E.; Bock, Andy; Markstrom, Steven L.; Atkinson, R. Dwight
2015-01-01
Monthly calibrated values of the Hamon PET coefficient (C) are determined for 109,951 hydrologic response units (HRUs) across the conterminous United States (U.S.). The calibrated coefficient values are determined by matching calculated mean monthly Hamon PET to mean monthly free-water surface evaporation. For most locations and months the calibrated coefficients are larger than the standard value reported by Hamon. The largest changes in the coefficients were for the late winter/early spring and fall months, whereas the smallest changes were for the summer months. Comparisons of PET computed using the standard value of C and computed using calibrated values of C indicate that for most of the conterminous U.S. PET is underestimated using the standard Hamon PET coefficient, except for the southeastern U.S.
IEEE Standards activities: A year in review. Annual activities report 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-09-01
For IEEE Standards and the constituency it serves, 1996 was a milestone year. It was a period that signaled the beginning of the Standards Program of the future. This program responds to worldwide standards issues and to the technical, market and competitive strategies of industrial sectors. It represents technological innovation, global participation in electrotechnology standards development and dedication to the on-going advancement and promotion of new concepts and technology. Instrumental in ensuring IEEE`s growth opportunities and leadership position in electrotechnology standards development and dissemination is IEEE membership. The value of members and their involvement in IEEE Standards is vital tomore » IEEE`s ability to continuously be the best professional association in the world. To fulfill this vision, IEEE set forth five goals -- increased globalization, career enhancement, electronic dissemination of existing products and services, organizational improvements and public responsibility. Key in the process for the achievement of these goals is to make certain that IEEE members are well informed about opportunities and benefits the present system of standardization offers, and what the future system will offer. With the member in mind, the Standards Board reviewed its strategic direction and operational structures, and its ability to deliver products and services that are needed now and, more importantly, that will be needed in the future. This Annual Activities Report provides a comprehensive picture of 1996 accomplishments and performance. It gives readers a broad picture of the Standards Board`s activities, the evolving role of IEEE Standards and the technological opportunities that the Board encourages.« less
Carrying capacity as "informed judgment": The values of science and the science of values
Robert E. Manning
2001-01-01
Contemporary carrying capacity frameworks, such as Limits of Acceptable Change and Visitor Experience and Resource Protection, rely on formulation of standards of quality, which are defined as minimum acceptable resource and social conditions in parks and wilderness. Formulation of standards of quality involves elements of both science and values, and both of these...
Nutrient intake values (NIVs): a recommended terminology and framework for the derivation of values.
King, Janet C; Vorster, Hester H; Tome, Daniel G
2007-03-01
Although most countries and regions around the world set recommended nutrient intake values for their populations, there is no standardized terminology or framework for establishing these standards. Different terms used for various components of a set of dietary standards are described in this paper and a common set of terminology is proposed. The recommended terminology suggests that the set of values be called nutrient intake values (NIVs) and that the set be composed of three different values. The average nutrient requirement (ANR) reflects the median requirement for a nutrient in a specific population. The individual nutrient level (INLx) is the recommended level of nutrient intake for all healthy people in the population, which is set at a certain level x above the mean requirement. For example, a value set at 2 standard deviations above the mean requirement would cover the needs of 98% of the population and would be INL98. The third component of the NIVs is an upper nutrient level (UNL), which is the highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in a specified life-stage group. The proposed framework for deriving a set of NIVs is based on a statistical approach for determining the midpoint of a distribution of requirements for a set of nutrients in a population (the ANR), the standard deviation of the requirements, and an individual nutrient level that assures health at some point above the mean, e.g., 2 standard deviations. Ideally, a second set of distributions of risk of excessive intakes is used as the basis for a UNL.
Jacobson, Magdalena; Wallgren, Per; Nordengrahn, Ann; Merza, Malik; Emanuelson, Ulf
2011-04-01
Lawsonia intracellularis is a common cause of chronic diarrhoea and poor performance in young growing pigs. Diagnosis of this obligate intracellular bacterium is based on the demonstration of the microbe or microbial DNA in tissue specimens or faecal samples, or the demonstration of L. intracellularis-specific antibodies in sera. The aim of the present study was to evaluate a blocking ELISA in the detection of serum antibodies to L. intracellularis, by comparison to the previously widely used immunofluorescent antibody test (IFAT). Sera were collected from 176 pigs aged 8-12 weeks originating from 24 herds with or without problems with diarrhoea and poor performance in young growing pigs. Sera were analyzed by the blocking ELISA and by IFAT. Bayesian modelling techniques were used to account for the absence of a gold standard test and the results of the blocking ELISA was modelled against the IFAT test with a "2 dependent tests, 2 populations, no gold standard" model. At the finally selected cut-off value of percent inhibition (PI) 35, the diagnostic sensitivity of the blocking ELISA was 72% and the diagnostic specificity was 93%. The positive predictive value was 0.82 and the negative predictive value was 0.89, at the observed prevalence of 33.5%. The sensitivity and specificity as evaluated by Bayesian statistic techniques differed from that previously reported. Properties of diagnostic tests may well vary between countries, laboratories and among populations of animals. In the absence of a true gold standard, the importance of validating new methods by appropriate statistical methods and with respect to the target population must be emphasized.
Harvey, Stephen B; Krimer, Paula M; Correa, Maria T; Hanes, Martha A
2008-01-01
Plasma biochemical and hematologic values are important parameters for assessing animal health and experimental results. Although normal reference values for many rodent species have been published, there is a dearth of similar information for the genus Microtus. In addition, most studies use a mean and standard deviation to establish reference intervals, but doing so is not the recommendation of the Clinical and Laboratory Standards Institute (formerly the National Committee on Clinical Laboratory Standards) or the International Federation of Clinical Chemistry and Laboratory Medicine. The purpose of this study was to establish normal reference parameters for plasma biochemistry and hematology in mature pine voles (Microtus pinetorum) by using the nonparametric rank percentile method as recommended by the 2 laboratory medicine organizations mentioned. Samples of cardiac blood from a closed colony of pine voles were collected at euthanasia and evaluated under rodent settings on 2 automated hematology analyzers from 2 different manufacturers and on the same type of automated biochemistry analyzer. There were no sex-associated clinically significant differences between the sexes; younger animals had a lower hematocrit, higher mean corpuscular volume, and lower mean corpuscular hemoglobin concentration than did older animals. Only platelet counts differed when comparing hematologic values from different analyzers. Relative to rats and mice, pine voles have a lower mean corpuscular volume and higher red blood cell count, higher blood urea nitrogen, much higher alanine aminotransferase, and lower glucose and phosphorous concentrations. Hematology and plasma biochemical results obtained in this study are considered representative for healthy adult laboratory pine voles under similar environmental conditions. PMID:18702449
Ford, Emily; Adams, Jon; Graves, Nicholas
2012-01-01
Objective An economic model was developed to evaluate the cost-effectiveness of hawthorn extract as an adjunctive treatment for heart failure in Australia. Methods A Markov model of chronic heart failure was developed to compare the costs and outcomes of standard treatment and standard treatment with hawthorn extract. Health states were defined by the New York Heart Association (NYHA) classification system and death. For any given cycle, patients could remain in the same NYHA class, experience an improvement or deterioration in NYHA class, be hospitalised or die. Model inputs were derived from the published medical literature, and the output was quality-adjusted life years (QALYs). Probabilistic sensitivity analysis was conducted. The expected value of perfect information (EVPI) and the expected value of partial perfect information (EVPPI) were conducted to establish the value of further research and the ideal target for such research. Results Hawthorn extract increased costs by $1866.78 and resulted in a gain of 0.02 QALYs. The incremental cost-effectiveness ratio was $85 160.33 per QALY. The cost-effectiveness acceptability curve indicated that at a threshold of $40 000 the new treatment had a 0.29 probability of being cost-effective. The average incremental net monetary benefit (NMB) was −$1791.64, the average NMB for the standard treatment was $92 067.49, and for hawthorn extract $90 275.84. Additional research is potentially cost-effective if research is not proposed to cost more than $325 million. Utilities form the most important target parameter group for further research. Conclusions Hawthorn extract is not currently considered to be cost-effective in as an adjunctive treatment for heart failure in Australia. Further research in the area of utilities is warranted. PMID:22942231
Ford, Emily; Adams, Jon; Graves, Nicholas
2012-01-01
An economic model was developed to evaluate the cost-effectiveness of hawthorn extract as an adjunctive treatment for heart failure in Australia. A Markov model of chronic heart failure was developed to compare the costs and outcomes of standard treatment and standard treatment with hawthorn extract. Health states were defined by the New York Heart Association (NYHA) classification system and death. For any given cycle, patients could remain in the same NYHA class, experience an improvement or deterioration in NYHA class, be hospitalised or die. Model inputs were derived from the published medical literature, and the output was quality-adjusted life years (QALYs). Probabilistic sensitivity analysis was conducted. The expected value of perfect information (EVPI) and the expected value of partial perfect information (EVPPI) were conducted to establish the value of further research and the ideal target for such research. Hawthorn extract increased costs by $1866.78 and resulted in a gain of 0.02 QALYs. The incremental cost-effectiveness ratio was $85 160.33 per QALY. The cost-effectiveness acceptability curve indicated that at a threshold of $40 000 the new treatment had a 0.29 probability of being cost-effective. The average incremental net monetary benefit (NMB) was -$1791.64, the average NMB for the standard treatment was $92 067.49, and for hawthorn extract $90 275.84. Additional research is potentially cost-effective if research is not proposed to cost more than $325 million. Utilities form the most important target parameter group for further research. Hawthorn extract is not currently considered to be cost-effective in as an adjunctive treatment for heart failure in Australia. Further research in the area of utilities is warranted.
Physical fitness reference standards in European children: the IDEFICS study.
De Miguel-Etayo, P; Gracia-Marco, L; Ortega, F B; Intemann, T; Foraita, R; Lissner, L; Oja, L; Barba, G; Michels, N; Tornaritis, M; Molnár, D; Pitsiladis, Y; Ahrens, W; Moreno, L A
2014-09-01
A low fitness status during childhood and adolescence is associated with important health-related outcomes, such as increased future risk for obesity and cardiovascular diseases, impaired skeletal health, reduced quality of life and poor mental health. Fitness reference values for adolescents from different countries have been published, but there is a scarcity of reference values for pre-pubertal children in Europe, using harmonised measures of fitness in the literature. The IDEFICS study offers a good opportunity to establish normative values of a large set of fitness components from eight European countries using common and well-standardised methods in a large sample of children. Therefore, the aim of this study is to report sex- and age-specific fitness reference standards in European children. Children (10,302) aged 6-10.9 years (50.7% girls) were examined. The test battery included: the flamingo balance test, back-saver sit-and-reach test (flexibility), handgrip strength test, standing long jump test (lower-limb explosive strength) and 40-m sprint test (speed). Moreover, cardiorespiratory fitness was assessed by a 20-m shuttle run test. Percentile curves for the 1st, 3rd, 10th, 25th, 50th, 75th, 90th, 97th and 99th percentiles were calculated using the General Additive Model for Location Scale and Shape (GAMLSS). Our results show that boys performed better than girls in speed, lower- and upper-limb strength and cardiorespiratory fitness, and girls performed better in balance and flexibility. Older children performed better than younger children, except for cardiorespiratory fitness in boys and flexibility in girls. Our results provide for the first time sex- and age-specific physical fitness reference standards in European children aged 6-10.9 years.
Shear-wave elastography of the testis in the healthy man - determination of standard values.
Trottmann, M; Marcon, J; D'Anastasi, M; Bruce, M F; Stief, C G; Reiser, M F; Buchner, A; Clevert, D A
2016-01-01
Real-time shear-wave elastography (SWE) is a newly developed technique for the sonographic quantification of tissue elasticity, which already is used in the assessment of breast and thyroid lesions. Due to limited overlying tissue, the testes are ideally suited for assessment using shear wave elastography. To our knowledge, no published data exist on real-time SWE of the testes. Sixty six male volunteers (mean age 51.86±18.82, range 20-86) with no known testicular pathology underwent normal B-mode sonography and multi-frame shear-wave elastography of both testes using the Aixplorer ® ultrasound system (SuperSonic Imagine, Aix en Provence, France). Three measurements were performed for each testis; one in the upper pole, in the middle portion and in the lower pole respectively. The results were statistically evaluated using multivariate analysis. Mean shear-wave velocity values were similar in the inferior and superior part of the testicle (1.15 m/s) and were significantly lower in the centre (0.90 m/s). These values were age-independent. Testicular stiffness was significantly lower in the upper pole than in the rest of the testis with increasing volume (p = 0.007). Real-time shear-wave elastography proved to be feasible in the assessment of testicular stiffness. It is important to consider the measurement region as standard values differ between the centre and the testicular periphery. Further studies with more subjects may be required to define the normal range of values for each age group. Useful clinical applications could include the diagnostic work-up of patients with scrotal masses or male infertility.
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
NASA Astrophysics Data System (ADS)
Alves, Julio Cesar L.; Poppi, Ronei J.
2013-02-01
This paper reports the application of piecewise direct standardization (PDS) for matrix correction in front face fluorescence spectroscopy of solids when different excipients are used in a pharmaceutical preparation based on a mixture of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine. As verified in earlier studies, the use of different excipients and their ratio can cause a displacement, change in fluorescence intensity or band profile. To overcome this important drawback, a standardization strategy was adopted to convert all the excitation-emission fluorescence spectra into those used for model development. An excitation-emission matrix (EEM) for which excitation and emission wavelengths ranging from 265 to 405 nm and 300 to 480 nm, respectively, was used. Excellent results were obtained using unfolded partial least squares (U-PLS), with RMSEP values of 8.2 mg/g, 10.9 mg/g and 2.7 mg/g for ASA, paracetamol and caffeine, respectively, and with relative errors lesser than 5% for the three analytes.
Packing Fraction of a Two-dimensional Eden Model with Random-Sized Particles
NASA Astrophysics Data System (ADS)
Kobayashi, Naoki; Yamazaki, Hiroshi
2018-01-01
We have performed a numerical simulation of a two-dimensional Eden model with random-size particles. In the present model, the particle radii are generated from a Gaussian distribution with mean μ and standard deviation σ. First, we have examined the bulk packing fraction for the Eden cluster and investigated the effects of the standard deviation and the total number of particles NT. We show that the bulk packing fraction depends on the number of particles and the standard deviation. In particular, for the dependence on the standard deviation, we have determined the asymptotic value of the bulk packing fraction in the limit of the dimensionless standard deviation. This value is larger than the packing fraction obtained in a previous study of the Eden model with uniform-size particles. Secondly, we have investigated the packing fraction of the entire Eden cluster including the effect of the interface fluctuation. We find that the entire packing fraction depends on the number of particles while it is independent of the standard deviation, in contrast to the bulk packing fraction. In a similar way to the bulk packing fraction, we have obtained the asymptotic value of the entire packing fraction in the limit NT → ∞. The obtained value of the entire packing fraction is smaller than that of the bulk value. This fact suggests that the interface fluctuation of the Eden cluster influences the packing fraction.
Confidence Limits for the Indirect Effect: Distribution of the Product and Resampling Methods
ERIC Educational Resources Information Center
MacKinnon, David P.; Lockwood, Chondra M.; Williams, Jason
2004-01-01
The most commonly used method to test an indirect effect is to divide the estimate of the indirect effect by its standard error and compare the resulting z statistic with a critical value from the standard normal distribution. Confidence limits for the indirect effect are also typically based on critical values from the standard normal…
USDA-ARS?s Scientific Manuscript database
The impact of standardizing the originally measured serum total 25-hydroxyvitamin D [25(OH)D] values from Third National Health and Nutrition Examination Survey (NHANES III, 1988-1994) on the association between 25(OH)D and rate of all-cause mortality was evaluated. Values were standardized to gold ...
NASA Astrophysics Data System (ADS)
Akhmetova, I. G.; Chichirova, N. D.
2017-11-01
Currently the actual problem is a precise definition of the normative and actual heat loss. Existing methods - experimental, on metering devices, on the basis of mathematical modeling methods are not without drawbacks. Heat losses establishing during the heat carrier transport has an impact on the tariff structure of heat supply organizations. This quantity determination also promotes proper choice of main and auxiliary equipment power, temperature chart of heat supply networks, as well as the heating system structure choice with the decentralization. Calculation of actual heat loss and their comparison with standard values justifies the performance of works on improvement of the heat networks with the replacement of piping or its insulation. To determine the cause of discrepancies between normative and actual heat losses thermal tests on the magnitude of the actual heat losses in the 124 sections of heat networks in Kazan. As were carried out the result mathematical model of the regulatory definition of heat losses is developed and tested. This model differ from differs the existing according the piping insulation type. The application of this factor will bring the value of calculative normative losses heat energy to their actual value. It is of great importance for enterprises operating distribution networks and because of the conditions of their configuration and extensions do not have the technical ability to produce thermal testing.
Li, Qing-bo; Liu, Jie-qiang; Li, Xiang
2012-03-01
A small non-invasive measurement system for human blood glucose has been developed, which can achieve fast, real-time and non invasive measurement of human blood glucose. The device is mainly composed of four parts, i. e. fixture, light system, data acquisition and processing systems, and spectrometer. A new scheme of light source driving was proposed, which can meet the requirements of light source under a variety of conditions of spectral acquisition. An integrated fixture design was proposed, which not only simplifies the optical structure of the system, but also improves the reproducibility of measurement conditions. The micro control system mainly achieves control function, dealing with data, data storage and so on. As the most important component, microprocessor DSP TMS320F2812 has many advantages, such as low power, high processing speed, high computing ability and so on. Wavelet denoising is used to pretreat the spectral data, which can decrease the loss of incident light and improve the signal-to-noise ratio. Kernel partial least squares method was adopted to build the mathematical model, which can improve the precision of the system. In the calibration experiment of the system, the standard values were measured by One-Touch. The correlation coefficient between standard blood glucose values and truth values is 0.95. The root mean square error of measurement is 0.6 mmol x L(-1). The system has good reproducibility.
NASA Astrophysics Data System (ADS)
Loftus, Pete; Giudice, Seb
2014-08-01
Measurements underpin the engineering decisions that allow products to be designed, manufactured, operated, and maintained. Therefore, the quality of measured data needs to be systematically assured to allow decision makers to proceed with confidence. The use of standards is one way of achieving this. This paper explores the relevance of international documentary standards to the assessment of measurement system capability in High Value Manufacturing (HVM) Industry. An internal measurement standard is presented which supplements these standards and recommendations are made for a cohesive effort to develop the international standards to provide consistency in such industrial applications.
Life cycle assessment and grid electricity: what do we know and what can we know?
Weber, Christopher L; Jiaramillo, Paulina; Marriott, Joe; Samaras, Constantine
2010-03-15
The generation and distribution of electricity comprises nearly 40% of U.S. CO(2), emissions, as well as large shares of SO(2), NO(x), small particulates, and other toxins. Thus, correctly accounting for these electricity-related environmental releases is of great importance in life cycle assessment of products and processes. Unfortunately, there is no agreed-upon protocol for accounting for the environmental emissions associated with electricity, as well as significant uncertainty in the estimates. Here, we explore the limits of current knowledge about grid electricity in LCA and carbon footprinting for the U.S. electrical grid, and show that differences in standards, protocols, and reporting organizations can lead to important differences in estimates of CO(2) SO(2), and NO(x) emissions factors. We find a considerable divergence in published values for grid emissions factor in the U.S. We discuss the implications of this divergence and list recommendations for a standardized approach to accounting for air pollution emissions in life cycle assessment and policy analyses in a world with incomplete and uncertain information.
Palacios Gruson, Martha; Barazza, Fabio; Murith, Christophe; Ryf, Salome
2015-04-01
The current revision of the Swiss Radiological Protection Ordinance aims to bring Swiss legislation in line with new international standards. In future, the control of radon exposure in dwellings will be based on a reference level of 300 Bq m(-3). Since this value is exceeded in >10 % of the buildings so far investigated nationwide, the new strategy requires the development of efficient measures to reduce radon-related health risks at an acceptable cost. The minimisation of radon concentrations in new buildings is therefore of great importance. This can be achieved, for example, through the enforcement of building regulations and the education of construction professionals. With regard to radon mitigation in existing buildings, synergies with the ongoing renewal of the building stock should be exploited. In addition, the dissemination of knowledge about radon and its risks needs to be focused on specific target groups, e.g. notaries, who play an important information role in real estate transactions. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marous, L; Muryn, J; Liptak, C
2016-06-15
Purpose: Monte Carlo simulation is a frequently used technique for assessing patient dose in CT. The accuracy of a Monte Carlo program is often validated using the standard CT dose index (CTDI) phantoms by comparing simulated and measured CTDI{sub 100}. To achieve good agreement, many input parameters in the simulation (e.g., energy spectrum and effective beam width) need to be determined. However, not all the parameters have equal importance. Our aim was to assess the relative importance of the various factors that influence the accuracy of simulated CTDI{sub 100}. Methods: A Monte Carlo program previously validated for a clinical CTmore » system was used to simulate CTDI{sub 100}. For the standard CTDI phantoms (32 and 16 cm in diameter), CTDI{sub 100} values from central and four peripheral locations at 70 and 120 kVp were first simulated using a set of reference input parameter values (treated as the truth). To emulate the situation in which the input parameter values used by the researcher may deviate from the truth, additional simulations were performed in which intentional errors were introduced into the input parameters, the effects of which on simulated CTDI{sub 100} were analyzed. Results: At 38.4-mm collimation, errors in effective beam width up to 5.0 mm showed negligible effects on simulated CTDI{sub 100} (<1.0%). Likewise, errors in acrylic density of up to 0.01 g/cm{sup 3} resulted in small CTDI{sub 100} errors (<2.5%). In contrast, errors in spectral HVL produced more significant effects: slight deviations (±0.2 mm Al) produced errors up to 4.4%, whereas more extreme deviations (±1.4 mm Al) produced errors as high as 25.9%. Lastly, ignoring the CT table introduced errors up to 13.9%. Conclusion: Monte Carlo simulated CTDI{sub 100} is insensitive to errors in effective beam width and acrylic density. However, they are sensitive to errors in spectral HVL. To obtain accurate results, the CT table should not be ignored. This work was supported by a Faculty Research and Development Award from Cleveland State University.« less
In vivo electric conductivity of cervical cancer patients based on B₁⁺ maps at 3T MRI.
Balidemaj, E; de Boer, P; van Lier, A L H M W; Remis, R F; Stalpers, L J A; Westerveld, G H; Nederveen, A J; van den Berg, C A T; Crezee, J
2016-02-21
The in vivo electric conductivity (σ) values of tissue are essential for accurate electromagnetic simulations and specific absorption rate (SAR) assessment for applications such as thermal dose computations in hyperthermia. Currently used σ-values are mostly based on ex vivo measurements. In this study the conductivity of human muscle, bladder content and cervical tumors is acquired non-invasively in vivo using MRI. The conductivity of 20 cervical cancer patients was measured with the MR-based electric properties tomography method on a standard 3T MRI system. The average in vivo σ-value of muscle is 14% higher than currently used in human simulation models. The σ-value of bladder content is an order of magnitude higher than the value for bladder wall tissue that is used for the complete bladder in many models. Our findings are confirmed by various in vivo animal studies from the literature. In cervical tumors, the observed average conductivity was 13% higher than the literature value reported for cervical tissue. Considerable deviations were found for the electrical conductivity observed in this study and the commonly used values for SAR assessment, emphasizing the importance of acquiring in vivo conductivity for more accurate SAR assessment in various applications.
Martinez, Alexa; Roberts, Glenn; Garzarella, Katherine; Lutz, Michael; Caswell, Michael
2013-04-01
The purpose of these clinical trials was to determine if 300 W and 150 W xenon arc solar simulators (SSs) deliver the same sun protection factor (SPF) and UVA protection factor (PFA). First, the SPF of the P7 control standard and of the P2 control standard was determined, testing 20 subjects using the method described in the Food and Drug Administration (FDA) Final Monograph and using 150 W and 300 W SSs. In the second clinical trial, the PFA of the Japanese Cosmetic Industry Association (JCIA) control standard and of the P2 control standard was determined, testing 10 subjects using the method described in the JCIA Technical Bulletin and using 150 W and 300 W SSs. The SPF values for P7 control standard determined using the 150 W and 300 W SSs were 4.54 ± 0.35 and 4.61 ± 0.32, respectively. The SPF values for P2 control standard determined using the 150 W and 300 W SSs were 17.0 ± 0.9 and 16.7 ± 0.9, respectively. The resultant PFA values for JCIA control standard determined using the 150 W and 300 W SSs were 4.06 ± 0.70 and 4.06 ± 0.70, respectively. The resultant PFA values for P2 control standard determined using the 150 W and 300 W SSs were 3.28 ± 0.25 and 3.44 ± 0.39, respectively. As the values are essentially identical for SPF and for PFA, the 150 W and 300 W SSs can be used interchangeably for SPF and PFA determinations. © 2013 John Wiley & Sons A/S.
Physical Principles of Development of the State Standard of Biological Cell Polarizability
NASA Astrophysics Data System (ADS)
Shuvalov, G. V.; Generalov, K. V.; Generalov, V. M.; Kruchinina, M. V.; Koptev, E. S.; Minin, O. V.; Minin, I. V.
2018-03-01
A new state standard of biological cell polarizability based on micron-size latex particles has been developed. As a standard material, it is suggested to use polystyrene. Values of the polarizability calculated for erythrocytes and values of the polarizability of micron-size spherical latex particles measured with measuring-computing complexes agree within the limits of satisfactory relative error. The Standard allows one the unit of polarizability measurements [m3] to be assigned to cells and erythrocytes for the needs of medicine.
Heinemann, Lutz
2018-04-01
At the 2017 10th annual International Conference on Advanced Technologies and Treatments for Diabetes (ATTD) in Paris, France, four speakers presented their perspectives on the roles of continuous glucose monitoring (CGM) and of blood glucose monitoring (BGM) in patient management within one symposium. These presentations included discussions of the differences in the accuracy of CGM and BGM, a clinical perspective on the physiological reasons behind differences in CGM and BGM values, and an overview of the impact of variations in device accuracy on patients with diabetes. Subsequently a short summary of these presentations is given, highlighting the value of good accuracy of BGM or CGM systems and the ongoing need for standardization. The important role of both BGM and CGM in patient management was a theme across all presentations.
The autopsy: a professional responsibility in assuring quality of care.
Burton, Elizabeth C
2002-01-01
Forty years ago, the value of autopsies was widely recognized as new diseases were discovered or clarified and scientific technology advanced greatly. Despite the autopsy's strong foundation, its value is not currently being properly conveyed to physicians or patients. Although autopsy-related policy exists, these policies have had little effect on increasing or even maintaining adequate autopsy rates. More recently, the autopsy has fallen on hard times, with US hospital rates now below 5%. The reasons for the decline in rates are multifaceted and include a lack of direct reimbursement for the procedure, lack of defined minimum rate standards, overconfidence in diagnostic technology, and the fear of litigation. Regardless of the reasons for the declining rates, the ethical and professional reasons for increasing the number of autopsies are far more important.
[Conservative calibration of a clearance monitor system for waste material from nuclear medicine].
Wanke, Carsten; Geworski, Lilli
2014-09-01
Clearance monitor systems are used for gross gamma measurements of waste potentially contaminated with radioactivity. These measurements are to make sure that legal requirements, e.g. clearance criteria according to the german radiation protection ordinance, are met. This means that measurement results may overestimate, but must not underestimate the true values. This paper describes a pragmatic way using a calibrated Cs-137 point source to generate a conservative calibration for the clearance monitor system used in the Medizinische Hochschule Hannover (MHH). The most important nuclides used in nuclear medicine are considered. The measurement result reliably overestimates the true value of the activity present in the waste. The calibration is compliant with the demands for conservativity and traceability to national standards. Copyright © 2014. Published by Elsevier GmbH.
Use of photovoltaic detector for photocatalytic activity estimation
NASA Astrophysics Data System (ADS)
Das, Susanta Kumar; Satapathy, Pravakar; Rao, P. Sai Shruti; Sabar, Bilu; Panda, Rudrashish; Khatua, Lizina
2018-05-01
Photocatalysis is a very important process and have numerous applications. Generally, to estimate the photocatalytic activity of newly grown material, its reaction rate constant w.r.t to some standard commercial TiO2 nanoparticles like Degussa P25 is evaluated. Here a photovoltaic detector in conjunction with laser is used to determine this rate constant. This method is tested using Zinc Orthotitanate (Zn2TiO4) nanoparticles prepared by solid state reaction and it is found that its reaction rate constant is six times higher than that of P25. The value is found to be close to the value found by a conventional system. Our proposed system is much more cost-effective than the conventional one and has the potential to do real time monitoring of the photocatalytic activity.
Understanding the value of emergency care: a framework incorporating stakeholder perspectives.
Sharp, Adam L; Cobb, Enesha M; Dresden, Scott M; Richardson, Derek K; Sabbatini, Amber K; Sauser, Kori; Kocher, Keith E
2014-09-01
In the face of escalating spending, measuring and maximizing the value of health services has become an important focus of health reform. Recent initiatives aim to incentivize high-value care through provider and hospital payment reform, but the role of the emergency department (ED) remains poorly defined. To achieve an improved understanding of the value of emergency care, we have developed a framework that incorporates the perspectives of stakeholders in the delivery of health services. A pragmatic review of the literature informed the design of this framework to standardize the definition of value in emergency care and discuss outcomes and costs from different stakeholder perspectives. The viewpoint of patient, provider, payer, health system, and society is each used to assess value for emergency medical conditions. We found that the value attributed to emergency care differs substantially by stakeholder perspective. Potential targets to improve ED value may be aimed at improving outcomes or controlling costs, depending on the acuity of the clinical condition. The value of emergency care varies by perspective, and a better understanding is achieved when specific outcomes and costs can be identified, quantified, and measured. Using this framework can help stakeholders find common ground to prioritize which costs and outcomes to target for research, quality improvement efforts, and future health policy impacting emergency care. Copyright © 2014 Elsevier Inc. All rights reserved.
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2013 CFR
2013-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
Code of Federal Regulations, 2012 CFR
2012-07-01
...-hour SO2 concentration values measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values are the metrics (i.e., statistics) that are compared to the NAAQS levels to determine compliance, calculated as specified in section 5 of this appendix. The design value...
Code of Federal Regulations, 2014 CFR
2014-07-01
...-hour SO2 concentration values measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values are the metrics (i.e., statistics) that are compared to the NAAQS levels to determine compliance, calculated as specified in section 5 of this appendix. The design value...
Code of Federal Regulations, 2011 CFR
2011-07-01
...-hour SO2 concentration values measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values are the metrics (i.e., statistics) that are compared to the NAAQS levels to determine compliance, calculated as specified in section 5 of this appendix. The design value...
Code of Federal Regulations, 2013 CFR
2013-07-01
...-hour SO2 concentration values measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values are the metrics (i.e., statistics) that are compared to the NAAQS levels to determine compliance, calculated as specified in section 5 of this appendix. The design value...
Code of Federal Regulations, 2010 CFR
2010-07-01
...-hour SO2 concentration values measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values are the metrics (i.e., statistics) that are compared to the NAAQS levels to determine compliance, calculated as specified in section 5 of this appendix. The design value...
40 CFR Appendix R to Part 50 - Interpretation of the National Ambient Air Quality Standards for Lead
Code of Federal Regulations, 2012 CFR
2012-07-01
... determine the design value. (B) The “below NAAQS level” test is as follows: Data substitution will be... the recalculated (“test”) result including the high values, shall be used to determine the design... (local standard time), that are used in NAAQS computations. Design value is the site-level metric (i.e...
40 CFR Appendix R to Part 50 - Interpretation of the National Ambient Air Quality Standards for Lead
Code of Federal Regulations, 2013 CFR
2013-07-01
... determine the design value. (B) The “below NAAQS level” test is as follows: Data substitution will be... the recalculated (“test”) result including the high values, shall be used to determine the design... (local standard time), that are used in NAAQS computations. Design value is the site-level metric (i.e...
40 CFR Appendix R to Part 50 - Interpretation of the National Ambient Air Quality Standards for Lead
Code of Federal Regulations, 2014 CFR
2014-07-01
... determine the design value. (B) The “below NAAQS level” test is as follows: Data substitution will be... the recalculated (“test”) result including the high values, shall be used to determine the design... (local standard time), that are used in NAAQS computations. Design value is the site-level metric (i.e...
40 CFR Appendix R to Part 50 - Interpretation of the National Ambient Air Quality Standards for Lead
Code of Federal Regulations, 2010 CFR
2010-07-01
... determine the design value. (B) The “below NAAQS level” test is as follows: Data substitution will be... the recalculated (“test”) result including the high values, shall be used to determine the design... (local standard time), that are used in NAAQS computations. Design value is the site-level metric (i.e...
40 CFR Appendix R to Part 50 - Interpretation of the National Ambient Air Quality Standards for Lead
Code of Federal Regulations, 2011 CFR
2011-07-01
... determine the design value. (B) The “below NAAQS level” test is as follows: Data substitution will be... the recalculated (“test”) result including the high values, shall be used to determine the design... (local standard time), that are used in NAAQS computations. Design value is the site-level metric (i.e...
Comparison of the nutrient-based standards for school lunches among South Korea, Japan, and Taiwan.
Kim, Meeyoung; Abe, Satoko; Zhang, Chengyu; Kim, Soyoung; Choi, Jiyu; Hernandez, Emely; Nozue, Miho; Yoon, Jihyun
2017-01-01
Nutritional standards are important guidelines for providing students with nutritionally-balanced school meals. This study compared nutrient-based school lunch standards regulated by South Korea, Japan, and Taiwan. The data were collected from relevant literature and websites of each country during September 2014. The number of classification groups of target students was 8, 5, and 5 for South Korea, Japan, and Taiwan, respectively. Gender was considered across all age groups in South Korea but only for high school students in Taiwan. Gender was not considered in Japan. Along with energy, the number of nutrients included in the standards for South Korea, Japan and Taiwan was 9, 12, and 4, respectively. The standards for all three countries included protein and fat among macronutrients. The standards for South Korea and Japan included vitamin A, B-1, B-2, and C, while the standards for Taiwan did not include any vitamins. Calcium was the only mineral commonly included in the three standards. The proportions of recommended daily intakes as reference values for each nutrient differed among the countries. Japan differentiated the proportions among 33%, 40%, or 50%, reflecting the target students' intake status of the respective nutrients. Taiwan differentiated either two-fifths or one-third of the recommended daily intakes. South Korea applied the proportion of recommended daily intake as one-third for all selected nutrients. This study could be valuable information for countries in developing nutrient-based standards for school lunches and for South Korea, Japan, and Taiwan in the process of reforming nutrient-based standards.
Markham, Wolfgang A; Young, Robert; Sweeting, Helen; West, Patrick; Aveyard, Paul
2012-07-01
Previous studies found lower substance use in schools achieving better examination and truancy results than expected, given their pupil populations (high value-added schools). This study examines whether these findings are replicated in West Scotland and whether school ethos indicators focussing on pupils' perceptions of schooling (environment, involvement, engagement and teacher-pupil relations) mediate the associations. Teenagers from forty-one schools (S2, aged 13, n = 2268; S4, aged 15, n = 2096) previously surveyed in primary school (aged 11, n = 2482) were surveyed in the late 1990s. School value-added scores were derived from standardised residuals of two regression equations separately predicting from pupils' socio-demographic characteristics (1) proportions of pupils passing five Scottish Standard Grade Examinations, and (2) half-day truancy loss. Outcomes were current smoking, monthly drinking, ever illicit drug use. Random effects logistic regression models adjusted for potential pupil-level confounders were used to assess (1) associations between substance use and school-level value-added scores and (2) whether these associations were mediated by pupils' perceptions of schooling or other school-level factors (school roll, religious denomination and mean aggregated school-level ethos scores). Against expectations, value-added education was positively associated with smoking (Odds Ratios [95% confidence intervals] for one standard deviation increase in value-added scores were 1.28 [1.02-1.61] in S2 and 1.13 [1.00-1.27] in S4) and positively but weakly and non-significantly associated with drinking and drug use. Engagement and positive teacher-pupil relations were strongly and negatively associated with all substance use outcomes at both ages. Other school-level factors appeared weakly and largely non-significantly related to substance use. Value-added scores were unrelated to school ethos measures and no ethos measure mediated associations between value-added education and substance use. We conclude that substance use in Scotland is more likely in high value-added schools, among disengaged students and those with poorer student-teacher relationships. Understanding the underpinning mechanisms is a potentially important public health concern. Copyright © 2012 Elsevier Ltd. All rights reserved.
Markham, Wolfgang A.; Young, Robert; Sweeting, Helen; West, Patrick; Aveyard, Paul
2012-01-01
Previous studies found lower substance use in schools achieving better examination and truancy results than expected, given their pupil populations (high value-added schools). This study examines whether these findings are replicated in West Scotland and whether school ethos indicators focussing on pupils' perceptions of schooling (environment, involvement, engagement and teacher–pupil relations) mediate the associations. Teenagers from forty-one schools (S2, aged 13, n = 2268; S4, aged 15, n = 2096) previously surveyed in primary school (aged 11, n = 2482) were surveyed in the late 1990s. School value-added scores were derived from standardised residuals of two regression equations separately predicting from pupils' socio-demographic characteristics (1) proportions of pupils passing five Scottish Standard Grade Examinations, and (2) half-day truancy loss. Outcomes were current smoking, monthly drinking, ever illicit drug use. Random effects logistic regression models adjusted for potential pupil-level confounders were used to assess (1) associations between substance use and school-level value-added scores and (2) whether these associations were mediated by pupils' perceptions of schooling or other school-level factors (school roll, religious denomination and mean aggregated school-level ethos scores). Against expectations, value-added education was positively associated with smoking (Odds Ratios [95% confidence intervals] for one standard deviation increase in value-added scores were 1.28 [1.02–1.61] in S2 and 1.13 [1.00–1.27] in S4) and positively but weakly and non-significantly associated with drinking and drug use. Engagement and positive teacher–pupil relations were strongly and negatively associated with all substance use outcomes at both ages. Other school-level factors appeared weakly and largely non-significantly related to substance use. Value-added scores were unrelated to school ethos measures and no ethos measure mediated associations between value-added education and substance use. We conclude that substance use in Scotland is more likely in high value-added schools, among disengaged students and those with poorer student–teacher relationships. Understanding the underpinning mechanisms is a potentially important public health concern. PMID:22503837
Challenges and opportunities in synthetic biology for chemical engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, YZ; Lee, JK; Zhao, HM
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. (C) 2012 Elsevier Ltd. All rights reserved.
Challenges and opportunities in synthetic biology for chemical engineers
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2012-01-01
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. PMID:24222925
Challenges and opportunities in synthetic biology for chemical engineers.
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2013-11-15
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement.
Digitalisierung - Management Zwischen 0 und 1
NASA Astrophysics Data System (ADS)
Friedrich, Stefan; Rachholz, Josef
2017-09-01
Digitization as a process of expressing actions and values by codes 0 and 1 has already has become part of our lives. Digitization enables enterprises to improve production, sales and to increase volume of production. However, no standard digitization strategy has been yet developed. Even in the digitized business process management system, the most important position remains to a human being. The improvement of software products, their availability and the education system in the area of introduction and use of information technology is thus a striking feature in development of managing (but also other) current processes.
Brasil, Edikarlos M; Canavieira, Luciana M; Cardoso, Érica T C; Silva, Edilene O; Lameira, Jerônimo; Nascimento, José L M; Eifler-Lima, Vera L; Macchi, Barbarella M; Sriram, Dharmarajan; Bernhardt, Paul V; Silva, José Rogério Araújo; Williams, Craig M; Alves, Cláudio N
2017-11-01
Inhibition of mushroom tyrosinase was observed with synthetic dihydropyrano[3,2-b]chromenediones. Among them, DHPC04 displayed the most potent tyrosinase inhibitory activity with a K i value of 4 μm, comparable to the reference standard inhibitor kojic acid. A kinetic study suggested that these synthetic heterocyclic compounds behave as competitive inhibitors for the L-DOPA binding site of the enzyme. Furthermore, molecular modeling provided important insight into the mechanism of binding interactions with the tyrosinase copper active site. © 2017 John Wiley & Sons A/S.
The 5 Clinical Pillars of Value for Total Joint Arthroplasty in a Bundled Payment Paradigm.
Kim, Kelvin; Iorio, Richard
2017-06-01
Our large, urban, tertiary, university-based institution reflects on its 4-year experience with Bundled Payments for Care Improvement. We will describe the importance of 5 clinical pillars that have contributed to the early success of our bundled payment initiative. We are convinced that value-based care delivered through bundled payment initiatives is the best method to optimize patient outcomes while rewarding surgeons and hospitals for adapting to the evolving healthcare reforms. We summarize a number of experiences and lessons learned since the implementation of Bundled Payments for Care Improvement at our institution. Our experience has led to the development of more refined clinical pathways and coordination of care through evidence-based approaches. We have established that the success of the bundled payment program rests on the following 5 main clinical pillars: (1) optimizing patient selection and comorbidities; (2) optimizing care coordination, patient education, shared decision making, and patient expectations; (3) using a multimodal pain management protocol and minimizing narcotic use to facilitate rapid rehabilitation; (4) optimizing blood management, and standardizing venous thromboembolic disease prophylaxis treatment by risk standardizing patients and minimizing the use of aggressive anticoagulation; and (5) minimizing post-acute facility and resource utilization, and maximizing home resources for patient recovery. From our extensive experience with bundled payment models, we have established 5 clinical pillars of value for bundled payments. Our hope is that these principles will help ease the transition to value-based care for less-experienced healthcare systems. Copyright © 2017 Elsevier Inc. All rights reserved.
Pharmacognostic studies of the leaves and stem of Careya arborea Roxb.
Gupta, Prakash Chandra; Sharma, Nisha; Rao, Ch V
2012-01-01
Objective To study detailed pharmacognostic profile of leaves and stem of Careya arborea (C. arborea) Roxb. (Lecthyidaceae), an important medicinal plant in the Indian system of medicine. Methods Leaf and stem samples of C. arborea were studied by macroscopical, microscopical, physicochemical, phytochemical, fluorescence analysis of powder of the plant and other methods for standardization recommended by WHO. Results Macroscopically, the leaves are simple, broadly obovate in shape, acuminate apex with crenate, dentate margin, petioles (0.1–1.8 cm) long. Microscopically, the leaf showed the presence of median large size vascular bundle covered with fibrous bundle sheath, arrangement of xylem in cup shape and presence of cortical vascular bundle, patches of sclerenchyma, phloem fibers in groups and brown pigment containing cells in stem are some of the diagnostic features noted from anatomical study. Powder microscopy of leaf revealed the presence of parenchyma cells, xylem with pitted vessels and epidermis with anisocytic stomata. The investigations also included leaf surface data; quantitative leaf microscopy and fluorescence analysis. Physiochemical parameters such as loss on drying, swelling index, extractive values and ash values were also determined and results showed that total ash of the stem bark was about two times higher than leaf and water soluble extractive value of leaf and stem bark was two times higher than alcohol soluble extractive value. Preliminary phytochemical screening showed the presence of triterpenoids, saponins, tannins and flavonoids. Conclusions The results of the study can serve as a valuable source of information and provide suitable standards for identification of this plant material in future investigations and applications. PMID:23569939
Chopin, Joshua; Kumar, Pankaj; Miklavcic, Stanley J
2018-01-01
One of the main challenges associated with image-based field phenotyping is the variability of illumination. During a single day's imaging session, or between different sessions on different days, the sun moves in and out of cloud cover and has varying intensity. How is one to know from consecutive images alone if a plant has become darker over time, or if the weather conditions have simply changed from clear to overcast? This is a significant problem to address as colour is an important phenotypic trait that can be measured automatically from images. In this work we use an industry standard colour checker to balance the colour in images within and across every day of a field trial conducted over four months in 2016. By ensuring that the colour checker is present in every image we are afforded a 'ground truth' to correct for varying illumination conditions across images. We employ a least squares approach to fit a quadratic model for correcting RGB values of an image in such a way that the observed values of the colour checker tiles align with their true values after the transformation. The proposed method is successful in reducing the error between observed and reference colour chart values in all images. Furthermore, the standard deviation of mean canopy colour across multiple days is reduced significantly after colour correction is applied. Finally, we use a number of examples to demonstrate the usefulness of accurate colour measurements in recording phenotypic traits and analysing variation among varieties and treatments.
Assessment of an undergraduate psychiatry course in an African setting
Baig, Benjamin J; Beaglehole, Anna; Stewart, Robert C; Boeing, Leonie; Blackwood, Douglas H; Leuvennink, Johan; Kauye, Felix
2008-01-01
Background International reports recommend the improvement in the amount and quality of training for mental health workers in low and middle income countries. The Scotland-Malawi Mental Health Education Project (SMMHEP) has been established to support the teaching of psychiatry to medical students in the University of Malawi. While anecdotally supportive medical educational initiatives appear of value, little quantitative evidence exists to demonstrate whether such initiatives can deliver comparable educational standards. This study aimed to assess the effectiveness of an undergraduate psychiatry course given by UK psychiatrists in Malawi by studying University of Malawi and Edinburgh University medical students' performance on an MCQ examination paper. Methods An undergraduate psychiatry course followed by an MCQ exam was delivered by the SMMHEP to 57 Malawi medical students. This same MCQ exam was given to 71 Edinburgh University medical students who subsequently sat their own Edinburgh University examination. Results There were no significant differences between Edinburgh students' performance on the Malawi exam and their own Edinburgh University exam. (p = 0.65). This would suggest that the Malawi exam is a comparable standard to the Edinburgh exam. Malawi students marks ranged from 52.4%–84.6%. Importantly 84.4% of Malawi students scored above 60% on their exam which would equate to a hypothetical pass by UK university standards. Conclusion The support of an undergraduate course in an African setting by high income country specialists can attain a high percentage pass rate by UK standards. Although didactic teaching has been surpassed by more novel educational methods, in resource poor countries it remains an effective and cost effective method of gaining an important educational standard. PMID:18430237
Assessing the Added Value of Dynamical Downscaling Using ...
In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drought timescales that have implications for agriculture and water resources planning. The regional climate generated by WRF has the largest improvement over reanalysis for SPI correlation with observations as the drought timescale increases. This suggests that dynamically downscaled fields may be more reliable than larger-scale fields for water resource applications (e.g., water storage within reservoirs). WRF improves the timing and intensity of moderate to extreme wet and dry periods, even in regions with homogenous terrain. This study also examines changes in SPI from the extreme drought of 1988 and three “drought busting” tropical storms. Each of those events illustrates the importance of using downscaling to resolve the spatial extent of droughts. The analysis of the “drought busting” tropical storms demonstrates that while the impact of these storms on ending prolonged droughts is improved by the RCM relative to the reanalysis, it remains underestimated. These results illustrate the importance and some limitations of using RCMs to project drought. The National Exposure Research Laboratory’s Atmospheric Modeling Division (AMAD) conducts research in support of EPA’s mission t
Klämpfl, Tobias G; Isbary, Georg; Shimizu, Tetsuji; Li, Yang-Fang; Zimmermann, Julia L; Stolz, Wilhelm; Schlegel, Jürgen; Morfill, Gregor E; Schmidt, Hans-Ulrich
2012-08-01
Physical cold atmospheric surface microdischarge (SMD) plasma operating in ambient air has promising properties for the sterilization of sensitive medical devices where conventional methods are not applicable. Furthermore, SMD plasma could revolutionize the field of disinfection at health care facilities. The antimicrobial effects on Gram-negative and Gram-positive bacteria of clinical relevance, as well as the fungus Candida albicans, were tested. Thirty seconds of plasma treatment led to a 4 to 6 log(10) CFU reduction on agar plates. C. albicans was the hardest to inactivate. The sterilizing effect on standard bioindicators (bacterial endospores) was evaluated on dry test specimens that were wrapped in Tyvek coupons. The experimental D(23)(°)(C) values for Bacillus subtilis, Bacillus pumilus, Bacillus atrophaeus, and Geobacillus stearothermophilus were determined as 0.3 min, 0.5 min, 0.6 min, and 0.9 min, respectively. These decimal reduction times (D values) are distinctly lower than D values obtained with other reference methods. Importantly, the high inactivation rate was independent of the material of the test specimen. Possible inactivation mechanisms for relevant microorganisms are briefly discussed, emphasizing the important role of neutral reactive plasma species and pointing to recent diagnostic methods that will contribute to a better understanding of the strong biocidal effect of SMD air plasma.
Isbary, Georg; Shimizu, Tetsuji; Li, Yang-Fang; Zimmermann, Julia L.; Stolz, Wilhelm; Schlegel, Jürgen; Morfill, Gregor E.; Schmidt, Hans-Ulrich
2012-01-01
Physical cold atmospheric surface microdischarge (SMD) plasma operating in ambient air has promising properties for the sterilization of sensitive medical devices where conventional methods are not applicable. Furthermore, SMD plasma could revolutionize the field of disinfection at health care facilities. The antimicrobial effects on Gram-negative and Gram-positive bacteria of clinical relevance, as well as the fungus Candida albicans, were tested. Thirty seconds of plasma treatment led to a 4 to 6 log10 CFU reduction on agar plates. C. albicans was the hardest to inactivate. The sterilizing effect on standard bioindicators (bacterial endospores) was evaluated on dry test specimens that were wrapped in Tyvek coupons. The experimental D23°C values for Bacillus subtilis, Bacillus pumilus, Bacillus atrophaeus, and Geobacillus stearothermophilus were determined as 0.3 min, 0.5 min, 0.6 min, and 0.9 min, respectively. These decimal reduction times (D values) are distinctly lower than D values obtained with other reference methods. Importantly, the high inactivation rate was independent of the material of the test specimen. Possible inactivation mechanisms for relevant microorganisms are briefly discussed, emphasizing the important role of neutral reactive plasma species and pointing to recent diagnostic methods that will contribute to a better understanding of the strong biocidal effect of SMD air plasma. PMID:22582068
Music-Based Magnetic Resonance Fingerprinting to Improve Patient Comfort During MRI Exams
Ma, Dan; Pierre, Eric Y.; Jiang, Yun; Schluchter, Mark D.; Setsompop, Kawin; Gulani, Vikas; Griswold, Mark A.
2015-01-01
Purpose The unpleasant acoustic noise is an important drawback of almost every magnetic resonance imaging scan. Instead of reducing the acoustic noise to improve patient comfort, a method is proposed to mitigate the noise problem by producing musical sounds directly from the switching magnetic fields while simultaneously quantifying multiple important tissue properties. Theory and Methods MP3 music files were converted to arbitrary encoding gradients, which were then used with varying flip angles and TRs in both 2D and 3D MRF exam. This new acquisition method named MRF-Music was used to quantify T1, T2 and proton density maps simultaneously while providing pleasing sounds to the patients. Results The MRF-Music scans were shown to significantly improve the patients' comfort during the MRI scans. The T1 and T2 values measured from phantom are in good agreement with those from the standard spin echo measurements. T1 and T2 values from the brain scan are also close to previously reported values. Conclusions MRF-Music sequence provides significant improvement of the patient's comfort as compared to the MRF scan and other fast imaging techniques such as EPI and TSE scans. It is also a fast and accurate quantitative method that quantifies multiple relaxation parameter simultaneously. PMID:26178439
Asher, Anthony L; Kerezoudis, Panagiotis; Mummaneni, Praveen V; Bisson, Erica F; Glassman, Steven D; Foley, Kevin T; Slotkin, Jonathan; Potts, Eric A; Shaffrey, Mark E; Shaffrey, Christopher I; Coric, Domagoj; Knightly, John J; Park, Paul; Fu, Kai-Ming; Devin, Clinton J; Archer, Kristin R; Chotai, Silky; Chan, Andrew K; Virk, Michael S; Bydon, Mohamad
2018-01-01
OBJECTIVE Patient-reported outcomes (PROs) play a pivotal role in defining the value of surgical interventions for spinal disease. The concept of minimum clinically important difference (MCID) is considered the new standard for determining the effectiveness of a given treatment and describing patient satisfaction in response to that treatment. The purpose of this study was to determine the MCID associated with surgical treatment for degenerative lumbar spondylolisthesis. METHODS The authors queried the Quality Outcomes Database registry from July 2014 through December 2015 for patients who underwent posterior lumbar surgery for grade I degenerative spondylolisthesis. Recorded PROs included scores on the Oswestry Disability Index (ODI), EQ-5D, and numeric rating scale (NRS) for leg pain (NRS-LP) and back pain (NRS-BP). Anchor-based (using the North American Spine Society satisfaction scale) and distribution-based (half a standard deviation, small Cohen's effect size, standard error of measurement, and minimum detectable change [MDC]) methods were used to calculate the MCID for each PRO. RESULTS A total of 441 patients (80 who underwent laminectomies alone and 361 who underwent fusion procedures) from 11 participating sites were included in the analysis. The changes in functional outcome scores between baseline and the 1-year postoperative evaluation were as follows: 23.5 ± 17.4 points for ODI, 0.24 ± 0.23 for EQ-5D, 4.1 ± 3.5 for NRS-LP, and 3.7 ± 3.2 for NRS-BP. The different calculation methods generated a range of MCID values for each PRO: 3.3-26.5 points for ODI, 0.04-0.3 points for EQ-5D, 0.6-4.5 points for NRS-LP, and 0.5-4.2 points for NRS-BP. The MDC approach appeared to be the most appropriate for calculating MCID because it provided a threshold greater than the measurement error and was closest to the average change difference between the satisfied and not-satisfied patients. On subgroup analysis, the MCID thresholds for laminectomy-alone patients were comparable to those for the patients who underwent arthrodesis as well as for the entire cohort. CONCLUSIONS The MCID for PROs was highly variable depending on the calculation technique. The MDC seems to be a statistically and clinically sound method for defining the appropriate MCID value for patients with grade I degenerative lumbar spondylolisthesis. Based on this method, the MCID values are 14.3 points for ODI, 0.2 points for EQ-5D, 1.7 points for NRS-LP, and 1.6 points for NRS-BP.
Improving cancer patient emergency room utilization: A New Jersey state assessment.
Scholer, Anthony J; Mahmoud, Omar M; Ghosh, Debopyria; Schwartzman, Jacob; Farooq, Mohammed; Cabrera, Javier; Wieder, Robert; Adam, Nabil R; Chokshi, Ravi J
2017-12-01
Due to its increasing incidence and its major contribution to healthcare costs, cancer is a major public health problem in the United States. The impact across different services is not well documented and utilization of emergency departments (ED) by cancer patients is not well characterized. The aim of our study was to identify factors that can be addressed to improve the appropriate delivery of quality cancer care thereby reducing ED utilization, decreasing hospitalizations and reducing the related healthcare costs. The New Jersey State Inpatient and Emergency Department Databases were used to identify the primary outcome variables; patient disposition and readmission rates. The independent variables were demographics, payer and clinical characteristics. Multivariable unconditional logistic regression models using clinical and demographic data were used to predict hospital admission or emergency department return. A total of 37,080 emergency department visits were cancer related with the most common diagnosis attributed to lung cancer (30.0%) and the most common presentation was pain. The disposition of patients who visit the ED due to cancer related issues is significantly affected by the factors of race (African American OR=0.6, p value=0.02 and Hispanic OR=0.5, p value=0.02, respectively), age aged 65 to 75years (SNF/ICF OR 2.35, p value=0.00 and Home Healthcare Service OR 5.15, p value=0.01, respectively), number of diagnoses (OR 1.26, p value=0.00), insurance payer (SNF/ICF OR 2.2, p value=0.02 and Home Healthcare Services OR 2.85, p value=0.07, respectively) and type of cancer (breast OR 0.54, p value=0.01, prostate OR 0.56, p value=0.01, uterine OR 0.37, p value=0.02, and other OR 0.62, p value=0.05, respectively). In addition, comorbidities increased the likelihood of death, being transferred to SNF/ICF, or utilization of home healthcare services (OR 1.6, p value=0.00, OR 1.18, p value=0.00, and OR 1.16, p value=0.04, respectively). Readmission is significantly affected by race (American Americans OR 0.41, standard error 0.08, p value=0.001 and Hispanics OR 0.29, standard error 0.11, p value=0.01, respectively), income (Quartile 2 OR 0.98, standard error 0.14, p value 0.01, Quartile 3 OR 1.07, standard error 0.13, p value 0.01, and Quartile 4 OR 0.88, standard error 0.12, p value 0.01, respectively), and type of cancer (prostate OR 0.25, standard error 0.09, p value=0.001). Web based symptom questionnaires, patient navigators, end of life nursing and clinical cancer pathways can identify, guide and prompt early initiation of treat before progression of symptoms in cancer patients most likely to visit the ED. Thus, improving cancer patient satisfaction, outcomes and reduce health care costs. Published by Elsevier Ltd.
The value of mainstreaming human rights into health impact assessment.
MacNaughton, Gillian; Forman, Lisa
2014-09-26
Health impact assessment (HIA) is increasingly being used to predict the health and social impacts of domestic and global laws, policies and programs. In a comprehensive review of HIA practice in 2012, the authors indicated that, given the diverse range of HIA practice, there is an immediate need to reconsider the governing values and standards for HIA implementation [1]. This article responds to this call for governing values and standards for HIA. It proposes that international human rights standards be integrated into HIA to provide a universal value system backed up by international and domestic laws and mechanisms of accountability. The idea of mainstreaming human rights into HIA is illustrated with the example of impact assessments that have been carried out to predict the potential effects of intellectual property rights in international trade agreements on the availability and affordability of medicines. The article concludes by recommending international human rights standards as a legal and ethical framework for HIA that will enhance the universal values of nondiscrimination, participation, transparency and accountability and bring legitimacy and coherence to HIA practice as well.
The Value of Mainstreaming Human Rights into Health Impact Assessment
MacNaughton, Gillian; Forman, Lisa
2014-01-01
Health impact assessment (HIA) is increasingly being used to predict the health and social impacts of domestic and global laws, policies and programs. In a comprehensive review of HIA practice in 2012, the authors indicated that, given the diverse range of HIA practice, there is an immediate need to reconsider the governing values and standards for HIA implementation [1]. This article responds to this call for governing values and standards for HIA. It proposes that international human rights standards be integrated into HIA to provide a universal value system backed up by international and domestic laws and mechanisms of accountability. The idea of mainstreaming human rights into HIA is illustrated with the example of impact assessments that have been carried out to predict the potential effects of intellectual property rights in international trade agreements on the availability and affordability of medicines. The article concludes by recommending international human rights standards as a legal and ethical framework for HIA that will enhance the universal values of nondiscrimination, participation, transparency and accountability and bring legitimacy and coherence to HIA practice as well. PMID:25264683
Alternative prediction methods of protein and energy evaluation of pig feeds.
Święch, Ewa
2017-01-01
Precise knowledge of the actual nutritional value of individual feedstuffs and complete diets for pigs is important for efficient livestock production. Methods of assessment of protein and energy values in pig feeds have been briefly described. In vivo determination of protein and energy values of feeds in pigs are time-consuming, expensive and very often require the use of surgically-modified animals. There is a need for more simple, rapid, inexpensive and reproducible methods for routine feed evaluation. Protein and energy values of pig feeds can be estimated using the following alternative methods: 1) prediction equations based on chemical composition; 2) animal models as rats, cockerels and growing pigs for adult animals; 3) rapid methods, such as the mobile nylon bag technique and in vitro methods. Alternative methods developed for predicting the total tract and ileal digestibility of nutrients including amino acids in feedstuffs and diets for pigs have been reviewed. This article focuses on two in vitro methods that can be used for the routine evaluation of amino acid ileal digestibility and energy value of pig feeds and on factors affecting digestibility determined in vivo in pigs and by alternative methods. Validation of alternative methods has been carried out by comparing the results obtained using these methods with those acquired in vivo in pigs. In conclusion, energy and protein values of pig feeds may be estimated with satisfactory precision in rats and by the two- or three-step in vitro methods providing equations for the calculation of standardized ileal digestibility of amino acids and metabolizable energy content. The use of alternative methods of feed evaluation is an important way for reduction of stressful animal experiments.
Yang, Zhihui; Luo, Shuang; Wei, Zongsu; Ye, Tiantian; Spinney, Richard; Chen, Dong; Xiao, Ruiyang
2016-04-01
The second-order rate constants (k) of hydroxyl radical (·OH) with polychlorinated biphenyls (PCBs) in the gas phase are of scientific and regulatory importance for assessing their global distribution and fate in the atmosphere. Due to the limited number of measured k values, there is a need to model the k values for unknown PCBs congeners. In the present study, we developed a quantitative structure-activity relationship (QSAR) model with quantum chemical descriptors using a sequential approach, including correlation analysis, principal component analysis, multi-linear regression, validation, and estimation of applicability domain. The result indicates that the single descriptor, polarizability (α), plays an important role in determining the reactivity with a global standardized function of lnk = -0.054 × α ‒ 19.49 at 298 K. In order to validate the QSAR predicted k values and expand the current k value database for PCBs congeners, an independent method, density functional theory (DFT), was employed to calculate the kinetics and thermodynamics of the gas-phase ·OH oxidation of 2,4',5-trichlorobiphenyl (PCB31), 2,2',4,4'-tetrachlorobiphenyl (PCB47), 2,3,4,5,6-pentachlorobiphenyl (PCB116), 3,3',4,4',5,5'-hexachlorobiphenyl (PCB169), and 2,3,3',4,5,5',6-heptachlorobiphenyl (PCB192) at 298 K at B3LYP/6-311++G**//B3LYP/6-31 + G** level of theory. The QSAR predicted and DFT calculated k values for ·OH oxidation of these PCB congeners exhibit excellent agreement with the experimental k values, indicating the robustness and predictive power of the single-descriptor based QSAR model we developed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Jiménez-Sotelo, Paola; Hernández-Martínez, Maylet; Osorio-Revilla, Guillermo; Meza-Márquez, Ofelia Gabriela; García-Ochoa, Felipe; Gallardo-Velázquez, Tzayhrí
2016-07-01
Avocado oil is a high-value and nutraceutical oil whose authentication is very important since the addition of low-cost oils could lower its beneficial properties. Mid-FTIR spectroscopy combined with chemometrics was used to detect and quantify adulteration of avocado oil with sunflower and soybean oils in a ternary mixture. Thirty-seven laboratory-prepared adulterated samples and 20 pure avocado oil samples were evaluated. The adulterated oil amount ranged from 2% to 50% (w/w) in avocado oil. A soft independent modelling class analogy (SIMCA) model was developed to discriminate between pure and adulterated samples. The model showed recognition and rejection rate of 100% and proper classification in external validation. A partial least square (PLS) algorithm was used to estimate the percentage of adulteration. The PLS model showed values of R(2) > 0.9961, standard errors of calibration (SEC) in the range of 0.3963-0.7881, standard errors of prediction (SEP estimated) between 0.6483 and 0.9707, and good prediction performances in external validation. The results showed that mid-FTIR spectroscopy could be an accurate and reliable technique for qualitative and quantitative analysis of avocado oil in ternary mixtures.
Are renewables portfolio standards cost-effective emission abatement policy?
Dobesova, Katerina; Apt, Jay; Lave, Lester B
2005-11-15
Renewables portfolio standards (RPS) could be an important policy instrument for 3P and 4P control. We examine the costs of renewable power, accounting for the federal production tax credit, the market value of a renewable credit, and the value of producing electricity without emissions of SO2, NOx, mercury, and CO2. We focus on Texas, which has a large RPS and is the largest U.S. electricity producer and one of the largest emitters of pollutants and CO2. We estimate the private and social costs of wind generation in an RPS compared with the current cost of fossil generation, accounting for the pollution and CO2 emissions. We find that society paid about 5.7 cent/kWh more for wind power, counting the additional generation, transmission, intermittency, and other costs. The higher cost includes credits amounting to 1.1 cent/kWh in reduced SO2, NOx, and Hg emissions. These pollution reductions and lower CO2 emissions could be attained at about the same cost using pulverized coal (PC) or natural gas combined cycle (NGCC) plants with carbon capture and sequestration (CCS); the reductions could be obtained more cheaply with an integrated coal gasification combined cycle (IGCC) plant with CCS.
[Legionella spp. contamination in indoor air: preliminary results of an Italian multicenter study].
Montagna, Maria Teresa; De Giglio, Osvalda; Napoli, Christian; Cannova, Lucia; Cristina, Maria Luisa; Deriu, Maria Grazia; Delia, Santi Antonino; Giuliano, Ada; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Mura, Ida; Pennino, Francesca; Rossini, Angelo; Tardivo, Stefano; Torre, Ida; Torregrossa, Maria Valeria; Villafrate, Maria Rosaria; Albertini, Roberto; Pasquarella, Cesira
2014-01-01
To propose a standardized protocol for the evaluation of Legionella contamination in air. A bathroom having a Legionella contamination in water >1,000 cfu/l was selected in 10 different healthcare facilities. Air contamination was assessed by active (Surface Air System, SAS) and passive (Index of Microbial Air, IMA) sampling for 8 hours, about 1 m away from the floor and 50 cm from the tap water. Two hundred liters of air were sampled by SAS every 12 min, after flushing water for 2 min. The IMA value was calculated as the mean value of colony forming units/16 plates exposed during sampling (2 plates/hour). Water contamination was evaluated at T0, after 4 and 8 hours, according to the standard methods. Air contamination by Legionella was found in three healthcare facilities (one with active and two with passive sampling), showing a concomitant tap water contamination (median=40,000; range 1,100-43,000 cfu/l). The remaining seven hospitals isolated Legionella spp. exclusively from water samples (median=8,000; range 1,200-70,000 cfu/l). Our data suggest that environmental Legionella contamination cannot be assessed only through the air sampling, even in the presence of an important water contamination.
[Aquatic Ecological Index based on freshwater (ICE(RN-MAE)) for the Rio Negro watershed, Colombia].
Forero, Laura Cristina; Longo, Magnolia; John Jairo, Ramirez; Guillermo, Chalar
2014-04-01
Aquatic Ecological Index based on freshwater (ICE(RN-MAE)) for the Rio Negro watershed, Colombia. Available indices to assess the ecological status of rivers in Colombia are mostly based on subjective hypotheses about macroinvertebrate tolerance to pollution, which have important limitations. Here we present the application of a method to establish an index of ecological quality for lotic systems in Colombia. The index, based on macroinvertebrate abundance and physicochemical variables, was developed as an alternative to the BMWP-Col index. The method consists on determining an environmental gradient from correlations between physicochemical variables and abundance. The scores obtained in each sampling point are used in a standardized correlation for a model of weighted averages (WA). In the WA model abundances are also weighted to estimate the optimum and tolerance values of each taxon; using this information we estimated the index of ecological quality based also on macroinvertebrate (ICE(RN-MAE)) abundance in each sampling site. Subsequently, we classified all sites using the index and concentrations of total phosphorus (TP) in a cluster analysis. Using TP and ICE(RN-MAE), mean, maximum, minimum and standard deviation, we defined threshold values corresponding to three categories of ecological status: good, fair and critical.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, N. J., E-mail: n.stone@physics.ox.ac.uk
The most recent tabulations of nuclear magnetic dipole and electric quadrupole moments have been prepared and published by the Nuclear Data Section of the IAEA, Vienna [N. J. Stone, Report No. INDC(NDS)-0650 (2013); Report No. INDC(NDS)-0658 (2014)]. The first of these is a table of recommended quadrupole moments for all isotopes in which all experimental results are made consistent with a limited number of adopted standards for each element; the second is a combined listing of all measurements of both moments. Both tables cover all isotopes and energy levels. In this paper, the considerations relevant to the preparation of bothmore » tables are described, together with observations as to the importance and (where appropriate) application of necessary corrections to achieve the “best” values. Some discussion of experimental methods is included with emphasis on their precision. The aim of the published quadrupole moment table is to provide a standard reference in which the value given for each moment is the best available and for which full provenance is given. A table of recommended magnetic dipole moments is in preparation, with the same objective in view.« less
Limits of linearity and detection for some drugs of abuse.
Needleman, S B; Romberg, R W
1990-01-01
The limits of linearity (LOL) and detection (LOD) are important factors in establishing the reliability of an analytical procedure for accurately assaying drug concentrations in urine specimens. Multiple analyses of analyte over an extended range of concentrations provide a measure of the ability of the analytical procedure to correctly identify known quantities of drug in a biofluid matrix. Each of the seven drugs of abuse gives linear analytical responses from concentrations at or near their LOD to concentrations several-fold higher than those generally encountered in the drug screening laboratory. The upper LOL exceeds the Department of Navy (DON) cutoff values by factors of approximately 2 to 160. The LOD varies from 0.4 to 5.0% of the DON cutoff value for each drug. The limit of quantitation (LOQ) is calculated as the LOD + 7 SD. The range for LOL is greater for drugs analyzed with deuterated internal standards compared with those using conventional internal standards. For THC acid, cocaine, PCP, and morphine, LOLs are 8 to 160-fold greater than the defined cutoff concentrations. For the other drugs, the LOL's are only 2 to 4-fold greater than the defined cutoff concentrations.
Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay
2007-01-01
To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers a 7.8% to 10.7% value gain for the treatment of classic subfoveal choroidal neovascularization. Intravitreal ranibizumab therapy confers greater than a 15% value gain for the treatment of subfoveal occult and minimally classic subfoveal choroidal neovascularization. The majority of cost-utility studies performed on interventions for neovascular macular degeneration are value-based medicine studies and thus are comparable. Value-based analyses of neovascular ARMD monotherapies demonstrate the power of value-based medicine to improve quality of care and concurrently maximize the efficacy of healthcare resource use in public policy. The comparability of value-based medicine cost-utility analyses has important implications for overall practice standards and public policy. The adoption of value-based medicine standards can greatly facilitate the goal of higher-quality care and maximize the best use of healthcare funds.
Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay
2007-01-01
Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers a 7.8% to 10.7% value gain for the treatment of classic subfoveal choroidal neovascularization. Intravitreal ranibizumab therapy confers greater than a 15% value gain for the treatment of subfoveal occult and minimally classic subfoveal choroidal neovascularization. Conclusions The majority of cost-utility studies performed on interventions for neovascular macular degeneration are value-based medicine studies and thus are comparable. Value-based analyses of neovascular ARMD monotherapies demonstrate the power of value-based medicine to improve quality of care and concurrently maximize the efficacy of healthcare resource use in public policy. The comparability of value-based medicine cost-utility analyses has important implications for overall practice standards and public policy. The adoption of value-based medicine standards can greatly facilitate the goal of higher-quality care and maximize the best use of healthcare funds. PMID:18427606
The nutrient value of Imbrasia belina Lepidoptera: Saturnidae (madora).
Onigbinde, A O; Adamolekun, B
1998-05-01
To determine the pattern of consumption of Imbrasia belina (madora) and other edible insects and also compare the nutrient values of madora larvae and two of its variants (Anaphe venata and Cirina forda) to those of some conventional sources of protein. University of Zimbabwe. 100 workers who admitted to a history of entomophagy. Popularity score of madora compared with those of other edible insects and approximate compositions of nutrients in the larvae compared with standard proteins. Most respondents (65%) were introduced to entomophagy by their parents. Termites were the most frequently consumed, followed by madora. More respondents ate insects because of their perceived nutritional value than because of their relative availability. There was no association of entomophagy with significant side effects. The protein, fat and mineral contents of the larvae were superior to those of beef and chicken. There were no major differences in the nutrient composition of the three Lepidoptera variants. The high nutrient value and low cost of these larvae make them an important protein supplement, especially for people in the low income group.
Love as a regulative ideal in surrogate decision making.
Stonestreet, Erica Lucast
2014-10-01
This discussion aims to give a normative theoretical basis for a "best judgment" model of surrogate decision making rooted in a regulative ideal of love. Currently, there are two basic models of surrogate decision making for incompetent patients: the "substituted judgment" model and the "best interests" model. The former draws on the value of autonomy and responds with respect; the latter draws on the value of welfare and responds with beneficence. It can be difficult to determine which of these two models is more appropriate for a given patient, and both approaches may seem inadequate for a surrogate who loves the patient. The proposed "best judgment" model effectively draws on the values incorporated in each of the traditional standards, but does so because these values are important to someone who loves a patient, since love responds to the patient as the specific person she is. © The Author 2014. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Commercial Discount Rate Estimation for Efficiency Standards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujita, K. Sydny
2016-04-13
Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at themore » national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).« less
Infant obesity and severe obesity growth patterns in the first two years of life.
Gittner, Lisaann S; Ludington-Hoe, Susan M; Haller, Harold S
2014-04-01
Distinguishing an obesity growth pattern that originates during infancy is clinically important. Infancy based obesity prevention interventions may be needed while precursors of later health are forming. Infant obesity and severe obesity growth patterns in the first 2-years are described and distinguished from a normal weight growth pattern. A retrospective chart review was conducted. Body mass index (BMI) growth patterns from birth to 2-years are described for children categorized at 5-years as normal weight (n = 61), overweight (n = 47), obese (n = 41) and severely obese (n = 72) cohorts using WHO reference standards. BMI values were calculated at birth, 1-week; 2-, 4-, 6-, 9-, 12-, 15-, 18-months; and 2- and 5-years. Graphs of the longitudinal Analysis of Variance of Means of BMI values identified the earliest significant divergence of a cohort's average BMI pattern from other cohorts' patterns. ANOVA and Pearson Product Moment correlations were also performed. Statistically significant differences in BMI values and differences in growth patterns between cohorts were evident as early as 2-6 months post-birth. Children who were obese or severely obese at 5-years demonstrated a BMI pattern that differed within the first 2-years of life from that of children who were normal weight at 5-years. The earliest significant correlation between early BMI values and 5-year BMI value was at 4-months post-birth. The study fills an important gap by demonstrating early onset of an infant obesity growth pattern in full-term children who were healthy throughout their first 5 years of life.
Hands, B; Parker, H E; Rose, E; Larkin, D
2016-03-01
Perceptions of the effects of physical activity could facilitate or deter future participation. This study explored the differences between gender and motor competence at 14 years of age in the perceptions of likelihood and importance of physical activity outcomes. The sample comprised 1582 14-year-old adolescents (769 girls) from the Western Australian Pregnancy Cohort (Raine) Study. Four motor competence groups were formed from a standardized Neuromuscular Developmental Index score (McCarron 1997). Perceptions of the likelihood and the importance of 15 physical activity outcomes were measured by a questionnaire developed for the NSW Schools Fitness and Physical Activity Survey (Booth et al. 1997). Gender (two) × motor competence (four) analyses of variance and Tukey post hoc were conducted on outcome scores (P < 0.02) using SPSS version 17. Gender differences were found in the perceived likelihood and importance of physical activity outcomes within competition, social friendships and injury domains. Motor competence was significant in the perceived likelihood of physical health (P < 0.001), psychosocial (P < 0.009) and competition (P < 0.002) outcomes, with lower perceptions by the least competent groups. Significantly lower importance was perceived for academic outcomes for 14 year olds categorized with low compared with high motor competence (P < 0.005). Regardless of motor competence and gender, the same health and fun outcomes were ranked the highest in likelihood and the highest in importance. Although level of motor competence at 14 years affected the perceived likelihood of health, social and fun outcomes from future participation in physical activity, adolescents highly valued these outcomes, whereas gender affected competition and winning, outcomes that were less valued. Physical activity that promotes these key and valued outcomes may encourage young people's ongoing involvement in physical activity, especially for those at risk of low participation. © 2015 John Wiley & Sons Ltd.
Impact of the hard-coded parameters on the hydrologic fluxes of the land surface model Noah-MP
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Samaniego, Luis; Clark, Martyn; Wulfmeyer, Volker; Attinger, Sabine; Thober, Stephan
2016-04-01
Land surface models incorporate a large number of processes, described by physical, chemical and empirical equations. The process descriptions contain a number of parameters that can be soil or plant type dependent and are typically read from tabulated input files. Land surface models may have, however, process descriptions that contain fixed, hard-coded numbers in the computer code, which are not identified as model parameters. Here we searched for hard-coded parameters in the computer code of the land surface model Noah with multiple process options (Noah-MP) to assess the importance of the fixed values on restricting the model's agility during parameter estimation. We found 139 hard-coded values in all Noah-MP process options, which are mostly spatially constant values. This is in addition to the 71 standard parameters of Noah-MP, which mostly get distributed spatially by given vegetation and soil input maps. We performed a Sobol' global sensitivity analysis of Noah-MP to variations of the standard and hard-coded parameters for a specific set of process options. 42 standard parameters and 75 hard-coded parameters were active with the chosen process options. The sensitivities of the hydrologic output fluxes latent heat and total runoff as well as their component fluxes were evaluated. These sensitivities were evaluated at twelve catchments of the Eastern United States with very different hydro-meteorological regimes. Noah-MP's hydrologic output fluxes are sensitive to two thirds of its standard parameters. The most sensitive parameter is, however, a hard-coded value in the formulation of soil surface resistance for evaporation, which proved to be oversensitive in other land surface models as well. Surface runoff is sensitive to almost all hard-coded parameters of the snow processes and the meteorological inputs. These parameter sensitivities diminish in total runoff. Assessing these parameters in model calibration would require detailed snow observations or the calculation of hydrologic signatures of the runoff data. Latent heat and total runoff exhibit very similar sensitivities towards standard and hard-coded parameters in Noah-MP because of their tight coupling via the water balance. It should therefore be comparable to calibrate Noah-MP either against latent heat observations or against river runoff data. Latent heat and total runoff are sensitive to both, plant and soil parameters. Calibrating only a parameter sub-set of only soil parameters, for example, thus limits the ability to derive realistic model parameters. It is thus recommended to include the most sensitive hard-coded model parameters that were exposed in this study when calibrating Noah-MP.
Use of benefit-cost analysis in establishing Federal radiation protection standards: a review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, L.E.
1979-10-01
This paper complements other work which has evaluated the cost impacts of radiation standards on the nuclear industry. It focuses on the approaches to valuation of the health and safety benefits of radiation standards and the actual and appropriate processes of benefit-cost comparison. A brief historical review of the rationale(s) for the levels of radiation standards prior to 1970 is given. The Nuclear Regulatory Commission (NRC) established numerical design objectives for light water reactors (LWRs). The process of establishing these numerical design criteria below the radiation protection standards set in 10 CFR 20 is reviewed. EPA's 40 CFR 190 environmentalmore » standards for the uranium fuel cycle have lower values than NRC's radiation protection standards in 10 CFR 20. The task of allocating EPA's 40 CFR 190 standards to the various portions of the fuel cycle was left to the implementing agency, NRC. So whether or not EPA's standards for the uranium fuel cycle are more stringent for LWRs than NRC's numerical design objectives depends on how EPA's standards are implemented by NRC. In setting the numerical levels in Appendix I to 10 CFR 50 and 40 CFR 190 NRC and EPA, respectively, focused on the costs of compliance with various levels of radiation control. A major portion of the paper is devoted to a review and critique of the available methods for valuing health and safety benefits. All current approaches try to estimate a constant value of life and use this to vaue the expected number of lives saved. This paper argues that it is more appropriate to seek a value of a reduction in risks to health and life that varies with the extent of these risks. Additional research to do this is recommended. (DC)« less
Astrophysical tests for radiative decay of neutrinos and fundamental physics implications
NASA Technical Reports Server (NTRS)
Stecker, F. W.; Brown, R. W.
1981-01-01
The radiative lifetime tau for the decay of massious neutrinos was calculated using various physical models for neutrino decay. The results were then related to the astrophysical problem of the detectability of the decay photons from cosmic neutrinos. Conversely, the astrophysical data were used to place lower limits on tau. These limits are all well below predicted values. However, an observed feature at approximately 1700 A in the ultraviolet background radiation at high galactic latitudes may be from the decay of neutrinos with mass approximately 14 eV. This would require a decay rate much larger than the predictions of standard models but could be indicative of a decay rate possible in composite models or other new physics. Thus an important test for substructure in leptons and quarks or other physics beyond the standard electroweak model may have been found.
XAFS Data Interchange: A single spectrum XAFS data file format.
Ravel, B; Newville, M
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
XAFS Data Interchange: A single spectrum XAFS data file format
NASA Astrophysics Data System (ADS)
Ravel, B.; Newville, M.
2016-05-01
We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.
Henderson, Amanda; Winch, Sarah
2008-01-01
Leadership strategies are important in facilitating the nursing profession to reach their optimum standards in the practice environment. To compare and contrast the central tenets of contemporary quality initiatives that are commensurate with enabling the environment so that best practice can occur. Democratic leadership, accessible and relevant education and professional development, the incorporation of evidence into practice and the ability of facilities to be responsive to change are core considerations for the successful maintenance of practice standards that are consistent with best nursing practice. While different concerns of management drive the adoption of contemporary approaches, there are many similarities in the how these approaches are translated into action in the clinical setting. Managers should focus on core principles of professional nursing that add value to practice rather than business processes.
Submicrosecond characteristics of lightning return-stroke currents
NASA Technical Reports Server (NTRS)
Leteinturier, Christiane; Hamelin, Joel H.; Eybert-Berard, Andre
1991-01-01
The authors describe the experimental results obtained during 1987 and 1988 triggered-lightning experiments in Florida. Seventy-four simultaneous submicrosecond time-resolved measurements of triggered return-stroke current (I) and current derivative (dI/dt) were made in Florida in 1987 and 1988. Peak currents ranged from about 5 to 76 kA, peak dI/dt amplitude from 13 to 411 kA/microsec and rise time from 90 to 1000 ns. The mean peak dI/dt values of 110 kA/microsec were 2-3 times higher than data from instrumented towers and peak I and dI/dt appear to be positively correlated. These data confirm previous experiments and conclusions supported by forty measurements. They are important in order to define, for example, standards for lightning protection. Present standards give a dI/dt maximum of 140 kA/microsec.
Analysis and design of a standardized control module for switching regulators
NASA Astrophysics Data System (ADS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.; Kolecki, J. C.
1982-07-01
Three basic switching regulators: buck, boost, and buck/boost, employing a multiloop standardized control module (SCM) were characterized by a common small signal block diagram. Employing the unified model, regulator performances such as stability, audiosusceptibility, output impedance, and step load transient are analyzed and key performance indexes are expressed in simple analytical forms. More importantly, the performance characteristics of all three regulators are shown to enjoy common properties due to the unique SCM control scheme which nullifies the positive zero and provides adaptive compensation to the moving poles of the boost and buck/boost converters. This allows a simple unified design procedure to be devised for selecting the key SCM control parameters for an arbitrarily given power stage configuration and parameter values, such that all regulator performance specifications can be met and optimized concurrently in a single design attempt.
Towards Dynamic Contrast Specific Ultrasound Tomography
NASA Astrophysics Data System (ADS)
Demi, Libertario; van Sloun, Ruud J. G.; Wijkstra, Hessel; Mischi, Massimo
2016-10-01
We report on the first study demonstrating the ability of a recently-developed, contrast-enhanced, ultrasound imaging method, referred to as cumulative phase delay imaging (CPDI), to image and quantify ultrasound contrast agent (UCA) kinetics. Unlike standard ultrasound tomography, which exploits changes in speed of sound and attenuation, CPDI is based on a marker specific to UCAs, thus enabling dynamic contrast-specific ultrasound tomography (DCS-UST). For breast imaging, DCS-UST will lead to a more practical, faster, and less operator-dependent imaging procedure compared to standard echo-contrast, while preserving accurate imaging of contrast kinetics. Moreover, a linear relation between CPD values and ultrasound second-harmonic intensity was measured (coefficient of determination = 0.87). DCS-UST can find clinical applications as a diagnostic method for breast cancer localization, adding important features to multi-parametric ultrasound tomography of the breast.
Taha, Muhammad; Arbin, Mastura; Ahmat, Norizan; Imran, Syahrul; Rahim, Fazal
2018-04-01
Due to the great biological importance of β-glucuronidase inhibitors, here in this study, we have synthesized a library of novel benzothiazole derivatives (1-30), characterized by different spectroscopic methods and evaluated for β-glucuronidase inhibitory potential. Among the series sixteen compounds i.e.1-6, 8, 9, 11, 14, 15, 20-23 and 26 showed outstanding inhibitory potential with IC 50 value ranging in between 16.50 ± 0.26 and 59.45 ± 1.12 when compared with standard d-Saccharic acid 1,4-lactone (48.4 ± 1.25 µM). Except compound 8 and 23 all active analogs showed better potential than the standard. Structure activity relationship has been established. Copyright © 2018 Elsevier Inc. All rights reserved.
Towards Dynamic Contrast Specific Ultrasound Tomography.
Demi, Libertario; Van Sloun, Ruud J G; Wijkstra, Hessel; Mischi, Massimo
2016-10-05
We report on the first study demonstrating the ability of a recently-developed, contrast-enhanced, ultrasound imaging method, referred to as cumulative phase delay imaging (CPDI), to image and quantify ultrasound contrast agent (UCA) kinetics. Unlike standard ultrasound tomography, which exploits changes in speed of sound and attenuation, CPDI is based on a marker specific to UCAs, thus enabling dynamic contrast-specific ultrasound tomography (DCS-UST). For breast imaging, DCS-UST will lead to a more practical, faster, and less operator-dependent imaging procedure compared to standard echo-contrast, while preserving accurate imaging of contrast kinetics. Moreover, a linear relation between CPD values and ultrasound second-harmonic intensity was measured (coefficient of determination = 0.87). DCS-UST can find clinical applications as a diagnostic method for breast cancer localization, adding important features to multi-parametric ultrasound tomography of the breast.
Towards Dynamic Contrast Specific Ultrasound Tomography
Demi, Libertario; Van Sloun, Ruud J. G.; Wijkstra, Hessel; Mischi, Massimo
2016-01-01
We report on the first study demonstrating the ability of a recently-developed, contrast-enhanced, ultrasound imaging method, referred to as cumulative phase delay imaging (CPDI), to image and quantify ultrasound contrast agent (UCA) kinetics. Unlike standard ultrasound tomography, which exploits changes in speed of sound and attenuation, CPDI is based on a marker specific to UCAs, thus enabling dynamic contrast-specific ultrasound tomography (DCS-UST). For breast imaging, DCS-UST will lead to a more practical, faster, and less operator-dependent imaging procedure compared to standard echo-contrast, while preserving accurate imaging of contrast kinetics. Moreover, a linear relation between CPD values and ultrasound second-harmonic intensity was measured (coefficient of determination = 0.87). DCS-UST can find clinical applications as a diagnostic method for breast cancer localization, adding important features to multi-parametric ultrasound tomography of the breast. PMID:27703251
Board Certification: Going Back to the Future.
Nora, Lois Margaret; McGreal, Sylvia Fonte; Nugent, Samantha Guastella
2016-09-01
The authors present snapshots of board certification in 1916, the year that the American Board of Ophthalmology was founded, 60 years later in 1976 as periodic recertification emerged, and speculation about what certification might look like in 2036. The concept of board certification and continuous certification in the medical specialties took shape at the beginning of the 20th century with the convergence of a new system of assessment, the emergence of certifying boards, and the creation of the American Board of Medical Specialties (ABMS). The importance of self-regulation is emphasized as are the principles underlying board certification and the standards that guide it to support its continued relevance as a valued credential and symbol of the highest standard in the practice of medicine. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Mental and social health during and after acute emergencies: emerging consensus?
van Ommeren, Mark; Saxena, Shekhar; Saraceno, Benedetto
2005-01-01
Mental health care programmes during and after acute emergencies in resource-poor countries have been considered controversial. There is no agreement on the public health value of the post-traumatic stress disorder concept and no agreement on the appropriateness of vertical (separate) trauma-focused services. A range of social and mental health intervention strategies and principles seem, however, to have the broad support of expert opinion. Despite continuing debate, there is emerging agreement on what entails good public health practice in respect of mental health. In terms of early interventions, this agreement is exemplified by the recent inclusion of a "mental and social aspects of health" standard in the Sphere handbook's revision on minimal standards in disaster response. This affirmation of emerging agreement is important and should give clear messages to health planners. PMID:15682252
Trevethan, Robert
2017-01-01
Within the context of screening tests, it is important to avoid misconceptions about sensitivity, specificity, and predictive values. In this article, therefore, foundations are first established concerning these metrics along with the first of several aspects of pliability that should be recognized in relation to those metrics. Clarification is then provided about the definitions of sensitivity, specificity, and predictive values and why researchers and clinicians can misunderstand and misrepresent them. Arguments are made that sensitivity and specificity should usually be applied only in the context of describing a screening test's attributes relative to a reference standard; that predictive values are more appropriate and informative in actual screening contexts, but that sensitivity and specificity can be used for screening decisions about individual people if they are extremely high; that predictive values need not always be high and might be used to advantage by adjusting the sensitivity and specificity of screening tests; that, in screening contexts, researchers should provide information about all four metrics and how they were derived; and that, where necessary, consumers of health research should have the skills to interpret those metrics effectively for maximum benefit to clients and the healthcare system.
A NEW GUI FOR GLOBAL ORBIT CORRECTION AT THE ALS USING MATLAB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pachikara, J.; Portmann, G.
2007-01-01
Orbit correction is a vital procedure at particle accelerators around the world. The orbit correction routine currently used at the Advanced Light Source (ALS) is a bit cumbersome and a new Graphical User Interface (GUI) has been developed using MATLAB. The correction algorithm uses a singular value decomposition method for calculating the required corrector magnet changes for correcting the orbit. The application has been successfully tested at the ALS. The GUI display provided important information regarding the orbit including the orbit errors before and after correction, the amount of corrector magnet strength change, and the standard deviation of the orbitmore » error with respect to the number of singular values used. The use of more singular values resulted in better correction of the orbit error but at the expense of enormous corrector magnet strength changes. The results showed an inverse relationship between the peak-to-peak values of the orbit error and the number of singular values used. The GUI interface helps the ALS physicists and operators understand the specifi c behavior of the orbit. The application is convenient to use and is a substantial improvement over the previous orbit correction routine in terms of user friendliness and compactness.« less
Trevethan, Robert
2017-01-01
Within the context of screening tests, it is important to avoid misconceptions about sensitivity, specificity, and predictive values. In this article, therefore, foundations are first established concerning these metrics along with the first of several aspects of pliability that should be recognized in relation to those metrics. Clarification is then provided about the definitions of sensitivity, specificity, and predictive values and why researchers and clinicians can misunderstand and misrepresent them. Arguments are made that sensitivity and specificity should usually be applied only in the context of describing a screening test’s attributes relative to a reference standard; that predictive values are more appropriate and informative in actual screening contexts, but that sensitivity and specificity can be used for screening decisions about individual people if they are extremely high; that predictive values need not always be high and might be used to advantage by adjusting the sensitivity and specificity of screening tests; that, in screening contexts, researchers should provide information about all four metrics and how they were derived; and that, where necessary, consumers of health research should have the skills to interpret those metrics effectively for maximum benefit to clients and the healthcare system. PMID:29209603
Bonin, Katija; McGuffin, Merrylee; Presutti, Roseanna; Harth, Tamara; Mesci, Aruz; Feldman-Stewart, Deb; Chow, Edward; Di Prospero, Lisa; Vesprini, Danny; Rakovitch, Eileen; Lee, Justin; Paszat, Lawrence; Doherty, Mary; Soliman, Hany; Ackerman, Ida; Cao, Xingshan; Kiss, Alex; Szumacher, Ewa
2018-02-01
This study was conducted to elucidate patients with early breast cancer preference for standard whole breast irradiation (WBI) or partial breast irradiation (PBI) following lumpectomy, as well as identify important factors for patients when making their treatment decisions. Based on relevant literature and ASTRO consensus statement guidelines, an educational tool and questionnaire were developed. Consenting, eligible women reviewed the educational tool and completed the trade-off questionnaire. Descriptive statistics were calculated, as well as chi-squares and a logistic regression model. Of the 90 patients who completed the study, 62 % preferred WBI, 30 % preferred PBI, 4 % required more information, and 3 % had no preferences. Of the patients who chose WBI, 58 % preferred hypofractionated RT, whereas 25 % preferred the conventional RT regimen. The majority of patients rated recurrence rate [WBI = 55/55 (100 %), PBI = 26/26 (100 %)] and survival [WBI = 54/55 (98 %), PBI = 26/26 (100 %)] as important factors contributing to their choice of treatment preference. Financial factors [WBI = 21/55 (38 %), PBI = 14/26 (53 %)] and convenience [WBI = 36/54 (67 %), PBI = 18/26 (69 %)] were rated as important less frequently. Significantly, more patients who preferred WBI also rated standard method of treatment as important when compared to patients who preferred PBI [WBI = 52/54 (96 %), PBI = 16/26 (61 %), χ 2 = 16.63, p = 0.001]. The majority of patients with early breast cancer who were surveyed for this study preferred WBI as an adjuvant treatment post lumpectomy, yet there was a sizeable minority who preferred PBI. This was associated with the importance patients place on standard treatment. These results will help medical professionals treat patients according to patient values.
Valuation of medical resource units collected in health economic studies.
Copley-Merriman, C; Lair, T J
1994-01-01
This paper reviews the issues that are critical for the valuation of medical resources in the context of health economic studies. There are several points to consider when undertaking the valuation of medical resources. The perspective of the analysis should be established before determining the valuation process. Future costs should be discounted to present values, and time and effort spent in assigning a monetary value to a medical resource should be proportional to its importance in the analysis. Prices vary considerably based on location of the service and the severity of the illness episode. Because of the wide variability in pricing data, sensitivity analysis is an important component of validation of study results. A variety of data sources have been applied to the valuation of medical resources. Several types of data are reviewed in this paper, including claims data, national survey data, administrative data, and marketing research data. Valuation of medical resources collected in clinical trials is complex because of the lack of standardization of the data sources. A national pricing data source for health economic valuation would greatly facilitate study analysis and make comparisons between results more meaningful.
Neville, David C A; Alonzi, Dominic S; Butters, Terry D
2012-04-13
Hydrophilic interaction liquid chromatography (HILIC) of fluorescently labelled oligosaccharides is used in many laboratories to analyse complex oligosaccharide mixtures. Separations are routinely performed using a TSK gel-Amide 80 HPLC column, and retention times of different oligosaccharide species are converted to glucose unit (GU) values that are determined with reference to an external standard. However, if retention times were to be compared with an internal standard, consistent and more accurate GU values would be obtained. We present a method to perform internal standard-calibrated HILIC of fluorescently labelled oligosaccharides. The method relies on co-injection of 4-aminobenzoic acid ethyl ester (4-ABEE)-labelled internal standard and detection by UV absorption, with 2-AA (2-aminobenzoic acid)-labelled oligosaccharides. 4-ABEE is a UV chromophore and a fluorophore, but there is no overlap of the fluorescent spectrum of 4-ABEE with the commonly used fluorescent reagents. The dual nature of 4-ABEE allows for accurate calculation of the delay between UV and fluorescent signals when determining the GU values of individual oligosaccharides. The GU values obtained are inherently more accurate as slight differences in gradients that can influence retention are negated by use of an internal standard. Therefore, this paper provides the first method for determination of HPLC-derived GU values of fluorescently labelled oligosaccharides using an internal calibrant. Copyright © 2012 Elsevier B.V. All rights reserved.
Clarifying atomic weights: A 2016 four-figure table of standard and conventional atomic weights
Coplen, Tyler B.; Meyers, Fabienne; Holden, Norman E.
2017-01-01
To indicate that atomic weights of many elements are not constants of nature, in 2009 and 2011 the Commission on Isotopic Abundances and Atomic Weights (CIAAW) of the International Union of Pure and Applied Chemistry (IUPAC) replaced single-value standard atomic weight values with atomic weight intervals for 12 elements (hydrogen, lithium, boron, carbon, nitrogen, oxygen, magnesium, silicon, sulfur, chlorine, bromine, and thallium); for example, the standard atomic weight of nitrogen became the interval [14.00643, 14.00728]. CIAAW recognized that some users of atomic weight data only need representative values for these 12 elements, such as for trade and commerce. For this purpose, CIAAW provided conventional atomic weight values, such as 14.007 for nitrogen, and these values can serve in education when a single representative value is needed, such as for molecular weight calculations. Because atomic weight values abridged to four figures are preferred by many educational users and are no longer provided by CIAAW as of 2015, we provide a table containing both standard atomic weight values and conventional atomic weight values abridged to four figures for the chemical elements. A retrospective review of changes in four-digit atomic weights since 1961 indicates that changes in these values are due to more accurate measurements over time or to the recognition of the impact of natural isotopic fractionation in normal terrestrial materials upon atomic weight values of many elements. Use of the unit “u” (unified atomic mass unit on the carbon mass scale) with atomic weight is incorrect because the quantity atomic weight is dimensionless, and the unit “amu” (atomic mass unit on the oxygen scale) is an obsolete term: Both should be avoided.
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Professional Values Among Female Nursing Students in Saudi Arabia.
Allari, Rabia S; Ismaile, Samantha; Househ, Mowafa
2017-01-01
Professional values are essential to nursing practice because they guide standards for working, provide a structure for evaluating behavior, and influence decisions making. The purpose of this study is to explore the perception of Saudi female nursing students on professional values and to assess the correlation between their perception of professional values in relation to their year of academic studies. We used a cross-sectional descriptive study where a survey was administered to 150 Saudi female nurses living in Riyadh. Results show that Saudi female nurses have a high perception of professional values relating to confidentiality, privacy, moral and legal rights, health and safety, and the work environment. Whereas Saudi nursing students have a low perception for participating in professional nursing activities, utilizing research in practice, peer review, public policy, and engaging in on-going self-evaluation. There was positive correlation between different professional values and academic years. The highest correlations were for the items related to caring and trust more than activism because nursing students at higher academic levels viewed the relationship with patients as more important than advancing health care systems through public policy, research, and professional organizations. In conclusion, nursing program administrators should put emphasis on improving the development of professional values through a role modeling approach to promote activism and professional values through the arrangement of meetings, exchange forums, and conferences with other nurses, managers, policy makers, innovators, and researchers within the nursing field.
Vaginismus and dyspareunia: relationship with general and sex-related moral standards.
Borg, Charmaine; de Jong, Peter J; Weijmar Schultz, Willibrord
2011-01-01
Relatively strong adherence to conservative values and/or relatively strict sex-related moral standards logically restricts the sexual repertoire and will lower the threshold for experiencing negative emotions in a sexual context. In turn, this may generate withdrawal and avoidance behavior, which is at the nucleus of vaginismus. To examine whether indeed strong adherence to conservative morals and/or strict sexual standards may be involved in vaginismus. The Schwartz Value Survey (SVS) to investigate the individual's value pattern and the Sexual Disgust Questionnaire (SDQ) to index the willingness to perform certain sexual activities as an indirect measure of sex-related moral standards. The SVS and SDQ were completed by three groups: women diagnosed with vaginismus (N=24), a group of women diagnosed with dyspareunia (N=24), and a healthy control group of women without sexual complaints (N=32). Specifically, the vaginismus group showed relatively low scores on liberal values together with comparatively high scores on conservative values. Additionally, the vaginismus group was more restricted in their readiness to perform particular sex-related behaviors than the control group. The dyspareunia group, on both the SVS and the SDQ, placed between the vaginismus and the control group, but not significantly different than either of the groups. The findings are consistent with the view that low liberal and high conservative values, along with restricted sexual standards, are involved in the development/maintenance of vaginismus. © 2010 International Society for Sexual Medicine.
Scialla, Michele A; Canter, Kimberly S; Chen, Fang Fang; Kolb, E Anders; Sandler, Eric; Wiener, Lori; Kazak, Anne E
2018-03-01
With published evidence-based Standards for Psychosocial Care for Children with Cancer and their Families, it is important to know the current status of their implementation. This paper presents data on delivery of psychosocial care related to the Standards in the United States. Pediatric oncologists, psychosocial leaders, and administrators in pediatric oncology from 144 programs completed an online survey. Participants reported on the extent to which psychosocial care consistent with the Standards was implemented and was comprehensive and state of the art. They also reported on specific practices and services for each Standard and the extent to which psychosocial care was integrated into broader medical care. Participants indicated that psychosocial care consistent with the Standards was usually or always provided at their center for most of the Standards. However, only half of the oncologists (55.6%) and psychosocial leaders (45.6%) agreed or strongly agreed that their psychosocial care was comprehensive and state of the art. Types of psychosocial care provided included evidence-based and less established approaches but were most often provided when problems were identified, rather than proactively. The perception of state of the art care was associated with practices indicative of integrated psychosocial care and the extent to which the Standards are currently implemented. Many oncologists and psychosocial leaders perceive that the delivery of psychosocial care at their center is consistent with the Standards. However, care is quite variable, with evidence for the value of more integrated models of psychosocial services. © 2017 Wiley Periodicals, Inc.
Lee, Joonkoo; Gereffi, Gary; Beauvais, Janet
2012-01-01
The rise of private food standards has brought forth an ongoing debate about whether they work as a barrier for smallholders and hinder poverty reduction in developing countries. This paper uses a global value chain approach to explain the relationship between value chain structure and agrifood safety and quality standards and to discuss the challenges and possibilities this entails for the upgrading of smallholders. It maps four potential value chain scenarios depending on the degree of concentration in the markets for agrifood supply (farmers and manufacturers) and demand (supermarkets and other food retailers) and discusses the impact of lead firms and key intermediaries on smallholders in different chain situations. Each scenario is illustrated with case examples. Theoretical and policy issues are discussed, along with proposals for future research in terms of industry structure, private governance, and sustainable value chains. PMID:21149723
Koenig, Bruce E; Lacey, Douglas S
2014-07-01
In this research project, nine small digital audio recorders were tested using five sets of 30-min recordings at all available recording modes, with consistent audio material, identical source and microphone locations, and identical acoustic environments. The averaged direct current (DC) offset values and standard deviations were measured for 30-sec and 1-, 2-, 3-, 6-, 10-, 15-, and 30-min segments. The research found an inverse association between segment lengths and the standard deviation values and that lengths beyond 30 min may not meaningfully reduce the standard deviation values. This research supports previous studies indicating that measured averaged DC offsets should only be used for exclusionary purposes in authenticity analyses and exhibit consistent values when the general acoustic environment and microphone/recorder configurations were held constant. Measured average DC offset values from exemplar recorders may not be directly comparable to those of submitted digital audio recordings without exactly duplicating the acoustic environment and microphone/recorder configurations. © 2014 American Academy of Forensic Sciences.
NASA Astrophysics Data System (ADS)
Lamberty, Andrée; Franks, Katrin; Braun, Adelina; Kestens, Vikram; Roebben, Gert; Linsinger, Thomas P. J.
2011-12-01
The Institute for Reference Materials and Measurements has organised an interlaboratory comparison (ILC) to allow the participating laboratories to demonstrate their proficiency in particle size and zeta potential measurements on monomodal aqueous suspensions of silica nanoparticles in the 10-100 nm size range. The main goal of this ILC was to identify competent collaborators for the production of certified nanoparticle reference materials. 38 laboratories from four different continents participated in the ILC with different methods for particle sizing and determination of zeta potential. Most of the laboratories submitted particle size results obtained with centrifugal liquid sedimentation (CLS), dynamic light scattering (DLS) or electron microscopy (EM), or zeta potential values obtained via electrophoretic light scattering (ELS). The results of the laboratories were evaluated using method-specific z scores, calculated on the basis of consensus values from the ILC. For CLS (13 results) and EM (13 results), all reported values were within the ±2 | z| interval. For DLS, 25 of the 27 results reported were within the ±2 | z| interval, the two other results were within the ±3 | z| interval. The standard deviations of the corresponding laboratory mean values varied between 3.7 and 6.5%, which demonstrates satisfactory interlaboratory comparability of CLS, DLS and EM particle size values. From the received test reports, a large discrepancy was observed in terms of the laboratory's quality assurance systems, which are equally important for the selection of collaborators in reference material certification projects. Only a minority of the participating laboratories is aware of all the items that are mandatory in test reports compliant to ISO/IEC 17025 (ISO General requirements for the competence of testing and calibration laboratories. International Organisation for Standardization, Geneva, 2005b). The absence of measurement uncertainty values in the reports, for example, hindered the calculation of zeta scores.
Determination of graphene's edge energy using hexagonal graphene quantum dots and PM7 method.
Vorontsov, Alexander V; Tretyakov, Evgeny V
2018-05-18
Graphene quantum dots (GQDs) are important for a variety of applications and designs, and the shapes of GQDs rely on the energy of their boundaries. Presently, many methods have been developed for the preparation of GQDs with the required boundaries, shapes and edge terminations. However, research on the properties of GQDs and their applications is limited due to the unavailability of these compounds in pure form. In the present computational study, the standard enthalpy of formation, the standard enthalpy of formation of edges and the standard enthalpy of hydrogenation are studied for hexagonal GQDs with purely zigzag and armchair edges in non-passivated and H-passivated forms using the semiempirical quantum chemistry method pm7. The standard enthalpy of formation of the edge is found to remain constant for GQDs studied in the range of 1 to 6 nm, and the enthalpies of edge C atoms are 32.4 and 35.5 kcal mol-1 for armchair and zigzag edges, respectively. In contrast to some literature data, the standard enthalpy of formation of hydrogenated edges is far from zero, and the values are 7.3 and 8.0 kcal mol-1 C for armchair and zigzag edges, respectively. The standard enthalpy of hydrogenation is found to be -10.2 and -9.72 eV nm-1 for the armchair and zigzag edges, respectively.
Gallardo-Moreno, Amparo M; Vadillo-Rodríguez, Virginia; Perera-Núñez, Julia; Bruque, José M; González-Martín, M Luisa
2012-07-21
The electrical characterization of surfaces in terms of the zeta potential (ζ), i.e., the electric potential contributing to the interaction potential energy, is of major importance in a wide variety of industrial, environmental and biomedical applications in which the integration of any material with the surrounding media is initially mediated by the physico-chemical properties of its outer surface layer. Among the different existing electrokinetic techniques for obtaining ζ, streaming potential (V(str)) and streaming current (I(str)) are important when dealing with flat-extended samples. Mostly dielectric materials have been subjected to this type of analysis and only a few papers can be found in the literature regarding the electrokinetic characterization of conducting materials. Nevertheless, a standardized procedure is typically followed to calculate ζ from the measured data and, importantly, it is shown in this paper that such a procedure leads to incorrect zeta potential values when conductors are investigated. In any case, assessment of a reliable numerical value of ζ requires careful consideration of the origin of the input data and the characteristics of the experimental setup. In particular, it is shown that the cell resistance (R) typically obtained through a.c. signals (R(a.c.)), and needed for the calculations of ζ, always underestimates the zeta potential values obtained from streaming potential measurements. The consideration of R(EK), derived from the V(str)/I(str) ratio, leads to reliable values of ζ when dielectrics are investigated. For metals, the contribution of conductivity of the sample to the cell resistance provokes an underestimation of R(EK), which leads to unrealistic values of ζ. For the electrical characterization of conducting samples I(str) measurements constitute a better choice. In general, the findings gathered in this manuscript establish a measurement protocol for obtaining reliable zeta potentials of dielectrics and conductors based on the intrinsic electrokinetic behavior of both types of samples.
McBride, Murray B.; Shayler, Hannah A.; Spliethoff, Henry M.; Mitchell, Rebecca G.; Marquez-Bravo, Lydia G.; Ferenz, Gretchen S.; Russell-Anelli, Jonathan M.; Casey, Linda; Bachman, Sharon
2014-01-01
Paired vegetable/soil samples from New York City and Buffalo, NY, gardens were analyzed for lead (Pb), cadmium (Cd) and barium (Ba). Vegetable aluminum (Al) was measured to assess soil adherence. Soil and vegetable metal concentrations did not correlate; vegetable concentrations varied by crop type. Pb was below health-based guidance values (EU standards) in virtually all fruits. 47% of root crops and 9% of leafy greens exceeded guidance values; over half the vegetables exceeded the 95th percentile of market-basket concentrations for Pb. Vegetable Pb correlated with Al; soil particle adherence/incorporation was more important than Pb uptake via roots. Cd was similar to market-basket concentrations and below guidance values in nearly all samples. Vegetable Ba was much higher than Pb or Cd, although soil Ba was lower than soil Pb. The poor relationship between vegetable and soil metal concentrations is attributable to particulate contamination of vegetables and soil characteristics that influence phytoavailability. PMID:25163429
McBride, Murray B; Shayler, Hannah A; Spliethoff, Henry M; Mitchell, Rebecca G; Marquez-Bravo, Lydia G; Ferenz, Gretchen S; Russell-Anelli, Jonathan M; Casey, Linda; Bachman, Sharon
2014-11-01
Paired vegetable/soil samples from New York City and Buffalo, NY, gardens were analyzed for lead (Pb), cadmium (Cd) and barium (Ba). Vegetable aluminum (Al) was measured to assess soil adherence. Soil and vegetable metal concentrations did not correlate; vegetable concentrations varied by crop type. Pb was below health-based guidance values (EU standards) in virtually all fruits. 47% of root crops and 9% of leafy greens exceeded guidance values; over half the vegetables exceeded the 95th percentile of market-basket concentrations for Pb. Vegetable Pb correlated with Al; soil particle adherence/incorporation was more important than Pb uptake via roots. Cd was similar to market-basket concentrations and below guidance values in nearly all samples. Vegetable Ba was much higher than Pb or Cd, although soil Ba was lower than soil Pb. The poor relationship between vegetable and soil metal concentrations is attributable to particulate contamination of vegetables and soil characteristics that influence phytoavailability. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Samadi; Wajizah, S.; Munawar, A. A.
2018-02-01
Feed plays an important factor in animal production. The purpose of this study is to apply NIRS method in determining feed values. NIRS spectra data were acquired for feed samples in wavelength range of 1000 - 2500 nm with 32 scans and 0.2 nm wavelength. Spectral data were corrected by de-trending (DT) and standard normal variate (SNV) methods. Prediction of in vitro dry matter digestibility (IVDMD) and in vitro organic matter digestibility (IVOMD) were established as model by using principal component regression (PCR) and validated using leave one out cross validation (LOOCV). Prediction performance was quantified using coefficient correlation (r) and residual predictive deviation (RPD) index. The results showed that IVDMD and IVOMD can be predicted by using SNV spectra data with r and RPD index: 0.93 and 2.78 for IVDMD ; 0.90 and 2.35 for IVOMD respectively. In conclusion, NIRS technique appears feasible to predict animal feed nutritive values.
Color Image Segmentation Based on Statistics of Location and Feature Similarity
NASA Astrophysics Data System (ADS)
Mori, Fumihiko; Yamada, Hiromitsu; Mizuno, Makoto; Sugano, Naotoshi
The process of “image segmentation and extracting remarkable regions” is an important research subject for the image understanding. However, an algorithm based on the global features is hardly found. The requisite of such an image segmentation algorism is to reduce as much as possible the over segmentation and over unification. We developed an algorithm using the multidimensional convex hull based on the density as the global feature. In the concrete, we propose a new algorithm in which regions are expanded according to the statistics of the region such as the mean value, standard deviation, maximum value and minimum value of pixel location, brightness and color elements and the statistics are updated. We also introduced a new concept of conspicuity degree and applied it to the various 21 images to examine the effectiveness. The remarkable object regions, which were extracted by the presented system, highly coincided with those which were pointed by the sixty four subjects who attended the psychological experiment.
Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?
Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.
2005-01-01
Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597
Variations in Scientific Data Production: What Can We Learn from #Overlyhonestmethods?
Bezuidenhout, Louise
2015-12-01
In recent months months the hashtag #overlyhonestmethods has steadily been gaining popularity. Posts under this hashtag--presumably by scientists--detail aspects of daily scientific research that differ considerably from the idealized interpretation of scientific experimentation as standardized, objective and reproducible. Over and above its entertainment value, the popularity of this hashtag raises two important points for those who study both science and scientists. Firstly, the posts highlight that the generation of data through experimentation is often far less standardized than is commonly assumed. Secondly, the popularity of the hashtag together with its relatively blasé reception by the scientific community reveal that the actions reported in the tweets are far from shocking and indeed may be considered just "part of scientific research". Such observations give considerable pause for thought, and suggest that current conceptions of data might be limited by failing to recognize this "inherent variability" within the actions of generation--and thus within data themselves. Is it possible, we must ask, that epistemic virtues such as standardization, consistency, reportability and reproducibility need to be reevaluated? Such considerations are, of course, of particular importance to data sharing discussions and the Open Data movement. This paper suggests that the notion of a "moral professionalism" for data generation and sharing needs to be considered in more detail if the inherent variability of data are to be addressed in any meaningful manner.