Current Issues in the Design and Information Content of Instrument Approach Charts
DOT National Transportation Integrated Search
1995-03-01
This report documents an analysis and interview effort conducted to identify common operational errors made using : current Instrument Approach Plates (IAP), Standard Terminal Arrival Route (STAR) charts. Standard Instrument Departure : (SID) charts,...
Neutrino oscillation processes in a quantum-field-theoretical approach
NASA Astrophysics Data System (ADS)
Egorov, Vadim O.; Volobuev, Igor P.
2018-05-01
It is shown that neutrino oscillation processes can be consistently described in the framework of quantum field theory using only the plane wave states of the particles. Namely, the oscillating electron survival probabilities in experiments with neutrino detection by charged-current and neutral-current interactions are calculated in the quantum field-theoretical approach to neutrino oscillations based on a modification of the Feynman propagator in the momentum representation. The approach is most similar to the standard Feynman diagram technique. It is found that the oscillating distance-dependent probabilities of detecting an electron in experiments with neutrino detection by charged-current and neutral-current interactions exactly coincide with the corresponding probabilities calculated in the standard approach.
We demonstrate an approach for evaluating the level of protection attained using a variety of forms and levels of past, current, and proposed Air Quality Standards (AQSs). The U.S. Clean Air Act requires the establishment of ambient air quality standards to protect health and pub...
A Standards-Based Approach for Reporting Assessment Results in South Africa
ERIC Educational Resources Information Center
Kanjee, Anil; Moloi, Qetelo
2016-01-01
This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…
The Lom Approach--a Call for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
The LOM Approach -- A CALL for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
Interpreting the Right to an Education as a Norm Referenced Adequacy Standard
ERIC Educational Resources Information Center
Pijanowski, John
2016-01-01
Our current conceptions of educational adequacy emerged out of an era dominated by equity-based school resource litigation. During that time of transitioning between successful litigation strategies, legal opinions provided clues as to how future courts might view a norm-referenced approach to establishing an adequacy standard--an approach that…
NASA Technical Reports Server (NTRS)
Sun, Yushi; Sun, Changhong; Zhu, Harry; Wincheski, Buzz
2006-01-01
Stress corrosion cracking in the relief radius area of a space shuttle primary reaction control thruster is an issue of concern. The current approach for monitoring of potential crack growth is nondestructive inspection (NDI) of remaining thickness (RT) to the acoustic cavities using an eddy current or remote field eddy current probe. EDM manufacturers have difficulty in providing accurate RT calibration standards. Significant error in the RT values of NDI calibration standards could lead to a mistaken judgment of cracking condition of a thruster under inspection. A tool based on eddy current principle has been developed to measure the RT at each acoustic cavity of a calibration standard in order to validate that the standard meets the sample design criteria.
Keeping up with Our Students: The Evolution of Technology and Standards in Art Education
ERIC Educational Resources Information Center
Patton, Ryan M.; Buffington, Melanie L.
2016-01-01
This article addresses the standards of technology in the visual arts, arguing the standards function as de facto policy, the guidelines that shape what teachers teach. In this study, we investigate how art education standards approach technology as a teaching tool and artmaking medium, analyzing the current National Visual Arts Standards, the…
In setting primary ambient air quality standards, the EPA’s responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and ...
A Thresholds Concepts Approach to the Standards Revision
ERIC Educational Resources Information Center
Hofer, Amy R.; Brunetti, Korey; Townsend, Lori
2013-01-01
Thirteen years after being adopted, the Association of College and Research Libraries' (ACRL's) "Information Literacy Competency Standards for Higher Education" are due for a retrofit. The current "Standards" do not account for the post-Google information landscape in which a blizzard of emerging technologies and unprecedented…
Indicators of standards of the quality of the visitor experience at a heavily-used national park
Robert E. Manning; David W. Lime; Richard F. McMonagle
1995-01-01
Contemporary approaches to determining and managing carrying capacity of national parks and similar areas focus on indicators and standards of quality. The National Park Service is currently developing the Visitor Experience and Resource Protection process which adopts this approach to carrying capacity. This process is being applied to Arches National Park, Utah,...
A Vertical Approach to Math Instruction
ERIC Educational Resources Information Center
Gojak, Linda
2012-01-01
In the current era of mathematics standards, whether they are Common Core State Standards or other state standards, effective vertical mathematics teams offer an opportunity for teachers to grow professionally through shared experiences, for leadership to grow among the faculty, and for the school to change its perspective on the teaching and…
Standardization of Assays That Detect Anti-Rubella Virus IgG Antibodies
Grangeot-Keros, Liliane; Vauloup-Fellous, Christelle
2015-01-01
SUMMARY Rubella virus usually causes a mild infection in humans but can cause congenital rubella syndrome (CRS). Vaccination programs have significantly decreased primary rubella virus infection and CRS; however, vaccinated individuals usually have lower levels of rubella virus IgG than those with natural infections. Rubella virus IgG is quantified with enzyme immunoassays that have been calibrated against the World Health Organization (WHO) international standard and report results in international units per milliliter. It is recognized that the results reported by these assays are not standardized. This investigation into the reasons for the lack of standardization found that the current WHO international standard (RUB-1-94) fails by three key metrological principles. The standard is not a pure analyte but is composed of pooled human immunoglobulin. It was not calibrated by certified reference methods; rather, superseded tests were used. Finally, no measurement uncertainty estimations have been provided. There is an analytical and clinical consequence to the lack of standardization of rubella virus IgG assays, which leads to misinterpretation of results. The current approach to standardization of rubella virus IgG assays has not achieved the desired results. A new approach is required. PMID:26607813
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... every $100 of current generally applicable leverage exposure based on a group of advanced approaches... approaches adopted by the agencies in July, 2013 (2013 revised capital approaches), the agencies established... organizations subject to the advanced approaches risk-based capital rules. In this notice of proposed rulemaking...
ERIC Educational Resources Information Center
Sugahara, Satoshi; Wilson, Rachel
2013-01-01
The development and implementation of the International Education Standards (IES) for professional accountants is currently an important issue in accounting education and for educators interested in a shift toward international education standards more broadly. The purpose of this study is to investigate professional and research discourse…
Data standards for clinical research data collection forms: current status and challenges.
Richesson, Rachel L; Nadkarni, Prakash
2011-05-01
Case report forms (CRFs) are used for structured-data collection in clinical research studies. Existing CRF-related standards encompass structural features of forms and data items, content standards, and specifications for using terminologies. This paper reviews existing standards and discusses their current limitations. Because clinical research is highly protocol-specific, forms-development processes are more easily standardized than is CRF content. Tools that support retrieval and reuse of existing items will enable standards adoption in clinical research applications. Such tools will depend upon formal relationships between items and terminological standards. Future standards adoption will depend upon standardized approaches for bridging generic structural standards and domain-specific content standards. Clinical research informatics can help define tools requirements in terms of workflow support for research activities, reconcile the perspectives of varied clinical research stakeholders, and coordinate standards efforts toward interoperability across healthcare and research data collection.
Towards a Framework for Developing Semantic Relatedness Reference Standards
Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.
2010-01-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697
Current approaches to norms research
John L. Heywood
2000-01-01
The dialogue session was a continuation of a debate about norms and the application of normative standards to wilderness management that has taken place throughout the 1990s at national meetings and in the research literature. Researchers who have made significant contributions to the normative approach to wilderness recreation management presented three approaches to...
49 CFR 236.310 - Signal governing approach to home signal.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 4 2012-10-01 2012-10-01 false Signal governing approach to home signal. 236.310... Standards § 236.310 Signal governing approach to home signal. A signal shall be provided on main track to govern the approach with the current of traffic to any home signal except where the home signal is the...
49 CFR 236.310 - Signal governing approach to home signal.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Signal governing approach to home signal. 236.310... Standards § 236.310 Signal governing approach to home signal. A signal shall be provided on main track to govern the approach with the current of traffic to any home signal except where the home signal is the...
49 CFR 236.310 - Signal governing approach to home signal.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 4 2011-10-01 2011-10-01 false Signal governing approach to home signal. 236.310... Standards § 236.310 Signal governing approach to home signal. A signal shall be provided on main track to govern the approach with the current of traffic to any home signal except where the home signal is the...
A Measure of Failure: The Political Origins of Standardized Testing
ERIC Educational Resources Information Center
Garrison, Mark J.
2009-01-01
How did standardized tests become the measure of performance in our public schools? In this compelling work, Mark J. Garrison attempts to answer this question by analyzing the development of standardized testing, from the days of Horace Mann and Alfred Binet to the current scene. Approaching the issue from a sociohistorical perspective, the author…
Ramsingh, Brigit
2014-07-01
Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches.
Assembling Appliances Standards from a Basket of Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siderious, Hans-Paul; Meier, Alan
2014-08-11
Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less
Bodner, Todd E.
2017-01-01
Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404
Report on Pairing-based Cryptography.
Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily
2015-01-01
This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.
Report on Pairing-based Cryptography
Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily
2015-01-01
This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST’s position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed. PMID:26958435
Analytical evaluation of current starch methods used in the international sugar industry: Part I.
Cole, Marsha; Eggleston, Gillian; Triplett, Alexa
2017-08-01
Several analytical starch methods exist in the international sugar industry to mitigate starch-related processing challenges and assess the quality of traded end-products. These methods use iodometric chemistry, mostly potato starch standards, and utilize similar solubilization strategies, but had not been comprehensively compared. In this study, industrial starch methods were compared to the USDA Starch Research method using simulated raw sugars. Type of starch standard, solubilization approach, iodometric reagents, and wavelength detection affected total starch determination in simulated raw sugars. Simulated sugars containing potato starch were more accurately detected by the industrial methods, whereas those containing corn starch, a better model for sugarcane starch, were only accurately measured by the USDA Starch Research method. Use of a potato starch standard curve over-estimated starch concentrations. Among the variables studied, starch standard, solubilization approach, and wavelength detection affected the sensitivity, accuracy/precision, and limited the detection/quantification of the current industry starch methods the most. Published by Elsevier Ltd.
Disciplinary Literacy through the Lens of the Next Generation Science Standards
ERIC Educational Resources Information Center
Houseal, Ana; Gillis, Victoria; Helmsing, Mark; Hutchison, Linda
2016-01-01
The current discussion among adolescent literacy researchers describes two positions at either end of a continuum: a generalist content area reading approach and a disciplinary literacy approach. Within the field, there are misunderstandings about the disciplinary literacy approach and claims that adolescents are ill suited to the kinds of…
Towards a framework for developing semantic relatedness reference standards.
Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G
2011-04-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Alkraiji, Abdullah; Jackson, Thomas; Murray, Ian
2011-01-01
Purpose: This paper seeks to carry out a critical study of health data standards and adoption process with a focus on Saudi Arabia. Design/methodology/approach: Many developed nations have initiated programs to develop, promote, adopt and customise international health data standards to the local needs. The current status of, and future plans for,…
Pförtner, T-K
2016-06-01
A common indicator of the measurement of relative poverty is the disposable income of a household. Current research introduces the living standard approach as an alternative concept for describing and measuring relative poverty. This study compares both approaches with regard to subjective health status of the German population, and provides theoretical implications for the utilisation of the income and living standard approach in health research. Analyses are based on the German Socio-Economic Panel (GSOEP) from the year 2011 that includes 12 290 private households and 21106 survey members. Self-rated health was based on a subjective assessment of general health status. Income poverty is based on the equalised disposable income and is applied to a threshold of 60% of the median-based average income. A person will be denoted as deprived (inadequate living standard) if 3 or more out of 11 living standard items are lacking due to financial reasons. To calculate the discriminate power of both poverty indicators, descriptive analyses and stepwise logistic regression models were applied separately for men and women adjusted for age, residence, nationality, educational level, occupational status and marital status. The results of the stepwise regression revealed a stronger poverty-health relationship for the living standard indicator. After adjusting for all control variables and the respective poverty indicator, income poverty was statistically not significantly associated with a poor subjective health status among men (OR Men: 1.33; 95% CI: 1.00-1.77) and women (OR Women: 0.98; 95% CI: 0.78-1.22). In contrast, the association between deprivation and subjective health status was statistically significant for men (OR Men: 2.00; 95% CI: 1.57-2.52) and women (OR Women: 2.11; 95% CI: 1.76-2.64). The results of the present study indicate that the income and standard of living approach measure different dimensions of poverty. In comparison to the income approach, the living standard approach measures stronger shortages of wealth and is relatively robust towards gender differences. This study expands the current debate about complementary research on the association between poverty and health. © Georg Thieme Verlag KG Stuttgart · New York.
A Binary Approach to Define and Classify Final Ecosystem Goods and Services
The ecosystem services literature decries the lack of consistency and standards in the application of ecosystem services as well as the inability of current approaches to explicitly link ecosystem services to human well-being. Recently, SEEA and CICES have conceptually identifie...
Bianchi, Paolo Pietro; Petz, Wanda; Luca, Fabrizio; Biffi, Roberto; Spinoglio, Giuseppe; Montorsi, Marco
2014-01-01
The current standard treatment for rectal cancer is based on a multimodality approach with preoperative radiochemotherapy in advanced cases and complete surgical removal through total mesorectal excision (TME). The most frequent surgical approach is traditional open surgery, as laparoscopic TME requires high technical skill, a long learning curve, and is not widespread, still being confined to centers with great experience in minimally invasive techniques. Nevertheless, in several studies, the laparoscopic approach, when compared to open surgery, has shown some better short-term clinical outcomes and at least comparable oncologic results. Robotic surgery for the treatment of rectal cancer is an emerging technique, which could overcome some of the technical difficulties posed by standard laparoscopy, but evidence from the literature regarding its oncologic safety and clinical outcomes is still lacking. This brief review analyses the current status of minimally invasive surgery for rectal cancer therapy, focusing on oncologic safety and the new robotic approach. PMID:24834429
New ANSI standard for thyroid phantom
Mallett, Michael W.; Bolch, Wesley E.; Fulmer, Philip C.; ...
2015-08-01
Here, a new ANSI standard titled “Thyroid Phantom Used in Occupational Monitoring” (Health Physics Society 2014) has been published. The standard establishes the criteria for acceptable design, fabrication, or modeling of a phantom suitable for calibrating in vivo monitoring systems to measure photon-emitting radionuclides deposited in the thyroid. The current thyroid phantom standard was drafted in 1973 (ANSI N44.3-1973), last reviewed in 1984, and a revision of the standard to cover a more modern approach was deemed warranted.
Mendiratta, Prateek; Armstrong, Andrew J; George, Daniel J
2007-01-01
Prostate cancer is a common cause of death in men and remains incurable in the metastatic setting. In 2004, 2 landmark trials using docetaxel-based chemotherapy, TAX 327 and SWOG 99-16, showed a survival benefit for the first time in metastatic, hormone-refractory prostate cancer. Current research suggests that several distinct mechanisms of androgen-refractory disease may converge in patients with disease progression on androgen deprivation therapy. These findings have identified several potential targets for therapeutic intervention. Current standard and investigational treatment options for this disease are discussed, including chemotherapy and rapidly evolving therapies in phase II/III trials involving antiangiogenic therapies, signal transduction inhibitors, immunomodulatory agents, and nuclear receptor targets. In light of a growing array of treatment options and an increasingly chronic natural history, this review supports a multidisciplinary care approach to these patients, including medical oncologists, urologists, and radiation oncologists, to optimize survival and quality of life. PMID:17387372
ERIC Educational Resources Information Center
Weber, Arnold R.
1990-01-01
The trend of tuition increases is excessive in terms of both economic standards and the special mission and characteristics of higher education. It is clouding the current state of higher education. Although there are sound reasons for tuition increases, the current approach to setting tuition should be altered and moderated. (MSE)
77 FR 28543 - Nationwide Health Information Network: Conditions for Trusted Exchange
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... expansion, electronic exchange has been governed by a patchwork of contractual relationships, procurement.... Consequently, this ad-hoc governance approach has led to asymmetries in the policies and technical standards... This request for information (RFI) reflects ONC's current thinking regarding the approach ONC should...
ERIC Educational Resources Information Center
Shewchuk, Richard M.; Schmidt, Hilary J.; Benarous, Alexandra; Bennett, Nancy L.; Abdolrasulnia, Maziar; Casebeer, Linda L.
2007-01-01
Introduction: Rapidly expanding science and mandates for maintaining credentials place increasing demands on continuing medical education (CME) activities to provide information that is current and relevant to patient care. Quality may be seen as the perceived level of service measured against consumer expectations. Standard tools have not been…
Curriculum Standards of Technological and Vocational Education in Taiwan, R.O.C.
ERIC Educational Resources Information Center
Lee, Lung-Sheng Steven; Hwang, Jenq-Jye
In Taiwan, curriculum standards for senior vocational schools and junior colleges are administered and promulgated by the Ministry of Education approximately every 10 years. Curricula for institutes of technology are principally school based. As a result of critiques of the current top-down or administration-based approach system of curriculum…
Writing Instruction in First Grade: An Observational Study
ERIC Educational Resources Information Center
Coker, David L., Jr.; Farley-Ripple, Elizabeth; Jackson, Allison F.; Wen, Huijing; MacArthur, Charles A.; Jennings, Austin S.
2016-01-01
As schools work to meet the ambitious Common Core State Standards in writing (Common Core State Standards Initiation, 2010), instructional approaches are likely to be examined. However, there is little research that describes the current state of instruction. This study was designed to expand the empirical base on writing instruction in first…
A new approach to counting measurements: Addressing the problems with ISO-11929
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klumpp, John Allan; Poudel, Deepesh; Miller, Guthrie
We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: “what is the probability distribution of the true amount in the sample, given the data?” The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the “measurement strength”more » that depends only on measurement-stage count quantities. Here, we show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an “action threshold” on the measurement strength which is similar to the decision threshold recommended by the current standard. Finally, we further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.« less
A new approach to counting measurements: Addressing the problems with ISO-11929
Klumpp, John Allan; Poudel, Deepesh; Miller, Guthrie
2017-12-23
We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: “what is the probability distribution of the true amount in the sample, given the data?” The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the “measurement strength”more » that depends only on measurement-stage count quantities. Here, we show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an “action threshold” on the measurement strength which is similar to the decision threshold recommended by the current standard. Finally, we further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.« less
A new approach to counting measurements: Addressing the problems with ISO-11929
NASA Astrophysics Data System (ADS)
Klumpp, John; Miller, Guthrie; Poudel, Deepesh
2018-06-01
We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: "what is the probability distribution of the true amount in the sample, given the data?" The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the "measurement strength" that depends only on measurement-stage count quantities. We show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an "action threshold" on the measurement strength which is similar to the decision threshold recommended by the current standard. We further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
... that plan for any mergers, (3) obtain prior written approvals for the use of certain approaches for... of its continuing effort to reduce paperwork and respondent burden, invites the general public and... Rules: Standardized Approach for Risk-Weighted Assets; Market Discipline and Disclosure Requirements (77...
Uncertainty and instream flow standards
Castleberry, D.; Cech, J.; Erman, D.; Hankin, D.; Healey, M.; Kondolf, M.; Mengel, M.; Mohr, M.; Moyle, P.; Nielsen, Jennifer L.; Speed, T.; Williams, J.
1996-01-01
Several years ago, Science published an important essay (Ludwig et al. 1993) on the need to confront the scientific uncertainty associated with managing natural resources. The essay did not discuss instream flow standards explicitly, but its arguments apply. At an April 1995 workshop in Davis, California, all 12 participants agreed that currently no scientifically defensible method exists for defining the instream flows needed to protect particular species of fish or aquatic ecosystems (Williams, in press). We also agreed that acknowledging this fact is an essential step in dealing rationally and effectively with the problem.Practical necessity and the protection of fishery resources require that new instream flow standards be established and that existing standards be revised. However, if standards cannot be defined scientifically, how can this be done? We join others in recommending the approach of adaptive management. Applied to instream flow standards, this approach involves at least three elements.
[Laboratory diagnosis of mucormycosis].
Garcia-Hermoso, Dea
2013-03-01
Mucormycosis are deep infections caused by ubiquitous filamentous fungi of the order of Mucorales. The disease occurs mostly in immunocompromised, diabetic or solid organ transplant recipients. There are currently no specific diagnostic guidelines for mucormycosis. The histological examination and culture of the clinical sample remain the most useful approaches for diagnosis. Furthermore, alternative methods to the fungal culture are yet to be standardized. Here we review the current microbiological approaches used for the diagnosis and identification of Mucorales. © 2013 médecine/sciences – Inserm / SRMS.
Considerations of Unmanned Aircraft Classification for Civil Airworthiness Standards
NASA Technical Reports Server (NTRS)
Maddalon, Jeffrey M.; Hayhurst, Kelly J.; Morris, A. Terry; Verstynen, Harry A.
2013-01-01
The use of unmanned aircraft in the National Airspace System (NAS) has been characterized as the next great step forward in the evolution of civil aviation. Although use of unmanned aircraft systems (UAS) in military and public service operations is proliferating, civil use of UAS remains limited in the United States today. This report focuses on one particular regulatory challenge: classifying UAS to assign airworthiness standards. Classification is useful for ensuring that meaningful differences in design are accommodated by certification to different standards, and that aircraft with similar risk profiles are held to similar standards. This paper provides observations related to how the current regulations for classifying manned aircraft, based on dimensions of aircraft class and operational aircraft categories, could apply to UAS. This report finds that existing aircraft classes are well aligned with the types of UAS that currently exist; however, the operational categories are more difficult to align to proposed UAS use in the NAS. Specifically, the factors used to group manned aircraft into similar risk profiles do not necessarily capture all relevant UAS risks. UAS classification is investigated through gathering approaches to classification from a broad spectrum of organizations, and then identifying and evaluating the classification factors from these approaches. This initial investigation concludes that factors in addition to those currently used today to group manned aircraft for the purpose of assigning airworthiness standards will be needed to adequately capture risks associated with UAS and their operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-03-01
This paper presents a comparison of several qualitatively different approaches to Total Quality Management (TQM). The continuum ranges from management approaches that are primarily standards -- with specific guidelines, but few theoretical concepts -- to approaches that are primarily philosophical, with few specific guidelines. The approaches to TQM discussed in this paper include the International Organization for Standardization (ISO) 9000 Standard, the Malcolm Baldrige National Quality Award, Senge`s the Learning Organization, Watkins and Marsick`s approach to organizational learning, Covey`s Seven Habits of Highly Successful People, and Deming`s Fourteen Points for Management. Some of these approaches (Deming and ISO 9000) aremore » then compared to the DOE`s official position on quality management and conduct of operations (DOE Orders 5700.6C and 5480.19). Using a tabular format, it is shown that while 5700.6C (Quality Assurance) maps well to many of the current approaches to TQM, DOE`s principle guide to management Order 5419.80 (Conduct of Operations) has many significant conflicts with some of the modern approaches to continuous quality improvement.« less
Standard Chinese: A Modular Approach. Student Workbook. Module 3: Money; Module 4: Directions.
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
Texts in spoken Standard Chinese were developed to improve and update Chinese materials and to reflect current usage in Beijing and Taipei. The focus is on communicating in Chinese in practical situations. The overall course is organized into 10 situational modules, student workbooks, and resource modules. This workbook covers the money and…
Standard Chinese: A Modular Approach. Student Text. Module 3: Money; Module 4: Directions.
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
Texts in spoken Standard Chinese were developed to improve and update Chinese materials to reflect current usage in Beijing and Taipei. The focus is on communicating in practical situations, and the texts summarize and supplement tapes. The overall course is organized into 10 situational modules, student workbooks, and resource modules. This text…
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
Texts in spoken Standard Chinese were developed to improve and update Chinese materials to reflect current usage in Beijing and Taipei. The focus is on communicating in Chinese in practical situations, and the texts summarize and supplement tapes. The overall course is organized into 10 situational modules, student workbooks, and resource modules.…
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
Texts in spoken Standard Chinese were developed to improve and update Chinese materials to reflect current usage in Beijing and Taipei. The focus is on communicating in Chinese in practical situations. The overall course is organized into 10 modules, student workbooks, and resource modules. This workbook covers the orientation and biographic…
Integrating Apps with the Core Arts Standards in the 21st-Century Elementary Music Classroom
ERIC Educational Resources Information Center
Heath-Reynolds, Julia; VanWeelden, Kimberly
2015-01-01
The implementation of the National Core Arts Standards has amplified the need for multiple approaches and opportunities for student responses and may compel music educators to use new tools. There are currently over one million available apps, and with the popularity of smart devices, student access to technology is increasing exponentially. Music…
Information risk and security modeling
NASA Astrophysics Data System (ADS)
Zivic, Predrag
2005-03-01
This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.
Age Analysis of Public Library Collections. Final Report.
ERIC Educational Resources Information Center
Wallace, Danny P.; And Others
The use of information regarding the ages of library items is a standard component of many approaches to weeding library collections, and has a long history in the literature of collection management. Current and past approaches to using aging information to make weeding decisions make use of very arbitrary decision criteria. This study examined…
Professional Competence of Teachers in the Age of Globalization
ERIC Educational Resources Information Center
Orazbayeva, Kuldarkhan O.
2016-01-01
Current challenges of globalization in a democratic post-industrial information society make the competency-based approach a standard in the creation of the global educational environment. This study describes the special aspects of the integration of the competency-based approach into the educational theory and practice of post-Soviet countries,…
Computation of ancestry scores with mixed families and unrelated individuals.
Zhou, Yi-Hui; Marron, James S; Wright, Fred A
2018-03-01
The issue of robustness to family relationships in computing genotype ancestry scores such as eigenvector projections has received increased attention in genetic association, and is particularly challenging when sets of both unrelated individuals and closely related family members are included. The current standard is to compute loadings (left singular vectors) using unrelated individuals and to compute projected scores for remaining family members. However, projected ancestry scores from this approach suffer from shrinkage toward zero. We consider two main novel strategies: (i) matrix substitution based on decomposition of a target family-orthogonalized covariance matrix, and (ii) using family-averaged data to obtain loadings. We illustrate the performance via simulations, including resampling from 1000 Genomes Project data, and analysis of a cystic fibrosis dataset. The matrix substitution approach has similar performance to the current standard, but is simple and uses only a genotype covariance matrix, while the family-average method shows superior performance. Our approaches are accompanied by novel ancillary approaches that provide considerable insight, including individual-specific eigenvalue scree plots. © 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
Real-time image sequence segmentation using curve evolution
NASA Astrophysics Data System (ADS)
Zhang, Jun; Liu, Weisong
2001-04-01
In this paper, we describe a novel approach to image sequence segmentation and its real-time implementation. This approach uses the 3D structure tensor to produce a more robust frame difference signal and uses curve evolution to extract whole objects. Our algorithm is implemented on a standard PC running the Windows operating system with video capture from a USB camera that is a standard Windows video capture device. Using the Windows standard video I/O functionalities, our segmentation software is highly portable and easy to maintain and upgrade. In its current implementation on a Pentium 400, the system can perform segmentation at 5 frames/sec with a frame resolution of 160 by 120.
Development of a space universal modular architecture (SUMO)
NASA Astrophysics Data System (ADS)
Collins, Bernie F.
This concept paper proposes that the space community should develop and implement a universal standard for spacecraft modularity - to improve interoperability of spacecraft components. Pursuing a global industry consensus standard for open and modular spacecraft architecture will encourage trade, remove standards-related market barriers, and in the long run increase both value provided to customers and profitability of the space industrial sector. This concept paper sets out: (1) the goals for a SUMO standard and how it will benefit the space community; (2) background on spacecraft modularity and existing related standards; (3) the proposed technical scope of the current standardization effort; and (4) an approach for creating a SUMO standard.
Simon, Heather; Baker, Kirk R; Akhtar, Farhan; Napelenok, Sergey L; Possiel, Norm; Wells, Benjamin; Timin, Brian
2013-03-05
In setting primary ambient air quality standards, the EPA's responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and risks associated with ambient ozone pollution using a statistical approach to adjust recent air quality to simulate just meeting the current standard level, without specifying emission control strategies. One drawback of this purely statistical concentration rollback approach is that it does not take into account spatial and temporal heterogeneity of ozone response to emissions changes. The application of the higher-order decoupled direct method (HDDM) in the community multiscale air quality (CMAQ) model is discussed here to provide an example of a methodology that could incorporate this variability into the risk assessment analyses. Because this approach includes a full representation of the chemical production and physical transport of ozone in the atmosphere, it does not require assumed background concentrations, which have been applied to constrain estimates from past statistical techniques. The CMAQ-HDDM adjustment approach is extended to measured ozone concentrations by determining typical sensitivities at each monitor location and hour of the day based on a linear relationship between first-order sensitivities and hourly ozone values. This approach is demonstrated by modeling ozone responses for monitor locations in Detroit and Charlotte to domain-wide reductions in anthropogenic NOx and VOCs emissions. As seen in previous studies, ozone response calculated using HDDM compared well to brute-force emissions changes up to approximately a 50% reduction in emissions. A new stepwise approach is developed here to apply this method to emissions reductions beyond 50% allowing for the simulation of more stringent reductions in ozone concentrations. Compared to previous rollback methods, this application of modeled sensitivities to ambient ozone concentrations provides a more realistic spatial response of ozone concentrations at monitors inside and outside the urban core and at hours of both high and low ozone concentrations.
Data Sharing to Improve Close Approach Monitoring and Safety of Flight
NASA Astrophysics Data System (ADS)
Chan, Joseph; DalBello, Richard; Hope, Dean; Wauthier, Pascal; Douglas, Tim; Inghram, Travis
2009-03-01
Individual satellite operators have done a good job of developing the internal protocols and procedures to ensure the safe operation of their fleets. However, data sharing among operators for close approach monitoring is conducted in an ad-hoc manner during relocations, and there is currently no standardized agreement among operators on the content, format, and distribution protocol for data sharing. Crowding in geostationary orbit, participation by new commercial actors, government interest in satellite constellations, and highly maneuverable spacecraft all suggest that satellite operators will need to begin a dialogue on standard communication protocols and procedure to improve situation awareness. We will give an overview of the current best practices among different operators for close approach monitoring and discuss the concept of an active data center to improve data sharing, conjunction monitoring, and avoidance among satellite operators. We will also report on the progress and lessons learned from a Data Center prototype conducted by several operators over a one year period.
NASA Occupant Protection Standards Development
NASA Technical Reports Server (NTRS)
Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles
2012-01-01
Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement
Current and Emerging Therapies for Lupus Nephritis
Parikh, Samir V.
2016-01-01
The introduction of corticosteroids and later, cyclophosphamide dramatically improved survival in patients with proliferative lupus nephritis, and combined administration of these agents became the standard-of-care treatment for this disease. However, treatment failures were still common and the rate of progression to ESRD remained unacceptably high. Additionally, treatment was associated with significant morbidity. Therefore, as patient survival improved, the goals for advancing lupus nephritis treatment shifted to identifying therapies that could improve long-term renal outcomes and minimize treatment-related toxicity. Unfortunately, progress has been slow and the current approaches to the management of lupus nephritis continue to rely on high-dose corticosteroids plus a broad-spectrum immunosuppressive agent. Over the past decade, an improved understanding of lupus nephritis pathogenesis fueled several clinical trials of novel drugs, but none have been found to be superior to the combination of a cytotoxic agent and corticosteroids. Despite these trial failures, efforts to translate mechanistic advances into new treatment approaches continue. In this review, we discuss current therapeutic strategies for lupus nephritis, briefly review recent advances in understanding the pathogenesis of this disease, and describe emerging approaches developed on the basis of these advances that promise to improve upon the standard-of-care lupus nephritis treatments. PMID:27283496
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Smith, P. M.; Deal, P. L.
1980-01-01
Piloted-simulator studies were conducted to determine takeoff and landing operating procedures for a supersonic cruise research transport concept that result in predicted noise levels which meet current Federal Aviation Administration (FAA) certification standards. With the use of standard FAA noise certification test procedures, the subject simulated aircraft did not meet the FAA traded-noise-level standards during takeoff and landing. However, with the use of advanced procedures, this aircraft meets the traded-noise-level standards for flight crews with average skills. The advanced takeoff procedures developed involved violating some of the current Federal Aviation Regulations (FAR), but it was not necessary to violate any FAR noise-test conditions during landing approach. Noise contours were also determined for some of the simulated takeoffs and landings in order to indicate the noise-reduction advantages of using operational procedures other than standard.
Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization
NASA Astrophysics Data System (ADS)
Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad
2017-02-01
Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.
ERIC Educational Resources Information Center
Defense Language Inst., Monterey, CA.
Texts in spoken Standard Chinese were developed to improve and update Chinese materials to reflect current usage in Beijing and Taipei. The focus is on communicating in Chinese in practical situations. The overall course is organized into 10 situational modules, student workbooks for each module, and resource modules. This text contains resource…
ERIC Educational Resources Information Center
Li, Deping; Oranje, Andreas
2007-01-01
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard; de Sousa, Gracinda
2015-04-01
The screening laboratory has a critical role in the post-transfusion safety. The success of its targets and efficiency depends on the management system used. Even though the European Union directive 2002/98/EC requires a quality management system in blood establishments, its requirements for screening laboratories are generic. Complementary approaches are needed to implement a quality management system focused on screening laboratories. This article briefly discusses the current good manufacturing practices and good laboratory practices, as well as the trends in quality management system standards. ISO 9001 is widely accepted in some European Union blood establishments as the quality management standard, however this is not synonymous of its successful application. The ISO "risk-based thinking" is interrelated with the quality risk-management process of the EuBIS "Standards and criteria for the inspection of blood establishments". ISO 15189 should be the next step on the quality assurance of a screening laboratory, since it is focused on medical laboratory. To standardize the quality management systems in blood establishments' screening laboratories, new national and European claims focused on technical requirements following ISO 15189 is needed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Johnson, Karl D
2003-03-01
GASB has proposed new standards that will affect the way in which governments report postemployment health care benefits in audited external financial statements, resulting in more complete and transparent reporting by employers and plans and more relevant and useful information for the users of governmental financial reports. This article provides an overview of current financial reporting standards and practice, the financial reporting objectives of the project, the proposed measurement approach, noteworthy specific proposals, and the projected timetable for completion of the project and implementation of the new standards.
Future Concepts for Realtime Data Interfaces for Control Centers
NASA Technical Reports Server (NTRS)
Kearney, Mike W., III
2004-01-01
Existing methods of exchanging realtime data between the major control centers in the International Space Station program have resulted in a patchwork of local formats being imposed on each Mission Control Center. This puts the burden on a data customer to comply with the proprietary data formats of each data supplier. This has increased the cost and complexity for each participant, limited access to mission data and hampered the development of efficient and flexible operations concepts. Ideally, a universal format should be promoted in the industry to prevent the unnecessary burden of each center processing a different data format standard for every external interface with another center. With the broad acceptance of XML and other conventions used in other industries, it is now time for the Aerospace industry to fully engage and establish such a standard. This paper will briefly consider the components that would be required by such a standard (XML schema, data dictionaries, etc.) in order to accomplish the goal of a universal low-cost interface, and acquire broad industry acceptance. We will then examine current approaches being developed by standards bodies and other groups. The current state of CCSDS panel work will be reviewed, with a survey of the degree of industry acceptance. Other widely accepted commercial approaches will be considered, sometimes complimentary to the standards work, but sometimes not. The question is whether de facto industry standards are in concert with, or in conflict with the direction of the standards bodies. And given that state of affairs, the author will consider whether a new program establishing its Mission Control Center should implement a data interface based on those standards. The author proposes that broad industry support to unify the various efforts will enable collaboration between control centers and space programs to a wider degree than is currently available. This will reduce the cost for programs to provide realtime access to their data, hence reducing the cost of access to space, and benefiting the industry as a whole.
Current National Approach to Healthcare ICT Standardization: Focus on Progress in New Zealand.
Park, Young-Taek; Atalag, Koray
2015-07-01
Many countries try to efficiently deliver high quality healthcare services at lower and manageable costs where healthcare information and communication technologies (ICT) standardisation may play an important role. New Zealand provides a good model of healthcare ICT standardisation. The purpose of this study was to review the current healthcare ICT standardisation and progress in New Zealand. This study reviewed the reports regarding the healthcare ICT standardisation in New Zealand. We also investigated relevant websites related with the healthcare ICT standards, most of which were run by the government. Then, we summarised the governance structure, standardisation processes, and their output regarding the current healthcare ICT standards status of New Zealand. New Zealand government bodies have established a set of healthcare ICT standards and clear guidelines and procedures for healthcare ICT standardisation. Government has actively participated in various enactments of healthcare ICT standards from the inception of ideas to their eventual retirement. Great achievements in eHealth have already been realized, and various standards are currently utilised at all levels of healthcare regionally and nationally. Standard clinical terminologies, such as International Classification of Diseases (ICD) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED-CT) have been adopted and Health Level Seven (HL7) standards are actively used in health information exchanges. The government to New Zealand has well organised ICT institutions, guidelines, and regulations, as well as various programs, such as e-Medications and integrated care services. Local district health boards directly running hospitals have effectively adopted various new ICT standards. They might already be benefiting from improved efficiency resulting from healthcare ICT standardisation.
NASA Technical Reports Server (NTRS)
Kezirian, Michael; Cook, Anthony; Dick, Brandon; Phoenix, S. Leigh
2012-01-01
To supply oxygen and nitrogen to the International Space Station, a COPV tank is being developed to meet requirements beyond that which have been flown. In order to "Ship Full' and support compatibility with a range of launch site operations, the vessel was designed for certification to International Standards (ISO) that have a different approach than current NASA certification approaches. These requirements were in addition to existing NASA certification standards had to be met. Initial risk-reduction development tests have been successful. Qualification is in progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.; Fisk, William J.
Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less
Standardising Responsibility? The Significance of Interstitial Spaces.
Wickson, Fern; Forsberg, Ellen-Marie
2015-10-01
Modern society is characterised by rapid technological development that is often socially controversial and plagued by extensive scientific uncertainty concerning its socio-ecological impacts. Within this context, the concept of 'responsible research and innovation' (RRI) is currently rising to prominence in international discourse concerning science and technology governance. As this emerging concept of RRI begins to be enacted through instruments, approaches, and initiatives, it is valuable to explore what it is coming to mean for and in practice. In this paper we draw attention to a realm that is often backgrounded in the current discussions of RRI but which has a highly significant impact on scientific research, innovation and policy-namely, the interstitial space of international standardization. Drawing on the case of nanoscale sciences and technologies to make our argument, we present examples of how international standards are already entangled in the development of RRI and yet, how the process of international standardization itself largely fails to embody the norms proposed as characterizing RRI. We suggest that although current models for RRI provide a promising attempt to make research and innovation more responsive to societal needs, ethical values and environmental challenges, such approaches will need to encompass and address a greater diversity of innovation system agents and spaces if they are to prove successful in their aims.
Health effects of indoor odorants.
Cone, J E; Shusterman, D
1991-01-01
People assess the quality of the air indoors primarily on the basis of its odors and on their perception of associated health risk. The major current contributors to indoor odorants are human occupant odors (body odor), environmental tobacco smoke, volatile building materials, bio-odorants (particularly mold and animal-derived materials), air fresheners, deodorants, and perfumes. These are most often present as complex mixtures, making measurement of the total odorant problem difficult. There is no current method of measuring human body odor, other than by human panel studies of expert judges of air quality. Human body odors have been quantitated in terms of the "olf" which is the amount of air pollution produced by the average person. Another quantitative unit of odorants is the "decipol," which is the perceived level of pollution produced by the average human ventilated by 10 L/sec of unpolluted air or its equivalent level of dissatisfaction from nonhuman air pollutants. The standard regulatory approach, focusing on individual constituents or chemicals, is not likely to be successful in adequately controlling odorants in indoor air. Besides the current approach of setting minimum ventilation standards to prevent health effects due to indoor air pollution, a standard based on the olf or decipol unit might be more efficacious as well as simpler to measure. PMID:1821378
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Toward a standard lexicon for ecosystem services
The complex, widely dispersed, and cumulative environmental challenges currently facing society require holistic, transdisciplinary approaches to resolve. The concept of ecosystem services (ES) has become more widely accepted both as a framework that cuts across the dimensions of...
Various approaches in EPR identification of gamma-irradiated plant foodstuffs: A review.
Aleksieva, Katerina I; Yordanov, Nicola D
2018-03-01
Irradiation of food in the world is becoming a preferred method for their sterilization and extending their shelf life. For the purpose of trade with regard to the rights of consumers is necessary marking of irradiated foodstuffs, and the use of appropriate methods for unambiguous identification of radiation treatment. One-third of the current standards of the European Union to identify irradiated foods use the method of the Electron Paramagnetic Resonance (EPR) spectroscopy. On the other hand the current standards for irradiated foods of plant origin have some weaknesses that led to the development of new methodologies for the identification of irradiated food. New approaches for EPR identification of radiation treatment of herbs and spices when the specific signal is absent or disappeared after irradiation are discussed. Direct EPR measurements of dried fruits and vegetables and different pretreatments for fresh samples are reviewed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Patient Safety Incident Reporting: Current Trends and Gaps Within the Canadian Health System.
Boucaud, Sarah; Dorschner, Danielle
2016-01-01
Patient safety incidents are a national-level phenomenon, requiring a pan-Canadian approach to ensure that incidents are reported and lessons are learned and broadly disseminated. This work explores the variation in current provincial and local approaches to reporting through a literature review. Trends are consolidated and recommendations are offered to foster better alignment of existing systems. These include adopting a common terminology, defining the patient role in reporting, increasing system users' perception of safety and further investigating the areas of home and community care in ensuring standard approaches at the local level. These steps can promote alignment, reducing barriers to a future pan-Canadian reporting and learning system.
Antibody-mediated delivery of therapeutics for cancer therapy.
Parakh, Sagun; Parslow, Adam C; Gan, Hui K; Scott, Andrew M
2016-01-01
Antibody-conjugated therapies (ACTs) combine the specificity of monoclonal antibodies to target cancer cells directly with highly potent payloads, often resulting in superior efficacy and/or reduced toxicity. This represents a new approach to the treatment of cancer. There have been highly promising clinical trial results using this approach with improvements in linker and payload technology. The breadth of current trials examining ACTs in haematological malignancies and solid tumours indicate the potential for clinical impact. This review will provide an overview of ACTs currently in clinical development as well as the principles of antibody delivery and types of payloads used, including cytotoxic drugs, radiolabelled isotopes, nanoparticle-based siRNA particles and immunotoxins. The focus of much of the clinical activity in ACTs has, understandably, been on their use as a monotherapy or in combination with standard of care drugs. This will continue, as will the search for better targets, linkers and payloads. Increasingly, as these drugs enter routine clinical care, important questions will arise regarding how to optimise ACT treatment approaches, including investigation of resistance mechanisms, biomarker and patient selection strategies, understanding of the unique toxicities of these drugs, and combinatorial approaches with standard therapies as well as emerging therapeutic agents like immunotherapy.
Evans, Travis C; Britton, Jennifer C
2018-09-01
Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.
Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam
2014-01-01
Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.
Predicting ESI/MS Signal Change for Anions in Different Solvents.
Kruve, Anneli; Kaupmees, Karl
2017-05-02
LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.
Orientation and mobility training for adults with low vision: a new standardized approach
Ballemans, Judith; Kempen, Gertrudis IJM
2013-01-01
Background: Orientation and mobility training aims to facilitate independent functioning and participation in the community of people with low vision. Objective: (1) To gain insight into current practice regarding orientation and mobility training, and (2) to develop a theory-driven standardized version of this training to teach people with low vision how to orientate and be safe in terms of mobility. Study of current practice: Insight into current practice and its strengths and weaknesses was obtained via reviewing the literature, observing orientation and mobility training sessions (n = 5) and interviewing Dutch mobility trainers (n = 18). Current practice was mainly characterized by an individual, face-to-face orientation and mobility training session concerning three components: crystallizing client’s needs, providing information and training skills. A weakness was the lack of a (structured) protocol based on evidence or theory. New theory-driven training: A new training protocol comprising two face-to-face sessions and one telephone follow-up was developed. Its content is partly based on the components of current practice, yet techniques from theoretical frameworks (e.g. social-cognitive theory and self-management) are incorporated. Discussion: A standardized, tailor-made orientation and mobility training for using the identification cane is available. The new theory-driven standardized training is generally applicable for teaching the use of every low-vision device. Its acceptability and effectiveness are currently being evaluated in a randomized controlled trial. PMID:22734105
Programmable, very low noise current source.
Scandurra, G; Cannatà, G; Giusi, G; Ciofi, C
2014-12-01
We propose a new approach for the realization of very low noise programmable current sources mainly intended for application in the field of low frequency noise measurements. The design is based on a low noise Junction Field Effect Transistor (JFET) acting as a high impedance current source and programmability is obtained by resorting to a low noise, programmable floating voltage source that allows to set the sourced current at the desired value. The floating voltage source is obtained by exploiting the properties of a standard photovoltaic MOSFET driver. Proper filtering and a control network employing super-capacitors allow to reduce the low frequency output noise to that due to the low noise JFET down to frequencies as low as 100 mHz while allowing, at the same time, to set the desired current by means of a standard DA converter with an accuracy better than 1%. A prototype of the system capable of supplying currents from a few hundreds of μA up to a few mA demonstrates the effectiveness of the approach we propose. When delivering a DC current of about 2 mA, the power spectral density of the current fluctuations at the output is found to be less than 25 pA/√Hz at 100 mHz and less than 6 pA/√Hz for f > 1 Hz, resulting in an RMS noise in the bandwidth from 0.1 to 10 Hz of less than 14 pA.
Programmable, very low noise current source
NASA Astrophysics Data System (ADS)
Scandurra, G.; Cannatà, G.; Giusi, G.; Ciofi, C.
2014-12-01
We propose a new approach for the realization of very low noise programmable current sources mainly intended for application in the field of low frequency noise measurements. The design is based on a low noise Junction Field Effect Transistor (JFET) acting as a high impedance current source and programmability is obtained by resorting to a low noise, programmable floating voltage source that allows to set the sourced current at the desired value. The floating voltage source is obtained by exploiting the properties of a standard photovoltaic MOSFET driver. Proper filtering and a control network employing super-capacitors allow to reduce the low frequency output noise to that due to the low noise JFET down to frequencies as low as 100 mHz while allowing, at the same time, to set the desired current by means of a standard DA converter with an accuracy better than 1%. A prototype of the system capable of supplying currents from a few hundreds of μA up to a few mA demonstrates the effectiveness of the approach we propose. When delivering a DC current of about 2 mA, the power spectral density of the current fluctuations at the output is found to be less than 25 pA/√Hz at 100 mHz and less than 6 pA/√Hz for f > 1 Hz, resulting in an RMS noise in the bandwidth from 0.1 to 10 Hz of less than 14 pA.
Pakzad-Vaezi, Kaivon; Mehta, Hemal; Mammo, Zaid; Tufail, Adnan
2016-07-01
Myopic choroidal neovascularization (CNV) is the most common cause of CNV in those under 50 years of age. It is a significant cause of visual loss in those with pathologic myopia. The current standard of care involves therapy with intravitreal inhibitors of vascular endothelial growth factor (VEGF). The epidemiology of myopia, high myopia, pathologic myopia, and myopic CNV is reviewed, along with a brief discussion of historical treatments. The pharmacology of the three most commonly used anti-VEGF agents is discussed, with an emphasis on the licensed drugs, ranibizumab and aflibercept. A comprehensive clinical approach to diagnosis and treatment of myopic CNV is presented. The current standard of care for myopic CNV is intravitreal inhibition of VEGF, with ranibizumab and aflibercept licensed for intraocular use. The diagnosis, OCT features of disease activity and retreatment algorithm for myopic CNV is different from wet age-related macular degeneration. In the long-term, myopic CNV may be associated with gradual, irreversible visual loss due to progressive chorioretinal atrophy, for which there is currently no treatment.
Burns, Lucinda; Coleman-Cowger, Victoria H.; Breen, Courtney
2016-01-01
Substance use in pregnancy can have adverse effects on mother and fetus alike. Australia and the US are countries with high levels of substance use and policies advising abstinence, although the Australian approach occurs within a broader framework of harm minimization. Less attention has been paid to treatment of the mothers’ substance use and what is considered gold standard. This is despite evidence that prior substance use in pregnancy is the most important factor in predicting future substance use in pregnancy. This paper draws together information from both the peer-reviewed and gray literature to provide a contemporary overview of patterns and outcomes of the three main drugs, alcohol, tobacco, and cannabis, used in Australia and the US during pregnancy and discusses what are considered gold standard screening and treatment approaches for these substances. This paper does not set out to be a comprehensive review of the area but rather aims to provide a concise summary of current guidelines for policy makers and practitioners who provide treatment for women who use substances in pregnancy. PMID:27980414
Pursiainen, S; Vorwerk, J; Wolters, C H
2016-12-21
The goal of this study is to develop focal, accurate and robust finite element method (FEM) based approaches which can predict the electric potential on the surface of the computational domain given its structure and internal primary source current distribution. While conducting an EEG evaluation, the placement of source currents to the geometrically complex grey matter compartment is a challenging but necessary task to avoid forward errors attributable to tissue conductivity jumps. Here, this task is approached via a mathematically rigorous formulation, in which the current field is modeled via divergence conforming H(div) basis functions. Both linear and quadratic functions are used while the potential field is discretized via the standard linear Lagrangian (nodal) basis. The resulting model includes dipolar sources which are interpolated into a random set of positions and orientations utilizing two alternative approaches: the position based optimization (PBO) and the mean position/orientation (MPO) method. These results demonstrate that the present dipolar approach can reach or even surpass, at least in some respects, the accuracy of two classical reference methods, the partial integration (PI) and St. Venant (SV) approach which utilize monopolar loads instead of dipolar currents.
Fast analytical spectral filtering methods for magnetic resonance perfusion quantification.
Reddy, Kasireddy V; Mitra, Abhishek; Yalavarthy, Phaneendra K
2016-08-01
The deconvolution in the perfusion weighted imaging (PWI) plays an important role in quantifying the MR perfusion parameters. The PWI application to stroke and brain tumor studies has become a standard clinical practice. The standard approach for this deconvolution is oscillatory-limited singular value decomposition (oSVD) and frequency domain deconvolution (FDD). The FDD is widely recognized as the fastest approach currently available for deconvolution of MR perfusion data. In this work, two fast deconvolution methods (namely analytical fourier filtering and analytical showalter spectral filtering) are proposed. Through systematic evaluation, the proposed methods are shown to be computationally efficient and quantitatively accurate compared to FDD and oSVD.
Narrative Interest Standard: A Novel Approach to Surrogate Decision-Making for People With Dementia.
Wilkins, James M
2017-06-17
Dementia is a common neurodegenerative process that can significantly impair decision-making capacity as the disease progresses. When a person is found to lack capacity to make a decision, a surrogate decision-maker is generally sought to aid in decision-making. Typical bases for surrogate decision-making include the substituted judgment standard and the best interest standard. Given the heterogeneous and progressive course of dementia, however, these standards for surrogate decision-making are often insufficient in providing guidance for the decision-making for a person with dementia, escalating the likelihood of conflict in these decisions. In this article, the narrative interest standard is presented as a novel and more appropriate approach to surrogate decision-making for people with dementia. Through case presentation and ethical analysis, the standard mechanisms for surrogate decision-making for people with dementia are reviewed and critiqued. The narrative interest standard is then introduced and discussed as a dementia-specific model for surrogate decision-making. Through incorporation of elements of a best interest standard in focusing on the current benefit-burden ratio and elements of narrative to provide context, history, and flexibility for values and preferences that may change over time, the narrative interest standard allows for elaboration of an enriched context for surrogate decision-making for people with dementia. More importantly, however, a narrative approach encourages the direct contribution from people with dementia in authoring the story of what matters to them in their lives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolker, Eugene
Our project focused primarily on analysis of different types of data produced by global high-throughput technologies, data integration of gene annotation, and gene and protein expression information, as well as on getting a better functional annotation of Shewanella genes. Specifically, four of our numerous major activities and achievements include the development of: statistical models for identification and expression proteomics, superior to currently available approaches (including our own earlier ones); approaches to improve gene annotations on the whole-organism scale; standards for annotation, transcriptomics and proteomics approaches; and generalized approaches for data integration of gene annotation, gene and protein expression information.
Leveraging Genomics for Head and Neck Cancer Treatment.
Kemmer, J D; Johnson, D E; Grandis, J R
2018-06-01
The genomic landscape of head and neck squamous cell carcinoma (HNSCC) has been recently elucidated. Key epigenetic and genetic characteristics of this cancer have been reported and substantiated in multiple data sets, including those distinctive to the growing subset of human papilloma virus (HPV)-associated tumors. This increased understanding of the molecular underpinnings of HNSCC has not resulted in new approaches to treatment. Three Food and Drug Administration-approved molecular targeting agents are currently available to treat recurrent/metastatic disease, but these have exhibited efficacy only in subsets of HNSCC patients, and thus surgery, chemotherapy, and/or radiation remain as standard approaches. The lack of predictive biomarkers to any therapy represents an obstacle to achieving the promise of precision medicine. This review aims to familiarize the reader with current insights into the HNSCC genomic landscape, discuss the currently approved and promising molecular targeting agents under exploration in laboratories and clinics, and consider precision medicine approaches to HNSCC.
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
Endoscopic management of benign biliary strictures.
Rustagi, Tarun; Jamidar, Priya A
2015-01-01
Benign biliary strictures are a common indication for endoscopic retrograde cholangiopancreatography (ERCP). Endoscopic management has evolved over the last 2 decades as the current standard of care. The most common etiologies of strictures encountered are following surgery and those related to chronic pancreatitis. High-quality cross-sectional imaging provides a road map for endoscopic management. Currently, sequential placement of multiple plastic biliary stents represents the preferred approach. There is an increasing role for the treatment of these strictures using covered metal stents, but due to conflicting reports of efficacies as well as cost and complications, this approach should only be entertained following careful consideration. Optimal management of strictures is best achieved using a team approach with the surgeon and interventional radiologist playing an important role.
Heller myotomy for achalasia. From the open to the laparoscopic approach.
Allaix, Marco E; Patti, Marco G
2015-07-01
The last three decades have witnessed a progressive evolution in the surgical treatment of esophageal achalasia, with a shift from open to a minimally invasive Heller myotomy. The laparoscopic approach is currently the standard of care with better short-term outcomes and similar long-term functional results when compared to open surgery. More recently, the laparoscopic single-site approach and the use of the robot have been proposed to further improve the surgical outcome in achalasia patients.
Education in Disaster Management and Emergencies: Defining a New European Course.
Khorram-Manesh, Amir; Ashkenazi, Michael; Djalali, Ahmadreza; Ingrassia, Pier Luigi; Friedl, Tom; von Armin, Gotz; Lupesco, Olivera; Kaptan, Kubilay; Arculeo, Chris; Hreckovski, Boris; Komadina, Radko; Fisher, Philipp; Voigt, Stefan; James, James; Gursky, Elin
2015-06-01
Unremitting natural disasters, deliberate threats, pandemics, and humanitarian suffering resulting from conflict situations necessitate swift and effective response paradigms. The European Union's (EU) increasing visibility as a disaster response enterprise suggests the need not only for financial contribution but also for instituting a coherent disaster response approach and management structure. The DITAC (Disaster Training Curriculum) project identified deficiencies in current responder training approaches and analyzed the characteristics and content required for a new, standardized European course in disaster management and emergencies. Over 35 experts from within and outside the EU representing various organizations and specialties involved in disaster management composed the DITAC Consortium. These experts were also organized into 5 specifically tasked working groups. Extensive literature reviews were conducted to identify requirements and deficiencies and to craft a new training concept based on research trends and lessons learned. A pilot course and program dissemination plan was also developed. The lack of standardization was repeatedly highlighted as a serious deficiency in current disaster training methods, along with gaps in the command, control, and communication levels. A blended and competency-based teaching approach using exercises combined with lectures was recommended to improve intercultural and interdisciplinary integration. The goal of a European disaster management course should be to standardize and enhance intercultural and inter-agency performance across the disaster management cycle. A set of minimal standards and evaluation metrics can be achieved through consensus, education, and training in different units. The core of the training initiative will be a unit that presents a realistic situation "scenario-based training."
NASA Technical Reports Server (NTRS)
Rutishauser, David; Donohue, George L.; Haynie, Rudolph C.
2003-01-01
This paper presents data and a proposed new aircraft wake vortex separation standard that argues for a fundamental re-thinking of international practice. The current static standard, under certain atmospheric conditions, presents an unnecessary restriction on system capacity. A new approach, that decreases aircraft separation when atmospheric conditions dictate, is proposed based upon the availability of new instrumentation and a better understanding of wake physics.
An Earthling to an Astronaut: Medical Challenges
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.
2011-01-01
Humans can travel safely into space in low Earth orbit (LEO) or to near-Earth objects if several medical, physiological, environmental, and human factors issues risks are mitigated. Research must be performed in order to set standards in these four areas, and current NASA standards are contained in the Space Flight Human System Standards volumes 1 and 2, and crew medical certification standards. These three sets of standards drive all of the clinical, biomedical research and environmental technology development for the NASA human space flight program. These standards also drive the identification of specific risks to crew health and safety, and we currently manage 65 human system risks within the human space flight program. Each risk has a specific program of research, technology development, and development of operational procedures to mitigate the risks. Some of the more important risks tat will be discussed in this talk include exposure to radiation, behavioral health due to confinement in a closed cabin, physiological changes such as loss of bone, muscle and exercise capability, reduction in immune system capability, environmental threats of maintaining an adequate atmosphere and water for drinking, avoidance of toxic or infectious material, protection of hearing, and human factors issues of equipment and task design. A nutritious and varied food supply must also be provided. All of these risks will be discussed and current strategies for mitigating these risks for long-duration human space flight. In mitigating these 65 human system risks, novel approaches to problem solving must be employed to find the most appropriate research and technology based applications. Some risk mitigations are developed internally to NASA while others are found through research grants, technology procurements, and more recently open innovation techniques to seek solutions from the global technical community. Examples and results will be presented from all of these approaches including the more recent use of prizes to stimulate innovation.
Analgesic Use in Nonhuman Primates Undergoing Neurosurgical Procedures
DiVincenti, Louis
2013-01-01
Animals experiencing major invasive surgery during biomedical research must receive appropriate and sufficient analgesia. The concept of pain management in veterinary medicine has evolved over the past several decades, and a multimodal, preemptive approach to postoperative analgesia is the current standard of care. Here, the pathophysiology of pain and a multimodal approach to analgesia for neurosurgical procedures is discussed, with emphasis on those involving nonhuman primates. PMID:23562027
MUSQA: a CS method to build a multi-standard quality management system
NASA Astrophysics Data System (ADS)
Cros, Elizabeth; Sneed, Isabelle
2002-07-01
CS Communication & Systèmes, through its long quality management experience, has been able to build and evolve its Quality Management System according to clients requirements, norms, standards and models (ISO, DO178, ECSS, CMM, ...), evolving norms (transition from ISO 9001:1994 to ISO 9001:2000) and the TQM approach, being currently deployed. The aim of this paper is to show how, from this enriching and instructive experience, CS has defined and formalised its method: MuSQA (Multi-Standard Quality Approach). This method allows to built a new Quality Management System or simplify and unify an existing one. MuSQA objective is to provide any organisation with an open Quality Management System, which is able to evolve easily and turns to be a useful instrument for everyone, operational as well as non-operational staff.
Risk Analysis as Regulatory Science: Toward The Establishment of Standards
Murakami, Michio
2016-01-01
Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional ‘Standard I’, which has a paternalistic orientation, and ‘Standard II’, established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751
Risk Analysis as Regulatory Science: Toward The Establishment of Standards.
Murakami, Michio
2016-09-01
Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.
Scialla, Michele A; Canter, Kimberly S; Chen, Fang Fang; Kolb, E Anders; Sandler, Eric; Wiener, Lori; Kazak, Anne E
2018-03-01
With published evidence-based Standards for Psychosocial Care for Children with Cancer and their Families, it is important to know the current status of their implementation. This paper presents data on delivery of psychosocial care related to the Standards in the United States. Pediatric oncologists, psychosocial leaders, and administrators in pediatric oncology from 144 programs completed an online survey. Participants reported on the extent to which psychosocial care consistent with the Standards was implemented and was comprehensive and state of the art. They also reported on specific practices and services for each Standard and the extent to which psychosocial care was integrated into broader medical care. Participants indicated that psychosocial care consistent with the Standards was usually or always provided at their center for most of the Standards. However, only half of the oncologists (55.6%) and psychosocial leaders (45.6%) agreed or strongly agreed that their psychosocial care was comprehensive and state of the art. Types of psychosocial care provided included evidence-based and less established approaches but were most often provided when problems were identified, rather than proactively. The perception of state of the art care was associated with practices indicative of integrated psychosocial care and the extent to which the Standards are currently implemented. Many oncologists and psychosocial leaders perceive that the delivery of psychosocial care at their center is consistent with the Standards. However, care is quite variable, with evidence for the value of more integrated models of psychosocial services. © 2017 Wiley Periodicals, Inc.
An Experimental Investigation of Cognitive Defusion
ERIC Educational Resources Information Center
Pilecki, Brian C.; McKay, Dean
2012-01-01
The current study compared cognitive defusion with other strategies in reducing the impact of experimentally induced negative emotional states. Sixty-seven undergraduates were assigned to one of three conditions (cognitive defusion, thought suppression, or control) and instructed in standardized approaches relevant to each condition before viewing…
Strategy for standardization of preeclampsia research study design.
Myatt, Leslie; Redman, Christopher W; Staff, Anne Cathrine; Hansson, Stefan; Wilson, Melissa L; Laivuori, Hannele; Poston, Lucilla; Roberts, James M
2014-06-01
Preeclampsia remains a major problem worldwide for mothers and babies. Despite intensive study, we have not been able to improve the management or early recognition of preeclampsia. At least part of this is because of failure to standardize the approach to studying this complex syndrome. It is possible that within the syndrome there may be different phenotypes with pathogenic pathways that differ between the subtypes. The capacity to recognize and to exploit different subtypes is of obvious importance for prediction, prevention, and treatment. We present a strategy for research to study preeclampsia, which will allow discrimination of such possible subtypes and also allow comparison and perhaps combinations of findings in different studies by standardized data and biosample collection. To make studies relevant to current clinical practice, the definition of preeclampsia can be that currently used and accepted. However, more importantly, sufficient data should be collected to allow other diagnostic criteria to be used and applied retrospectively. To that end, we present what we consider to be the minimum requirements for a data set in a study of preeclampsia that will facilitate comparisons. We also present a comprehensive or optimal data set for in-depth investigation of pathophysiology. As we approach the definition of phenotypes of preeclampsia by clinical and biochemical criteria, adherence to standardized protocols will hasten our understanding of the causes of preeclampsia and development of targeted treatment strategies.
Massive juvenile nasopharyngeal angiofibroma: ode to the open surgical approach.
Meher, Ravi; Arora, Nikhil; Bhargava, Eishaan Kamta; Juneja, Ruchika
2017-08-01
The management of juvenile nasopharyngeal angiofibroma has undergone a significant evolution, with more surgeons moving towards the minimal invasive endoscopic approaches. Although considered the standard of care by most, an endoscopic approach may not be sufficient for extensive tumours, as exemplified by the current case of a young man presenting with the largest juvenile nasopharyngeal angiofibroma described in English literature until the present that was eventually excised via an anterior external approach. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Molecular and Nonmolecular Diagnostic Methods for Invasive Fungal Infections
Arvanitis, Marios; Anagnostou, Theodora; Fuchs, Beth Burgwyn; Caliendo, Angela M.
2014-01-01
SUMMARY Invasive fungal infections constitute a serious threat to an ever-growing population of immunocompromised individuals and other individuals at risk. Traditional diagnostic methods, such as histopathology and culture, which are still considered the gold standards, have low sensitivity, which underscores the need for the development of new means of detecting fungal infectious agents. Indeed, novel serologic and molecular techniques have been developed and are currently under clinical evaluation. Tests like the galactomannan antigen test for aspergillosis and the β-glucan test for invasive Candida spp. and molds, as well as other antigen and antibody tests, for Cryptococcus spp., Pneumocystis spp., and dimorphic fungi, have already been established as important diagnostic approaches and are implemented in routine clinical practice. On the other hand, PCR and other molecular approaches, such as matrix-assisted laser desorption ionization (MALDI) and fluorescence in situ hybridization (FISH), have proved promising in clinical trials but still need to undergo standardization before their clinical use can become widespread. The purpose of this review is to highlight the different diagnostic approaches that are currently utilized or under development for invasive fungal infections and to identify their performance characteristics and the challenges associated with their use. PMID:24982319
Legal ecotones: A comparative analysis of riparian policy protection in the Oregon Coast Range, USA.
Boisjolie, Brett A; Santelmann, Mary V; Flitcroft, Rebecca L; Duncan, Sally L
2017-07-15
Waterways of the USA are protected under the public trust doctrine, placing responsibility on the state to safeguard public resources for the benefit of current and future generations. This responsibility has led to the development of management standards for lands adjacent to streams. In the state of Oregon, policy protection for riparian areas varies by ownership (e.g., federal, state, or private), land use (e.g., forest, agriculture, rural residential, or urban) and stream attributes, creating varying standards for riparian land-management practices along the stream corridor. Here, we compare state and federal riparian land-management standards in four major policies that apply to private and public lands in the Oregon Coast Range. We use a standard template to categorize elements of policy protection: (1) the regulatory approach, (2) policy goals, (3) stream attributes, and (4) management standards. All four policies have similar goals for achieving water-quality standards, but differ in their regulatory approach. Plans for agricultural lands rely on outcome-based standards to treat pollution, in contrast with the prescriptive policy approaches for federal, state, and private forest lands, which set specific standards with the intent of preventing pollution. Policies also differ regarding the stream attributes considered when specifying management standards. Across all policies, 25 categories of unique standards are identified. Buffer widths vary from 0 to ∼152 m, with no buffer requirements for streams in agricultural areas or small, non-fish-bearing, seasonal streams on private forest land; narrow buffer requirements for small, non-fish-bearing perennial streams on private forest land (3 m); and the widest buffer requirements for fish-bearing streams on federal land (two site-potential tree-heights, up to an estimated 152 m). Results provide insight into how ecosystem concerns are addressed by variable policy approaches in multi-ownership landscapes, an important consideration to recovery-planning efforts for threatened species. Copyright © 2017 Elsevier Ltd. All rights reserved.
Vehicle track segmentation using higher order random fields
Quach, Tu -Thach
2017-01-09
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
Alternative approach for fire suppression of class A, B and C fires in gloveboxes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberger, Mark S; Tsiagkouris, James A
2011-02-10
Department of Energy (DOE) Orders and National Fire Protection Association (NFPA) Codes and Standards require fire suppression in gloveboxes. Several potential solutions have been and are currently being considered at Los Alamos National Laboratory (LANL). The objective is to provide reliable, minimally invasive, and seismically robust fire suppression capable of extinguishing Class A, B, and C fires; achieve compliance with DOE and NFPA requirements; and provide value-added improvements to fire safety in gloveboxes. This report provides a brief summary of current approaches and also documents the successful fire tests conducted to prove that one approach, specifically Fire Foe{trademark} tubes, ismore » capable of achieving the requirement to provide reliable fire protection in gloveboxes in a cost-effective manner.« less
Vehicle track segmentation using higher order random fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quach, Tu -Thach
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
NASA Astrophysics Data System (ADS)
Kehlenbeck, Matthias; Breitner, Michael H.
Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.
Kranz, J; Sommer, K-J; Steffens, J
2014-05-01
Patient safety and risk/complication management rank among the current megatrends in modern medicine, which has undoubtedly become more complex. In time-critical, error-prone and difficult situations, which often occur repeatedly in everyday clinical practice, guidelines are inappropriate for acting rapidly and intelligently. With the establishment and consistent use of standard operating procedures like in commercial aviation, a possible strategic approach is available. These medical aids to decision-making - quick reference cards - are short, optimized instructions that enable a standardized procedure in case of medical claims.
Current Approaches in the Treatment of Relapsed and Refractory Acute Myeloid Leukemia
Ramos, Nestor R.; Mo, Clifton C.; Karp, Judith E.; Hourigan, Christopher S.
2015-01-01
The limited sensitivity of the historical treatment response criteria for acute myeloid leukemia (AML) has resulted in a different paradigm for treatment compared with most other cancers presenting with widely disseminated disease. Initial cytotoxic induction chemotherapy is often able to reduce tumor burden to a level sufficient to meet the current criteria for “complete” remission. Nevertheless, most AML patients ultimately die from their disease, most commonly as clinically evident relapsed AML. Despite a variety of available salvage therapy options, prognosis in patients with relapsed or refractory AML is generally poor. In this review, we outline the commonly utilized salvage cytotoxic therapy interventions and then highlight novel investigational efforts currently in clinical trials using both pathway-targeted agents and immunotherapy based approaches. We conclude that there is no current standard of care for adult relapsed or refractory AML other than offering referral to an appropriate clinical trial. PMID:25932335
Thrust reverser analysis for implementation in the Aviation Environmental Design Tool (AEDT)
DOT National Transportation Integrated Search
2007-06-01
This letter report presents an updated implementation for thrust reversers in AEDT. Currently, thrust reverser is applied to all STANDARD approach profiles in the Integrated Noise Mode (lNM) as 60% of the max rated thrust for jets and 40% for props o...
A Clinical Approach to Antioxidant Therapy: Hypertonic Fluid Resuscitation Trial
2003-06-01
5 2. Experimental Section...limited forward surgical care and delayed evacuation.[9] 1.1.1 Current Fluid Resuscitation Standard of Care By virtue of clinical experience , low cost...bleeding, thereby potentially increasing mortality. Indeed, evidence from experimental animal studies suggests that small-volume hypotensive
Is it time to reassess current safety standards for glyphosate-based herbicides?
Blumberg, Bruce; Antoniou, Michael N; Benbrook, Charles M; Carroll, Lynn; Colborn, Theo; Everett, Lorne G; Hansen, Michael; Landrigan, Philip J; Lanphear, Bruce P; Mesnage, Robin; vom Saal, Frederick S; Welshons, Wade V; Myers, John Peterson
2017-01-01
Use of glyphosate-based herbicides (GBHs) increased ∼100-fold from 1974 to 2014. Additional increases are expected due to widespread emergence of glyphosate-resistant weeds, increased application of GBHs, and preharvest uses of GBHs as desiccants. Current safety assessments rely heavily on studies conducted over 30 years ago. We have considered information on GBH use, exposures, mechanisms of action, toxicity and epidemiology. Human exposures to glyphosate are rising, and a number of in vitro and in vivo studies challenge the basis for the current safety assessment of glyphosate and GBHs. We conclude that current safety standards for GBHs are outdated and may fail to protect public health or the environment. To improve safety standards, the following are urgently needed: (1) human biomonitoring for glyphosate and its metabolites; (2) prioritisation of glyphosate and GBHs for hazard assessments, including toxicological studies that use state-of-the-art approaches; (3) epidemiological studies, especially of occupationally exposed agricultural workers, pregnant women and their children and (4) evaluations of GBHs in commercially used formulations, recognising that herbicide mixtures likely have effects that are not predicted by studying glyphosate alone. PMID:28320775
An Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Danford; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system discussed and time for questions and answers will be provided.
Mignogna, Joseph; Stanley, Melinda A.; Davila, Jessica; Wear, Jackie; Amico, K. Rivet; Giordano, Thomas P.
2012-01-01
Abstract Although peer interventionists have been successful in medication treatment-adherence interventions, their role in complex behavior-change approaches to promote entry and reentry into HIV care requires further investigation. The current study sought to describe and test the feasibility of a standardized peer-mentor training program used for MAPPS (Mentor Approach for Promoting Patient Self-Care), a study designed to increase engagement and attendance at HIV outpatient visits among high-risk HIV inpatients using HIV-positive peer interventionists to deliver a comprehensive behavioral change intervention. Development of MAPPS and its corresponding training program included collaborations with mentors from a standing outpatient mentor program. The final training program included (1) a half-day workshop; (2) practice role-plays; and (3) formal, standardized patient role-plays, using trained actors with “real-time” video observation (and ratings from trainers). Mentor training occurred over a 6-week period and required demonstration of adherence and skill, as rated by MAPPS trainers. Although time intensive, ultimate certification of mentors suggested the program was both feasible and effective. Survey data indicated mentors thought highly of the training program, while objective rating data from trainers indicated mentors were able to understand and display standards associated with intervention fidelity. Data from the MAPPS training program provide preliminary evidence that peer mentors can be trained to levels necessary to ensure intervention fidelity, even within moderately complex behavioral-change interventions. Although additional research is needed due to limitations of the current study (e.g., limited generalizability due to sample size and limited breadth of clinical training opportunities), data from the current trial suggest that training programs such as MAPPS appear both feasible and effective. PMID:22248331
A service-oriented approach to assessing the infrastructure value index.
Amaral, R; Alegre, H; Matos, J S
Many national and regional administrations are currently facing challenges to ensure long-term sustainability of urban water services, as infrastructures continue to accumulate alarming levels of deferred maintenance and rehabilitation. The infrastructure value index (IVI) has proven to be an effective tool to support long-term planning, in particular by facilitating the ability to communicate and to create awareness. It is given by the ratio between current value of an infrastructure and its replacement cost. Current value is commonly estimated according to an asset-oriented approach, which is based on the concept of useful life of individual components. The standard values assumed for the useful lives can vary significantly, which leads to valuations that are just as different. Furthermore, with water companies increasingly focused on the customer, effective service-centric asset management is essential now more than ever. This paper shows results of on-going research work, which aims to explore a service-oriented approach for assessing the IVI. The paper presents the fundamentals underlying this approach, discusses and compares results obtained from both perspectives and points to challenges that still need to be addressed.
SLAE–CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies
Ma, Jing; Wang, Qiang; Zhao, Zhibiao
2017-01-01
In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE–CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE–CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE–CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology. PMID:28657577
SLAE-CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies.
Ma, Jing; Wang, Qiang; Zhao, Zhibiao
2017-06-28
In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE-CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE-CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE-CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology.
Jones, W Schuyler; Krucoff, Mitchell W; Morales, Pablo; Wilgus, Rebecca W; Heath, Anne H; Williams, Mary F; Tcheng, James E; Marinac-Dabic, J Danica; Malone, Misti L; Reed, Terrie L; Fukaya, Rie; Lookstein, Robert; Handa, Nobuhiro; Aronow, Herbert D; Bertges, Daniel J; Jaff, Michael R; Tsai, Thomas T; Smale, Joshua A; Zaugg, Margo J; Thatcher, Robert J; Cronenwett, Jack L; Nc, Durham; Md, Silver Spring; Japan, Tokyo; Ny, New York; Ri, Providence; Vt, Burlington; Mass, Newton; Colo, Denver; Ariz, Tempe; Calif, Santa Clara; Minn, Minneapolis; Nh, Lebanon
2018-01-25
The current state of evaluating patients with peripheral artery disease and more specifically of evaluating medical devices used for peripheral vascular intervention (PVI) remains challenging because of the heterogeneity of the disease process, the multiple physician specialties that perform PVI, the multitude of devices available to treat peripheral artery disease, and the lack of consensus about the best treatment approaches. Because PVI core data elements are not standardized across clinical care, clinical trials, and registries, aggregation of data across different data sources and physician specialties is currently not feasible.Methods and Results:Under the auspices of the U.S. Food and Drug Administration's Medical Device Epidemiology Network initiative-and its PASSION (Predictable and Sustainable Implementation of the National Registries) program, in conjunction with other efforts to align clinical data standards-the Registry Assessment of Peripheral Interventional Devices (RAPID) workgroup was convened. RAPID is a collaborative, multidisciplinary effort to develop a consensus lexicon and to promote interoperability across clinical care, clinical trials, and national and international registries of PVI. The current manuscript presents the initial work from RAPID to standardize clinical data elements and definitions, to establish a framework within electronic health records and health information technology procedural reporting systems, and to implement an informatics-based approach to promote the conduct of pragmatic clinical trials and registry efforts in PVI. Ultimately, we hope this work will facilitate and improve device evaluation and surveillance for patients, clinicians, health outcomes researchers, industry, policymakers, and regulators.
O'Connor Mooney, Rory; Davis, Niall Francis; Hoey, David; Hogan, Lisa; McGloughlin, Timothy M; Walsh, Michael T
2016-01-01
To investigate the repeatability of automatic decellularisation of porcine aortae using a non-enzymatic approach, addressing current limitations associated with other automatic decellularisation processes. Individual porcine aortae (n = 3) were resected and every third segment (n = 4) was allocated to one of three different groups: a control or a manually or automatically decellularised group. Manual and automatic decellularisation was performed using Triton X-100 (2% v/v) and sodium deoxycholate. Protein preservation and the elimination of a galactosyl-α(1,3)galactose (GAL) epitope were measured using immunohistochemistry and protein binding assays. The presence of residual DNA was determined with gel electrophoresis and spectrophotometry. Scaffold integrity was characterised with scanning electron microscopy and uni-axial tensile testing. Manual and automatic results were compared to one another, to control groups and to current gold standards. The results were comparable to those of current gold standard decellularisation techniques. Successful repeatability was achieved, both manually and automatically, with little effect on mechanical characteristics. Complete acellularity was not confirmed in either decellularisation group. Protein preservation was consistent in both the manually and automatically decellularised groups and between each individual aorta. Elimination of GAL was not achieved. Repeatable automatic decellularisation of porcine aortae is feasible using a Triton X-100-sodium deoxycholate protocol. Protein preservation was satisfactory; however, gold standard thresholds for permissible residual DNA levels were not achieved. Future research will focus on addressing this issue by optimisation of the existing protocol for thick tissues. © 2016 S. Karger AG, Basel.
Management of obstructive sleep apnea in the indigent population: a deviation of standard of care?
Hamblin, John S; Sandulache, Vlad C; Alapat, Philip M; Takashima, Masayoshi
2014-03-01
Comprehensive management of patients with obstructive sleep apnea (OSA) typically is managed best via a multidisciplinary approach, involving otolaryngologists, sleep psychologists/psychiatrists, pulmonologists, neurologists, oral surgeons, and sleep trained dentists. By utilizing these resources, one could fashion a treatment individualized to the patient, giving rise to the holistic phrase of "personalized medicine." Unfortunately, in situations and environments with limited resources, the treatment options in an otolaryngologist's armamentarium are restricted--typically to continuous positive airway pressure (CPAP) versus sleep surgery. However, a recent patient encounter highlighted here shows how a hospital's reimbursement policy effectively dictated a patient's medical management to sleep surgery. This occurred although the current gold standard for the initial treatment of OSA is CPAP. Changing the course of medical/surgical management by selectively restricting funding is a cause of concern, especially when it promotes patients to choose a treatment option that is not considered the current standard of care.
A Community Standard: Equivalency of Healthcare in Australian Immigration Detention.
Essex, Ryan
2017-08-01
The Australian government has long maintained that the standard of healthcare provided in its immigration detention centres is broadly comparable with health services available within the Australian community. Drawing on the literature from prison healthcare, this article examines (1) whether the principle of equivalency is being applied in Australian immigration detention and (2) whether this standard of care is achievable given Australia's current policies. This article argues that the principle of equivalency is not being applied and that this standard of health and healthcare will remain unachievable in Australian immigration detention without significant reform. Alternate approaches to addressing the well documented issues related to health and healthcare in Australian immigration detention are discussed.
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
Dyadic OPTION: Measuring perceptions of shared decision-making in practice.
Melbourne, Emma; Roberts, Stephen; Durand, Marie-Anne; Newcombe, Robert; Légaré, France; Elwyn, Glyn
2011-04-01
Current models of the medical consultation emphasize shared decision-making (SDM), whereby the expertise of both the doctor and the patient are recognised and seen to equally contribute to the consultation. The evidence regarding the desirability and effectiveness of the SDM approach is often conflicting. It is proposed that the conflicts are due to the nature of assessment, with current assessments from the perspective of an outside observer. To empirically assess perceived involvement in the medical consultation using the dyadic OPTION instrument. 36 simulated medical consultations were organised between general practitioners and standardized- patients, using the observer OPTION and the newly developed dyadic OPTION instruments. SDM behaviours observed in the consultations were seen to depend on both members of the doctor and patient dyad, rather than each in isolation. Thus a dyadic approach to measurement is supported. This current study highlights the necessity for a dyadic approach to assessment and introduces a novel research instrument: the dyadic OPTION instrument. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Up Front: Students Are Chafing under "Test Stress."
ERIC Educational Resources Information Center
American School Board Journal, 2001
2001-01-01
Although President Bush favors continuous testing, headlines reflect an intense, growing antitesting sentiment. One standard does not fit all, current systems are malfunctioning, and kids are short-changed. A recent report says abstinence-only sex education is ineffective; high teen birth rates underline the need for comprehensive approaches. (MLH)
Hassanein, Tarek
2017-04-01
Hepatic Encephalopathy is a devastating complication of End-Stage Liver Disease. In its severe grades it requires extra intervention beyond the standard medical approaches. In this article were view the role of liver support systems in managing hepatic encephalopthy.
On Present State of Teaching Russian Language in Russia
ERIC Educational Resources Information Center
Tekucheva, Irina V.; Gromova, Liliya Y.
2016-01-01
The article discusses the current state of teaching Russian language, discovers the nature of philological education, outlines the main problems of the implementation of the standard in school practice, analyzes the problems of formation of universal educational actions within the context of the implementation of cognitive-communicative approach,…
New Themes and Approaches in Second Language Motivation Research.
ERIC Educational Resources Information Center
Dornyei, Zoltan
2001-01-01
Provides an overview of the current themes and research directions in second language motivation research. Argues that the initial research inspiration and standard-setting empirical work on second language motivation originating from Canada has borne fruit by educating a new generation of international scholars who have created a colorful mixture…
Your Science Classroom: Becoming an Elementary/Middle School Science Teacher
ERIC Educational Resources Information Center
Goldston, M. Jenice; Downey, Laura
2012-01-01
Designed around a practical "practice-what-you-teach" approach to methods instruction, "Your Science Classroom: Becoming an Elementary/Middle School Science Teacher" is based on current constructivist philosophy, organized around 5E inquiry, and guided by the National Science Education Teaching Standards. Written in a reader-friendly style, the…
The Impact of the AACTE-Microsoft Grant on Elementary Reading & Writing
ERIC Educational Resources Information Center
Borgia, Laurel; Cheek, Earl H., Jr.
2005-01-01
Accountability for student learning and support of evidence-based instructional approaches are critical responsibilities for teachers. Both are particularly significant with the current reliance on state standards, assessment tests and the No Child Left Behind Act (Shanahan 2002). Every elementary teacher must have research-based resources to help…
ARTISTIC Critique: A Practical Approach to Viewing Dance
ERIC Educational Resources Information Center
Harris, Typhani
2013-01-01
Content Literacy, 21st Century Skills, and Common Core Standards are quickly becoming the buzz of current public education initiatives. As these new policies dictate educational reform, public schools are hustling to find meaning, definition, and accountability for these future expectations. Content literacy goes beyond the ability to read and…
Positive animal welfare states and reference standards for welfare assessment.
Mellor, D J
2015-01-01
Developments in affective neuroscience and behavioural science during the last 10-15 years have together made it increasingly apparent that sentient animals are potentially much more sensitive to their environmental and social circumstances than was previously thought to be the case. It therefore seems likely that both the range and magnitude of welfare trade-offs that occur when animals are managed for human purposes have been underestimated even when minimalistic but arguably well-intentioned attempts have been made to maintain high levels of welfare. In light of these neuroscience-supported behaviour-based insights, the present review considers the extent to which the use of currently available reference standards might draw attention to these previously neglected areas of concern. It is concluded that the natural living orientation cannot provide an all-embracing or definitive welfare benchmark because of its primary focus on behavioural freedom. However assessments of this type, supported by neuroscience insights into behavioural motivation, may now carry greater weight when used to identify management practices that should be avoided, discontinued or substantially modified. Using currently accepted baseline standards as welfare reference points may result in small changes being accorded greater significance than would be the case if they were compared with higher standards, and this could slow the progress towards better levels of welfare. On the other hand, using "what animals want" as a reference standard has the appeal of focusing on the specific resources or conditions the animals would choose themselves and can potentially improve their welfare more quickly than the approach of making small increments above baseline standards. It is concluded that the cautious use of these approaches in different combinations could lead to recommendations that would more effectively promote positive welfare states in hitherto neglected areas of concern.
A process-based standard for the Solar Energetic Particle Event Environment
NASA Astrophysics Data System (ADS)
Gabriel, Stephen
For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.
Hysteroscopic Sterilization: History and Current Methods
Greenberg, James A
2008-01-01
For many practicing obstetrician-gynecologists, tubal ligation was the gold standard by which female sterilization techniques were measured. Yet gynecologic surgeons have simultaneously sought to occlude the fallopian tubes transcervically to avoid discomfort and complications associated with transabdominal approaches. In this review, the history of transcervical sterilization is discussed. Past, current, and upcoming techniques are reviewed. This article focuses on interval sterilization techniques, thus removing post-vaginal and post-cesarean delivery tubal ligations from the discussion. PMID:19015762
Thomson, G R; Penrith, M-L; Atkinson, M W; Thalwitzer, S; Mancuso, A; Atkinson, S J; Osofsky, S A
2013-12-01
A case is made for greater emphasis to be placed on value chain management as an alternative to geographically based disease risk mitigation for trade in commodities and products derived from animals. The geographic approach is dependent upon achievement of freedom in countries or zones from infectious agents that cause so-called transboundary animal diseases, while value chain-based risk management depends upon mitigation of animal disease hazards potentially associated with specific commodities or products irrespective of the locality of production. This commodity-specific approach is founded on the same principles upon which international food safety standards are based, viz. hazard analysis critical control points (HACCP). Broader acceptance of a value chain approach enables animal disease risk management to be combined with food safety management by the integration of commodity-based trade and HACCP methodologies and thereby facilitates 'farm to fork' quality assurance. The latter is increasingly recognized as indispensable to food safety assurance and is therefore a pre-condition to safe trade. The biological principles upon which HACCP and commodity-based trade are based are essentially identical, potentially simplifying sanitary control in contrast to current separate international sanitary standards for food safety and animal disease risks that are difficult to reconcile. A value chain approach would not only enable more effective integration of food safety and animal disease risk management of foodstuffs derived from animals but would also ameliorate adverse environmental and associated socio-economic consequences of current sanitary standards based on the geographic distribution of animal infections. This is especially the case where vast veterinary cordon fencing systems are relied upon to separate livestock and wildlife as is the case in much of southern Africa. A value chain approach would thus be particularly beneficial to under-developed regions of the world such as southern Africa specifically and sub-Saharan Africa more generally where it would reduce incompatibility between attempts to expand and commercialize livestock production and the need to conserve the subcontinent's unparalleled wildlife and wilderness resources. © 2013 Blackwell Verlag GmbH.
Localized renal cell carcinoma management: an update.
Heldwein, Flavio L; McCullough, T Casey; Souto, Carlos A V; Galiano, Marc; Barret, Eric
2008-01-01
To review the current modalities of treatment for localized renal cell carcinoma. A literature search for keywords: renal cell carcinoma, radical nephrectomy, nephron sparing surgery, minimally invasive surgery, and cryoablation was performed for the years 2000 through 2008. The most relevant publications were examined. New epidemiologic data and current treatment of renal cancer were covered. Concerning the treatment of clinically localized disease, the literature supports the standardization of partial nephrectomy and laparoscopic approaches as therapeutic options with better functional results and oncologic success comparable to standard radical resection. Promising initial results are now available for minimally invasive therapies, such as cryotherapy and radiofrequency ablation. Active surveillance has been reported with acceptable results, including for those who are poor surgical candidates. This review covers current advances in radical and conservative treatments of localized kidney cancer. The current status of nephron-sparing surgery, ablative therapies, and active surveillance based on natural history has resulted in great progress in the management of localized renal cell carcinoma.
The uses and implications of standards in general practice consultations.
Lippert, Maria Laura; Reventlow, Susanne; Kousgaard, Marius Brostrøm
2017-01-01
Quality standards play an increasingly important role in primary care through their inscription in various technologies for improving professional practice. While 'hard' biomedical standards have been the most common and debated, current quality development initiatives increasingly seek to include standards for the 'softer' aspects of care. This article explores the consequences of both kinds of quality standards for chronic care consultations. The article presents findings from an explorative qualitative field study in Danish general practice where a standardized technology for quality development has been introduced. Data from semi-structured interviews and observations among 17 general practitioners were analysed using an iterative analytical approach, which served to identify important variations in the uses and impacts of the technology. The most pronounced impact of the technology was observed among general practitioners who strictly adhered to the procedural standards on the interactional aspects of care. Thus, when allowed to function as an overall frame for consultations, those standards supported adherence to general recommendations regarding which elements to be included in chronic disease consultations. However, at the same time, adherence to those standards was observed to narrow the focus of doctor-patient dialogues and to divert general practitioners' attention from patients' personal concerns. Similar consequences of quality standards have previously been framed as manifestations of an inherent conflict between principles of patient-centredness and formal biomedical quality standards. However, this study suggests that standards on the 'softer' aspects of care may just as well interfere with a clinical approach relying on situated and attentive interactions with patients.
Parallel Approach Separation and Controller Performance
1989-11-01
adjacent runways from the current minimum of 2.0 to 1.5 nautical miles (nmi). The possible impact of this alteration included changes in the nature and...for aircraft in trail on the same approach. 4. Determine if a change in separation standard affects controller work effort and if so, how. SIMULATION...practice effects should be minimal. 2. They can evaluate the realism of the simulation. 3. They are better able to evaluate the impact of any changes
Constraints and spandrels of interareal connectomes
Rubinov, Mikail
2016-01-01
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls. PMID:27924867
Constraints and spandrels of interareal connectomes.
Rubinov, Mikail
2016-12-07
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls.
Future challenges to protecting public health from drinking-water contaminants.
Murphy, Eileen A; Post, Gloria B; Buckley, Brian T; Lippincott, Robert L; Robson, Mark G
2012-04-01
Over the past several decades, human health protection for chemical contaminants in drinking water has been accomplished by development of chemical-specific standards. This approach alone is not feasible to address current issues of the occurrence of multiple contaminants in drinking water, some of which have little health effects information, and water scarcity. In this article, we describe the current chemical-specific paradigm for regulating chemicals in drinking water and discuss some potential additional approaches currently being explored to focus more on sustaining quality water for specific purposes. Also discussed are strategies being explored by the federal government to screen more efficiently the toxicity of large numbers of chemicals to prioritize further intensive testing. Water reuse and water treatment are described as sustainable measures for managing water resources for potable uses as well as other uses such as irrigation.
An evolution of orchiopexy: historical aspect.
Park, Kwanjin; Choi, Hwang
2010-03-01
The history of treatment for cryptorchidism dates back more than 200 years. This review is intended to highlight some historical aspect that led us to our current surgical treatment of this condition. The medical and historical surgical literatures pertaining to cryptorchidism were reviewed. Data sources were PubMed, Embase, conference proceedings, and bibliographies. No language, date, or publication status restrictions were imposed. The study of cryptorchidism began with the anatomical descriptions of Baron Albrecht von Haller and John Hunter. Attempts at surgical correction of the undescended testis began in the early 1800s, culminating in the first successful orchiopexy by Thomas Annandale in 1877. Max Schüller, Arthur Dean Bevan and Lattimer contributed to the establishment of current techniques for standard orchiopexy. Later, laparoscopy, high inguinal incision (Jones' approach) and scrotal approach were added to the list of current orchiopexy.
Future Challenges to Protecting Public Health from Drinking-Water Contaminants
Murphy, Eileen A.; Post, Gloria B.; Buckley, Brian T.; Lippincott, Robert L.; Robson, Mark G.
2014-01-01
Over the past several decades, human health protection for chemical contaminants in drinking water has been accomplished by development of chemical-specific standards. This approach alone is not feasible to address current issues of the occurrence of multiple contaminants in drinking water, some of which have little health effects information, and water scarcity. In this article, we describe the current chemical-specific paradigm for regulating chemicals in drinking water and discuss some potential additional approaches currently being explored to focus more on sustaining quality water for specific purposes. Also discussed are strategies being explored by the federal government to screen more efficiently the toxicity of large numbers of chemicals to prioritize further intensive testing. Water reuse and water treatment are described as sustainable measures for managing water resources for potable uses as well as other uses such as irrigation. PMID:22224887
Long-term Outcomes of Elective Surgery for Diverticular Disease: A Call for Standardization.
Biondi, Alberto; Santullo, Francesco; Fico, Valeria; Persiani, Roberto
2016-10-01
To date, the appropriate management of diverticular disease is still controversial. The American Society of Colon and Rectal Surgeons declared that the decision between conservative or surgical approach should be taken by a case-by-case evaluation. There is still lack of evidence in literature about long-term outcomes after elective sigmoid resection for diverticular disease. Considering the potentially key role of the surgical technique in long-term outcomes, there is the need for surgeons to define strict rules to standardize the surgical technique. Currently there are 5 areas of debate in elective surgery for diverticular disease: laparoscopic versus open approach, the site of the proximal and distal colonic division, the vascular approach and the mobilization of the splenic flexure. The purpose of this paper is to review existing knowledge about technical aspects, which represent how the surgeon is able to affect the long-term results.
Fennell, Philip; Goldstein, Robert Lloyd
2006-01-01
Legal approaches to civil commitment in the United States and the United Kingdom are compared. A concise overview of the historical evolution of civil commitment in both countries precedes a discussion of the present scheme of commitment standards in each system. These current standards in U.S. and U.K. jurisdictions are then applied to a hypothetical case of delusional disorder. A discussion of the constructive use of civil commitment in patients with delusional disorder who may be dangerous focuses on its value as a preventive measure against potential harm to self or others, as well as the pros and cons of coercive assessment and treatment. Despite the many differences in approach to commitment, the authors concur that in both countries the patient with delusional disorder was committable before the commission of a serious criminal offense.
NASA Astrophysics Data System (ADS)
Zhao, Ye; Hsieh, Yu-Te; Belshaw, Nick
2015-04-01
Silicon (Si) stable isotopes have been used in a broad range of geochemical and cosmochemical applications. A precise and accurate determination of Si isotopes is desirable to distinguish their small natural variations (< 0.2‰) in many of these studies. In the past decade, the advent of the MC-ICP-MS has spurred a remarkable improvement in the precision and accuracy of Si isotopic analysis. The instrumental mass fractionation correction is one crucial aspect of the analysis of Si isotopes. Two options are currently available: the sample-standard bracketing approach and the Mg doping approach. However, there has been a debate over the validity of the Mg doping approach. Some studies (Cardinal et al., 2003; Engström et al., 2006) favoured it compared to the sample-standard bracketing approach, whereas some other studies (e.g. De La Rocha, 2002) considered it unsuitable. This study investigates the Mg doping approach on both the Nu Plasma II and the Nu Plasma 1700. Experiments were performed in both the wet plasma and the dry plasma modes, using a number of different combinations of cones. A range of different Mg to Si ratios as well as different matrices have been used in the experiments. A sample-standard bracketing approach has also been adopted for the Si mass fractionation correction to compare with the Mg doping approach. Through assessing the mass fractionation behaviours of both Si and Mg under different instrument settings, this study aims to identity the factors which may affect the Mg doping approach and answer some key questions to the debate.
Glines, Wayne M; Markham, Anna
2018-05-01
Seventy-five years after the Hanford Site was initially created as the primary plutonium production site for atomic weapons development under the Manhattan Project, the American Nuclear Society and the Health Physics Society are sponsoring a conference from 30 September through 3 October 2018, in Pasco, Washington, titled "Applicability of Radiation Response Models to Low Dose Protection Standards." The goal of this conference is to use current scientific data to update the approach to regulating low-level radiation doses; i.e., to answer a quintessential question of radiation protection-how to best develop radiation protection standards that protect human populations against detrimental effects while allowing the beneficial uses of radiation and radioactive materials. Previous conferences (e.g., "Wingspread Conference," "Arlie Conference") have attempted to address this question; but now, almost 20 y later, the key issues, goals, conclusions, and recommendations of those two conferences remain and are as relevant as they were then. Despite the best efforts of the conference participants and increased knowledge and understanding of the science underlying radiation effects in human populations, the bases of current radiation protection standards have evolved little. This 2018 conference seeks to provide a basis and path forward for evolving radiation protection standards to be more reflective of current knowledge and understanding of low dose response models.
A Palliative Approach to Dialysis Care: A Patient-Centered Transition to the End of Life
Moss, Alvin H.; Cohen, Lewis M.; Fischer, Michael J.; Germain, Michael J.; Jassal, S. Vanita; Perl, Jeffrey; Weiner, Daniel E.; Mehrotra, Rajnish
2014-01-01
As the importance of providing patient-centered palliative care for patients with advanced illnesses gains attention, standard dialysis delivery may be inconsistent with the goals of care for many patients with ESRD. Many dialysis patients with life expectancy of <1 year may desire a palliative approach to dialysis care, which focuses on aligning patient treatment with patients’ informed preferences. This commentary elucidates what comprises a palliative approach to dialysis care and describes its potential and appropriate use. It also reviews the barriers to integrating such an approach into the current clinical paradigm of care and existing infrastructure and outlines system-level changes needed to accommodate such an approach. PMID:25104274
Toward Clarity in Clinical Vitamin D Status Assessment: 25(OH)D Assay Standardization.
Binkley, Neil; Carter, Graham D
2017-12-01
Widespread variation in 25-hydroxyvitamin D (25(OH)D) assays continues to compromise efforts to develop clinical and public health guidelines regarding vitamin D status. The Vitamin D Standardization Program helps alleviate this problem. Reference measurement procedures and standard reference materials have been developed to allow current, prospective, and retrospective standardization of 25(OH)D results. Despite advances in 25(OH)D measurement, substantial variability in clinical laboratory 25(OH)D measurement persists. Existing guidelines have not used standardized data and, as a result, it seems unlikely that consensus regarding definitions of vitamin D deficiency, inadequacy, sufficiency, and excess will soon be reached. Until evidence-based consensus is reached, a reasonable clinical approach is advocated. Copyright © 2017 Elsevier Inc. All rights reserved.
Rothschild, Uta; Muller, Laurent; Lechner, Axel; Schlösser, Hans A; Beutner, Dirk; Läubli, Heinz; Zippelius, Alfred; Rothschild, Sacha I
2018-05-14
Head and neck squamous cell carcinoma (HNSCC) is a frequent tumour arising from multiple anatomical subsites in the head and neck region. The treatment for early-stage disease is generally single modality, either surgery or radiotherapy. The treatment for locally advanced tumours is multimodal. For recurrent/metastatic HNSCC palliative chemotherapy is standard of care. The prognosis is limited and novel treatment approaches are urgently needed. HNSCC evades immune responses through multiple resistance mechanisms. HNSCC is particularly characterised by an immunosuppressive environment which includes the release of immunosuppressive factors, activation, expansion of immune cells with inhibitory activity and decreased tumour immunogenicity. An in-depth understanding of these mechanisms led to rational design of immunotherapeutic approaches and clinical trials. Currently, only immune checkpoint inhibitors, namely monoclonal antibodies targeting the immune inhibitory receptor programmed cell death 1 (PD-1) and its ligand PD-L1 have proven clinical efficacy in randomised phase III trials. The PD-1 inhibitor nivolumab is the only drug approved for platinum-refractory recurrent/metastatic HNSCC. However, many more immunotherapeutic treatment options are currently under investigation. Ongoing trials are investigating immunotherapeutic approaches also in the curative setting and combination therapies using different immunotherapeutic approaches. This review article summarises current knowledge of the role of the immune system in the development and progression of HNSCC, and provides a comprehensive overview on the development of immunotherapeutic approaches.
Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.
Current trends in molecular diagnostics of chronic myeloid leukemia.
Vinhas, Raquel; Cordeiro, Milton; Pedrosa, Pedro; Fernandes, Alexandra R; Baptista, Pedro V
2017-08-01
Nearly 1.5 million people worldwide suffer from chronic myeloid leukemia (CML), characterized by the genetic translocation t(9;22)(q34;q11.2), involving the fusion of the Abelson oncogene (ABL1) with the breakpoint cluster region (BCR) gene. Early onset diagnosis coupled to current therapeutics allow for a treatment success rate of 90, which has focused research on the development of novel diagnostics approaches. In this review, we present a critical perspective on current strategies for CML diagnostics, comparing to gold standard methodologies and with an eye on the future trends on nanotheranostics.
Exact joint density-current probability function for the asymmetric exclusion process.
Depken, Martin; Stinchcombe, Robin
2004-07-23
We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society
Quality assurance and accreditation.
1997-01-01
In 1996, the Joint Commission International (JCI), which is a partnership between the Joint Commission on Accreditation of Healthcare Organizations and Quality Healthcare Resources, Inc., became one of the contractors of the Quality Assurance Project (QAP). JCI recognizes the link between accreditation and quality, and uses a collaborative approach to help a country develop national quality standards that will improve patient care, satisfy patient-centered objectives, and serve the interest of all affected parties. The implementation of good standards provides support for the good performance of professionals, introduces new ideas for improvement, enhances the quality of patient care, reduces costs, increases efficiency, strengthens public confidence, improves management, and enhances the involvement of the medical staff. Such good standards are objective and measurable; achievable with current resources; adaptable to different institutions and cultures; and demonstrate autonomy, flexibility, and creativity. The QAP offers the opportunity to approach accreditation through research efforts, training programs, and regulatory processes. QAP work in the area of accreditation has been targeted for Zambia, where the goal is to provide equal access to cost-effective, quality health care; Jordan, where a consensus process for the development of standards, guidelines, and policies has been initiated; and Ecuador, where JCI has been asked to help plan an approach to the evaluation and monitoring of the health care delivery system.
Christensen, Leif; Karle, Hans; Nystrup, Jørgen
2007-09-01
An outcome-based approach to medical education compared to a process/content orientation is currently being discussed intensively. In this article, the process and outcome interrelationship in medical education is discussed, with specific emphasis on the relation to the definition of standards in basic medical education. Perceptions of outcome have always been an integrated element of curricular planning. The present debate underlines the need for stronger focus on learning objectives and outcome assessment in many medical schools around the world. The need to maintain an integrated approach of process/content and outcome is underlined in this paper. A worry is expressed about the taxonomy of learning in pure outcome-based medical education, in which student assessment can be a major determinant for the learning process, leaving the control of the medical curriculum to medical examiners. Moreover, curricula which favour reductionism by stating everything in terms of instrumental outcomes or competences, do face a risk of lowering quality and do become a prey for political interference. Standards based on outcome alone rise unclarified problems in relationship to licensure requirements of medical doctors. It is argued that the alleged dichotomy between process/content and outcome seems artificial, and that formulation of standards in medical education must follow a comprehensive line in curricular planning.
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
Advances in diagnostic and treatment modalities for intracranial tumors.
Dickinson, P J
2014-01-01
Intracranial neoplasia is a common clinical condition in domestic companion animals, particularly in dogs. Application of advances in standard diagnostic and therapeutic modalities together with a broad interest in the development of novel translational therapeutic strategies in dogs has resulted in clinically relevant improvements in outcome for many canine patients. This review highlights the status of current diagnostic and therapeutic approaches to intracranial neoplasia and areas of novel treatment currently in development. Copyright © 2014 by the American College of Veterinary Internal Medicine.
Non-Intrusive Load Monitoring Assessment: Literature Review and Laboratory Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butner, R. Scott; Reid, Douglas J.; Hoffman, Michael G.
2013-07-01
To evaluate the accuracy of NILM technologies, a literature review was conducted to identify any test protocols or standardized testing approaches currently in use. The literature review indicated that no consistent conventions were currently in place for measuring the accuracy of these technologies. Consequently, PNNL developed a testing protocol and metrics to provide the basis for quantifying and analyzing the accuracy of commercially available NILM technologies. This report discusses the results of the literature review and the proposed test protocol and metrics in more detail.
Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Dan; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system development process are presented. Benefits and lessons learned will be discussed and time for questions and answers will be provided.
NASA Astrophysics Data System (ADS)
Peckerar, Martin C.; Marrian, Christie R.
1995-05-01
Standard matrix inversion methods of e-beam proximity correction are compared with a variety of pseudoinverse approaches based on gradient descent. It is shown that the gradient descent methods can be modified using 'regularizers' (terms added to the cost function minimized during gradient descent). This modification solves the 'negative dose' problem in a mathematically sound way. Different techniques are contrasted using a weighted error measure approach. It is shown that the regularization approach leads to the highest quality images. In some cases, ignoring negative doses yields results which are worse than employing an uncorrected dose file.
ERIC Educational Resources Information Center
Jani, Jayshree S.; Pierce, Dean; Ortiz, Larry; Sowbel, Lynda
2011-01-01
This article provides an assessment of the current situation in social work education regarding the teaching of content on diversity, with a focus on implications for social work theory, practice, and education. The article provides a critical analysis of the historical development of approaches to teaching diversity content in social work…
Regional-scale air quality models are being used to demonstrate attainment of the ozone air quality standard. In current regulatory applications, a regional-scale air quality model is applied for a base year and a future year with reduced emissions using the same meteorological ...
An Open and Scalable Learning Infrastructure for Food Safety
ERIC Educational Resources Information Center
Manouselis, Nikos; Thanopoulos, Charalampos; Vignare, Karen; Geith, Christine
2013-01-01
In the last several years, a variety of approaches and tools have been developed for giving access to open educational resources (OER) related to food safety, security, and food standards, as well to various targeted audiences (e.g., farmers, agronomists). The aim of this paper is to present a technology infrastructure currently in demonstration…
Why Teach Science with an Interdisciplinary Approach: History, Trends, and Conceptual Frameworks
ERIC Educational Resources Information Center
You, Hye Sun
2017-01-01
This study aims to describe the history of interdisciplinary education and the current trends and to elucidate the conceptual framework and values that support interdisciplinary science teaching. Many science educators have perceived the necessity for a crucial paradigm shift towards interdisciplinary learning as shown in science standards.…
The Dependency Axiom and the Relation between Agreement and Movement
ERIC Educational Resources Information Center
Linares Scarcerieau, Carlo Andrei
2012-01-01
Agreement and movement go hand in hand in a number of constructions across languages, and this correlation has played an important role in syntactic theory. The current standard approach to this "movement-agreement connection" is the Agree+EPP model, whose EPP component has often been questioned on conceptual grounds. The goal of this…
A Probablistic Diagram to Guide Chemical Design with Reduced Potency to Incur Cytotoxicity
Toxicity is a concern with many chemicals currently in commerce, and with new chemicals that are introduced each year. The standard approach to testing chemicals is to run studies in laboratory animals (e.g. rats, mice, dogs), but because of the expense of these studies and conce...
Student Interest in Engineering Design-Based Science
ERIC Educational Resources Information Center
Selcen Guzey, S.; Moore, Tamara J.; Morse, Gillian
2016-01-01
Current reform efforts in science education around the world call on teachers to use integrated approaches to teach science. As a part of such reform efforts in the United States, engineering practices and engineering design have been identified in K-12 science education standards. However, there is relatively little is known about effective ways…
Opportunities and Possibilities: Philosophical Hermeneutics and the Educational Researcher
ERIC Educational Resources Information Center
Agrey, Loren G.
2014-01-01
The opportunities that philosophical hermeneutics provide as a research tool are explored and it is shown that this qualitative research method can be employed as a valuable tool for the educational researcher. Used as an alternative to the standard quantitative approach to educational research, currently being the dominant paradigm of data…
ERIC Educational Resources Information Center
Huang, Shaobo; Mejia, Joel Alejandro; Becker, Kurt; Neilson, Drew
2015-01-01
Improving high school physics teaching and learning is important to the long-term success of science, technology, engineering, and mathematics (STEM) education. Efforts are currently in place to develop an understanding of science among high school students through formal and informal educational experiences in engineering design activities…
ERIC Educational Resources Information Center
Patalino, Marianne
Problems in current course evaluation methods are discussed and an alternative method is described for the construction, analysis, and interpretation of a test to evaluate instructional programs. The method presented represents a different approach to the traditional overreliance on standardized achievement tests and the total scores they provide.…
Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory
ERIC Educational Resources Information Center
Wynne, Heather Marie
2014-01-01
Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…
Decoding the Disciplines: An Approach to Scientific Thinking
ERIC Educational Resources Information Center
Pinnow, Eleni
2016-01-01
The Decoding the Disciplines methodology aims to teach students to think like experts in discipline-specific tasks. The central aspect of the methodology is to identify a bottleneck in the course content: a particular topic that a substantial number of students struggle to master. The current study compared the efficacy of standard lecture and…
Alhumaidan, Hiba; Cheves, Tracey; Holme, Stein; Sweeney, Joseph D
2011-10-01
The processing of whole blood-derived platelet-rich plasma (PRP) to a platelet concentrate and platelet-poor plasma is currently performed within 8 hours to comply with the requirements to manufacture fresh frozen plasma. Maintaining PRP at room temperature for a longer period can have the advantage of shifting the completion of component manufacture onto day shifts. Pairs of ABO-identical prepooled platelets were manufactured by the PRP method, using the current approach with platelet storage in a CLX HP container (Pall Medical, Covina, CA) and plasma, or a novel approach with an 18- to a 24-hour room temperature hold of the PRP and the manufacture of pooled platelets in a glucose-containing additive solution (AS) and storage in a new ELX container (Pall Medical). Standard in vitro assays were performed on days 2, 5, and 7. The results showed that the AS platelets in ELX have in vitro characteristics that are equivalent or superior to those of the standard product.
2009-01-01
being done, in part, in response to Executive Order 13327, which mandates a pragmatic and consistent approach to Federal agency management of real...move forward. The U.S. Army Research and Development Center, Construction Engineering Research Laboratory was tasked with surveying a number of...assessment in use within USACE. (All rely on a deficiency-based approach, i.e., deviations from standards or from known benchmarks, to inspection.); (2
Water impacts of CO2 emission performance standards for fossil fuel-fired power plants.
Talati, Shuchi; Zhai, Haibo; Morgan, M Granger
2014-10-21
We employ an integrated systems modeling tool to assess the water impacts of the new source performance standards recently proposed by the U.S. Environmental Protection Agency for limiting CO2 emissions from coal- and gas-fired power plants. The implementation of amine-based carbon capture and storage (CCS) for 40% CO2 capture to meet the current proposal will increase plant water use by roughly 30% in supercritical pulverized coal-fired power plants. The specific amount of added water use varies with power plant and CCS designs. More stringent emission standards than the current proposal would require CO2 emission reductions for natural gas combined-cycle (NGCC) plants via CCS, which would also increase plant water use. When examined over a range of possible future emission standards from 1100 to 300 lb CO2/MWh gross, new baseload NGCC plants consume roughly 60-70% less water than coal-fired plants. A series of adaptation approaches to secure low-carbon energy production and improve the electric power industry's water management in the face of future policy constraints are discussed both quantitatively and qualitatively.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Malone, Matthew; Goeres, Darla M; Gosbell, Iain; Vickery, Karen; Jensen, Slade; Stoodley, Paul
2017-02-01
The concept of biofilms in human health and disease is now widely accepted as cause of chronic infection. Typically, biofilms show remarkable tolerance to many forms of treatments and the host immune response. This has led to vast increase in research to identify new (and sometimes old) anti-biofilm strategies that demonstrate effectiveness against these tolerant phenotypes. Areas covered: Unfortunately, a standardized methodological approach of biofilm models has not been adopted leading to a large disparity between testing conditions. This has made it almost impossible to compare data across multiple laboratories, leaving large gaps in the evidence. Furthermore, many biofilm models testing anti-biofilm strategies aimed at the medical arena have not considered the matter of relevance to an intended application. This may explain why some in vitro models based on methodological designs that do not consider relevance to an intended application fail when applied in vivo at the clinical level. Expert commentary: This review will explore the issues that need to be considered in developing performance standards for anti-biofilm therapeutics and provide a rationale for the need to standardize models/methods that are clinically relevant. We also provide some rational as to why no standards currently exist.
Is it time to reassess current safety standards for glyphosate-based herbicides?
Vandenberg, Laura N; Blumberg, Bruce; Antoniou, Michael N; Benbrook, Charles M; Carroll, Lynn; Colborn, Theo; Everett, Lorne G; Hansen, Michael; Landrigan, Philip J; Lanphear, Bruce P; Mesnage, Robin; Vom Saal, Frederick S; Welshons, Wade V; Myers, John Peterson
2017-06-01
Use of glyphosate-based herbicides (GBHs) increased ∼100-fold from 1974 to 2014. Additional increases are expected due to widespread emergence of glyphosate-resistant weeds, increased application of GBHs, and preharvest uses of GBHs as desiccants. Current safety assessments rely heavily on studies conducted over 30 years ago. We have considered information on GBH use, exposures, mechanisms of action, toxicity and epidemiology. Human exposures to glyphosate are rising, and a number of in vitro and in vivo studies challenge the basis for the current safety assessment of glyphosate and GBHs. We conclude that current safety standards for GBHs are outdated and may fail to protect public health or the environment. To improve safety standards, the following are urgently needed: (1) human biomonitoring for glyphosate and its metabolites; (2) prioritisation of glyphosate and GBHs for hazard assessments, including toxicological studies that use state-of-the-art approaches; (3) epidemiological studies, especially of occupationally exposed agricultural workers, pregnant women and their children and (4) evaluations of GBHs in commercially used formulations, recognising that herbicide mixtures likely have effects that are not predicted by studying glyphosate alone. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Jones, W Schuyler; Krucoff, Mitchell W; Morales, Pablo; Wilgus, Rebecca W; Heath, Anne H; Williams, Mary F; Tcheng, James E; Marinac-Dabic, J Danica; Malone, Misti L; Reed, Terrie L; Fukaya, Rie; Lookstein, Robert A; Handa, Nobuhiro; Aronow, Herbert D; Bertges, Daniel J; Jaff, Michael R; Tsai, Thomas T; Smale, Joshua A; Zaugg, Margo J; Thatcher, Robert J; Cronenwett, Jack L
2018-02-01
The current state of evaluating patients with peripheral artery disease and more specifically of evaluating medical devices used for peripheral vascular intervention (PVI) remains challenging because of the heterogeneity of the disease process, the multiple physician specialties that perform PVI, the multitude of devices available to treat peripheral artery disease, and the lack of consensus about the best treatment approaches. Because PVI core data elements are not standardized across clinical care, clinical trials, and registries, aggregation of data across different data sources and physician specialties is currently not feasible. Under the auspices of the U.S. Food and Drug Administration's Medical Device Epidemiology Network initiative-and its PASSION (Predictable and Sustainable Implementation of the National Registries) program, in conjunction with other efforts to align clinical data standards-the Registry Assessment of Peripheral Interventional Devices (RAPID) workgroup was convened. RAPID is a collaborative, multidisciplinary effort to develop a consensus lexicon and to promote interoperability across clinical care, clinical trials, and national and international registries of PVI. The current manuscript presents the initial work from RAPID to standardize clinical data elements and definitions, to establish a framework within electronic health records and health information technology procedural reporting systems, and to implement an informatics-based approach to promote the conduct of pragmatic clinical trials and registry efforts in PVI. Ultimately, we hope this work will facilitate and improve device evaluation and surveillance for patients, clinicians, health outcomes researchers, industry, policymakers, and regulators. Copyright © 2017 Society for Vascular Surgery. All rights reserved.
For Researchers on Obesity: Historical Review of Extra Body Weight Definitions.
Komaroff, Marina
2016-01-01
Rationale. The concept of obesity has been known since ancient world; however, the current standard definition of obesity was endorsed only about a decade ago. There is a need for researches to understand multiple approaches to defining obesity and how and why the standard definition was developed. The review will help to grasp the complexity of the problem and can lead to novel hypotheses in obesity research. Objective. This paper focuses on the objective to understand historical background on the development of "reference and standard tables" of weight as a platform for normal versus abnormal body weight definition. Methods. A systematic literature review was performed to chronologically summarize the definition of body weight from time of Hippocrates till the year of 2010. Conclusion. This paper presents the historical background on the development of "reference and standard tables" of weight as a platform for normal versus abnormal body weight definition. Knowledge of historical approaches to the concept of obesity can motivate researchers to find new hypotheses and utilize the appropriate obesity assessments to address their objectives.
Development of a New Departure Aversion Standard for Light Aircraft
NASA Technical Reports Server (NTRS)
Borer, Nicholas K.
2017-01-01
The Federal Aviation Administration (FAA) and European Aviation Safety Agency (EASA) have recently established new light aircraft certification rules that introduce significant changes to the current regulations. The changes include moving from prescriptive design requirements to performance-based standards, transferring many of the acceptable means of compliance out of the rules and into consensus standards. In addition, the FAA/EASA rules change the performance requirements associated with some of the more salient safety issues regarding light aircraft. One significant change is the elimination of spin recovery demonstration. The new rules now call for enhanced stall warning and aircraft handling characteristics that demonstrate resistance to inadvertent departure from controlled flight. The means of compliance with these changes in a safe, cost-effective manner is a challenging problem. This paper discusses existing approaches to reducing the likelihood of departure from controlled flight and introduces a new approach, dubbed Departure Aversion, which allows applicants to tailor the amount of departure resistance, stall warning, and enhanced safety equipment to meet the new proposed rules. The Departure Aversion approach gives applicants the freedom to select the most cost-effective portfolio for their design, while meeting the safety intent of the new rules, by ensuring that any combination of the selected approaches will be at a higher equivalent level of safety than today's status quo.
Florio, Tullio; Barbieri, Federica
2012-10-01
Glioblastoma is the most prevalent and malignant form of brain cancer, but the current available multimodality treatments yield poor survival improvement. Thus, innovative therapeutic strategies represent the challenging topic for glioblastoma management. Multidisciplinary advances, supporting current standard of care therapies and investigational trials that reveal potential drug targets for glioblastoma are reviewed. A radical change in glioblastoma therapeutic approaches could arise from the identification of cancer stem cells, putative tumor-initiating cells involved in tumor initiation, progression and resistance, as innovative drug target. Still controversial identification of markers and molecular regulators in glioma tumor-initiating cells and novel approaches targeting these cells are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Evaluation of Esophageal Motor Function With High-resolution Manometry
2013-01-01
For several decades esophageal manometry has been the test of choice to evaluate disorders of esophageal motor function. The recent introduction of high-resolution manometry for the study of esophageal motor function simplified performance of esophageal manometry, and revealed previously unidentified patterns of normal and abnormal esophageal motor function. Presentation of pressure data as color contour plots or esophageal pressure topography led to the development of new tools for analyzing and classifying esophageal motor patterns. The current standard and still developing approach to do this is the Chicago classification. While this methodical approach is improving our diagnosis of esophageal motor disorders, it currently does not address all motor abnormalities. We will explore the Chicago classification and disorders that it does not address. PMID:23875094
Novel optical strategies for biodetection
NASA Astrophysics Data System (ADS)
Sakamuri, Rama M.; Wolfenden, Mark S.; Anderson, Aaron S.; Swanson, Basil I.; Schmidt, Jurgen S.; Mukundan, Harshini
2013-09-01
Although bio-detection strategies have significantly evolved in the past decade, they still suffer from many disadvantages. For one, current approaches still require confirmation of pathogen viability by culture, which is the `gold-standard' method, and can take several days to result. Second, current methods typically target protein and nucleic acid signatures and cannot be applied to other biochemical categories of biomarkers (e.g.; lipidated sugars). Lipidated sugars (e.g.; lipopolysaccharide, lipoarabinomannan) are bacterial virulence factors that are significant to pathogenicity. Herein, we present two different optical strategies for biodetection to address these two limitations. We have exploited bacterial iron sequestration mechanisms to develop a simple, specific assay for the selective detection of viable bacteria, without the need for culture. We are currently working on the use of this technology for the differential detection of two different bacteria, using siderophores. Second, we have developed a novel strategy termed `membrane insertion' for the detection of amphiphilic biomarkers (e.g. lipidated glycans) that cannot be detected by conventional approaches. We have extended this technology to the detection of small molecule amphiphilic virulence factors, such as phenolic glycolipid-1 from leprosy, which could not be directly detected before. Together, these strategies address two critical limitations in current biodetection approaches. We are currently working on the optimization of these methods, and their extension to real-world clinical samples.
A Fully Automated Approach to Spike Sorting.
Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F
2017-09-13
Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.
New techniques for assessing response after hypofractionated radiotherapy for lung cancer
Mattonen, Sarah A.; Huang, Kitty; Ward, Aaron D.; Senan, Suresh
2014-01-01
Hypofractionated radiotherapy (HFRT) is an effective and increasingly-used treatment for early stage non-small cell lung cancer (NSCLC). Stereotactic ablative radiotherapy (SABR) is a form of HFRT and delivers biologically effective doses (BEDs) in excess of 100 Gy10 in 3-8 fractions. Excellent long-term outcomes have been reported; however, response assessment following SABR is complicated as radiation induced lung injury can appear similar to a recurring tumor on CT. Current approaches to scoring treatment responses include Response Evaluation Criteria in Solid Tumors (RECIST) and positron emission tomography (PET), both of which appear to have a limited role in detecting recurrences following SABR. Novel approaches to assess response are required, but new techniques should be easily standardized across centers, cost effective, with sensitivity and specificity that improves on current CT and PET approaches. This review examines potential novel approaches, focusing on the emerging field of quantitative image feature analysis, to distinguish recurrence from fibrosis after SABR. PMID:24688782
Schacherer, Lindsey J; Xie, Weiping; Owens, Michaela A; Alarcon, Clara; Hu, Tiger X
2016-09-01
Liquid chromatography coupled with tandem mass spectrometry is increasingly used for protein detection for transgenic crops research. Currently this is achieved with protein reference standards which may take a significant time or efforts to obtain and there is a need for rapid protein detection without protein reference standards. A sensitive and specific method was developed to detect target proteins in transgenic maize leaf crude extract at concentrations as low as ∼30 ng mg(-1) dry leaf without the need of reference standards or any sample enrichment. A hybrid Q-TRAP mass spectrometer was used to monitor all potential tryptic peptides of the target proteins in both transgenic and non-transgenic samples. The multiple reaction monitoring-initiated detection and sequencing (MIDAS) approach was used for initial peptide/protein identification via Mascot database search. Further confirmation was achieved by direct comparison between transgenic and non-transgenic samples. Definitive confirmation was provided by running the same experiments of synthetic peptides or protein standards, if available. A targeted proteomic mass spectrometry method using MIDAS approach is an ideal methodology for detection of new proteins in early stages of transgenic crop research and development when neither protein reference standards nor antibodies are available. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Laboratory techniques and rhythmometry
NASA Technical Reports Server (NTRS)
Halberg, F.
1973-01-01
Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.
State of the art of prostatic arterial embolization for benign prostatic hyperplasia.
Petrillo, Mario; Pesapane, Filippo; Fumarola, Enrico Maria; Emili, Ilaria; Acquasanta, Marzia; Patella, Francesca; Angileri, Salvatore Alessio; Rossi, Umberto G; Piacentini, Igor; Granata, Antonio Maria; Ierardi, Anna Maria; Carrafiello, Gianpaolo
2018-04-01
Prostatectomy via open surgery or transurethral resection of the prostate (TURP) is the standard treatment for benign prostatic hyperplasia (BPH). Several patients present contraindication for standard approach, individuals older than 60 years with urinary tract infection, strictures, post-operative pain, incontinence or urinary retention, sexual dysfunction, and blood loss are not good candidates for surgery. Prostatic artery embolization (PAE) is emerging as a viable method for patients unsuitable for surgery. In this article, we report results about technical and clinical success and safety of the procedure to define the current status.
A working paradigm for the treatment of obesity in gastrointestinal practice
Acosta, Andres; Camilleri, Michael
2017-01-01
Obesity is a chronic, relapsing, multi-factorial disease characterized by abnormal or excessive adipose tissue accumulation that may impair health and increase disease risks. Despite the ever-increasing prevalence and economic and societal burden, the current approaches to treat obesity are not standardized or generally effective. In this manuscript, we describe a current working paradigm developed by a consensus approach for the multidisciplinary treatment of obesity in the GI practice. Obesity should be managed as a continuum of care focusing on weight loss, weight loss maintenance and prevention of weight regain. This approach needs to be disseminated throughout the health care system, coordinated by a multidisciplinary team and include gastroenterologists who are in a unique position to address obesity. Gastroenterologists are in the front line of managing the morbidity resulting from obesity, and have expertise in use of the essential tools to manage obesity: nutrition, pharmacology, endoscopy and surgery. PMID:28936110
The fit and implementation of sexual harassment law to workplace evaluations.
Wiener, Richard L; Hackney, Amy; Kadela, Karen; Rauch, Shannon; Seib, Hope; Warren, Laura; Hurt, Linda E
2002-08-01
Three studies used videotaped harassment complaints to examine the impact of legal standards on the evaluation of social-sexual conduct at work. Study 1 demonstrated that without legal instructions, college students' judgment strategies were highly variable. Study 2 compared 2 current legal standards, the "severity or pervasiveness test" and a proposed utilitarian alternative (i.e., the rational woman approach). Undergraduate participants taking the perspective of the complainant were more sensitive to offensive conduct than were those adopting an objective perspective. Although the utilitarian altemative further increased sensitivity on some measures, it failed to produce a principled judgment strategy. Finally, Study 3 examined the kinds of errors that full-time workers make when applying the "severity or pervasiveness" test to examine more closely the sensitivity of the subjective approach.
Pooling biomarker data from different studies of disease risk, with a focus on endogenous hormones
Key, Timothy J; Appleby, Paul N; Allen, Naomi E; Reeves, Gillian K
2010-01-01
Large numbers of observations are needed to provide adequate power in epidemiological studies of biomarkers and cancer risk. However, there are currently few large mature studies with adequate numbers of cases with biospecimens available. Therefore pooling biomarker measures from different studies is a valuable approach, enabling investigators to make robust estimates of risk and to examine associations in subgroups of the population. The ideal situation is to have standardized methods in all studies so that the biomarker data can be pooled in their original units. However, even when the studies do not have standardized methods, as with existing studies on hormones and cancer, a simple approach using study-specific quantiles or percentage increases can provide substantial information on the relationship of the biomarker with cancer risk. PMID:20233851
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2018-03-01
Like other NDE methods, eddy current surface crack detectability is determined using probability of detection (POD) demonstration. The POD demonstration involves eddy current testing of surface crack specimens with known crack sizes. Reliably detectable flaw size, denoted by, a90/95 is determined by statistical analysis of POD test data. The surface crack specimens shall be made from a similar material with electrical conductivity close to the part conductivity. A calibration standard with electro-discharged machined (EDM) notches is typically used in eddy current testing for surface crack detection. The calibration standard conductivity shall be within +/- 15% of the part conductivity. This condition is also applicable to the POD demonstration crack set. Here, a case is considered, where conductivity of the crack specimens available for POD testing differs by more than 15% from that of the part to be inspected. Therefore, a direct POD demonstration of reliably detectable flaw size is not applicable. Additional testing is necessary to use the demonstrated POD test data. An approach to estimate the reliably detectable flaw size in eddy current testing for part made from material A using POD crack specimens made from material B with different conductivity is provided. The approach uses additional test data obtained on EDM notch specimens made from materials A and B. EDM notch test data from the two materials is used to create a transfer function between the demonstrated a90/95 size on crack specimens made of material B and the estimated a90/95 size for part made of material A. Two methods are given. For method A, a90/95 crack size for material B is given and POD data is available. Objective of method A is to determine a90/95 crack size for material A using the same relative decision threshold that was used for material B. For method B, target crack size a90/95 for material A is known. Objective is to determine decision threshold for inspecting material A.
Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin
2017-12-01
Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.
Addressing unwarranted clinical variation: A rapid review of current evidence.
Harrison, Reema; Manias, Elizabeth; Mears, Stephen; Heslop, David; Hinchcliff, Reece; Hay, Liz
2018-05-15
Unwarranted clinical variation (UCV) can be described as variation that can only be explained by differences in health system performance. There is a lack of clarity regarding how to define and identify UCV and, once identified, to determine whether it is sufficiently problematic to warrant action. As such, the implementation of systemic approaches to reducing UCV is challenging. A review of approaches to understand, identify, and address UCV was undertaken to determine how conceptual and theoretical frameworks currently attempt to define UCV, the approaches used to identify UCV, and the evidence of their effectiveness. Rapid evidence assessment (REA) methodology was used. A range of text words, synonyms, and subject headings were developed for the major concepts of unwarranted clinical variation, standards (and deviation from these standards), and health care environment. Two electronic databases (Medline and Pubmed) were searched from January 2006 to April 2017, in addition to hand searching of relevant journals, reference lists, and grey literature. Results were merged using reference-management software (Endnote) and duplicates removed. Inclusion criteria were independently applied to potentially relevant articles by 3 reviewers. Findings were presented in a narrative synthesis to highlight key concepts addressed in the published literature. A total of 48 relevant publications were included in the review; 21 articles were identified as eligible from the database search, 4 from hand searching published work and 23 from the grey literature. The search process highlighted the voluminous literature reporting clinical variation internationally; yet, there is a dearth of evidence regarding systematic approaches to identifying or addressing UCV. Wennberg's classification framework is commonly cited in relation to classifying variation, but no single approach is agreed upon to systematically explore and address UCV. The instances of UCV that warrant investigation and action are largely determined at a systems level currently, and stakeholder engagement in this process is limited. Lack of consensus on an evidence-based definition for UCV remains a substantial barrier to progress in this field. © 2018 John Wiley & Sons, Ltd.
A palliative approach to dialysis care: a patient-centered transition to the end of life.
Grubbs, Vanessa; Moss, Alvin H; Cohen, Lewis M; Fischer, Michael J; Germain, Michael J; Jassal, S Vanita; Perl, Jeffrey; Weiner, Daniel E; Mehrotra, Rajnish
2014-12-05
As the importance of providing patient-centered palliative care for patients with advanced illnesses gains attention, standard dialysis delivery may be inconsistent with the goals of care for many patients with ESRD. Many dialysis patients with life expectancy of <1 year may desire a palliative approach to dialysis care, which focuses on aligning patient treatment with patients' informed preferences. This commentary elucidates what comprises a palliative approach to dialysis care and describes its potential and appropriate use. It also reviews the barriers to integrating such an approach into the current clinical paradigm of care and existing infrastructure and outlines system-level changes needed to accommodate such an approach. Copyright © 2014 by the American Society of Nephrology.
Standardization and Optimization of Computed Tomography Protocols to Achieve Low-Dose
Chin, Cynthia; Cody, Dianna D.; Gupta, Rajiv; Hess, Christopher P.; Kalra, Mannudeep K.; Kofler, James M.; Krishnam, Mayil S.; Einstein, Andrew J.
2014-01-01
The increase in radiation exposure due to CT scans has been of growing concern in recent years. CT scanners differ in their capabilities and various indications require unique protocols, but there remains room for standardization and optimization. In this paper we summarize approaches to reduce dose, as discussed in lectures comprising the first session of the 2013 UCSF Virtual Symposium on Radiation Safety in Computed Tomography. The experience of scanning at low dose in different body regions, for both diagnostic and interventional CT procedures, is addressed. An essential primary step is justifying the medical need for each scan. General guiding principles for reducing dose include tailoring a scan to a patient, minimizing scan length, use of tube current modulation and minimizing tube current, minimizing-tube potential, iterative reconstruction, and periodic review of CT studies. Organized efforts for standardization have been spearheaded by professional societies such as the American Association of Physicists in Medicine. Finally, all team members should demonstrate an awareness of the importance of minimizing dose. PMID:24589403
The AIST Managed Cloud Environment
NASA Astrophysics Data System (ADS)
Cook, S.
2016-12-01
ESTO is currently in the process of developing and implementing the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to ESTO-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs will allow them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE will facilitate infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.
Gupta, Veer; Henriksen, Kim; Edwards, Melissa; Jeromin, Andreas; Lista, Simone; Bazenet, Chantal; Soares, Holly; Lovestone, Simon; Hampel, Harald; Montine, Thomas; Blennow, Kaj; Foroud, Tatiana; Carrillo, Maria; Graff-Radford, Neill; Laske, Christoph; Breteler, Monique; Shaw, Leslie; Trojanowski, John Q.; Schupf, Nicole; Rissman, Robert A.; Fagan, Anne M.; Oberoi, Pankaj; Umek, Robert; Weiner, Michael W.; Grammas, Paula; Posner, Holly; Martins, Ralph
2015-01-01
The lack of readily available biomarkers is a significant hindrance towards progressing to effective therapeutic and preventative strategies for Alzheimer’s disease (AD). Blood-based biomarkers have potential to overcome access and cost barriers and greatly facilitate advanced neuroimaging and cerebrospinal fluid biomarker approaches. Despite the fact that preanalytical processing is the largest source of variability in laboratory testing, there are no currently available standardized preanalytical guidelines. The current international working group provides the initial starting point for such guidelines for standardized operating procedures (SOPs). It is anticipated that these guidelines will be updated as additional research findings become available. The statement provides (1) a synopsis of selected preanalytical methods utilized in many international AD cohort studies, (2) initial draft guidelines/SOPs for preanalytical methods, and (3) a list of required methodological information and protocols to be made available for publications in the field in order to foster cross-validation across cohorts and laboratories. PMID:25282381
Schoenberg, Mike R; Lange, Rael T; Brickell, Tracey A; Saklofske, Donald H
2007-04-01
Neuropsychologic evaluation requires current test performance be contrasted against a comparison standard to determine if change has occurred. An estimate of premorbid intelligence quotient (IQ) is often used as a comparison standard. The Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is a commonly used intelligence test. However, there is no method to estimate premorbid IQ for the WISC-IV, limiting the test's utility for neuropsychologic assessment. This study develops algorithms to estimate premorbid Full Scale IQ scores. Participants were the American WISC-IV standardization sample (N = 2172). The sample was randomly divided into 2 groups (development and validation). The development group was used to generate 12 algorithms. These algorithms were accurate predictors of WISC-IV Full Scale IQ scores in healthy children and adolescents. These algorithms hold promise as a method to predict premorbid IQ for patients with known or suspected neurologic dysfunction; however, clinical validation is required.
Assessment of the magnetic field exposure due to the battery current of digital mobile phones.
Jokela, Kari; Puranen, Lauri; Sihvonen, Ari-Pekka
2004-01-01
Hand-held digital mobile phones generate pulsed magnetic fields associated with the battery current. The peak value and the waveform of the battery current were measured for seven different models of digital mobile phones, and the results were applied to compute approximately the magnetic flux density and induced currents in the phone-user's head. A simple circular loop model was used for the magnetic field source and a homogeneous sphere consisting of average brain tissue equivalent material simulated the head. The broadband magnetic flux density and the maximal induced current density were compared with the guidelines of ICNIRP using two various approaches. In the first approach the relative exposure was determined separately at each frequency and the exposure ratios were summed to obtain the total exposure (multiple-frequency rule). In the second approach the waveform was weighted in the time domain with a simple low-pass RC filter and the peak value was divided by a peak limit, both derived from the guidelines (weighted peak approach). With the maximum transmitting power (2 W) the measured peak current varied from 1 to 2.7 A. The ICNIRP exposure ratio based on the current density varied from 0.04 to 0.14 for the weighted peak approach and from 0.08 to 0.27 for the multiple-frequency rule. The latter values are considerably greater than the corresponding exposure ratios 0.005 (min) to 0.013 (max) obtained by applying the evaluation based on frequency components presented by the new IEEE standard. Hence, the exposure does not seem to exceed the guidelines. The computed peak magnetic flux density exceeded substantially the derived peak reference level of ICNIRP, but it should be noted that in a near-field exposure the external field strengths are not valid indicators of exposure. Currently, no biological data exist to give a reason for concern about the health effects of magnetic field pulses from mobile phones.
Setting Standards for Reporting and Quantification in Fluorescence-Guided Surgery.
Hoogstins, Charlotte; Burggraaf, Jan Jaap; Koller, Marjory; Handgraaf, Henricus; Boogerd, Leonora; van Dam, Gooitzen; Vahrmeijer, Alexander; Burggraaf, Jacobus
2018-05-29
Intraoperative fluorescence imaging (FI) is a promising technique that could potentially guide oncologic surgeons toward more radical resections and thus improve clinical outcome. Despite the increase in the number of clinical trials, fluorescent agents and imaging systems for intraoperative FI, a standardized approach for imaging system performance assessment and post-acquisition image analysis is currently unavailable. We conducted a systematic, controlled comparison between two commercially available imaging systems using a novel calibration device for FI systems and various fluorescent agents. In addition, we analyzed fluorescence images from previous studies to evaluate signal-to-background ratio (SBR) and determinants of SBR. Using the calibration device, imaging system performance could be quantified and compared, exposing relevant differences in sensitivity. Image analysis demonstrated a profound influence of background noise and the selection of the background on SBR. In this article, we suggest clear approaches for the quantification of imaging system performance assessment and post-acquisition image analysis, attempting to set new standards in the field of FI.
NASA Astrophysics Data System (ADS)
Hagar, Amit
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one's system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the 'apparent' collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard.
Mission Systems Open Architecture Science and Technology (MOAST) program
NASA Astrophysics Data System (ADS)
Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.
2017-04-01
The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Searching for Physics Beyond the Standard Model and Beyond
NASA Astrophysics Data System (ADS)
Abdullah, Mohammad
The hierarchy problem, convolved with the various known puzzles in particle physics, grants us a great outlook of new physics soon to be discovered. We present multiple approaches to searching for physics beyond the standard model. First, two models with a minimal amount of theoretical guidance are analyzed using existing or simulated LHC data. Then, an extension of the Minimal Supersymmetric Standard Model (MSSM) is studied with an emphasis on the cosmological implications as well as the current and future sensitivity of colliders, direct detection and indirect detection experiments. Finally, a more complete model of the MSSM is presented through which we attempt to resolve tension with observations within the context of gauge mediated supersymmetry breaking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidhu, K.S.
1991-06-01
The primary objective of a standard setting process is to arrive at a drinking water concentration at which exposure to a contaminant would result in no known or potential adverse health effect on human health. The drinking water standards also serve as guidelines to prevent pollution of water sources and may be applicable in some cases as regulatory remediation levels. The risk assessment methods along with various decision making parameters are used to establish drinking water standards. For carcinogens classified in Groups A and B by the United States Environmental Protection Agency (USEPA) the standards are set by using nonthresholdmore » cancer risk models. The linearized multistage model is commonly used for computation of potency factors for carcinogenic contaminants. The acceptable excess risk level may vary from 10(-6) to 10(-4). For noncarcinogens, a threshold model approach based on application of an uncertainty factor is used to arrive at a reference dose (RfD). The RfD approach may also be used for carcinogens classified in Group C by the USEPA. The RfD approach with an additional uncertainty factory of 10 for carcinogenicity has been applied in the formulation of risk assessment for Group C carcinogens. The assumptions commonly used in arriving at drinking water standards are human life expectancy, 70 years; average human body weight, 70 kg; human daily drinking water consumption, 2 liters; and contribution of exposure to the contaminant from drinking water (expressed as a part of the total environmental exposure), 20%. Currently, there are over 80 USEPA existing or proposed primary standards for organic and inorganic contaminants in drinking water. Some of the state versus federal needs and viewpoints are discussed.« less
ERIC Educational Resources Information Center
van Loo, Jasper B.; Rocco, Tonette S.
2008-01-01
We discuss how economic theory has analyzed the effects of being GLBT (Gay, Lesbian, Bisexual or Transgender). We find that economics has focused on finding earnings differentials between GLBT and heterosexuals. The issue is, however, whether the standard analytical techniques available in economics, can be applied to sexual minorities. A number…
State-Level Support for Comprehensive School Reform: Implications for Policy and Practice
ERIC Educational Resources Information Center
Lane, Brett; Gracia, Susan
2005-01-01
In the current context of standards-based reform and heightened accountability for school performance, state education agencies (SEAs) have an important, but not yet well-articulated, role to play in local school improvement efforts. This article starts to articulate such a role by examining the variety of approaches and strategies used by 7 SEAs…
Distance Learning Approaches in the Mathematical Training of Pedagogical Institutes's Students
ERIC Educational Resources Information Center
Fomina, Tatyana; Vorobjev, Grigory; Kalitvin, Vladimir
2016-01-01
Nowadays, the Information technologies are more and more widely used in the mathematical education system. The generalization of experience and its implementation by means of the open source software is of current interest. It is also important to address this problem due to the transfer to the new FSES (Federal State Education Standards) of high…
ERIC Educational Resources Information Center
Wallace, Maria F. G.
2017-01-01
Traditionally, the humanist narrative of "novice versus veteran teacher" serves as an efficient way to understand and produce an experienced workforce in the current neoliberal educational climate. This standard approach to workforce development has hit science teachers especially hard as they work within a culture of crisis for ensuring…
A Legal Approach to Tackling Contract Cheating?
ERIC Educational Resources Information Center
Draper, Michael J.; Newton, Philip M.
2017-01-01
The phenomenon of contract cheating presents, potentially, a serious threat to the quality and standards of Higher Education around the world. There have been suggestions, cited below, to tackle the problem using legal means, but we find that current laws are not fit for this purpose. In this article we present a proposal for a specific new law to…
Proposals for a Dynamic Library. Technical Report.
ERIC Educational Resources Information Center
Salton, Gerard
The current library environment is first examined, and an attempt is made to explain why the standard approaches to the library problem have been less productive than had been anticipated. A new design is then introduced for modern library operations based on a two-fold strategy: on the input side, the widest possible utilization should be made of…
Questioning the Role of "21st-Century Skills" in Arts Education Advocacy Discourse
ERIC Educational Resources Information Center
Logsdon, Leann F.
2013-01-01
The revised Core Arts Standards offer music educators the chance to examine the contradictions that currently permeate the arts advocacy discourse. This article examines the emphasis on 21st-century workplace skills in claims made by arts advocacy proponents. An alternative approach focuses instead on lifelong learning in the arts and the array of…
Examining a Public Montessori School's Response to the Pressures of High-Stakes Accountability
ERIC Educational Resources Information Center
Block, Corrie Rebecca
2015-01-01
A public Montessori school is expected to demonstrate high student scores on standardized assessments to succeed in the current school accountability era. A problem for a public Montessori elementary school is how to make sense of the school's high-stakes assessment scores in terms of Montessori's unique educational approach. This case study…
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
Proteomics data exchange and storage: the need for common standards and public repositories.
Jiménez, Rafael C; Vizcaíno, Juan Antonio
2013-01-01
Both the existence of data standards and public databases or repositories have been key factors behind the development of the existing "omics" approaches. In this book chapter we first review the main existing mass spectrometry (MS)-based proteomics resources: PRIDE, PeptideAtlas, GPMDB, and Tranche. Second, we report on the current status of the different proteomics data standards developed by the Proteomics Standards Initiative (PSI): the formats mzML, mzIdentML, mzQuantML, TraML, and PSI-MI XML are then reviewed. Finally, we present an easy way to query and access MS proteomics data in the PRIDE database, as a representative of the existing repositories, using the workflow management system (WMS) tool Taverna. Two different publicly available workflows are explained and described.
Probabilistic consensus scoring improves tandem mass spectrometry peptide identification.
Nahnsen, Sven; Bertsch, Andreas; Rahnenführer, Jörg; Nordheim, Alfred; Kohlbacher, Oliver
2011-08-05
Database search is a standard technique for identifying peptides from their tandem mass spectra. To increase the number of correctly identified peptides, we suggest a probabilistic framework that allows the combination of scores from different search engines into a joint consensus score. Central to the approach is a novel method to estimate scores for peptides not found by an individual search engine. This approach allows the estimation of p-values for each candidate peptide and their combination across all search engines. The consensus approach works better than any single search engine across all different instrument types considered in this study. Improvements vary strongly from platform to platform and from search engine to search engine. Compared to the industry standard MASCOT, our approach can identify up to 60% more peptides. The software for consensus predictions is implemented in C++ as part of OpenMS, a software framework for mass spectrometry. The source code is available in the current development version of OpenMS and can easily be used as a command line application or via a graphical pipeline designer TOPPAS.
An alternative approach for computing seismic response with accidental eccentricity
NASA Astrophysics Data System (ADS)
Fan, Xuanhua; Yin, Jiacong; Sun, Shuli; Chen, Pu
2014-09-01
Accidental eccentricity is a non-standard assumption for seismic design of tall buildings. Taking it into consideration requires reanalysis of seismic resistance, which requires either time consuming computation of natural vibration of eccentric structures or finding a static displacement solution by applying an approximated equivalent torsional moment for each eccentric case. This study proposes an alternative modal response spectrum analysis (MRSA) approach to calculate seismic responses with accidental eccentricity. The proposed approach, called the Rayleigh Ritz Projection-MRSA (RRP-MRSA), is developed based on MRSA and two strategies: (a) a RRP method to obtain a fast calculation of approximate modes of eccentric structures; and (b) an approach to assemble mass matrices of eccentric structures. The efficiency of RRP-MRSA is tested via engineering examples and compared with the standard MRSA (ST-MRSA) and one approximate method, i.e., the equivalent torsional moment hybrid MRSA (ETM-MRSA). Numerical results show that RRP-MRSA not only achieves almost the same precision as ST-MRSA, and is much better than ETM-MRSA, but is also more economical. Thus, RRP-MRSA can be in place of current accidental eccentricity computations in seismic design.
Experimental Evaluation of Unicast and Multicast CoAP Group Communication
Ishaq, Isam; Hoebeke, Jeroen; Moerman, Ingrid; Demeester, Piet
2016-01-01
The Internet of Things (IoT) is expanding rapidly to new domains in which embedded devices play a key role and gradually outnumber traditionally-connected devices. These devices are often constrained in their resources and are thus unable to run standard Internet protocols. The Constrained Application Protocol (CoAP) is a new alternative standard protocol that implements the same principals as the Hypertext Transfer Protocol (HTTP), but is tailored towards constrained devices. In many IoT application domains, devices need to be addressed in groups in addition to being addressable individually. Two main approaches are currently being proposed in the IoT community for CoAP-based group communication. The main difference between the two approaches lies in the underlying communication type: multicast versus unicast. In this article, we experimentally evaluate those two approaches using two wireless sensor testbeds and under different test conditions. We highlight the pros and cons of each of them and propose combining these approaches in a hybrid solution to better suit certain use case requirements. Additionally, we provide a solution for multicast-based group membership management using CoAP. PMID:27455262
NASA Astrophysics Data System (ADS)
Badini, L.; Grassi, F.; Pignari, S. A.; Spadacini, G.; Bisognin, P.; Pelissou, P.; Marra, S.
2016-05-01
This work presents a theoretical rationale for the substitution of radiated-susceptibility (RS) verifications defined in current aerospace standards with an equivalent conducted-susceptibility (CS) test procedure based on bulk current injection (BCI) up to 500 MHz. Statistics is used to overcome the lack of knowledge about uncontrolled or uncertain setup parameters, with particular reference to the common-mode impedance of equipment. The BCI test level is properly investigated so to ensure correlation of currents injected in the equipment under test via CS and RS. In particular, an over-testing probability quantifies the severity of the BCI test with respect to the RS test.
Addressing current and future challenges for the NHS: the role of good leadership.
Elton, Lotte
2016-10-03
Purpose This paper aims to describe and analyse some of the ways in which good leadership can enable those working within the National Health Service (NHS) to weather the changes and difficulties likely to arise in the coming years, and takes the format of an essay written by the prize-winner of the Faculty of Medical Leadership and Management's Student Prize. The Faculty of Medical Leadership and Management ran its inaugural Student Prize in 2015-2016, which aimed at medical students with an interest in medical leadership. In running the Prize, the Faculty hoped to foster an enthusiasm for and understanding of the importance of leadership in medicine. Design/methodology/approach The Faculty asked entrants to discuss the role of good leadership in addressing the current and future challenges faced by the NHS, making reference to the Leadership and Management Standards for Medical Professionals published by the Faculty in 2015. These standards were intended to help guide current and future leaders and were grouped into three categories, namely, self, team and corporate responsibility. Findings This paper highlights the political nature of health care in the UK and the increasing impetus on medical professionals to navigate debates on austerity measures and health-care costs, particularly given the projected deficit in NHS funding. It stresses the importance of building organisational cultures prizing transparency to prevent future breaches in standards of care and the value of patient-centred approaches in improving satisfaction for both patients and staff. Identification of opportunities for collaboration and partnership is emphasised as crucial to assuage the burden that lack of appropriate social care places on clinical services. Originality/value This paper offers a novel perspective - that of a medical student - on the complex issues faced by the NHS over the coming years and utilises a well-regarded set of standards in conceptualising the role that health professionals have to play in leading the NHS.
A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH
Sadasivam, Rajani S.; Tanik, Murat M.
2013-01-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436
A meta-composite software development approach for translational research.
Sadasivam, Rajani S; Tanik, Murat M
2013-06-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.
Synthesis: Intertwining product and process
NASA Technical Reports Server (NTRS)
Weiss, David M.
1990-01-01
Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.
Formation Flying With Decentralized Control in Libration Point Orbits
NASA Technical Reports Server (NTRS)
Folta, David; Carpenter, J. Russell; Wagner, Christoph
2000-01-01
A decentralized control framework is investigated for applicability of formation flying control in libration orbits. The decentralized approach, being non-hierarchical, processes only direct measurement data, in parallel with the other spacecraft. Control is accomplished via linearization about a reference libration orbit with standard control using a Linear Quadratic Regulator (LQR) or the GSFC control algorithm. Both are linearized about the current state estimate as with the extended Kalman filter. Based on this preliminary work, the decentralized approach appears to be feasible for upcoming libration missions using distributed spacecraft.
Hermoso, Maria; Tabacchi, Garden; Iglesia-Altaba, Iris; Bel-Serrat, Silvia; Moreno-Aznar, Luis A; García-Santos, Yurena; García-Luzardo, Ma del Rosario; Santana-Salguero, Beatriz; Peña-Quintana, Luis; Serra-Majem, Lluis; Moran, Victoria Hall; Dykes, Fiona; Decsi, Tamás; Benetou, Vassiliki; Plada, Maria; Trichopoulou, Antonia; Raats, Monique M; Doets, Esmée L; Berti, Cristiana; Cetin, Irene; Koletzko, Berthold
2010-10-01
This paper presents a review of the current knowledge regarding the macro- and micronutrient requirements of infants and discusses issues related to these requirements during the first year of life. The paper also reviews the current reference values used in European countries and the methodological approaches used to derive them by a sample of seven European and international authoritative committees from which background scientific reports are available. Throughout the paper, the main issues contributing to disparities in micronutrient reference values for infants are highlighted. The identification of these issues in relation to the specific physiological aspects of infants is important for informing future initiatives aimed at providing standardized approaches to overcome variability of micronutrient reference values across Europe for this age group. © 2010 Blackwell Publishing Ltd.
The NASA F-106B Storm Hazards Program
NASA Technical Reports Server (NTRS)
Neely, W. R., Jr.; Fisher, B. D.
1983-01-01
During the NASA LRC Storm Hazards Program, 698 thunderstorm precipitations were made from 1980 to 1983 with an F-106B aircraft in order to record direct lightning strike data and the associated flight conditions. It was found that each of the three composite fin caps tested experienced multiple lightning attachments with only minor cosmetic damage. The maximum current level was only 20 ka, which is well below the design standard of 200 ka; however, indications are that the current rate of rise standard has been approached and may be exceeded in a major strike. The peak lightning strike rate occurred at ambient temperatures between -40 and -45 C, while most previously reported strikes have occurred at or near the freezing level. No significant operational difficulties or major aircraft damage resulting from the thunderstorm penetrations have been found.
Microfluidic Transduction Harnesses Mass Transport Principles to Enhance Gene Transfer Efficiency.
Tran, Reginald; Myers, David R; Denning, Gabriela; Shields, Jordan E; Lytle, Allison M; Alrowais, Hommood; Qiu, Yongzhi; Sakurai, Yumiko; Li, William C; Brand, Oliver; Le Doux, Joseph M; Spencer, H Trent; Doering, Christopher B; Lam, Wilbur A
2017-10-04
Ex vivo gene therapy using lentiviral vectors (LVs) is a proven approach to treat and potentially cure many hematologic disorders and malignancies but remains stymied by cumbersome, cost-prohibitive, and scale-limited production processes that cannot meet the demands of current clinical protocols for widespread clinical utilization. However, limitations in LV manufacture coupled with inefficient transduction protocols requiring significant excess amounts of vector currently limit widespread implementation. Herein, we describe a microfluidic, mass transport-based approach that overcomes the diffusion limitations of current transduction platforms to enhance LV gene transfer kinetics and efficiency. This novel ex vivo LV transduction platform is flexible in design, easy to use, scalable, and compatible with standard cell transduction reagents and LV preparations. Using hematopoietic cell lines, primary human T cells, primary hematopoietic stem and progenitor cells (HSPCs) of both murine (Sca-1 + ) and human (CD34 + ) origin, microfluidic transduction using clinically processed LVs occurs up to 5-fold faster and requires as little as one-twentieth of LV. As an in vivo validation of the microfluidic-based transduction technology, HSPC gene therapy was performed in hemophilia A mice using limiting amounts of LV. Compared to the standard static well-based transduction protocols, only animals transplanted with microfluidic-transduced cells displayed clotting levels restored to normal. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Van Broeck, G; Van Langenhove, H; Nieuwejaers, B
2001-01-01
Until now there has been little uniformity in the approach of odour nuisance problems in Flanders. A switch to a standardised and scientifically underpinned approach is essential and is currently in full development. This paper mainly discusses the results of five year research on odour concentration standard developments in Flanders, executed in the period 1996-2000. The research was focused on five pilot sectors: pig farms, slaughterhouses, paint spray installations, sewage treatment plants and textile plants. The general approach of the method to determine the dose-response relation was found to be sufficient. The methodology used is fully described in the paper presented by Van Broeck and Van Langenhove at the CIWEM and IAWO Joint International Conference on Control and Prevention of Odours in the Water Industry in September 1999. For each location (16 locations in total) an unambiguous dose-response relation could be derived (rising nuisance for rising concentrations). In most cases, a "no effect" level could be determined. The background percentage nuisance fluctuated between 0 and 15%. For the sectors of the slaughterhouses, paint spray installations and sewage treatment plants a no effect level was 0.5, 2.0 and 0.5 sniffing units m(-3) as 98th percentile (sniffing units are odour concentrations measured by means of sniffing measurements on the field) was determined. For the sectors of the textile plants and pig farms, no unambiguous no effect level was found. Currently research is undertaken to translate the no effect levels to odour standards. Other initiatives, taken to underpin the Flemish odour regulations, such as the development of an odour source inventory and a complaint handling system, are also briefly discussed.
SNL/CA Facilities Management Design Standards Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabb, David; Clark, Eva
2014-12-01
At Sandia National Laboratories in California (SNL/CA), the design, construction, operation, and maintenance of facilities is guided by industry standards, a graded approach, and the systematic analysis of life cycle benefits received for costs incurred. The design of the physical plant must ensure that the facilities are "fit for use," and provide conditions that effectively, efficiently, and safely support current and future mission needs. In addition, SNL/CA applies sustainable design principles, using an integrated whole-building design approach, from site planning to facility design, construction, and operation to ensure building resource efficiency and the health and productivity of occupants. The safetymore » and health of the workforce and the public, any possible effects on the environment, and compliance with building codes take precedence over project issues, such as performance, cost, and schedule.« less
Ford, Kristina L.; Zeng, Wei; Heazlewood, Joshua L.; ...
2015-08-28
The analysis of post-translational modifications (PTMs) by proteomics is regarded as a technically challenging undertaking. While in recent years approaches to examine and quantify protein phosphorylation have greatly improved, the analysis of many protein modifications, such as glycosylation, are still regarded as problematic. Limitations in the standard proteomics workflow, such as use of suboptimal peptide fragmentation methods, can significantly prevent the identification of glycopeptides. The current generation of tandem mass spectrometers has made available a variety of fragmentation options, many of which are becoming standard features on these instruments. Lastly, we have used three common fragmentation techniques, namely CID, HCD,more » and ETD, to analyze a glycopeptide and highlight how an integrated fragmentation approach can be used to identify the modified residue and characterize the N-glycan on a peptide.« less
Optimal Force Control of Vibro-Impact Systems for Autonomous Drilling Applications
NASA Technical Reports Server (NTRS)
Aldrich, Jack B.; Okon, Avi B.
2012-01-01
The need to maintain optimal energy efficiency is critical during the drilling operations performed on future and current planetary rover missions (see figure). Specifically, this innovation seeks to solve the following problem. Given a spring-loaded percussive drill driven by a voice-coil motor, one needs to determine the optimal input voltage waveform (periodic function) and the optimal hammering period that minimizes the dissipated energy, while ensuring that the hammer-to-rock impacts are made with sufficient (user-defined) impact velocity (or impact energy). To solve this problem, it was first observed that when voice-coil-actuated percussive drills are driven at high power, it is of paramount importance to ensure that the electrical current of the device remains in phase with the velocity of the hammer. Otherwise, negative work is performed and the drill experiences a loss of performance (i.e., reduced impact energy) and an increase in Joule heating (i.e., reduction in energy efficiency). This observation has motivated many drilling products to incorporate the standard bang-bang control approach for driving their percussive drills. However, the bang-bang control approach is significantly less efficient than the optimal energy-efficient control approach solved herein. To obtain this solution, the standard tools of classical optimal control theory were applied. It is worth noting that these tools inherently require the solution of a two-point boundary value problem (TPBVP), i.e., a system of differential equations where half the equations have unknown boundary conditions. Typically, the TPBVP is impossible to solve analytically for high-dimensional dynamic systems. However, for the case of the spring-loaded vibro-impactor, this approach yields the exact optimal control solution as the sum of four analytic functions whose coefficients are determined using a simple, easy-to-implement algorithm. Once the optimal control waveform is determined, it can be used optimally in the context of both open-loop and closed-loop control modes (using standard realtime control hardware).
Limited access atrial septal defect closure and the evolution of minimally invasive surgery.
Izzat, M B; Yim, A P; El-Zufari, M H
1998-04-01
While minimizing the "invasiveness" in general surgery has been equated with minimizing "access", what constitutes minimally invasive intra-cardiac surgery remains controversial. Many surgeons doubt the benefits of minimizing access when the need for cardiopulmonary bypass cannot be waived. Recognizing that median sternotomy itself does entail significant morbidity, we investigated the value of alternative approaches to median sternotomy using atrial septal defect closure as our investigative model. We believe that some, but not all minimal access approaches are associated with reduced postoperative morbidity and enhanced recovery. Our current strategy is to use a mini-sternotomy approach in adult patients, whereas conventional median sternotomy remains our standard approach in the pediatric population. Considerable clinical experiences coupled with documented clinical benefits are fundamental before a certain approach is adopted in routine practice.
ERIC Educational Resources Information Center
Blasberg, Jonathan S.; Hewitt, Paul L.; Flett, Gordon L.; Sherry, Simon B.; Chen, Chang
2016-01-01
In the current research, we illustrate the impact that item wording has on the content of personality scales and how differences in item wording influence empirical results. We present evidence indicating that items in certain scales used to measure "adaptive" perfectionism fail to capture the disabling all-or-nothing approach that is…
ERIC Educational Resources Information Center
Rusk, Kara
2012-01-01
For teachers who work with students with severe disabilities, it is a challenge to find ways to incorporate the core content and academic standards with the functional skills that the student will need to be independent as he or she transitions into adulthood. "Current federal requirements challenge educators, to bring about achievement of a…
ERIC Educational Resources Information Center
Kelava, Augustin; Nagengast, Benjamin
2012-01-01
Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent…
[Medical nutrition in Alzheimer's: the trials].
Scheltens, Philip; Twisk, Jos W R
2013-01-01
We describe the small but statistically significant effects of the medical nutrition diet 'Souvenaid' on memory in early Alzheimer's disease in two published randomised clinical trials. We specifically discuss the design and statistical approach, which were predefined and meet current standards in the field. Further research is needed to substantiate the long term effects and learn more about the mode of action of Souvenaid.
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
ERIC Educational Resources Information Center
Mars, Matthew M.; Ball, Anna L.
2016-01-01
The mainstream agricultural literacy movement has been mostly focused on school-based learning through formal curricula and standardized non-formal models (e.g., FFA, 4-H). The purpose of the current study is to qualitatively explore through a grounded theory approach, the development, sharing, and translation of diverse forms of agricultural…
Wee, Eugene J.H.; Wang, Yuling; Tsao, Simon Chang-Hao; Trau, Matt
2016-01-01
Sensitive and accurate identification of specific DNA mutations can influence clinical decisions. However accurate diagnosis from limiting samples such as circulating tumour DNA (ctDNA) is challenging. Current approaches based on fluorescence such as quantitative PCR (qPCR) and more recently, droplet digital PCR (ddPCR) have limitations in multiplex detection, sensitivity and the need for expensive specialized equipment. Herein we describe an assay capitalizing on the multiplexing and sensitivity benefits of surface-enhanced Raman spectroscopy (SERS) with the simplicity of standard PCR to address the limitations of current approaches. This proof-of-concept method could reproducibly detect as few as 0.1% (10 copies, CV < 9%) of target sequences thus demonstrating the high sensitivity of the method. The method was then applied to specifically detect three important melanoma mutations in multiplex. Finally, the PCR/SERS assay was used to genotype cell lines and ctDNA from serum samples where results subsequently validated with ddPCR. With ddPCR-like sensitivity and accuracy yet at the convenience of standard PCR, we believe this multiplex PCR/SERS method could find wide applications in both diagnostics and research. PMID:27446486
Wee, Eugene J H; Wang, Yuling; Tsao, Simon Chang-Hao; Trau, Matt
2016-01-01
Sensitive and accurate identification of specific DNA mutations can influence clinical decisions. However accurate diagnosis from limiting samples such as circulating tumour DNA (ctDNA) is challenging. Current approaches based on fluorescence such as quantitative PCR (qPCR) and more recently, droplet digital PCR (ddPCR) have limitations in multiplex detection, sensitivity and the need for expensive specialized equipment. Herein we describe an assay capitalizing on the multiplexing and sensitivity benefits of surface-enhanced Raman spectroscopy (SERS) with the simplicity of standard PCR to address the limitations of current approaches. This proof-of-concept method could reproducibly detect as few as 0.1% (10 copies, CV < 9%) of target sequences thus demonstrating the high sensitivity of the method. The method was then applied to specifically detect three important melanoma mutations in multiplex. Finally, the PCR/SERS assay was used to genotype cell lines and ctDNA from serum samples where results subsequently validated with ddPCR. With ddPCR-like sensitivity and accuracy yet at the convenience of standard PCR, we believe this multiplex PCR/SERS method could find wide applications in both diagnostics and research.
Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, H.; Keller, J.; Guo, Y.
2013-04-01
Gearboxes in wind turbines have not been achieving their expected design life even though they commonly meet or exceed the design criteria specified in current design standards. One of the basic premises of the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is that the low gearbox reliability results from the absence of critical elements in the design process or insufficient design tools. Key goals of the GRC are to improve design approaches and analysis tools and to recommend practices and test methods resulting in improved design standards for wind turbine gearboxes that lower the cost of energy (COE)more » through improved reliability. The GRC uses a combined gearbox testing, modeling and analysis approach, along with a database of information from gearbox failures collected from overhauls and investigation of gearbox condition monitoring techniques to improve wind turbine operations and maintenance practices. Testing of Gearbox 2 (GB2) using the two-speed turbine controller that has been used in prior testing. This test series will investigate non-torque loads, high-speed shaft misalignment, and reproduction of field conditions in the dynamometer. This test series will also include vibration testing using an eddy-current brake on the gearbox's high speed shaft.« less
A review on digital ECG formats and the relationships between them.
Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José
2012-05-01
A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.
Exposure Patterns and Health Effects Associated with Swimming and Surfing in Polluted Marine Waters
NASA Astrophysics Data System (ADS)
Grant, S. B.
2007-05-01
Marine bathing beaches are closed to the public whenever water quality fails to meet State and Federal standards. In this talk I will explore the science (and lack thereof!) behind these beach closures, including the health effects data upon which standards are based, shortcomings of the current approach used for testing and notification, and the high degree of spatial and temporal heterogeneity associated with human exposure to pollutants in these systems. The talk will focus on examples from Huntington Beach, where the speaker has conducted research over the past several years.
New approach to flavor symmetry and an extended naturalness principle
NASA Astrophysics Data System (ADS)
Barr, S. M.
2010-09-01
A class of nonsupersymmetric extensions of the standard model is proposed in which there is a multiplicity of light scalar doublets in a multiplet of a nonabelian family group with the standard model Higgs doublet. Anthropic tuning makes the latter light, and consequently the other scalar doublets remain light because of the family symmetry. The family symmetry greatly constrains the pattern of flavor-changing neutral-current interactions (FCNC) and p decay operators coming from scalar-exchange. Such models show that useful constraints on model-building can come from an extended naturalness principle when the electroweak scale is anthropically tuned.
Eliason, Michele J
2014-01-01
There is currently no consensus on the best ways to define and operationalize research concepts related to sexuality and gender. This article explores some of the ways that sex/gender and sexuality terms have been used in health-related research and in keyword searches in the health sciences. Reasons for the diversity of terms and measurement approaches are explored and arguments for and against standardizing the language are presented. The article ends with recommendations for beginning a productive dialogue among health researchers to create some consistency in the terminology used to assess sexuality and gender.
Alternative approaches to conventional antiepileptic drugs in the management of paediatric epilepsy
Kneen, R; Appleton, R E
2006-01-01
Over the last two decades, there has been a rapid expansion in the number and types of available antiepileptic drugs (AEDs), but there is increasing concern amongst parents and carers about their unwanted side effects. Seizure control is achieved in approximately 75% of children treated with conventional AEDs, but non‐conventional (or non‐standard) medical treatments, surgical procedures, dietary approaches, and other non‐pharmacological treatment approaches may have a role to play in those with intractable seizures or AED toxicity. Many of the approaches are largely common sense and are already incorporated into our current practice, including, for example, avoidance techniques and lifestyle advice, while others require further investigation or appear to be impractical in children. PMID:17056869
Computational intelligence approaches for pattern discovery in biological systems.
Fogel, Gary B
2008-07-01
Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.
Smith, James; Ross, Kirstin; Whiley, Harriet
2016-12-08
Foodborne illness is a global public health burden. Over the past decade in Australia, despite advances in microbiological detection and control methods, there has been an increase in the incidence of foodborne illness. Therefore improvements in the regulation and implementation of food safety policy are crucial for protecting public health. In 2000, Australia established a national food safety regulatory system, which included the adoption of a mandatory set of food safety standards. These were in line with international standards and moved away from a "command and control" regulatory approach to an "outcomes-based" approach using risk assessment. The aim was to achieve national consistency and reduce foodborne illness without unnecessarily burdening businesses. Evidence demonstrates that a risk based approach provides better protection for consumers; however, sixteen years after the adoption of the new approach, the rates of food borne illness are still increasing. Currently, food businesses are responsible for producing safe food and regulatory bodies are responsible for ensuring legislative controls are met. Therefore there is co-regulatory responsibility and liability and implementation strategies need to reflect this. This analysis explores the challenges facing food regulation in Australia and explores the rationale and evidence in support of this new regulatory approach.
Smith, James; Ross, Kirstin; Whiley, Harriet
2016-01-01
Foodborne illness is a global public health burden. Over the past decade in Australia, despite advances in microbiological detection and control methods, there has been an increase in the incidence of foodborne illness. Therefore improvements in the regulation and implementation of food safety policy are crucial for protecting public health. In 2000, Australia established a national food safety regulatory system, which included the adoption of a mandatory set of food safety standards. These were in line with international standards and moved away from a “command and control” regulatory approach to an “outcomes-based” approach using risk assessment. The aim was to achieve national consistency and reduce foodborne illness without unnecessarily burdening businesses. Evidence demonstrates that a risk based approach provides better protection for consumers; however, sixteen years after the adoption of the new approach, the rates of food borne illness are still increasing. Currently, food businesses are responsible for producing safe food and regulatory bodies are responsible for ensuring legislative controls are met. Therefore there is co-regulatory responsibility and liability and implementation strategies need to reflect this. This analysis explores the challenges facing food regulation in Australia and explores the rationale and evidence in support of this new regulatory approach. PMID:27941657
Modeling healthcare authorization and claim submissions using the openEHR dual-model approach
2011-01-01
Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model) can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing processes. A complete communication architecture to simulate the exchange of TISS data between systems according to the openEHR approach still needs to be designed and implemented. PMID:21992670
Schoolcraft, William; Meseguer, Marcos
2017-10-01
Infertility affects over 70 million couples globally. Access to, and interest in, assisted reproductive technologies is growing worldwide, with more couples seeking medical intervention to conceive, in particular by IVF. Despite numerous advances in IVF techniques since its first success in 1978, almost half of the patients treated remain childless. The multifactorial nature of IVF treatment means that success is dependent on many variables. Therefore, it is important to examine how each variable can be optimized to achieve the best possible outcomes for patients. The current approach to IVF is fragmented, with various protocols in use. A systematic approach to establishing optimum best practices may improve IVF success and live birth rates. Our vision of the future is that technological advancements in the laboratory setting are standardized and universally adopted to enable a gold standard of care. Implementation of best practices for laboratory procedures will enable clinicians to generate high-quality gametes, and to produce and identify gametes and embryos of maximum viability and implantation potential, which should contribute to improving take-home healthy baby rates. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Dependability of technical items: Problems of standardization
NASA Astrophysics Data System (ADS)
Fedotova, G. A.; Voropai, N. I.; Kovalev, G. F.
2016-12-01
This paper is concerned with problems blown up in the development of a new version of the Interstate Standard GOST 27.002 "Industrial product dependability. Terms and definitions". This Standard covers a wide range of technical items and is used in numerous regulations, specifications, standard and technical documentation. A currently available State Standard GOST 27.002-89 was introduced in 1990. Its development involved a participation of scientists and experts from different technical areas, its draft was debated in different audiences and constantly refined, so it was a high quality document. However, after 25 years of its application it's become necessary to develop a new version of the Standard that would reflect the current understanding of industrial dependability, accounting for the changes taking place in Russia in the production, management and development of various technical systems and facilities. The development of a new version of the Standard makes it possible to generalize on a terminological level the knowledge and experience in the area of reliability of technical items, accumulated over a quarter of the century in different industries and reliability research schools, to account for domestic and foreign experience of standardization. Working on the new version of the Standard, we have faced a number of issues and problems on harmonization with the International Standard IEC 60500-192, caused first of all by different approaches to the use of terms and differences in the mentalities of experts from different countries. The paper focuses on the problems related to the chapter "Maintenance, restoration and repair", which caused difficulties for the developers to harmonize term definitions both with experts and the International Standard, which is mainly related to differences between the Russian concept and practice of maintenance and repair and foreign ones.
In-Situ Transfer Standard and Coincident-View Intercomparisons for Sensor Cross-Calibration
NASA Technical Reports Server (NTRS)
Thome, Kurt; McCorkel, Joel; Czapla-Myers, Jeff
2013-01-01
There exist numerous methods for accomplishing on-orbit calibration. Methods include the reflectance-based approach relying on measurements of surface and atmospheric properties at the time of a sensor overpass as well as invariant scene approaches relying on knowledge of the temporal characteristics of the site. The current work examines typical cross-calibration methods and discusses the expected uncertainties of the methods. Data from the Advanced Land Imager (ALI), Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), Moderate Resolution Imaging Spectroradiometer (MODIS), and Thematic Mapper (TM) are used to demonstrate the limits of relative sensor-to-sensor calibration as applied to current sensors while Landsat-5 TM and Landsat-7 ETM+ are used to evaluate the limits of in situ site characterizations for SI-traceable cross calibration. The current work examines the difficulties in trending of results from cross-calibration approaches taking into account sampling issues, site-to-site variability, and accuracy of the method. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The results show that cross calibrations with absolute uncertainties lesser than 1.5 percent (1 sigma) are currently achievable even for sensors without coincident views.
Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat
2013-01-01
Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950
Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat
2013-12-01
The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed.
NASA Astrophysics Data System (ADS)
Lezcano, Jean-Marc; Adachihara, Hatsuo; Prunier, Marc
European higher education organizations are encouraged to implement quality management practices. Existing quality standards and frameworks do not capitalize an important set of best practices addressing educational services delivery. The ISO/IEC 20000 standard, elaborated from IT service management issues, is widening its field of application and may represent an interesting alternative. A specific approach is needed to apprehend the particular nature of educational services, consider the systemic cooperating roles of educational system and learning system, and define ISO/IEC 20000 vocabulary and concepts adapted to the domain. ISO/IEC 20000 may provide an answer to European Standard Guideline compliance and improve educational services management. The current experimentation is expected to cast light on the complexity, practicality and effectiveness of the use of ISO/IEC 20000 in a first field of "non IT" services.
Analysis of view synthesis prediction architectures in modern coding standards
NASA Astrophysics Data System (ADS)
Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang
2013-09-01
Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.
Informatics in clinical research in oncology: current state, challenges, and a future perspective.
Chahal, Amar P S
2011-01-01
The informatics landscape of clinical trials in oncology has changed significantly in the last 10 years. The current state of the infrastructure for clinical trial management, execution, and data management is reviewed. The systems, their functionality, the users, and the standards available to researchers are discussed from the perspective of the oncologist-researcher. Challenges in complexity and in the processing of information are outlined. These challenges include the lack of communication and information-interchange between systems, the lack of simplified standards, and the lack of implementation and adherence to the standards that are available. The clinical toxicology criteria from the National Cancer Institute (CTCAE) are cited as a successful standard in oncology, and HTTP on the Internet is referenced for its simplicity. Differences in the management of information standards between industries are discussed. Possible future advances in oncology clinical research informatics are addressed. These advances include strategic policy review of standards and the implementation of actions to make standards free, ubiquitous, simple, and easily interpretable; the need to change from a local data-capture- or transaction-driven model to a large-scale data-interpretation model that provides higher value to the oncologist and the patient; and the need for information technology investment in a readily available digital educational model for clinical research in oncology that is customizable for individual studies. These new approaches, with changes in information delivery to mobile platforms, will set the stage for the next decade in clinical research informatics.
Minimum current principle and variational method in theory of space charge limited flow
NASA Astrophysics Data System (ADS)
Rokhlenko, A.
2015-10-01
In spirit of the principle of least action, which means that when a perturbation is applied to a physical system, its reaction is such that it modifies its state to "agree" with the perturbation by "minimal" change of its initial state. In particular, the electron field emission should produce the minimum current consistent with boundary conditions. It can be found theoretically by solving corresponding equations using different techniques. We apply here the variational method for the current calculation, which can be quite effective even when involving a short set of trial functions. The approach to a better result can be monitored by the total current that should decrease when we on the right track. Here, we present only an illustration for simple geometries of devices with the electron flow. The development of these methods can be useful when the emitter and/or anode shapes make difficult the use of standard approaches. Though direct numerical calculations including particle-in-cell technique are very effective, but theoretical calculations can provide an important insight for understanding general features of flow formation and even sometimes be realized by simpler routines.
An efficiency-decay model for Lumen maintenance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bobashev, Georgiy; Baldasaro, Nicholas G.; Mills, Karmann C.
Proposed is a multicomponent model for the estimation of light-emitting diode (LED) lumen maintenance using test data that were acquired in accordance with the test standards of the Illumination Engineering Society of North America, i.e., LM-80-08. Lumen maintenance data acquired with this test do not always follow exponential decay, particularly data collected in the first 1000 h or under low-stress (e.g., low temperature) conditions. This deviation from true exponential behavior makes it difficult to use the full data set in models for the estimation of lumen maintenance decay coefficient. As a result, critical information that is relevant to the earlymore » life or low-stress operation of LED light sources may be missed. We present an efficiency-decay model approach, where all lumen maintenance data can be used to provide an alternative estimate of the decay rate constant. The approach considers a combined model wherein one part describes an initial “break-in” period and another part describes the decay in lumen maintenance. During the break-in period, several mechanisms within the LED can act to produce a small (typically <; 10%) increase in luminous flux. The effect of the break-in period and its longevity is more likely to be present at low-ambient temperatures and currents, where the discrepancy between a standard TM-21 approach and our proposed model is the largest. For high temperatures and currents, the difference between the estimates becomes nonsubstantial. Finally, our approach makes use of all the collected data and avoids producing unrealistic estimates of the decay coefficient.« less
Nevers, Meredith B.; Whitman, Richard L.
2011-01-01
Efforts to improve public health protection in recreational swimming waters have focused on obtaining real-time estimates of water quality. Current monitoring techniques rely on the time-intensive culturing of fecal indicator bacteria (FIB) from water samples, but rapidly changing FIB concentrations result in management errors that lead to the public being exposed to high FIB concentrations (type II error) or beaches being closed despite acceptable water quality (type I error). Empirical predictive models may provide a rapid solution, but their effectiveness at improving health protection has not been adequately assessed. We sought to determine if emerging monitoring approaches could effectively reduce risk of illness exposure by minimizing management errors. We examined four monitoring approaches (inactive, current protocol, a single predictive model for all beaches, and individual models for each beach) with increasing refinement at 14 Chicago beaches using historical monitoring and hydrometeorological data and compared management outcomes using different standards for decision-making. Predictability (R2) of FIB concentration improved with model refinement at all beaches but one. Predictive models did not always reduce the number of management errors and therefore the overall illness burden. Use of a Chicago-specific single-sample standard-rather than the default 235 E. coli CFU/100 ml widely used-together with predictive modeling resulted in the greatest number of open beach days without any increase in public health risk. These results emphasize that emerging monitoring approaches such as empirical models are not equally applicable at all beaches, and combining monitoring approaches may expand beach access.
An efficiency-decay model for Lumen maintenance
Bobashev, Georgiy; Baldasaro, Nicholas G.; Mills, Karmann C.; ...
2016-08-25
Proposed is a multicomponent model for the estimation of light-emitting diode (LED) lumen maintenance using test data that were acquired in accordance with the test standards of the Illumination Engineering Society of North America, i.e., LM-80-08. Lumen maintenance data acquired with this test do not always follow exponential decay, particularly data collected in the first 1000 h or under low-stress (e.g., low temperature) conditions. This deviation from true exponential behavior makes it difficult to use the full data set in models for the estimation of lumen maintenance decay coefficient. As a result, critical information that is relevant to the earlymore » life or low-stress operation of LED light sources may be missed. We present an efficiency-decay model approach, where all lumen maintenance data can be used to provide an alternative estimate of the decay rate constant. The approach considers a combined model wherein one part describes an initial “break-in” period and another part describes the decay in lumen maintenance. During the break-in period, several mechanisms within the LED can act to produce a small (typically <; 10%) increase in luminous flux. The effect of the break-in period and its longevity is more likely to be present at low-ambient temperatures and currents, where the discrepancy between a standard TM-21 approach and our proposed model is the largest. For high temperatures and currents, the difference between the estimates becomes nonsubstantial. Finally, our approach makes use of all the collected data and avoids producing unrealistic estimates of the decay coefficient.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busch, John; Deringer, Joseph
1998-10-01
Energy efficiency standards for buildings have been adopted in over forty countries. This policy mechanism is pursued by governments as a means of increasing energy efficiency in the buildings sector, which typically accounts for about a third of most nations' energy consumption and half of their electricity consumption. This study reports on experience with implementation of energy standards for commercial buildings in a number of countries and U.S. states. It is conducted from the perspective of providing useful input to the Government of the Philippines' (GOP) current effort at implementing their building energy standard. While the impetus for this workmore » is technical assistance to the Philippines, the intent is to shed light on the broader issues attending implementation of building energy standards that would be applicable there and elsewhere. The background on the GOP building energy standard is presented, followed by the objectives for the study, the approach used to collect and analyze information about other jurisdictions' implementation experience, results, and conclusions and recommendations.« less
Medical home implementation: a sensemaking taxonomy of hard and soft best practices.
Hoff, Timothy
2013-12-01
The patient-centered medical home (PCMH) model of care is currently a central focus of U.S. health system reform, but less is known about the model's implementation in the practice of everyday primary care. Understanding its implementation is key to ensuring the approach's continued support and success nationally. This article addresses this gap through a qualitative examination of the best practices associated with PCMH implementation for older adult patients in primary care. I used a multicase, comparative study design that relied on a sensemaking approach and fifty-one in-depth interviews with physicians, nurses, and clinic support staff working in six accredited medical homes located in various geographic areas. My emphasis was on gaining descriptive insights into the staff's experiences delivering medical home care to older adult patients in particular and then analyzing how these experiences shaped the staff's thinking, learning, and future actions in implementing medical home care. I found two distinct taxonomies of implementation best practices, which I labeled "hard" and "soft" because of their differing emphasis and content. Hard implementation practices are normative activities and structural interventions that align well with existing national standards for medical home care. Soft best practices are more relational in nature and derive from the existing practice social structure and everyday interactions between staff and patients. Currently, external stakeholders are less apt to recognize, encourage, or incentivize soft best practices. The results suggest that there may be no standardized, one-size-fits-all approach to making medical home implementation work, particularly for special patient populations such as the elderly. My study also raises the issue of broadening current PCMH assessments and reward systems to include implementation practices that contain heavy social and relational components of care, in addition to the emphasis now placed on building structural supports for medical home work. Further study of these softer implementation practices and a continued call for qualitative methodological approaches that gain insight into everyday practice behavior are warranted. © 2013 Milbank Memorial Fund.
Standards Advisor-Advanced Information Technology for Advanced Information Delivery
NASA Technical Reports Server (NTRS)
Hawker, J. Scott
2003-01-01
Developers of space systems must deal with an increasing amount of information in responding to extensive requirements and standards from numerous sources. Accessing these requirements and standards, understanding them, comparing them, negotiating them and responding to them is often an overwhelming task. There are resources to aid the space systems developer, such as lessons learned and best practices. Again, though, accessing, understanding, and using this information is often more difficult than helpful. This results in space systems that: 1. Do not meet all their requirements. 2. Do not incorporate prior engineering experience. 3. Cost more to develop. 4. Take longer to develop. The NASA Technical Standards Program (NTSP) web site at http://standards.nasa.gov has made significant improvements in making standards, lessons learned, and related material available to space systems developers agency-wide. The Standards Advisor was conceived to take the next steps beyond the current product, continuing to apply evolving information technology that continues to improve information delivery to space systems developers. This report describes the features of the Standards Advisor and suggests a technical approach to its development.
Immunosuppressive therapy for transplant-ineligible aplastic anemia patients.
Schrezenmeier, Hubert; Körper, Sixten; Höchsmann, Britta
2015-02-01
Aplastic anemia is a rare life-threatening bone marrow failure that is characterized by bicytopenia or pancytopenia in the peripheral blood and a hypoplastic or aplastic bone marrow. The patients are at risk of infection and hemorrhage due to neutropenia and thrombocytopenia and suffer from symptoms of anemia. The main treatment approaches are allogeneic stem cell transplantation and immunosuppression. Here, we review current standard immunosuppression and the attempts that have been made in the past two decades to improve results: review of recent developments also reveals that sometimes not only the advent of new drugs, good ideas and well-designed clinical trials decide the progress in the field but also marketing considerations of pharmaceutical companies. Aplastic anemia experts unfortunately had to face the situation that efficient drugs were withdrawn simply for marketing considerations. We will discuss the current options and challenges in first-line treatment and management of relapsing and refractory patients with an emphasis on adult patients. Some promising new approaches are currently under investigation in prospective, randomized trials.
Gearbox Reliability Collaborative Analytic Formulation for the Evaluation of Spline Couplings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Keller, Jonathan; Errichello, Robert
2013-12-01
Gearboxes in wind turbines have not been achieving their expected design life; however, they commonly meet and exceed the design criteria specified in current standards in the gear, bearing, and wind turbine industry as well as third-party certification criteria. The cost of gearbox replacements and rebuilds, as well as the down time associated with these failures, has elevated the cost of wind energy. The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) was established by the U.S. Department of Energy in 2006; its key goal is to understand the root causes of premature gearbox failures and improve their reliabilitymore » using a combined approach of dynamometer testing, field testing, and modeling. As part of the GRC program, this paper investigates the design of the spline coupling often used in modern wind turbine gearboxes to connect the planetary and helical gear stages. Aside from transmitting the driving torque, another common function of the spline coupling is to allow the sun to float between the planets. The amount the sun can float is determined by the spline design and the sun shaft flexibility subject to the operational loads. Current standards address spline coupling design requirements in varying detail. This report provides additional insight beyond these current standards to quickly evaluate spline coupling designs.« less
Biomedical information retrieval across languages.
Daumke, Philipp; Markü, Kornél; Poprat, Michael; Schulz, Stefan; Klar, Rüdiger
2007-06-01
This work presents a new dictionary-based approach to biomedical cross-language information retrieval (CLIR) that addresses many of the general and domain-specific challenges in current CLIR research. Our method is based on a multilingual lexicon that was generated partly manually and partly automatically, and currently covers six European languages. It contains morphologically meaningful word fragments, termed subwords. Using subwords instead of entire words significantly reduces the number of lexical entries necessary to sufficiently cover a specific language and domain. Mediation between queries and documents is based on these subwords as well as on lists of word-n-grams that are generated from large monolingual corpora and constitute possible translation units. The translations are then sent to a standard Internet search engine. This process makes our approach an effective tool for searching the biomedical content of the World Wide Web in different languages. We evaluate this approach using the OHSUMED corpus, a large medical document collection, within a cross-language retrieval setting.
Use of benefit-cost analysis in establishing Federal radiation protection standards: a review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, L.E.
1979-10-01
This paper complements other work which has evaluated the cost impacts of radiation standards on the nuclear industry. It focuses on the approaches to valuation of the health and safety benefits of radiation standards and the actual and appropriate processes of benefit-cost comparison. A brief historical review of the rationale(s) for the levels of radiation standards prior to 1970 is given. The Nuclear Regulatory Commission (NRC) established numerical design objectives for light water reactors (LWRs). The process of establishing these numerical design criteria below the radiation protection standards set in 10 CFR 20 is reviewed. EPA's 40 CFR 190 environmentalmore » standards for the uranium fuel cycle have lower values than NRC's radiation protection standards in 10 CFR 20. The task of allocating EPA's 40 CFR 190 standards to the various portions of the fuel cycle was left to the implementing agency, NRC. So whether or not EPA's standards for the uranium fuel cycle are more stringent for LWRs than NRC's numerical design objectives depends on how EPA's standards are implemented by NRC. In setting the numerical levels in Appendix I to 10 CFR 50 and 40 CFR 190 NRC and EPA, respectively, focused on the costs of compliance with various levels of radiation control. A major portion of the paper is devoted to a review and critique of the available methods for valuing health and safety benefits. All current approaches try to estimate a constant value of life and use this to vaue the expected number of lives saved. This paper argues that it is more appropriate to seek a value of a reduction in risks to health and life that varies with the extent of these risks. Additional research to do this is recommended. (DC)« less
The rise of genomic profiling in ovarian cancer
Previs, Rebecca A.; Sood, Anil K.; Mills, Gordon B.; Westin, Shannon N.
2017-01-01
Introduction Next-generation sequencing and advances in ‘omics technology have rapidly increased our understanding of the molecular landscape of epithelial ovarian cancers. Areas covered Once characterized only by histologic appearance and clinical behavior, we now understand many of the molecular phenotypes that underlie the different ovarian cancer subtypes. While the current approach to treatment involves standard cytotoxic therapies after cytoreductive surgery for all ovarian cancers regardless of histologic or molecular characteristics, focus has shifted beyond a ‘one size fits all’ approach to ovarian cancer. Expert commentary Genomic profiling offers potentially ‘actionable’ opportunities for development of targeted therapies and a more individualized approach to treatment with concomitant improved outcomes and decreased toxicity. PMID:27828713
An Update on Modern Approaches to Localized Esophageal Cancer
Welsh, James; Amini, Arya; Likhacheva, Anna; Erasmus, Jeremy; Gomez, Daniel; Davila, Marta; Mehran, Reza J; Komaki, Ritsuko; Liao, Zhongxing; Hofstetter, Wayne L; Bhutani, Manoop; Ajani, Jaffer A
2014-01-01
Esophageal cancer treatment continues to be a topic of wide debate. Based on improvements in chemotherapy drugs, surgical techniques, and radiotherapy advances, esophageal cancer treatment approaches are becoming more specific to the stage of the tumor and the overall performance status of the patient. While surgery continues to be the standard treatment option for localized disease, the current direction favors multimodality treatment including both radiation and chemotherapy with surgery. In the next few years, we will continue to see improvements in radiation techniques and proton treatment, with more minimally invasive surgical approaches minimizing postoperative side effects, and the discovery of molecular biomarkers to help deliver more specifically targeted medication to treat esophageal cancers. PMID:21365188
Advanced Imaging Technologies for the Detection of Dysplasia and Early Cancer in Barrett Esophagus
Espino, Alberto; Cirocco, Maria; DaCosta, Ralph
2014-01-01
Advanced esophageal adenocarcinomas arising from Barrett esophagus (BE) are tumors with an increasing incidence and poor prognosis. The aim of endoscopic surveillance of BE is to detect dysplasia, particularly high-grade dysplasia and intramucosal cancers that can subsequently be treated endoscopically before progression to invasive cancer with lymph node metastases. Current surveillance practice standards require the collection of random 4-quadrant biopsy specimens over every 1 to 2 cm of BE (Seattle protocol) to detect dysplasia with the assistance of white light endoscopy, in addition to performing targeted biopsies of recognizable lesions. This approach is labor-intensive but should currently be considered state of the art. Chromoendoscopy, virtual chromoendoscopy (e.g., narrow band imaging), and confocal laser endomicroscopy, in addition to high-definition standard endoscopy, might increase the diagnostic yield for the detection of dysplastic lesions. Until these modalities have been demonstrated to enhance efficiency or cost effectiveness, the standard protocol will remain careful examination using conventional off the shelf high-resolution endoscopes, combined with as longer inspection time which is associated with increased detection of dysplasia. PMID:24570883
Quality management of eLearning for medical education: current situation and outlook
Abrusch, Jasmin; Marienhagen, Jörg; Böckers, Anja; Gerhardt-Szép, Susanne
2015-01-01
Introduction: In 2008, the German Council of Science had advised universities to establish a quality management system (QMS) that conforms to international standards. The system was to be implemented within 5 years, i.e., until 2014 at the latest. The aim of the present study was to determine whether a QMS suitable for electronic learning (eLearning) domain of medical education to be used across Germany has meanwhile been identified. Methods: We approached all medical universities in Germany (n=35), using an anonymous questionnaire (8 domains, 50 items). Results: Our results (response rate 46.3%) indicated very reluctant application of QMS in eLearning and a major information deficit at the various institutions. Conclusions: Authors conclude that under the limitations of this study there seems to be a considerable need to improve the current knowledge on QMS for eLearning, and that clear guidelines and standards for their implementation should be further defined. PMID:26038685
Quality management of eLearning for medical education: current situation and outlook.
Abrusch, Jasmin; Marienhagen, Jörg; Böckers, Anja; Gerhardt-Szép, Susanne
2015-01-01
In 2008, the German Council of Science had advised universities to establish a quality management system (QMS) that conforms to international standards. The system was to be implemented within 5 years, i.e., until 2014 at the latest. The aim of the present study was to determine whether a QMS suitable for electronic learning (eLearning) domain of medical education to be used across Germany has meanwhile been identified. We approached all medical universities in Germany (n=35), using an anonymous questionnaire (8 domains, 50 items). Our results (response rate 46.3%) indicated very reluctant application of QMS in eLearning and a major information deficit at the various institutions. Authors conclude that under the limitations of this study there seems to be a considerable need to improve the current knowledge on QMS for eLearning, and that clear guidelines and standards for their implementation should be further defined.
Internet use, misuse, and addiction in adolescents: current issues and challenges.
Greydanus, Donald E; Greydanus, Megan M
2012-01-01
The Internet has revolutionized education and social communication in the 21st century. This article reviews the growing literature identifying a number of adolescents and young adults with a pathologically excessive Internet use leading to many potential consequences. Current research dilemmas in this area include that Internet addiction is a broad topic with no standard definition and no standard measurement tools. Management of youth with identified problematic Internet use or misuse centers on behavioral therapy and treatment of comorbidities. Pharmacologic approaches are limited at this time but are undergoing research, such as use of opioid antagonists and antidepressants in adults with pathological gambling. Efforts should be expanded on not only the education of all adolescents regarding the benefits but also the potential negative consequences of Internet use. It is vital that we do this for Generation Z, whereas Generation ALPHA will soon benefit or suffer from our efforts in this regard today.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Current challenges in diagnostic imaging of venous thromboembolism.
Huisman, Menno V; Klok, Frederikus A
2015-01-01
Because the clinical diagnosis of deep-vein thrombosis and pulmonary embolism is nonspecific, integrated diagnostic approaches for patients with suspected venous thromboembolism have been developed over the years, involving both non-invasive bedside tools (clinical decision rules and D-dimer blood tests) for patients with low pretest probability and diagnostic techniques (compression ultrasound for deep-vein thrombosis and computed tomography pulmonary angiography for pulmonary embolism) for those with a high pretest probability. This combination has led to standardized diagnostic algorithms with proven safety for excluding venous thrombotic disease. At the same time, it has become apparent that, as a result of the natural history of venous thrombosis, there are special patient populations in which the current standard diagnostic algorithms are not sufficient. In this review, we present 3 evidence-based patient cases to underline recent developments in the imaging diagnosis of venous thromboembolism. © 2015 by The American Society of Hematology. All rights reserved.
Wake Vortex Advisory System (WakeVAS) Concept of Operations
NASA Technical Reports Server (NTRS)
Rutishauser, David; Lohr, Gary; Hamilton, David; Powers, Robert; McKissick, Burnell; Adams, Catherine; Norris, Edward
2003-01-01
NASA Langley Research Center has a long history of aircraft wake vortex research, with the most recent accomplishment of demonstrating the Aircraft VOrtex Spacing System (AVOSS) at Dallas/Forth Worth International Airport in July 2000. The AVOSS was a concept for an integration of technologies applied to providing dynamic wake-safe reduced spacing for single runway arrivals, as compared to current separation standards applied during instrument approaches. AVOSS included state-of-the-art weather sensors, wake sensors, and a wake behavior prediction algorithm. Using real-time data AVOSS averaged a 6% potential throughput increase over current standards. This report describes a Concept of Operations for applying the technologies demonstrated in the AVOSS to a variety of terminal operations to mitigate wake vortex capacity constraints. A discussion of the technological issues and open research questions that must be addressed to design a Wake Vortex Advisory System (WakeVAS) is included.
Robotic surgical systems in maxillofacial surgery: a review
Liu, Hang-Hang; Li, Long-Jiang; Shi, Bin; Xu, Chun-Wei; Luo, En
2017-01-01
Throughout the twenty-first century, robotic surgery has been used in multiple oral surgical procedures for the treatment of head and neck tumors and non-malignant diseases. With the assistance of robotic surgical systems, maxillofacial surgery is performed with less blood loss, fewer complications, shorter hospitalization and better cosmetic results than standard open surgery. However, the application of robotic surgery techniques to the treatment of head and neck diseases remains in an experimental stage, and the long-lasting effects on surgical morbidity, oncologic control and quality of life are yet to be established. More well-designed studies are needed before this approach can be recommended as a standard treatment paradigm. Nonetheless, robotic surgical systems will inevitably be extended to maxillofacial surgery. This article reviews the current clinical applications of robotic surgery in the head and neck region and highlights the benefits and limitations of current robotic surgical systems. PMID:28660906
NASA Astrophysics Data System (ADS)
Rajagopal, Deepak
2013-06-01
The absence of a globally-consistent and binding commitment to reducing greenhouse emissions provides a rationale for partial policies, such as renewable energy mandates, product emission standards, etc to target lifecycle emissions of the regulated products or services. While appealing in principle, regulation of lifecycle emissions presents several practical challenges. Using biofuels as an illustrative example, we highlight some outstanding issues in the design and implementation of life cycle-based policies and discuss potential remedies. We review the literature on emissions due to price effects in fuel markets, which are akin to emissions due to indirect land use change, but are, unlike the latter, ignored under all current life cycle emissions-based regulations. We distinguish the current approaches to regulating indirect emissions into hard and soft approaches and discuss their implications.
An approach for software-driven and standard-based support of cross-enterprise tumor boards.
Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas
2015-01-01
For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.
Standardizing of Pathology in Patients Receiving Neoadjuvant Chemotherapy.
Bossuyt, Veerle; Symmans, W Fraser
2016-10-01
The use of neoadjuvant systemic therapy for the treatment of breast cancer patients is increasing. Pathologic response in the form of pathologic complete response (pCR) and grading systems of partial response, such as the residual cancer burden (RCB) system, gives valuable prognostic information for patients and is used as a primary endpoint in clinical trials. The breast cancer and pathology communities are responding with efforts to standardize pathology in patients receiving neoadjuvant chemotherapy. In this review, we summarize the challenges that postneoadjuvant systemic therapy surgical specimens pose and how pathologists and the multidisciplinary team can work together to optimize handling of these specimens. Multidisciplinary communication is essential. A single, standardized approach to macroscopic and microscopic pathologic examination makes it possible to provide reliable response information. This approach employs a map of tissue sections to correlate clinical, gross, microscopic, and imaging findings in order to report the presence of pCR (ypT0 ypN0 and ypT0/is ypN0) versus residual disease, the ypT and ypN stage using the current AJCC/UICC staging system, and the RCB.
A contact-free respiration monitor for smart bed and ambulatory monitoring applications.
Hart, Adam; Tallevi, Kevin; Wickland, David; Kearney, Robert E; Cafazzo, Joseph A
2010-01-01
The development of a contact-free respiration monitor has a broad range of clinical applications in the home and hospital setting. Current approaches suffer from a variety of problems including unreliability, low sensitivity, and high cost. This work describes a novel approach to contact-free respiration monitoring that addresses these shortcomings by employing a highly sensitive capacitance sensor to detect variations in capacitive coupling caused by breathing. A prototype system consisting of a synthetic-metallic pad, sensor electronics, and iPhone interface was built and its performance compared experimentally to the gold standard technique (Respiratory Inductance Plethysmography) on both a healthy volunteer and SimMan robotic mannequin. The prototype sensor effectively captured respiratory movements over breathing rates of 5-55 bpm; achieving an average spectral correlation of 0.88 (CI: 0.86-0.90) and 0.95 (CI: 0.95-0.96) to the gold standard using the SimMan and healthy volunteer respectively.
Short-course versus long-course chemoradiation in rectal cancer--time to change strategies?
Palta, Manisha; Willett, Christopher G; Czito, Brian G
2014-09-01
There is significant debate regarding the optimal neoadjuvant regimen for resectable rectal cancer patients. Short-course radiotherapy, a standard approach throughout most of northern Europe, is generally defined as 25 Gy in 5 fractions over the course of 1 week without the concurrent administration of chemotherapy. Long-course radiotherapy is typically defined as 45 to 50.4 Gy in 25-28 fractions with the administration of concurrent 5-fluoropyrimidine-based chemotherapy and is the standard approach in other parts of Europe and the United States. At present, two randomized trials have compared outcomes for short course radiotherapy with long-course chemoradiation showing no difference in respective study endpoints. Late toxicity data are lacking given limited follow-up. Although the ideal neoadjuvant regimen is controversial, our current bias is long-course chemoradiation to treat patients with locally advanced, resectable rectal cancer.
Spanish adaptation of the European guidelines for the evaluation and treatment of actinic keratosis.
Ferrándiz, C; Fonseca-Capdevila, E; García-Diez, A; Guillén-Barona, C; Belinchón-Romero, I; Redondo-Bellón, P; Moreno-Giménez, J C; Senán, R
2014-05-01
Current trends in our setting indicate that the prevalence of actinic keratosis and similar diseases will increase in coming years and impose a greater burden on health care resources. A long list of clinical features must be taken into account when approaching the treatment of actinic keratosis. Until recently, therapeutic approaches focused solely on ablative procedures and the treatment of individual lesions and did not take into account areas of field cancerization. Now that the therapeutic arsenal has grown, standardized criteria are needed to guide the optimal choice of treatment for each patient. The elaboration of evidence-based consensus recommendations for the diagnosis and treatment of actinic keratosis generates knowledge that will help clinicians to deliver the highest level of care possible, standardizing decision-making processes and enhancing awareness among all the health professionals involved in the care pathway. Copyright © 2013 Elsevier España, S.L. and AEDV. All rights reserved.
[Safer operating theatre: easier said than done].
Kalkman, C J
2008-10-18
The Netherlands Health Care Inspectorate recently changed its approach to quality of care and patient safety from a reactive to a firmly proactive style. In two reports, the current perioperative processes in Dutch hospitals were scrutinised. Despite a highly-motivated workforce, the inspectorate detected a lack of standardisation, incomplete or inaccessible patient data, poor adherence to hygiene standards and gaps during transfer of care in both the preoperative and intraoperative stages ofsurgery. The inspectorate mandates rapid implementation of various new patient safety approaches, including the use of checklists, 'time-outs' before the start of surgery, double checking of intravenous drugs and improved compliance with hygiene standards, as well as a strict definition of roles and responsibilities of team members. Implementation will require major changes within the processes and culture of operating theatres in Dutch hospitals. Such a change is unlikely to be completed within the short timeframe allowed by the inspectorate.
An Ontology Based Approach to Information Security
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Santos, Henrique
The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard
APPROACHES TO THE ASSESSMENT OF AROUSALS AND SLEEP DISTURBANCE IN CHILDREN
Paruthi, Shalini; Chervin, Ronald D.
2010-01-01
Childhood arousals, awakenings, and sleep disturbances during the night are common problems for both patients and their families. Additionally, inadequate sleep may contribute to daytime sleepiness, behavioral problems, and other important consequences of pediatric sleep disorders. Arousals, awakenings, and sleep disturbances can be quantified by routine polysomnography, and arousal scoring is generally performed as part of the standard polysomnogram. Here we review current approaches to quantification of arousals and sleep disturbances and examine outcomes that have been associated with these measures. Initial data suggest that computer-assisted identification of nonvisible arousals, cyclic alternating patterns, or respiratory cycle-related EEG changes may complement what can be accomplished by human scorers. Focus on contiguous bouts of sleep or specific sleep stages may prove similarly useful. Incorporation of autonomic arousal measures—such as heart rate variability, pulse transit time, or peripheral arterial tone—into standard reports may additionally capture subtle sleep fragmentation. PMID:20620104
Design principles for shift current photovoltaics
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; ...
2017-01-25
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less
Design principles for shift current photovoltaics
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; Coh, Sinisa; Moore, Joel E.
2017-01-01
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. By analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. Our method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenides such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W−1. Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells. PMID:28120823
Design principles for shift current photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less
Current-Sensitive Path Planning for an Underactuated Free-Floating Ocean Sensorweb
NASA Technical Reports Server (NTRS)
Dahl, Kristen P.; Thompson, David R.; McLaren, David; Chao, Yi; Chien, Steve
2011-01-01
This work investigates multi-agent path planning in strong, dynamic currents using thousands of highly under-actuated vehicles. We address the specific task of path planning for a global network of ocean-observing floats. These submersibles are typified by the Argo global network consisting of over 3000 sensor platforms. They can control their buoyancy to float at depth for data collection or rise to the surface for satellite communications. Currently, floats drift at a constant depth regardless of the local currents. However, accurate current forecasts have become available which present the possibility of intentionally controlling floats' motion by dynamically commanding them to linger at different depths. This project explores the use of these current predictions to direct float networks to some desired final formation or position. It presents multiple algorithms for such path optimization and demonstrates their advantage over the standard approach of constant-depth drifting.
ERIC Educational Resources Information Center
CROMWELL, RUE L.
FOUR INSTRUMENTS WERE DEVELOPED AND STANDARDIZED TO MEASURE EARLY EXPERIENCE, CURRENT BEHAVIOR, TREATMENT APPROACHES, AND PROGNOSIS OF EMOTIONALLY DISTURBED CHILDREN--THE RATING/RANKING SCALE OF CHILD BEHAVIOR (R/R SCALE), THE PARENT PRACTICES INVENTORY (PPI), THE SCALE ON PROCEDURES IN DEALING WITH CHILDREN (PDC), AND THE CHILD HISTORY CODE…
ERIC Educational Resources Information Center
Bitsika, Vicki
2005-01-01
The number of students who are labeled as having some form of behavioural disorder which requires specialized assistance in the regular school setting is growing. Current approaches to working with these students are often based on the standardized application of treatments designed to modify general symptoms rather than specific behaviours. It is…
Current management of penetrating torso trauma: nontherapeutic is not good enough anymore.
Ball, Chad G
2014-04-01
A highly organized approach to the evaluation and treatment of penetrating torso injuries based on regional anatomy provides rapid diagnostic and therapeutic consistency. It also minimizes delays in diagnosis, missed injuries and nontherapeutic laparotomies. This review discusses an optimal sequence of structured rapid assessments that allow the clinician to rapidly proceed to gold standard therapies with a minimal risk of associated morbidity.
Management of intracerebral pressure in the neurosciences critical care unit.
Marshall, Scott A; Kalanuria, Atul; Markandaya, Manjunath; Nyquist, Paul A
2013-07-01
Management of intracranial pressure in neurocritical care remains a potentially valuable target for improvements in therapy and patient outcomes. Surrogate markers of increased intracranial pressure, invasive monitors, and standard therapy, as well as promising new approaches to improve cerebral compliance are discussed, and a current review of the literature addressing this metric in neuroscience critical care is provided. Published by Elsevier Inc.
Lithium-Ion Small Cell Battery Shorting Study
NASA Technical Reports Server (NTRS)
Pearson, Chris; Curzon, David; Blackmore, Paul; Rao, Gopalakrishna
2004-01-01
AEA performed a hard short study on various cell configurations whilst monitoring voltage, current and temperature. Video recording was also done to verify the evidence for cell venting. The presentation summarizes the results of the study including video footage of typical samples. Need for the diode protection in manned applications is identified. The standard AEA approach of using fused connectors during AIT for unmanned applications is also described.
Policies and practices of beach monitoring in the Great Lakes, USA: a critical review
Nevers, Meredith B.; Whitman, Richard L.
2010-01-01
Beaches throughout the Great Lakes are monitored for fecal indicator bacteria (typically Escherichia coli) in order to protect the public from potential sewage contamination. Currently, there is no universal standard for sample collection and analysis or results interpretation. Monitoring policies are developed by individual beach management jurisdictions, and applications are highly variable across and within lakes, states, and provinces. Extensive research has demonstrated that sampling decisions for time, depth, number of replicates, frequency of sampling, and laboratory analysis all influence the results outcome, as well as calculations of the mean and interpretation of the results in policy decisions. Additional shortcomings to current monitoring approaches include appropriateness and reliability of currently used indicator bacteria and the overall goal of these monitoring programs. Current research is attempting to circumvent these complex issues by developing new tools and methods for beach monitoring. In this review, we highlight the variety of sampling routines used across the Great Lakes and the extensive body of research that challenges comparisons among beaches. We also assess the future of Great Lakes monitoring and the advantages and disadvantages of establishing standards that are evenly applied across all beaches.
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-01-01
Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476
Semantic e-Science in Space Physics - A Case Study
NASA Astrophysics Data System (ADS)
Narock, T.; Yoon, V.; Merka, J.; Szabo, A.
2009-05-01
Several search and retrieval systems for space physics data are currently under development in NASA's heliophysics data environment. We present a case study of two such systems, and describe our efforts in implementing an ontology to aid in data discovery. In doing so we highlight the various aspects of knowledge representation and show how they led to our ontology design, creation, and implementation. We discuss advantages that scientific reasoning allows, as well as difficulties encountered in current tools and standards. Finally, we present a space physics research project conducted with and without e-Science and contrast the two approaches.
Chisholm, A; Ang-Chen, P; Peters, S; Hart, J; Beenstock, J
2018-05-30
National Health Service England encourages staff to use everyday interactions with patients to discuss healthy lifestyle changes as part of the 'Making Every Contact Count' (MECC) approach. Although healthcare, government and public health organisations are now expected to adopt this approach, evidence is lacking about how MECC is currently implemented in practice. This study explored the views and experiences of those involved in designing, delivering and evaluating MECC. We conducted a qualitative study using semi-structured interviews with 13 public health practitioners with a range of roles in implementing MECC across England. Interviews were conducted via telephone, transcribed verbatim and analysed using an inductive thematic approach. Four key themes emerged identifying factors accounting for variations in MECC implementation: (i) 'design, quality and breadth of training', (ii) 'outcomes attended to and measured', (iii) 'engagement levels of trainees and trainers' and (iv) 'system-level influences'. MECC is considered a valuable public health approach but because organisations interpret MECC differently, staff training varies in nature. Practitioners believe that implementation can be improved, and an evidence-base underpinning MECC developed, by sharing experiences more widely, introducing standardization to staff training and finding better methods for assessing meaningful outcomes.
An approach for configuring space photovoltaic tandem arrays based on cell layer performance
NASA Technical Reports Server (NTRS)
Flora, C. S.; Dillard, P. A.
1991-01-01
Meeting solar array performance goals of 300 W/Kg requires use of solar cells with orbital efficiencies greater than 20 percent. Only multijunction cells and cell layers operating in tandem produce this required efficiency. An approach for defining solar array design concepts that use tandem cell layers involve the following: transforming cell layer performance at standard test conditions to on-orbit performance; optimizing circuit configuration with tandem cell layers; evaluating circuit sensitivity to cell current mismatch; developing array electrical design around selected circuit; and predicting array orbital performance including seasonal variations.
Minimally invasive surgery for rectal cancer: Are we there yet?
Champagne, Bradley J; Makhija, Rohit
2011-01-01
Laparoscopic colon surgery for select cancers is slowly evolving as the standard of care but minimally invasive approaches for rectal cancer have been viewed with significant skepticism. This procedure has been performed by select surgeons at specialized centers and concerns over local recurrence, sexual dysfunction and appropriate training measures have further hindered widespread acceptance. Data for laparoscopic rectal resection now supports its continued implementation and widespread usage by expeienced surgeons for select patients. The current controversies regarding technical approaches have created ambiguity amongst opinion leaders and are also addressed in this review. PMID:21412496
Evaluation Criteria for Micro-CAI: A Psychometric Approach
Wallace, Douglas; Slichter, Mark; Bolwell, Christine
1985-01-01
The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.
Transanal Total Mesorectal Excision: A Novel Approach to Rectal Surgery
Suwanabol, Pasithorn A.; Maykel, Justin A.
2017-01-01
Less invasive approaches continue to be explored and refined for diseases of the colon and rectum. The current gold standard for the surgical treatment of rectal cancer, total mesorectal excision (TME), is a technically precise yet demanding procedure with outcomes measured by both oncologic and functional outcomes (including bowel, urinary, and sexual). To date, the minimally invasive approach to rectal cancer has not yet been perfected, leaving ample opportunity for rectal surgeons to innovate. Transanal TME has recently emerged as a safe and effective technique for both benign and malignant diseases of the rectum. While widespread acceptance of this surgical approach remains tempered at this time due to lack of long-term oncologic outcome data, short-term outcomes are promising and there is great excitement surrounding the promise of this technique. PMID:28381943
Comparison of Predictive Modeling Methods of Aircraft Landing Speed
NASA Technical Reports Server (NTRS)
Diallo, Ousmane H.
2012-01-01
Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.
The need for data standards in zoomorphology.
Vogt, Lars; Nickel, Michael; Jenner, Ronald A; Deans, Andrew R
2013-07-01
eScience is a new approach to research that focuses on data mining and exploration rather than data generation or simulation. This new approach is arguably a driving force for scientific progress and requires data to be openly available, easily accessible via the Internet, and compatible with each other. eScience relies on modern standards for the reporting and documentation of data and metadata. Here, we suggest necessary components (i.e., content, concept, nomenclature, format) of such standards in the context of zoomorphology. We document the need for using data repositories to prevent data loss and how publication practice is currently changing, with the emergence of dynamic publications and the publication of digital datasets. Subsequently, we demonstrate that in zoomorphology the scientific record is still limited to published literature and that zoomorphological data are usually not accessible through data repositories. The underlying problem is that zoomorphology lacks the standards for data and metadata. As a consequence, zoomorphology cannot participate in eScience. We argue that the standardization of morphological data requires i) a standardized framework for terminologies for anatomy and ii) a formalized method of description that allows computer-parsable morphological data to be communicable, compatible, and comparable. The role of controlled vocabularies (e.g., ontologies) for developing respective terminologies and methods of description is discussed, especially in the context of data annotation and semantic enhancement of publications. Finally, we introduce the International Consortium for Zoomorphology Standards, a working group that is open to everyone and whose aim is to stimulate and synthesize dialog about standards. It is the Consortium's ultimate goal to assist the zoomorphology community in developing modern data and metadata standards, including anatomy ontologies, thereby facilitating the participation of zoomorphology in eScience. Copyright © 2013 Wiley Periodicals, Inc.
Innovative approach to teaching communication skills to nursing students.
Zavertnik, Jean Ellen; Huff, Tanya A; Munro, Cindy L
2010-02-01
This study assessed the effectiveness of a learner-centered simulation intervention designed to improve the communication skills of preprofessional sophomore nursing students. An innovative teaching strategy in which communication skills are taught to nursing students by using trained actors who served as standardized family members in a clinical learning laboratory setting was evaluated using a two-group posttest design. In addition to current standard education, the intervention group received a formal training session presenting a framework for communication and a 60-minute practice session with the standardized family members. Four domains of communication-introduction, gathering of information, imparting information, and clarifying goals and expectations-were evaluated in the control and intervention groups in individual testing sessions with a standardized family member. The intervention group performed better than the control group in all four tested domains related to communication skills, and the difference was statistically significant in the domain of gathering information (p = 0.0257). Copyright 2010, SLACK Incorporated.
Ash, Susan; O'Connor, Jackie; Anderson, Sarah; Ridgewell, Emily; Clarke, Leigh
2015-06-01
The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis - a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups - an expert and a recent graduate group of Australian orthotist/prosthetists - were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.
Norris, Susan L.; Holmer, Haley K.; Burda, Brittany U.; Ogden, Lauren A.; Fu, Rongwei
2012-01-01
Background Conflict of interest (COI) of clinical practice guideline (CPG) sponsors and authors is an important potential source of bias in CPG development. The objectives of this study were to describe the COI policies for organizations currently producing a significant number of CPGs, and to determine if these policies meet 2011 Institute of Medicine (IOM) standards. Methodology/Principal Findings We identified organizations with five or more guidelines listed in the National Guideline Clearinghouse between January 1, 2009 and November 5, 2010. We obtained the COI policy for each organization from publicly accessible sources, most often the organization's website, and compared those polices to IOM standards related to COI. 37 organizations fulfilled our inclusion criteria, of which 17 (46%) had a COI policy directly related to CPGs. These COI policies varied widely with respect to types of COI addressed, from whom disclosures were collected, monetary thresholds for disclosure, approaches to management, and updating requirements. Not one organization's policy adhered to all seven of the IOM standards that were examined, and nine organizations did not meet a single one of the standards. Conclusions/Significance COI policies among organizations producing a large number of CPGs currently do not measure up to IOM standards related to COI disclosure and management. CPG developers need to make significant improvements in these policies and their implementation in order to optimize the quality and credibility of their guidelines. PMID:22629391
Paramedic specialization: a strategy for better out-of-hospital care.
Caffrey, Sean M; Clark, John R; Bourn, Scott; Cole, Jim; Cole, John S; Mandt, Maria; Murray, Jimm; Sibold, Harry; Stuhlmiller, David; Swanson, Eric R
2014-01-01
Demographic, economic, and political forces are driving significant change in the US health care system. Paramedics are a health profession currently providing advanced emergency care and medical transportation throughout the United States. As the health care system demands more team-based care in nonacute, community, interfacility, and tactical response settings, specialized paramedic practitioners could be a valuable and well-positioned resource to meet these needs. Currently, there is limited support for specialty certifications that demand appropriate education, training, or experience standards before specialized practice by paramedics. A fragmented approach to specialty paramedic practice currently exists across our country in which states, regulators, nonprofit organizations, and other health care professions influence and regulate the practice of paramedicine. Multiple other medical professions, however, have already developed effective systems over the last century that can be easily adapted to the practice of paramedicine. Paramedicine practitioners need to organize a profession-based specialty board to organize and standardize a specialty certification system that can be used on a national level. Copyright © 2014 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.
Johnston, Jennifer M.
2014-01-01
The majority of biological processes mediated by G Protein-Coupled Receptors (GPCRs) take place on timescales that are not conveniently accessible to standard molecular dynamics (MD) approaches, notwithstanding the current availability of specialized parallel computer architectures, and efficient simulation algorithms. Enhanced MD-based methods have started to assume an important role in the study of the rugged energy landscape of GPCRs by providing mechanistic details of complex receptor processes such as ligand recognition, activation, and oligomerization. We provide here an overview of these methods in their most recent application to the field. PMID:24158803
Multi-mission space science data processing systems - Past, present, and future
NASA Technical Reports Server (NTRS)
Stallings, William H.
1990-01-01
Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.
Strategic Deployment of Clinical Models.
Goossen, William
2016-01-01
The selection, implementation, and certification of electronic health records (EHR) could benefit from the required use of one of the established clinical model approaches. For the lifelong record of data about individuals, issues arise about the permanence and preservation of data during or even beyond a lifetime. Current EHR do not fully adhere to pertinent standards for clinical data, where it is known for some 20 plus years that standardization of health data is a cornerstone for patient safety, interoperability, data retrieval for various purposes and the lifelong preservation of such data. This paper briefly introduces the issues and gives a brief recommendation for future work in this area.
Englar, Ryane E
Experiential learning through the use of standardized patients (SPs) is the primary way by which human medical schools teach clinical communication. The profession of veterinary medicine has followed suit in response to new graduates' and their employers' concerns that veterinary interpersonal skills are weak and unsatisfactory. As a result, standardized clients (SCs) are increasingly relied upon as invaluable teaching tools within veterinary curricula to advance relationship-centered care in the context of a clinical scenario. However, there is little to no uniformity in the approach that various colleges of veterinary medicine take when designing simulation-based education (SBE). A further complication is that programs with pre-conceived curricula must now make room for training in clinical communication. Curricular time constraints challenge veterinary colleges to individually decide how best to utilize SCs in what time is available. Because it is a new program, Midwestern University College of Veterinary Medicine (MWU CVM) has had the flexibility and the freedom to prioritize an innovative approach to SBE. The author discusses the SBE that is currently underway at MWU CVM, which incorporates 27 standardized client encounters over eight consecutive pre-clinical quarters. Prior to entering clinical rotations, MWU CVM students are exposed to a variety of simulation formats, species, clients, settings, presenting complaints, and communication tasks. These represent key learning opportunities for students to practice clinical communication, develop self-awareness, and strategize their approach to future clinical experiences.
An Evaluation of Unit and ½ Mass Correction Approaches as a ...
Rare earth elements (REE) and certain alkaline earths can produce M+2 interferences in ICP-MS because they have sufficiently low second ionization energies. Four REEs (150Sm, 150Nd, 156Gd and 156Dy) produce false positives on 75As and 78Se and 132Ba can produce a false positive on 66Zn. Currently, US EPA Method 200.8 does not address these as sources of false positives. Additionally, these M+2 false positives are typically enhanced if collision cell technology is utilized to reduce polyatomic interferences associated with ICP-MS detection. A preliminary evaluation indicates that instrumental tuning conditions can impact the observed M+2/M+1 ratio and in turn the false positives generated on Zn, As and Se. Both unit and ½ mass approaches will be evaluated to correct for these false positives relative to the benchmark concentrations estimates from a triple quadrupole ICP-MS using standard solutions. The impact of matrix on these M+2 corrections will be evaluated over multiple analysis days with a focus on evaluating internal standards that mirror the matrix induced shifts in the M+2 ion transmission. The goal of this evaluation is to move away from fixed M+2 corrective approaches and move towards sample specific approaches that mimic the sample matrix induced variability while attempting to address intra-day variability of the M+2 correction factors through the use of internal standards. Oral Presentation via webinar for EPA Laboratory Technical Informati
NASA Technical Reports Server (NTRS)
Burt, Eric A.; Tjoelker, R. L.
2007-01-01
A recent long-term comparison between the compensated multi-pole Linear Ion Trap Standard (LITS) and the laser-cooled primary standards via GPS carrier phase time transfer showed a deviation of less than 2.7x10(exp -17)/day. A subsequent evaluation of potential drift contributors in the LITS showed that the leading candidates are fluctuations in background gases and the neon buffer gas. The current vacuum system employs a "flow-through" turbomolecular pump and a diaphragm fore pump. Here we consider the viability of a "sealed" vacuum system pumped by a non-evaporable getter for long-term ultra-stable clock operation. Initial tests suggests that both further stability improvement and longer mean-time-between-maintenance can be achieved using this approach
A modular approach for automated sample preparation and chemical analysis
NASA Technical Reports Server (NTRS)
Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph
1994-01-01
Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.
Implementing the Next Generation Science Standards: Impacts on Geoscience Education
NASA Astrophysics Data System (ADS)
Wysession, M. E.
2014-12-01
This is a critical time for the geoscience community. The Next Generation Science Standards (NGSS) have been released and are now being adopted by states (a dozen states and Washington, DC, at the time of writing this), with dramatic implications for national K-12 science education. Curriculum developers and textbook companies are working hard to construct educational materials that match the new standards, which emphasize a hands-on practice-based approach that focuses on working directly with primary data and other forms of evidence. While the set of 8 science and engineering practices of the NGSS lend themselves well to the observation-oriented approach of much of the geosciences, there is currently not a sufficient number of geoscience educational modules and activities geared toward the K-12 levels, and geoscience research organizations need to be mobilizing their education & outreach programs to meet this need. It is a rare opportunity that will not come again in this generation. There are other significant issues surrounding the implementation of the NGSS. The NGSS involves a year of Earth and space science at the high school level, but there does not exist a sufficient workforce is geoscience teachers to meet this need. The form and content of the geoscience standards are also very different from past standards, moving away from a memorization and categorization approach and toward a complex Earth Systems Science approach. Combined with the shift toward practice-based teaching, this means that significant professional development will therefore be required for the existing K-12 geoscience education workforce. How the NGSS are to be assessed is another significant question, with an NRC report providing some guidance but leaving many questions unanswered. There is also an uneasy relationship between the NGSS and the Common Core of math and English, and the recent push-back against the Common Core in many states may impact the implementation of the NGSS.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Henderson, Silja E K; Elsass, Peter; Berliner, Peter
2016-07-01
The primary objective of this paper is to examine and inform the mental health and psychosocial support standards of the 2011 edition of the Sphere Project's Humanitarian Charter and Minimum Standards in Humanitarian Response. This is done through a qualitative analysis of internal evaluation documents, reflecting four long-term humanitarian psychosocial programmes in different countries in post-tsunami Asia. The analysis yielded three overall conclusions. First, the Sphere standards on mental health and psychosocial support generally are highly relevant to long-term psychosocial interventions after disasters such as the Indian Ocean tsunami of 26 December 2004, and their application in such settings may improve the quality of the response. Second, some of the standards in the current Sphere handbook may lack sufficient guidance to ensure the quality of humanitarian response required. Third, the long-term intervention approach poses specific challenges to programming, a problem that could be addressed by including additional guidance in the publication. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
When the value of gold is zero.
Chase, J Geoffrey; Moeller, Knut; Shaw, Geoffrey M; Schranz, Christoph; Chiew, Yeong Shiong; Desaive, Thomas
2014-06-27
This manuscript presents the concerns around the increasingly common problem of not having readily available or useful "gold standard" measurements. This issue is particularly important in critical care where many measurements used in decision making are surrogates of what we would truly wish to use. However, the question is broad, important and applicable in many other areas.In particular, a gold standard measurement often exists, but is not clinically (or ethically in some cases) feasible. The question is how does one even begin to develop new measurements or surrogates if one has no gold standard to compare with?We raise this issue concisely with a specific example from mechanical ventilation, a core bread and butter therapy in critical care that is also a leading cause of length of stay and cost of care. Our proposed solution centers around a hierarchical validation approach that we believe would ameliorate ethics issues around radiation exposure that make current gold standard measures clinically infeasible, and thus provide a pathway to create a (new) gold standard.
Kwak, Jin Il; Nam, Sun-Hwa; An, Youn-Joo
2018-02-01
Since the Korean Ministry of the Environment established the Master Plan for Water Environment (2006-2015), the need to revise the water quality standards (WQSs) has driven government projects to expand the standards for the protection of human health and aquatic ecosystems. This study aimed to provide an historical overview of how these WQSs were established, amended, and expanded over the past 10 years in Korea. Here, major projects related to national monitoring in rivers and the amendment of WQSs were intensely reviewed, including projects on the categorization of hazardous chemicals potentially discharged into surface water, the chemical ranking and scoring methodology for surface water (CRAFT, Chemical RAnking of surFace water polluTants), whole effluent toxicity (WET) management systems, the 4th, 5th, and 6th revisions of the water quality standards for the protection of human health, and efforts toward developing the 7th revision. In this review, we assimilated the past and current status as well as future perspectives of Korean surface WQSs. This research provides information that aids our understanding of how surface WQSs have been expanded, and how scientific approaches to ensure water quality have been applied at each step of the process in Korea.
Towards life cycle sustainability assessent of cities. A review of background knowledge.
Albertí, Jaume; Balaguera, Alejandra; Brodhag, Christian; Fullana-I-Palmer, Pere
2017-12-31
This article analyses whether existing LCA and sustainability methods can be used in the assessment of a city or an urban region. The approach is performed through the review of current existing LCA-based and sustainability standards and guidelines. A focus is put into those LCA-based standards specially designed for the built environment. Moreover, a review of non-LCA based standards, indices and guides for the assessment of the sustainability of countries, cities or urban regions is done. The purpose is to check if these assessment tools can provide good results in the absence of LCA-based assessments for cities and urban regions. This review demonstrates the lack of consensus in the definition of both, the city and its boundaries, which hinders the development of useful sustainability standards. Furthermore, it is concluded that current sustainability assessment tools miss, at least, one of these aspects: (i) holistic point of view, (ii) focus on various environmental impacts, (iii) a Life Cycle (LC) perspective, and (iv) the possibility to compare the results among different cities or urban regions. From the LCA perspective, the deficiencies found also include the need for a definition of the function, functional unit (FU), and reference flow (RF) of neighbourhoods, cities, and urban regions. Copyright © 2017 Elsevier B.V. All rights reserved.
Developing accreditation for community based surgery: the Irish experience.
Ní Riain, Ailís; Collins, Claire; O'Sullivan, Tony
2018-02-05
Purpose Carrying out minor surgery procedures in the primary care setting is popular with patients, cost effective and delivers at least as good outcomes as those performed in the hospital setting. This paper aims to describe the central role of clinical leadership in developing an accreditation system for general practitioners (GPs) undertaking community-based surgery in the Irish national setting where no mandatory accreditation process currently exists. Design/methodology/approach In all, 24 GPs were recruited to the GP network. Ten pilot standards were developed addressing GPs' experience and training, clinical activity and practice supporting infrastructure and tested, using information and document review, prospective collection of clinical data and a practice inspection visit. Two additional components were incorporated into the project (patient satisfaction survey and self-audit). A multi-modal evaluation was undertaken. A majority of GPs was included at all stages of the project, in line with the principles of action learning. The steering group had a majority of GPs with relevant expertise and representation of all other actors in the minor surgery arena. The GP research network contributed to each stage of the project. The project lead was a GP with minor surgery experience. Quantitative data collected were analysed using Predictive Analytic SoftWare. Krueger's framework analysis approach was used to analyse the qualitative data. Findings A total of 9 GPs achieved all standards at initial review, 14 successfully completed corrective actions and 1 GP did not achieve the required standard. Standards were then amended to reflect findings and a supporting framework was developed. Originality/value The flexibility of the action-learning approach and the clinical leadership design allowed for the development of robust quality standards in a short timeframe.
PILOT STUDY: CCQM-P43: Tributyltin and dibutyltin in sediment
NASA Astrophysics Data System (ADS)
Wolff Briche, Céline S. J.; Wahlen, Raimund; Sturgeon, Ralph E.
2006-01-01
The pilot study CCQM P43 was undertaken to allow the assessment of the current capabilities of interested National Metrology Institutes (NIMs) (those which are members of the CCQM) and selected outside 'expert' laboratories for quantification of (C4H9)2Sn+ (DBT) and (C4H9)3Sn+ (TBT) in a prepared marine sediment. It was organised in parallel to the key comparison CCQM-K28, in which only NMIs determined TBT. This exercise was sanctioned by the 8th CCQM meeting, 18-19 April 2002, as an activity of the Inorganic Analysis Working Group and was jointly coordinated by the Institute for National Measurement Standards of the National Research Council of Canada (NRC) and LGC, UK. A total of 13 laboratories initially indicated interest (nine NMIs and four external laboratories). Only one external laboratory utilised a standard calibration approach based on natural abundance TBT and DBT standards, whereas all NMIs relied on isotope dilution mass spectrometry for quantitation (one NMI used ID-MS and an internal standard approach for the analysis of DBT). For this purpose, species specific 117Sn-enriched TBT and DBT standards were supplied by LGC. No sample preparation methodology was prescribed by the coordinating laboratories and, as a consequence, a variety of approaches was adopted by the participants, including mechanical shaking, sonication, accelerated solvent extraction, microwave assisted extraction and heating in combination with Grignard derivatization, ethylation and direct sampling. Detection techniques included ICP-MS (coupled to GC or HPLC), GC-MS and GC-AED. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the Mutual Recognition Arrangement (MRA).
Current management of penetrating torso trauma: nontherapeutic is not good enough anymore
Ball, Chad G.
2014-01-01
A highly organized approach to the evaluation and treatment of penetrating torso injuries based on regional anatomy provides rapid diagnostic and therapeutic consistency. It also minimizes delays in diagnosis, missed injuries and nontherapeutic laparotomies. This review discusses an optimal sequence of structured rapid assessments that allow the clinician to rapidly proceed to gold standard therapies with a minimal risk of associated morbidity. PMID:24666458
2012-07-01
does not display a currently valid OMB control number. 1. REPORT DATE JUL 2012 2 . REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE...bottom right). ............................................................... 5 Figure 3- 2 A standard approach to mitigating sensor positioning...within the yellow enclosure. ........................ 8 Figure 3-5 A comparison of the actual ( 2 -D motion, pen trace on paper) versus the computed
Neutrino oscillations and Non-Standard Interactions
NASA Astrophysics Data System (ADS)
Farzan, Yasaman; Tórtola, Mariam
2018-02-01
Current neutrino experiments are measuring the neutrino mixing parameters with an unprecedented accuracy. The upcoming generation of neutrino experiments will be sensitive to subdominant oscillation effects that can give information on the yet-unknown neutrino parameters: the Dirac CP-violating phase, the mass ordering and the octant of θ_{23}. Determining the exact values of neutrino mass and mixing parameters is crucial to test neutrino models and flavor symmetries designed to predict these neutrino parameters. In the first part of this review, we summarize the current status of the neutrino oscillation parameter determination. We consider the most recent data from all solar experiments and the atmospheric data from Super-Kamiokande, IceCube and ANTARES. We also implement the data from the reactor neutrino experiments KamLAND, Daya Bay, RENO and Double Chooz as well as the long baseline neutrino data from MINOS, T2K and NOvA. If in addition to the standard interactions, neutrinos have subdominant yet-unknown Non-Standard Interactions (NSI) with matter fields, extracting the values of these parameters will suffer from new degeneracies and ambiguities. We review such effects and formulate the conditions on the NSI parameters under which the precision measurement of neutrino oscillation parameters can be distorted. Like standard weak interactions, the non-standard interaction can be categorized into two groups: Charged Current (CC) NSI and Neutral Current (NC) NSI. Our focus will be mainly on neutral current NSI because it is possible to build a class of models that give rise to sizeable NC NSI with discernible effects on neutrino oscillation. These models are based on new U(1) gauge symmetry with a gauge boson of mass ≲ 10 MeV. The UV complete model should be of course electroweak invariant which in general implies that along with neutrinos, charged fermions also acquire new interactions on which there are strong bounds. We enumerate the bounds that already exist on the electroweak symmetric models and demonstrate that it is possible to build viable models avoiding all these bounds. In the end, we review methods to test these models and suggest approaches to break the degeneracies in deriving neutrino mass parameters caused by NSI.
Development of procedures for programmable proximity aperture lithography
NASA Astrophysics Data System (ADS)
Whitlow, H. J.; Gorelick, S.; Puttaraksa, N.; Napari, M.; Hokkanen, M. J.; Norarat, R.
2013-07-01
Programmable proximity aperture lithography (PPAL) with MeV ions has been used in Jyväskylä and Chiang Mai universities for a number of years. Here we describe a number of innovations and procedures that have been incorporated into the LabView-based software. The basic operation involves the coordination of the beam blanker and five motor-actuated translators with high accuracy, close to the minimum step size with proper anti-collision algorithms. By using special approaches, such writing calibration patterns, linearisation of position and careful backlash correction the absolute accuracy of the aperture size and position, can be improved beyond the standard afforded by the repeatability of the translator end-point switches. Another area of consideration has been the fluence control procedures. These involve control of the uniformity of the beam where different approaches for fluence measurement such as simultaneous aperture current and the ion current passing through the aperture using a Faraday cup are used. Microfluidic patterns may contain many elements that make-up mixing sections, reaction chambers, separation columns and fluid reservoirs. To facilitate conception and planning we have implemented a .svg file interpreter, that allows the use of scalable vector graphics files produced by standard drawing software for generation of patterns made up of rectangular elements.
Low cost paths to binary optics
NASA Technical Reports Server (NTRS)
Nelson, Arthur; Domash, Lawrence
1993-01-01
Application of binary optics has been limited to a few major laboratories because of the limited availability of fabrication facilities such as e-beam machines and the lack of standardized design software. Foster-Miller has attempted to identify low cost approaches to medium-resolution binary optics using readily available computer and fabrication tools, primarily for the use of students and experimenters in optical computing. An early version of our system, MacBEEP, made use of an optimized laser film recorder from the commercial typesetting industry with 10 micron resolution. This report is an update on our current efforts to design and build a second generation MacBEEP, which aims at 1 micron resolution and multiple phase levels. Trails included a low cost scanning electron microscope in microlithography mode, and alternative laser inscribers or photomask generators. Our current software approach is based on Mathematica and PostScript compatibility.
[Precision Oncology and "Molecular Tumor Boards" - Concepts, Chances and Challenges].
Holch, Julian Walter; Westphalen, Christoph Benedikt; Hiddemann, Wolfgang; Heinemann, Volker; Jung, Andreas; Metzeler, Klaus Hans
2017-11-01
Recent developments in genomics allow a more and more comprehensive genetic analysis of human malignancies, and have sparked hopes that this will contribute to the development of novel targeted, effective and well-tolerated therapies.While targeted therapies have improved the prognosis of many cancer patients with certain tumor types, "precision oncology" also brings along new challenges. Highly personalized treatment strategies require new strategies for clinical trials and translation into routine clinical practice. We review the current technical approaches for "universal genetic testing" in cancer, and potential pitfalls in the interpretation of such data. We then provide an overview of the available evidence supporting treatment strategies based on extended genetic analysis. Based on the available data, we conclude that "precision oncology" approaches that go beyond the current standard of care should be pursued within the framework of an interdisciplinary "molecular tumor board", and preferably within clinical trials. © Georg Thieme Verlag KG Stuttgart · New York.
Codony, Francesc; Pérez, Leonardo Martín; Adrados, Bárbara; Agustí, Gemma; Fittipaldi, Mariana; Morató, Jordi
2012-01-01
Culture-based methods for fecal indicator microorganisms are the standard protocol to assess potential health risk from drinking water systems. However, these traditional fecal indicators are inappropriate surrogates for disinfection-resistant fecal pathogens and the indigenous pathogens that grow in drinking water systems. There is now a range of molecular-based methods, such as quantitative PCR, which allow detection of a variety of pathogens and alternative indicators. Hence, in addition to targeting total Escherichia coli (i.e., dead and alive) for the detection of fecal pollution, various amoebae may be suitable to indicate the potential presence of pathogenic amoeba-resisting microorganisms, such as Legionellae. Therefore, monitoring amoeba levels by quantitative PCR could be a useful tool for directly and indirectly evaluating health risk and could also be a complementary approach to current microbial quality control strategies for drinking water systems.
Contemporary approach to stroke prevention in atrial fibrillation: Risks, benefits, and new options.
Stock, Jonathan; Malm, Brian J
2018-04-04
Atrial fibrillation is a common diagnosis affecting nearly 3 million adults in the United States. Morbidity and mortality in these patients is driven largely by the associated increased risk of thromboembolic complications, especially stroke. Atrial fibrillation is a stronger risk factor than hypertension, coronary disease, or heart failure and is associated with an approximately five-fold increased risk. Mitigating stroke risk can be challenging and requires accurate assessment of stroke risk factors and careful selection of appropriate therapy. Anticoagulation, including the more recently introduced direct oral anticoagulants, is the standard of care for most patients. In addition, emerging non-pharmacologic mechanical interventions are playing an expanding role in reducing stroke risk in select patients. In this review we highlight the current approach to stroke risk stratification in atrial fibrillation and discuss in detail the mechanism, risks, and benefits of current and evolving therapies. Copyright © 2018 Elsevier Inc. All rights reserved.
Biologically Based Restorative Management of Tooth Wear
Kelleher, Martin G. D.; Bomfim, Deborah I.; Austin, Rupert S.
2012-01-01
The prevalence and severity of tooth wear is increasing in industrialised nations. Yet, there is no high-level evidence to support or refute any therapeutic intervention. In the absence of such evidence, many currently prevailing management strategies for tooth wear may be failing in their duty of care to first and foremost improve the oral health of patients with this disease. This paper promotes biologically sound approaches to the management of tooth wear on the basis of current best evidence of the aetiology and clinical features of this disease. The relative risks and benefits of the varying approaches to managing tooth wear are discussed with reference to long-term follow-up studies. Using reference to ethical standards such as “The Daughter Test”, this paper presents case reports of patients with moderate-to-severe levels of tooth wear managed in line with these biologically sound principles. PMID:22315608
User-Defined Data Distributions in High-Level Programming Languages
NASA Technical Reports Server (NTRS)
Diaconescu, Roxana E.; Zima, Hans P.
2006-01-01
One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.
Koren, Danny; Seidman, Larry J; Goldsmith, Morris; Harvey, Phillip D
2006-01-01
While the role of impaired cognition in accounting for functional outcome in schizophrenia is generally established by now, the overlap is far from complete. Moreover, little is known about the potential mechanisms that bridge between cognition and functional outcome. The aim of this article is to aid in closing this gap by presenting a novel, more ecologically valid approach for neuropsychological assessment. The new approach is motivated by the view that metacognitive processes of self-monitoring and self-regulation are fundamental determinants of competent functioning in the real world. The new approach incorporates experimental psychological concepts and paradigms used to study metacognition into current standard neuropsychological assessment procedures. Preliminary empirical data that support and demonstrate the utility of the new approach for assessment, as well as remediation efforts, in schizophrenia are presented and discussed. PMID:16397202
NASA Astrophysics Data System (ADS)
Jog, Mayank V.; Smith, Robert X.; Jann, Kay; Dunn, Walter; Lafon, Belen; Truong, Dennis; Wu, Allan; Parra, Lucas; Bikson, Marom; Wang, Danny J. J.
2016-10-01
Transcranial direct current stimulation (tDCS) is an emerging non-invasive neuromodulation technique that applies mA currents at the scalp to modulate cortical excitability. Here, we present a novel magnetic resonance imaging (MRI) technique, which detects magnetic fields induced by tDCS currents. This technique is based on Ampere’s law and exploits the linear relationship between direct current and induced magnetic fields. Following validation on a phantom with a known path of electric current and induced magnetic field, the proposed MRI technique was applied to a human limb (to demonstrate in-vivo feasibility using simple biological tissue) and human heads (to demonstrate feasibility in standard tDCS applications). The results show that the proposed technique detects tDCS induced magnetic fields as small as a nanotesla at millimeter spatial resolution. Through measurements of magnetic fields linearly proportional to the applied tDCS current, our approach opens a new avenue for direct in-vivo visualization of tDCS target engagement.
Simplified Approach Charts Improve Data Retrieval Performance
Stewart, Michael; Laraway, Sean; Jordan, Kevin; Feary, Michael S.
2016-01-01
The effectiveness of different instrument approach charts to deliver minimum visibility and altitude information during airport equipment outages was investigated. Eighteen pilots flew simulated instrument approaches in three conditions: (a) normal operations using a standard approach chart (standard-normal), (b) equipment outage conditions using a standard approach chart (standard-outage), and (c) equipment outage conditions using a prototype decluttered approach chart (prototype-outage). Errors and retrieval times in identifying minimum altitudes and visibilities were measured. The standard-outage condition produced significantly more errors and longer retrieval times versus the standard-normal condition. The prototype-outage condition had significantly fewer errors and shorter retrieval times than did the standard-outage condition. The prototype-outage condition produced significantly fewer errors but similar retrieval times when compared with the standard-normal condition. Thus, changing the presentation of minima may reduce risk and increase safety in instrument approaches, specifically with airport equipment outages. PMID:28491009
Nooh, Ahmed Mohamed; Abdeldayem, Hussein Mohammed; Ben-Affan, Othman
2017-05-01
The objective of this study was to assess effectiveness and safety of the reverse breech extraction approach in Caesarean section for obstructed labour, and compare it with the standard approach of pushing the fetal head up through the vagina. This randomised controlled trial included 192 women. In 96, the baby was delivered by the 'reverse breech extraction approach', and in the remaining 96, by the 'standard approach'. Extension of uterine incision occurred in 18 participants (18.8%) in the reverse breech extraction approach group, and 46 (47.9%) in the standard approach group (p = .0003). Two women (2.1%) in the reverse breech extraction approach group needed blood transfusion and 11 (11.5%) in the standard approach group (p = .012). Pyrexia developed in 3 participants (3.1%) in the reverse breech extraction approach group, and 19 (19.8%) in the standard approach group (p = .0006). Wound infection occurred in 2 women (2.1%) in the reverse breech extraction approach group, and 12 (12.5%) in the standard approach group (p = .007). Apgar score <7 at 5 minutes was noted in 8 babies (8.3%) in the reverse breech extraction approach group, and 21 (21.9%) in the standard approach group (p = .015). In conclusion, reverse breech extraction in Caesarean section for obstructed labour is an effective and safe alternative to the standard approach of pushing the fetal head up through the vagina.
On the recovery of electric currents in the liquid core of the Earth
NASA Astrophysics Data System (ADS)
Kuslits, Lukács; Prácser, Ernő; Lemperger, István
2017-04-01
Inverse geodynamo modelling has become a standard method to get a more accurate image of the processes within the outer core. In this poster excerpts from the preliminary results of an other approach are presented. This comes around the possibility of recovering the currents within the liquid core directly, using Main Magnetic Field data. The approximation of different systems of the flow of charge is possible with various geometries. Based on previous geodynamo simulations, current coils can furnish a good initial geometry for such an estimation. The presentation introduces our preliminary test results and the study of reliability of the applied inversion algorithm for different numbers of coils, distributed in a grid simbolysing the domain between the inner-core and core-mantle boundaries. We shall also present inverted current structures using Main Field model data.
NASA Astrophysics Data System (ADS)
Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun
2015-05-01
Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farooq, M.O.
1988-01-01
The failure of the standard Growth Approach to economic development to solve the problems of underdevelopment in LDCs has caused an alternative approach, Basic Needs Approach (BNA), to attain prominence in development thought. BNA emphasizes poverty-minimizing growth. Its strategy of direct attack on poverty has better potential for LDCs' development and fulfillment of their populations' basic needs than the trickle-down mechanism of the Growth Approach. BNA requires, among other things, (a) suitable rural financial markets (RFMs) as parts of the overall financial system, and (b) indigenous technological capabilities. The financial system, if it functions as a central element in anmore » institutionalized technology policy, can link technology-related institutions that generate, evaluate, and promote appropriate technologies (ATs) with RFMs that can support adoption and diffusion of ATs in the agro-rural sector. The above argument uses Bangladesh as a case for illustration. In the light of an institutional framework presented, examined, and extended in this dissertation, it is found that Bangladesh currently does not have an institutionalized technology policy. The current organizational framework and policies related to technological development are not conducive to BNA.« less
Pfoertner, Timo-Kolja; Andress, Hans-Juergen; Janssen, Christian
2011-08-01
Current study introduces the living standard concept as an alternative approach of measuring poverty and compares its explanatory power to an income-based poverty measure with regard to subjective health status of the German population. Analyses are based on the German Socio-Economic Panel (2001, 2003 and 2005) and refer to binary logistic regressions of poor subjective health status with regard to each poverty condition, their duration and their causal influence from a previous time point. To calculate the discriminate power of both poverty indicators, initially the indicators were considered separately in regression models and subsequently, both were included simultaneously. The analyses reveal a stronger poverty-health relationship for the living standard indicator. An inadequate living standard in 2005, longer spells of an inadequate living standard between 2001, 2003 and 2005 as well as an inadequate living standard at a previous time point is significantly strongly associated with subjective health than income poverty. Our results challenge conventional measurements of the relationship between poverty and health that probably has been underestimated by income measures so far.
Gordon, Sarah; Daneshian, Mardas; Bouwstra, Joke; Caloni, Francesca; Constant, Samuel; Davies, Donna E; Dandekar, Gudrun; Guzman, Carlos A; Fabian, Eric; Haltner, Eleonore; Hartung, Thomas; Hasiwa, Nina; Hayden, Patrick; Kandarova, Helena; Khare, Sangeeta; Krug, Harald F; Kneuer, Carsten; Leist, Marcel; Lian, Guoping; Marx, Uwe; Metzger, Marco; Ott, Katharina; Prieto, Pilar; Roberts, Michael S; Roggen, Erwin L; Tralau, Tewes; van den Braak, Claudia; Walles, Heike; Lehr, Claus-Michael
2015-01-01
Models of the outer epithelia of the human body - namely the skin, the intestine and the lung - have found valid applications in both research and industrial settings as attractive alternatives to animal testing. A variety of approaches to model these barriers are currently employed in such fields, ranging from the utilization of ex vivo tissue to reconstructed in vitro models, and further to chip-based technologies, synthetic membrane systems and, of increasing current interest, in silico modeling approaches. An international group of experts in the field of epithelial barriers was convened from academia, industry and regulatory bodies to present both the current state of the art of non-animal models of the skin, intestinal and pulmonary barriers in their various fields of application, and to discuss research-based, industry-driven and regulatory-relevant future directions for both the development of new models and the refinement of existing test methods. Issues of model relevance and preference, validation and standardization, acceptance, and the need for simplicity versus complexity were focal themes of the discussions. The outcomes of workshop presentations and discussions, in relation to both current status and future directions in the utilization and development of epithelial barrier models, are presented by the attending experts in the current report.
Broschard, Thomas H; Glowienke, Susanne; Bruen, Uma S; Nagao, Lee M; Teasdale, Andrew; Stults, Cheryl L M; Li, Kim L; Iciek, Laurie A; Erexson, Greg; Martin, Elizabeth A; Ball, Douglas J
2016-11-01
Leachables from pharmaceutical container closure systems can present potential safety risks to patients. Extractables studies may be performed as a risk mitigation activity to identify potential leachables for dosage forms with a high degree of concern associated with the route of administration. To address safety concerns, approaches to toxicological safety evaluation of extractables and leachables have been developed and applied by pharmaceutical and biologics manufacturers. Details of these approaches may differ depending on the nature of the final drug product. These may include application, the formulation, route of administration and length of use. Current regulatory guidelines and industry standards provide general guidance on compound specific safety assessments but do not provide a comprehensive approach to safety evaluations of leachables and/or extractables. This paper provides a perspective on approaches to safety evaluations by reviewing and applying general concepts and integrating key steps in the toxicological evaluation of individual extractables or leachables. These include application of structure activity relationship studies, development of permitted daily exposure (PDE) values, and use of safety threshold concepts. Case studies are provided. The concepts presented seek to encourage discussion in the scientific community, and are not intended to represent a final opinion or "guidelines." Copyright © 2016 Elsevier Inc. All rights reserved.
Introduction to ISO 15189: a blueprint for quality systems in veterinary laboratories.
Freeman, Kathleen P; Bauer, Natali; Jensen, Asger L; Thoresen, Stein
2006-06-01
A trend in human and veterinary medical laboratory management is to achieve accreditation based on international standards. The International Organization for Standardization (ISO) 15189 standard is the first developed especially for accreditation of medical laboratories, and emphasizes the laboratory-client interface. European veterinary laboratories seeking to train candidates for the certification examination of the European College of Veterinary Clinical Pathology (ECVCP) require approval by the ECVCP Laboratory Standards Committee, which bases its evaluation in part on adherence to quality systems described in the ISO 15189 standards. The purpose of this article was to introduce the latest ISO quality standard and describe its application to veterinary laboratories in Europe, specifically as pertains to accreditation of laboratories involved in training veterinary clinical pathologists. Between 2003 and 2006, the Laboratory Standards Committee reviewed 12 applications from laboratories (3 commercial and 9 university) involved in training veterinary clinical pathologists. Applicants were asked to provide a description of the facilities for training and testing, current methodology and technology, health and safety policy, quality assurance policy (including internal quality control and participation in an external quality assurance program), written standard operating procedures (SOPs) and policies, a description of the laboratory information system, and personnel and training. Also during this time period multiple informal and formal discussions among ECVCP diplomates took place as to current practices and perceived areas of concern with regard to laboratory accreditation requirements. Areas in which improvement most often was needed in veterinary laboratories applying for ECVCP accreditation were the written quality plan, defined quality requirements for the tests performed, written SOPs and policies, training records, ongoing audits and competency assessments, and processes for identifying and addressing opportunities for improvement. Recommendations were developed for a stepwise approach towards achieving ISO 15189 standards, including 3 levels of quality components. The ISO 15189 standard provides a sound framework for veterinary laboratories aspiring to meet international quality standards.
Mertz, Marcel; Strech, Daniel
2014-12-04
Clinical practice guidelines (CPGs), a core tool to foster medical professionalism, differ widely in whether and how they address disease-specific ethical issues (DSEIs), and current manuals for CPG development are silent on this issue. The implementation of an explicit method faces two core challenges: first, it adds further complexity to CPG development and requires human and financial resources. Second, in contrast to the in-depth treatment of ethical issues that is standard in bioethics, the inclusion of DSEIs in CPGs need to be more pragmatic, reductive, and simplistic, but without rendering the resulting recommendations useless or insufficiently justified. This paper outlines a six-step approach, EthicsGuide, for the systematic and transparent inclusion of ethical issues and recommendations in CPGs. The development of EthicsGuide is based on (a) methodological standards in evidence-based CPG development, (b) principles of bioethics, (c) research findings on how DSEIs are currently addressed in CPGs, and (d) findings from two proof-of-concept analyses of the EthicsGuide approach. The six steps are 1) determine the DSEI spectrum and the need for ethical recommendations; 2) develop statements on which to base ethical recommendations; 3) categorize, classify, condense, and paraphrase the statements; 4) write recommendations in a standard form; 5) validate and justify recommendations, making any necessary modifications; and 6) address consent. All six steps necessarily come into play when including DSEIs in CPGs. If DSEIs are not explicitly addressed, they are unavoidably dealt with implicitly. We believe that as ethicists gain greater involvement in decision-making about health, personal rights, or economic issues, they should make their methods transparent and replicable by other researchers; and as ethical issues become more widely reflected in CPGs, CPG developers have to learn how to address them in a methodologically adequate way. The approach proposed should serve as a basis for further discussion on how to reach these goals. It breaks open the black box of what ethicists implicitly do when they develop recommendations. Further, interdisciplinary discussion and pilot tests are needed to explore the minimal requirements that guarantee a simplified procedure which is still acceptable and does not become mere window dressing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne
The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of amore » formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting. • Advantages of HIA in the air quality standard setting process are demonstrated.« less
Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.
Peeler, E J; Reese, R A; Thrush, M A
2015-10-01
The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. © 2013 Crown copyright. This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.
Barzman, Drew H; Ni, Yizhao; Griffey, Marcus; Patel, Bianca; Warren, Ashaki; Latessa, Edward; Sorter, Michael
2017-09-01
School violence has increased over the past decade and innovative, sensitive, and standardized approaches to assess school violence risk are needed. In our current feasibility study, we initialized a standardized, sensitive, and rapid school violence risk approach with manual annotation. Manual annotation is the process of analyzing a student's transcribed interview to extract relevant information (e.g., key words) to school violence risk levels that are associated with students' behaviors, attitudes, feelings, use of technology (social media and video games), and other activities. In this feasibility study, we first implemented school violence risk assessments to evaluate risk levels by interviewing the student and parent separately at the school or the hospital to complete our novel school safety scales. We completed 25 risk assessments, resulting in 25 transcribed interviews of 12-18 year olds from 15 schools in Ohio and Kentucky. We then analyzed structured professional judgments, language, and patterns associated with school violence risk levels by using manual annotation and statistical methodology. To analyze the student interviews, we initiated the development of an annotation guideline to extract key information that is associated with students' behaviors, attitudes, feelings, use of technology and other activities. Statistical analysis was applied to associate the significant categories with students' risk levels to identify key factors which will help with developing action steps to reduce risk. In a future study, we plan to recruit more subjects in order to fully develop the manual annotation which will result in a more standardized and sensitive approach to school violence assessments.
Current options in inguinal hernia repair in adult patients
Kulacoglu, H
2011-01-01
Inguinal hernia is a very common problem. Surgical repair is the current approach, whereas asymptomatic or minimally symptomatic hernias may be good candidate for watchful waiting. Prophylactic antibiotics can be used in centers with high rate of wound infection. Local anesthesia is a suitable and economic option for open repairs, and should be popularized in day-case setting. Numerous repair methods have been described to date. Mesh repairs are superior to "nonmesh" tissue-suture repairs. Lichtenstein repair and endoscopic/laparoscopic techniques have similar efficacy. Standard polypropylene mesh is still the choice, whereas use of partially absorbable lightweight meshes seems to have some advantages. PMID:22435019
NASA Technical Reports Server (NTRS)
O'Brien, T. Kevin; Johnston, William M.; Toland, Gregory J.
2010-01-01
Mode II interlaminar fracture toughness and delamination onset and growth characterization data were generated for IM7/8552 graphite epoxy composite materials from two suppliers for use in fracture mechanics analyses. Both the fracture toughness testing and the fatigue testing were conducted using the End-notched Flexure (ENF) test. The ENF test for mode II fracture toughness is currently under review by ASTM as a potential standard test method. This current draft ASTM protocol was used as a guide to conduct the tests on the IM7/8552 material. This report summarizes the test approach, methods, procedures and results of this characterization effort.
Quinn, Bridget A; Lee, Nathaniel A; Kegelman, Timothy P; Bhoopathi, Praveen; Emdad, Luni; Das, Swadesh K; Pellecchia, Maurizio; Sarkar, Devanand; Fisher, Paul B
2015-01-01
With therapies that date back to the 1950s, and few newly approved treatments in the last 20 years, pancreatic cancer remains a significant challenge for the development of novel therapeutics. Current regimens have successfully extended patient survival, although they still lead to prognoses measured in months rather than years. The genetic diversity inherent in pancreatic tumors forms the roadblocks that must be overcome in future therapeutics. Recent insight into the genetic patterns found in tumor cells may provide clues leading to better understanding of the challenges hindering the development of treatments. Here, we review currently used drugs and established combination therapies that comprise the standard of care for a highly recalcitrant disease. Novel approaches can improve upon current therapies in a variety of ways. Enhancing specificity, such that growth inhibition and cytotoxic effects act preferentially on tumor cells, is one approach to advance treatments. This can be accomplished through the targeting of extracellular markers specific to cancer cells. Additionally, enlisting natural defenses and overcoming tumor-driven immune suppression could prove to be a useful tactic. Recent studies utilizing these approaches have yielded promising results and could contribute to an ongoing effort battling a particularly difficult cancer. © 2015 Elsevier Inc. All rights reserved.
Nursing constraint models for electronic health records: a vision for domain knowledge governance.
Hovenga, Evelyn; Garde, Sebastian; Heard, Sam
2005-12-01
Various forms of electronic health records (EHRs) are currently being introduced in several countries. Nurses are primary stakeholders and need to ensure that their information and knowledge needs are being met by such systems information sharing between health care providers to enable them to improve the quality and efficiency of health care service delivery for all subjects of care. The latest international EHR standards have adopted the openEHR approach of two-level modelling. The first level is a stable information model determining structure, while the second level consists of constraint models or 'archetypes' that reflect the specifications or clinician rules for how clinical information needs to be represented to enable unambiguous data sharing. The current state of play in terms of international health informatics standards development activities is providing the nursing profession with a unique opportunity and challenge. Much work has been undertaken internationally in the area of nursing terminologies and evidence-based practice. This paper argues that to make the most of these emerging technologies and EHRs we must now concentrate on developing a process to identify, document, implement, manage and govern our nursing domain knowledge as well as contribute to the development of relevant international standards. It is argued that one comprehensive nursing terminology, such as the ICNP or SNOMED CT is simply too complex and too difficult to maintain. As the openEHR archetype approach does not rely heavily on big standardised terminologies, it offers more flexibility during standardisation of clinical concepts and it ensures open, future-proof electronic health records. We conclude that it is highly desirable for the nursing profession to adopt this openEHR approach as a means of documenting and governing the nursing profession's domain knowledge. It is essential for the nursing profession to develop its domain knowledge constraint models (archetypes) collaboratively in an international context.
Economic Evaluations of Strategies to Prevent Hospital-Acquired Pressure Injuries.
Ocampo, Wrechelle; Cheung, Amanda; Baylis, Barry; Clayden, Nancy; Conly, John M; Ghali, William A; Ho, Chester H; Kaufman, Jaime; Stelfox, Henry T; Hogan, David B
2017-07-01
To provide information from a review of literature about economic evaluations of preventive strategies for pressure injuries (PIs). This continuing education activity is intended for physicians, physician assistants, nurse practitioners, and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Identify the purpose and methods used for this study.2. Compare costs and effectiveness related to preventative strategies for PIs. BACKGROUND: Pressure injuries (PIs) are a common and resource-intensive challenge for acute care hospitals worldwide. While a number of preventive strategies have the potential to reduce the cost of hospital-acquired PIs, it is unclear what approach is the most effective. The authors performed a narrative review of the literature on economic evaluations of preventive strategies to survey current findings and identify important factors in economic assessments. Ovid, MEDLINE, NHS Economic Evaluation Databases, and the Cochrane Database of Systematic ReviewsSELECTION CRITERIA: Potentially relevant original research articles and systematic reviews were considered. Selection criteria included articles that were written in English, provided data on cost or economic evaluations of preventive strategies of PIs in acute care, and published between January 2004 and September 2015. Data were abstracted from the articles using a standardized approach to evaluate how the items on the Consolidated Health Economic Evaluation Reporting Standards checklist were addressed. The searches identified 192 references. Thirty-three original articles were chosen for full-text reviews. Nineteen of these articles provided clear descriptions of interventions, study methods, and outcomes considered. Limitations in the available literature prevent firm conclusions from being reached about the relative economic merits of the various approaches to the prevention of PIs. The authors' review revealed a need for additional high-quality studies that adhere to commonly used standards of both currently utilized and emerging ways to prevent hospital-acquired PIs.
Frequency-domain full-waveform inversion with non-linear descent directions
NASA Astrophysics Data System (ADS)
Geng, Yu; Pan, Wenyong; Innanen, Kristopher A.
2018-05-01
Full-waveform inversion (FWI) is a highly non-linear inverse problem, normally solved iteratively, with each iteration involving an update constructed through linear operations on the residuals. Incorporating a flexible degree of non-linearity within each update may have important consequences for convergence rates, determination of low model wavenumbers and discrimination of parameters. We examine one approach for doing so, wherein higher order scattering terms are included within the sensitivity kernel during the construction of the descent direction, adjusting it away from that of the standard Gauss-Newton approach. These scattering terms are naturally admitted when we construct the sensitivity kernel by varying not the current but the to-be-updated model at each iteration. Linear and/or non-linear inverse scattering methodologies allow these additional sensitivity contributions to be computed from the current data residuals within any given update. We show that in the presence of pre-critical reflection data, the error in a second-order non-linear update to a background of s0 is, in our scheme, proportional to at most (Δs/s0)3 in the actual parameter jump Δs causing the reflection. In contrast, the error in a standard Gauss-Newton FWI update is proportional to (Δs/s0)2. For numerical implementation of more complex cases, we introduce a non-linear frequency-domain scheme, with an inner and an outer loop. A perturbation is determined from the data residuals within the inner loop, and a descent direction based on the resulting non-linear sensitivity kernel is computed in the outer loop. We examine the response of this non-linear FWI using acoustic single-parameter synthetics derived from the Marmousi model. The inverted results vary depending on data frequency ranges and initial models, but we conclude that the non-linear FWI has the capability to generate high-resolution model estimates in both shallow and deep regions, and to converge rapidly, relative to a benchmark FWI approach involving the standard gradient.
McLaren, Stuart J; Page, Wyatt H; Parker, Lou; Rushton, Martin
2013-12-19
An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys.
McLaren, Stuart J.; Page, Wyatt H.; Parker, Lou; Rushton, Martin
2013-01-01
An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys. PMID:24452254
Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A
2017-11-01
Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
Reflection: moving from a mandatory ritual to meaningful professional development.
Murdoch-Eaton, Deborah; Sandars, John
2014-03-01
Reflection has become established as a key principle underpinning maintenance of standards within professional education and practice. A requirement to evidence reflection within performance review is intended to develop a transformative approach to practice, identify developmental goals, and ultimately, improve healthcare. However, some applications have taken an excessively instrumental approach to the evidencing of reflection, and while they have provided useful templates or framing devices for recording individualistic reflective practice, they potentially have distorted the original intentions. This article revisits the educational theory underpinning the importance of reflection for enhancing performance and considers how to enhance its value within current paediatric practice.
A DYNAMIC DENSITY FUNCTIONAL THEORY APPROACH TO DIFFUSION IN WHITE DWARFS AND NEUTRON STAR ENVELOPES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaw, A.; Murillo, M. S.
2016-09-20
We develop a multicomponent hydrodynamic model based on moments of the Born–Bogolyubov–Green–Kirkwood–Yvon hierarchy equations for physical conditions relevant to astrophysical plasmas. These equations incorporate strong correlations through a density functional theory closure, while transport enters through a relaxation approximation. This approach enables the introduction of Coulomb coupling correction terms into the standard Burgers equations. The diffusive currents for these strongly coupled plasmas is self-consistently derived. The settling of impurities and its impact on cooling can be greatly affected by strong Coulomb coupling, which we show can be quantified using the direct correlation function.
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
Characterizing multi-pollutant air pollution in China: Comparison of three air quality indices.
Hu, Jianlin; Ying, Qi; Wang, Yungang; Zhang, Hongliang
2015-11-01
Multi-pollutant air pollution (i.e., several pollutants reaching very high concentrations simultaneously) frequently occurs in many regions across China. Air quality index (AQI) is used worldwide to inform the public about levels of air pollution and associated health risks. The current AQI approach used in China is based on the maximum value of individual pollutants, and does not consider the combined health effects of exposure to multiple pollutants. In this study, two novel alternative indices--aggregate air quality index (AAQI) and health-risk based air quality index (HAQI)--were calculated based on data collected in six megacities of China (Beijing, Shanghai, Guangzhou, Shjiazhuang, Xi'an, and Wuhan) during 2013 to 2014. Both AAQI and HAQI take into account the combined health effects of various pollutants, and the HAQI considers the exposure (or concentration)-response relationships of pollutants. AAQI and HAQI were compared to AQI to examine the effectiveness of the current AQI in characterizing multi-pollutant air pollution in China. The AAQI and HAQI values are higher than the AQI on days when two or more pollutants simultaneously exceed the Chinese Ambient Air Quality Standards (CAAQS) 24-hour Grade II standards. The results of the comparison of the classification of risk categories based on the three indices indicate that the current AQI approach underestimates the severity of health risk associated with exposure to multi-pollutant air pollution. For the AQI-based risk category of 'unhealthy', 96% and 80% of the days would be 'very unhealthy' or 'hazardous' if based on AAQI and HAQI, respectively; and for the AQI-based risk category of 'very unhealthy', 67% and 75% of the days would be 'hazardous' if based on AAQI and HAQI, respectively. The results suggest that the general public, especially sensitive population groups such as children and the elderly, should take more stringent actions than those currently suggested based on the AQI approach during high air pollution events. Sensitivity studies were conducted to examine the assumptions used in the AAQI and HAQI approaches. Results show that AAQI is sensitive to the choice of pollutant irrelevant constant. HAQI is sensitive to the choice of both threshold values and pollutants included in total risk calculation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Surgical quality assessment. A simplified approach.
DeLong, D L
1991-10-01
The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.
Recent developments in optical detection methods for microchip separations.
Götz, Sebastian; Karst, Uwe
2007-01-01
This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.
Robotic single port cholecystectomy: current data and future perspectives.
Angelou, Anastasios; Skarmoutsos, Athanasios; Margonis, Georgios A; Moris, Demetrios; Tsigris, Christos; Pikoulis, Emmanouil
2017-04-01
Minimally invasive techniques are used more and more frequently. Since conventional laparoscopic approach has been the gold standard, surgeons in their effort to further reduce the invasiveness of conventional laparoscopic cholecystectomy have adopted Single Incision approach. The widespread adoption of robotics has led to the inevitable hybridization of robotic technology with laparoendoscopic single-site surgery (LESS). As a result, employment of the da Vinci surgical system may allow greater surgical maneuverability, improving ergonomics. A review of the English literature was conducted to evaluate all robotic single port cholecystectomy performed till today. Demographic data, operative parameters, postoperative outcomes and materials used for the operation were collected and assessed. A total of 12 studies, including 501 patients were analyzed. Demographics and clinical characteristics of the patients was heterogeneous, but in most studies a mean BMI <30 was recorded. Intraoperative metrics like operative time, estimated blood loss and conversion rate were comparable with those in multiport conventional laparoscopy. Robotic single port cholecystectomy is a safe and feasible alternative to conventional multiport laparoscopic or manual robotic approach. However, current data do not suggest a superiority of robotic SILC over other established methods.
Current and cutting-edge interventions for the treatment of obese patients.
Vairavamurthy, Jenanan; Cheskin, Lawrence J; Kraitchman, Dara L; Arepally, Aravind; Weiss, Clifford R
2017-08-01
The number of people classified as obese, defined by the World Health Organization as having a body mass index ≥30, has been rising since the 1980s. Obesity is associated with comorbidities such as hypertension, diabetes mellitus, and nonalcoholic fatty liver disease. The current treatment paradigm emphasizes lifestyle modifications, including diet and exercise; however this approach produces only modest weight loss for many patients. When lifestyle modifications fail, the current "gold standard" therapy for obesity is bariatric surgery, including Roux-en-Y gastric bypass, sleeve gastrectomy, duodenal switch, and placement of an adjustable gastric band. Though effective, bariatric surgery can have severe short- and long-term complications. To fill the major gap in invasiveness between lifestyle modification and surgery, researchers have been developing pharmacotherapies and minimally invasive endoscopic techniques to treat obesity. Recently, interventional radiologists developed a percutaneous transarterial catheter-directed therapy targeting the hormonal function of the stomach. This review describes the current standard obesity treatments (including diet, exercise, and surgery), as well as newer endoscopic bariatric procedures and pharmacotherapies to help patients lose weight. We present data from two ongoing human trials of a new interventional radiology procedure for weight loss, bariatric embolization. Copyright © 2017 Elsevier B.V. All rights reserved.
Lindsey, Brock A; Markel, Justin E; Kleinerman, Eugenie S
2017-06-01
Osteosarcoma (OS) is the most common primary malignancy of bone and patients with metastatic disease or recurrences continue to have very poor outcomes. Unfortunately, little prognostic improvement has been generated from the last 20 years of research and a new perspective is warranted. OS is extremely heterogeneous in both its origins and manifestations. Although multiple associations have been made between the development of osteosarcoma and race, gender, age, various genomic alterations, and exposure situations among others, the etiology remains unclear and controversial. Noninvasive diagnostic methods include serum markers like alkaline phosphatase and a growing variety of imaging techniques including X-ray, computed tomography, magnetic resonance imaging, and positron emission as well as combinations thereof. Still, biopsy and microscopic examination are required to confirm the diagnosis and carry additional prognostic implications such as subtype classification and histological response to neoadjuvant chemotherapy. The current standard of care combines surgical and chemotherapeutic techniques, with a multitude of experimental biologics and small molecules currently in development and some in clinical trial phases. In this review, in addition to summarizing the current understanding of OS etiology, diagnostic methods, and the current standard of care, our group describes various experimental therapeutics and provides evidence to encourage a potential paradigm shift toward the introduction of immunomodulation, which may offer a more comprehensive approach to battling cancer pleomorphism.
What is missing between model and Aura MLS observations in mesospheric OH?
NASA Astrophysics Data System (ADS)
Wang, S.; Li, K. F.; Zeng, Z.; Sander, S. P.; Shia, R. L.; Yung, Y. L.
2017-12-01
Recent Aura Microwave Limb Souder observations show higher mesospheric OH levels than earlier versions and previous satellite observations. The current photochemical model with standard chemistry is not able to accurately simulate MLS OH in the mesosphere. In particular, the model significantly underestimates OH over the altitude range of 60-80km. In the standard middle atmospheric chemistry, HOx over this altitude range is controled mainly through the reactions of H2O + hv (< 205 nm) → H + OH; H + O2 + M → HO2 + M; and OH + HO2 → H2O + O2. In an attempt to resolve the model-observation discrepancy, we adjust the rate coefficients of these reactions within recommended uncertainty ranges using an objective Bayesian approach. However, reasonable perturbations to these reactions are not capable of resolving the mesospheric discrepancy without introducing disagreements in other regions of the atmosphere. We explore possible new reactions in the Earth's atmosphere that are not included in current standard models. Some candidate reactions and their potential impacts on mesospheric HOx chemistry will be discussed. Our results urge new laboratory studies of these candidate reactions, whose rate coefficients have never been measured for the atmospheric conditions.
Hofmann-Amtenbrink, Margarethe; Grainger, David W; Hofmann, Heinrich
2015-10-01
Although nanoparticles research is ongoing since more than 30years, the development of methods and standard protocols required for their safety and efficacy testing for human use is still in development. The review covers questions on toxicity, safety, risk and legal issues over the lifecycle of inorganic nanoparticles for medical applications. The following topics were covered: (i) In vitro tests may give only a very first indication of possible toxicity as in the actual methods interactions at systemic level are mainly neglected; (ii) the science-driven and the regulation-driven approaches do not really fit for decisive strategies whether or not a nanoparticle should be further developed and may receive a kind of "safety label". (iii) Cost and time of development are the limiting factors for the drug pipeline. Knowing which property of a nanoparticle makes it toxic it may be feasible to re-engineer the particle for higher safety (safety by design). Testing the safety and efficacy of nanoparticles for human use is still in need of standardization. In this concise review, the author described and discussed the current unresolved issues over the application of inorganic nanoparticles for medical applications. Copyright © 2015 Elsevier Inc. All rights reserved.
Toward a 35-years North American Precipitation and Surface Reanalysis
NASA Astrophysics Data System (ADS)
Gasset, N.; Fortin, V.
2017-12-01
In support of the International Watersheds Initiative (IWI) of the International Joint Commission (IJC), a 35-years precipitation and surface reanalysis covering North America at a 3-hours and 15-km resolution is currently being developed at the Canadian Meteorological Centre (CMC). A deterministic reforecast / dynamical downscaling approach is followed where a global reanalysis (ERA-Interim) is used as initial condition of the Global Environmental Multi-scale model (GEM). Moreover, the latter is coupled with precipitation and surface data assimilation systems, i.e. the Canadian Precipitation Analysis (CaPA) and the Canadian Land Data Assimilation System (CaLDAS). While optimized to be more computationally efficient in the context of a reforecast experiment, all systems used are closely related to model versions and configurations currently run operationally at CMC, meaning they have undergone a strict and thorough validation procedure.As a proof of concept and in order to identify the optimal set-up before achieving the 35-years reanalysis, several configurations of the approach are evaluated for the years 2010-2014 using both standard CMC validation methodology as well as more dedicated scores such as comparison against the currently available products (North American Regional Reanalysis, MERRA-Land and the newly released ERA5 reanalysis). A special attention is dedicated to the evaluation of analysed variables, i.e. precipitation, snow depth, surface/ground temperature and moisture over the whole domain of interest. Results from these preliminary samples are very encouraging and the optimal set-up is identified. The coupled approach, i.e. GEM+CaPA/CaLDAS, always shows clear improvements over classical reforecast and dynamical downscaling where surface observations are present. Furthermore, results are inline or better than currently available products and the reference CMC operational approach that was operated from 2012 to 2016 (GEM 3.3, 10-km resolution). This reanalysis will allow for bias correction of current estimates and forecasts, and help decision maker understand and communicate by how much the current forecasted state of the system differs from the recent past.
Osis, Sean T; Hettinga, Blayne A; Ferber, Reed
2016-05-01
An ongoing challenge in the application of gait analysis to clinical settings is the standardized detection of temporal events, with unobtrusive and cost-effective equipment, for a wide range of gait types. The purpose of the current study was to investigate a targeted machine learning approach for the prediction of timing for foot strike (or initial contact) and toe-off, using only kinematics for walking, forefoot running, and heel-toe running. Data were categorized by gait type and split into a training set (∼30%) and a validation set (∼70%). A principal component analysis was performed, and separate linear models were trained and validated for foot strike and toe-off, using ground reaction force data as a gold-standard for event timing. Results indicate the model predicted both foot strike and toe-off timing to within 20ms of the gold-standard for more than 95% of cases in walking and running gaits. The machine learning approach continues to provide robust timing predictions for clinical use, and may offer a flexible methodology to handle new events and gait types. Copyright © 2016 Elsevier B.V. All rights reserved.
Automatic outdoor monitoring system for photovoltaic panels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stefancich, Marco; Simpson, Lin; Chiesa, Matteo
Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum powermore » point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.« less
Automatic outdoor monitoring system for photovoltaic panels.
Stefancich, Marco; Simpson, Lin; Chiesa, Matteo
2016-05-01
Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.
The Space Communications Protocol Standards Program
NASA Technical Reports Server (NTRS)
Jeffries, Alan; Hooke, Adrian J.
1994-01-01
In the fall of 1992 NASA and the Department of Defense chartered a technical team to explore the possibility of developing a common set of space data communications standards for potential dual-use across the U.S. national space mission support infrastructure. The team focused on the data communications needs of those activities associated with on-lined control of civil and military aircraft. A two-pronged approach was adopted: a top-down survey of representative civil and military space data communications requirements was conducted; and a bottom-up analysis of available standard data communications protocols was performed. A striking intersection of civil and military space mission requirements emerged, and an equally striking consensus on the approach towards joint civil and military space protocol development was reached. The team concluded that wide segments of the U.S. civil and military space communities have common needs for: (1) an efficient file transfer protocol; (2) various flavors of underlying data transport service; (3) an optional data protection mechanism to assure end-to-end security of message exchange; and (4) an efficient internetworking protocol. These recommendations led to initiating a program to develop a suite of protocols based on these findings. This paper describes the current status of this program.
Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas
2016-06-15
Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Bouchpan-Lerust-Juéry, L.
2007-08-01
Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.
Practical Considerations for Clinical PET/MR Imaging.
Galgano, Samuel; Viets, Zachary; Fowler, Kathryn; Gore, Lael; Thomas, John V; McNamara, Michelle; McConathy, Jonathan
2018-01-01
Clinical PET/MR imaging is currently performed at a number of centers around the world as part of routine standard of care. This article focuses on issues and considerations for a clinical PET/MR imaging program, focusing on routine standard-of-care studies. Although local factors influence how clinical PET/MR imaging is implemented, the approaches and considerations described here intend to apply to most clinical programs. PET/MR imaging provides many more options than PET/computed tomography with diagnostic advantages for certain clinical applications but with added complexity. A recurring theme is matching the PET/MR imaging protocol to the clinical application to balance diagnostic accuracy with efficiency. Copyright © 2017 Elsevier Inc. All rights reserved.
Fuels characterization studies. [jet fuels
NASA Technical Reports Server (NTRS)
Seng, G. T.; Antoine, A. C.; Flores, F. J.
1980-01-01
Current analytical techniques used in the characterization of broadened properties fuels are briefly described. Included are liquid chromatography, gas chromatography, and nuclear magnetic resonance spectroscopy. High performance liquid chromatographic ground-type methods development is being approached from several directions, including aromatic fraction standards development and the elimination of standards through removal or partial removal of the alkene and aromatic fractions or through the use of whole fuel refractive index values. More sensitive methods for alkene determinations using an ultraviolet-visible detector are also being pursued. Some of the more successful gas chromatographic physical property determinations for petroleum derived fuels are the distillation curve (simulated distillation), heat of combustion, hydrogen content, API gravity, viscosity, flash point, and (to a lesser extent) freezing point.
Technology Innovations to Improve Biomass Cookstoves to Meet Tier 4 Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Still, Dean K; Hatfield, Micheal S
Technology Innovations to Improve Biomass Cookstoves to Meet Tier 4 Standards. Protecting public health has become a major motivation for investigating how improved cook stoves might function as a viable intervention. Currently, the great majority of cookstoves for sale in the developing world were not designed for this purpose but instead success was based on criteria such as reduced fuel use, affordability, and ease of use. With DOE funding Aprovecho Research Center spent three years creating stoves using an iterative development and modeling approach resulting in four stoves that in lab tests met the World Health Organization (2014) intermediate ratemore » vented targets for PM2.5 and for CO.« less
Practical Considerations for Clinical PET/MR Imaging.
Galgano, Samuel; Viets, Zachary; Fowler, Kathryn; Gore, Lael; Thomas, John V; McNamara, Michelle; McConathy, Jonathan
2017-05-01
Clinical PET/MR imaging is currently performed at a number of centers around the world as part of routine standard of care. This article focuses on issues and considerations for a clinical PET/MR imaging program, focusing on routine standard-of-care studies. Although local factors influence how clinical PET/MR imaging is implemented, the approaches and considerations described here intend to apply to most clinical programs. PET/MR imaging provides many more options than PET/computed tomography with diagnostic advantages for certain clinical applications but with added complexity. A recurring theme is matching the PET/MR imaging protocol to the clinical application to balance diagnostic accuracy with efficiency. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.
Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment thatmore » integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches.« less
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel
2018-05-01
Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.
Otero, P; Hersh, W
2011-01-01
Web 3.0 is transforming the World Wide Web by allowing knowledge and reasoning to be gleaned from its content. Describe a new scenario in education and training known as "Education 3.0" that can help in the promotion of learning in health informatics in a collaborative way. Review of the current standards available for curricula and learning activities in in Biomedical and Health Informatics (BMHI) for a Web 3.0 scenario. A new scenario known as "Education 3.0" can provide open educational resources created and reused throughout different institutions and improved by means of an international collaborative knowledge powered by the use of E-learning. Currently there are standards that could be used in identifying and deliver content in education in BMHI in the semantic web era such as Resource Description Format (RDF), Web Ontology Language (OWL) and Sharable Content Object Reference Model (SCORM). In addition, there are other standards to support healthcare education and training. There are few experiences in the use of standards in e-learning in BMHI published in the literature. Web 3.0 can propose new approaches to building the BMHI workforce so there is a need to build tools as knowledge infrastructure to leverage it. The usefulness of standards in the content and competencies of training programs in BMHI needs more experience and research so as to promote the interoperability and sharing of resources in this growing discipline.
Quantum Fragment Based ab Initio Molecular Dynamics for Proteins.
Liu, Jinfeng; Zhu, Tong; Wang, Xianwei; He, Xiao; Zhang, John Z H
2015-12-08
Developing ab initio molecular dynamics (AIMD) methods for practical application in protein dynamics is of significant interest. Due to the large size of biomolecules, applying standard quantum chemical methods to compute energies for dynamic simulation is computationally prohibitive. In this work, a fragment based ab initio molecular dynamics approach is presented for practical application in protein dynamics study. In this approach, the energy and forces of the protein are calculated by a recently developed electrostatically embedded generalized molecular fractionation with conjugate caps (EE-GMFCC) method. For simulation in explicit solvent, mechanical embedding is introduced to treat protein interaction with explicit water molecules. This AIMD approach has been applied to MD simulations of a small benchmark protein Trpcage (with 20 residues and 304 atoms) in both the gas phase and in solution. Comparison to the simulation result using the AMBER force field shows that the AIMD gives a more stable protein structure in the simulation, indicating that quantum chemical energy is more reliable. Importantly, the present fragment-based AIMD simulation captures quantum effects including electrostatic polarization and charge transfer that are missing in standard classical MD simulations. The current approach is linear-scaling, trivially parallel, and applicable to performing the AIMD simulation of proteins with a large size.
Multifractal Analysis for Nutritional Assessment
Park, Youngja; Lee, Kichun; Ziegler, Thomas R.; Martin, Greg S.; Hebbar, Gautam; Vidakovic, Brani; Jones, Dean P.
2013-01-01
The concept of multifractality is currently used to describe self-similar and complex scaling properties observed in numerous biological signals. Fractals are geometric objects or dynamic variations which exhibit some degree of similarity (irregularity) to the original object in a wide range of scales. This approach determines irregularity of biologic signal as an indicator of adaptability, the capability to respond to unpredictable stress, and health. In the present work, we propose the application of multifractal analysis of wavelet-transformed proton nuclear magnetic resonance (1H NMR) spectra of plasma to determine nutritional insufficiency. For validation of this method on 1H NMR signal of human plasma, standard deviation from classical statistical approach and Hurst exponent (H), left slope and partition function from multifractal analysis were extracted from 1H NMR spectra to test whether multifractal indices could discriminate healthy subjects from unhealthy, intensive care unit patients. After validation, the multifractal approach was applied to spectra of plasma from a modified crossover study of sulfur amino acid insufficiency and tested for associations with blood lipids. The results showed that standard deviation and H, but not left slope, were significantly different for sulfur amino acid sufficiency and insufficiency. Quadratic discriminant analysis of H, left slope and the partition function showed 78% overall classification accuracy according to sulfur amino acid status. Triglycerides and apolipoprotein C3 were significantly correlated with a multifractal model containing H, left slope, and standard deviation, and cholesterol and high-sensitivity C-reactive protein were significantly correlated to H. In conclusion, multifractal analysis of 1H NMR spectra provides a new approach to characterize nutritional status. PMID:23990878
Diagnostic approach to patients with suspected pulmonary embolism: a report from the real world
Saro, G; Campo, J; Hernandez, M; Anta, M; Olmos, J; Gonzalez-Macias, J; Riancho, J
1999-01-01
This study was carried out to examine the diagnostic approach to patients with suspected pulmonary embolism (PE) in a university hospital. A retrospective case record review of 251 patients with suspected pulmonary embolism was carried out according to a standard protocol, which looked at the utilisation of imaging techniques and compared clinical diagnoses with a standardised diagnosis established according to current recommendations. Isotopic lung scan was the most commonly used technique (73%), followed by leg vein sonography (36%) and contrast venography (31%). Lung arteriography was done in only 7% of patients. Among the 205 patients with a clinical diagnosis of PE, 115 (56%) would be diagnosed as having PE according to the standard criteria, 84 (41%) would be unclassified, and six (3%) would not be regarded as having PE. Among patients who were diagnosed as having PE and received anticoagulant therapy, 32% did not have the diagnosis confirmed by an imaging technique. Most of these had a non-diagnostic lung scan which, despite evidence to the contrary, seemed to be interpreted as confirmation of PE. We conclude that clinicians do not seem to follow current recommendations when approaching patients with suspected PE. In particular, there is an over-reliance on lung scans, and the significance of non-diagnostic scans was often misinterpreted. Arteriography was underused. These results emphasise the need to take measures to implement practice guidelines and to explore the usefulness of newer non-invasive techniques. Keywords: pulmonary embolism; diagnosis; lung scan; imaging techniques; audit PMID:10533633
Meeting EHR security requirements: SeAAS approach.
Katt, Basel; Trojer, Thomas; Breu, Ruth; Schabetsberger, Thomas; Wozak, Florian
2010-01-01
In the last few years, Electronic Health Record (EHR) systems have received a great attention in the literature, as well as in the industry. They are expected to lead to health care savings, increase health care quality and reduce medical errors. This interest has been accompanied by the development of different standards and frameworks to meet EHR challenges. One of the most important initiatives that was developed to solve problems of EHR is IHE (Integrating the Healthcare Enterprise), which adapts the distributed approach to store and manage healthcare data. IHE aims at standardizing the way healthcare systems exchange information in distributed environments. For this purpose it defines several so called Integration Profiles that specify the interactions and the interfaces (Transactions) between various healthcare systems (Actors) or entities. Security was considered also in few profiles that tackled the main security requirements, mainly authentication and audit trails. The security profiles of IHE currently suffer two drawbacks. First, they apply end point security methodology, which has been proven recently to be insufficient and cumbersome in distributed and heterogeneous environment. Second, the current security profiles for more complex security requirements are oversimplified, vague and do not consider architectural design. This recently changed to some extend e.g., with the introduction of newly published white papers regarding privacy [5] and access control [9]. In order to solve the first problem we utilize results of previous studies conducted in the area of security-aware IHE-based systems and the state-of-the-art Security-as-a-Service approach as a convenient methodology to group domain-wide security needs and overcome the end point security shortcomings.
Hendry, Shona; Salgado, Roberto; Gevaert, Thomas; Russell, Prudence A; John, Tom; Thapa, Bibhusal; Christie, Michael; van de Vijver, Koen; Estrada, M V; Gonzalez-Ericsson, Paula I; Sanders, Melinda; Solomon, Benjamin; Solinas, Cinzia; Van den Eynden, Gert G G M; Allory, Yves; Preusser, Matthias; Hainfellner, Johannes; Pruneri, Giancarlo; Vingiani, Andrea; Demaria, Sandra; Symmans, Fraser; Nuciforo, Paolo; Comerma, Laura; Thompson, E A; Lakhani, Sunil; Kim, Seong-Rim; Schnitt, Stuart; Colpaert, Cecile; Sotiriou, Christos; Scherer, Stefan J; Ignatiadis, Michail; Badve, Sunil; Pierce, Robert H; Viale, Giuseppe; Sirtaine, Nicolas; Penault-Llorca, Frederique; Sugie, Tomohagu; Fineberg, Susan; Paik, Soonmyung; Srinivasan, Ashok; Richardson, Andrea; Wang, Yihong; Chmielik, Ewa; Brock, Jane; Johnson, Douglas B; Balko, Justin; Wienert, Stephan; Bossuyt, Veerle; Michiels, Stefan; Ternes, Nils; Burchardi, Nicole; Luen, Stephen J; Savas, Peter; Klauschen, Frederick; Watson, Peter H; Nelson, Brad H; Criscitiello, Carmen; O'Toole, Sandra; Larsimont, Denis; de Wind, Roland; Curigliano, Giuseppe; André, Fabrice; Lacroix-Triki, Magali; van de Vijver, Mark; Rojo, Federico; Floris, Giuseppe; Bedri, Shahinaz; Sparano, Joseph; Rimm, David; Nielsen, Torsten; Kos, Zuzana; Hewitt, Stephen; Singh, Baljit; Farshid, Gelareh; Loibl, Sibylle; Allison, Kimberly H; Tung, Nadine; Adams, Sylvia; Willard-Gallo, Karen; Horlings, Hugo M; Gandhi, Leena; Moreira, Andre; Hirsch, Fred; Dieci, Maria V; Urbanowicz, Maria; Brcic, Iva; Korski, Konstanty; Gaire, Fabien; Koeppen, Hartmut; Lo, Amy; Giltnane, Jennifer; Rebelatto, Marlon C; Steele, Keith E; Zha, Jiping; Emancipator, Kenneth; Juco, Jonathan W; Denkert, Carsten; Reis-Filho, Jorge; Loi, Sherene; Fox, Stephen B
2017-11-01
Assessment of the immune response to tumors is growing in importance as the prognostic implications of this response are increasingly recognized, and as immunotherapies are evaluated and implemented in different tumor types. However, many different approaches can be used to assess and describe the immune response, which limits efforts at implementation as a routine clinical biomarker. In part 1 of this review, we have proposed a standardized methodology to assess tumor-infiltrating lymphocytes (TILs) in solid tumors, based on the International Immuno-Oncology Biomarkers Working Group guidelines for invasive breast carcinoma. In part 2 of this review, we discuss the available evidence for the prognostic and predictive value of TILs in common solid tumors, including carcinomas of the lung, gastrointestinal tract, genitourinary system, gynecologic system, and head and neck, as well as primary brain tumors, mesothelioma and melanoma. The particularities and different emphases in TIL assessment in different tumor types are discussed. The standardized methodology we propose can be adapted to different tumor types and may be used as a standard against which other approaches can be compared. Standardization of TIL assessment will help clinicians, researchers and pathologists to conclusively evaluate the utility of this simple biomarker in the current era of immunotherapy.
White, Helen E; Hedges, John; Bendit, Israel; Branford, Susan; Colomer, Dolors; Hochhaus, Andreas; Hughes, Timothy; Kamel-Reid, Suzanne; Kim, Dong-Wook; Modur, Vijay; Müller, Martin C; Pagnano, Katia B; Pane, Fabrizio; Radich, Jerry; Cross, Nicholas C P; Labourier, Emmanuel
2013-06-01
Current guidelines for managing Philadelphia-positive chronic myeloid leukemia include monitoring the expression of the BCR-ABL1 (breakpoint cluster region/c-abl oncogene 1, non-receptor tyrosine kinase) fusion gene by quantitative reverse-transcription PCR (RT-qPCR). Our goal was to establish and validate reference panels to mitigate the interlaboratory imprecision of quantitative BCR-ABL1 measurements and to facilitate global standardization on the international scale (IS). Four-level secondary reference panels were manufactured under controlled and validated processes with synthetic Armored RNA Quant molecules (Asuragen) calibrated to reference standards from the WHO and the NIST. Performance was evaluated in IS reference laboratories and with non-IS-standardized RT-qPCR methods. For most methods, percent ratios for BCR-ABL1 e13a2 and e14a2 relative to ABL1 or BCR were robust at 4 different levels and linear over 3 logarithms, from 10% to 0.01% on the IS. The intraassay and interassay imprecision was <2-fold overall. Performance was stable across 3 consecutive lots, in multiple laboratories, and over a period of 18 months to date. International field trials demonstrated the commutability of the reagents and their accurate alignment to the IS within the intra- and interlaboratory imprecision of IS-standardized methods. The synthetic calibrator panels are robust, reproducibly manufactured, analytically calibrated to the WHO primary standards, and compatible with most BCR-ABL1 RT-qPCR assay designs. The broad availability of secondary reference reagents will further facilitate interlaboratory comparative studies and independent quality assessment programs, which are of paramount importance for worldwide standardization of BCR-ABL1 monitoring results and the optimization of current and new therapeutic approaches for chronic myeloid leukemia. © 2013 American Association for Clinical Chemistry.
Allaire, Brett T; DePaolis Kaluza, M Clara; Bruno, Alexander G; Samelson, Elizabeth J; Kiel, Douglas P; Anderson, Dennis E; Bouxsein, Mary L
2017-01-01
Current standard methods to quantify disc height, namely distortion compensated Roentgen analysis (DCRA), have been mostly utilized in the lumbar and cervical spine and have strict exclusion criteria. Specifically, discs adjacent to a vertebral fracture are excluded from measurement, thus limiting the use of DCRA in studies that include older populations with a high prevalence of vertebral fractures. Thus, we developed and tested a modified DCRA algorithm that does not depend on vertebral shape. Participants included 1186 men and women from the Framingham Heart Study Offspring and Third Generation Multidetector CT Study. Lateral CT scout images were used to place 6 morphometry points around each vertebra at 13 vertebral levels in each participant. Disc heights were calculated utilizing these morphometry points using DCRA methodology and our modified version of DCRA, which requires information from fewer morphometry points than the standard DCRA. Modified DCRA and standard DCRA measures of disc height are highly correlated, with concordance correlation coefficients above 0.999. Both measures demonstrate good inter- and intra-operator reproducibility. 13.9 % of available disc heights were not evaluable or excluded using the standard DCRA algorithm, while only 3.3 % of disc heights were not evaluable using our modified DCRA algorithm. Using our modified DCRA algorithm, it is not necessary to exclude vertebrae with fracture or other deformity from disc height measurements as in the standard DCRA. Modified DCRA also yields identical measurements to the standard DCRA. Thus, the use of modified DCRA for quantitative assessment of disc height will lead to less missing data without any loss of accuracy, making it a preferred alternative to the current standard methodology.
Medical Home Implementation: A Sensemaking Taxonomy of Hard and Soft Best Practices
Hoff, Timothy
2013-01-01
Context The patient-centered medical home (PCMH) model of care is currently a central focus of U.S. health system reform, but less is known about the model's implementation in the practice of everyday primary care. Understanding its implementation is key to ensuring the approach's continued support and success nationally. This article addresses this gap through a qualitative examination of the best practices associated with PCMH implementation for older adult patients in primary care. Methods I used a multicase, comparative study design that relied on a sensemaking approach and fifty-one in-depth interviews with physicians, nurses, and clinic support staff working in six accredited medical homes located in various geographic areas. My emphasis was on gaining descriptive insights into the staff's experiences delivering medical home care to older adult patients in particular and then analyzing how these experiences shaped the staff's thinking, learning, and future actions in implementing medical home care. Findings I found two distinct taxonomies of implementation best practices, which I labeled “hard” and “soft” because of their differing emphasis and content. Hard implementation practices are normative activities and structural interventions that align well with existing national standards for medical home care. Soft best practices are more relational in nature and derive from the existing practice social structure and everyday interactions between staff and patients. Currently, external stakeholders are less apt to recognize, encourage, or incentivize soft best practices. Conclusions The results suggest that there may be no standardized, one-size-fits-all approach to making medical home implementation work, particularly for special patient populations such as the elderly. My study also raises the issue of broadening current PCMH assessments and reward systems to include implementation practices that contain heavy social and relational components of care, in addition to the emphasis now placed on building structural supports for medical home work. Further study of these softer implementation practices and a continued call for qualitative methodological approaches that gain insight into everyday practice behavior are warranted. PMID:24320169
Current projects in Pre-analytics: where to go?
Sapino, Anna; Annaratone, Laura; Marchiò, Caterina
2015-01-01
The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, T.
Public-private partnerships have been a mainstay of the U.S. Department of Energy and the National Renewable Energy Laboratory (DOE/NREL) approach to research and development. These partnerships also include technology development that enables grid modernization and distributed energy resources (DER) advancement, especially renewable energy systems integration with the grid. Through DOE/NREL and industry support of Institute of Electrical and Electronics Engineers (IEEE) standards development, the IEEE 1547 series of standards has helped shape the way utilities and other businesses have worked together to realize increasing amounts of DER interconnected with the distribution grid. And more recently, the IEEE 2030 series ofmore » standards is helping to further realize greater implementation of communications and information technologies that provide interoperability solutions for enhanced integration of DER and loads with the grid. For these standards development partnerships, for approximately $1 of federal funding, industry partnering has contributed $5. In this report, the status update is presented for the American National Standards IEEE 1547 and IEEE 2030 series of standards. A short synopsis of the history of the 1547 standards is first presented, then the current status and future direction of the ongoing standards development activities are discussed.« less
Fibrinolysis standards: a review of the current status.
Thelwell, C
2010-07-01
Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.; Apte, Mike G.
This report considers the question of whether the California Energy Commission should incorporate the ASHRAE 62.1 ventilation standard into the Title 24 ventilation rate (VR) standards, thus allowing buildings to follow the Indoor Air Quality Procedure. This, in contrast to the current prescriptive standard, allows the option of using ventilation rate as one of several strategies, which might include source reduction and air cleaning, to meet specified targets of indoor air concentrations and occupant acceptability. The research findings reviewed in this report suggest that a revised approach to a ventilation standard for commercial buildings is necessary, because the current prescriptivemore » ASHRAE 62.1 Ventilation Rate Procedure (VRP) apparently does not provide occupants with either sufficiently acceptable or sufficiently healthprotective air quality. One possible solution would be a dramatic increase in the minimum ventilation rates (VRs) prescribed by a VRP. This solution, however, is not feasible for at least three reasons: the current need to reduce energy use rather than increase it further, the problem of polluted outdoor air in many cities, and the apparent limited ability of increasing VRs to reduce all indoor airborne contaminants of concern (per Hodgson (2003)). Any feasible solution is thus likely to include methods of pollutant reduction other than increased outdoor air ventilation; e.g., source reduction or air cleaning. The alternative 62.1 Indoor Air Quality Procedure (IAQP) offers multiple possible benefits in this direction over the VRP, but seems too limited by insufficient specifications and inadequate available data to provide adequate protection for occupants. Ventilation system designers rarely choose to use it, finding it too arbitrary and requiring use of much non-engineering judgment and information that is not readily available. This report suggests strategies to revise the current ASHRAE IAQP to reduce its current limitations. These strategies, however, would make it more complex and more prescriptive, and would require substantial research. One practical intermediate strategy to save energy would be an alternate VRP, allowing VRs lower than currently prescribed, as long as indoor VOC concentrations were no higher than with VRs prescribed under the current VRP. This kind of hybrid, with source reduction and use of air cleaning optional but permitted, could eventually evolve, as data, materials, and air-cleaning technology allowed gradual lowering of allowable concentrations, into a fully developed IAQP. Ultimately, it seems that VR standards must evolve to resemble the IAQP, especially in California, where buildings must achieve zero net energy use within 20 years.« less
The Selective Use of Radiation Therapy in Rectal Cancer Patients.
Martella, Andrew; Willett, Christopher; Palta, Manisha; Czito, Brian
2018-04-11
Colorectal cancer has a high global incidence, and standard treatment employs a multimodality approach. In addition to cure, minimizing treatment-related toxicity and improving the therapeutic ratio is a common goal. The following article addresses the potential of omitting radiotherapy in select rectal cancer patients. Omission of radiotherapy in rectal cancer is analyzed in the context of historical findings, as well as more recent data describing risk stratification of stage II-III disease, surgical optimization, imaging limitations, improvement in systemic chemotherapeutic agents, and contemporary studies evaluating selective omission of radiotherapy. A subset of rectal cancer patients exists that may be considered low to intermediate risk for locoregional recurrence. With appropriate staging, surgical technique, and possibly improved systemic therapy, it may be feasible to selectively omit radiotherapy in these patients. Current imaging limitations as well as evidence of increased locoregional recurrence following radiotherapy omission lend us to continue supporting the standard treatment of approach of neoadjuvant chemoradiation therapy followed by surgical resection until additional improvements and prospective evidence can support otherwise.
Gregg, Evan O.; Minet, Emmanuel
2013-01-01
There are established guidelines for bioanalytical assay validation and qualification of biomarkers. In this review, they were applied to a panel of urinary biomarkers of tobacco smoke exposure as part of a “fit for purpose” approach to the assessment of smoke constituents exposure in groups of tobacco product smokers. Clinical studies have allowed the identification of a group of tobacco exposure biomarkers demonstrating a good doseresponse relationship whilst others such as dihydroxybutyl mercapturic acid and 2-carboxy-1-methylethylmercapturic acid – did not reproducibly discriminate smokers and non-smokers. Furthermore, there are currently no agreed common reference standards to measure absolute concentrations and few inter-laboratory trials have been performed to establish consensus values for interim standards. Thus, we also discuss in this review additional requirements for the generation of robust data on urinary biomarkers, including toxicant metabolism and disposition, method validation and qualification for use in tobacco products comparison studies. PMID:23902266
Paraboloid magnetospheric magnetic field model and the status of the model as an ISO standard
NASA Astrophysics Data System (ADS)
Alexeev, I.
A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions It is a reason why the method of the paraboloid magnetospheric model construction based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters Such approach is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace equation for each of these large-scale current systems in the magnetosphere with a
Angiogenic therapy for cardiac repair based on protein delivery systems.
Formiga, F R; Tamayo, E; Simón-Yarza, T; Pelacho, B; Prósper, F; Blanco-Prieto, M J
2012-05-01
Cardiovascular diseases remain the first cause of morbidity and mortality in the developed countries and are a major problem not only in the western nations but also in developing countries. Current standard approaches for treating patients with ischemic heart disease include angioplasty or bypass surgery. However, a large number of patients cannot be treated using these procedures. Novel curative approaches under investigation include gene, cell, and protein therapy. This review focuses on potential growth factors for cardiac repair. The role of these growth factors in the angiogenic process and the therapeutic implications are reviewed. Issues including aspects of growth factor delivery are presented in relation to protein stability, dosage, routes, and safety matters. Finally, different approaches for controlled growth factor delivery are discussed as novel protein delivery platforms for cardiac regeneration.
Carling, Christopher
2013-08-01
Academic and practitioner interest in the physical performance of male professional soccer players in the competition setting determined via time-motion analyses has grown substantially over the last four decades leading to a substantial body of published research and aiding development of a more systematic evidence-based framework for physical conditioning. Findings have forcibly shaped contemporary opinions in the sport with researchers and practitioners frequently emphasising the important role that physical performance plays in match outcomes. Time-motion analyses have also influenced practice as player conditioning programmes can be tailored according to the different physical demands identified across individual playing positions. Yet despite a more systematic approach to physical conditioning, data indicate that even at the very highest standards of competition, the contemporary player is still susceptible to transient and end-game fatigue. Over the course of this article, the author suggests that a more pragmatic approach to interpreting the current body of time-motion analysis data and its application in the practical setting is nevertheless required. Examples of this are addressed using findings in the literature to examine (a) the association between competitive physical performance and 'success' in professional soccer, (b) current approaches to interpreting differences in time-motion analysis data across playing positions, and (c) whether data can realistically be used to demonstrate the occurrence of fatigue in match-play. Gaps in the current literature and directions for future research are also identified.
How to manage MTTF larger than 30,000hr on rotary cryocoolers
NASA Astrophysics Data System (ADS)
Cauquil, Jean-Marc; Seguineau, Cédric; Martin, Jean-Yves; Van-Acker, Sébastien; Benschop, Tonny
2017-05-01
The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. Indeed, Stirling coolers are mechanical systems where wear occurs on millimetric mechanisms. The exponential law classically used in electronics for Mean Time to Failure (MTTF) calculation cannot be directly used for mechanical devices. With new applications for thermal sensor like border surveillance, an increasing reliability has become mandatory for rotary cooler. The current needs are above several tens of thousands of continuous hour of cooling. Thales Cryogenics made specific development on that topic, for both linear and rotary applications. The time needed for validating changes in processes through suited experimental design is hardly affordable by following a robust and rigorous standard scientific approach. The targeted Mean Time to Failure (MTTF) led us to adopt an innovative approach to keep development phases in line with expected time to market. This innovative approach is today widespread on all of Thales Cryogenics rotary products and results in a proven increase of MTTF for RM2, RM3 and recently RM1. This paper will then focused on the current MTTF figures measured on RM1, RM2 and RM3. After explaining the limit of a conventional approach, the paper will then describe the current method. At last, the authors will explain how these principles are taken into account for the new SWaP rotary cooler of Thales Cryogénie SAS.
Research on resistance characteristics of YBCO tape under short-time DC large current impact
NASA Astrophysics Data System (ADS)
Zhang, Zhifeng; Yang, Jiabin; Qiu, Qingquan; Zhang, Guomin; Lin, Liangzhen
2017-06-01
Research of the resistance characteristics of YBCO tape under short-time DC large current impact is the foundation of the developing DC superconducting fault current limiter (SFCL) for voltage source converter-based high voltage direct current system (VSC-HVDC), which is one of the valid approaches to solve the problems of renewable energy integration. SFCL can limit DC short-circuit and enhance the interrupting capabilities of DC circuit breakers. In this paper, under short-time DC large current impacts, the resistance features of naked tape of YBCO tape are studied to find the resistance - temperature change rule and the maximum impact current. The influence of insulation for the resistance - temperature characteristics of YBCO tape is studied by comparison tests with naked tape and insulating tape in 77 K. The influence of operating temperature on the tape is also studied under subcooled liquid nitrogen condition. For the current impact security of YBCO tape, the critical current degradation and top temperature are analyzed and worked as judgment standards. The testing results is helpful for in developing SFCL in VSC-HVDC.
Osteotomy models - the current status on pain scoring and management in small rodents.
Lang, Annemarie; Schulz, Anja; Ellinghaus, Agnes; Schmidt-Bleek, Katharina
2016-12-01
Fracture healing is a complex regeneration process which produces new bone tissue without scar formation. However, fracture healing disorders occur in approximately 10% of human patients and cause severe pain and reduced quality of life. Recently, the development of more standardized, sophisticated and commercially available osteosynthesis techniques reflecting clinical approaches has increased the use of small rodents such as rats and mice in bone healing research dramatically. Nevertheless, there is no standard for pain assessment, especially in these species, and consequently limited information regarding the welfare aspects of osteotomy models. Moreover, the selection of analgesics is restricted for osteotomy models since non-steroidal anti-inflammatory drugs (NSAIDs) are known to affect the initial, inflammatory phase of bone healing. Therefore, opioids such as buprenorphine and tramadol are often used. However, dosage data in the literature are varied. Within this review, we clarify the background of osteotomy models, explain the current status and challenges of animal welfare assessment, and provide an example score sheet including model specific parameters. Furthermore, we summarize current refinement options and present a brief outlook on further 3R research. © The Author(s) 2016.
The currency and tempo of extinction.
Regan, H M; Lupia, R; Drinnan, A N; Burgman, M A
2001-01-01
This study examines estimates of extinction rates for the current purported biotic crisis and from the fossil record. Studies that compare current and geological extinctions sometimes use metrics that confound different sources of error and reflect different features of extinction processes. The per taxon extinction rate is a standard measure in paleontology that avoids some of the pitfalls of alternative approaches. Extinction rates reported in the conservation literature are rarely accompanied by measures of uncertainty, despite many elements of the calculations being subject to considerable error. We quantify some of the most important sources of uncertainty and carry them through the arithmetic of extinction rate calculations using fuzzy numbers. The results emphasize that estimates of current and future rates rely heavily on assumptions about the tempo of extinction and on extrapolations among taxa. Available data are unlikely to be useful in measuring magnitudes or trends in current extinction rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman; Jeffrey C. Joe
2005-09-01
An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less
Electrical Stimulation Technologies for Wound Healing
Kloth, Luther C.
2014-01-01
Objective: To discuss the physiological bases for using exogenously applied electric field (EF) energy to enhance wound healing with conductive electrical stimulation (ES) devices. Approach: To describe the types of electrical currents that have been reported to enhance chronic wound-healing rate and closure. Results: Commercial ES devices that generate direct current (DC), and mono and biphasic pulsed current waveforms represent the principal ES technologies which are reported to enhance wound healing. Innovation: Wafer-thin, disposable ES technologies (wound dressings) that utilize mini or micro-batteries to deliver low-level DC for wound healing and antibacterial wound-treatment purposes are commercially available. Microfluidic wound-healing chips are currently being used with greater accuracy to investigate the EF effects on cellular electrotaxis. Conclusion: Numerous clinical trials described in subsequent sections of this issue have demonstrated that ES used adjunctively with standard wound care (SWC), enhances wound healing rate faster than SWC alone. PMID:24761348
International Standards for Genomes, Transcriptomes, and Metagenomes
Mason, Christopher E.; Afshinnekoo, Ebrahim; Tighe, Scott; Wu, Shixiu; Levy, Shawn
2017-01-01
Challenges and biases in preparing, characterizing, and sequencing DNA and RNA can have significant impacts on research in genomics across all kingdoms of life, including experiments in single-cells, RNA profiling, and metagenomics (across multiple genomes). Technical artifacts and contamination can arise at each point of sample manipulation, extraction, sequencing, and analysis. Thus, the measurement and benchmarking of these potential sources of error are of paramount importance as next-generation sequencing (NGS) projects become more global and ubiquitous. Fortunately, a variety of methods, standards, and technologies have recently emerged that improve measurements in genomics and sequencing, from the initial input material to the computational pipelines that process and annotate the data. Here we review current standards and their applications in genomics, including whole genomes, transcriptomes, mixed genomic samples (metagenomes), and the modified bases within each (epigenomes and epitranscriptomes). These standards, tools, and metrics are critical for quantifying the accuracy of NGS methods, which will be essential for robust approaches in clinical genomics and precision medicine. PMID:28337071
CCSDS - Advancing Spaceflight Technology for International Collaboration
NASA Technical Reports Server (NTRS)
Kearney, Mike; Kiely, Aaron; Yeh, Penshu; Gerner, Jean-Luc; Calzolari, Gian-Paolo; Gifford, Kevin; Merri, Mario; Weiss, Howard
2010-01-01
The Consultative Committee for Space Data Systems (CCSDS) has been developing data and communications standards since 1982, with the objective of providing interoperability for enabling international collaboration for spaceflight missions. As data and communications technology has advanced, CCSDS has progressed to capitalize on existing products when available and suitable for spaceflight, and to develop innovative new approaches when available products fail. The current scope of the CCSDS architecture spans the end-to-end data architecture of a spaceflight mission, with ongoing efforts to develop and standardize cutting-edge technology. This manuscript describes the overall architecture, the position of CCSDS in the standards and international mission community, and some CCSDS processes. It then highlights in detail several of the most interesting and critical technical areas in work right now, and how they support collaborative missions. Special topics include: Delay/Disruption Tolerant Networking (DTN), Asynchronous Message Service (AMS), Multispectral/Hyperspectral Data Compression (MHDC), Coding and Synchronization, Onboard Wireless, Spacecraft Monitor and Control, Navigation, Security, and Time Synchronization/Correlation. Broad international participation in development of CCSDS standards is encouraged.
Pang, Xiao-Na; Li, Zhao-Jie; Chen, Jing-Yu; Gao, Li-Juan; Han, Bei-Zhong
2017-03-01
Standards and regulations related to spirit drinks have been established by different countries and international organizations to ensure the safety and quality of spirits. Here, we introduce the principles of food safety and quality standards for alcoholic beverages and then compare the key indicators used in the distinct standards of the Codex Alimentarius Commission, the European Union, the People's Republic of China, the United States, Canada, and Australia. We also discuss in detail the "maximum level" of the following main contaminants of spirit drinks: methanol, higher alcohols, ethyl carbamate, hydrocyanic acid, heavy metals, mycotoxins, phthalates, and aldehydes. Furthermore, the control measures used for potential hazards are introduced. Harmonization of the current requirements based on comprehensive scope analysis and the risk assessment approach will enhance both the trade and quality of distilled spirits. This review article provides valuable information that will enable producers, traders, governments, and researchers to increase their knowledge of spirit drink safety requirements, control measures, and research trends.
Pellegrin, Karen L; Miyamura, Jill B; Ma, Carolyn; Taniguchi, Ronald
2016-01-01
Current race/ethnicity categories established by the U.S. Office of Management and Budget are neither reliable nor valid for understanding health disparities or for tracking improvements in this area. In Hawaii, statewide hospitals have collaborated to collect race/ethnicity data using a standardized method consistent with recommended practices that overcome the problems with the federal categories. The purpose of this observational study was to determine the impact of this collaboration on key measures of race/ethnicity documentation. After this collaborative effort, the number of standardized categories available across hospitals increased from 6 to 34, and the percent of inpatients with documented race/ethnicity increased from 88 to 96%. This improved standardized methodology is now the foundation for tracking population health indicators statewide and focusing quality improvement efforts. The approach used in Hawaii can serve as a model for other states and regions. Ultimately, the ability to standardize data collection methodology across states and regions will be needed to track improvements nationally.
Nicotine reduction as an increase in the unit price of cigarettes: A behavioral economics approach
Smith, Tracy T.; Sved, Alan F.; Hatsukami, Dorothy K.; Donny, Eric C.
2015-01-01
Urgent action is needed to reduce the harm caused by smoking. Product standards that reduce the addictiveness of cigarettes are now possible both in the U.S. and in countries party to the Framework Convention on Tobacco Control. Specifically, standards that required substantially reduced nicotine content in cigarettes could enable cessation in smokers and prevent future smoking among current non-smokers. Behavioral economics uses principles from the field of microeconomics to characterize how consumption of a reinforcer changes as a function of the unit price of that reinforcer (unit price = cost / reinforcer magnitude). A nicotine reduction policy might be considered an increase in the unit price of nicotine because smokers are paying more per unit of nicotine. This perspective allows principles from behavioral economics to be applied to nicotine reduction research questions, including how nicotine consumption, smoking behavior, use of other tobacco products, and use of other drugs of abuse are likely to be affected. This paper reviews the utility of this approach and evaluates the notion that a reduction in nicotine content is equivalent to a reduction in the reinforcement value of smoking—an assumption made by the unit price approach. PMID:25025523
Nicotine reduction as an increase in the unit price of cigarettes: a behavioral economics approach.
Smith, Tracy T; Sved, Alan F; Hatsukami, Dorothy K; Donny, Eric C
2014-11-01
Urgent action is needed to reduce the harm caused by smoking. Product standards that reduce the addictiveness of cigarettes are now possible both in the U.S. and in countries party to the Framework Convention on Tobacco Control. Specifically, standards that required substantially reduced nicotine content in cigarettes could enable cessation in smokers and prevent future smoking among current non-smokers. Behavioral economics uses principles from the field of microeconomics to characterize how consumption of a reinforcer changes as a function of the unit price of that reinforcer (unit price=cost/reinforcer magnitude). A nicotine reduction policy might be considered an increase in the unit price of nicotine because smokers are paying more per unit of nicotine. This perspective allows principles from behavioral economics to be applied to nicotine reduction research questions, including how nicotine consumption, smoking behavior, use of other tobacco products, and use of other drugs of abuse are likely to be affected. This paper reviews the utility of this approach and evaluates the notion that a reduction in nicotine content is equivalent to a reduction in the reinforcement value of smoking-an assumption made by the unit price approach. Copyright © 2014 Elsevier Inc. All rights reserved.
Marinkovich, Matt; Wallace, Chelsea; Morris, Pat J; Rideout, Bruce; Pye, Geoffrey W
2016-03-01
The preshipment examination, with associated transmissible disease testing, has become standard practice in the movement of animals between zoos. An alternative disease risk-based approach, based on a comprehensive surveillance program including necropsy and preventive medicine examination testing and data, has been in practice since 2006 between the San Diego Zoo and San Diego Zoo Safari Park. A retrospective analysis, evaluating comprehensive necropsy data and preshipment testing over a 5-yr study period, was performed to determine the viability of this model for use with sending animals to other institutions. Animals (607 birds, 704 reptiles and amphibians, and 341 mammals) were shipped to 116 Association of Zoos and Aquariums (AZA)-accredited and 29 non-AZA-accredited institutions. The evaluation showed no evidence of the specific transmissible diseases tested for during the preshipment exam being present within the San Diego Zoo collection. We suggest that a risk-based animal and institution-specific approach to transmissible disease preshipment testing is more cost effective and is in the better interest of animal welfare than the current industry standard of dogmatic preshipment testing.
An ecological approach to problems of Dark Energy, Dark Matter, MOND and Neutrinos
NASA Astrophysics Data System (ADS)
Zhao, Hong Sheng
2008-11-01
Modern astronomical data on galaxy and cosmological scales have revealed powerfully the existence of certain dark sectors of fundamental physics, i.e., existence of particles and fields outside the standard models and inaccessible by current experiments. Various approaches are taken to modify/extend the standard models. Generic theories introduce multiple de-coupled fields A, B, C, each responsible for the effects of DM (cold supersymmetric particles), DE (Dark Energy) effect, and MG (Modified Gravity) effect respectively. Some theories use adopt vanilla combinations like AB, BC, or CA, and assume A, B, C belong to decoupled sectors of physics. MOND-like MG and Cold DM are often taken as antagnising frameworks, e.g. in the muddled debate around the Bullet Cluster. Here we argue that these ad hoc divisions of sectors miss important clues from the data. The data actually suggest that the physics of all dark sectors is likely linked together by a self-interacting oscillating field, which governs a chameleon-like dark fluid, appearing as DM, DE and MG in different settings. It is timely to consider an interdisciplinary approach across all semantic boundaries of dark sectors, treating the dark stress as one identity, hence accounts for several "coincidences" naturally.
Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis
2018-02-09
Virgin olive oil is the only food product for which sensory analysis is regulated to classify it in different quality categories. To harmonize the results of the sensorial method, the use of standards or reference materials is crucial. The stability of sensory reference materials is required to enable their suitable control, aiming to confirm that their specific target values are maintained on an ongoing basis. Currently, such stability is monitored by means of sensory analysis and the sensory panels are in the paradoxical situation of controlling the standards that are devoted to controlling the panels. In the present study, several approaches based on similarity analysis are exploited. For each approach, the specific methodology to build a proper multivariate control chart to monitor the stability of the sensory properties is explained and discussed. The normalized Euclidean and Mahalanobis distances, the so-called nearness and hardiness indices respectively, have been defined as new similarity indices to range the values from 0 to 1. Also, the squared mean from Hotelling's T 2 -statistic and Q 2 -statistic has been proposed as another similarity index. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.
Comparison of two head-up displays in simulated standard and noise abatement night visual approaches
NASA Technical Reports Server (NTRS)
Cronn, F.; Palmer, E. A., III
1975-01-01
Situation and command head-up displays were evaluated for both standard and two segment noise abatement night visual approaches in a fixed base simulation of a DC-8 transport aircraft. The situation display provided glide slope and pitch attitude information. The command display provided glide slope information and flight path commands to capture a 3 deg glide slope. Landing approaches were flown in both zero wind and wind shear conditions. For both standard and noise abatement approaches, the situation display provided greater glidepath accuracy in the initial phase of the landing approaches, whereas the command display was more effective in the final approach phase. Glidepath accuracy was greater for the standard approaches than for the noise abatement approaches in all phases of the landing approach. Most of the pilots preferred the command display and the standard approach. Substantial agreement was found between each pilot's judgment of his performance and his actual performance.
XML-based approaches for the integration of heterogeneous bio-molecular data.
Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David
2009-10-15
The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources.
Paritosh, Kunwar; Kushwaha, Sandeep K.; Yadav, Monika; Pareek, Nidhi; Chawade, Aakash
2017-01-01
Food wastage and its accumulation are becoming a critical problem around the globe due to continuous increase of the world population. The exponential growth in food waste is imposing serious threats to our society like environmental pollution, health risk, and scarcity of dumping land. There is an urgent need to take appropriate measures to reduce food waste burden by adopting standard management practices. Currently, various kinds of approaches are investigated in waste food processing and management for societal benefits and applications. Anaerobic digestion approach has appeared as one of the most ecofriendly and promising solutions for food wastes management, energy, and nutrient production, which can contribute to world's ever-increasing energy requirements. Here, we have briefly described and explored the different aspects of anaerobic biodegrading approaches for food waste, effects of cosubstrates, effect of environmental factors, contribution of microbial population, and available computational resources for food waste management researches. PMID:28293629
Beyond Fine Tuning: Adding capacity to leverage few labels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodas, Nathan O.; Shaffer, Kyle J.; Yankov, Artem
2017-12-09
In this paper we present a technique to train neural network models on small amounts of data. Current methods for training neural networks on small amounts of rich data typically rely on strategies such as fine-tuning a pre-trained neural networks or the use of domain-specific hand-engineered features. Here we take the approach of treating network layers, or entire networks, as modules and combine pre-trained modules with untrained modules, to learn the shift in distributions between data sets. The central impact of using a modular approach comes from adding new representations to a network, as opposed to replacing representations via fine-tuning.more » Using this technique, we are able surpass results using standard fine-tuning transfer learning approaches, and we are also able to significantly increase performance over such approaches when using smaller amounts of data.« less
Paritosh, Kunwar; Kushwaha, Sandeep K; Yadav, Monika; Pareek, Nidhi; Chawade, Aakash; Vivekanand, Vivekanand
2017-01-01
Food wastage and its accumulation are becoming a critical problem around the globe due to continuous increase of the world population. The exponential growth in food waste is imposing serious threats to our society like environmental pollution, health risk, and scarcity of dumping land. There is an urgent need to take appropriate measures to reduce food waste burden by adopting standard management practices. Currently, various kinds of approaches are investigated in waste food processing and management for societal benefits and applications. Anaerobic digestion approach has appeared as one of the most ecofriendly and promising solutions for food wastes management, energy, and nutrient production, which can contribute to world's ever-increasing energy requirements. Here, we have briefly described and explored the different aspects of anaerobic biodegrading approaches for food waste, effects of cosubstrates, effect of environmental factors, contribution of microbial population, and available computational resources for food waste management researches.
'Mendelian randomization': an approach for exploring causal relations in epidemiology.
Gupta, V; Walia, G K; Sachdeva, M P
2017-04-01
To assess the current status of Mendelian randomization (MR) approach in effectively influencing the observational epidemiology for examining causal relationships. Narrative review on studies related to principle, strengths, limitations, and achievements of MR approach. Observational epidemiological studies have repeatedly produced several beneficiary associations which were discarded when tested by standard randomized controlled trials (RCTs). The technique which is more feasible, highly similar to RCTs, and has the potential to establish a causal relationship between modifiable exposures and disease outcomes is known as MR. The technique uses genetic variants related to modifiable traits/exposures as instruments for detecting causal and directional associations with outcomes. In the last decade, the approach of MR has methodologically developed and progressed to a stage of high acceptance among the epidemiologists and is gradually expanding the landscape of causal relationships in non-communicable chronic diseases. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Improving hospital weekend handover: a user-centered, standardised approach.
Mehra, Avi; Henein, Christin
2014-01-01
Clinical Handover remains one of the most perilous procedures in medicine (1). Weekend handover has emerged as a key area of concern with high variability in handover processes across hospitals (1,2,4, 5-10). Studying weekend handover processes within medicine at an acute teaching hospital revealed huge variability in documented content and structure. A total of 12 different pro formas were in use by the medical day-team to handover to the weekend team on-call. A Likert-survey of doctors revealed 93% felt the current handover system needed improvement with 71% stating that it did not ensure patient safety (Chi-squared, p-value <0.001, n=32). Semi-structured interviews of doctors identified common themes including "a lack of consistency in approach" "poor standardization" and "high variability". Seeking to address concerns of standardization, a standardized handover pro forma was developed using Royal College of Physician (RCP) guidelines (2), with direct end-user input. Results following implementation revealed a considerable improvement in documented ceiling of care, urgency of task and team member assignment with 100% uptake of the new proforma at both 4-week and 6-month post-implementation analyses. 88% of doctors surveyed perceived that the new proforma improved patient safety (p<0.01, n=25), with 62% highlighting that it allowed doctors to work more efficiently. Results also revealed that 44% felt further improvements were needed and highlighted electronic solutions and handover training as main priorities. Handover briefing was subsequently incorporated into junior doctor induction and education modules delivered, with good feedback. Following collaboration with key stakeholders and with end-user input, integrated electronic handover software was designed and funding secured. The software is currently under final development. Introducing a standardized handover proforma can be an effective initial step in improving weekend handover. Handover education and end-user involvement are key in improving the process. Electronic handover solutions have been shown to significantly increase the quality of handover and are worth considering (9, 10).
Hendrickson, Carolyn M; Dobbins, Sarah; Redick, Brittney J; Greenberg, Molly D; Calfee, Carolyn S; Cohen, Mitchell Jay
2015-09-01
Adherence to rigorous research protocols for identifying adult respiratory distress syndrome (ARDS) after trauma is variable. To examine how misclassification of ARDS may bias observational studies in trauma populations, we evaluated the agreement of two methods for adjudicating ARDS after trauma: the current gold standard, direct review of chest radiographs and review of dictated radiology reports, a commonly used alternative. This nested cohort study included 123 mechanically ventilated patients between 2005 and 2008, with at least one PaO2/FIO2 less than 300 within the first 8 days of admission. Two blinded physician investigators adjudicated ARDS by two methods. The investigators directly reviewed all chest radiographs to evaluate for bilateral infiltrates. Several months later, blinded to their previous assessments, they adjudicated ARDS using a standardized rubric to classify radiology reports. A κ statistics was calculated. Regression analyses quantified the association between established risk factors as well as important clinical outcomes and ARDS determined by the aforementioned methods as well as hypoxemia as a surrogate marker. The κ was 0.47 for the observed agreement between ARDS adjudicated by direct review of chest radiographs and ARDS adjudicated by review of radiology reports. Both the magnitude and direction of bias on the estimates of association between ARDS and established risk factors as well as clinical outcomes varied by method of adjudication. Classification of ARDS by review of dictated radiology reports had only moderate agreement with the current gold standard, ARDS adjudicated by direct review of chest radiographs. While the misclassification of ARDS had varied effects on the estimates of associations with established risk factors, it tended to weaken the association of ARDS with important clinical outcomes. A standardized approach to ARDS adjudication after trauma by direct review of chest radiographs will minimize misclassification bias in future observational studies. Diagnostic study, level II.
Modified Drop Tower Impact Tests for American Football Helmets.
Rush, G Alston; Prabhu, R; Rush, Gus A; Williams, Lakiesha N; Horstemeyer, M F
2017-02-19
A modified National Operating Committee on Standards for Athletic Equipment (NOCSAE) test method for American football helmet drop impact test standards is presented that would provide better assessment of a helmet's on-field impact performance by including a faceguard on the helmet. In this study, a merger of faceguard and helmet test standards is proposed. The need for a more robust systematic approach to football helmet testing procedures is emphasized by comparing representative results of the Head Injury Criterion (HIC), Severity Index (SI), and peak acceleration values for different helmets at different helmet locations under modified NOCSAE standard drop tower tests. Essentially, these comparative drop test results revealed that the faceguard adds a stiffening kinematic constraint to the shell that lessens total energy absorption. The current NOCSAE standard test methods can be improved to represent on-field helmet hits by attaching the faceguards to helmets and by including two new helmet impact locations (Front Top and Front Top Boss). The reported football helmet test method gives a more accurate representation of a helmet's performance and its ability to mitigate on-field impacts while promoting safer football helmets.
Data Standards for Flow Cytometry
SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.
2009-01-01
Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228
Hg0 and HgCl2 Reference Gas Standards: ?NIST Traceability ...
EPA and NIST have collaborated to establish the necessary procedures for establishing the required NIST traceability of commercially-provided Hg0 and HgCl2 reference generators. This presentation will discuss the approach of a joint EPA/NIST study to accurately quantify the true concentrations of Hg0 and HgCl2 reference gases produced from high quality, NIST-traceable, commercial Hg0 and HgCl2 generators. This presentation will also discuss the availability of HCl and Hg0 compressed reference gas standards as a result of EPA's recently approved Alternative Methods 114 and 118. Gaseous elemental mercury (Hg0) and oxidized mercury (HgCl2) reference standards are integral to the use of mercury continuous emissions monitoring systems (Hg CEMS) for regulatory compliance emissions monitoring. However, a quantitative disparity of approximately 7-10% has been observed between commercial Hg0 and HgCl2 reference gases which currently limits the use of (HgCl2) reference gas standards. Resolving this disparity would enable the expanded use of (HgCl2) reference gas standards for regulatory compliance purposes.
Open Science: tools, approaches, and implications.
Neylon, Cameron; Wu, Shirley
2009-01-01
Open Science is gathering pace both as a grass roots effort amongst scientists to enable them to share the outputs of their research more effectively, and as a policy initiative for research funders to gain a greater return on their investment. In this workshop, we will discuss the current state of the art in collaborative research tools, the social challenges facing those adopting and advocating more openness, and the development of standards, policies and best practices for Open Science.
Lin, Steve; Scales, Damon C
2016-06-28
High-quality cardiopulmonary resuscitation (CPR) has been shown to improve survival outcomes after cardiac arrest. The current standard in studies evaluating CPR quality is to measure CPR process measures-for example, chest compression rate, depth, and fraction. Published studies evaluating CPR feedback devices have yielded mixed results. Newer approaches that seek to optimize CPR by measuring physiological endpoints during the resuscitation may lead to individualized patient care and improved patient outcomes.
Acupuncture for the Trauma Spectrum Response: Scientific Foundations, Challenges to Implementation
2011-01-01
The current approach to management of these injuries follows the standard medical model that attempts to isolate the pathophysiological locations and...the private views of the author and are not to be construed as official or as reflecting the views of the United States Air Force Medical Corps, the...Air Force at large, or the Department of Defense. The author indicates that he does not have any conflicts of interest. MEDICAL ACUPUNCTURE Volume 23
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
Space Life Sciences at NASA: Spaceflight Health Policy and Standards
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.; House, Nancy G.
2006-01-01
In January 2005, the President proposed a new initiative, the Vision for Space Exploration. To accomplish the goals within the vision for space exploration, physicians and researchers at Johnson Space Center are establishing spaceflight health standards. These standards include fitness for duty criteria (FFD), permissible exposure limits (PELs), and permissible outcome limits (POLs). POLs delineate an acceptable maximum decrement or change in a physiological or behavioral parameter, as the result of exposure to the space environment. For example cardiovascular fitness for duty standards might be a measurable clinical parameter minimum that allows successful performance of all required duties. An example of a permissible exposure limit for radiation might be the quantifiable limit of exposure over a given length of time (e.g. life time radiation exposure). An example of a permissible outcome limit might be the length of microgravity exposure that would minimize bone loss. The purpose of spaceflight health standards is to promote operational and vehicle design requirements, aid in medical decision making during space missions, and guide the development of countermeasures. Standards will be based on scientific and clinical evidence including research findings, lessons learned from previous space missions, studies conducted in space analog environments, current standards of medical practices, risk management data, and expert recommendations. To focus the research community on the needs for exploration missions, NASA has developed the Bioastronautics Roadmap. The Bioastronautics Roadmap, NASA's approach to identification of risks to human space flight, revised baseline was released in February 2005. This document was reviewed by the Institute of Medicine in November 2004 and the final report was received in October 2005. The roadmap defines the most important research and operational needs that will be used to set policy, standards (define acceptable risk), and implement an overall Risk Management and Analysis process. Currently NASA is drafting spaceflight health standards for neurosensory alterations, space radiation exposure, behavioral health, muscle atrophy, cardiovascular fitness, immunological compromise, bone demineralization, and nutrition.
Beichel, Reinhard R; Van Tol, Markus; Ulrich, Ethan J; Bauer, Christian; Chang, Tangel; Plichta, Kristin A; Smith, Brian J; Sunderland, John J; Graham, Michael M; Sonka, Milan; Buatti, John M
2016-06-01
The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the "just-enough-interaction" principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.
Beichel, Reinhard R.; Van Tol, Markus; Ulrich, Ethan J.; Bauer, Christian; Chang, Tangel; Plichta, Kristin A.; Smith, Brian J.; Sunderland, John J.; Graham, Michael M.; Sonka, Milan; Buatti, John M.
2016-01-01
Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction. PMID:27277044
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beichel, Reinhard R., E-mail: reinhard-beichel@uiowa.edu; Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242; Department of Internal Medicine, University of Iowa, Iowa City, Iowa 52242
Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behaviormore » of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties of the authors approach make it well suited for applications in image-guided radiation oncology, response assessment, or treatment outcome prediction.« less
Thermoelectric converters for alternating current standards
NASA Astrophysics Data System (ADS)
Anatychuk, L. I.; Taschuk, D. D.
2012-06-01
Thermoelectric converters of alternating current remain priority instruments when creating standard equipment. This work presents the results of design and manufacture of alternating current converter for a military standard of alternating current in Ukraine. Results of simulation of temperature distribution in converter elements, ways of optimization to improve the accuracy of alternating current signal reproduction are presented. Results of metrological trials are given. The quality of thermoelectric material specially created for alternating current metrology is verified. The converter was used in alternating current standard for the frequency range from 10 Hz to 30 MHz. The efficiency of using thermoelectric signal converters in measuring instruments is confirmed.
The future of high-grade glioma: Where we are and where are we going
Rhun, Emilie Le; Taillibert, Sophie; Chamberlain, Marc C.
2015-01-01
High-grade glioma (HGG) are optimally treated with maximum safe surgery, followed by radiotherapy (RT) and/or systemic chemotherapy (CT). Recently, the treatment of newly diagnosed anaplastic glioma (AG) has changed, particularly in patients with 1p19q codeleted tumors. Results of trials currenlty ongoing are likely to determine the best standard of care for patients with noncodeleted AG tumors. Trials in AG illustrate the importance of molecular characterization, which are germane to both prognosis and treatment. In contrast, efforts to improve the current standard of care of newly diagnosed glioblastoma (GB) with, for example, the addition of bevacizumab (BEV), have been largely disappointing and furthermore molecular characterization has not changed therapy except in elderly patients. Novel approaches, such as vaccine-based immunotherapy, for newly diagnosed GB are currently being pursued in multiple clinical trials. Recurrent disease, an event inevitable in nearly all patients with HGG, continues to be a challenge. Both recurrent GB and AG are managed in similar manner and when feasible re-resection is often suggested notwithstanding limited data to suggest benefit from repeat surgery. Occassional patients may be candidates for re-irradiation but again there is a paucity of data to commend this therapy and only a minority of selected patients are eligible for this approach. Consequently systemic therapy continues to be the most often utilized treatment in recurrent HGG. Choice of therapy, however, varies and revolves around re-challenge with temozolomide (TMZ), use of a nitrosourea (most often lomustine; CCNU) or BEV, the most frequently used angiogenic inhibitor. Nevertheless, no clear standard recommendation regarding the prefered agent or combination of agents is avaliable. Prognosis after progression of a HGG remains poor, with an unmet need to improve therapy. PMID:25722939
NASA Technical Reports Server (NTRS)
Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza
2016-01-01
Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.
Beyond the Standard Model: The pragmatic approach to the gauge hierarchy problem
NASA Astrophysics Data System (ADS)
Mahbubani, Rakhi
The current favorite solution to the gauge hierarchy problem, the Minimal Supersymmetric Standard Model (MSSM), is looking increasingly fine tuned as recent results from LEP-II have pushed it to regions of its parameter space where a light higgs seems unnatural. Given this fact it seems sensible to explore other approaches to this problem; we study three alternatives here. The first is a Little Higgs theory, in which the Higgs particle is realized as the pseudo-Goldstone boson of an approximate global chiral symmetry and so is naturally light. We analyze precision electroweak observables in the Minimal Moose model, one example of such a theory, and look for regions in its parameter space that are consistent with current limits on these. It is also possible to find a solution within a supersymmetric framework by adding to the MSSM superpotential a lambdaSHuH d term and UV completing with new strong dynamics under which S is a composite before lambda becomes non-perturbative. This allows us to increase the MSSM tree level higgs mass bound to a value that alleviates the supersymmetric fine-tuning problem with elementary higgs fields, maintaining gauge coupling unification in a natural way. Finally we try an entirely different tack, in which we do not attempt to solve the hierarchy problem, but rather assume that the tuning of the higgs can be explained in some unnatural way, from environmental considerations for instance. With this philosophy in mind we study in detail the low-energy phenomenology of the minimal extension to the Standard Model with a dark matter candidate and gauge coupling unification, consisting of additional fermions with the quantum numbers of SUSY higgsinos, and a singlet.
Alali, Aziz S; McCredie, Victoria A; Mainprize, Todd G; Gomez, David; Nathens, Avery B
2017-10-01
Outcome after severe traumatic brain injury (TBI) differs substantially between hospitals. Explaining this variation begins with understanding the differences in structures and processes of care, particularly at intensive care units (ICUs) where acute TBI care takes place. We invited trauma medical directors (TMDs) from 187 centers participating in the American College of Surgeons Trauma Quality Improvement Program (ACS TQIP) to complete a survey. The survey domains included ICU model, type, availability of specialized units, staff, training programs, standard protocols and order sets, approach to withdrawal of life support, and perceived level of neurosurgeons' engagement in the ICU management of TBI. One hundred forty-two TMDs (76%) completed the survey. Severe TBI patients are admitted to dedicated neurocritical care units in 52 hospitals (37%), trauma ICUs in 44 hospitals (31%), general ICUs in 34 hospitals (24%), and surgical ICUs in 11 hospitals (8%). Fifty-seven percent are closed units. Board-certified intensivists directed 89% of ICUs, whereas 17% were led by neurointensivists. Sixty percent of ICU directors were general surgeons. Thirty-nine percent of hospitals had critical care fellowships and 11% had neurocritical care fellowships. Fifty-nine percent of ICUs had standard order sets and 61% had standard protocols specific for TBI, with the most common protocol relating to intracranial pressure management (53%). Only 43% of TMDs were satisfied with the current level of neurosurgeons' engagement in the ICU management of TBI; 46% believed that neurosurgeons should be more engaged; 11% believed they should be less engaged. In the largest survey of North American ICUs caring for TBI patients, there is substantial variation in the current approaches to ICU care for TBI, highlighting multiple opportunities for comparative effectiveness research.
The future of high-grade glioma: Where we are and where are we going.
Le Rhun, Emilie; Rhun, Emilie Le; Taillibert, Sophie; Chamberlain, Marc C
2015-01-01
High-grade glioma (HGG) are optimally treated with maximum safe surgery, followed by radiotherapy (RT) and/or systemic chemotherapy (CT). Recently, the treatment of newly diagnosed anaplastic glioma (AG) has changed, particularly in patients with 1p19q codeleted tumors. Results of trials currenlty ongoing are likely to determine the best standard of care for patients with noncodeleted AG tumors. Trials in AG illustrate the importance of molecular characterization, which are germane to both prognosis and treatment. In contrast, efforts to improve the current standard of care of newly diagnosed glioblastoma (GB) with, for example, the addition of bevacizumab (BEV), have been largely disappointing and furthermore molecular characterization has not changed therapy except in elderly patients. Novel approaches, such as vaccine-based immunotherapy, for newly diagnosed GB are currently being pursued in multiple clinical trials. Recurrent disease, an event inevitable in nearly all patients with HGG, continues to be a challenge. Both recurrent GB and AG are managed in similar manner and when feasible re-resection is often suggested notwithstanding limited data to suggest benefit from repeat surgery. Occassional patients may be candidates for re-irradiation but again there is a paucity of data to commend this therapy and only a minority of selected patients are eligible for this approach. Consequently systemic therapy continues to be the most often utilized treatment in recurrent HGG. Choice of therapy, however, varies and revolves around re-challenge with temozolomide (TMZ), use of a nitrosourea (most often lomustine; CCNU) or BEV, the most frequently used angiogenic inhibitor. Nevertheless, no clear standard recommendation regarding the prefered agent or combination of agents is avaliable. Prognosis after progression of a HGG remains poor, with an unmet need to improve therapy.
Zhang, Yi-Fan; Gou, Ling; Zhou, Tian-Shu; Lin, De-Nan; Zheng, Jing; Li, Ye; Li, Jing-Song
2017-08-01
Chronic diseases are complex and persistent clinical conditions that require close collaboration among patients and health care providers in the implementation of long-term and integrated care programs. However, current solutions focus partially on intensive interventions at hospitals rather than on continuous and personalized chronic disease management. This study aims to fill this gap by providing computerized clinical decision support during follow-up assessments of chronically ill patients at home. We proposed an ontology-based framework to integrate patient data, medical domain knowledge, and patient assessment criteria for chronic disease patient follow-up assessments. A clinical decision support system was developed to implement this framework for automatic selection and adaptation of standard assessment protocols to suit patient personal conditions. We evaluated our method in the case study of type 2 diabetic patient follow-up assessments. The proposed framework was instantiated using real data from 115,477 follow-up assessment records of 36,162 type 2 diabetic patients. Standard evaluation criteria were automatically selected and adapted to the particularities of each patient. Assessment results were generated as a general typing of patient overall condition and detailed scoring for each criterion, providing important indicators to the case manager about possible inappropriate judgments, in addition to raising patient awareness of their disease control outcomes. Using historical data as the gold standard, our system achieved a rate of accuracy of 99.93% and completeness of 95.00%. This study contributes to improving the accessibility, efficiency and quality of current patient follow-up services. It also provides a generic approach to knowledge sharing and reuse for patient-centered chronic disease management. Copyright © 2017 Elsevier Inc. All rights reserved.
Geometric representation methods for multi-type self-defining remote sensing data sets
NASA Technical Reports Server (NTRS)
Anuta, P. E.
1980-01-01
Efficient and convenient representation of remote sensing data is highly important for an effective utilization. The task of merging different data types is currently dealt with by treating each case as an individual problem. A description is provided of work which is carried out to standardize the multidata merging process. The basic concept of the new approach is that of the self-defining data set (SDDS). The creation of a standard is proposed. This standard would be such that data which may be of interest in a large number of earth resources remote sensing applications would be in a format which allows convenient and automatic merging. Attention is given to details regarding the multidata merging problem, a geometric description of multitype data sets, image reconstruction from track-type data, a data set generation system, and an example multitype data set.
Wise, Stephen A; Poster, Dianne L; Kucklick, John R; Keller, Jennifer M; Vanderpol, Stacy S; Sander, Lane C; Schantz, Michele M
2006-10-01
For the past 25 years the National Institute of Standards and Technology (NIST) has developed certified reference materials (CRMs), known as standard reference materials (SRMs), for determination of organic contaminants in environmental matrices. Assignment of certified concentrations has usually been based on combining results from two or more independent analytical methods. The first-generation environmental-matrix SRMs were issued with certified concentrations for a limited number (5 to 10) of polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs). Improvements in the analytical certification approach significantly expanded the number and classes of contaminants determined. Environmental-matrix SRMs currently available include air and diesel particulate matter, coal tar, marine and river sediment, mussel tissue, fish oil and tissue, and human serum, with concentrations typically assigned for 50 to 90 organic contaminants, for example PAHs, nitro-substituted PAHs, PCBs, chlorinated pesticides, and polybrominated diphenyl ethers (PBDEs).
Robust tuning of robot control systems
NASA Technical Reports Server (NTRS)
Minis, I.; Uebel, M.
1992-01-01
The computed torque control problem is examined for a robot arm with flexible, geared, joint drive systems which are typical in many industrial robots. The standard computed torque algorithm is not directly applicable to this class of manipulators because of the dynamics introduced by the joint drive system. The proposed approach to computed torque control combines a computed torque algorithm with torque controller at each joint. Three such control schemes are proposed. The first scheme uses the joint torque control system currently implemented on the robot arm and a novel form of the computed torque algorithm. The other two use the standard computed torque algorithm and a novel model following torque control system based on model following techniques. Standard tasks and performance indices are used to evaluate the performance of the controllers. Both numerical simulations and experiments are used in evaluation. The study shows that all three proposed systems lead to improved tracking performance over a conventional PD controller.
Wei, Zhenglun Alan; Sonntag, Simon Johannes; Toma, Milan; Singh-Gryzbon, Shelly; Sun, Wei
2018-04-19
The governing international standard for the development of prosthetic heart valves is International Organization for Standardization (ISO) 5840. This standard requires the assessment of the thrombus potential of transcatheter heart valve substitutes using an integrated thrombus evaluation. Besides experimental flow field assessment and ex vivo flow testing, computational fluid dynamics is a critical component of this integrated approach. This position paper is intended to provide and discuss best practices for the setup of a computational model, numerical solving, post-processing, data evaluation and reporting, as it relates to transcatheter heart valve substitutes. This paper is not intended to be a review of current computational technology; instead, it represents the position of the ISO working group consisting of experts from academia and industry with regards to considerations for computational fluid dynamic assessment of transcatheter heart valve substitutes.
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Pain and pain management in dermatology.
Beiteke, Ulrike; Bigge, Stefan; Reichenberger, Christina; Gralow, Ingrid
2015-10-01
It is estimated that 23 million Germans suffer from chronic pain. A recent survey has revealed that 30 % of chronic pain patients are dissatisfied with their pain management. Furthermore, five million Germans suffer from neuropathic pain, 20 % of whom are inadequately treated. Pain is also a symptom of many dermatologic diseases, which is mostly somatic and may be classified as mild in the majority of cases. Nevertheless, research on the quality of life (QoL) has increasingly shown a marked impairment of QoL by moderate pain such as in psoriatic arthritis. -Severe pain is associated with herpes zoster (shingles), leg ulcers, and pyoderma gangrenosum. This article addresses the basics of pain classification and, in a short excerpt, pain transduction/transmission and modulation. The use of standardized diagnostic -scales is recommended for the purpose of recording and monitoring pain intensity, which allows for the optimization of therapy and consistent interdisciplinary -communication. Any dermatology residency program includes the acquisition of knowledge and skills in pain management. This review therefore aims to present fundamental therapeutic concepts based on the expanded WHO analgesic ladder, and describes a step-wise therapeutic approach and combination therapies. The article focuses on the pain management of the above-mentioned severely painful, conservatively treated dermatoses. Besides well-established therapeutic agents and current -therapeutic standards, it discusses specific options based on guidelines (where available). Current knowledge on peri- and postoperative pain management is briefly outlined. This article addresses: ▸ The fundamentals of the classification and neurophysiology of pain; ▸ Standards for pain documentation in children and adults; ▸ General standards for pharmaceutical pain management; ▸ Current specific treatment options for postherpetic neuralgia, leg ulcers, and -pyoderma gangrenosum in conjunction with the expanded WHO analgesic -ladder. © 2015 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.
Towards improving the NASA standard soil moisture retrieval algorithm and product
NASA Astrophysics Data System (ADS)
Mladenova, I. E.; Jackson, T. J.; Njoku, E. G.; Bindlish, R.; Cosh, M. H.; Chan, S.
2013-12-01
Soil moisture mapping using passive-based microwave remote sensing techniques has proven to be one of the most effective ways of acquiring reliable global soil moisture information on a routine basis. An important step in this direction was made by the launch of the Advanced Microwave Scanning Radiometer on the NASA's Earth Observing System Aqua satellite (AMSR-E). Along with the standard NASA algorithm and operational AMSR-E product, the easy access and availability of the AMSR-E data promoted the development and distribution of alternative retrieval algorithms and products. Several evaluation studies have demonstrated issues with the standard NASA AMSR-E product such as dampened temporal response and limited range of the final retrievals and noted that the available global passive-based algorithms, even though based on the same electromagnetic principles, produce different results in terms of accuracy and temporal dynamics. Our goal is to identify the theoretical causes that determine the reduced sensitivity of the NASA AMSR-E product and outline ways to improve the operational NASA algorithm, if possible. Properly identifying the underlying reasons that cause the above mentioned features of the NASA AMSR-E product and differences between the alternative algorithms requires a careful examination of the theoretical basis of each approach. Specifically, the simplifying assumptions and parametrization approaches adopted by each algorithm to reduce the dimensionality of unknowns and characterize the observing system. Statistically-based error analyses, which are useful and necessary, provide information on the relative accuracy of each product but give very little information on the theoretical causes, knowledge that is essential for algorithm improvement. Thus, we are currently examining the possibility of improving the standard NASA AMSR-E global soil moisture product by conducting a thorough theoretically-based review of and inter-comparisons between several well established global retrieval techniques. A detailed discussion focused on the theoretical basis of each approach and algorithms sensitivity to assumptions and parametrization approaches will be presented. USDA is an equal opportunity provider and employer.
The role of quality assurance in future midwifery practice.
Dawson, J
1993-08-01
Recent recommendations have been made which would give midwives a more central role in maternity care and a greater degree of independence than they currently enjoy. This paper argues that midwives' current attitudes to quality assurance are incompatible with this enhanced role. Research conducted in three health districts is described, which explored the perceptions of nurses, midwives and managers towards quality assurance. The findings indicate that quality assurance (in whatever form that concept is operationalized) is a demonstration of accountability. For managers this accountability is primarily for the service as a whole, whilst nurses and midwives view their accountability as being owed to patients/clients. The main methodology which the study identified as being used for monitoring nursing care was the development and auditing of explicit standards. This approach has been actively promoted by the Royal College of Nursing, enabling nurses to regain control of the purely professional aspects of the nursing profession. Midwives in the study districts showed a marked reluctance to adopt such a strategy, taking the view that as independent practitioners consensus standards would be unacceptable. It is argued that this attitude is inconsistent with the basic principle that professionals are accountable for both demonstrating and developing the quality of professional practice. It is further suggested that midwives currently have an opportunity to regain professional control of midwifery practice, which will be lost unless they are prepared to take responsibility for evaluating the standards for which they are accountable.
A systematic review of approaches to refeeding in patients with anorexia nervosa.
Garber, Andrea K; Sawyer, Susan M; Golden, Neville H; Guarda, Angela S; Katzman, Debra K; Kohn, Michael R; Le Grange, Daniel; Madden, Sloane; Whitelaw, Melissa; Redgrave, Graham W
2016-03-01
Given the importance of weight restoration for recovery in patients with anorexia nervosa (AN), we examined approaches to refeeding in adolescents and adults across treatment settings. Systematic review of PubMed, PsycINFO, Scopus, and Clinical Trials databases (1960-2015) using terms refeeding, weight restoration, hypophosphatemia, anorexia nervosa, anorexia, and anorexic. Of 948 screened abstracts, 27 met these inclusion criteria: participants had AN; reproducible refeeding approach; weight gain, hypophosphatemia or cognitive/behavioral outcomes. Twenty-six studies (96%) were observational/prospective or retrospective and performed in hospital. Twelve studies published since 2010 examined approaches starting with higher calories than currently recommended (≥1400 kcal/d). The evidence supports 8 conclusions: 1) In mildly and moderately malnourished patients, lower calorie refeeding is too conservative; 2) Both meal-based approaches or combined nasogastric+meals can administer higher calories; 3) Higher calorie refeeding has not been associated with increased risk for the refeeding syndrome under close medical monitoring with electrolyte correction; 4) In severely malnourished inpatients, there is insufficient evidence to change the current standard of care; 5) Parenteral nutrition is not recommended; 6) Nutrient compositions within recommended ranges are appropriate; 7) More research is needed in non-hospital settings; 8) The long-term impact of different approaches is unknown; Findings support higher calorie approaches to refeeding in mildly and moderately malnourished patients under close medical monitoring, however the safety, long-term outcomes, and feasibility outside of hospital have not been established. Further research is also needed on refeeding approaches in severely malnourished patients, methods of delivery, nutrient compositions and treatment settings. © 2015 Wiley Periodicals, Inc.
Risk-based audit selection of dairy farms.
van Asseldonk, M A P M; Velthuis, A G J
2014-02-01
Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Integrated Mecical Model (IMM) 4.0 Verification and Validation (VV) Testing (HRP IWS 2016)
NASA Technical Reports Server (NTRS)
Walton, M; Kerstman, E.; Arellano, J.; Boley, L.; Reyes, D.; Young, M.; Garcia, Y.; Saile, L.; Myers, J.
2016-01-01
Timeline, partial treatment, and alternate medications were added to the IMM to improve the fidelity of this model to enhance decision support capabilities. Using standard design reference missions, IMM VV testing compared outputs from the current operational IMM (v3) with those from the model with added functionalities (v4). These new capabilities were examined in a comparative, stepwise approach as follows: a) comparison of the current operational IMM v3 with the enhanced functionality of timeline alone (IMM 4.T), b) comparison of IMM 4.T with the timeline and partial treatment (IMM 4.TPT), and c) comparison of IMM 4.TPT with timeline, partial treatment and alternative medication (IMM 4.0).
The Pot Calling the Kettle Black? A Comparison of Measures of Current Tobacco Use
ROSENMAN, ROBERT
2014-01-01
Researchers often use the discrepancy between self-reported and biochemically assessed active smoking status to argue that self-reported smoking status is not reliable, ignoring the limitations of biochemically assessed measures and treating it as the gold standard in their comparisons. Here, we employ econometric techniques to compare the accuracy of self-reported and biochemically assessed current tobacco use, taking into account measurement errors with both methods. Our approach allows estimating and comparing the sensitivity and specificity of each measure without directly observing true smoking status. The results, robust to several alternative specifications, suggest that there is no clear reason to think that one measure dominates the other in accuracy. PMID:25587199
NASA Astrophysics Data System (ADS)
Ahmed, Shamim; Miorelli, Roberto; Calmon, Pierre; Anselmi, Nicola; Salucci, Marco
2018-04-01
This paper describes Learning-By-Examples (LBE) technique for performing quasi real time flaw localization and characterization within a conductive tube based on Eddy Current Testing (ECT) signals. Within the framework of LBE, the combination of full-factorial (i.e., GRID) sampling and Partial Least Squares (PLS) feature extraction (i.e., GRID-PLS) techniques are applied for generating a suitable training set in offine phase. Support Vector Regression (SVR) is utilized for model development and inversion during offine and online phases, respectively. The performance and robustness of the proposed GIRD-PLS/SVR strategy on noisy test set is evaluated and compared with standard GRID/SVR approach.
McLaren, Donald G.; Ries, Michele L.; Xu, Guofan; Johnson, Sterling C.
2012-01-01
Functional MRI (fMRI) allows one to study task-related regional responses and task-dependent connectivity analysis using psychophysiological interaction (PPI) methods. The latter affords the additional opportunity to understand how brain regions interact in a task-dependent manner. The current implementation of PPI in Statistical Parametric Mapping (SPM8) is configured primarily to assess connectivity differences between two task conditions, when in practice fMRI tasks frequently employ more than two conditions. Here we evaluate how a generalized form of context-dependent PPI (gPPI; http://www.nitrc.org/projects/gppi), which is configured to automatically accommodate more than two task conditions in the same PPI model by spanning the entire experimental space, compares to the standard implementation in SPM8. These comparisons are made using both simulations and an empirical dataset. In the simulated dataset, we compare the interaction beta estimates to their expected values and model fit using the Akaike Information Criterion (AIC). We found that interaction beta estimates in gPPI were robust to different simulated data models, were not different from the expected beta value, and had better model fits than when using standard PPI (sPPI) methods. In the empirical dataset, we compare the model fit of the gPPI approach to sPPI. We found that the gPPI approach improved model fit compared to sPPI. There were several regions that became non-significant with gPPI. These regions all showed significantly better model fits with gPPI. Also, there were several regions where task-dependent connectivity was only detected using gPPI methods, also with improved model fit. Regions that were detected with all methods had more similar model fits. These results suggest that gPPI may have greater sensitivity and specificity than standard implementation in SPM. This notion is tempered slightly as there is no gold standard; however, data simulations with a known outcome support our conclusions about gPPI. In sum, the generalized form of context-dependent PPI approach has increased flexibility of statistical modeling, and potentially improves model fit, specificity to true negative findings, and sensitivity to true positive findings. PMID:22484411
A novel approach to Hough Transform for implementation in fast triggers
NASA Astrophysics Data System (ADS)
Pozzobon, Nicola; Montecassiano, Fabio; Zotto, Pierluigi
2016-10-01
Telescopes of position sensitive detectors are common layouts in charged particles tracking, and programmable logic devices, such as FPGAs, represent a viable choice for the real-time reconstruction of track segments in such detector arrays. A compact implementation of the Hough Transform for fast triggers in High Energy Physics, exploiting a parameter reduction method, is proposed, targeting the reduction of the needed storage or computing resources in current, or next future, state-of-the-art FPGA devices, while retaining high resolution over a wide range of track parameters. The proposed approach is compared to a Standard Hough Transform with particular emphasis on their application to muon detectors. In both cases, an original readout implementation is modeled.
[Current strategy in PCI for CTO].
Asakura, Yasushi
2011-02-01
Recently, CTO PCI has come into wide use all over the world and it has been standardized. The 1st step is an antegrade approach using single wire. The 2nd strategy would be parallel wire technique. And the next would be a retrograde approach. In this method, retrograde wiring with Corsair is done at first. If it is successful, externalization is established using 300 cm wire, and this system is able to provide strong back-up support. If it fails, reverse CART technique is the next step. IVUS guided wiring is a last resort. The 2nd wire is manipulated with IVUS guidance. Now, initial success rate is more than 90% with these methods.
Don't panic--prepare: towards crisis-aware models of emergency department operations.
Ceglowski, Red; Churilov, Leonid; Wasserheil, Jeff
2005-12-01
The existing models of Emergency Department (ED) operations that are based on the "flow-shop" management logic do not provide adequate decision support in dealing with the ED overcrowding crises. A conceptually different crisis-aware approach to ED modelling and operational decision support is introduced in this paper. It is based on Perrow's theory of "normal accidents" and calls for recognizing the inevitable nature of ED overcrowding crises within current health system setup. Managing the crisis before it happens--a standard approach in crisis management area--should become an integral part of ED operations management. The potential implications of adopting such a crisis-aware perspective for health services research and ED management are outlined.
Consensus Treatment Plans for New-Onset Systemic Juvenile Idiopathic Arthritis
DeWitt, Esi Morgan; Kimura, Yukiko; Beukelman, Timothy; Nigrovic, Peter A.; Onel, Karen; Prahalad, Sampath; Schneider, Rayfel; Stoll, Matthew L.; Angeles-Han, Sheila; Milojevic, Diana; Schikler, Kenneth N.; Vehe, Richard K.; Weiss, Jennifer E.; Weiss, Pamela; Ilowite, Norman T.; Wallace, Carol A.
2012-01-01
Objective There is wide variation in therapeutic approaches to systemic juvenile idiopathic arthritis (sJIA) among North American rheumatologists. Understanding the comparative effectiveness of the diverse therapeutic options available for treatment of sJIA can result in better health outcomes. The Childhood Arthritis and Rheumatology Research Alliance (CARRA) developed consensus treatment plans and standardized assessment schedules for use in clinical practice to facilitate such studies. Methods Case-based surveys were administered to CARRA members to identify prevailing treatments for new-onset sJIA. A 2-day consensus conference in April 2010 employed modified nominal group technique to formulate preliminary treatment plans and determine important data elements for collection. Follow-up surveys were employed to refine the plans and assess clinical acceptability. Results The initial case-based survey identified significant variability among current treatment approaches for new onset sJIA, underscoring the utility of standardized plans to evaluate comparative effectiveness. We developed four consensus treatment plans for the first 9 months of therapy, as well as case definitions and clinical and laboratory monitoring schedules. The four treatment regimens included glucocorticoids only, or therapy with methotrexate, anakinra or tocilizumab, with or without glucocorticoids. This approach was approved by >78% of CARRA membership. Conclusion Four standardized treatment plans were developed for new-onset sJIA. Coupled with data collection at defined intervals, use of these treatment plans will create the opportunity to evaluate comparative effectiveness in an observational setting to optimize initial management of sJIA. PMID:22290637
Tolomeo, Concettina Tina; Major, Nili E; Szondy, Mary V; Bazzy-Asaad, Alia
At our institution, there is a six bed Pediatric Respiratory Care Unit for technology dependent infants and children with a tracheostomy tube. A lack of consistency in patient care and parent/guardian education prompted our group to critically evaluate the services we provided by revisiting our teaching protocol and instituting a new model of care in the Unit. The aims of this quality improvement (QI) project were to standardize care and skills proficiency training to parents of infants with a tracheostomy tube in preparation for discharge to home. After conducting a current state survey of key unit stakeholders, we initiated a multidisciplinary, QI project to answer the question: 'could a standardized approach to care and training lead to a decrease in parental/guardian training time, a decrease in length of stay, and/or an increase in developmental interventions for infants with tracheostomy tubes'? A convenience sample of infants with a tracheostomy tube admitted to the Pediatric Respiratory Care Unit were included in the study. Descriptive statistics were used to analyze the results. Through this QI approach, we were able to decrease the time required by parents to achieve proficiency in the care of a technology dependent infant, the length of stay for these infants, and increase referral of the infants for developmental assessment. These outcomes have implications for how to approach deficiencies in patient care and make changes that lead to sustained improvements. Copyright © 2016 Elsevier Inc. All rights reserved.
Damas, S; Wilkinson, C; Kahana, T; Veselovskaya, E; Abramov, A; Jankauskas, R; Jayaprakash, P T; Ruiz, E; Navarro, F; Huete, M I; Cunha, E; Cavalli, F; Clement, J; Lestón, P; Molinero, F; Briers, T; Viegas, F; Imaizumi, K; Humpire, D; Ibáñez, O
2015-12-01
Craniofacial superimposition, although existing for one century, is still a controversial technique within the scientific community. Objective and unbiased validation studies over a significant number of cases are required to establish a more solid picture on the reliability. However, there is lack of protocols and standards in the application of the technique leading to contradictory information concerning reliability. Instead of following a uniform methodology, every expert tends to apply his own approach to the problem, based on the available technology and deep knowledge on human craniofacial anatomy, soft tissues, and their relationships. The aim of this study was to assess the reliability of different craniofacial superimposition methodologies and the corresponding technical approaches to this type of identification. With all the data generated, some of the most representative experts in craniofacial identification joined in a discussion intended to identify and agree on the most important issues that have to be considered to properly employ the craniofacial superimposition technique. As a consequence, the consortium has produced the current manuscript, which can be considered the first standard in the field; including good and bad practices, sources of error and uncertainties, technological requirements and desirable features, and finally a common scale for the craniofacial matching evaluation. Such a document is intended to be part of a more complete framework for craniofacial superimposition, to be developed during the FP7-founded project MEPROCS, which will favour and standardize its proper application. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
7 CFR 220.8 - Nutrition standards and menu planning approaches for breakfasts.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 4 2011-01-01 2011-01-01 false Nutrition standards and menu planning approaches for... NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SCHOOL BREAKFAST PROGRAM § 220.8 Nutrition standards and menu planning approaches for breakfasts. (a) What are the nutrition standards for...
7 CFR 220.8 - Nutrition standards and menu planning approaches for breakfasts.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 4 2012-01-01 2012-01-01 false Nutrition standards and menu planning approaches for... NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SCHOOL BREAKFAST PROGRAM § 220.8 Nutrition standards and menu planning approaches for breakfasts. (a) What are the nutrition standards for...
7 CFR 220.8 - Nutrition standards and menu planning approaches for breakfasts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false Nutrition standards and menu planning approaches for... NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SCHOOL BREAKFAST PROGRAM § 220.8 Nutrition standards and menu planning approaches for breakfasts. (a) What are the nutrition standards for...
A review of the quantum current standard
NASA Astrophysics Data System (ADS)
Kaneko, Nobu-Hisa; Nakamura, Shuji; Okazaki, Yuma
2016-03-01
The electric current, voltage, and resistance standards are the most important standards related to electricity and magnetism. Of these three standards, only the ampere, which is the unit of electric current, is an International System of Units (SI) base unit. However, even with modern technology, relatively large uncertainty exists regarding the generation and measurement of current. As a result of various innovative techniques based on nanotechnology and novel materials, new types of junctions for quantum current generation and single-electron current sources have recently been proposed. These newly developed methods are also being used to investigate the consistency of the three quantum electrical effects, i.e. the Josephson, quantum Hall, and single-electron tunneling effects, which are also known as ‘the quantum metrology triangle’. This article describes recent research and related developments regarding current standards and quantum-metrology-triangle experiments.
Clinical challenges in thyroid disease: Time for a new approach?
Juby, A G; Hanly, M G; Lukaczer, D
2016-05-01
Thyroid disease is common, and the prevalence is rising. Traditional diagnosis and monitoring relies on thyroid stimulating hormone (TSH) levels. This does not always result in symptomatic improvement in hypothyroid symptoms, to the disappointment of both patients and physicians. A non-traditional therapeutic approach would include evaluation of GI function as well as a dietary history and micronutrient evaluation. This approach also includes assessment of thyroid peroxidase (TPO) antibodies, T3, T4, and reverse T3 levels, and in some cases may require specific T3 supplementation in addition to standard T4 therapy. Both high and low TSH levels on treatment are associated with particular medical risks. In the case of high TSH this is primarily cardiac, whereas for low TSH it is predominantly bone health. This article discusses these important clinical issues in more detail, with some practical tips especially for an approach to the "non-responders" to the current traditional therapeutic approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Abuhamad, Alfred; Zhao, Yili; Abuhamad, Sharon; Sinkovskaya, Elena; Rao, Rashmi; Kanaan, Camille; Platt, Lawrence
2016-01-01
This study aims to validate the feasibility and accuracy of a new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, and compare the new approach to the regular approach performed in the scheduled obstetric ultrasound examination. A new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, to evaluate fetal presentation, fetal cardiac activity, presence of multiple pregnancy, placental localization, amniotic fluid volume evaluation, and biometric measurements, was prospectively performed on 100 pregnant women between 18(+0) and 27(+6) weeks of gestation and another 100 pregnant women between 28(+0) and 36(+6) weeks of gestation. The agreement of findings for each of the six steps of the standardized six-step approach was evaluated against the regular approach. In all ultrasound examinations performed, substantial to perfect agreement (Kappa value between 0.64 and 1.00) was observed between the new standardized six-step approach and the regular approach. The new standardized six-step approach to the focused basic obstetric ultrasound examination can be performed successfully and accurately between 18(+0) and 36(+6) weeks of gestation. This standardized approach can be of significant benefit to limited resource settings and in point of care obstetric ultrasound applications. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Low Resolution Picture Transmission (LRPT) Demonstration System. Phase II; 1.0
NASA Technical Reports Server (NTRS)
Fong, Wai; Yeh, Pen-Shu; Duran, Steve; Sank, Victor; Nyugen, Xuan; Xia, Wei; Day, John H. (Technical Monitor)
2002-01-01
Low-Resolution Picture Transmission (LRPT) is a proposed standard for direct broadcast transmission of satellite weather images. This standard is a joint effort by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) and NOAA. As a digital transmission scheme, its purpose is to replace the current analog Automatic Picture Transmission (APT) system for use in the Meteorological Operational (METOP) satellites. GSFC has been tasked to build an LRPT Demonstration System (LDS). Its main objective is to develop or demonstrate the feasibility of a low-cost receiver utilizing a PC as the primary processing component and determine the performance of the protocol in the simulated Radio Frequency (RF) environment. The approach would consist of two phases.
Design techniques for low-voltage analog integrated circuits
NASA Astrophysics Data System (ADS)
Rakús, Matej; Stopjaková, Viera; Arbet, Daniel
2017-08-01
In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.
The price of palliative care: toward a complete accounting of costs and benefits.
Boni-Saenz, Alexander A; Dranove, David; Emanuel, Linda L; Lo Sasso, Anthony T
2005-02-01
In this article, currently accepted standards for cost-benefit analysis of health care interventions are outlined, and a framework to evaluate palliative care within these standards is provided. Recent publications on the economic implications of palliative care are reviewed, which are only the "tip of the iceberg" of the potential costs and benefits. Using this framework, the authors offer guidelines for performing comprehensive cost-benefit analyses of palliative care and conclude that many of the issues beneath the surface may be substantial and deserving of closer scrutiny. Methods for gathering relevant cost-benefit information are detailed, along with potential obstacles to implementation. This approach is applicable to palliative care in general, including palliative care for elders.
A single blue nanorod light emitting diode.
Hou, Y; Bai, J; Smith, R; Wang, T
2016-05-20
We report a light emitting diode (LED) consisting of a single InGaN/GaN nanorod fabricated by a cost-effective top-down approach from a standard LED wafer. The device demonstrates high performance with a reduced quantum confined Stark effect compared with a standard planar counterpart fabricated from the same wafer, confirmed by optical and electrical characterization. Current density as high as 5414 A cm(-2) is achieved without significant damage to the device due to the high internal quantum efficiency. The efficiency droop is mainly ascribed to Auger recombination, which was studied by an ABC model. Our work provides a potential method for fabricating compact light sources for advanced photonic integrated circuits without involving expensive or time-consuming fabrication facilities.
Use of short-term toxicity data for prediction of long-term health effects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartley, W.R.; Ohanian, E.V.
1988-01-01
Under the Safe Drinking Water Act Amendments of 1986, the US Environmental Protection Agency determines Maximum Contaminant Level Goals (MCLGs) and enforceable Maximum Contaminant Levels (MCLs) or provides lifetime health advisories (HAs) in the absence of regulatory standards. The critical value for calculation of the lifetime level is the reference dose (RfD). The RfD is an estimate of a lifetime dose which is likely to be without significant risk to human populations. The RfD is determined by dividing the no-observed-adverse-effect level (NOAEL) or the lowest-observed-adverse-effect level (LOAEL) by an uncertainty factor (UF). The NOAEL or LOAEL is determined from toxicologicalmore » or epidemiological studies. For many chemicals, human toxicological or epidemiological data are not available. Chronic mammalian studies are sometimes unavailable. Faced with the need for providing guidance for the increasing number of chemicals threatening our drinking water sources, this paper considers the possibility of providing provisional RfDs using data from toxicological studies of less than ninety days duration. The current UF approach is reviewed along with some proposed mathematical models for extrapolation of NOAELs from dose-response data. The current UF approach to developing the RfD is protective and conservative. More research is needed on the relationship of short- and long-term toxicity data to improve our current approach.« less
Tomás-Callejas, Alejandro; López-Velasco, Gabriela; Valadez, Angela M; Sbodio, Adrian; Artés-Hernández, Francisco; Danyluk, Michelle D; Suslow, Trevor V
2012-02-01
Standard postharvest unit operations that rely on copious water contact, such as fruit unloading and washing, approach the criteria for a true critical control point in fresh tomato production. Performance data for approved sanitizers that reflect commercial systems are needed to set standards for audit compliance. This study was conducted to evaluate the efficacy of chlorine dioxide (ClO(2)) for water disinfection as an objective assessment of recent industry-adopted standards for dump tank and flume management in fresh tomato packing operations. On-site assessments were conducted during eight temporally distinct shifts in two Florida packinghouses and one California packinghouse. Microbiological analyses of incoming and washed fruit and dump and flume system water were evaluated. Water temperature, pH, turbidity, conductivity, and oxidation-reduction potential (ORP) were monitored. Reduction in populations of mesophilic and coliform bacteria on fruit was not significant, and populations were significantly higher (P < 0.05) after washing. Escherichia coli was near the limit of detection in dump tanks but consistently below the detection limit in flumes. Turbidity and conductivity increased with loads of incoming tomatoes. Water temperature varied during daily operations, but pH and ORP mostly remained constant. The industry standard positive temperature differential of 5.5°C between water and fruit pulp was not maintained in tanks during the full daily operation. ORP values were significantly higher in the flume than in the dump tank. A positive correlation was found between ORP and temperature, and negative correlations were found between ORP and turbidity, total mesophilic bacteria, and coliforms. This study provides in-plant data indicating that ClO(2) can be an effective sanitizer in flume and spray-wash systems, but current operational limitations restrict its performance in dump tanks. Under current conditions, ClO(2) alone is unlikely to allow the fresh tomato industry to meet its microbiological quality goals under typical commercial conditions.
Weiner, Michael W; Veitch, Dallas P; Aisen, Paul S; Beckett, Laurel A; Cairns, Nigel J; Green, Robert C; Harvey, Danielle; Jack, Clifford R; Jagust, William; Morris, John C; Petersen, Ronald C; Salazar, Jennifer; Saykin, Andrew J; Shaw, Leslie M; Toga, Arthur W; Trojanowski, John Q
2017-05-01
The overall goal of the Alzheimer's Disease Neuroimaging Initiative (ADNI) is to validate biomarkers for Alzheimer's disease (AD) clinical trials. ADNI-3, which began on August 1, 2016, is a 5-year renewal of the current ADNI-2 study. ADNI-3 will follow current and additional subjects with normal cognition, mild cognitive impairment, and AD using innovative technologies such as tau imaging, magnetic resonance imaging sequences for connectivity analyses, and a highly automated immunoassay platform and mass spectroscopy approach for cerebrospinal fluid biomarker analysis. A Systems Biology/pathway approach will be used to identify genetic factors for subject selection/enrichment. Amyloid positron emission tomography scanning will be standardized using the Centiloid method. The Brain Health Registry will help recruit subjects and monitor subject cognition. Multimodal analyses will provide insight into AD pathophysiology and disease progression. ADNI-3 will aim to inform AD treatment trials and facilitate development of AD disease-modifying treatments. Copyright © 2016 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiner, Michael W.; Veitch, Dallas P.; Aisen, Paul S.
Overall, the goal of the Alzheimer's Disease Neuroimaging Initiative (ADNI) is to validate biomarkers for Alzheimer's disease (AD) clinical trials. ADNI-3, which began on August 1, 2016, is a 5-year renewal of the current ADNI-2 study. ADNI-3 will follow current and additional subjects with normal cognition, mild cognitive impairment, and AD using innovative technologies such as tau imaging, magnetic resonance imaging sequences for connectivity analyses, and a highly automated immunoassay platform and mass spectroscopy approach for cerebrospinal fluid biomarker analysis. A Systems Biology/pathway approach will be used to identify genetic factors for subject selection/enrichment. Amyloid positron emission tomography scanning willmore » be standardized using the Centiloid method. The Brain Health Registry will help recruit subjects and monitor subject cognition. Multimodal analyses will provide insight into AD pathophysiology and disease progression. Finally, ADNI-3 will aim to inform AD treatment trials and facilitate development of AD disease-modifying treatments.« less
Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Angell
1998-05-01
The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less
Lopez-Gordo, M. A.; Sanchez-Morillo, D.; Valle, F. Pelayo
2014-01-01
Electroencephalography (EEG) emerged in the second decade of the 20th century as a technique for recording the neurophysiological response. Since then, there has been little variation in the physical principles that sustain the signal acquisition probes, otherwise called electrodes. Currently, new advances in technology have brought new unexpected fields of applications apart from the clinical, for which new aspects such as usability and gel-free operation are first order priorities. Thanks to new advances in materials and integrated electronic systems technologies, a new generation of dry electrodes has been developed to fulfill the need. In this manuscript, we review current approaches to develop dry EEG electrodes for clinical and other applications, including information about measurement methods and evaluation reports. We conclude that, although a broad and non-homogeneous diversity of approaches has been evaluated without a consensus in procedures and methodology, their performances are not far from those obtained with wet electrodes, which are considered the gold standard, thus enabling the former to be a useful tool in a variety of novel applications. PMID:25046013
Wallace, T.J.; Torre, T.; Grob, M.; Yu, J.; Avital, I.; Brücher, BLDM; Stojadinovic, A.; Man, Y.G.
2014-01-01
Prostate cancer is the most commonly diagnosed non-cutaneous neoplasm in men in the United States and the second leading cause of cancer mortality. One in 7 men will be diagnosed with prostate cancer during their lifetime. As a result, monitoring treatment response is of vital importance. The cornerstone of current approaches in monitoring treatment response remains the prostate-specific antigen (PSA). However, with the limitations of PSA come challenges in our ability to monitor treatment success. Defining PSA response is different depending on the individual treatment rendered potentially making it difficult for those not trained in urologic oncology to understand. Furthermore, standard treatment response criteria do not apply to prostate cancer further complicating the issue of treatment response. Historically, prostate cancer has been difficult to image and no single modality has been consistently relied upon to measure treatment response. However, with newer imaging modalities and advances in our understanding and utilization of specific biomarkers, the future for monitoring treatment response in prostate cancer looks bright. PMID:24396494
Delivery of large biopharmaceuticals from cardiovascular stents: a review
Takahashi, Hironobu; Letourneur, Didier; Grainger, David W.
2008-01-01
This review focuses on the new and emerging large-molecule bioactive agents delivered from stent surfaces in drug-eluting stents (DES) to inhibit vascular restenosis in the context of interventional cardiology. New therapeutic agents representing proteins, nucleic acids (small interfering RNAs and large DNA plasmids), viral delivery vectors and even engineered cell therapies require specific delivery designs distinct from traditional smaller molecule approaches on DES. While small molecules are currently the clinical standard for coronary stenting, extension of the DES to other lesion types, peripheral vasculature and non-vasculature therapies will seek to deliver an increasingly sophisticated armada of drug types. This review describes many of the larger molecule and biopharmaceutical approaches reported recently for stent-based delivery with the challenges associated with formulating and delivering these drug classes compared to the current small molecule drugs. It also includes perspectives on possible future applications that may improve safety and efficacy and facilitate diversification of the DES to other clinical applications. PMID:17929968
Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M
2013-10-01
Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.
True versus Apparent Malaria Infection Prevalence: The Contribution of a Bayesian Approach
Claes, Filip; Van Hong, Nguyen; Torres, Kathy; Mao, Sokny; Van den Eede, Peter; Thi Thinh, Ta; Gamboa, Dioni; Sochantha, Tho; Thang, Ngo Duc; Coosemans, Marc; Büscher, Philippe; D'Alessandro, Umberto; Berkvens, Dirk; Erhart, Annette
2011-01-01
Aims To present a new approach for estimating the “true prevalence” of malaria and apply it to datasets from Peru, Vietnam, and Cambodia. Methods Bayesian models were developed for estimating both the malaria prevalence using different diagnostic tests (microscopy, PCR & ELISA), without the need of a gold standard, and the tests' characteristics. Several sources of information, i.e. data, expert opinions and other sources of knowledge can be integrated into the model. This approach resulting in an optimal and harmonized estimate of malaria infection prevalence, with no conflict between the different sources of information, was tested on data from Peru, Vietnam and Cambodia. Results Malaria sero-prevalence was relatively low in all sites, with ELISA showing the highest estimates. The sensitivity of microscopy and ELISA were statistically lower in Vietnam than in the other sites. Similarly, the specificities of microscopy, ELISA and PCR were significantly lower in Vietnam than in the other sites. In Vietnam and Peru, microscopy was closer to the “true” estimate than the other 2 tests while as expected ELISA, with its lower specificity, usually overestimated the prevalence. Conclusions Bayesian methods are useful for analyzing prevalence results when no gold standard diagnostic test is available. Though some results are expected, e.g. PCR more sensitive than microscopy, a standardized and context-independent quantification of the diagnostic tests' characteristics (sensitivity and specificity) and the underlying malaria prevalence may be useful for comparing different sites. Indeed, the use of a single diagnostic technique could strongly bias the prevalence estimation. This limitation can be circumvented by using a Bayesian framework taking into account the imperfect characteristics of the currently available diagnostic tests. As discussed in the paper, this approach may further support global malaria burden estimation initiatives. PMID:21364745
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinmann, Vera; Chakraborty, Rupak; Rekemeyer, Paul H.
2016-08-31
As novel absorber materials are developed and screened for their photovoltaic (PV) properties, the challenge remains to reproducibly test promising candidates for high-performing PV devices. Many early-stage devices are prone to device shunting due to pinholes in the absorber layer, producing 'false-negative' results. Here, we demonstrate a device engineering solution toward a robust device architecture, using a two-step absorber deposition approach. We use tin sulfide (SnS) as a test absorber material. The SnS bulk is processed at high temperature (400 degrees C) to stimulate grain growth, followed by a much thinner, low-temperature (200 degrees C) absorber deposition. At a lowermore » process temperature, the thin absorber overlayer contains significantly smaller, densely packed grains, which are likely to provide a continuous coating and fill pinholes in the underlying absorber bulk. We compare this two-step approach to the more standard approach of using a semi-insulating buffer layer directly on top of the annealed absorber bulk, and we demonstrate a more than 3.5x superior shunt resistance Rsh with smaller standard error ..sigma..Rsh. Electron-beam-induced current (EBIC) measurements indicate a lower density of pinholes in the SnS absorber bulk when using the two-step absorber deposition approach. We correlate those findings to improvements in the device performance and device performance reproducibility.« less
Past and present in abdominal surgery management for Cushing's syndrome.
Vilallonga, Ramon; Zafon, Carles; Fort, José Manuel; Mesa, Jordi; Armengol, Manel
2014-01-01
Data on specific abdominal surgery and Cushing's syndrome are infrequent and are usually included in the adrenalectomy reports. Current literature suggests the feasibility and reproducibility of the surgical adrenalectomies for patients diagnosed with non-functioning tumours and functioning adrenal tumours including pheochromocytoma, Conn's syndrome and Cushing's syndrome. Medical treatment for Cushing's syndrome is feasible but follow-up or clinical situations force the patient to undergo a surgical procedure. Laparoscopic surgery has become a gold standard nowadays in a broad spectrum of pathologies. Laparoscopic adrenalectomies are also standard procedures nowadays. However, despite the different characteristics and clinical disorders related to the laparoscopically removed adrenal tumours, the intraoperative and postoperative outcomes do not significantly differ in most cases between the different groups of patients, techniques and types of tumours. Tumour size, hormonal type and surgeon's experience could be different factors that predict intraoperative and postoperative complications. Transabdominal and retroperitoneal approaches can be considered. Outcomes for Cushing's syndrome do not differ depending on the surgical approach. Novel technologies and approaches such as single-port surgery or robotic surgery have proven to be safe and feasible. Laparoscopic adrenalectomy is a safe and feasible approach to adrenal pathology, providing the patients with all the benefits of minimally invasive surgery. Single-port access and robotic surgery can be performed but more data are required to identify their correct role between the different surgical approaches. Factors such as surgeon's experience, tumour size and optimal technique can affect the outcomes of this surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Courtney A., E-mail: courtney.schultz@colostate.edu
Cumulative effects analysis (CEA) allows natural resource managers to understand the status of resources in historical context, learn from past management actions, and adapt future activities accordingly. U.S. federal agencies are required to complete CEA as part of environmental impact assessment under the National Environmental Policy Act (NEPA). Past research on CEA as part of NEPA has identified significant deficiencies in CEA practice, suggested methodologies for handling difficult aspects of CEA, and analyzed the rise in litigation over CEA in U.S. courts. This article provides a review of the literature and legal standards related to CEA as it is donemore » under NEPA and then examines current practice on a U.S. National Forest, utilizing qualitative methods in order to provide a detailed understanding of current approaches to CEA. Research objectives were to understand current practice, investigate ongoing challenges, and identify impediments to improvement. Methods included a systematic review of a set of NEPA documents and semi-structured interviews with practitioners, scientists, and members of the public. Findings indicate that the primary challenges associated with CEA include: issues of both geographic and temporal scale of analysis, confusion over the purpose of the requirement, the lack of monitoring data, and problems coordinating and disseminating data. Improved monitoring strategies and programmatic analyses could support improved CEA practice.« less
Endoscopic ultrasound-guided techniques for diagnosing pancreatic mass lesions: Can we do better?
Storm, Andrew C; Lee, Linda S
2016-01-01
The diagnostic approach to a possible pancreatic mass lesion relies first upon various non-invasive imaging modalities, including computed tomography, ultrasound, and magnetic resonance imaging techniques. Once a suspect lesion has been identified, tissue acquisition for characterization of the lesion is often paramount in developing an individualized therapeutic approach. Given the high prevalence and mortality associated with pancreatic cancer, an ideal approach to diagnosing pancreatic mass lesions would be safe, highly sensitive, and reproducible across various practice settings. Tools, in addition to radiologic imaging, currently employed in the initial evaluation of a patient with a pancreatic mass lesion include serum tumor markers, endoscopic retrograde cholangiopancreatography, and endoscopic ultrasound-guided fine needle aspiration (EUS-FNA). EUS-FNA has grown to become the gold standard in tissue diagnosis of pancreatic lesions. PMID:27818584
Pyeloplasty techniques using minimally invasive surgery (MIS) in pediatric patients.
Turrà, Francesco; Escolino, Maria; Farina, Alessandra; Settimi, Alessandro; Esposito, Ciro; Varlet, François
2016-10-01
Hydronephrosis is the most common presentation of ureteropelvic junction (UPJ) obstruction. We reviewed literature, collecting data from Medline, to evaluate the current status of minimally invasive surgery (MIS) approach to pyeloplasty. Since the first pyeloplasty was described in 1939, several techniques has been applied to correct UPJ obstruction, but Anderson-Hynes dismembered pyeloplasty is established as the gold standard, to date also in MIS technique. According to literature several studies underline the safety and effectiveness of this approach for both trans- and retro-peritoneal routes, with a success rate between 81-100% and an operative time between 90-228 min. These studies have demonstrated the safety and efficacy of this procedure in the management of UPJ obstruction in children. Whether better the transperitoneal, than the retroperitoneal approach is still debated. A long learning curve is needed especially in suturing and knotting.