DOE Office of Scientific and Technical Information (OSTI.GOV)
Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne
The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of amore » formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting. • Advantages of HIA in the air quality standard setting process are demonstrated.« less
Standard Setting as Psychometric Due Process: Going a Little Further Down an Uncertain Road.
ERIC Educational Resources Information Center
Cizek, Gregory J.
The concept of due process provides an analogy for the process of standard setting that emphasizes many of the procedural and substantive elements of the process over technical and statistical concerns. Surely such concerns can and should continue to be addressed. However, a sound rationale for standard setting does not rest on this foundation.…
State Standard-Setting Processes in Brief. State Academic Standards: Standard-Setting Processes
ERIC Educational Resources Information Center
Thomsen, Jennifer
2014-01-01
Concerns about academic standards, whether created by states from scratch or adopted by states under the Common Core State Standards (CCSS) banner, have drawn widespread media attention and are at the top of many state policymakers' priority lists. Recently, a number of legislatures have required additional steps, such as waiting periods for…
Adopting Cut Scores: Post-Standard-Setting Panel Considerations for Decision Makers
ERIC Educational Resources Information Center
Geisinger, Kurt F.; McCormick, Carina M.
2010-01-01
Standard-setting studies utilizing procedures such as the Bookmark or Angoff methods are just one component of the complete standard-setting process. Decision makers ultimately must determine what they believe to be the most appropriate standard or cut score to use, employing the input of the standard-setting panelists as one piece of information…
A Comparison of Web-Based Standard Setting and Monitored Standard Setting.
ERIC Educational Resources Information Center
Harvey, Anne L.; Way, Walter D.
Standard setting, when carefully done, can be an expensive and time-consuming process. The modified Angoff method and the benchmark method, as utilized in this study, employ representative panels of judges to provide recommended passing scores to standard setting decision-makers. It has been considered preferable to have the judges meet in a…
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
The Effect of Various Factors on Standard Setting.
ERIC Educational Resources Information Center
Norcini, John J.; And Others
1988-01-01
Two studies of medical certification examinations were undertaken to assess standard setting using Angoff's Method. Results indicate that (1) specialization within broad content areas does not affect an expert's estimates of the performance of the borderline group; and (2) performance data should be provided during the standard-setting process.…
Implementing standard setting into the Conjoint MAFP/FRACGP Part 1 examination - Process and issues.
Chan, S C; Mohd Amin, S; Lee, T W
2016-01-01
The College of General Practitioners of Malaysia and the Royal Australian College of General Practitioners held the first Conjoint Member of the College of General Practitioners (MCGP)/Fellow of Royal Australian College of General Practitioners (FRACGP) examination in 1982, later renamed the Conjoint MAFP/FRACGP examinations. The examination assesses competency for safe independent general practice and as family medicine specialists in Malaysia. Therefore, a defensible standard set pass mark is imperative to separate the competent from the incompetent. This paper discusses the process and issues encountered in implementing standard setting to the Conjoint Part 1 examination. Critical to success in standard setting were judges' understanding of the process of the modified Angoff method, defining the borderline candidate's characteristics and the composition of judges. These were overcome by repeated hands-on training, provision of detailed guidelines and careful selection of judges. In December 2013, 16 judges successfully standard set the Part 1 Conjoint examinations, with high inter-rater reliability: Cronbach's alpha coefficient 0.926 (Applied Knowledge Test), 0.921 (Key Feature Problems).
ERIC Educational Resources Information Center
Mee, Janet; Clauser, Brian E.; Margolis, Melissa J.
2013-01-01
Despite being widely used and frequently studied, the Angoff standard setting procedure has received little attention with respect to an integral part of the process: how judges incorporate examinee performance data in the decision-making process. Without performance data, subject matter experts have considerable difficulty accurately making the…
Variation in passing standards for graduation-level knowledge items at UK medical schools.
Taylor, Celia A; Gurnell, Mark; Melville, Colin R; Kluth, David C; Johnson, Neil; Wass, Val
2017-06-01
Given the absence of a common passing standard for students at UK medical schools, this paper compares independently set standards for common 'one from five' single-best-answer (multiple-choice) items used in graduation-level applied knowledge examinations and explores potential reasons for any differences. A repeated cross-sectional study was conducted. Participating schools were sent a common set of graduation-level items (55 in 2013-2014; 60 in 2014-2015). Items were selected against a blueprint and subjected to a quality review process. Each school employed its own standard-setting process for the common items. The primary outcome was the passing standard for the common items by each medical school set using the Angoff or Ebel methods. Of 31 invited medical schools, 22 participated in 2013-2014 (71%) and 30 (97%) in 2014-2015. Schools used a mean of 49 and 53 common items in 2013-2014 and 2014-2015, respectively, representing around one-third of the items in the examinations in which they were embedded. Data from 19 (61%) and 26 (84%) schools, respectively, met the inclusion criteria for comparison of standards. There were statistically significant differences in the passing standards set by schools in both years (effect sizes (f 2 ): 0.041 in 2013-2014 and 0.218 in 2014-2015; both p < 0.001). The interquartile range of standards was 5.7 percentage points in 2013-2014 and 6.5 percentage points in 2014-2015. There was a positive correlation between the relative standards set by schools in the 2 years (Pearson's r = 0.57, n = 18, p = 0.014). Time allowed per item, method of standard setting and timing of examination in the curriculum did not have a statistically significant impact on standards. Independently set standards for common single-best-answer items used in graduation-level examinations vary across UK medical schools. Further work to examine standard-setting processes in more detail is needed to help explain this variability and develop methods to reduce it. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
A primer on standards setting as it applies to surgical education and credentialing.
Cendan, Juan; Wier, Daryl; Behrns, Kevin
2013-07-01
Surgical technological advances in the past three decades have led to dramatic reductions in the morbidity associated with abdominal procedures and permanently altered the surgical practice landscape. Significant changes continue apace including surgical robotics, natural orifice-based surgery, and single-incision approaches. These disruptive technologies have on occasion been injurious to patients, and high-stakes assessment before adoption of new technologies would be reasonable. We reviewed the drivers for well-established psychometric techniques available for the standards-setting process. We present a series of examples that are relevant in the surgical domain including standards setting for knowledge and skills assessments. Defensible standards for knowledge and procedural skills will likely become part of surgical clinical practice. Understanding the methodology for determining standards should position the surgical community to assist in the process and lead within their clinical settings as standards are considered that may affect patient safety and physician credentialing.
ERIC Educational Resources Information Center
Cravens, Xiu Chen; Goldring, Ellen B.; Porter, Andrew C.; Polikoff, Morgan S.; Murphy, Joseph; Elliott, Stephen N.
2013-01-01
Purpose: Performance evaluation informs professional development and helps school personnel improve student learning. Although psychometric literature indicates that a rational, sound, and coherent standard-setting process adds to the credibility of an assessment, few studies have empirically examined the decision-making process. This article…
Software Development Standard Processes (SDSP)
NASA Technical Reports Server (NTRS)
Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.;
2011-01-01
A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PERFORMANCE STANDARDS SPECIAL PERMANENT PROGRAM PERFORMANCE STANDARDS-IN SITU PROCESSING § 828.1 Scope. This part sets forth special environmental protection performance, reclamation and design standards for in situ processing activities. [44 FR 15455, Mar. 13, 1979] ...
Code of Federal Regulations, 2012 CFR
2012-07-01
... PERFORMANCE STANDARDS SPECIAL PERMANENT PROGRAM PERFORMANCE STANDARDS-IN SITU PROCESSING § 828.1 Scope. This part sets forth special environmental protection performance, reclamation and design standards for in situ processing activities. [44 FR 15455, Mar. 13, 1979] ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... PERFORMANCE STANDARDS SPECIAL PERMANENT PROGRAM PERFORMANCE STANDARDS-IN SITU PROCESSING § 828.1 Scope. This part sets forth special environmental protection performance, reclamation and design standards for in situ processing activities. [44 FR 15455, Mar. 13, 1979] ...
Code of Federal Regulations, 2013 CFR
2013-07-01
... PERFORMANCE STANDARDS SPECIAL PERMANENT PROGRAM PERFORMANCE STANDARDS-IN SITU PROCESSING § 828.1 Scope. This part sets forth special environmental protection performance, reclamation and design standards for in situ processing activities. [44 FR 15455, Mar. 13, 1979] ...
ERIC Educational Resources Information Center
Stone, Gregory Ethan; Koskey, Kristin L. K.; Sondergeld, Toni A.
2011-01-01
Typical validation studies on standard setting models, most notably the Angoff and modified Angoff models, have ignored construct development, a critical aspect associated with all conceptualizations of measurement processes. Stone compared the Angoff and objective standard setting (OSS) models and found that Angoff failed to define a legitimate…
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
42 CFR 410.143 - Requirements for approved accreditation organizations.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) Notice of any proposed changes in its accreditation standards and requirements or evaluation process. If... enforcement of its standards to a set of quality standards (described in § 410.144) and processes when any of the following conditions exist: (i) CMS imposes new requirements or changes its process for approving...
ERIC Educational Resources Information Center
Bloom, Robert; And Others
A study of the processes for establishing the principles and policies of measurement and disclosure in preparing financial reports examines differences in these processes in the United States, Canada, and England. Information was drawn from international accounting literature on standard setting. The differences and similarities in the…
Proficiency Standards and Cut-Scores for Language Proficiency Tests.
ERIC Educational Resources Information Center
Moy, Raymond H.
The problem of standard setting on language proficiency tests is often approached by the use of norms derived from the group being tested, a process commonly known as "grading on the curve." One particular problem with this ad hoc method of standard setting is that it will usually result in a fluctuating standard dependent on the particular group…
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Meleskie, Jessica; Eby, Don
2009-01-01
Standardized, preprinted or computer-generated physician orders are an attractive project for organizations that wish to improve the quality of patient care. The successful development and maintenance of order sets is a major undertaking. This article recounts the collaborative experience of the Grey Bruce Health Network in adapting and implementing an existing set of physician orders for use in its three hospital corporations. An Order Set Committee composed of primarily front-line staff was given authority over the order set development, approval and implementation processes. This arrangement bypassed the traditional approval process and facilitated the rapid implementation of a large number of order sets in a short time period.
The Introduction of Standardized External Testing in Ukraine: Challenges and Successes
ERIC Educational Resources Information Center
Kovalchuk, Serhiy; Koroliuk, Svitlana
2012-01-01
Standardized external testing (SET) began to be implemented in Ukraine in 2008 as an instrument for combating corruption in higher education and ensuring fair university admission. This article examines the conditions and processes that led to the introduction of SET, overviews its implementation over three years (2008-10), analyzes SET and…
Grol, R
1990-01-01
The Nederlands Huisartsen Genootschap (NHG), the college of general practitioners in the Netherlands, has begun a national programme of standard setting for the quality of care in general practice. When the standards have been drawn up and assessed they are disseminated via the journal Huisarts en Wetenschap. In a survey, carried out among a randomized sample of 10% of all general practitioners, attitudes towards national standard setting in general and to the first set of standards (diabetes care) were studied. The response was 70% (453 doctors). A majority of the respondents said they were well informed about the national standard setting initiatives instigated by the NHG (71%) and about the content of the first standards (77%). The general practitioners had a positive attitude towards the setting of national standards for quality of care, and this was particularly true for doctors who were members of the NHG. Although a large majority of doctors said they agreed with most of the guidelines in the diabetes standards fewer respondents were actually working to the guidelines and some of the standards are certain to meet with a lot of resistance. A better knowledge of the standards and a more positive attitude to the process of national standard setting correlated with a more positive attitude to the guidelines formulated in the diabetes standards. The results could serve as a starting point for an exchange of views about standard setting in general practice in other countries. PMID:2265001
Fulton, James L.
1992-01-01
Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.
ERIC Educational Resources Information Center
Hack, David
This report on telephone networks and computer networks in a global context focuses on the processes and organizations through which the standards that make this possible are set. The first of five major sections presents descriptions of the standardization process, including discussions of the various kinds of standards, advantages and…
Setting Standards and Primary School Teachers' Experiences of the Process
ERIC Educational Resources Information Center
Scherman, Vanessa; Zimmerman, Lisa; Howie, Sarah J.; Bosker, Roel
2014-01-01
In South Africa, very few standard-setting exercises are carried out in education and, if they are, teachers are not involved in their execution. As a result, there is no clear understanding of what the standard is and how it was arrived at. This situation is compounded when teachers are held accountable when learners do not meet the prescribed…
Comparison of data used for setting occupational exposure limits.
Schenk, Linda
2010-01-01
It has previously been shown that occupational exposure limits (OELs) for the same substance can vary significantly between different standard-setters. The work presented in this paper identifies the steps in the process towards establishing an OEL and how variations in those processes could account for these differences. This study selects for further scrutiny substances for which the level of OELs vary by a factor of 100, focussing on 45 documents concerning 14 substances from eight standard-setters. Several of the OELs studied were more than 20 years old and based on outdated knowledge. Furthermore, different standard-setters sometimes based their OELs on different sets of data, and data availability alone could not explain all differences in the selection of data sets used by standard-setters. While the interpretation of key studies did not differ significantly in standard-setters' documentations, the evaluations of the key studies' quality did. Also, differences concerning the critical effect coincided with differences in the level of OELs for half of the substances.
Standards for Title VII Evaluations: Accommodation for Reality Constraints.
ERIC Educational Resources Information Center
Yap, Kim Onn
Two separate sets of minimum standards designed to guide the evaluation of bilingual projects are proposed. The first set relates to the process in which the evaluation activities are conducted. They include: validity of assessment procedures, validity and reliability of evaluation instruments, representativeness of findings, use of procedures for…
Diversification and Challenges of Software Engineering Standards
NASA Technical Reports Server (NTRS)
Poon, Peter T.
1994-01-01
The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.
Increasing patient safety and efficiency in transfusion therapy using formal process definitions.
Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L
2007-01-01
The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.
This data set consists of Census Designated Place and Federal Information Processing Standard (FIPS) Populated Place boundaries for the State of Arizona which were extracted from the 1992 U.S. Census Bureau TIGER line files.
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
A Mapmark method of standard setting as implemented for the National Assessment Governing Board.
Schulz, E Matthew; Mitzel, Howard C
2011-01-01
This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.
Unified System Of Data On Materials And Processes
NASA Technical Reports Server (NTRS)
Key, Carlo F.
1989-01-01
Wide-ranging sets of data for aerospace industry described. Document describes Materials and Processes Technical Information System (MAPTIS), computerized set of integrated data bases for use by NASA and aerospace industry. Stores information in standard format for fast retrieval in searches and surveys of data. Helps engineers select materials and verify their properties. Promotes standardized nomenclature as well as standarized tests and presentation of data. Format of document of photographic projection slides used in lectures. Presents examples of reports from various data bases.
A case of standardization? Implementing health promotion guidelines in Denmark.
Rod, Morten Hulvej; Høybye, Mette Terp
2016-09-01
Guidelines are increasingly used in an effort to standardize and systematize health practices at the local level and to promote evidence-based practice. The implementation of guidelines frequently faces problems, however, and standardization processes may in general have other outcomes than the ones envisioned by the makers of standards. In 2012, the Danish National Health Authorities introduced a set of health promotion guidelines that were meant to guide the decision making and priority setting of Denmark's 98 local governments. The guidelines provided recommendations for health promotion policies and interventions and were structured according to risk factors such as alcohol, smoking and physical activity. This article examines the process of implementation of the new Danish health promotion guidelines. The article is based on qualitative interviews and participant observation, focusing on the professional practices of health promotion officers in four local governments as well as the field of Danish health promotion more generally. The analysis highlights practices and episodes related to the implementation of the guidelines and takes inspiration from Timmermans and Epstein's sociology of standards and standardization. It remains an open question whether or not the guidelines lead to more standardized policies and interventions, but we suggest that the guidelines promote a risk factor-oriented approach as the dominant frame for knowledge, reasoning, decision making and priority setting in health promotion. We describe this process as a case of epistemic standardization. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Identifying and Evaluating External Validity Evidence for Passing Scores
ERIC Educational Resources Information Center
Davis-Becker, Susan L.; Buckendahl, Chad W.
2013-01-01
A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…
The coming commoditization of processes.
Davenport, Thomas H
2005-06-01
Despite the much-ballyhooed increase in outsourcing, most companies are in do-it-yourself mode for the bulk of their processes, in large part because there's no way to compare outside organizations' capabilities with those of internal functions. Given the lack of comparability, it's almost surprising that anyone outsources today. But it's not surprising that cost is by far companies' primary criterion for evaluating outsourcers or that many companies are dissatisfied with their outsourcing relationships. A new world is coming, says the author, and it will lead to dramatic changes in the shape and structure of corporations. A broad set of process standards will soon make it easy to determine whether a business capability can be improved by outsourcing it. Such standards will also help businesses compare service providers and evaluate the costs versus the benefits of outsourcing. Eventually these costs and benefits will be so visible to buyers that outsourced processes will become a commodity, and prices will drop significantly. The low costs and low risk of outsourcing will accelerate the flow of jobs offshore, force companies to reassess their strategies, and change the basis of competition. The speed with which some businesses have already adopted process standards suggests that many previously unscrutinized areas are ripe for change. In the field of technology, for instance, the Carnegie Mellon Software Engineering Institute has developed a global standard for software development processes, called the Capability Maturity Model (CMM). For companies that don't have process standards in place, it makes sense for them to create standards by working with customers, competitors, software providers, businesses that processes may be outsourced to, and objective researchers and standard-setters. Setting standards is likely to lead to the improvement of both internal and outsourced processes.
ERIC Educational Resources Information Center
Meuter, Matthew L.; Chapman, Kenneth J.; Toy, Daniel; Wright, Lauren K.; McGowan, William
2009-01-01
This article describes a standardization process for an introductory marketing course with multiple sections. The authors first outline the process used to develop a standardized set of marketing concepts to be used in all introductory marketing classes. They then discuss the benefits to both students and faculty that occur as a result of…
The Vanderbilt Professional Nursing Practice Program, part 3: managing an advancement process.
Steaban, Robin; Fudge, Mitzie; Leutgens, Wendy; Wells, Nancy
2003-11-01
Consistency of performance standards across multiple clinical settings is an essential component of a credible advancement system. Our advancement process incorporates a central committee, composed of nurses from all clinical settings within the institution, to ensure consistency of performance in inpatient, outpatient, and procedural settings. An analysis of nurses advanced during the first 18 months of the program indicates that performance standards are applicable to nurses in all clinical settings. The first article (September 2003) in this 3-part series described the foundation for and the philosophical background of the Vanderbilt Professional Nursing Practice Program (VPNPP), the career advancement program underway at Vanderbilt University Medical Center. Part 2 described the development of the evaluation tools used in the VPNPP, the implementation and management of this new system, program evaluation, and improvements since the program's inception. The purpose of this article is to review the advancement process, review the roles of those involved in the process, and to describe outcomes and lessons learned.
Finding-specific display presets for computed radiography soft-copy reading.
Andriole, K P; Gould, R G; Webb, W R
1999-05-01
Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding-specific processing is preferred over the standard presentation, and zero denoting no difference. Processing settings have been developed for several findings including pneumothorax and lung nodules, and clinical cases are currently being collected in preparation for formal clinical trials. Preliminary results indicate a preference for the optimized-processing presentation of images over the standard default, particularly by inexperienced radiology residents and referring clinicians.
International standards for programmes of training in intensive care medicine in Europe.
2011-03-01
To develop internationally harmonised standards for programmes of training in intensive care medicine (ICM). Standards were developed by using consensus techniques. A nine-member nominal group of European intensive care experts developed a preliminary set of standards. These were revised and refined through a modified Delphi process involving 28 European national coordinators representing national training organisations using a combination of moderated discussion meetings, email, and a Web-based tool for determining the level of agreement with each proposed standard, and whether the standard could be achieved in the respondent's country. The nominal group developed an initial set of 52 possible standards which underwent four iterations to achieve maximal consensus. All national coordinators approved a final set of 29 standards in four domains: training centres, training programmes, selection of trainees, and trainers' profiles. Only three standards were considered immediately achievable by all countries, demonstrating a willingness to aspire to quality rather than merely setting a minimum level. Nine proposed standards which did not achieve full consensus were identified as potential candidates for future review. This preliminary set of clearly defined and agreed standards provides a transparent framework for assuring the quality of training programmes, and a foundation for international harmonisation and quality improvement of training in ICM.
CR softcopy display presets based on optimum visualization of specific findings
NASA Astrophysics Data System (ADS)
Andriole, Katherine P.; Gould, Robert G.; Webb, W. R.
1999-07-01
The purpose of this research is to assess the utility of providing presets for computed radiography (CR) softcopy display, based not on the window/level settings, but on image processing applied to the image based on optimization for visualization of specific findings, pathologies, etc. Clinical chest images are acquired using an Agfa ADC 70 CR scanner, and transferred over the PACS network to an image processing station which has the capability to perform multiscale contrast equalization. The optimal image processing settings per finding are developed in conjunction with a thoracic radiologist by manipulating the multiscale image contrast amplification algorithm parameters. Softcopy display of images processed with finding-specific settings are compared with the standard default image presentation for fifty cases of each category. Comparison is scored using a five point scale with positive one and two denoting the standard presentation is preferred over the finding-specific presets, negative one and two denoting the finding-specific preset is preferred over the standard presentation, and zero denoting no difference. Presets have been developed for pneumothorax and clinical cases are currently being collected in preparation for formal clinical trials. Subjective assessments indicate a preference for the optimized-preset presentation of images over the standard default, particularly by inexperienced radiology residents and referring clinicians.
Process Improvement of Reactive Dye Synthesis Using Six Sigma Concept
NASA Astrophysics Data System (ADS)
Suwanich, Thanapat; Chutima, Parames
2017-06-01
This research focuses on the problem occurred in the reactive dye synthesis process of a global manufacturer in Thailand which producing various chemicals for reactive dye products to supply global industries such as chemicals, textiles and garments. The product named “Reactive Blue Base” is selected in this study because it has highest demand and the current chemical yield shows a high variation, i.e. yield variation of 90.4% - 99.1% (S.D. = 2.405 and Cpk = -0.08) and average yield is 94.5% (lower than the 95% standard set by the company). The Six Sigma concept is applied aiming at increasing yield and reducing variation of this process. This approach is suitable since it provides a systematic guideline with five improvement phases (DMAIC) to effectively tackle the problem and find the appropriate parameter settings of the process. Under the new parameter settings, the process yield variation is reduced to range between 96.5% - 98.5% (S.D. = 0.525 and Cpk = 1.83) and the average yield is increased to 97.5% (higher than the 95% standard set by the company).
ERIC Educational Resources Information Center
Abdel-Messih, Ibrahim Adib; El-Setouhy, Maged; Crouch, Michael M.; Earhart, Kenneth C.
2008-01-01
Research is conducted in a variety of cultural settings. Ethical standards developed in Europe and the Americas are increasingly applied in these settings, many of which are culturally different from the countries in which these standards originated. To overcome these cultural differences, investigators may be tempted to deviate from ethical…
ERIC Educational Resources Information Center
Kontos, Pia C.; Miller, Karen-Lee; Mitchell, Gail J.
2010-01-01
Purpose: The Resident Assessment Instrument-Minimum Data Set (RAI/MDS) is an interdisciplinary standardized process that informs care plan development in nursing homes. This standardized process has failed to consistently result in individualized care planning, which may suggest problems with content and planning integrity. We examined the…
Accreditation in the Professions: Implications for Educational Leadership Preparation Programs
ERIC Educational Resources Information Center
Pavlakis, Alexandra; Kelley, Carolyn
2016-01-01
Program accreditation is a process based on a set of professional expectations and standards meant to signal competency and credibility. Although accreditation has played an important role in shaping educational leadership preparation programs, recent revisions to accreditation processes and standards have highlighted attention to the purposes,…
Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J
2012-11-09
A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.
Education Technology Standards Self-Efficacy (ETSSE) Scale: A Validity and Reliability Study
ERIC Educational Resources Information Center
Simsek, Omer; Yazar, Taha
2016-01-01
Problem Statement: The educational technology standards for teachers set by the International Society for Technology in Education (the ISTE Standards-T) represent an important framework for using technology effectively in teaching and learning processes. These standards are widely used by universities, educational institutions, and schools. The…
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of a specific avionics hardware/software system. This standard defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
A process-based standard for the Solar Energetic Particle Event Environment
NASA Astrophysics Data System (ADS)
Gabriel, Stephen
For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.
Unifying K-12 Learning Processes: Integrating Curricula through Learning
ERIC Educational Resources Information Center
Bosse, Michael J.; Fogarty, Elizabeth A.
2011-01-01
This study was designed to examine whether a set of cross-curricular learning processes could be found in the respective K-12 US national standards for math, language arts, foreign language, science, social studies, fine arts, and technology. Using a qualitative research methodology, the standards from the national associations for these content…
76 FR 23714 - Railroad Safety Appliance Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
... would include each of the elements that would be necessary to allow it to make an informed decision on a... for the review and approval of existing industry standards. This process will permit railroad industry.... Section 238.230 borrows the process set out in Sec. 238.21. It allows a recognized representative of the...
The Next Generation Science Standards: The Features and Challenges
ERIC Educational Resources Information Center
Pruitt, Stephen L.
2014-01-01
Beginning in January of 2010, the Carnegie Corporation of New York funded a two-step process to develop a new set of state developed science standards intended to prepare students for college and career readiness in science. These new internationally benchmarked science standards, the Next Generation Science Standards (NGSS) were completed in…
Standards Setting and Federal Information Policy: The Escrowed Encryption Standard (EES).
ERIC Educational Resources Information Center
Gegner, Karen E.; Veeder, Stacy B.
1994-01-01
Examines the standards process used for developing the Escrowed Encryption Standard (EES) and its possible impact on national communication and information policies. Discusses the balance between national security and law enforcement concerns versus privacy rights and economic competitiveness in the area of foreign trade and export controls. (67…
The Alignment of CEC/DEC and NAEYC Personnel Preparation Standards
ERIC Educational Resources Information Center
Chandler, Lynette K.; Cochran, Deborah C.; Christensen, Kimberly A.; Dinnebeil, Laurie A.; Gallagher, Peggy A.; Lifter, Karin; Stayton, Vicki D.; Spino, Margie
2012-01-01
This article describes the process for alignment of the personnel preparation standards developed by the Council for Exceptional Children and Division for Early Childhood with the standards developed by the National Association for the Education of Young Children. The results identify areas of convergence across the two sets of standards and areas…
40 CFR 471.63 - New source performance standards (NSPS).
Code of Federal Regulations, 2014 CFR
2014-07-01
... GUIDELINES AND STANDARDS (CONTINUED) NONFERROUS METALS FORMING AND METAL POWDERS POINT SOURCE CATEGORY Titanium Forming Subcategory § 471.63 New source performance standards (NSPS). Any new source subject to... wastewater pollutants from titanium process wastewater shall not exceed the values set forth below: (a...
40 CFR 471.63 - New source performance standards (NSPS).
Code of Federal Regulations, 2012 CFR
2012-07-01
... GUIDELINES AND STANDARDS (CONTINUED) NONFERROUS METALS FORMING AND METAL POWDERS POINT SOURCE CATEGORY Titanium Forming Subcategory § 471.63 New source performance standards (NSPS). Any new source subject to... wastewater pollutants from titanium process wastewater shall not exceed the values set forth below: (a...
40 CFR 461.43 - New source performance standards (NSPS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... GUIDELINES AND STANDARDS BATTERY MANUFACTURING POINT SOURCE CATEGORY Leclanche Subcategory § 461.43 New... subject to this subpart shall not exceed the standards set forth below: (1) Subpart D—Foliar Battery... process wastewater pollutants from any battery manufacturing operation other than those battery...
40 CFR 461.43 - New source performance standards (NSPS).
Code of Federal Regulations, 2011 CFR
2011-07-01
... GUIDELINES AND STANDARDS BATTERY MANUFACTURING POINT SOURCE CATEGORY Leclanche Subcategory § 461.43 New... subject to this subpart shall not exceed the standards set forth below: (1) Subpart D—Foliar Battery... process wastewater pollutants from any battery manufacturing operation other than those battery...
Getting to Know Governmental GAAP.
ERIC Educational Resources Information Center
Bissell, George E.
1987-01-01
Presents the history and an overview of how generally accepted accounting principles (GAAP) are established and by what process the standards are created. School business officials are invited to participate in the Governmental Accounting Standards Board (GASB), established as the standard setting body for state and local governments. (MLF)
2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Wesley; Mai, Trieu; Logan, Jeffrey
The National Renewable Energy Laboratory is conducting a study sponsored by the Office of Energy Efficiency and Renewable Energy (EERE) that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis.
Increasing Consistency and Transparency in Considering Costs and Benefits in the Rulemaking Process
Advance notice of proposed rulemaking for standardizing terminology and specificity provided in each law regarding the nature and scope of the cost and benefit considerations when setting pollution standards.
Proofs without Words: A Visual Application of Reasoning and Proof
ERIC Educational Resources Information Center
Bell, Carol J.
2011-01-01
Reasoning and Proof is one of the Process Standards set forth in National Council of Teachers of Mathematics' (NCTM) "Principles and Standards for School Mathematics." Thus, it is important to give students opportunities to build their reasoning skills and aid their understanding of the proof process. Teaching students how to do proofs is a…
Cataloguing Standards; The Report of the Canadian Task Group on Cataloguing Standards.
ERIC Educational Resources Information Center
National Library of Canada, Ottawa (Ontario).
Following the recommendations of the National Conference on Cataloguing Standards held at the National Library of Canada in May 1970, a Canadian Task Group on Cataloguing Standards was set up to study and identify present deficiencies in the organizing and processing of Canadian material, and the cataloging problems of Canadian libraries, and to…
Code of Federal Regulations, 2010 CFR
2010-07-01
... the requirements set forth in §§ 312.23 through 312.31: (a) The procedures of ASTM International... Site Assessment Process.” (b) The procedures of ASTM International Standard E2247-08 entitled “Standard... or Rural Property.” This standard is available from ASTM International at http://www.astm.org, 1-610...
Control of Air Pollution from Aviation: The Emission Standard Setting Process.
1981-01-01
49 VIII-2 ORGANIC EMISSIONS FROM GAS TURBINE ENGINES .......... 64 VIII-3 THE REACTIVITY OF AIRCRAFT COMPARED WITH OTHER EMISSION SOURCES...SETTING PROCESS ............................................... 45 VIII-I GAS TURBINE POLLUTANT FORMATION AND DECOMPO- SITION...144 A-4-3 AIRCRAFT GAS TURBINE POLLUTION CONSIDERATIONS ....... 145 A-4-4 PRIMARY ZONE ENRICHMENT, DELAYED DILUTION, AND AIRBLAST
Fast interrupt platform for extended DOS
NASA Technical Reports Server (NTRS)
Duryea, T. W.
1995-01-01
Extended DOS offers the unique combination of a simple operating system which allows direct access to the interrupt tables, 32 bit protected mode access to 4096 MByte address space, and the use of industry standard C compilers. The drawback is that fast interrupt handling requires both 32 bit and 16 bit versions of each real-time process interrupt handler to avoid mode switches on the interrupts. A set of tools has been developed which automates the process of transforming the output of a standard 32 bit C compiler to 16 bit interrupt code which directly handles the real mode interrupts. The entire process compiles one set of source code via a make file, which boosts productivity by making the management of the compile-link cycle very simple. The software components are in the form of classes written mostly in C. A foreground process written as a conventional application which can use the standard C libraries can communicate with the background real-time classes via a message passing mechanism. The platform thus enables the integration of high performance real-time processing into a conventional application framework.
Transportable GPU (General Processor Units) chip set technology for standard computer architectures
NASA Astrophysics Data System (ADS)
Fosdick, R. E.; Denison, H. C.
1982-11-01
The USAFR-developed GPU Chip Set has been utilized by Tracor to implement both USAF and Navy Standard 16-Bit Airborne Computer Architectures. Both configurations are currently being delivered into DOD full-scale development programs. Leadless Hermetic Chip Carrier packaging has facilitated implementation of both architectures on single 41/2 x 5 substrates. The CMOS and CMOS/SOS implementations of the GPU Chip Set have allowed both CPU implementations to use less than 3 watts of power each. Recent efforts by Tracor for USAF have included the definition of a next-generation GPU Chip Set that will retain the application-proven architecture of the current chip set while offering the added cost advantages of transportability across ISO-CMOS and CMOS/SOS processes and across numerous semiconductor manufacturers using a newly-defined set of common design rules. The Enhanced GPU Chip Set will increase speed by an approximate factor of 3 while significantly reducing chip counts and costs of standard CPU implementations.
NASA Astrophysics Data System (ADS)
Moyle, Steve
Collaborative Data Mining is a setting where the Data Mining effort is distributed to multiple collaborating agents - human or software. The objective of the collaborative Data Mining effort is to produce solutions to the tackled Data Mining problem which are considered better by some metric, with respect to those solutions that would have been achieved by individual, non-collaborating agents. The solutions require evaluation, comparison, and approaches for combination. Collaboration requires communication, and implies some form of community. The human form of collaboration is a social task. Organizing communities in an effective manner is non-trivial and often requires well defined roles and processes. Data Mining, too, benefits from a standard process. This chapter explores the standard Data Mining process CRISP-DM utilized in a collaborative setting.
Geometric representation methods for multi-type self-defining remote sensing data sets
NASA Technical Reports Server (NTRS)
Anuta, P. E.
1980-01-01
Efficient and convenient representation of remote sensing data is highly important for an effective utilization. The task of merging different data types is currently dealt with by treating each case as an individual problem. A description is provided of work which is carried out to standardize the multidata merging process. The basic concept of the new approach is that of the self-defining data set (SDDS). The creation of a standard is proposed. This standard would be such that data which may be of interest in a large number of earth resources remote sensing applications would be in a format which allows convenient and automatic merging. Attention is given to details regarding the multidata merging problem, a geometric description of multitype data sets, image reconstruction from track-type data, a data set generation system, and an example multitype data set.
Table of Historical Carbon Monoxide (CO) National Ambient Air Quality Standards (NAAQS)
See the history of limits to the level of carbon monoxide (CO) in ambient air, set through the NAAQS review and rulemaking process under the Clean Air Act. This includes both primary and secondary standards.
This regulation sets standards for the protection of public health, safety, and the environment from radiological and non-radiological hazards from uranium and thorium ore processing and disposal of associated wastes.
Table of Historical Nitrogen Dioxide National Ambient Air Quality Standards (NAAQS)
See the history of limits to the level of nitrogen dioxide (NO2) in ambient air, set through the NAAQS review and rulemaking process under the Clean Air Act. This includes both primary and secondary standards.
Table of Historical Sulfur Dioxide National Ambient Air Quality Standards (NAAQS)
See the history of limits to the level of sulfur dioxide (SO2) in ambient air, set through the NAAQS review and rulemaking process under the Clean Air Act. This includes both primary and secondary standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendell, Mark J.; Fisk, William J.
Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less
Management Reporting Standards for Educational Institutions: Fund Raising and Related Activities.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
A set of definitions are presented to create common cost and gift reporting standards for fund raisers and business officers at colleges, universities, and independent secondary schools. The objective is to bring clarity and consistency to the gift reporting process. In addition, the standards and management reports formats provide useful tools…
Educators' Perspectives: Survey on the 2009 CEC Advanced Content Standards
ERIC Educational Resources Information Center
Othman, Lama Bergstrand; Kieran, Laura; Anderson, Christine J.
2015-01-01
Educators who pursue an advanced degree or certification in special education must learn and master the Advanced Content Standards as set forth by the Council for Exceptional Children. These six content standards were validated by the CEC to guide educators through the process of assuming an advanced role in special education teaching or…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the development and testing of a set of recommendations generated to serve as a primary basis for the Congressionally-mandated residential standard. This report treats only the residential building recommendations.
DOT National Transportation Integrated Search
1999-01-01
Significant changes to standards and regulations that influence metropolitan transportation planning in many areas were made in 1997. Specifically, the U.S. EPA issued both a new set of National Ambient Air Quality Standards(NAAQS) and major revision...
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
Enhanced Cumulative Sum Charts for Monitoring Process Dispersion
Abujiya, Mu’azu Ramat; Riaz, Muhammad; Lee, Muhammad Hisyam
2015-01-01
The cumulative sum (CUSUM) control chart is widely used in industry for the detection of small and moderate shifts in process location and dispersion. For efficient monitoring of process variability, we present several CUSUM control charts for monitoring changes in standard deviation of a normal process. The newly developed control charts based on well-structured sampling techniques - extreme ranked set sampling, extreme double ranked set sampling and double extreme ranked set sampling, have significantly enhanced CUSUM chart ability to detect a wide range of shifts in process variability. The relative performances of the proposed CUSUM scale charts are evaluated in terms of the average run length (ARL) and standard deviation of run length, for point shift in variability. Moreover, for overall performance, we implore the use of the average ratio ARL and average extra quadratic loss. A comparison of the proposed CUSUM control charts with the classical CUSUM R chart, the classical CUSUM S chart, the fast initial response (FIR) CUSUM R chart, the FIR CUSUM S chart, the ranked set sampling (RSS) based CUSUM R chart and the RSS based CUSUM S chart, among others, are presented. An illustrative example using real dataset is given to demonstrate the practicability of the application of the proposed schemes. PMID:25901356
Karapetyan, Karen; Batchelor, Colin; Sharpe, David; Tkachenko, Valery; Williams, Antony J
2015-01-01
There are presently hundreds of online databases hosting millions of chemical compounds and associated data. As a result of the number of cheminformatics software tools that can be used to produce the data, subtle differences between the various cheminformatics platforms, as well as the naivety of the software users, there are a myriad of issues that can exist with chemical structure representations online. In order to help facilitate validation and standardization of chemical structure datasets from various sources we have delivered a freely available internet-based platform to the community for the processing of chemical compound datasets. The chemical validation and standardization platform (CVSP) both validates and standardizes chemical structure representations according to sets of systematic rules. The chemical validation algorithms detect issues with submitted molecular representations using pre-defined or user-defined dictionary-based molecular patterns that are chemically suspicious or potentially requiring manual review. Each identified issue is assigned one of three levels of severity - Information, Warning, and Error - in order to conveniently inform the user of the need to browse and review subsets of their data. The validation process includes validation of atoms and bonds (e.g., making aware of query atoms and bonds), valences, and stereo. The standard form of submission of collections of data, the SDF file, allows the user to map the data fields to predefined CVSP fields for the purpose of cross-validating associated SMILES and InChIs with the connection tables contained within the SDF file. This platform has been applied to the analysis of a large number of data sets prepared for deposition to our ChemSpider database and in preparation of data for the Open PHACTS project. In this work we review the results of the automated validation of the DrugBank dataset, a popular drug and drug target database utilized by the community, and ChEMBL 17 data set. CVSP web site is located at http://cvsp.chemspider.com/. A platform for the validation and standardization of chemical structure representations of various formats has been developed and made available to the community to assist and encourage the processing of chemical structure files to produce more homogeneous compound representations for exchange and interchange between online databases. While the CVSP platform is designed with flexibility inherent to the rules that can be used for processing the data we have produced a recommended rule set based on our own experiences with the large data sets such as DrugBank, ChEMBL, and data sets from ChemSpider.
Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele
2017-01-01
Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).
Ni, Kuei-Jung
2013-01-01
Most international health-related standards are voluntary per se. However, the incorporation of international standard-making into WTO agreements like the SPS Agreement has drastically changed the status and effectiveness of the standards. WTO members are urged to follow international standards, even when not required to comply fully with them. Indeed, such standards have attained great influence in the trade system. Yet evidence shows that the credibility of the allegedly scientific approach of these international standard-setting institutions, especially the Codex Alimentarius Commission (Codex) governing food safety standards, has been eroded and diluted by industrial and political influences. Its decision-making is no longer based on consensus, but voting. The adoption of new safety limits for the veterinary drug ractopamine in 2012, by a very close vote, is simply another instance of the problematic operations of the Codex. These dynamics have led skeptics to question the legitimacy of the standard setting body and to propose solutions to rectify the situation. Prior WTO rulings have yet to pay attention to the defect in the decision-making processes of the Codex. Nevertheless, the recent Appellate Body decision on Hormones II is indicative of a deferential approach to national measures that are distinct from Codex formulas. The ruling also rejects the reliance on those experts who authored the Codex standards to assess new measures of the European Community. This approach provides an opportunity to contemplate what the proper relationship between the WTO and Codex ought to be. Through a critical review of WTO rulings and academic proposals, this article aims to analyze how the WTO ought to define such interactions and respond to the politicized standard-making process in an optimal manner. This article argues that building a more systematic approach and normative basis for WTO judicial review of standard-setting decisions and the selection of technical experts would be instrumental to strengthening the mutual supports between the WTO and international standard-setting organizations, and may help avoid the introduction of a prejudice toward a justified science finding.
Human Integration Design Processes (HIDP)
NASA Technical Reports Server (NTRS)
Boyer, Jennifer
2014-01-01
The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.
USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION
Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice L.; Baggs, Rhoda
2007-01-01
Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.
Digital health technology and trauma: development of an app to standardize care.
Hsu, Jeremy M
2015-04-01
Standardized practice results in less variation, therefore reducing errors and improving outcome. Optimal trauma care is achieved through standardization, as is evidenced by the widespread adoption of the Advanced Trauma Life Support approach. The challenge for an individual institution is how does one educate and promulgate these standardized processes widely and efficiently? In today's world, digital health technology must be considered in the process. The aim of this study was to describe the process of developing an app, which includes standardized trauma algorithms. The objective of the app was to allow easy, real-time access to trauma algorithms, and therefore reduce omissions/errors. A set of trauma algorithms, relevant to the local setting, was derived from the best available evidence. After obtaining grant funding, a collaborative endeavour was undertaken with an external specialist app developing company. The process required 6 months to translate the existing trauma algorithms into an app. The app contains 32 separate trauma algorithms, formatted as a single-page flow diagram. It utilizes specific smartphone features such as 'pinch to zoom', jump-words and pop-ups to allow rapid access to the desired information. Improvements in trauma care outcomes result from reducing variation. By incorporating digital health technology, a trauma app has been developed, allowing easy and intuitive access to evidenced-based algorithms. © 2015 Royal Australasian College of Surgeons.
ERIC Educational Resources Information Center
Piskunova, Elena; Sokolova, Irina; Kalimullin, Aydar
2016-01-01
In the article, the problem of correspondence of educational standards of higher pedagogical education and teacher professional standards in Russia is actualized. Modern understanding of the quality of vocational education suggests that in the process of education the student develops a set of competencies that will enable him or her to carry out…
ERIC Educational Resources Information Center
Ellwein, Mary Catherine; Glass, Gene V.
A qualitative case study involving five educational institutions assessed the use of competency testing as a prerequisite for high school graduation, criterion for admission into college, criterion for teacher certification, and statewide assessment tool. Focus was on persons and processes involved in setting educational standards associated with…
Foundations for High Achievement: Safety, Civility, Literacy. K-12 Public Education.
ERIC Educational Resources Information Center
Colorado State Dept. of Education, Denver. Research and Evaluation Unit.
The state of Colorado has set high standards for students based on three fundamental principles: safety, civility, and literacy. How these standards were integrated into the schools is the subject of this report. It opens with an overview of the foundations of academic success and the process involved in implementing standards-based education. The…
Code of Federal Regulations, 2012 CFR
2012-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2011 CFR
2011-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2013 CFR
2013-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2014 CFR
2014-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.
Safe Handling of Snakes in an ED Setting.
Cockrell, Melanie; Swanson, Kristofer; Sanders, April; Prater, Samuel; von Wenckstern, Toni; Mick, JoAnn
2017-01-01
Efforts to improve consistency in management of snakes and venomous snake bites in the emergency department (ED) can improve patient and staff safety and outcomes, as well as improve surveillance data accuracy. The emergency department at a large academic medical center identified an opportunity to implement a standardized process for snake disposal and identification to reduce staff risk exposure to snake venom from snakes patients brought with them to the ED. A local snake consultation vendor and zoo Herpetologist assisted with development of a process for snake identification and disposal. All snakes have been identified and securely disposed of using the newly implemented process and no safety incidents have been reported. Other emergency department settings may consider developing a standardized process for snake disposal using listed specialized consultants combined with local resources and suppliers to promote employee and patient safety. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.
Using Text Sets to Facilitate Critical Thinking in Sixth Graders
ERIC Educational Resources Information Center
Scales, Roya Q.; Tracy, Kelly N.
2017-01-01
This case study examines features and processes of a sixth grade teacher (Jane) utilizing text sets as a tool for facilitating critical thinking. Jane's strong vision and student-centered beliefs informed her use of various texts to teach language arts as she worked to address demands of the Common Core State Standards. Text sets promoted multiple…
32 CFR 865.110 - Decision process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... in applying the standards set forth in this regulation. (b) The presiding officer is responsible for... and 32 CFR part 70: available official military records, documentary evidence submitted by or on... Standards: (1) When the DRB determines that an applicant's discharge was improper, the DRB will determine...
Magnetic Field Experiment Data Analysis System
NASA Technical Reports Server (NTRS)
Holland, D. B.; Zanetti, L. J.; Suther, L. L.; Potemra, T. A.; Anderson, B. J.
1995-01-01
The Johns Hopkins University Applied Physics Laboratory (JHU/APL) Magnetic Field Experiment Data Analysis System (MFEDAS) has been developed to process and analyze satellite magnetic field experiment data from the TRIAD, MAGSAT, AMPTE/CCE, Viking, Polar BEAR, DMSP, HILAT, UARS, and Freja satellites. The MFEDAS provides extensive data management and analysis capabilities. The system is based on standard data structures and a standard user interface. The MFEDAS has two major elements: (1) a set of satellite unique telemetry processing programs for uniform and rapid conversion of the raw data to a standard format and (2) the program Magplot which has file handling, data analysis, and data display sections. This system is an example of software reuse, allowing new data sets and software extensions to be added in a cost effective and timely manner. Future additions to the system will include the addition of standard format file import routines, modification of the display routines to use a commercial graphics package based on X-Window protocols, and a generic utility for telemetry data access and conversion.
A data types profile suitable for use with ISO EN 13606.
Sun, Shanghua; Austin, Tony; Kalra, Dipak
2012-12-01
ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.
Study unique artistic lopburi province for design brass tea set of bantahkrayang community
NASA Astrophysics Data System (ADS)
Pliansiri, V.; Seviset, S.
2017-07-01
The objectives of this study were as follows: 1) to study the production process of handcrafted Brass Tea Set; and 2) to design and develop the handcrafted of Brass Tea Set. The process of design was started by mutual analytical processes and conceptual framework for product design, Quality Function Deployment, Theory of Inventive Problem Solving, Principles of Craft Design, and Principle of Reverse Engineering. The experts in field of both Industrial Product Design and Brass Handicraft Product, have evaluated the Brass Tea Set design and created prototype of Brass tea set by the sample of consumers who have ever bought the Brass Tea Set of Bantahkrayang Community on this research. The statistics methods used were percentage, mean ({{{\\overline X}} = }) and standard deviation (S.D.) 3. To assess consumer satisfaction toward of handcrafted Brass tea set was at the high level.
Internal audit in a microbiology laboratory.
Mifsud, A J; Shafi, M S
1995-01-01
AIM--To set up a programme of internal laboratory audit in a medical microbiology laboratory. METHODS--A model of laboratory based process audit is described. Laboratory activities were examined in turn by specimen type. Standards were set using laboratory standard operating procedures; practice was observed using a purpose designed questionnaire and the data were analysed by computer; performance was assessed at laboratory audit meetings; and the audit circle was closed by re-auditing topics after an interval. RESULTS--Improvements in performance scores (objective measures) and in staff morale (subjective impression) were observed. CONCLUSIONS--This model of process audit could be applied, with amendments to take local practice into account, in any microbiology laboratory. PMID:7665701
Information system life-cycle and documentation standards, volume 1
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.
Kim, Andrew H; Roberts, Charlotte; Feagan, Brian G; Banerjee, Rupa; Bemelman, Willem; Bodger, Keith; Derieppe, Marc; Dignass, Axel; Driscoll, Richard; Fitzpatrick, Ray; Gaarentstroom-Lunt, Janette; Higgins, Peter D; Kotze, Paulo Gustavo; Meissner, Jillian; O'Connor, Marian; Ran, Zhi-Hua; Siegel, Corey A; Terry, Helen; van Deen, Welmoed K; van der Woude, C Janneke; Weaver, Alandra; Yang, Suk-Kyun; Sands, Bruce E; Vermeire, Séverine; Travis, Simon Pl
2018-03-28
Success in delivering value-based healthcare involves measuring outcomes that matter most to patients. Our aim was to develop a minimum Standard Set of patient-centred outcome measures for inflammatory bowel disease [IBD], for use in different healthcare settings. An international working group [n = 25] representing patients, patient associations, gastroenterologists, surgeons, specialist nurses, IBD registries and patient-reported outcome measure [PROM] methodologists participated in a series of teleconferences incorporating a modified Delphi process. Systematic review of existing literature, registry data, patient focus groups and open review periods were used to reach consensus on a minimum set of standard outcome measures and risk adjustment variables. Similar methodology has been used in 21 other disease areas [www.ichom.org]. A minimum Standard Set of outcomes was developed for patients [aged ≥16] with IBD. Outcome domains included survival and disease control [survival, disease activity/remission, colorectal cancer, anaemia], disutility of care [treatment-related complications], healthcare utilization [IBD-related admissions, emergency room visits] and patient-reported outcomes [including quality of life, nutritional status and impact of fistulae] measured at baseline and at 6 or 12 month intervals. A single PROM [IBD-Control questionnaire] was recommended in the Standard Set and minimum risk adjustment data collected at baseline and annually were included: demographics, basic clinical information and treatment factors. A Standard Set of outcome measures for IBD has been developed based on evidence, patient input and specialist consensus. It provides an international template for meaningful, comparable and easy-to-interpret measures as a step towards achieving value-based healthcare in IBD.
ERIC Educational Resources Information Center
Loch, John R.
2003-01-01
Outlines problems in continuing higher education, suggesting that it lacks (1) a standard name; (2) a unified voice on national issues; (3) a standard set of roles and functions; (4) a standard title for the chief administrative officer; (5) an accreditation body and process; and (6) resolution of the centralization/decentralization issue. (SK)
Neuss, Michael N; Gilmore, Terry R; Belderson, Kristin M; Billett, Amy L; Conti-Kalchik, Tara; Harvey, Brittany E; Hendricks, Carolyn; LeFebvre, Kristine B; Mangu, Pamela B; McNiff, Kristen; Olsen, MiKaela; Schulmeister, Lisa; Von Gehr, Ann; Polovich, Martha
2016-12-01
Purpose To update the ASCO/Oncology Nursing Society (ONS) Chemotherapy Administration Safety Standards and to highlight standards for pediatric oncology. Methods The ASCO/ONS Chemotherapy Administration Safety Standards were first published in 2009 and updated in 2011 to include inpatient settings. A subsequent 2013 revision expanded the standards to include the safe administration and management of oral chemotherapy. A joint ASCO/ONS workshop with stakeholder participation, including that of the Association of Pediatric Hematology Oncology Nurses and American Society of Pediatric Hematology/Oncology, was held on May 12, 2015, to review the 2013 standards. An extensive literature search was subsequently conducted, and public comments on the revised draft standards were solicited. Results The updated 2016 standards presented here include clarification and expansion of existing standards to include pediatric oncology and to introduce new standards: most notably, two-person verification of chemotherapy preparation processes, administration of vinca alkaloids via minibags in facilities in which intrathecal medications are administered, and labeling of medications dispensed from the health care setting to be taken by the patient at home. The standards were reordered and renumbered to align with the sequential processes of chemotherapy prescription, preparation, and administration. Several standards were separated into their respective components for clarity and to facilitate measurement of adherence to a standard. Conclusion As oncology practice has changed, so have chemotherapy administration safety standards. Advances in technology, cancer treatment, and education and training have prompted the need for periodic review and revision of the standards. Additional information is available at http://www.asco.org/chemo-standards .
A Systems Engineering Capability Maturity Model, Version 1.1,
1995-11-01
of a sequence of actions to be taken to perform a given task. [SECMM] 1. A set of activities ( ISO 12207 ). 2. A set of practices that address the...standards One of the design goals of the SE-CMM effort was to capture the salient concepts from emerging standards and initiatives (e.g.; ISO 9001...history for the SE-CMM: Version Designator Content Change Notes Release 1 • architecture rationale • Process Areas • ISO (SPICE) BPG 0.05 summary
32 CFR 724.804 - Decision process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... a case-by-case basis in applying the standard set forth in subpart I. (b) The presiding officer is... of the NDRB: available official records, documentary evidence submitted by or on behalf of an... available service records. (f) Application of standards. (1) When the NDRB determines that an applicant's...
Zerillo, Jessica A; Schouwenburg, Maartje G; van Bommel, Annelotte C M; Stowell, Caleb; Lippa, Jacob; Bauer, Donna; Berger, Ann M; Boland, Gilles; Borras, Josep M; Buss, Mary K; Cima, Robert; Van Cutsem, Eric; van Duyn, Eino B; Finlayson, Samuel R G; Hung-Chun Cheng, Skye; Langelotz, Corinna; Lloyd, John; Lynch, Andrew C; Mamon, Harvey J; McAllister, Pamela K; Minsky, Bruce D; Ngeow, Joanne; Abu Hassan, Muhammad R; Ryan, Kim; Shankaran, Veena; Upton, Melissa P; Zalcberg, John; van de Velde, Cornelis J; Tollenaar, Rob
2017-05-01
Global health systems are shifting toward value-based care in an effort to drive better outcomes in the setting of rising health care costs. This shift requires a common definition of value, starting with the outcomes that matter most to patients. The International Consortium for Health Outcomes Measurement (ICHOM), a nonprofit initiative, was formed to define standard sets of outcomes by medical condition. In this article, we report the efforts of ICHOM's working group in colorectal cancer. The working group was composed of multidisciplinary oncology specialists in medicine, surgery, radiation therapy, palliative care, nursing, and pathology, along with patient representatives. Through a modified Delphi process during 8 months (July 8, 2015 to February 29, 2016), ICHOM led the working group to a consensus on a final recommended standard set. The process was supported by a systematic PubMed literature review (1042 randomized clinical trials and guidelines from June 3, 2005, to June 3, 2015), a patient focus group (11 patients with early and metastatic colorectal cancer convened during a teleconference in August 2015), and a patient validation survey (among 276 patients with and survivors of colorectal cancer between October 15, 2015, and November 4, 2015). After consolidating findings of the literature review and focus group meeting, a list of 40 outcomes was presented to the WG and underwent voting. The final recommendation includes outcomes in the following categories: survival and disease control, disutility of care, degree of health, and quality of death. Selected case-mix factors were recommended to be collected at baseline to facilitate comparison of results across treatments and health care professionals. A standardized set of patient-centered outcome measures to inform value-based health care in colorectal cancer was developed. Pilot efforts are under way to measure the standard set among members of the working group.
Reliability Analysis and Standardization of Spacecraft Command Generation Processes
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Grenander, Sven; Evensen, Ken
2011-01-01
center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence
Price, Gavin R.; Fuchs, Lynn S.
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd- grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1–9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test. PMID:26859564
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence.
Price, Gavin R; Fuchs, Lynn S
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd-grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1-9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test.
Blueprint for Bologna: University of Prishtina and the European Union
ERIC Educational Resources Information Center
Epp, Juanita Ross; Epp, Walter
2010-01-01
Countries hoping to demonstrate an adequate educational infrastructure need a national framework that meets Bologna requirements, a national accreditation agency which sets out the approved framework, and national accreditation processes by which individual institutions can be measured against the standards set by the national accreditation…
Modified Confidence Intervals for the Mean of an Autoregressive Process.
1985-08-01
Validity of the method 45 3.6 Theorem 47 4 Derivation of corrections 48 Introduction 48 The zero order pivot 50 4.1 Algorithm 50 CONTENTS The first...of standard confidence intervals. There are several standard methods of setting confidence intervals in simulations, including the regener- ative... method , batch means, and time series methods . We-will focus-s on improved confidence intervals for the mean of an autoregressive process, and as such our
ISO 19115 Experiences in NASA's Earth Observing System (EOS) ClearingHOuse (ECHO)
NASA Astrophysics Data System (ADS)
Cechini, M. F.; Mitchell, A.
2011-12-01
Metadata is an important entity in the process of cataloging, discovering, and describing earth science data. As science research and the gathered data increases in complexity, so does the complexity and importance of descriptive metadata. To meet these growing needs, the metadata models required utilize richer and more mature metadata attributes. Categorizing, standardizing, and promulgating these metadata models to a politically, geographically, and scientifically diverse community is a difficult process. An integral component of metadata management within NASA's Earth Observing System Data and Information System (EOSDIS) is the Earth Observing System (EOS) ClearingHOuse (ECHO). ECHO is the core metadata repository for the EOSDIS data centers providing a centralized mechanism for metadata and data discovery and retrieval. ECHO has undertaken an internal restructuring to meet the changing needs of scientists, the consistent advancement in technology, and the advent of new standards such as ISO 19115. These improvements were based on the following tenets for data discovery and retrieval: + There exists a set of 'core' metadata fields recommended for data discovery. + There exists a set of users who will require the entire metadata record for advanced analysis. + There exists a set of users who will require a 'core' set metadata fields for discovery only. + There will never be a cessation of new formats or a total retirement of all old formats. + Users should be presented metadata in a consistent format of their choosing. In order to address the previously listed items, ECHO's new metadata processing paradigm utilizes the following approach: + Identify a cross-format set of 'core' metadata fields necessary for discovery. + Implement format-specific indexers to extract the 'core' metadata fields into an optimized query capability. + Archive the original metadata in its entirety for presentation to users requiring the full record. + Provide on-demand translation of 'core' metadata to any supported result format. Lessons learned by the ECHO team while implementing its new metadata approach to support usage of the ISO 19115 standard will be presented. These lessons learned highlight some discovered strengths and weaknesses in the ISO 19115 standard as it is introduced to an existing metadata processing system.
A Comparison of Approaches for Setting Proficiency Standards.
ERIC Educational Resources Information Center
Koffler, Stephen L.
This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…
Distance Education and Training Council Constitution and Bylaws. 2012 Edition
ERIC Educational Resources Information Center
Distance Education and Training Council, 2012
2012-01-01
The mission of the Distance Education and Training Council (hereinafter referred to as the Council or DETC) is to promote, by means of standard-setting, evaluation, and consultation processes, the development and maintenance of high educational and ethical standards in education and training programs delivered through distance learning. The…
Rethinking the 2000 ACRL Standards: Some Things to Consider
ERIC Educational Resources Information Center
Kuhlthau, Carol C.
2013-01-01
I propose three "rethinks" to consider in recasting the ACRL Standards for information literacy for the coming decades. First, rethink the concept of information need. Second, rethink the notion that information literacy is composed of a set of abilities for "extracting information." Third, rethink the holistic process of…
40 CFR 406.11 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Corn Wet Milling Subcategory § 406.11 Specialized definitions... and methods of analysis set forth in 40 CFR part 401 shall apply to this subpart. (b) The term corn shall mean the shelled corn delivered to a plant before processing. (c) The term standard bushel shall...
Automated process planning system
NASA Technical Reports Server (NTRS)
Mann, W.
1978-01-01
Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1992-01-01
The Architecture for Survivable Systems Processing (ASSP) program is a two phase program whose objective is the derivation, specification, development and validation of an open system architecture capable of supporting advanced processing needs of space, ground, and launch vehicle operations. The output of the first phase is a set of hardware and software standards and specifications defining this architecture at three levels. The second phase will validate these standards and develop the technology necessary to achieve strategic hardness, packaging density, throughput requirements, and interoperability/interchangeability.
The OSHA standard setting process: role of the occupational health nurse.
Klinger, C S; Jones, M L
1994-08-01
1. Occupational health nurses are the health professionals most often involved with the worker who suffers as a result of ineffective or non-existent safety and health standards. 2. Occupational health nurses are familiar with health and safety standards, but may not understand or participate in the rulemaking process used to develop them. 3. Knowing the eight basic steps of rulemaking and actively participating in the process empowers occupational health nurses to influence national policy decisions affecting the safety and health of millions of workers. 4. By actively participating in rulemaking activities, occupational health nurses also improve the quality of occupational health nursing practice and enhance the image of the nursing profession.
Adding Bite to the Bark: Using LibGuides2 Migration as Impetus to Introduce Strong Content Standards
ERIC Educational Resources Information Center
Fritch, Melia; Pitts, Joelle E.
2016-01-01
The authors discuss the long-term accumulation of unstandardized and inaccessible content within the Libguides system and the decision-making process to create and implement a set of standards using the migration to the LibGuides2 platform as a vehicle for change. Included in the discussion are strategies for the creation of standards and…
Health IT for Patient Safety and Improving the Safety of Health IT.
Magrabi, Farah; Ong, Mei-Sing; Coiera, Enrico
2016-01-01
Alongside their benefits health IT applications can pose new risks to patient safety. Problems with IT have been linked to many different types of clinical errors including prescribing and administration of medications; as well as wrong-patient, wrong-site errors, and delays in procedures. There is also growing concern about the risks of data breach and cyber-security. IT-related clinical errors have their origins in processes undertaken to design, build, implement and use software systems in a broader sociotechnical context. Safety can be improved with greater standardization of clinical software and by improving the quality of processes at different points in the technology life cycle, spanning design, build, implementation and use in clinical settings. Oversight processes can be set up at a regional or national level to ensure that clinical software systems meet specific standards. Certification and regulation are two mechanisms to improve oversight. In the absence of clear standards, guidelines are useful to promote safe design and implementation practices. Processes to identify and mitigate hazards can be formalised via a safety management system. Minimizing new patient safety risks is critical to realizing the benefits of IT.
Collaborative Outcome Measurement: Development of the Nationally Standardized Minimum Data Set
ERIC Educational Resources Information Center
Stephens, Barry C.; Kirchner, Corinne; Orr, Alberta L.; Suvino, Dawn; Rogers, Priscilla
2009-01-01
This article discusses the challenging process of developing a common data set for independent living programs serving older adults who are visually impaired. The three-year project, which included collaborative efforts among many stakeholders that encompass diverse program models, resulted in the development of the Internet-based Nationally…
Terrestrail indicators and measurements: Selection process and preliminary recommendations
USDA-ARS?s Scientific Manuscript database
The objective of this project is to identify a small set of core indicators and measurements that can be applied across rangeland, forest and riparian ecosystems managed by the BLM. A set of core indicators quantified using standardized measurements allows data to be integrated across field office, ...
Common Approach to Geoprocessing of Uav Data across Application Domains
NASA Astrophysics Data System (ADS)
Percivall, G. S.; Reichardt, M.; Taylor, T.
2015-08-01
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
State trends in ecological risk assessment and standard setting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siegel, M R; Fowler, K M; Bilyard, G R
1993-02-01
The purposes of this paper are (1) to identify key states' activities and plans related to setting cleanup standards using the ecological risk assessment process, and (2) to discuss the impacts these actions may have on the US Department of Energy's (DOE's) environmental restoration program. This report is prepared as part of a larger task, the purpose of which is to identify and assess state regulatory trends and legal developments that may impact DOE's environmental restoration program. Results of this task are intended to provide DOE with advance notice of potentially significant regulatory developments so as to enhance DOE's abilitymore » to influence these developments and to incorporate possible regulatory and policy changes into its planning process.« less
D'Agostino, Fabio; Vellone, Ercole; Tontini, Francesco; Zega, Maurizio; Alvaro, Rosaria
2012-01-01
The aim of a nursing data set is to provide useful information for assessing the level of care and the state of health of the population. Currently, both in Italy and in other countries, this data is incomplete due to the lack of a structured nursing documentation , making it indispensible to develop a Nursing Minimum Data Set (NMDS) using standard nursing language to evaluate care, costs and health requirements. The aim of the project described , is to create a computer system using standard nursing terms with a dedicated software which will aid the decision-making process and provide the relative documentation. This will make it possible to monitor nursing activity and costs and their impact on patients' health : adequate training and involvement of nursing staff will play a fundamental role.
1994-05-01
LOGISTICS MANAGEMENT INSTITUTE An Approach for Meeting Customer Standards Under Executive Order 12862 Summary Executive Order 12862, Setting...search Centers all operate and manage wind tunnels for both NASA and indus- try customers . Nonetheless, a separate wind-tunnel process should be...could include the man- ager of the process, selected members of the manager’s staff, a key customer , and a survey expert. The manager and staff would
Chen, Elizabeth S.; Maloney, Francine L.; Shilmayster, Eugene; Goldberg, Howard S.
2009-01-01
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs. PMID:20351830
Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S
2009-11-14
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.
Aerobic Digestion. Biological Treatment Process Control. Instructor's Guide.
ERIC Educational Resources Information Center
Klopping, Paul H.
This unit on aerobic sludge digestion covers the theory of the process, system components, factors that affect the process performance, standard operational concerns, indicators of steady-state operations, and operational problems. The instructor's guide includes: (1) an overview of the unit; (2) lesson plan; (3) lecture outline (keyed to a set of…
Foerster, Rebecca M.; Poth, Christian H.; Behler, Christian; Botsch, Mario; Schneider, Werner X.
2016-01-01
Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen’s visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions. PMID:27869220
Foerster, Rebecca M; Poth, Christian H; Behler, Christian; Botsch, Mario; Schneider, Werner X
2016-11-21
Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen's visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions.
NASA Astrophysics Data System (ADS)
Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.
2012-12-01
Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives is a correct design of the test cases. The major challenge is: to set-up the analytical framework for analyzing the impact of GI-standards on the process performance, to define the appropriate indicators and to choose the right test cases. In order to do so, it is proposed to define the test cases as 8 pairs of organizations (see figure). The paper will present the state of the art of performance measurement in the context of work processes, propose a series of SMART indicators for describing the set-up and measure the performance, define the test case set-up and suggest criteria for the selection of the test cases, i.e. the organizational pairs. References Anupindi, R., Chopra, S., Deshmukh, S.D., Van Mieghem, J.A., & Zemel, E. (2006). Managing Business Process Flows: Principles of Operations Management. New-Jersey, USA: Prentice Hall. Dessers, D., Crompvoets, J., Janssen, K., Vancauwenberghe, G., Vandenbroucke, D. & Vanhaverbeke, L. (2011). SDI at work: The Spatial Zoning Plans Case. Leuven, Belgium: Katholieke Universiteit Leuven.
ERIC Educational Resources Information Center
Shirazi, Mandana; Sadeghi, Majid; Emami, A.; Kashani, A. Sabouri; Parikh, Sagar; Alaeddini, F.; Arbabi, Mohammad; Wahlstrom, Rolf
2011-01-01
Objective: Standardized patients (SPs) have been developed to measure practitioner performance in actual practice settings, but results have not been fully validated for psychiatric disorders. This study describes the process of creating reliable and valid SPs for unannounced assessment of general-practitioners' management of depression disorders…
Go With the Flow, on Jupiter and Snow. Coherence from Model-Free Video Data Without Trajectories
NASA Astrophysics Data System (ADS)
AlMomani, Abd AlRahman R.; Bollt, Erik
2018-06-01
Viewing a data set such as the clouds of Jupiter, coherence is readily apparent to human observers, especially the Great Red Spot, but also other great storms and persistent structures. There are now many different definitions and perspectives mathematically describing coherent structures, but we will take an image processing perspective here. We describe an image processing perspective inference of coherent sets from a fluidic system directly from image data, without attempting to first model underlying flow fields, related to a concept in image processing called motion tracking. In contrast to standard spectral methods for image processing which are generally related to a symmetric affinity matrix, leading to standard spectral graph theory, we need a not symmetric affinity which arises naturally from the underlying arrow of time. We develop an anisotropic, directed diffusion operator corresponding to flow on a directed graph, from a directed affinity matrix developed with coherence in mind, and corresponding spectral graph theory from the graph Laplacian. Our methodology is not offered as more accurate than other traditional methods of finding coherent sets, but rather our approach works with alternative kinds of data sets, in the absence of vector field. Our examples will include partitioning the weather and cloud structures of Jupiter, and a local to Potsdam, NY, lake effect snow event on Earth, as well as the benchmark test double-gyre system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidhu, K.S.
1991-06-01
The primary objective of a standard setting process is to arrive at a drinking water concentration at which exposure to a contaminant would result in no known or potential adverse health effect on human health. The drinking water standards also serve as guidelines to prevent pollution of water sources and may be applicable in some cases as regulatory remediation levels. The risk assessment methods along with various decision making parameters are used to establish drinking water standards. For carcinogens classified in Groups A and B by the United States Environmental Protection Agency (USEPA) the standards are set by using nonthresholdmore » cancer risk models. The linearized multistage model is commonly used for computation of potency factors for carcinogenic contaminants. The acceptable excess risk level may vary from 10(-6) to 10(-4). For noncarcinogens, a threshold model approach based on application of an uncertainty factor is used to arrive at a reference dose (RfD). The RfD approach may also be used for carcinogens classified in Group C by the USEPA. The RfD approach with an additional uncertainty factory of 10 for carcinogenicity has been applied in the formulation of risk assessment for Group C carcinogens. The assumptions commonly used in arriving at drinking water standards are human life expectancy, 70 years; average human body weight, 70 kg; human daily drinking water consumption, 2 liters; and contribution of exposure to the contaminant from drinking water (expressed as a part of the total environmental exposure), 20%. Currently, there are over 80 USEPA existing or proposed primary standards for organic and inorganic contaminants in drinking water. Some of the state versus federal needs and viewpoints are discussed.« less
Holistic nursing as a specialty: holistic nursing - scope and standards of practice.
Mariano, Carla
2007-06-01
This article describes the Holistic Nursing: Scope and Standards of Practice. It defines holistic nursing, its five core values, and its practice standards. These include holistic philosophy, theory, and ethics; holistic caring process; holistic communication, therapeutic environment, and cultural diversity; holistic education and research; and holistic nurse self-care. Educational preparation for holistic nursing and settings in which holistic nurses practice are also explored.
Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine
2013-08-06
We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.
Standard Specimen Reference Set: Pancreatic — EDRN Public Portal
The primary objective of the EDRN Pancreatic Cancer Working Group Proposal is to create a reference set consisting of well-characterized serum/plasma specimens to use as a resource for the development of biomarkers for the early detection of pancreatic adenocarcinoma. The testing of biomarkers on the same sample set permits direct comparison among them; thereby, allowing the development of a biomarker panel that can be evaluated in a future validation study. Additionally, the establishment of an infrastructure with core data elements and standardized operating procedures for specimen collection, processing and storage, will provide the necessary preparatory platform for larger validation studies when the appropriate marker/panel for pancreatic adenocarcinoma has been identified.
NASA Astrophysics Data System (ADS)
Gaikwad, Akshay; Rehal, Diksha; Singh, Amandeep; Arvind, Dorai, Kavita
2018-02-01
We present the NMR implementation of a scheme for selective and efficient quantum process tomography without ancilla. We generalize this scheme such that it can be implemented efficiently using only a set of measurements involving product operators. The method allows us to estimate any element of the quantum process matrix to a desired precision, provided a set of quantum states can be prepared efficiently. Our modified technique requires fewer experimental resources as compared to the standard implementation of selective and efficient quantum process tomography, as it exploits the special nature of NMR measurements to allow us to compute specific elements of the process matrix by a restrictive set of subsystem measurements. To demonstrate the efficacy of our scheme, we experimentally tomograph the processes corresponding to "no operation," a controlled-NOT (CNOT), and a controlled-Hadamard gate on a two-qubit NMR quantum information processor, with high fidelities.
The Role of Metadata Standards in EOSDIS Search and Retrieval Applications
NASA Technical Reports Server (NTRS)
Pfister, Robin
1999-01-01
Metadata standards play a critical role in data search and retrieval systems. Metadata tie software to data so the data can be processed, stored, searched, retrieved and distributed. Without metadata these actions are not possible. The process of populating metadata to describe science data is an important service to the end user community so that a user who is unfamiliar with the data, can easily find and learn about a particular dataset before an order decision is made. Once a good set of standards are in place, the accuracy with which data search can be performed depends on the degree to which metadata standards are adhered during product definition. NASA's Earth Observing System Data and Information System (EOSDIS) provides examples of how metadata standards are used in data search and retrieval.
Convergence Toward Common Standards in Machine-Readable Cataloging *
Gull, C. D.
1969-01-01
The adoption of the MARC II format for the communication of bibliographic information by the three National Libraries of the U.S.A. makes it possible for those libraries to converge on the remaining necessary common standards for machine-readable cataloging. Three levels of standards are identified: fundamental, the character set; intermediate, MARC II; and detailed, the codes for identifying data elements. The convergence on these standards implies that the National Libraries can create and operate a Joint Bibliographic Data Bank requiring standard book numbers and universal serial numbers for identifying monographs and serials and that the system will thoroughly process contributed catalog entries before adding them to the Data Bank. There is reason to hope that the use of the MARC II format will facilitate catalogers' decision processes. PMID:5782261
Assembling Appliances Standards from a Basket of Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siderious, Hans-Paul; Meier, Alan
2014-08-11
Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less
Eblen, Denise R; Barlow, Kristina E; Naugle, Alecia Larew
2006-11-01
The U.S. Food Safety and Inspection Service (FSIS) pathogen reduction-hazard analysis critical control point systems final rule, published in 1996, established Salmonella performance standards for broiler chicken, cow and bull, market hog, and steer and heifer carcasses and for ground beef, chicken, and turkey meat. In 1998, the FSIS began testing to verify that establishments are meeting performance standards. Samples are collected in sets in which the number of samples is defined but varies according to product class. A sample set fails when the number of positive Salmonella samples exceeds the maximum number of positive samples allowed under the performance standard. Salmonella sample sets collected at 1,584 establishments from 1998 through 2003 were examined to identify factors associated with failure of one or more sets. Overall, 1,282 (80.9%) of establishments never had failed sets. In establishments that did experience set failure(s), generally the failed sets were collected early in the establishment testing history, with the exception of broiler establishments where failure(s) occurred both early and late in the course of testing. Small establishments were more likely to have experienced a set failure than were large or very small establishments, and broiler establishments were more likely to have failed than were ground beef, market hog, or steer-heifer establishments. Agency response to failed Salmonella sample sets in the form of in-depth verification reviews and related establishment-initiated corrective actions have likely contributed to declines in the number of establishments that failed sets. A focus on food safety measures in small establishments and broiler processing establishments should further reduce the number of sample sets that fail to meet the Salmonella performance standard.
Korzynska, Anna; Roszkowiak, Lukasz; Pijanowska, Dorota; Kozlowski, Wojciech; Markiewicz, Tomasz
2014-01-01
The aim of this study is to compare the digital images of the tissue biopsy captured with optical microscope using bright field technique under various light conditions. The range of colour's variation in immunohistochemically stained with 3,3'-Diaminobenzidine and Haematoxylin tissue samples is immense and coming from various sources. One of them is inadequate setting of camera's white balance to microscope's light colour temperature. Although this type of error can be easily handled during the stage of image acquisition, it can be eliminated with use of colour adjustment algorithms. The examination of the dependence of colour variation from microscope's light temperature and settings of the camera is done as an introductory research to the process of automatic colour standardization. Six fields of view with empty space among the tissue samples have been selected for analysis. Each field of view has been acquired 225 times with various microscope light temperature and camera white balance settings. The fourteen randomly chosen images have been corrected and compared, with the reference image, by the following methods: Mean Square Error, Structural SIMilarity and visual assessment of viewer. For two types of backgrounds and two types of objects, the statistical image descriptors: range, median, mean and its standard deviation of chromaticity on a and b channels from CIELab colour space, and luminance L, and local colour variability for objects' specific area have been calculated. The results have been averaged for 6 images acquired in the same light conditions and camera settings for each sample. The analysis of the results leads to the following conclusions: (1) the images collected with white balance setting adjusted to light colour temperature clusters in certain area of chromatic space, (2) the process of white balance correction for images collected with white balance camera settings not matched to the light temperature moves image descriptors into proper chromatic space but simultaneously the value of luminance changes. So the process of the image unification in a sense of colour fidelity can be solved in separate introductory stage before the automatic image analysis.
Development of a core set of outcome measures for OAB treatment.
Foust-Wright, Caroline; Wissig, Stephanie; Stowell, Caleb; Olson, Elizabeth; Anderson, Anita; Anger, Jennifer; Cardozo, Linda; Cotterill, Nikki; Gormley, Elizabeth Ann; Toozs-Hobson, Philip; Heesakkers, John; Herbison, Peter; Moore, Kate; McKinney, Jessica; Morse, Abraham; Pulliam, Samantha; Szonyi, George; Wagg, Adrian; Milsom, Ian
2017-12-01
Standardized measures enable the comparison of outcomes across providers and treatments giving valuable information for improving care quality and efficacy. The aim of this project was to define a minimum standard set of outcome measures and case-mix factors for evaluating the care of patients with overactive bladder (OAB). The International Consortium for Health Outcomes Measurement (ICHOM) convened an international working group (WG) of leading clinicians and patients to engage in a structured method for developing a core outcome set. Consensus was determined by a modified Delphi process, and discussions were supported by both literature review and patient input. The standard set measures outcomes of care for adults seeking treatment for OAB, excluding residents of long-term care facilities. The WG focused on treatment outcomes identified as most important key outcome domains to patients: symptom burden and bother, physical functioning, emotional health, impact of symptoms and treatment on quality of life, and success of treatment. Demographic information and case-mix factors that may affect these outcomes were also included. The standardized outcome set for evaluating clinical care is appropriate for use by all health providers caring for patients with OAB, regardless of specialty or geographic location, and provides key data for quality improvement activities and research.
Studies of the physical, yield and failure behavior of aliphatic polyketones
NASA Astrophysics Data System (ADS)
Karttunen, Nicole Renee
This thesis describes an investigation into the multiaxial yield and failure behavior of an aliphatic polyketone terpolymer. The behavior is studied as a function of: stress state, strain rate, temperature, and sample processing conditions. Results of this work include: elucidation of the behavior of a recently commercialized polymer, increased understanding of the effects listed above, insight into the effects of processing conditions on the morphology of the polyketone, and a description of yield strength of this material as a function of stress state, temperature, and strain rate. The first portion of work focuses on the behavior of a set of samples that are extruded under "common" processing conditions. Following this reference set of tests, the effect of testing this material at different temperatures is studied. A total of four different temperatures are examined. In addition, the effect of altering strain rate is examined. Testing is performed under pseudo-strain rate control at constant nominal octahedral shear strain rate for each failure envelope. A total of three different rates are studied. An extension of the first portion of work involves modeling the yield envelope. This is done by combining two approaches: continuum level and molecular level. The use of both methods allows the description of the yield envelope as a function of stress state, strain rate and temperature. The second portion of work involves the effects of processing conditions. For this work, additional samples are extruded with different shear and thermal histories than the "standard" material. One set of samples is processed with shear rates higher and lower than the standard. A second set is processed at higher and lower cooling rates than the standard. In order to understand the structural cause for changes in behavior with processing conditions, morphological characterization is performed on these samples. In particular, the effect on spherulitic structure is important. Residual stresses are also determined to be important to the behavior of the samples. Finally, an investigation into the crystalline structure of a family of aliphatic polyketones is performed. The effects of side group concentration and size are described.
Specifications for a Federal Information Processing Standard Data Dictionary System
NASA Technical Reports Server (NTRS)
Goldfine, A.
1984-01-01
The development of a software specification that Federal agencies may use in evaluating and selecting data dictionary systems (DDS) is discussed. To supply the flexibility needed by widely different applications and environments in the Federal Government, the Federal Information Processing Standard (FIPS) specifies a core DDS together with an optimal set of modules. The focus and status of the development project are described. Functional specifications for the FIPS DDS are examined for the dictionary, the dictionary schema, and the dictionary processing system. The DDS user interfaces and DDS software interfaces are discussed as well as dictionary administration.
Compressed Sensing Quantum Process Tomography for Superconducting Quantum Gates
NASA Astrophysics Data System (ADS)
Rodionov, Andrey
An important challenge in quantum information science and quantum computing is the experimental realization of high-fidelity quantum operations on multi-qubit systems. Quantum process tomography (QPT) is a procedure devised to fully characterize a quantum operation. We first present the results of the estimation of the process matrix for superconducting multi-qubit quantum gates using the full data set employing various methods: linear inversion, maximum likelihood, and least-squares. To alleviate the problem of exponential resource scaling needed to characterize a multi-qubit system, we next investigate a compressed sensing (CS) method for QPT of two-qubit and three-qubit quantum gates. Using experimental data for two-qubit controlled-Z gates, taken with both Xmon and superconducting phase qubits, we obtain estimates for the process matrices with reasonably high fidelities compared to full QPT, despite using significantly reduced sets of initial states and measurement configurations. We show that the CS method still works when the amount of data is so small that the standard QPT would have an underdetermined system of equations. We also apply the CS method to the analysis of the three-qubit Toffoli gate with simulated noise, and similarly show that the method works well for a substantially reduced set of data. For the CS calculations we use two different bases in which the process matrix is approximately sparse (the Pauli-error basis and the singular value decomposition basis), and show that the resulting estimates of the process matrices match with reasonably high fidelity. For both two-qubit and three-qubit gates, we characterize the quantum process by its process matrix and average state fidelity, as well as by the corresponding standard deviation defined via the variation of the state fidelity for different initial states. We calculate the standard deviation of the average state fidelity both analytically and numerically, using a Monte Carlo method. Overall, we show that CS QPT offers a significant reduction in the needed amount of experimental data for two-qubit and three-qubit quantum gates.
2012-01-01
Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552
A novel way of integrating rule-based knowledge into a web ontology language framework.
Gamberger, Dragan; Krstaçić, Goran; Jović, Alan
2013-01-01
Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.
North Carolina Public Schools Facility Standards. A Guide for Planning School Facilities.
ERIC Educational Resources Information Center
Knott, Gerald H.; Lora, James M.; Acker, Marjorie L.; Taynton, Steven; Logan, Gladys B.; Harrell, Ronald C.
The State of North Carolina has developed a planning guide for those in the process of building, enlarging, or renovating school facilities. This guide defines and describes the educational spaces needed to support a modern, comprehensive educational program and sets minimal standards for the types and sizes of spaces required. It serves as a…
A System Evaluation Theory Analyzing Value and Results Chain for Institutional Accreditation in Oman
ERIC Educational Resources Information Center
Paquibut, Rene Ymbong
2017-01-01
Purpose: This paper aims to apply the system evaluation theory (SET) to analyze the institutional quality standards of Oman Academic Accreditation Authority using the results chain and value chain tools. Design/methodology/approach: In systems thinking, the institutional standards are connected as input, process, output and feedback and leads to…
Van Hecke, Wim; Sijbers, Jan; De Backer, Steve; Poot, Dirk; Parizel, Paul M; Leemans, Alexander
2009-07-01
Although many studies are starting to use voxel-based analysis (VBA) methods to compare diffusion tensor images between healthy and diseased subjects, it has been demonstrated that VBA results depend heavily on parameter settings and implementation strategies, such as the applied coregistration technique, smoothing kernel width, statistical analysis, etc. In order to investigate the effect of different parameter settings and implementations on the accuracy and precision of the VBA results quantitatively, ground truth knowledge regarding the underlying microstructural alterations is required. To address the lack of such a gold standard, simulated diffusion tensor data sets are developed, which can model an array of anomalies in the diffusion properties of a predefined location. These data sets can be employed to evaluate the numerous parameters that characterize the pipeline of a VBA algorithm and to compare the accuracy, precision, and reproducibility of different post-processing approaches quantitatively. We are convinced that the use of these simulated data sets can improve the understanding of how different diffusion tensor image post-processing techniques affect the outcome of VBA. In turn, this may possibly lead to a more standardized and reliable evaluation of diffusion tensor data sets of large study groups with a wide range of white matter altering pathologies. The simulated DTI data sets will be made available online (http://www.dti.ua.ac.be).
Statistical Process Control: Going to the Limit for Quality.
ERIC Educational Resources Information Center
Training, 1987
1987-01-01
Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)
Proceduralism and Bureaucracy: Due Process in the School Setting
ERIC Educational Resources Information Center
Kirp, David L.
1976-01-01
The likely consequences of applying traditional due process standards, expecially formal adversary hearings, to the public school are examined. The ruling in Goss v. Lopez suggests that fair treatment can still be expected if the hearings are treated as opportunities for candid and informal exchange rather than prepunishment ceremonies. (LBH)
Advocating the Broad Use of the Decision Tree Method in Education
ERIC Educational Resources Information Center
Gomes, Cristiano Mauro Assis; Almeida, Leandro S.
2017-01-01
Predictive studies have been widely undertaken in the field of education to provide strategic information about the extensive set of processes related to teaching and learning, as well as about what variables predict certain educational outcomes, such as academic achievement or dropout. As in any other area, there is a set of standard techniques…
A management, leadership, and board road map to transforming care for patients.
Toussaint, John
2013-01-01
Over the last decade I have studied 115 healthcare organizations in II countries, examining them from the boardroom to the patient bedside. In that time, I have observed one critical element missing from just about every facility: a set of standards that could reliably produce zero-defect care for patients. This lack of standards is largely rooted in the Sloan management approach, a top-down management and leadership structure that is void of standardized accountability. This article offers an alternative approach: management by process--an operating system that engages frontline staff in decisions and imposes standards and processes on the act of managing. Organizations that have adopted management by process have seen quality improve and costs decrease because the people closest to the work are expected to identify problems and solve them. Also detailed are the leadership behaviors required for an organization to successfully implement the management-by-process operating system and the board of trustees' role in supporting the transformation.
Evaluation of Standards for Access Control Enabling PHR-S Federation.
Mense, Alexander; Urbauer, Philipp; Sauermann, Stefan
2017-01-01
The adoption of the Internet of Things (IoT) and mobile applications in the healthcare may transform the healthcare industry by offering better disease tracking and management as well as patient empowerment. Unfortunately, almost all of these new systems set up their own ecosystem and to be really valuable for the care process they need to be integrated or federated with user managed access control services based on international standards and profiles to enable interoperability. Thus, this work presents the results of an evaluation of available specifications for federated authorization, based on a set of basic requirements.
Evaluation and implementation of chemotherapy regimen validation in an electronic health record.
Diaz, Amber H; Bubalo, Joseph S
2014-12-01
Computerized provider order entry of chemotherapy regimens is quickly becoming the standard for prescribing chemotherapy in both inpatient and ambulatory settings. One of the difficulties with implementation of chemotherapy regimen computerized provider order entry lies in verifying the accuracy and completeness of all regimens built in the system library. Our goal was to develop, implement, and evaluate a process for validating chemotherapy regimens in an electronic health record. We describe our experience developing and implementing a process for validating chemotherapy regimens in the setting of a standard, commercially available computerized provider order entry system. The pilot project focused on validating chemotherapy regimens in the adult inpatient oncology setting and adult ambulatory hematologic malignancy setting. A chemotherapy regimen validation process was defined as a result of the pilot project. Over a 27-week pilot period, 32 chemotherapy regimens were validated using the process we developed. Results of the study suggest that by validating chemotherapy regimens, the amount of time spent by pharmacists in daily chemotherapy review was decreased. In addition, the number of pharmacist modifications required to make regimens complete and accurate were decreased. Both physician and pharmacy disciplines showed improved satisfaction and confidence levels with chemotherapy regimens after implementation of the validation system. Chemotherapy regimen validation required a considerable amount of planning and time but resulted in increased pharmacist efficiency and improved provider confidence and satisfaction. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Zhang, Zhongqi; Zhang, Aming; Xiao, Gang
2012-06-05
Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.
ERIC Educational Resources Information Center
Diamond, Esther E.
The growing demand for program evaluation in the 1970s gave rise to a need for a comprehensive, carefully developed, objective set of guiding principles for the evaluation process, from initial planning to final report. The Joint Committee on Standards for Education Evaluation was established to meet this need. This broad-based group, representing…
ERIC Educational Resources Information Center
Ryan, David L.
2010-01-01
While research in academic and professional information technology (IT) journals address the need for strategic alignment and defined IT processes, there is little research about what factors should be considered when implementing specific IT hardware standards in an organization. The purpose of this study was to develop a set of factors for…
A Survey of Leadership Standards for Professional Preparation of Public School Principals in Kuwait
ERIC Educational Resources Information Center
Alansari, Amal EEHE
2012-01-01
Problem: Over the last decade, the Ministry of Education in Kuwait undertook the responsibility of reforming the Kuwaiti education system. While it noted the importance of school principals in this reform process, it has not yet focused on the development of school leaders through formal preparation. There were no standards set to guide school…
Lozano, Valeria A; Ibañez, Gabriela A; Olivieri, Alejandro C
2009-10-05
In the presence of analyte-background interactions and a significant background signal, both second-order multivariate calibration and standard addition are required for successful analyte quantitation achieving the second-order advantage. This report discusses a modified second-order standard addition method, in which the test data matrix is subtracted from the standard addition matrices, and quantitation proceeds via the classical external calibration procedure. It is shown that this novel data processing method allows one to apply not only parallel factor analysis (PARAFAC) and multivariate curve resolution-alternating least-squares (MCR-ALS), but also the recently introduced and more flexible partial least-squares (PLS) models coupled to residual bilinearization (RBL). In particular, the multidimensional variant N-PLS/RBL is shown to produce the best analytical results. The comparison is carried out with the aid of a set of simulated data, as well as two experimental data sets: one aimed at the determination of salicylate in human serum in the presence of naproxen as an additional interferent, and the second one devoted to the analysis of danofloxacin in human serum in the presence of salicylate.
Recommendations for Selecting Drug-Drug Interactions for Clinical Decision Support
Tilson, Hugh; Hines, Lisa E.; McEvoy, Gerald; Weinstein, David M.; Hansten, Philip D.; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T.; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L.; Huang, Shiew-Mei; Perre, Anthony; Bates, David W.; Poikonen, John; Wittie, Michael A.; Grizzle, Amy J.; Brown, Mary; Malone, Daniel C.
2016-01-01
Purpose To recommend principles for including drug-drug interactions (DDIs) in clinical decision support. Methods A conference series was conducted to improve clinical decision support (CDS) for DDIs. The Content Workgroup met monthly by webinar from January 2013 to February 2014, with two in-person meetings to reach consensus. The workgroup consisted of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information (IT) vendors, and healthcare organizations. Workgroup members addressed four key questions: (1) What process should be used to develop and maintain a standard set of DDIs?; (2) What information should be included in a knowledgebase of standard DDIs?; (3) Can/should a list of contraindicated drug pairs be established?; and (4) How can DDI alerts be more intelligently filtered? Results To develop and maintain a standard set of DDIs for CDS in the United States, we recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated, as only a small set of drug combinations are truly contraindicated. Finally, we recommend more research to identify methods to safely reduce repetitive and less relevant alerts. Conclusion A systematic ongoing process is necessary to select DDIs for alerting clinicians. We anticipate that our recommendations can lead to consistent and clinically relevant content for interruptive DDIs, and thus reduce alert fatigue and improve patient safety. PMID:27045070
Butt, Muhammad Arif; Akram, Muhammad
2016-01-01
We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.
Evaluating environmental survivability of optical coatings
NASA Astrophysics Data System (ADS)
Joseph, Shay; Yadlovker, Doron; Marcovitch, Orna; Zipin, Hedva
2009-05-01
In this paper we report an on going research to correlate between optical coating survivability and military (MIL) standards. For this purpose 8 different types of coatings were deposited on 1" substrates of sapphire, multi-spectral ZnS (MS-ZnS), germanium, silicon and BK7. All coatings underwent MIL standard evaluation as defined by customer specifications and have passed successfully. Two other sets were left to age for 12 months at two different locations, one near central Tel-Aviv and one by the shoreline of the Mediterranean Sea. A third set was aged for 2000 hours at a special environmental chamber simulating conditions of temperature, humidity and ultra-violet (UV) radiation simultaneously. Measurements of optical transmission before and after aging from all 3 sets reveal, in some cases, major transmission loss indicating severe coating damage. The different aging methods and their relation to the MIL standards are discussed in detail. The most pronounced conclusion is that MIL standards alone are not sufficient for predicting the lifetime of an external coated optical element and are only useful in certifying the coating process and comparison between coatings.
Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.
Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D
2016-10-01
This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.
Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis
Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies. PMID:26556560
The evolving story of information assurance at the DoD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Philip LaRoche
2007-01-01
This document is a review of five documents on information assurance from the Department of Defense (DoD), namely 5200.40, 8510.1-M, 8500.1, 8500.2, and an ''interim'' document on DIACAP [9]. The five documents divide into three sets: (1) 5200.40 & 8510.1-M, (2) 8500.1 & 8500.2, and (3) the interim DIACAP document. The first two sets describe the certification and accreditation process known as ''DITSCAP''; the last two sets describe the certification and accreditation process known as ''DIACAP'' (the second set applies to both processes). Each set of documents describes (1) a process, (2) a systems classification, and (3) a measurement standard.more » Appendices in this report (a) list the Phases, Activities, and Tasks of DITSCAP, (b) note the discrepancies between 5200.40 and 8510.1-M concerning DITSCAP Tasks and the System Security Authorization Agreement (SSAA), (c) analyze the DIACAP constraints on role fusion and on reporting, (d) map terms shared across the documents, and (e) review three additional documents on information assurance, namely DCID 6/3, NIST 800-37, and COBIT{reg_sign}.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Nan; Khanna, Nina Zheng; Fridley, David
Over the last twenty years, with growing policy emphasis on improving energy efficiency and reducing environmental pollution and carbon emissions, China has implemented a series of new minimum energy performance standards (MEPS) and mandatory and voluntary energy labels to improve appliance energy efficiency. As China begins planning for the next phase of standards and labeling (S&L) program development under the 12th Five Year Plan, an evaluation of recent program developments and future directions is needed to identify gaps that still exist when compared with international best practices. The review of China’s S&L program development and implementation in comparison with majormore » findings from international experiences reveal that there are still areas of improvement, particularly when compared to success factors observed across leading international S&L program. China currently lacks a formalized regulatory process for standard-setting and do not have any legal or regulatory guidance on elements of S&L development such as stakeholder participation or the issue of legal precedence between conflicting national, industrial and local standards. Consequently, China’s laws regarding standard-setting and management of the mandatory energy label program could be updated, as they have not been amended or revised recently and no longer reflects the current situation. While China uses similar principles for choosing target products as the U.S., Australia, EU and Japan, including high energy-consumption, mature industry and testing procedure and stakeholder support, recent MEPS revisions have generally aimed at only eliminating the bottom 20% efficiency of the market. Setting a firm principle based on maximizing energy savings that are technically feasible and economically justified may help improve the stringency of China’s MEPS program and reduce the need for frequent revisions. China also lacks robust survey data and relies primarily on market research data in relatively simple techno-economic analyses used to determine its efficiency standards levels rather than the specific sets of analyses and tools used internationally. Based on international experiences, inclusion of more detailed energy consumption surveys in the Chinese national census surveys and statistical reporting systems could help provide the necessary data for more comprehensive standard-setting analyses. In terms of stakeholder participation in the standards development process, stakeholder participation in China is limited to membership on technical committees responsible for developing or revising standards and generally do not include environmental groups, consumer associations, utilities and other NGOs. Increasing stakeholder involvement to broader interest groups could help garner more support and feedback in the S&L implementation process. China has emerged as a leader in a national verification testing scheme with complementary pilot checktesting projects, but it still faces challenges with insufficient funding, low local awareness amongst some regulatory agencies and resistance to check-testing by some manufacturers, limited product sampling scope, and testing inconsistency and incomparability of results. Thus, further financial and staff resources and capacity building will be needed to overcome these remaining challenges and to expand impacts evaluations to assess the actual effectiveness of implementation and enforcement.« less
Darrah, J; Wiart, L; Magill-Evans, J; Ray, L; Andersen, J
2012-01-01
Family-centred service, functional goal setting and co-ordination of a child's move between programmes are important concepts of rehabilitation services for children with cerebral palsy identified in the literature. We examined whether these three concepts could be objectively identified in programmes providing services to children with cerebral palsy in Alberta, Canada. Programme managers (n= 37) and occupational and physical therapists (n= 54) representing 59 programmes participated in individual 1-h semi-structured interviews. Thirty-nine parents participated in eleven focus groups or two individual interviews. Evidence of family-centred values in mission statements and advisory boards was evaluated. Therapists were asked to identify three concepts of family-centred service and to complete the Measures of Process of Care for Service Providers. Therapists also identified therapy goals for children based on clinical case scenarios. The goals were coded using the components of the International Classification of Functioning Disability and Health. Programme managers and therapists discussed the processes in their programmes for goal setting and for preparing children and their families for their transition to other programmes. Parents reflected on their experiences with their child's rehabilitation related to family-centredness, goal setting and co-ordination between programmes. All respondents expressed commitment to the three concepts, but objective indicators of family-centred processes were lacking in many programmes. In most programmes, the processes to implement the three concepts were informal rather than standardized. Both families and therapists reported limited access to general information regarding community supports. Lack of formal processes for delivery of family-centred service, goal-setting and co-ordination between children's programmes may result in inequitable opportunities for families to participate in their children's rehabilitation despite attending the same programme. Standardized programme processes and policies may provide a starting point to ensure that all families have equitable opportunities to participate in their child's rehabilitation programme. © 2010 Blackwell Publishing Ltd.
Peer Review: Promoting Efficient School District Operations
ERIC Educational Resources Information Center
Hale, Jason S.
2010-01-01
Many professions recognize the benefits of peer reviews to assess processes and operations because peers can more easily identify one another's inefficiencies and provide some kind of intervention. Generally, the goal of the peer review process is to verify whether the work satisfies the standards set by the industry. A number of states have begun…
What Makes School Ethnography "Ethnographic?"
ERIC Educational Resources Information Center
Erickson, Frederick
Ethnography as an inquiry process guided by a point of view rather than as a reporting process guided by a standard technique or set of techniques is the main point of this essay which suggests the application of Malinowski's theories and methods to an ethnology of the school, indicates reasons why traditional ethnography is inadequate to the…
Development of the private practice management standards for psychology.
Mathews, Rebecca; Stokes, David; Littlefield, Lyn; Collins, Leah
2011-01-01
This paper describes the process of developing a set of private practice management standards to support Australian psychologists and promote high quality services to the public. A review of the literature was conducted to identify management standards relevant to psychology, which were further developed in consultation with a panel of experts in psychology or in the development of standards. Forty-three psychologists in independent private practice took part in either a survey (n=22) to provide feedback on the relevance of, and their compliance with, the identified standards, or a 6-month pilot study (n=21) in which a web-based self-assessment instrument evaluating the final set of standards and performance indicators was implemented in their practice to investigate self-reported change in management procedures. The pilot study demonstrated good outcomes for practitioners when evaluation of compliance to the standards was operationalized in a self-assessment format. Study results are based on a small sample size. Nevertheless, relevance and utility of the standards was found providing an initial version of management standards that have relevance to the practice of psychology in Australia, along with a system for evaluating psychological service provision to ensure best practice in service delivery. © 2010 National Association for Healthcare Quality.
NASA Astrophysics Data System (ADS)
Polyakova, Marina; Rubin, Gennadiy
2017-07-01
Modern theory of technological and economical development is based on long-term cycles. So far it has been proved that the technological structure of the economy can be subdivided into groups of technological complexes, which are inter-related with each other by similar technological links, so called technological modes. Technological mode is defined as a complex of interrelated production units of similar technological level, which develop simultaneously. In order to provide competitiveness of products in the new changing conditions, it is necessary to make sure that they meet all the regulatory requirements specified in standards. But the existing and the fast changing situation on the merchandise markets causes disbalance between the growing customer requirements and the technological capabilities of the manufacturer. This makes the issue of standardization development even more urgent both from the point of view of establishing the current positions and from the point of view of possible promising development trends in technology. In the paper scientific and engineering principles of developing standardization as a science are described. It is shown that further development of standardization is based on the principles of advanced standardization the main idea of which is to set up the prospective requirements to the innovative product. Modern approaches of advanced standardization are shown in this paper. The complexity of the negotiation procedure between customer and manufacturer as a whole and achieving of consensus, in particular, make it necessary to find conceptually new approaches to developing mathematical models. The developed methodology picture the process of achieving the consensus between customer and manufacturer while developing the standard norms in the form of decreasing S-curve diagram. It means that in the end of the negotiation process, there is no difference between customer and manufacturer positions. It makes it possible to provide the basics of the assessment using the differential equation of the relationship between the rate of change of quality assessment and the distance of the estimated parameter value from the best value to the worst one. The obtained mathematical model can be used in the practice of standardization decreasing time of setting standard norms.
The quality of care in occupational therapy: an assessment of selected Michigan hospitals.
Kirchman, M M
1979-07-01
In this study, a methodology was developed and tested for assessing the quality of care in occupational therapy between educational and noneducational clinical settings, as measured by process and outcome. An instrument was constructed for an external audit of the hospital record. Standards drafted by the investigator were established as normative by a panel of experts for use in judging the programs. Hospital records of 84 patients with residual hemiparesis or hemiplegia in three noneducational settings and of 100 patients with similar diagnoses in two educational clinical settings from selected Michigan facilities were chosen by proportionate stratified random sampling. The process study showed that occupational therapy was of significantly higher quality in the educational settings. The outcome study did not show significant differences between types of settings. Implications for education and practice are discussed.
Guidelines on Good Clinical Laboratory Practice
Ezzelle, J.; Rodriguez-Chavez, I. R.; Darden, J. M.; Stirewalt, M.; Kunwar, N.; Hitchcock, R.; Walter, T.; D’Souza, M. P.
2008-01-01
A set of Good Clinical Laboratory Practice (GCLP) standards that embraces both the research and clinical aspects of GLP were developed utilizing a variety of collected regulatory and guidance material. We describe eleven core elements that constitute the GCLP standards with the objective of filling a gap for laboratory guidance, based on IND sponsor requirements, for conducting laboratory testing using specimens from human clinical trials. These GCLP standards provide guidance on implementing GLP requirements that are critical for laboratory operations, such as performance of protocol-mandated safety assays, peripheral blood mononuclear cell processing and immunological or endpoint assays from biological interventions on IND-registered clinical trials. The expectation is that compliance with the GCLP standards, monitored annually by external audits, will allow research and development laboratories to maintain data integrity and to provide immunogenicity, safety, and product efficacy data that is repeatable, reliable, auditable and that can be easily reconstructed in a research setting. PMID:18037599
NASA Astrophysics Data System (ADS)
Flores, Jorge L.; García-Torales, G.; Ponce Ávila, Cristina
2006-08-01
This paper describes an in situ image recognition system designed to inspect the quality standards of the chocolate pops during their production. The essence of the recognition system is the localization of the events (i.e., defects) in the input images that affect the quality standards of pops. To this end, processing modules, based on correlation filter, and segmentation of images are employed with the objective of measuring the quality standards. Therefore, we designed the correlation filter and defined a set of features from the correlation plane. The desired values for these parameters are obtained by exploiting information about objects to be rejected in order to find the optimal discrimination capability of the system. Regarding this set of features, the pop can be correctly classified. The efficacy of the system has been tested thoroughly under laboratory conditions using at least 50 images, containing 3 different types of possible defects.
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
Standardised assessment of functioning in ADHD: consensus on the ICF Core Sets for ADHD.
Bölte, Sven; Mahdi, Soheil; Coghill, David; Gau, Susan Shur-Fen; Granlund, Mats; Holtmann, Martin; Karande, Sunil; Levy, Florence; Rohde, Luis A; Segerer, Wolfgang; de Vries, Petrus J; Selb, Melissa
2018-02-12
Attention-deficit/hyperactivity disorder (ADHD) is associated with significant impairments in social, educational, and occupational functioning, as well as specific strengths. Currently, there is no internationally accepted standard to assess the functioning of individuals with ADHD. WHO's International Classification of Functioning, Disability and Health-child and youth version (ICF) can serve as a conceptual basis for such a standard. The objective of this study is to develop a comprehensive, a common brief, and three age-appropriate brief ICF Core Sets for ADHD. Using a standardised methodology, four international preparatory studies generated 132 second-level ICF candidate categories that served as the basis for developing ADHD Core Sets. Using these categories and following an iterative consensus process, 20 ADHD experts from nine professional disciplines and representing all six WHO regions selected the most relevant categories to constitute the ADHD Core Sets. The consensus process resulted in 72 second-level ICF categories forming the comprehensive ICF Core Set-these represented 8 body functions, 35 activities and participation, and 29 environmental categories. A Common Brief Core Set that included 38 categories was also defined. Age-specific brief Core Sets included a 47 category preschool version for 0-5 years old, a 55 category school-age version for 6-16 years old, and a 52 category version for older adolescents and adults 17 years old and above. The ICF Core Sets for ADHD mark a milestone toward an internationally standardised functional assessment of ADHD across the lifespan, and across educational, administrative, clinical, and research settings.
Use of benefit-cost analysis in establishing Federal radiation protection standards: a review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, L.E.
1979-10-01
This paper complements other work which has evaluated the cost impacts of radiation standards on the nuclear industry. It focuses on the approaches to valuation of the health and safety benefits of radiation standards and the actual and appropriate processes of benefit-cost comparison. A brief historical review of the rationale(s) for the levels of radiation standards prior to 1970 is given. The Nuclear Regulatory Commission (NRC) established numerical design objectives for light water reactors (LWRs). The process of establishing these numerical design criteria below the radiation protection standards set in 10 CFR 20 is reviewed. EPA's 40 CFR 190 environmentalmore » standards for the uranium fuel cycle have lower values than NRC's radiation protection standards in 10 CFR 20. The task of allocating EPA's 40 CFR 190 standards to the various portions of the fuel cycle was left to the implementing agency, NRC. So whether or not EPA's standards for the uranium fuel cycle are more stringent for LWRs than NRC's numerical design objectives depends on how EPA's standards are implemented by NRC. In setting the numerical levels in Appendix I to 10 CFR 50 and 40 CFR 190 NRC and EPA, respectively, focused on the costs of compliance with various levels of radiation control. A major portion of the paper is devoted to a review and critique of the available methods for valuing health and safety benefits. All current approaches try to estimate a constant value of life and use this to vaue the expected number of lives saved. This paper argues that it is more appropriate to seek a value of a reduction in risks to health and life that varies with the extent of these risks. Additional research to do this is recommended. (DC)« less
A standard satellite control reference model
NASA Technical Reports Server (NTRS)
Golden, Constance
1994-01-01
This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.
Magee, Michelle F
2007-05-15
Evolving elements of best practices for providing targeted glycemic control in the hospital setting, clinical performance measurement, basal-bolus plus correction-dose insulin regimens, components of standardized subcutaneous (s.c.) insulin order sets, and strategies for implementation and cost justification of glycemic control initiatives are discussed. Best practices for targeted glycemic control should address accurate documentation of hyperglycemia, initial patient assessment, management plan, target blood glucose range, blood glucose monitoring frequency, maintenance of glycemic control, criteria for glucose management consultations, and standardized insulin order sets and protocols. Establishing clinical performance measures, including desirable processes and outcomes, can help ensure the success of targeted hospital glycemic control initiatives. The basal-bolus plus correction-dose regimen for insulin administration will be used to mimic the normal physiologic pattern of endogenous insulin secretion. Standardized insulin order sets and protocols are being used to minimize the risk of error in insulin therapy. Components of standardized s.c. insulin order sets include specification of the hyperglycemia diagnosis, finger stick blood glucose monitoring frequency and timing, target blood glucose concentration range, cutoff values for excessively high or low blood glucose concentrations that warrant alerting the physician, basal and prandial or nutritional (i.e., bolus) insulin, correction doses, hypoglycemia treatment, and perioperative or procedural dosage adjustments. The endorsement of hospital administrators and key physician and nursing leaders is needed for glycemic control initiatives. Initiatives may be cost justified on the basis of the billings for clinical diabetes management services and/or the return- on-investment accrued to reductions in hospital length of stay, readmissions, and accurate documentation and coding of unrecognized or uncontrolled diabetes, and diabetes complications. Standardized insulin order sets and protocols may minimize risk of insulin errors. The endorsement of these protocols by administrators, physicians, nurses, and pharmacists is also needed for success.
2013-06-01
ABBREVIATIONS ANSI American National Standards Institute ASIS American Society of Industrial Security CCTV Closed Circuit Television CONOPS...is globally recognized for the development and maintenance of standards. ASTM defines a specification as an explicit set of requirements...www.rkb.us/saver/. One of the SAVER reports titled CCTV Technology Handbook has a chapter on system design. The report uses terms like functional
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
... Industrial Process Cooling X Towers. R Gasoline Distribution X S Pulp & Paper MACT I X T Halogenated Solvent.... IIII Auto & Light Duty Truck (Surface X Coating). JJJJ Paper & Other Webs (Surface X Coating). KKKK... subparts B, H, I, K, Q, R, T, and W. For the part 63 NESHAPs, this includes the NESHAPs set forth in the...
Electric Power: Contemporary Issues and the Federal Role in Oversight and Regulation.
1981-12-21
Regulatory Commission NRECA National Rural Electric Cooperative Asso- ciation PURPA Public Utility Regulatory Policies Act of 1978 REA Rural...energy efficiency standards for certain products and processes, and sets standards for solar energy and conservation in Federal buildings. PURPA --the...conservation, efficient use of facilities and resources, and equitable rates to electric consumers. PURPA also (1) encourages the use of cogeneration and
Progress in the development of paper-based diagnostics for low-resource point-of-care settings
Byrnes, Samantha; Thiessen, Gregory; Fu, Elain
2014-01-01
This Review focuses on recent work in the field of paper microfluidics that specifically addresses the goal of translating the multistep processes that are characteristic of gold-standard laboratory tests to low-resource point-of-care settings. A major challenge is to implement multistep processes with the robust fluid control required to achieve the necessary sensitivity and specificity of a given application in a user-friendly package that minimizes equipment. We review key work in the areas of fluidic controls for automation in paper-based devices, readout methods that minimize dedicated equipment, and power and heating methods that are compatible with low-resource point-of-care settings. We also highlight a focused set of recent applications and discuss future challenges. PMID:24256361
Study of materials for space processing
NASA Technical Reports Server (NTRS)
Lal, R. B.
1975-01-01
Materials were selected for device applications and their commercial use. Experimental arrangements were also made for electrical characterization of single crystals using electrical resistivity and Hall effect measurements. The experimental set-up was tested with some standard samples.
Prue-Owens, Kathy; Watkins, Miko; Wolgast, Kelly A
2011-01-01
The Patient CaringTouch System emerged from a comprehensive assessment and gap analysis of clinical nursing capabilities in the Army. The Patient CaringTouch System now provides the framework and set of standards by which we drive excellence in quality nursing care for our patients and excellence in quality of life for our nurses in Army Medicine. As part of this enterprise transformation, we placed particular emphasis on the delivery of nursing care at the bedside as well as the integration of a formal professional peer feedback process in support of individual nurse practice enhancement. The Warrior Care Imperative Action Team was chartered to define and establish the standards for care teams in the clinical settings and the process by which we established formal peer feedback for our professional nurses. This back-to-basics approach is a cornerstone of the Patient CaringTouch System implementation and sustainment.
ERIC Educational Resources Information Center
Hantula, Donald A.
1995-01-01
Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…
ERIC Educational Resources Information Center
Stowers, Robert H.; Barker, Randolph T.
2010-01-01
This article explores the uses of coaching and mentoring as they apply to organizational communication professors. The authors contend that these professors already are proficient at coaching and mentoring and the coaching and mentoring processes are routinely undertaken as part of their standard university teaching responsibilities. As coaches,…
The Effect and Importance of Technology in the Research Process
ERIC Educational Resources Information Center
Cuff, Ed
2014-01-01
From elementary schooling to doctoral-level education, technology has become an integral part of the learning process in and out of the classroom. With the implementation of the Common Core Learning Standards, the skills required for research are more valuable than ever, for they are required to succeed in a college setting, as well as in the…
Homeland Security Lessons for the United States
2004-06-01
international standard for AML / CFT practices is set by the forty Recommendations of the Financial Action Task Force, or FATF, an inter-governmental...to foster sound AML / CFT practices. Singapore has a strong tradition for rigorous supervision of financial institutions. The two aspects of this...supervisory process with regards to AML / CFT are: issuing detailed guidelines to financial institutions, setting out their obligations with respect to
Halabi, Sam F; Lin, Ching-Fu
An extensive global system of private food regulation is under construction, one that exceeds conventional regulation thought of as being driven by public authorities like FDA and USDA in the U.S. or the Food Standards Agency in the UK. Agrifood and grocer organizations, in concert with some farming groups, have been the primary designers of this new food regulatory regime. These groups have established alliances that compete with national regulators in complex ways. This article analyzes the relationship between public and private sources of food safety regulation by examining standards adopted by the Codex Alimentarius Commission, a food safety organization jointly run by the Food and Agricultural Organization and the World Health Organization and GlobalG.A.P., a farm assurance program created in the late 1990s by supermarket chains and their major suppliers which has now expanded into a global certifying coalition. While Codex standards are adopted, often as written, by national food safety regulators who are principal drivers of the standard setting process, customers for agricultural products in many countries now demand evidence of GlobalG.A.P. certification as a prerequisite for doing business This article tests not only the durability and strength of private sector standard setting in the food safety system, but also the desirability of that system as an alternative to formal, governmental processes embodied, for our purposes, in the standards adopted by Codex. In many cases, official standards and GlobalG.A.P. standards clash in ways that implicate not only food safety but the flow of agricultural products in the global trading system. The article analyzes current weaknesses in both regimes and possibilities for change that will better reconcile the two competing systems.
Lessons Learned on Quality (of) Standards
NASA Astrophysics Data System (ADS)
Gerlich, Rainer; Gerlich, Ralf
2011-08-01
Standards are used to describe and ensure the quality of products, services and processes throughout almost all branches of industry, including the field of software engineering. Contractors and suppliers are obligated by their customers and certification authorities to follow a certain set of standards during development. For example, a customer can easier actively participate in and control the contractor's process when enforcing a standard process..However, as with any requirement, a standard may also impede the contractor or supplier in assuring actual quality of the product in the sense of fitness for the purpose intended by the customer.This is the case when a standard defines specific quality assurance activities requiring a considerable amount of effort while other more efficient but equivalent or even superior approaches are blocked. Then improvement of the ratio between cost and quality exceeding miniscule advances is heavily impeded.While in some parts being too specific in defining the mechanisms of the enforced process, standards are sometimes too weak in defining the principles or goals on control of product quality.Therefore this paper addresses the following issues: (1) Which conclusions can be drawn on the quality and efficiency of a standard? (2) If and how is it possible to improve or evolve a standard? (3) How well does a standard guide a user towards high quality of the end product?One conclusion is that the analyzed standards do interfere with technological innovation, though the standards leave a lot of freedom for concretization and are understood as technology-independent.Another conclusion is that standards are not only a matter of quality but also a matter of competitiveness of the industry depending on resulting costs and time-to- market. When the costs induced by a standard are not adequate to the achievable quality, industry encounters a significant disadvantage.
Data preprocessing methods of FT-NIR spectral data for the classification cooking oil
NASA Astrophysics Data System (ADS)
Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli
2014-12-01
This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.
Gómez-Benito, Juana; Guilera, Georgina; Barrios, Maite; Rojo, Emilio; Pino, Oscar; Gorostiaga, Arantxa; Balluerka, Nekane; Hidalgo, María Dolores; Padilla, José Luis; Benítez, Isabel; Selb, Melissa
2017-07-30
Based on the International Classification of Functioning, Disability and Health (ICF), this paper presents the results of the process to develop the Comprehensive and Brief Core Sets for schizophrenia that allow to comprehensively describe functioning in persons with schizophrenia. Twenty health professionals from diverse backgrounds participated in a formal and iterative decision-making process during an international consensus conference to develop these Core Sets. The conference was carried out based on evidence gathered from four preparatory studies (systematic literature review, qualitative study, expert survey, and empirical study). The first step of this decision-making and consensus process comprised of discussions and voting in working groups and plenary sessions to develop the comprehensive version. The categories of the Comprehensive ICF Core Set for schizophrenia served as the basis for the second step -a ranking and cutoff procedure to decide on the brief version. Of the 184 candidate categories identified in the preparatory studies, 97 categories were included in the Comprehensive Core Set for schizophrenia. A total of 25 categories were selected to constitute the Brief Core Set. The formal decision-making and consensus process integrating evidence from four preparatory studies and expert opinion led to the first version of the Core Sets for schizophrenia. Comprehensive and Brief Core Sets for schizophrenia may provide a common language among different health professionals and researchers, and a basic international standard of what to measure, report, and assess the functioning of persons with schizophrenia. Implications for rehabilitation Schizophrenia is a chronic mental disorder that has a tremendous impact on functioning and daily life of persons living with the disorder. The International Classification of Functioning, Disability and Health (ICF) offers an internationally recognized standard for describing the functioning status of these individuals. The Core Sets for schizophrenia have potential use in supporting rehabilitation practice such as for planning mental health services and other interventions or defining rehabilitation goals, and documenting patient care. The Core Sets for schizophrenia may also be used to promote interdisciplinary coordination and facilitate communication between members of a multidisciplinary rehabilitation team. Rehabilitation research is another potential area of application of the Core Sets for schizophrenia. This is valuable, since rehabilitation research provides crucial evidence for optimizing rehabilitation practice.
Unified transform architecture for AVC, AVS, VC-1 and HEVC high-performance codecs
NASA Astrophysics Data System (ADS)
Dias, Tiago; Roma, Nuno; Sousa, Leonel
2014-12-01
A unified architecture for fast and efficient computation of the set of two-dimensional (2-D) transforms adopted by the most recent state-of-the-art digital video standards is presented in this paper. Contrasting to other designs with similar functionality, the presented architecture is supported on a scalable, modular and completely configurable processing structure. This flexible structure not only allows to easily reconfigure the architecture to support different transform kernels, but it also permits its resizing to efficiently support transforms of different orders (e.g. order-4, order-8, order-16 and order-32). Consequently, not only is it highly suitable to realize high-performance multi-standard transform cores, but it also offers highly efficient implementations of specialized processing structures addressing only a reduced subset of transforms that are used by a specific video standard. The experimental results that were obtained by prototyping several configurations of this processing structure in a Xilinx Virtex-7 FPGA show the superior performance and hardware efficiency levels provided by the proposed unified architecture for the implementation of transform cores for the Advanced Video Coding (AVC), Audio Video coding Standard (AVS), VC-1 and High Efficiency Video Coding (HEVC) standards. In addition, such results also demonstrate the ability of this processing structure to realize multi-standard transform cores supporting all the standards mentioned above and that are capable of processing the 8k Ultra High Definition Television (UHDTV) video format (7,680 × 4,320 at 30 fps) in real time.
Effect of standards on new equipment design by new international standards and industry restraints
NASA Astrophysics Data System (ADS)
Endelman, Lincoln L.
1991-01-01
The use of international standards to further trade is one of the objectives of creating a standard. By having form fit and function compatible the free interchange of manufactured goods can be handled without hindrance. Unfortunately by setting up standards that are peculiar to a particular country or district it is possible to exclude competition from a group of manufacturers. A major effort is now underway to develop international laser standards. In the May I 990 issue of Laser Focus World Donald R. Johnson the director of industrial technology services for the National Institute of Standards and Technology (NIST formerly the National Bureau of Standards) is quoted as follows: " The common means of protectionism has been through certification for the market place. " The article goes on to say " Mr. Johnson expects this tradition to continue and that the new European Community (EC) will demand not just safety standards but performance standards as well. . . . the American laser industry must move very quickly on this issue or risk being left behind the European standards bandwagon. " The article continues laser companies must get involved in the actual standards negotiating process if they are to have a say in future policy. A single set of standards would reduce the need to repeatedly recalibrate products for different national markets. " As a member of ISO TC-72 SC9 I am
Barnes, Rebecca; Albert, Monique; Damaraju, Sambasivarao; de Sousa-Hitzler, Jean; Kodeeswaran, Sugy; Mes-Masson, Anne-Marie; Watson, Peter; Schacter, Brent
2013-12-01
Despite the integral role of biorepositories in fueling translational research and the advancement of medicine, there are significant gaps in harmonization of biobanking practices, resulting in variable biospecimen collection, storage, and processing. This significantly impacts accurate downstream analysis and, in particular, creates a problem for biorepository networks or consortia. The Canadian Tumour Repository Network (CTRNet; www.ctrnet.ca ) is a consortium of Canadian tumor biorepositories that aims to enhance biobanking capacity and quality through standardization. To minimize the issue of variable biobanking practices throughout its network, CTRNet has developed and maintained a comprehensive set of 45 standard operating procedures (SOPs). There were four key elements to the CTRNet SOP development process: 1) an SOP development team was formed from members across CTRNet to co-produce each SOP; 2) a principal author was appointed with responsibility for overall coordination of the SOP development process; 3) the CTRNet Management Committee (composed of principal investigators for each member biorepository) reviewed/revised each SOP completed by the development team; and 4) external expert reviewers provided feedback and recommendations on each SOP. Once final Management Committee approval was obtained, the ratified SOP was published on the CTRNet website for public access. Since the SOPs were first published on the CTRNet website (June 2008), there have been approximately 15,000 downloads of one or more CTRNet SOPs/Policies by users from over 60 countries. In accordance with biobanking best practices, CTRNet performs an exhaustive review of its SOPs at set intervals, to coincide with each granting cycle. The last revision was completed in May 2012.
An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)
NASA Technical Reports Server (NTRS)
Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)
1974-01-01
A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.
Core Outcome Set-STAndards for Development: The COS-STAD recommendations.
Kirkham, Jamie J; Davis, Katherine; Altman, Douglas G; Blazeby, Jane M; Clarke, Mike; Tunis, Sean; Williamson, Paula R
2017-11-01
The use of core outcome sets (COS) ensures that researchers measure and report those outcomes that are most likely to be relevant to users of their research. Several hundred COS projects have been systematically identified to date, but there has been no formal quality assessment of these studies. The Core Outcome Set-STAndards for Development (COS-STAD) project aimed to identify minimum standards for the design of a COS study agreed upon by an international group, while other specific guidance exists for the final reporting of COS development studies (Core Outcome Set-STAndards for Reporting [COS-STAR]). An international group of experienced COS developers, methodologists, journal editors, potential users of COS (clinical trialists, systematic reviewers, and clinical guideline developers), and patient representatives produced the COS-STAD recommendations to help improve the quality of COS development and support the assessment of whether a COS had been developed using a reasonable approach. An open survey of experts generated an initial list of items, which was refined by a 2-round Delphi survey involving nearly 250 participants representing key stakeholder groups. Participants assigned importance ratings for each item using a 1-9 scale. Consensus that an item should be included in the set of minimum standards was defined as at least 70% of the voting participants from each stakeholder group providing a score between 7 and 9. The Delphi survey was followed by a consensus discussion with the study management group representing multiple stakeholder groups. COS-STAD contains 11 minimum standards that are the minimum design recommendations for all COS development projects. The recommendations focus on 3 key domains: the scope, the stakeholders, and the consensus process. The COS-STAD project has established 11 minimum standards to be followed by COS developers when planning their projects and by users when deciding whether a COS has been developed using reasonable methods.
Patel, Sanjay R.; Weng, Jia; Rueschman, Michael; Dudley, Katherine A.; Loredo, Jose S.; Mossavar-Rahmani, Yasmin; Ramirez, Maricelle; Ramos, Alberto R.; Reid, Kathryn; Seiger, Ashley N.; Sotres-Alvarez, Daniela; Zee, Phyllis C.; Wang, Rui
2015-01-01
Study Objectives: While actigraphy is considered objective, the process of setting rest intervals to calculate sleep variables is subjective. We sought to evaluate the reproducibility of actigraphy-derived measures of sleep using a standardized algorithm for setting rest intervals. Design: Observational study. Setting: Community-based. Participants: A random sample of 50 adults aged 18–64 years free of severe sleep apnea participating in the Sueño sleep ancillary study to the Hispanic Community Health Study/Study of Latinos. Interventions: N/A. Measurements and Results: Participants underwent 7 days of continuous wrist actigraphy and completed daily sleep diaries. Studies were scored twice by each of two scorers. Rest intervals were set using a standardized hierarchical approach based on event marker, diary, light, and activity data. Sleep/wake status was then determined for each 30-sec epoch using a validated algorithm, and this was used to generate 11 variables: mean nightly sleep duration, nap duration, 24-h sleep duration, sleep latency, sleep maintenance efficiency, sleep fragmentation index, sleep onset time, sleep offset time, sleep midpoint time, standard deviation of sleep duration, and standard deviation of sleep midpoint. Intra-scorer intraclass correlation coefficients (ICCs) were high, ranging from 0.911 to 0.995 across all 11 variables. Similarly, inter-scorer ICCs were high, also ranging from 0.911 to 0.995, and mean inter-scorer differences were small. Bland-Altman plots did not reveal any systematic disagreement in scoring. Conclusions: With use of a standardized algorithm to set rest intervals, scoring of actigraphy for the purpose of generating a wide array of sleep variables is highly reproducible. Citation: Patel SR, Weng J, Rueschman M, Dudley KA, Loredo JS, Mossavar-Rahmani Y, Ramirez M, Ramos AR, Reid K, Seiger AN, Sotres-Alvarez D, Zee PC, Wang R. Reproducibility of a standardized actigraphy scoring algorithm for sleep in a US Hispanic/Latino population. SLEEP 2015;38(9):1497–1503. PMID:25845697
Standard setting: the crucial issues. A case study of accounting & auditing.
Nowakowski, J R
1982-01-01
A study of standard-setting efforts in accounting and auditing is reported. The study reveals four major areas of concern in a professional standard-setting effort: (1) issues related to the rationale for setting standards, (2) issues related to the standard-setting board and its support structure, (3) issues related to the content of standards and rules for generating them, and (4) issues that deal with how standards are put to use. Principles derived from the study of accounting and auditing are provided to illuminate and assess standard-setting efforts in evaluation.
NASA Astrophysics Data System (ADS)
Mayernik, M. S.; Daniels, M.; Eaker, C.; Strand, G.; Williams, S. F.; Worley, S. J.
2012-12-01
Data sets exist within scientific research and knowledge networks as both technical and non-technical entities. Establishing the quality of data sets is a multi-faceted task that encompasses many automated and manual processes. Data sets have always been essential for science research, but now need to be more visible as first-class scholarly objects at national, international, and local levels. Many initiatives are establishing procedures to publish and curate data sets, as well as to promote professional rewards for researchers that collect, create, manage, and preserve data sets. Traditionally, research quality has been assessed by peer review of textual publications, e.g. journal articles, conference proceedings, and books. Citation indices then provide standard measures of productivity used to reward individuals for their peer-reviewed work. Whether a similar peer review process is appropriate for assessing and ensuring the quality of data sets remains as an open question. How does the traditional process of peer review apply to data sets? This presentation will describe current work being done at the National Center for Atmospheric Research (NCAR) in the context of the Peer REview for Publication & Accreditation of Research Data in the Earth sciences (PREPARDE) project. PREPARDE is assessing practices and processes for data peer review, with the goal of developing recommendations. NCAR data management teams perform various kinds of quality assessment and review of data sets prior to making them publicly available. The poster will investigate how notions of peer review relate to the types of data review already in place at NCAR. We highlight the data set characteristics and management/archiving processes that challenge the traditional peer review processes by using a number of questions as probes, including: Who is qualified to review data sets? What formal and informal documentation is necessary to allow someone outside of a research team to review a data set? What data set review can be done pre-publication, and what must be done post-publication? What components of the data sets review processes can be automated, and what components will always require human expertise and evaluation?
EPA Collaboration on International Air Pollution Standards for Aircraft
EPA has collaborated with the United Nation’s International Civil Aviation Organization (ICAO) to set a timeframe for initiating the U.S. domestic regulatory process for addressing greenhouse gas emissions from aircraft under the Clean Air Act.
Boosting standard order sets utilization through clinical decision support.
Li, Haomin; Zhang, Yinsheng; Cheng, Haixia; Lu, Xudong; Duan, Huilong
2013-01-01
Well-designed standard order sets have the potential to integrate and coordinate care by communicating best practices through multiple disciplines, levels of care, and services. However, there are several challenges which certainly affected the benefits expected from standard order sets. To boost standard order sets utilization, a problem-oriented knowledge delivery solution was proposed in this study to facilitate access of standard order sets and evaluation of its treatment effect. In this solution, standard order sets were created along with diagnostic rule sets which can trigger a CDS-based reminder to help clinician quickly discovery hidden clinical problems and corresponding standard order sets during ordering. Those rule set also provide indicators for targeted evaluation of standard order sets during treatment. A prototype system was developed based on this solution and will be presented at Medinfo 2013.
Defense Facility Condition: Revised Guidance Needed to Improve Oversight of Assessments and Ratings
2016-06-01
are to implement the standardized process in part by assessing the condition of buildings, pavement , and rail using the same set of software tools...facility to current standards; costs for labor, equipment, materials, and currency exchange rates overseas; costs for project planning and design ...example, the services are to assess the condition of buildings, pavement , and rail using Sustainment Management System software tools developed by the
Process Improvement in a Radically Changing Organization
NASA Technical Reports Server (NTRS)
Varga, Denise M.; Wilson, Barbara M.
2007-01-01
This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.
Kay, Jack F
2016-05-01
The Codex Committee on Residues of Veterinary Drugs in Food (CCRVDF) fulfils a number of functions revolving around standard setting. The core activities of the CCRVDF include agreeing priorities for assessing veterinary drug residues, recommending maximum residue limits for veterinary drugs in foods of animal origin, considering methods of sampling and analyses, and developing codes of practice. Draft standards are developed and progress through an agreed series of steps common to all Codex Alimentarius Commission Committees. Meetings of the CCRVDF are held at approximately 18-month intervals. To ensure effective progress is made with meetings at this frequency, the CCRVDF makes use of a number of management tools. These include circular letters to interested parties, physical and electronic drafting groups between plenary sessions, meetings of interested parties immediately prior to sessions, as well as break out groups within sessions and detailed discussions within the CCRVDF plenary sessions. A range of these approaches is required to assist advances within the standards setting process and can be applied to other Codex areas and international standard setting more generally. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A model-driven approach to information security compliance
NASA Astrophysics Data System (ADS)
Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena
2017-06-01
The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.
A Framework for Categorizing Important Project Variables
NASA Technical Reports Server (NTRS)
Parsons, Vickie S.
2003-01-01
While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.
Democratizing Implementation and Innovation in Mental Health Care.
Saxe, Glenn; Acri, Mary
2017-03-01
Improvements in the quality of mental health care in the United States depend on the successful implementation of evidence-based treatments (EBT's) in typical settings of care. Unfortunately, there is little evidence that EBT's are used in ways that would approximate their established fidelity standards in such settings. This article describes an approach to more successful implementation of EBT's via a collaborative process between intervention developers and intervention users (e.g. providers, administrators, consumers) called Lead-user Innovation. Lead-user Innovation democratizes the implementation process by integrating the expertise of lead-users in the delivery, adaptation, innovation and evaluation of EBT's.
Wiltz, Jennifer L; Blanck, Heidi M; Lee, Brian; Kocot, S Lawrence; Seeff, Laura; McGuire, Lisa C; Collins, Janet
2017-10-26
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the "ABCDs" of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public-private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems.
Blanck, Heidi M.; Lee, Brian; Kocot, S. Lawrence; Seeff, Laura; McGuire, Lisa C.; Collins, Janet
2017-01-01
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the “ABCDs” of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public–private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems. PMID:29072985
Automatic Lung-RADS™ classification with a natural language processing system.
Beyer, Sebastian E; McKee, Brady J; Regis, Shawn M; McKee, Andrea B; Flacke, Sebastian; El Saadawi, Gilan; Wald, Christoph
2017-09-01
Our aim was to train a natural language processing (NLP) algorithm to capture imaging characteristics of lung nodules reported in a structured CT report and suggest the applicable Lung-RADS™ (LR) category. Our study included structured, clinical reports of consecutive CT lung screening (CTLS) exams performed from 08/2014 to 08/2015 at an ACR accredited Lung Cancer Screening Center. All patients screened were at high-risk for lung cancer according to the NCCN Guidelines ® . All exams were interpreted by one of three radiologists credentialed to read CTLS exams using LR using a standard reporting template. Training and test sets consisted of consecutive exams. Lung screening exams were divided into two groups: three training sets (500, 120, and 383 reports each) and one final evaluation set (498 reports). NLP algorithm results were compared with the gold standard of LR category assigned by the radiologist. The sensitivity/specificity of the NLP algorithm to correctly assign LR categories for suspicious nodules (LR 4) and positive nodules (LR 3/4) were 74.1%/98.6% and 75.0%/98.8% respectively. The majority of mismatches occurred in cases where pulmonary findings were present not currently addressed by LR. Misclassifications also resulted from the failure to identify exams as follow-up and the failure to completely characterize part-solid nodules. In a sub-group analysis among structured reports with standardized language, the sensitivity and specificity to detect LR 4 nodules were 87.0% and 99.5%, respectively. An NLP system can accurately suggest the appropriate LR category from CTLS exam findings when standardized reporting is used.
Automatic Lung-RADS™ classification with a natural language processing system
Beyer, Sebastian E.; Regis, Shawn M.; McKee, Andrea B.; Flacke, Sebastian; El Saadawi, Gilan; Wald, Christoph
2017-01-01
Background Our aim was to train a natural language processing (NLP) algorithm to capture imaging characteristics of lung nodules reported in a structured CT report and suggest the applicable Lung-RADS™ (LR) category. Methods Our study included structured, clinical reports of consecutive CT lung screening (CTLS) exams performed from 08/2014 to 08/2015 at an ACR accredited Lung Cancer Screening Center. All patients screened were at high-risk for lung cancer according to the NCCN Guidelines®. All exams were interpreted by one of three radiologists credentialed to read CTLS exams using LR using a standard reporting template. Training and test sets consisted of consecutive exams. Lung screening exams were divided into two groups: three training sets (500, 120, and 383 reports each) and one final evaluation set (498 reports). NLP algorithm results were compared with the gold standard of LR category assigned by the radiologist. Results The sensitivity/specificity of the NLP algorithm to correctly assign LR categories for suspicious nodules (LR 4) and positive nodules (LR 3/4) were 74.1%/98.6% and 75.0%/98.8% respectively. The majority of mismatches occurred in cases where pulmonary findings were present not currently addressed by LR. Misclassifications also resulted from the failure to identify exams as follow-up and the failure to completely characterize part-solid nodules. In a sub-group analysis among structured reports with standardized language, the sensitivity and specificity to detect LR 4 nodules were 87.0% and 99.5%, respectively. Conclusions An NLP system can accurately suggest the appropriate LR category from CTLS exam findings when standardized reporting is used. PMID:29221286
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
Use of simulated data sets to evaluate the fidelity of metagenomic processing methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mavromatis, K; Ivanova, N; Barry, Kerrie
2007-01-01
Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene-finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity-based ( blast hit distribution) and twomore » sequence composition-based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.« less
Setting the Scope of Concept Inventories for Introductory Computing Subjects
ERIC Educational Resources Information Center
Goldman, Ken; Gross, Paul; Heeren, Cinda; Herman, Geoffrey L.; Kaczmarczyk, Lisa; Loui, Michael C.; Zilles, Craig
2010-01-01
A concept inventory is a standardized assessment tool intended to evaluate a student's understanding of the core concepts of a topic. In order to create a concept inventory it is necessary to accurately identify these core concepts. A Delphi process is a structured multi-step process that uses a group of experts to achieve a consensus opinion. We…
ERIC Educational Resources Information Center
Heine, Angela; Tamm, Sascha; Wissmann, Jacqueline; Jacobs, Arthur M.
2011-01-01
Whether and in what way enumeration processes differ for small and large sets of objects is still a matter of debate. In order to shed light on this issue, EEG data were obtained from 60 normally developing elementary school children. Adopting a standard non-symbolic numerical comparison paradigm allowed us to manipulate numerical distance between…
Standard operating procedures for clinical research departments.
Kee, Ashley Nichole
2011-01-01
A set of standard operating procedures (SOPs) provides a clinical research department with clear roles, responsibilities, and processes to ensure compliance, accuracy, and timeliness of data. SOPs also serve as a standardized training program for new employees. A practice may have an employee that can assist in the development of SOPs. There are also consultants that specialize in working with a practice to develop and write practice-specific SOPs. Making SOPs a priority will save a practice time and money in the long run and make the research practice more attractive to corporate study sponsors.
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines.
Khan, Arif Ul Maula; Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts.
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines
Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts. PMID:27764213
Correa, Cassia Bellotto; Pires, Juliana Rico; Fernandes-Filho, Romeu Belon; Sartori, Rafael; Vaz, Luis Geraldo
2009-07-01
The influence of fatigue and the fluoride ion corrosion process on Streptococcus mutans adherence to commercially pure Titanium (Cp Ti) implant/component set surfaces were studied. Thirty Nobel implants and 30 Neodent implants were used. Each commercial brand was divided into three groups. Group A: control, Group B: sets submitted to fatigue (10(5) cycles, 15 Hz, 150 N), and Group C: sets submitted to fluoride (1500 ppm, pH 5.5) and fatigue, simulating a mean use of 5 years in the oral medium. Afterward, the sets were contaminated with standard strains of S. mutans (NTCC 1023) and analyzed by scanning electronic microscopy (SEM) and colony-forming unit counts (CFU/mL). By SEM, bacterial adherence was verified only in group C in both brands. By CFU/mL counts, S. mutans was statistically higher in both brands in group C than in groups A and B (p < 0.05, ANOVA). The process of corrosion by fluoride ions on Cp Ti implant/component sets allowed greater S. mutans adherence than in the absence of corrosion and with the fatigue process in isolation.
Processing of meteorological data with ultrasonic thermoanemometers
NASA Astrophysics Data System (ADS)
Telminov, A. E.; Bogushevich, A. Ya.; Korolkov, V. A.; Botygin, I. A.
2017-11-01
The article describes a software system intended for supporting scientific researches of the atmosphere during the processing of data gathered by multi-level ultrasonic complexes for automated monitoring of meteorological and turbulent parameters in the ground layer of the atmosphere. The system allows to process files containing data sets of temperature instantaneous values, three orthogonal components of wind speed, humidity and pressure. The processing task execution is done in multiple stages. During the first stage, the system executes researcher's query for meteorological parameters. At the second stage, the system computes series of standard statistical meteorological field properties, such as averages, dispersion, standard deviation, asymmetry coefficients, excess, correlation etc. The third stage is necessary to prepare for computing the parameters of atmospheric turbulence. The computation results are displayed to user and stored at hard drive.
Christensen, Leif; Karle, Hans; Nystrup, Jørgen
2007-09-01
An outcome-based approach to medical education compared to a process/content orientation is currently being discussed intensively. In this article, the process and outcome interrelationship in medical education is discussed, with specific emphasis on the relation to the definition of standards in basic medical education. Perceptions of outcome have always been an integrated element of curricular planning. The present debate underlines the need for stronger focus on learning objectives and outcome assessment in many medical schools around the world. The need to maintain an integrated approach of process/content and outcome is underlined in this paper. A worry is expressed about the taxonomy of learning in pure outcome-based medical education, in which student assessment can be a major determinant for the learning process, leaving the control of the medical curriculum to medical examiners. Moreover, curricula which favour reductionism by stating everything in terms of instrumental outcomes or competences, do face a risk of lowering quality and do become a prey for political interference. Standards based on outcome alone rise unclarified problems in relationship to licensure requirements of medical doctors. It is argued that the alleged dichotomy between process/content and outcome seems artificial, and that formulation of standards in medical education must follow a comprehensive line in curricular planning.
Transputer parallel processing at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1989-01-01
The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Metal Halide Lamp Ballasts and Fixtures Energy Conservation Standards § 431.329 Enforcement. Process for Metal Halide Lamp Ballasts. This section sets forth procedures DOE will follow in pursuing alleged... with the following statistical sampling procedures for metal halide lamp ballasts, with the methods...
EPA's Policy on Evaluating Risk to Children
It is the policy of EPA to consider the risks to infants and children consistently and explicitly as a part of risk assessments generated during its decision making process, including the setting of standards to protect public health and the environment.
Ontological simulation for educational process organisation in a higher educational institution
NASA Astrophysics Data System (ADS)
Berestneva, O. G.; Marukhina, O. V.; Bahvalov, S. V.; Fisochenko, O. N.; Berestneva, E. V.
2017-01-01
Following the new-generation standards is needed to form a task list connected with planning and organizing of an academic process, structure and content formation of degree programmes. Even when planning the structure and content of an academic process, one meets some problems concerning the necessity to assess the correlation between degree programmes and demands of educational and professional standards and to consider today’s job-market and students demands. The paper presents examples of ontological simulations for solutions of organizing educational process problems in a higher educational institution and gives descriptions of model development. The article presents two examples: ontological simulation when planning an educational process in a higher educational institution and ontological simulation for describing competences of an IT-specialist. The paper sets a conclusion about ontology application perceptiveness for formalization of educational process organization in a higher educational institution.
Thiermann, A
2004-04-01
To maximise the benefits of globalisation, countries and their stakeholders must become familiar with and adhere to the rights and obligations set out by the World Trade Organization under the Agreement on Sanitary and Phytosanitary Measures. Furthermore, for trade in animals and animal products, they must adhere to the standards, guidelines and recommendations established by the OIE (World organisation for animal health), which also encourages participation of countries in the standard-setting process. Only after implementing these requirements and strengthening veterinary infrastructures and surveillance and monitoring systems, will countries be able to fully benefit from the new international trade rules.
Globalization, international trade and animal health: the new roles of OIE.
Thiermann, Alejandro B
2005-02-01
In order for countries and their stakeholders to maximize the benefits of globalization they must become familiar with, and must adhere to, the rights and obligations set out by the World Trade Organization (WTO) under the Agreement on Sanitary and Phytosanitary Measures (SPS). For the purpose of trade in animals and animal products, they must also adhere to the standards, guidelines and recommendations established by the World Organisation for Animal Health (OIE). Countries are also encouraged to participate in this standard setting process of the OIE. Only after implementing these requirements and after strengthening the veterinary infrastructures and their surveillance and monitoring systems, will countries be able to fully benefit from these new international trade rules.
Byrne, Karen M; Levy, Kimberly Y; Reese, Erika M
2016-05-01
Maintaining an in-compliance clinical laboratory takes continuous awareness and review of standards, regulations, and best practices. A strong quality assurance program and well informed leaders who maintain professional networks can aid in this necessary task. This article will discuss a process that laboratories can follow to interpret, understand, and comply with the rules and standards set by laboratory accreditation bodies. Published by Oxford University Press on behalf American Society for Clinical Pathology, 2016. This work is written by US Government employees and is in the public domain in the United States.
Processing Techniques for Intelligibility Improvement to Speech with Co-Channel Interference.
1983-09-01
processing was found to be always less than in the original unprocessed co-channel sig- nali also as the length of the comb filter increased, the...7 D- i35 702 PROCESSING TECHNIQUES FOR INTELLIGIBILITY IMPRO EMENT 1.TO SPEECH WITH CO-C..(U) SIGNAL TECHNOLOGY INC GOLETACA B A HANSON ET AL SEP...11111111122 11111.25 1111 .4 111.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU Of STANDARDS- 1963-A RA R.83-225 Set ,’ember 1983 PROCESSING
Setting up an ethics of ecosystem research structure based on the precautionary principle.
Farmer, Michael C
2013-01-01
Ethical practices in ecological field research differ from those in laboratory research in more than the technical setting and the important distinction between population-level and individual-based concerns. The number of stakeholders affected by the conduct of field research is far larger; private landholders, public water utilities, public land managers, local industries, and communities large and small are only some of those who may be impacted. As research review boards move to establish specific ethical practices for field biologists, the process of identifying appropriate standards will affect the degree to which research will ultimately be disrupted. Standards that lead to research protocols that alienate key interests are not likely to be sustainable. Already, standards that have conflicted with the primary values of a key interest have resulted in disruptions to research and scientific progress. One way to manage this problem of deeply competing interests is to avoid the deepest offenses to any relevant interest group in the design of a proposed study. This is an application of the precautionary principle and is likely to generate a more sustainable balance among competing interests. Unfortunately, this process is also likely to be a never-ending, consensus-seeking process. Fortunately, scientists can have enormous influence on the process if they choose to engage in it early. If scientists use their expertise to function as honest brokers among affected interests, their own interest in scientific research progress is likely to be better met.
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
ARM Data File Standards Version: 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kehoe, Kenneth; Beus, Sherman; Cialella, Alice
2014-04-01
The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less
Federal COBOL Compiler Testing Service Compiler Validation Request Information.
1977-05-09
background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the
Setting Emissions Standards Based on Technology Performance
In setting national emissions standards, EPA sets emissions performance levels rather than mandating use of a particular technology. The law mandates that EPA use numerical performance standards whenever feasible in setting national emissions standards.
Silva, Raquel V S; Tessarolo, Nathalia S; Pereira, Vinícius B; Ximenes, Vitor L; Mendes, Fábio L; de Almeida, Marlon B B; Azevedo, Débora A
2017-03-01
The elucidation of bio-oil composition is important to evaluate the processes of biomass conversion and its upgrading, and to suggest the proper use for each sample. Comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) is a widely applied analytical approach for bio-oil investigation due to the higher separation and resolution capacity from this technique. This work addresses the issue of analytical performance to assess the comprehensive characterization of real bio-oil samples via GC×GC-TOFMS. The approach was applied to the individual quantification of compounds of real thermal (PWT), catalytic process (CPO), and hydrodeoxygenation process (HDO) bio-oils. Quantification was performed with reliability using the analytical curves of oxygenated and hydrocarbon standards as well as the deuterated internal standards. The limit of quantification was set at 1ngµL -1 for major standards, except for hexanoic acid, which was set at 5ngµL -1 . The GC×GC-TOFMS method provided good precision (<10%) and excellent accuracy (recovery range of 70-130%) for the quantification of individual hydrocarbons and oxygenated compounds in real bio-oil samples. Sugars, furans, and alcohols appear as the major constituents of the PWT, CPO, and HDO samples, respectively. In order to obtain bio-oils with better quality, the catalytic pyrolysis process may be a better option than hydrogenation due to the effective reduction of oxygenated compound concentrations and the lower cost of the process, when hydrogen is not required to promote deoxygenation in the catalytic pyrolysis process. Copyright © 2016 Elsevier B.V. All rights reserved.
Introducing Python tools for magnetotellurics: MTpy
NASA Astrophysics Data System (ADS)
Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.
2013-12-01
Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).
German MedicalTeachingNetwork (MDN) implementing national standards for teacher training.
Lammerding-Koeppel, M; Ebert, T; Goerlitz, A; Karsten, G; Nounla, C; Schmidt, S; Stosch, C; Dieter, P
2016-01-01
An increasing demand for proof of professionalism in higher education strives for quality assurance (QA) and improvement in medical education. A wide range of teacher trainings is available to medical staff in Germany. Cross-institutional approval of individual certificates is usually a difficult and time consuming task for institutions. In case of non-acceptance it may hinder medical teachers in their professional mobility. The faculties of medicine aimed to develop a comprehensive national framework, to promote standards for formal faculty development programmes across institutions and to foster professionalization of medical teaching. Addressing the above challenges in a joint approach, the faculties set up the national MedicalTeacherNetwork (MDN). Great importance is attributed to work out nationally concerted standards for faculty development and an agreed-upon quality control process across Germany. Medical teachers benefit from these advantages due to portability of faculty development credentials from one faculty of medicine to another within the MDN system. The report outlines the process of setting up the MDN and the national faculty development programme in Germany. Success factors, strengths and limitations are discussed from an institutional, individual and general perspective. Faculties engaged in similar developments might be encouraged to transfer the MDN concept to their countries.
Barnard, Juliana G; Dempsey, Amanda F; Brewer, Sarah E; Pyrzanowski, Jennifer; Mazzoni, Sara E; O'Leary, Sean T
2017-01-01
Many young and middle-aged women receive their primary health care from their obstetrician-gynecologists. A recent change to vaccination recommendations during pregnancy has forced the integration of new clinical processes at obstetrician-gynecology practices. Evidence-based best practices for vaccination delivery include the establishment of vaccination standing orders. As part of an intervention to increase adoption of evidence-based vaccination strategies for women in safety-net and private obstetrician-gynecology settings, we conducted a qualitative study to identify the facilitators and barriers experienced by obstetrician-gynecology sites when establishing vaccination standing orders. At 6 safety-net and private obstetrician-gynecology practices, 51 semistructured interviews were completed by trained qualitative researchers over 2 years with clinical staff and vaccination program personnel. Standardized qualitative research methods were used during data collection and team-based data analysis to identify major themes and subthemes within the interview data. All study practices achieved partial to full implementation of vaccine standing orders for human papillomavirus, tetanus diphtheria pertussis, and influenza vaccines. Facilitating factors for vaccine standing order adoption included process standardization, acceptance of a continual modification process, and staff training. Barriers to vaccine standing order adoption included practice- and staff-level competing demands, pregnant women's preference for medical providers to discuss vaccine information with them, and staff hesitation in determining HPV vaccine eligibility. With guidance and commitment to integration of new processes, obstetrician-gynecology practices are able to establish vaccine standing orders for pregnant and nonpregnant women. Attention to certain process barriers can aid the adoption of processes to support the delivery of vaccinations in obstetrician-gynecology practice setting, and provide access to preventive health care for many women. Copyright © 2016 Elsevier Inc. All rights reserved.
[The standardization of medical care and the training of medical personnel].
Korbut, V B; Tyts, V V; Boĭshenko, V A
1997-09-01
The medical specialist training at all levels (medical orderly, doctor's assistant, general practitioner, doctors) should be based on the medical care standards. Preliminary studies in the field of military medicine standards have demonstrated that the medical service of the Armed Forces of Russia needs medical resources' standards, structure and organization standards, technology standards. Military medical service resources' standards should reflect the requisitions for: all medical specialists' qualification, equipment and material for medical set-ups, field medical systems, drugs, etc. Standards for structures and organization should include requisitions for: command and control systems in military formations' and task forces' medical services and their information support; health-care and evacuation functions, sanitary control and anti-epidemic measures and personnel health protection. Technology standards development could improve and regulate the health care procedures in the process of evacuation. Standards' development will help to solve the problem of the data-base for the military medicine education system and medical research.
Aaltonen, T.
2015-03-17
Production of the Υ(1S) meson in association with a vector boson is a rare process in the standard model with a cross section predicted to be below the sensitivity of the Tevatron. Observation of this process could signify contributions not described by the standard model or reveal limitations with the current nonrelativistic quantum-chromodynamic models used to calculate the cross section. We perform a search for this process using the full Run II data set collected by the CDF II detector corresponding to an integrated luminosity of 9.4 fb -1. Our search considers the Υ→μμ decay and the decay of themore » W and Z bosons into muons and electrons. Furthermore, in these purely leptonic decay channels, we observe one ΥW candidate with an expected background of 1.2±0.5 events, and one ΥZcandidate with an expected background of 0.1±0.1 events. Both observations are consistent with the predicted background contributions. The resulting upper limits on the cross section for Υ+W/Zproduction are the most sensitive reported from a single experiment and place restrictions on potential contributions from non-standard-model physics.« less
Automated Data Submission for the Data Center
NASA Astrophysics Data System (ADS)
Wright, D.; Beaty, T.; Wei, Y.; Shanafield, H.; Santhana Vannan, S. K.
2014-12-01
Data centers struggle with difficulties related to data submission. Data are acquired through many avenues by many people. Many data submission activities involve intensive manual processes. During the submission process, data end up on varied storage devices. The situation can easily become chaotic. Collecting information on the status of pending data sets is arduous. For data providers, the submission process can be inconsistent and confusing. Scientists generally provide data from previous projects, and archival can be a low priority. Incomplete or poor documentation accompanies many data sets. However, complicated questionnaires deter busy data providers. At the ORNL DAAC, we have semi-automated the data set submission process to create a uniform data product and provide a consistent data provider experience. The formalized workflow makes archival faster for the data center and data set submission easier for data providers. Software modules create a flexible, reusable submission package. Formalized data set submission provides several benefits to the data center. A single data upload area provides one point of entry and ensures data are stored in a consistent location. A central dashboard records pending data set submissions in a single table and simplifies reporting. Flexible role management allows team members to readily coordinate and increases efficiency. Data products and metadata become uniform and easily maintained. As data and metadata standards change, modules can be modified or re-written without affecting workflow. While each data center has unique challenges, the data ingestion process is generally the same: get data from the provider, scientist, or project and capture metadata pertinent to that data. The ORNL DAAC data set submission workflow and software modules can be reused entirely or in part by other data centers looking for a data set submission solution. These data set submission modules will be available on NASA's Earthdata Code Collaborative and by request.
SU-B-213-02: Development of CAMPEP Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckham, W.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
SU-B-213-05: Development of ABR Certification Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seibert, J.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1996-10-01
This programmatic environmental impact statement (PElS) was prepared for the Uranium Mill Tailings Remedial Action (UMTRA) Ground Water Project to comply with the National Environmental Policy Act (NEPA). This PElS provides an analysis of the potential impacts of the alternatives and ground water compliance strategies as well as potential cumulative impacts. On November 8, 1978, Congress enacted the Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978, Public Law, codified at 42 USC §7901 et seq. Congress found that uranium mill tailings " ... may pose a potential and significant radiation health hazard to the public, and that every reasonablemore » effort should be made to provide for stabilization, disposal, and control in a safe, and environmentally sound manner of such tailings in order to prevent or minimize other environmental hazards from such tailings." Congress authorized the Secretary of Energy to designate inactive uranium processing sites for remedial action by the U.S. Department of Energy (DOE). Congress also directed the U.S. Environmental Protection Agency (EPA) to set the standards to be followed by the DOE for this process of stabilization, disposal, and control. On January 5, 1983, EPA published standards (40 CFR Part 192) for the disposal and cleanup of residual radioactive materials. On September 3, 1985, the U.S. Court of Appeals for the Tenth Circuit set aside and remanded to EPA the ground water provisions of the standards. The EPA proposed new standards to replace remanded sections and changed other sections of 40 CFR Part 192. These proposed standards were published in the Federal Register on September 24, 1987 (52 FR 36000). Section 108 of the UMTRCA requires that DOE comply with EPA's proposed standards in the absence of final standards. The Ground Water Project was planned under the proposed standards. On January 11, 1995, EPA published the final rule, with which the DOE must now comply. The PElS and the Ground Water Project are in accordance with the final standards. The EPA reserves the right to modify the ground water standards, if necessary, based on changes in EPA drinking water standards. Appendix A contains a copy of the 1983 EPA ground water compliance standards, the 1987 proposed changes to the standards, and the 1995 final rule. Under UMTRA, DOE is responsible for bringing the designated processing sites into compliance with the EPA ground water standards and complying with all other applicable standards and requirements. The U.S. Nuclear Regulatory Commission (NRC) must concur with DOE's actions. States are full participants in the process. The DOE also must consult with any affected Indian tribes and the Bureau of Indian Affairs. Uranium processing activities at most of the inactive mill sites resulted in the contamination of ground water beneath and, in some cases, downgradient of the sites. This contaminated ground water often has elevated levels of constituents such as but not limited to uranium and nitrates. The purpose of the UMTRA Ground Water Project is to eliminate or reduce to acceptable levels the potential health and environmental consequences of milling activities by meeting the EPA ground water standards.« less
Forrest, Sarah M; Challis, John H; Winter, Samantha L
2014-06-01
Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
An accelerated training method for back propagation networks
NASA Technical Reports Server (NTRS)
Shelton, Robert O. (Inventor)
1993-01-01
The principal objective is to provide a training procedure for a feed forward, back propagation neural network which greatly accelerates the training process. A set of orthogonal singular vectors are determined from the input matrix such that the standard deviations of the projections of the input vectors along these singular vectors, as a set, are substantially maximized, thus providing an optimal means of presenting the input data. Novelty exists in the method of extracting from the set of input data, a set of features which can serve to represent the input data in a simplified manner, thus greatly reducing the time/expense to training the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
Improved Protocols for Illumina Sequencing
Bronner, Iraad F.; Quail, Michael A.; Turner, Daniel J.; Swerdlow, Harold
2013-01-01
In this unit, we describe a set of improvements we have made to the standard Illumina protocols to make the sequencing process more reliable in a high-throughput environment, reduce amplification bias, narrow the distribution of insert sizes, and reliably obtain high yields of data. PMID:19582764
Gallo, Stephen A; Carpenter, Afton S; Glisson, Scott R
2013-01-01
Teleconferencing as a setting for scientific peer review is an attractive option for funding agencies, given the substantial environmental and cost savings. Despite this, there is a paucity of published data validating teleconference-based peer review compared to the face-to-face process. Our aim was to conduct a retrospective analysis of scientific peer review data to investigate whether review setting has an effect on review process and outcome measures. We analyzed reviewer scoring data from a research program that had recently modified the review setting from face-to-face to a teleconference format with minimal changes to the overall review procedures. This analysis included approximately 1600 applications over a 4-year period: two years of face-to-face panel meetings compared to two years of teleconference meetings. The average overall scientific merit scores, score distribution, standard deviations and reviewer inter-rater reliability statistics were measured, as well as reviewer demographics and length of time discussing applications. The data indicate that few differences are evident between face-to-face and teleconference settings with regard to average overall scientific merit score, scoring distribution, standard deviation, reviewer demographics or inter-rater reliability. However, some difference was found in the discussion time. These findings suggest that most review outcome measures are unaffected by review setting, which would support the trend of using teleconference reviews rather than face-to-face meetings. However, further studies are needed to assess any correlations among discussion time, application funding and the productivity of funded research projects.
The Gel Electrophoresis Markup Language (GelML) from the Proteomics Standards Initiative
Gibson, Frank; Hoogland, Christine; Martinez-Bartolomé, Salvador; Medina-Aunon, J. Alberto; Albar, Juan Pablo; Babnigg, Gyorgy; Wipat, Anil; Hermjakob, Henning; Almeida, Jonas S; Stanislaus, Romesh; Paton, Norman W; Jones, Andrew R
2011-01-01
The Human Proteome Organisation’s Proteomics Standards Initiative (HUPO-PSI) has developed the GelML data exchange format for representing gel electrophoresis experiments performed in proteomics investigations. The format closely follows the reporting guidelines for gel electrophoresis, which are part of the Minimum Information About a Proteomics Experiment (MIAPE) set of modules. GelML supports the capture of metadata (such as experimental protocols) and data (such as gel images) resulting from gel electrophoresis so that laboratories can be compliant with the MIAPE Gel Electrophoresis guidelines, while allowing such data sets to be exchanged or downloaded from public repositories. The format is sufficiently flexible to capture data from a broad range of experimental processes, and complements other PSI formats for mass spectrometry data and the results of protein and peptide identifications to capture entire gel-based proteome workflows. GelML has resulted from the open standardisation process of PSI consisting of both public consultation and anonymous review of the specifications. PMID:20677327
The gel electrophoresis markup language (GelML) from the Proteomics Standards Initiative.
Gibson, Frank; Hoogland, Christine; Martinez-Bartolomé, Salvador; Medina-Aunon, J Alberto; Albar, Juan Pablo; Babnigg, Gyorgy; Wipat, Anil; Hermjakob, Henning; Almeida, Jonas S; Stanislaus, Romesh; Paton, Norman W; Jones, Andrew R
2010-09-01
The Human Proteome Organisation's Proteomics Standards Initiative has developed the GelML (gel electrophoresis markup language) data exchange format for representing gel electrophoresis experiments performed in proteomics investigations. The format closely follows the reporting guidelines for gel electrophoresis, which are part of the Minimum Information About a Proteomics Experiment (MIAPE) set of modules. GelML supports the capture of metadata (such as experimental protocols) and data (such as gel images) resulting from gel electrophoresis so that laboratories can be compliant with the MIAPE Gel Electrophoresis guidelines, while allowing such data sets to be exchanged or downloaded from public repositories. The format is sufficiently flexible to capture data from a broad range of experimental processes, and complements other PSI formats for MS data and the results of protein and peptide identifications to capture entire gel-based proteome workflows. GelML has resulted from the open standardisation process of PSI consisting of both public consultation and anonymous review of the specifications.
Bölte, Sven; de Schipper, Elles; Holtmann, Martin; Karande, Sunil; de Vries, Petrus J; Selb, Melissa; Tannock, Rosemary
2014-12-01
In the study of health and quality of life in attention deficit/hyperactivity disorder (ADHD), it is of paramount importance to include assessment of functioning. The International Classification of Functioning, Disability and Health (ICF) provides a comprehensive, universally accepted framework for the description of functioning in relation to health conditions. In this paper, the authors outline the process to develop ICF Core Sets for ADHD. ICF Core Sets are subgroups of ICF categories selected to capture the aspects of functioning that are most likely to be affected in specific disorders. The ICF categories that will be included in the ICF Core Sets for ADHD will be determined at an ICF Core Set Consensus Conference, wherein evidence from four preliminary studies (a systematic review, an expert survey, a patient and caregiver qualitative study, and a clinical cross-sectional study) will be integrated. Comprehensive and Brief ICF Core Sets for ADHD will be developed with the goal of providing useful standards for research and clinical practice, and to generate a common language for the description of functioning in ADHD in different areas of life and across the lifespan.
Educators' insights in using chronicle diabetes: a data management system for diabetes education.
Wang, Jing; Siminerio, Linda M
2013-01-01
Diabetes educators lack data systems to monitor diabetes self-management education processes and programs. The purpose of the study is to explore diabetes educator's insights in using a diabetes education data management program: the Chronicle Diabetes system. We conducted 1 focus group with 8 diabetes educators who use the Chronicle system in western Pennsylvania. The focus group was audiotaped and transcribed verbatim. Themes were categorized according to system facilitators and barriers in using Chronicle. Educators report 4 system facilitators and 4 barrier features. System facilitators include (1) ability to extract data from Chronicle for education program recognition, (2) central location for collecting and documenting all patient and education data, (3) capability to monitor behavioral goal setting and clinical outcomes, and (4) use of a patient snapshot report that automatically summarizes behavioral goal setting and an education plan. Barriers reported are (1) initially time-consuming for data entry, (2) Health Insurance Portability and Accountability Act privacy concerns for e-mailing or downloading report, (3) need for special features (e.g., ability to attach a food diary), and (4) need to enhance existing features to standardize goal-setting process and incorporate psychosocial content. Educators favor capabilities for documenting program requirements, goal setting, and patient summaries. Barriers that need to be overcome are the amount of time needed for data entry, privacy, and special features. Diabetes educators conclude that a data management system such as Chronicle facilitates the education process and affords ease in documentation of meeting diabetes self-management education standards and recognition requirements.
NASA Astrophysics Data System (ADS)
Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.
2010-12-01
Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.
Fulga, Netta
2013-06-01
Quality management and accreditation in the analytical laboratory setting are developing rapidly and becoming the standard worldwide. Quality management refers to all the activities used by organizations to ensure product or service consistency. Accreditation is a formal recognition by an authoritative regulatory body that a laboratory is competent to perform examinations and report results. The Motherisk Drug Testing Laboratory is licensed to operate at the Hospital for Sick Children in Toronto, Ontario. The laboratory performs toxicology tests of hair and meconium samples for research and clinical purposes. Most of the samples are involved in a chain of custody cases. Establishing a quality management system and achieving accreditation became mandatory by legislation for all Ontario clinical laboratories since 2003. The Ontario Laboratory Accreditation program is based on International Organization for Standardization 15189-Medical laboratories-Particular requirements for quality and competence, an international standard that has been adopted as a national standard in Canada. The implementation of a quality management system involves management commitment, planning and staff education, documentation of the system, validation of processes, and assessment against the requirements. The maintenance of a quality management system requires control and monitoring of the entire laboratory path of workflow. The process of transformation of a research/clinical laboratory into an accredited laboratory, and the benefits of maintaining an effective quality management system, are presented in this article.
NASA Astrophysics Data System (ADS)
Wibawa, Teja A.; Lehodey, Patrick; Senina, Inna
2017-02-01
Geo-referenced catch and fishing effort data of the bigeye tuna fisheries in the Indian Ocean over 1952-2014 were analyzed and standardized to facilitate population dynamics modeling studies. During this 62-year historical period of exploitation, many changes occurred both in the fishing techniques and the monitoring of activity. This study includes a series of processing steps used for standardization of spatial resolution, conversion and standardization of catch and effort units, raising of geo-referenced catch into nominal catch level, screening and correction of outliers, and detection of major catchability changes over long time series of fishing data, i.e., the Japanese longline fleet operating in the tropical Indian Ocean. A total of 30 fisheries were finally determined from longline, purse seine and other-gears data sets, from which 10 longline and 4 purse seine fisheries represented 96 % of the whole historical geo-referenced catch. Nevertheless, one-third of total nominal catch is still not included due to a total lack of geo-referenced information and would need to be processed separately, accordingly to the requirements of the study. The geo-referenced records of catch, fishing effort and associated length frequency samples of all fisheries are available at doi:10.1594/PANGAEA.864154.
American National Standards: The Consensus Process
NASA Technical Reports Server (NTRS)
Schafer, Thom
2000-01-01
Since the early 20th Century, technical and professional societies have developed standards within their areas of expertise addressing aspects of their industries which they feel would benefit from a degree of standardization. From the beginning, the use of these standards was strictly voluntary. It did not take jurisdictional authorities long, however, to recognize that application of these voluntary standards enhanced public safety, as well as leveling the playing field in trade. Hence, laws were passed mandating their use. Purchasers of goods and services also recognized the advantages of standardization, and began requiring the use of standards in their procurement contracts. But how do jurisdictions and purchasers know that the standard they are mandating is a broad-based industry standard, or a narrowly focused set of rules which only apply to one company or institution, thereby giving them an unfair advantage? The answer is "consensus", and a unified approach in achieving it.
Cantrill, Richard C
2008-01-01
Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.
Compliance with minimum information guidelines in public metabolomics repositories
Spicer, Rachel A.; Salek, Reza; Steinbeck, Christoph
2017-01-01
The Metabolomics Standards Initiative (MSI) guidelines were first published in 2007. These guidelines provided reporting standards for all stages of metabolomics analysis: experimental design, biological context, chemical analysis and data processing. Since 2012, a series of public metabolomics databases and repositories, which accept the deposition of metabolomic datasets, have arisen. In this study, the compliance of 399 public data sets, from four major metabolomics data repositories, to the biological context MSI reporting standards was evaluated. None of the reporting standards were complied with in every publicly available study, although adherence rates varied greatly, from 0 to 97%. The plant minimum reporting standards were the most complied with and the microbial and in vitro were the least. Our results indicate the need for reassessment and revision of the existing MSI reporting standards. PMID:28949328
Compliance with minimum information guidelines in public metabolomics repositories.
Spicer, Rachel A; Salek, Reza; Steinbeck, Christoph
2017-09-26
The Metabolomics Standards Initiative (MSI) guidelines were first published in 2007. These guidelines provided reporting standards for all stages of metabolomics analysis: experimental design, biological context, chemical analysis and data processing. Since 2012, a series of public metabolomics databases and repositories, which accept the deposition of metabolomic datasets, have arisen. In this study, the compliance of 399 public data sets, from four major metabolomics data repositories, to the biological context MSI reporting standards was evaluated. None of the reporting standards were complied with in every publicly available study, although adherence rates varied greatly, from 0 to 97%. The plant minimum reporting standards were the most complied with and the microbial and in vitro were the least. Our results indicate the need for reassessment and revision of the existing MSI reporting standards.
Developing core outcome measurement sets for clinical trials: OMERACT filter 2.0.
Boers, Maarten; Kirwan, John R; Wells, George; Beaton, Dorcas; Gossec, Laure; d'Agostino, Maria-Antonietta; Conaghan, Philip G; Bingham, Clifton O; Brooks, Peter; Landewé, Robert; March, Lyn; Simon, Lee S; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter
2014-07-01
Lack of standardization of outcome measures limits the usefulness of clinical trial evidence to inform health care decisions. This can be addressed by agreeing on a minimum core set of outcome measures per health condition, containing measures relevant to patients and decision makers. Since 1992, the Outcome Measures in Rheumatology (OMERACT) consensus initiative has successfully developed core sets for many rheumatologic conditions, actively involving patients since 2002. Its expanding scope required an explicit formulation of its underlying conceptual framework and process. Literature searches and iterative consensus process (surveys and group meetings) of stakeholders including patients, health professionals, and methodologists within and outside rheumatology. To comprehensively sample patient-centered and intervention-specific outcomes, a framework emerged that comprises three core "Areas," namely Death, Life Impact, and Pathophysiological Manifestations; and one strongly recommended Resource Use. Through literature review and consensus process, core set development for any specific health condition starts by identifying at least one core "Domain" within each of the Areas to formulate the "Core Domain Set." Next, at least one applicable measurement instrument for each core Domain is identified to formulate a "Core Outcome Measurement Set." Each instrument must prove to be truthful (valid), discriminative, and feasible. In 2012, 96% of the voting participants (n=125) at the OMERACT 11 consensus conference endorsed this model and process. The OMERACT Filter 2.0 explicitly describes a comprehensive conceptual framework and a recommended process to develop core outcome measurement sets for rheumatology likely to be useful as a template in other areas of health care. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Tissue allograft coding and traceability in USM Tissue Bank, Malaysia.
Sheikh Ab Hamid, Suzina; Abd Rahman, Muhamad Nor Firdaus
2010-11-01
In Malaysia, tissue banking activities began in Universiti Sains Malaysia (USM) Tissue Bank in early 1990s. Since then a few other bone banks have been set up in other government hospitals and institutions. However, these banks are not governed by the national authority. In addition there is no requirement set by the national regulatory authority on coding and traceability for donated human tissues for transplantation. Hence, USM Tissue Bank has taken the initiatives to adopt a system that enables the traceability of tissues between the donor, the processed tissue and the recipient based on other international standards for tissue banks. The traceability trail has been effective and the bank is certified compliance to the international standard ISO 9001:2008.
NASA Astrophysics Data System (ADS)
Dell'Acqua, Fabio; Iannelli, Gianni Cristian; Kerekes, John; Lisini, Gianni; Moser, Gabriele; Ricardi, Niccolo; Pierce, Leland
2016-08-01
The issue of homogeneity in performance assessment of proposed algorithms for information extraction is generally perceived also in the Earth Observation (EO) domain. Different authors propose different datasets to test their developed algorithms and to the reader it is frequently difficult to assess which is better for his/her specific application, given the wide variability in test sets that makes pure comparison of e.g. accuracy values less meaningful than one would desire. With our work, we gave a modest contribution to ease the problem by making it possible to automatically distribute a limited set of possible "standard" open datasets, together with some ground truth info, and automatically assess processing results provided by the users.
OzFlux data: network integration from collection to curation
NASA Astrophysics Data System (ADS)
Isaac, Peter; Cleverly, James; McHugh, Ian; van Gorsel, Eva; Ewenz, Cacilia; Beringer, Jason
2017-06-01
Measurement of the exchange of energy and mass between the surface and the atmospheric boundary-layer by the eddy covariance technique has undergone great change in the last 2 decades. Early studies of these exchanges were confined to brief field campaigns in carefully controlled conditions followed by months of data analysis. Current practice is to run tower-based eddy covariance systems continuously over several years due to the need for continuous monitoring as part of a global effort to develop local-, regional-, continental- and global-scale budgets of carbon, water and energy. Efficient methods of processing the increased quantities of data are needed to maximise the time available for analysis and interpretation. Standardised methods are needed to remove differences in data processing as possible contributors to observed spatial variability. Furthermore, public availability of these data sets assists with undertaking global research efforts. The OzFlux data path has been developed (i) to provide a standard set of quality control and post-processing tools across the network, thereby facilitating inter-site integration and spatial comparisons; (ii) to increase the time available to researchers for analysis and interpretation by reducing the time spent collecting and processing data; (iii) to propagate both data and metadata to the final product; and (iv) to facilitate the use of the OzFlux data by adopting a standard file format and making the data available from web-based portals. Discovery of the OzFlux data set is facilitated through incorporation in FLUXNET data syntheses and the publication of collection metadata via the RIF-CS format. This paper serves two purposes. The first is to describe the data sets, along with their quality control and post-processing, for the other papers of this Special Issue. The second is to provide an example of one solution to the data collection and curation challenges that are encountered by similar flux tower networks worldwide.
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Standardization of domestic frying processes by an engineering approach.
Franke, K; Strijowski, U
2011-05-01
An approach was developed to enable a better standardization of domestic frying of potato products. For this purpose, 5 domestic fryers differing in heating power and oil capacity were used. A very defined frying process using a highly standardized model product and a broad range of frying conditions was carried out in these fryers and the development of browning representing an important quality parameter was measured. Product-to-oil ratio, oil temperature, and frying time were varied. Quite different color changes were measured in the different fryers although the same frying process parameters were applied. The specific energy consumption for water evaporation (spECWE) during frying related to product amount was determined for all frying processes to define an engineering parameter for characterizing the frying process. A quasi-linear regression approach was applied to calculate this parameter from frying process settings and fryer properties. The high significance of the regression coefficients and a coefficient of determination close to unity confirmed the suitability of this approach. Based on this regression equation, curves for standard frying conditions (SFC curves) were calculated which describe the frying conditions required to obtain the same level of spECWE in the different domestic fryers. Comparison of browning results from the different fryers operated at conditions near the SFC curves confirmed the applicability of the approach. © 2011 Institute of Food Technologists®
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-01-01
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-03-03
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.
NASA Astrophysics Data System (ADS)
Ruscher, P. H.
2008-05-01
Introduction This paper will discuss the process that went into the development of new teacher standards for Florida's K-12 science benchmarks over 2007-2008. Florida developed its first set of science standards at about the same time that the National Science Education Standards debuted, in the mid-1990s, and the two documents produced had little in common with other, particularly with regard to controversial issue of evolution, and was also quite weak in the treatment of earth/space (or geoscience) content expectations. The process created panels of Framers and Writers (with much overlap) and led to the creation of a draft set of documents in the fall of 2007 after much electronic and face-to-face collaboration at several meetings. The public was then invited to comment on the draft, and the comments came fast and furious (some really were, in fact, furious). But most were highly complementary, and external professional reviewers lauded the changes from Florida's existing "mile-wide, inch-deep" standards to a much more reasonable core group of standards. Over a 60-day period, over 20,000 individual comments were submitted, and over 100,000 numerical ratings (on a 5 point scale) were entered. In January 2008, these comments were reviewed and that culminated in a final draft of the standards, presented to the State Department of Education and its Commissioner, in late January. The process became fraught with political pressures late, however, as anti-evolutionists led an assault on some aspects of the Life Sciences standards, which had repercussions in particular related to fossil evidence in the Earth/Space Science standards, The talk will summarize the process of evolution that this forced the standards to undergo. Nature of Science There is an expanded section of Nature of Science benchmarks and standards that are based on over twenty years of research in science education that cut across all standard areas (life, physical, and earth/space). This body of knowledge exists at all levels from kindergarten to 12th grade, and serves to assure that science is inquiry-based, if not directly experientially-based, encourages laboratory and field work in science, and serves to elevate science teaching. Impacts on the Florida Science FCAT (Florida Comprehensive Assessment Test) will also be discussed. Geoscience Components Our efforts concentrated on all aspects of Earth/Space Science, including astronomy, cosmology, hydrology, geology, climatology, meteorology, and oceanography (and various other sub-disciplines one could name). We include societal impacts such as the impact of the space program on Florida, disaster mitigation and preparation, and resource utilization. Linkages to physical and life sciences are explicit, allowing for the creation of new crosscutting curricula that might provide interesting new challenges for implementers at the district (e.g., county) level.
Kilinc Balci, F. Selcen
2016-01-01
Although they play an important role in infection prevention and control, textile materials and personal protective equipment (PPE) used in health care settings are known to be one of the sources of cross-infection. Gowns are recommended to prevent transmission of infectious diseases in certain settings; however, laboratory and field studies have produced mixed results of their efficacy. PPE used in health care is regulated as either class I (low risk) or class II (intermediate risk) devices in the United States. Many organizations have published guidelines for the use of PPE, including isolation gowns, in health care settings. In addition, the Association for the Advancement of Medical Instrumentation published a guidance document on the selection of gowns and a classification standard on liquid barrier performance for both surgical and isolation gowns. However, there is currently no existing standard specific to isolation gowns that considers not only the barrier resistance but also a wide array of end user desired attributes. As a result, infection preventionists and purchasing agents face several difficulties in the selection process, and end users have limited or no information on the levels of protection provided by isolation gowns. Lack of knowledge about the performance of protective clothing used in health care became more apparent during the 2014 Ebola epidemic. This article reviews laboratory studies, regulations, guidelines and standards pertaining to isolation gowns, characterization problems, and other potential barriers of isolation gown selection and use. PMID:26391468
Issues and Methods for Standard-Setting.
ERIC Educational Resources Information Center
Hambleton, Ronald K.; And Others
Issues involved in standard setting along with methods for standard setting are reviewed, with specific reference to their relevance for criterion referenced testing. Definitions are given of continuum and state models, and traditional and normative standard setting procedures. Since continuum models are considered more appropriate for criterion…
Evaluation of Apache Hadoop for parallel data analysis with ROOT
NASA Astrophysics Data System (ADS)
Lehrack, S.; Duckeck, G.; Ebke, J.
2014-06-01
The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.
Soils Project Risk-Based Corrective Action Evaluation Process with ROTC 1 and ROTC 2, Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Patrick; Sloop, Christina
2012-04-01
This document formally defines and clarifies the NDEP-approved process the NNSA/NSO Soils Activity uses to fulfill the requirements of the FFACO and state regulations. This process is used to establish FALs in accordance with the risk-based corrective action (RBCA) process stipulated in Chapter 445 of the Nevada Administrative Code (NAC) as described in the ASTM International (ASTM) Method E1739-95 (NAC, 2008; ASTM, 1995). It is designed to provide a set of consistent standards for chemical and radiological corrective actions.
End-of-fabrication CMOS process monitor
NASA Technical Reports Server (NTRS)
Buehler, M. G.; Allen, R. A.; Blaes, B. R.; Hannaman, D. J.; Lieneweg, U.; Lin, Y.-S.; Sayah, H. R.
1990-01-01
A set of test 'modules' for verifying the quality of a complementary metal oxide semiconductor (CMOS) process at the end of the wafer fabrication is documented. By electrical testing of specific structures, over thirty parameters are collected characterizing interconnects, dielectrics, contacts, transistors, and inverters. Each test module contains a specification of its purpose, the layout of the test structure, the test procedures, the data reduction algorithms, and exemplary results obtained from 3-, 2-, or 1.6-micrometer CMOS/bulk processes. The document is intended to establish standard process qualification procedures for Application Specific Integrated Circuits (ASIC's).
Schulze, H Georg; Turner, Robin F B
2013-04-01
Raman spectra often contain undesirable, randomly positioned, intense, narrow-bandwidth, positive, unidirectional spectral features generated when cosmic rays strike charge-coupled device cameras. These must be removed prior to analysis, but doing so manually is not feasible for large data sets. We developed a quick, simple, effective, semi-automated procedure to remove cosmic ray spikes from spectral data sets that contain large numbers of relatively homogenous spectra. Although some inhomogeneous spectral data sets can be accommodated--it requires replacing excessively modified spectra with the originals and removing their spikes with a median filter instead--caution is advised when processing such data sets. In addition, the technique is suitable for interpolating missing spectra or replacing aberrant spectra with good spectral estimates. The method is applied to baseline-flattened spectra and relies on fitting a third-order (or higher) polynomial through all the spectra at every wavenumber. Pixel intensities in excess of a threshold of 3× the noise standard deviation above the fit are reduced to the threshold level. Because only two parameters (with readily specified default values) might require further adjustment, the method is easily implemented for semi-automated processing of large spectral sets.
NASA Astrophysics Data System (ADS)
Jöckel, P.; Sander, R.; Kerkweg, A.; Tost, H.; Lelieveld, J.
2005-02-01
The development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes, requires coupling of the different domains (land, ocean, atmosphere, ...). One strategy is to link existing domain-specific models with a universal coupler, i.e. an independent standalone program organizing the communication between other programs. In many cases, however, a much simpler approach is more feasible. We have developed the Modular Earth Submodel System (MESSy). It comprises (1) a modular interface structure to connect to a , (2) an extendable set of such for miscellaneous processes, and (3) a coding standard. MESSy is therefore not a coupler in the classical sense, but exchanges data between a and several within one comprehensive executable. The internal complexity of the is controllable in a transparent and user friendly way. This provides remarkable new possibilities to study feedback mechanisms (by two-way coupling). Note that the MESSy and the coupler approach can be combined. For instance, an atmospheric model implemented according to the MESSy standard could easily be coupled to an ocean model by means of an external coupler. The vision is to ultimately form a comprehensive ESM which includes a large set of submodels, and a base model which contains only a central clock and runtime control. This can be reached stepwise, since each process can be included independently. Starting from an existing model, process submodels can be reimplemented according to the MESSy standard. This procedure guarantees the availability of a state-of-the-art model for scientific applications at any time of the development. In principle, MESSy can be implemented into any kind of model, either global or regional. So far, the MESSy concept has been applied to the general circulation model ECHAM5 and a number of process boxmodels.
Undergraduate Experiment with Fractal Diffraction Gratings
ERIC Educational Resources Information Center
Monsoriu, Juan A.; Furlan, Walter D.; Pons, Amparo; Barreiro, Juan C.; Gimenez, Marcos H.
2011-01-01
We present a simple diffraction experiment with fractal gratings based on the triadic Cantor set. Diffraction by fractals is proposed as a motivating strategy for students of optics in the potential applications of optical processing. Fraunhofer diffraction patterns are obtained using standard equipment present in most undergraduate physics…
Drugs and alcohol in civil aviation accident pilot fatalities from 2004-2008.
DOT National Transportation Integrated Search
2011-09-01
The FAA Office of Aerospace Medicine sets medical standards needed to protect the public and pilots from death : or injury due to incapacitation of the pilot. As a part of this process, toxicology testing is performed by the FAA : on almost every pil...
ERIC Educational Resources Information Center
Frashure, K. M.; Chen, R. F.; Stephen, R. A.; Bolmer, T.; Lavin, M.; Strohschneider, D.; Maichle, R.; Micozzi, N.; Cramer, C.
2007-01-01
Demonstrating wave processes quantitatively in the classroom using standard classroom tools (such as Slinkys and wave tanks) can be difficult. For example, waves often travel too fast for students to actually measure amplitude or wavelength. Also, when teaching propagating waves, reflections from the ends set up standing waves, which can confuse…
A Leader's Guide to Mathematics Curriculum Topic Study
ERIC Educational Resources Information Center
Keeley, Page; Mundry, Susan; Tobey, Cheryl Rose; Carroll, Catherine E.
2012-01-01
The Curriculum Topic Study (CTS) process, funded by the National Science Foundation, supports teachers in improving practice by connecting standards and research to curriculum, instruction, and assessment. Designed for facilitators, this guide provides a robust set of professional development tools, templates, and designs to strengthen mathematics…
13 CFR 134.801 - Scope of rules.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Dispute Resolution Process (EDRP). Standard Operating Procedure (SOP) 37 71 sets out the EDRP. It is... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Scope of rules. 134.801 Section 134.801 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION RULES OF PROCEDURE GOVERNING...
13 CFR 134.801 - Scope of rules.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Dispute Resolution Process (EDRP). Standard Operating Procedure (SOP) 37 71 sets out the EDRP. It is... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Scope of rules. 134.801 Section 134.801 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION RULES OF PROCEDURE GOVERNING...
13 CFR 134.801 - Scope of rules.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Dispute Resolution Process (EDRP). Standard Operating Procedure (SOP) 37 71 sets out the EDRP. It is... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Scope of rules. 134.801 Section 134.801 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION RULES OF PROCEDURE GOVERNING...
13 CFR 134.801 - Scope of rules.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Dispute Resolution Process (EDRP). Standard Operating Procedure (SOP) 37 71 sets out the EDRP. It is... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Scope of rules. 134.801 Section 134.801 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION RULES OF PROCEDURE GOVERNING...
ERIC Educational Resources Information Center
Bailey, Anthony
2013-01-01
The nominal group technique (NGT) is a structured process to gather information from a group. The technique was first described in 1975 and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This paper describes the process of…
NASA Technical Reports Server (NTRS)
Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.
1987-01-01
The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.
7 CFR 28.107 - Original cotton standards and reserve sets.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 2 2014-01-01 2014-01-01 false Original cotton standards and reserve sets. 28.107... CONTAINER REGULATIONS COTTON CLASSING, TESTING, AND STANDARDS Regulations Under the United States Cotton Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a...
7 CFR 28.107 - Original cotton standards and reserve sets.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Original cotton standards and reserve sets. 28.107... CONTAINER REGULATIONS COTTON CLASSING, TESTING, AND STANDARDS Regulations Under the United States Cotton Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a...
7 CFR 28.107 - Original cotton standards and reserve sets.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Original cotton standards and reserve sets. 28.107... CONTAINER REGULATIONS COTTON CLASSING, TESTING, AND STANDARDS Regulations Under the United States Cotton Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a...
7 CFR 28.107 - Original cotton standards and reserve sets.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 2 2013-01-01 2013-01-01 false Original cotton standards and reserve sets. 28.107... CONTAINER REGULATIONS COTTON CLASSING, TESTING, AND STANDARDS Regulations Under the United States Cotton Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a...
7 CFR 28.107 - Original cotton standards and reserve sets.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Original cotton standards and reserve sets. 28.107... CONTAINER REGULATIONS COTTON CLASSING, TESTING, AND STANDARDS Regulations Under the United States Cotton Standards Act Practical Forms of Cotton Standards § 28.107 Original cotton standards and reserve sets. (a...
Electro-deposition painting process improvement of cab truck by Six Sigma concept
NASA Astrophysics Data System (ADS)
Kawitu, Kitiya; Chutima, Parames
2017-06-01
The case study company is a manufacturer of trucks and currently facing a high rework cost due to the thickness of the electro-deposited paint (EDP) of the truck cab is lower than standard. In addition, the process capability is very low. The Six Sigma concept consisting of 5 phases (DMAIC) is applied to determine new parameter settings for each significant controllable factor. After the improvement, EDP thickness of the truck cab increases from 17.88μ to 20μ (i.e. standard = 20 ± 3μ). Moreover, the process capability indexes (Cp and Cpk) are increased from 0.9 to 1.43, and from 0.27 to 1.43, respectively. This improvement could save the rework cost about 1.6M THB per year.
Nutescu, Edith A; Wittkowsky, Ann K; Burnett, Allison; Merli, Geno J; Ansell, Jack E; Garcia, David A
2013-05-01
To provide recommendations for optimized anticoagulant therapy in the inpatient setting and outline broad elements that need to be in place for effective management of anticoagulant therapy in hospitalized patients; the guidelines are designed to promote optimization of patient clinical outcomes while minimizing the risks for potential anticoagulation-related errors and adverse events. The medical literature was reviewed using MEDLINE (1946-January 2013), EMBASE (1980-January 2013), and PubMed (1947-January 2013) for topics and key words including, but not limited to, standards of practice, national guidelines, patient safety initiatives, and regulatory requirements pertaining to anticoagulant use in the inpatient setting. Non-English-language publications were excluded. Specific MeSH terms used include algorithms, anticoagulants/administration and dosage/adverse effects/therapeutic use, clinical protocols/standards, decision support systems, drug monitoring/methods, humans, inpatients, efficiency/ organizational, outcome and process assessment (health care), patient care team/organization and administration, program development/standards, quality improvement/organization and administration, thrombosis/ drug therapy, thrombosis/prevention and control, risk assessment/standards, patient safety/standards, and risk management/methods. Because of this document's scope, the medical literature was searched using a variety of strategies. When possible, recommendations are supported by available evidence; however, because this paper deals with processes and systems of care, high-quality evidence (eg, controlled trials) is unavailable. In these cases, recommendations represent the consensus opinion of all authors and are endorsed by the Board of Directors of the Anticoagulation Forum, an organization dedicated to optimizing anticoagulation care. The board is composed of physicians, pharmacists, and nurses with demonstrated expertise and experience in the management of patients receiving anticoagulation therapy. Recommendations for delivering optimized inpatient anticoagulation therapy were developed collaboratively by the authors and are summarized in 8 key areas: (1) process, (2) accountability, (3) integration, (4) standards of practice, (5) provider education and competency, (6) patient education, (7) care transitions, and (8) outcomes. Recommendations are intended to inform the development of coordinated care systems containing elements with demonstrated benefit in improvement of anticoagulation therapy outcomes. Recommendations for delivering optimized inpatient anticoagulation therapy are intended to apply to all clinicians involved in the care of hospitalized patients receiving anticoagulation therapy. Anticoagulants are high-risk medications associated with a significant rate of medication errors among hospitalized patients. Several national organizations have introduced initiatives to reduce the likelihood of patient harm associated with the use of anticoagulants. Health care organizations are under increasing pressure to develop systems to ensure the safe and effective use of anticoagulants in the inpatient setting. This document provides consensus guidelines for anticoagulant therapy in the inpatient setting and serves as a companion document to prior guidelines relevant for outpatients.
Using standardized patients to evaluate hospital-based intervention outcomes.
Li, Li; Lin, Chunqing; Guan, Jihui
2014-06-01
The standardized patient approach has proved to be an effective training tool for medical educators. This article explains the process of employing standardized patients in an HIV stigma reduction intervention in healthcare settings in China. The study was conducted in 40 hospitals in two provinces of China. One year after the stigma reduction intervention, standardized patients made unannounced visits to participating hospitals, randomly approached service providers on duty and presented symptoms related to HIV and disclosed HIV-positive test results. After each visit, the standardized patients evaluated their providers' attitudes and behaviours using a structured checklist. Standardized patients also took open-ended observation notes about their experience and the evaluation process. Seven standardized patients conducted a total of 217 assessments (108 from 20 hospitals in the intervention condition; 109 from 20 hospitals in the control condition). Based on a comparative analysis, the intervention hospitals received a better rating than the control hospitals in terms of general impression and universal precaution compliance as well as a lower score on stigmatizing attitudes and behaviours toward the standardized patients. Standardized patients are a useful supplement to traditional self-report assessments, particularly for measuring intervention outcomes that are sensitive or prone to social desirability. Published by Oxford University Press on behalf of the International Epidemiological Association © The Author 2013; all rights reserved.
Mining functionally relevant gene sets for analyzing physiologically novel clinical expression data.
Turcan, Sevin; Vetter, Douglas E; Maron, Jill L; Wei, Xintao; Slonim, Donna K
2011-01-01
Gene set analyses have become a standard approach for increasing the sensitivity of transcriptomic studies. However, analytical methods incorporating gene sets require the availability of pre-defined gene sets relevant to the underlying physiology being studied. For novel physiological problems, relevant gene sets may be unavailable or existing gene set databases may bias the results towards only the best-studied of the relevant biological processes. We describe a successful attempt to mine novel functional gene sets for translational projects where the underlying physiology is not necessarily well characterized in existing annotation databases. We choose targeted training data from public expression data repositories and define new criteria for selecting biclusters to serve as candidate gene sets. Many of the discovered gene sets show little or no enrichment for informative Gene Ontology terms or other functional annotation. However, we observe that such gene sets show coherent differential expression in new clinical test data sets, even if derived from different species, tissues, and disease states. We demonstrate the efficacy of this method on a human metabolic data set, where we discover novel, uncharacterized gene sets that are diagnostic of diabetes, and on additional data sets related to neuronal processes and human development. Our results suggest that our approach may be an efficient way to generate a collection of gene sets relevant to the analysis of data for novel clinical applications where existing functional annotation is relatively incomplete.
NASA Astrophysics Data System (ADS)
Holzweissig, Martin Joachim; Lackmann, Jan; Konrad, Stefan; Schaper, Mirko; Niendorf, Thomas
2015-07-01
The current work elucidates an improvement of the mechanical properties of tool-quenched low-alloy steel by employing extremely short austenitization durations utilizing a press heating arrangement. Specifically, the influence of different austenitization treatments—involving austenitization durations ranging from three to 15 seconds—on the mechanical properties of low-alloy steel in comparison to an industrial standard furnace process was examined. A thorough set of experiments was conducted to investigate the role of different austenitization durations and temperatures on the resulting mechanical properties such as hardness, bending angle, tensile strength, and strain at fracture. The most important finding is that the hardness, the bending angle as well as the tensile strength increase with shortened austenitization durations. Furthermore, the ductility of the steels exhibits almost no difference following the short austenitization durations and the standard furnace process. The enhancement of the mechanical properties imposed by the short heat treatments investigated, is related to a refinement of microstructural features as compared to the standard furnace process.
Fox, W.E.; McCollum, D.W.; Mitchell, J.E.; Swanson, L.E.; Kreuter, U.P.; Tanaka, J.A.; Evans, G.R.; Theodore, Heintz H.; Breckenridge, R.P.; Geissler, P.H.
2009-01-01
Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss rangeland sustainability and assessment. The SRR has worked to integrate social, economic, and ecological disciplines related to rangelands and has identified a standard set of indicators that can be used to assess rangeland sustainability. As part of this process, SRR has developed a two-tiered conceptual framework from a systems perspective to study the validity of indicators and the relationships among them. The first tier categorizes rangeland characteristics into four states. The second tier defines processes affecting these states through time and space. The framework clearly shows that the processes affect and are affected by each other. ?? 2009 Taylor & Francis Group, LLC.
Sales, Anne E; Bostrom, Anne-Marie; Bucknall, Tracey; Draper, Kellie; Fraser, Kimberly; Schalm, Corinne; Warren, Sharon
2012-02-01
Standardized resident or client assessments, including the Resident Assessment Instrument (RAI), have been available in long term care and home care settings (continuing care sector) in many jurisdictions for a number of years. Although using these data can make quality improvement activities more efficient and less costly, there has not been a review of the literature reporting quality improvement interventions using standardized data. To address 2 questions: (1) How have RAI and other standardized data been used in process or quality improvement activities in the continuing care sector? and (2) Has the use of RAI and similar data resulted in improvements to resident or other outcomes? Searches using a combination of keyword and controlled vocabulary term searches were conducted in MEDLINE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), EMBASE, the Cochrane Library, and PsychINFO. ELIGIBILITY CRITERIA, PARTICIPANTS, AND INTERVENTIONS: English language publications from database inception to October 2008 were included. Eligibility criteria included the following: (1) set in continuing care (long-term care facility or home care), (2) involved some form of intervention designed to improve quality or process of care, and (3) used standardized data in the quality or process improvement intervention. After reviewing the articles, we grouped the studies according to the type of intervention used to initiate process improvement. Four different intervention types were identified. We organized the results and discussion by these 4 intervention types. Key word searches identified 713 articles, of which we excluded 639 on abstract review because they did not meet inclusion criteria. A further 50 articles were excluded on full-text review, leaving a total of 24 articles. Of the 24 studies, 10 used a defined process improvement model, 8 used a combination of interventions (multimodal), 5 implemented new guidelines or protocols, and 1 used an education intervention. The most frequently cited issues contributing to unsuccessful quality improvement interventions were lack of staff, high staff turnover, and limited time available to train staff in ways that would improve client care. Innovative strategies and supporting research are required to determine how to intervene successfully to improve quality in these settings characterized by low staffing levels and predominantly nonprofessional staff. Research on how to effectively enable practitioners to use data to improve quality of care, and ultimately quality of life, needs to be a priority. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
Legal consequences of standard setting for competitive athletes with cardiovascular abnormalities.
Weistart, J C
1985-12-01
This paper addresses the issue of whether establishing consensus standards for the treatment of particular medical conditions increases a physician's exposure to legal liability. The conclusion reached is that the legal effects of standard setting, rather than representing a significant threat of liability, should be seen as beneficial to the medical profession. A fundamental point is that the legal test for liability is entirely dependent on the medical profession's definition of what constitutes adequate care. The law incorporates the standard of care defined by the medical profession and does not impose an external norm. In the absence of formally stated standards, the process of defining relevant medical criteria will involve a great deal of uncertainty. Outcomes of legal contests will be affected by such extraneous factors as the relative experience of the lawyers involved, their access to knowledgeable expert witnesses, and their strategic decisions made with respect to tactics and procedures. Establishment of formal standards has the salutory effect of limiting the influence of these factors and thus reducing the randomness of the results reached. Formal standards also have the advantage of being easily replicated in unrelated proceedings and thereby contribute to the development of a consistent, evenly applied rule of liability. Finally, even if formal standards are either more, or less, progressive than the actual state of medical practice, there is relatively little risk that they will produce untoward results.
Reliability and Validity of 10 Different Standard Setting Procedures.
ERIC Educational Resources Information Center
Halpin, Glennelle; Halpin, Gerald
Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…
Kushniruk, Andre; Borycki, Elizabeth
2017-01-01
In recent years there has been considerable discussion around the need for certification and regulation of healthcare information technology (IT). In particular, the usability of the products being developed needs to be evaluated. This has included the application of standards designed to ensure the process of system development is user-centered and takes usability into consideration while a product is being developed. In addition to this, in healthcare, organizations in the United States and Europe have also addressed the need and requirement for product certification. However, despite these efforts there are continued reports of unusable and unsafe implementations. In this paper we discuss the need to not only include (and require) usability testing in the one-time development process of health IT products (such as EHRs), but we also argue for the need to additionally develop specific usability standards and requirements for usability testing during the implementation of vendor products (i.e. post product development) in healthcare settings. It is further argued that health IT products that may have been certified regarding their development process will still require application of usability testing in the process of implementing them in real hospital settings in order to ensure usability and safety. This is needed in order to ensure that the final result of both product development and implementation processes take into account and apply the latest usability principles and methods.
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)
Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).
Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.
Screening Methodologies to Support Risk and Technology ...
The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
Ghafoor, Virginia L; Silus, Lauren S
2011-03-15
The development of a policy, evidence-based standard orders, and monitoring for palliative sedation therapy (PST) is described. Concerns regarding PST at the University of Minnesota Medical Center (UMMC) arose and needed to be addressed in a formal process. A multidisciplinary group consisting of palliative care physicians, nurse practitioners, clinical nurse specialists, and clinical pharmacy specialists reached consensus on the practice model and medications to be used for PST. Major elements of the plan included the development and implementation of an institutional policy for palliative sedation; standard orders for patient care, sedation, and monitoring; education for staff, patients, and patients' family members; and quality-assurance monitoring. A literature review was performed to identify research and guidelines defining the practice of PST. Policy content includes the use of a standard order set linking patient care, medication administration, the monitoring of sedation, and symptom management. Approval of the policy involved several UMMC committees. An evaluation matrix was used to determine critical areas for PST monitoring and to guide development of a form to monitor quality. A retrospective chart audit using the quality-assurance monitoring form assessed baseline sedation medication and patient outcomes. Assessment of compliance began in the fall of 2008, after the policy and standard orders were approved by the UMMC medical executive committee. In 2008, two cases of PST were monitored using the standardized form. PST cases will be continually monitored and analyzed. Development of policy, standard orders, and quality-assurance monitoring for PST required a formal multidisciplinary process. A process-improvement process is critical to defining institutional policy, educational goals, and outcome metrics for PST.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjöstrand, Torbjörn; Ask, Stefan; Christiansen, Jesper R.
The Pythia program is a standard tool for the generation of events in high-energy collisions, comprising a coherent set of physics models for the evolution from a few-body hard process to a complex multiparticle final state. It contains a library of hard processes, models for initial- and final-state parton showers, matching and merging methods between hard processes and parton showers, multiparton interactions, beam remnants, string fragmentation and particle decays. It also has a set of utilities and several interfaces to external programs. Pythia 8.2 is the second main release after the complete rewrite from Fortran to C++, and now hasmore » reached such a maturity that it offers a complete replacement for most applications, notably for LHC physics studies. Lastly, the many new features should allow an improved description of data.« less
Raster profile development for the spatial data transfer standard
Szemraj, John A.
1993-01-01
The Spatial Data Transfer Standard (SDTS), recently approved as Federal Information Processing Standard (FIPS) Publication 173, is designed to transfer various types of spatial data. Implementing all of the standard's options at one time is impractical. Profiles, or limited subsets of the SDTS, are the mechanisms by which the standards will be implemented. The development of a raster profile is being coordinated by the U.S. Geological Survey's (USGS) SDTS Task Force. This raster profile is intended to accommodate digital georeferenced image data and regularly spaces, georeferenced gridded data. The USGS's digital elevation models (DEMs) and digital orthophoto quadrangles (DOQs), National Oceanic and Atmospheric Administration's (NOAA) advanced very huh resolution radiometer (AVHRR) and Landsat data, and National Aeronautics and Space Administration's (NASA) Earth observing system (EOS) data are among the candidate data sets for this profile. Other raster profiles, designed to support nongeoreferenced and other types of "raw" sensor data will be consider in the future. As with the Topological Vector Profile (TVP) for the SDTS, development of the raster profile includes designing a prototype profile, testing the prototype profile using sample data sets, and finally, requesting and receiving FIPS approval.
New IEEE standard enables data collection for medical applications.
Kennelly, R J; Wittenber, J
1994-01-01
The IEEE has gone to ballot on a "Standard for Medical Device Communications", IEEE P1073. The lower layer, hardware portions of the standard are expected to be approved by the IEEE Standards Board at their December 11-13, 1994 meeting. Other portions of the standard are in the initial stages of the IEEE ballot process. The intent of the standard is to allow hospitals and other users to interface medical electronic devices to host computer systems in a standard, interchangeable manner. The standard is optimized for acute care environments such as ICU's, operating rooms, and emergency rooms. [1] IEEE General Committee and Subcommittee work has been on-going since 1984. Significant amounts of work have been done to discover and meet the needs of the patient care setting. Surveys performed in 1989 identified the following four key user requirements for medical device communications: 1) Frequent reconfiguration of the network. 2) Allow "plug and play" operation by users. 3) Associate devices with a specific bed and patient. 4) Support a wide range of hospital computer system topologies. Additionally, the most critical difference in the acute care setting is patient safety, which has an overall effect on the standard. The standard that went to ballot meets these requirements. The standard is based on existing ISO standards. P1073 is compliant with the OSI seven layer model. P1073 specifies the entire communication stack, from object-oriented software to hospital unique connectors. The standard will be able to be put forward as a true international standard, much in the way that the IEEE 802.x family of standards (like Ethernet) were presented as draft ISO standards.(ABSTRACT TRUNCATED AT 250 WORDS)
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly seeks to quantify complex interactions of processes for a wide range of environmental conditions and crop management scenarios, leading to investigation where multiple sets of experimental data are examined using tools such as simulation and regression. The use of ...
ERIC Educational Resources Information Center
Anderson, W. Steve
Faithfully following a fixed set of procedures has long been standard practice in writing resumes. This process produces a passable document with a minimum of effort expended. There are, however, advantages to be gained from taking a conceptual approach to resume preparation. First, it can help provide a framework for these procedures, giving the…
36 CFR 223.187 - Determinations of unprocessed timber.
Code of Federal Regulations, 2010 CFR
2010-07-01
... processed into any one of the following: (1) Lumber or construction timbers, except western red cedar... grades, sawn on 4 sides, not intended for remanufacture. To determine whether such lumber or construction... generally recognized by the industry as setting a selling standard; and, (ii) A statement by the...
Leadership for Literacy: Teachers Raising Expectations and Opportunities
ERIC Educational Resources Information Center
Chilla, Nicole A.; Waff, Diane; Cook, Heleny
2007-01-01
The public is deeply concerned that students in urban settings are not achieving at high levels. Over the past twenty years, large urban districts have attempted to restructure massive school systems using educational policymaking processes that have focused on school structures, standards-driven curriculum, and test-based accountability measures.…
The Imperfect Art of Designing Online Courses
ERIC Educational Resources Information Center
Berrett, Dan
2012-01-01
Growing pressure to provide more virtual instruction is spurring efforts to design large courses that balance standardization of content with flexibility for instructors. Each course uses a common template, which sets out lesson objectives, lecture material, practice activities, and assessments. The process results in a single version of each…
40 CFR 410.31 - Specialized definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Specialized definitions. 410.31 Section... STANDARDS TEXTILE MILLS POINT SOURCE CATEGORY Low Water Use Processing Subcategory § 410.31 Specialized definitions. In addition to the definitions set forth in 40 CFR part 401 and § 410.01 of this part, the...
40 CFR 410.31 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 29 2011-07-01 2009-07-01 true Specialized definitions. 410.31 Section... STANDARDS TEXTILE MILLS POINT SOURCE CATEGORY Low Water Use Processing Subcategory § 410.31 Specialized definitions. In addition to the definitions set forth in 40 CFR part 401 and § 410.01 of this part, the...
78 FR 59817 - Revision to United States Marshals Service Fees for Services
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
.... Federal Cost Accounting and Fee Setting Standards and Guidelines Being Used When developing fees for... imputed rents on land, buildings, and equipment;'' (c) ``management and supervisory costs;'' and (d... current costs to the United States Marshals Service for service of process in federal court proceedings. A...
Simpson, Deborah M; Beynon, Robert J
2012-09-01
Systems biology requires knowledge of the absolute amounts of proteins in order to model biological processes and simulate the effects of changes in specific model parameters. Quantification concatamers (QconCATs) are established as a method to provide multiplexed absolute peptide standards for a set of target proteins in isotope dilution standard experiments. Two or more quantotypic peptides representing each of the target proteins are concatenated into a designer gene that is metabolically labelled with stable isotopes in Escherichia coli or other cellular or cell-free systems. Co-digestion of a known amount of QconCAT with the target proteins generates a set of labelled reference peptide standards for the unlabelled analyte counterparts, and by using an appropriate mass spectrometry platform, comparison of the intensities of the peptide ratios delivers absolute quantification of the encoded peptides and in turn the target proteins for which they are surrogates. In this review, we discuss the criteria and difficulties associated with surrogate peptide selection and provide examples in the design of QconCATs for quantification of the proteins of the nuclear factor κB pathway.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palanisamy, Giri
The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.
Becker, Matthias; Böckmann, Britta
2016-01-01
Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-02-16
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.
2016-03-01
Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less
Comparison of Web-Based and Face-to-Face Standard Setting Using the Angoff Method
ERIC Educational Resources Information Center
Katz, Irvin R.; Tannenbaum, Richard J.
2014-01-01
Web-based standard setting holds promise for reducing the travel and logistical inconveniences of traditional, face-to-face standard setting meetings. However, because there are few published reports of setting standards via remote meeting technology, little is known about the practical potential of the approach, including technical feasibility of…
Demands Upon Children Regarding Quality of Achievement: Standard Setting in Preschool Classrooms.
ERIC Educational Resources Information Center
Potter, Ellen F.
Focusing particularly on messages transmitted by socializing agents in preschool settings, this exploratory study investigates (1) the incidence of communication events in which standards for achievement are expressed, (2) the nature of the standards, and (3) variations across settings in the nature of standard-setting events. The relationship of…
Science and Art of Setting Performance Standards and Cutoff Scores in Kinesiology
ERIC Educational Resources Information Center
Zhu, Weimo
2013-01-01
Setting standards and cutoff scores is essential to any measurement and evaluation practice. Two evaluation frameworks, norm-referenced (NR) and criterion-referenced (CR), have often been used for setting standards. Although setting fitness standards based on the NR evaluation is relatively easy as long as a nationally representative sample can be…
Standard Setting: A Systematic Approach to Interpreting Student Learning.
ERIC Educational Resources Information Center
DeMars, Christine E.; Sundre, Donna L.; Wise, Steven L.
2002-01-01
Describes workshops designed to set standards for freshman technological literacy at James Madison University (Virginia). Results indicated that about 30% of incoming freshmen could meet the standards set initially; by the end of the year, an additional 50-60% could meet them. Provides recommendations for standard setting in a general education…
Impact of the hard-coded parameters on the hydrologic fluxes of the land surface model Noah-MP
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Samaniego, Luis; Clark, Martyn; Wulfmeyer, Volker; Attinger, Sabine; Thober, Stephan
2016-04-01
Land surface models incorporate a large number of processes, described by physical, chemical and empirical equations. The process descriptions contain a number of parameters that can be soil or plant type dependent and are typically read from tabulated input files. Land surface models may have, however, process descriptions that contain fixed, hard-coded numbers in the computer code, which are not identified as model parameters. Here we searched for hard-coded parameters in the computer code of the land surface model Noah with multiple process options (Noah-MP) to assess the importance of the fixed values on restricting the model's agility during parameter estimation. We found 139 hard-coded values in all Noah-MP process options, which are mostly spatially constant values. This is in addition to the 71 standard parameters of Noah-MP, which mostly get distributed spatially by given vegetation and soil input maps. We performed a Sobol' global sensitivity analysis of Noah-MP to variations of the standard and hard-coded parameters for a specific set of process options. 42 standard parameters and 75 hard-coded parameters were active with the chosen process options. The sensitivities of the hydrologic output fluxes latent heat and total runoff as well as their component fluxes were evaluated. These sensitivities were evaluated at twelve catchments of the Eastern United States with very different hydro-meteorological regimes. Noah-MP's hydrologic output fluxes are sensitive to two thirds of its standard parameters. The most sensitive parameter is, however, a hard-coded value in the formulation of soil surface resistance for evaporation, which proved to be oversensitive in other land surface models as well. Surface runoff is sensitive to almost all hard-coded parameters of the snow processes and the meteorological inputs. These parameter sensitivities diminish in total runoff. Assessing these parameters in model calibration would require detailed snow observations or the calculation of hydrologic signatures of the runoff data. Latent heat and total runoff exhibit very similar sensitivities towards standard and hard-coded parameters in Noah-MP because of their tight coupling via the water balance. It should therefore be comparable to calibrate Noah-MP either against latent heat observations or against river runoff data. Latent heat and total runoff are sensitive to both, plant and soil parameters. Calibrating only a parameter sub-set of only soil parameters, for example, thus limits the ability to derive realistic model parameters. It is thus recommended to include the most sensitive hard-coded model parameters that were exposed in this study when calibrating Noah-MP.
Processing on weak electric signals by the autoregressive model
NASA Astrophysics Data System (ADS)
Ding, Jinli; Zhao, Jiayin; Wang, Lanzhou; Li, Qiao
2008-10-01
A model of the autoregressive model of weak electric signals in two plants was set up for the first time. The result of the AR model to forecast 10 values of the weak electric signals is well. It will construct a standard set of the AR model coefficient of the plant electric signal and the environmental factor, and can be used as the preferences for the intelligent autocontrol system based on the adaptive characteristic of plants to achieve the energy saving on agricultural productions.
Mulkerin, Daniel L; Bergsbaken, Jason J; Fischer, Jessica A; Mulkerin, Mary J; Bohler, Aaron M; Mably, Mary S
2016-10-01
Use of oral chemotherapy is expanding and offers advantages while posing unique safety challenges. ASCO and the Oncology Nursing Society jointly published safety standards for administering chemotherapy that offer a framework for improving oral chemotherapy practice at the University of Wisconsin Carbone Cancer Center. With the goal of improving safety, quality, and uniformity within our oral chemotherapy practice, we conducted a gap analysis comparing our practice against ASCO/Oncology Nursing Society guidelines. Areas for improvement were addressed by multidisciplinary workgroups that focused on education, workflows, and information technology. Recommendations and process changes included defining chemotherapy, standardizing patient and caregiver education, mandating the use of comprehensive electronic order sets, and standardizing documentation for dose modification. Revised processes allow pharmacists to review all orders for oral chemotherapy, and they support monitoring adherence and toxicity by using a library of scripted materials. Between August 2015 and January 2016, revised processes were implemented across the University of Wisconsin Carbone Cancer Center clinics. The following are key performance indicators: 92.5% of oral chemotherapy orders (n = 1,216) were initiated within comprehensive electronic order sets (N = 1,315), 89.2% compliance with informed consent was achieved, 14.7% of orders (n = 193) required an average of 4.4 minutes review time by the pharmacist, and 100% compliance with first-cycle monitoring of adherence and toxicity was achieved. We closed significant gaps between institutional practice and published standards for our oral chemotherapy practice and experienced steady improvement and sustainable performance in key metrics. We created an electronic definition of oral chemotherapies that allowed us to leverage our electronic health records. We believe our tools are broadly applicable.
Bergsbaken, Jason J.; Fischer, Jessica A.; Mulkerin, Mary J.; Bohler, Aaron M.; Mably, Mary S.
2016-01-01
Purpose: Use of oral chemotherapy is expanding and offers advantages while posing unique safety challenges. ASCO and the Oncology Nursing Society jointly published safety standards for administering chemotherapy that offer a framework for improving oral chemotherapy practice at the University of Wisconsin Carbone Cancer Center. Methods: With the goal of improving safety, quality, and uniformity within our oral chemotherapy practice, we conducted a gap analysis comparing our practice against ASCO/Oncology Nursing Society guidelines. Areas for improvement were addressed by multidisciplinary workgroups that focused on education, workflows, and information technology. Recommendations and process changes included defining chemotherapy, standardizing patient and caregiver education, mandating the use of comprehensive electronic order sets, and standardizing documentation for dose modification. Revised processes allow pharmacists to review all orders for oral chemotherapy, and they support monitoring adherence and toxicity by using a library of scripted materials. Results: Between August 2015 and January 2016, revised processes were implemented across the University of Wisconsin Carbone Cancer Center clinics. The following are key performance indicators: 92.5% of oral chemotherapy orders (n = 1,216) were initiated within comprehensive electronic order sets (N = 1,315), 89.2% compliance with informed consent was achieved, 14.7% of orders (n = 193) required an average of 4.4 minutes review time by the pharmacist, and 100% compliance with first-cycle monitoring of adherence and toxicity was achieved. Conclusion: We closed significant gaps between institutional practice and published standards for our oral chemotherapy practice and experienced steady improvement and sustainable performance in key metrics. We created an electronic definition of oral chemotherapies that allowed us to leverage our electronic health records. We believe our tools are broadly applicable. PMID:27858570
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
[Output standard in the mental health services of Reggio Emilia, Italy. Methodological issues].
Grassi, G
2000-01-01
The project Output Standards of the Mental Health Department (MHD) of Reggio Emilia is set out to define outputs and quality standards and to guarantee transparency and to facilitate organizational improvement. The MHD started an interprofessional working group that defined the MHD outputs as long as process, quality peculiarities, indicators and standards for each output. The MHD Director validated the group results. The MHD defined 9 outputs and its indicators and standards and consequently modified its data registration system, the way to supply free and partially charged services and budget indicators. As a result, a new instrument for management and quality control has been provided. The A. maintains that to define outputs, indicators and standards will allow to compare several services of the Department, get them omogeneous and guarantee and improve quality.
Large Terrain Continuous Level of Detail 3D Visualization Tool
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.
Evaluation of a standard shade guide for color change after disinfection.
Pohjola, Randall M; Hackman, Steven T; Browning, William D
2007-09-01
To determine if surface disinfectants cause a change in the shade perception of a standard Classic Vitapan shade guide. Consistency in shade selection for dental restorations involves many factors, and one of the most important is the shade tabs used in the selection process. Ten shade tabs each of shades B2, D2, C1, and A3.5 were selected from the Classic Vitapan shade guide (Vident). All tabs were measured with the EasyShade shade device (Vident) at baseline. Three tabs of each shade were set aside as controls. The other 7 tabs of each shade were treated with the surface disinfectant Cavicide (Metrex Research) for 480 cycles to simulate a year's usage. After each 480 cycles, all the tabs were again measured with the EasyShade. This process was repeated to simulate 2 and 3 years of use. The data were analyzed to calculate the delta E 2000 for any change. A statistically significant increase was observed in the value (L*) and chroma (C*) after 2 and 3 years of simulated treatments. These changes were not perceptible to the clinician. The authors suggest that 1 standard shade guide be set aside to compare against those in clinical use to determine when they should be replaced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Stanley T.
2007-01-01
This thesis describes the first search for Standard Model Higgs boson production in association with a top-antitop quark pair in proton-antiproton collisions at a centre of mass energy of 1.96 TeV. The integrated luminosity for othis search corresponds to 319 pb -1 of data recorded by the Collider Detector at Fermilab. We outline the even selection criteria, evaluate the even acceptance and estimate backgrounds from Standard Model sources. These events are observed that satisfy our event selection, while 2.16 ± 0.66 events are expected from background processes. no significant excess of events above background is thus observed, and we set 95% confidence level upper limits on the production cross section for this process as a function of the Higgs mass. For a Higgs boson mass of 115 GeV/c 2 we find that σ more » $$t\\bar{t}H$$ x BR (H → bb) < 690 fb at 95% C.L. These are the first limits set for $$t\\bar{t}H$$ production. This search also allows us to anticipate the challenges and necessary strategies needed for future searches of $$t\\bar{t}H$$ production.« less
NASA Astrophysics Data System (ADS)
Chartosias, Marios
Acceptance of Carbon Fiber Reinforced Polymer (CFRP) structures requires a robust surface preparation method with improved process controls capable of ensuring high bond quality. Surface preparation in a production clean room environment prior to applying adhesive for bonding would minimize risk of contamination and reduce cost. Plasma treatment is a robust surface preparation process capable of being applied in a production clean room environment with process parameters that are easily controlled and documented. Repeatable and consistent processing is enabled through the development of a process parameter window utilizing techniques such as Design of Experiments (DOE) tailored to specific adhesive and substrate bonding applications. Insight from respective plasma treatment Original Equipment Manufacturers (OEMs) and screening tests determined critical process factors from non-factors and set the associated factor levels prior to execution of the DOE. Results from mode I Double Cantilever Beam (DCB) testing per ASTM D 5528 [1] standard and DOE statistical analysis software are used to produce a regression model and determine appropriate optimum settings for each factor.
An International Standard Set of Patient-Centered Outcome Measures After Stroke.
Salinas, Joel; Sprinkhuizen, Sara M; Ackerson, Teri; Bernhardt, Julie; Davie, Charlie; George, Mary G; Gething, Stephanie; Kelly, Adam G; Lindsay, Patrice; Liu, Liping; Martins, Sheila C O; Morgan, Louise; Norrving, Bo; Ribbers, Gerard M; Silver, Frank L; Smith, Eric E; Williams, Linda S; Schwamm, Lee H
2016-01-01
Value-based health care aims to bring together patients and health systems to maximize the ratio of quality over cost. To enable assessment of healthcare value in stroke management, an international standard set of patient-centered stroke outcome measures was defined for use in a variety of healthcare settings. A modified Delphi process was implemented with an international expert panel representing patients, advocates, and clinical specialists in stroke outcomes, stroke registers, global health, epidemiology, and rehabilitation to reach consensus on the preferred outcome measures, included populations, and baseline risk adjustment variables. Patients presenting to a hospital with ischemic stroke or intracerebral hemorrhage were selected as the target population for these recommendations, with the inclusion of transient ischemic attacks optional. Outcome categories recommended for assessment were survival and disease control, acute complications, and patient-reported outcomes. Patient-reported outcomes proposed for assessment at 90 days were pain, mood, feeding, selfcare, mobility, communication, cognitive functioning, social participation, ability to return to usual activities, and health-related quality of life, with mobility, feeding, selfcare, and communication also collected at discharge. One instrument was able to collect most patient-reported subdomains (9/16, 56%). Minimum data collection for risk adjustment included patient demographics, premorbid functioning, stroke type and severity, vascular and systemic risk factors, and specific treatment/care-related factors. A consensus stroke measure Standard Set was developed as a simple, pragmatic method to increase the value of stroke care. The set should be validated in practice when used for monitoring and comparisons across different care settings. © 2015 The Authors.
Ganz, David A.; Alkema, Gretchen E.; Wu, Shinyi
2013-01-01
Systematic evidence reviews support the efficacy of physical activity programs and multifactorial strategies for fall prevention. However, community settings where fall prevention programs occur often differ substantially from the research settings in which efficacy was first demonstrated. Because of these differences, alternative approaches are needed to judge the adequacy of fall prevention activities occurring as part of standard medical care or community efforts. This paper uses the World Health Organization Innovative Care for Chronic Conditions (ICCC) framework to rethink how fall prevention programs might be implemented routinely in both medical and community settings. We highlight examples of innovative programs and policies that provide fall prevention strategies consistent with the ICCC framework, and provide evidence where available on the effects of these strategies on processes and outcomes of care. We close by proposing a “no wrong door” approach to fall prevention and management, in which older adults who are found to be at risk for falls in either a medical or community setting are linked to a standard fall risk evaluation across three domains (physical activity, medical risks and home safety). PMID:18676787
Graph theory for feature extraction and classification: a migraine pathology case study.
Jorge-Hernandez, Fernando; Garcia Chimeno, Yolanda; Garcia-Zapirain, Begonya; Cabrera Zubizarreta, Alberto; Gomez Beldarrain, Maria Angeles; Fernandez-Ruanova, Begonya
2014-01-01
Graph theory is also widely used as a representational form and characterization of brain connectivity network, as is machine learning for classifying groups depending on the features extracted from images. Many of these studies use different techniques, such as preprocessing, correlations, features or algorithms. This paper proposes an automatic tool to perform a standard process using images of the Magnetic Resonance Imaging (MRI) machine. The process includes pre-processing, building the graph per subject with different correlations, atlas, relevant feature extraction according to the literature, and finally providing a set of machine learning algorithms which can produce analyzable results for physicians or specialists. In order to verify the process, a set of images from prescription drug abusers and patients with migraine have been used. In this way, the proper functioning of the tool has been proved, providing results of 87% and 92% of success depending on the classifier used.
Development of a Replacement for Trichloroethylene in the Two-Stage Cleaning Process
1992-12-01
Auger-Determined Carbon/Iron Ratios of Set 4 ..................... 15 3 Abstract Isopropyl alcohol, d- limonene , and a synthetic mineral spirits were...found to be as clean as those alcohol, d- limonene , and a synthetic cleaned by the standard two-stage mineral spirits,- were chosen to be process...selected, therefore, was to soil test specimens with Another candidate was d- limonene . It has representative soils, clean them by the been extensively
[Global immunization policies and recommendations: objectives and process].
Duclos, Philippe; Okwo-Bele, Jean-Marie
2007-04-01
The World Health Organization (WHO) has a dual mandate of providing global policies, standards and norms as well as support for member countries in applying such policies and standards to national programmes with the aim to improve health. The vaccine world is changing and with it the demands and expectations of the global and national policy makers, donors, and other interested parties. Changes pertain to : new vaccines and technologies developments, vaccine safety issues, regulation and approval of vaccines, and increased funding flowing through new financing mechanisms. This places a special responsibility on WHO to respond effectively. WHO has recently reviewed and optimized its policy making structure for vaccines and immunization and adjusted it to the new Global Immunization Vision and Strategy, which broadens the scope of immunization efforts to all age groups and vaccines with emphasis on integration of immunization delivery with other health interventions. This includes an extended consultation process to promptly generate evidence base recommendations, ensuring transparency of the decision making process and added communication efforts. This article presents the objectives and impact of the process set to develop global immunization policies, norms, standards and recommendations. The key advisory committees landscape contributing to this process is described. This includes the Strategic Advisory Group of Experts, the Global Advisory Committee on Vaccine Safety and the Expert Committee on Biological Standardization. The elaboration of WHO vaccine position papers is also described.
Standards for reporting qualitative research: a synthesis of recommendations.
O'Brien, Bridget C; Harris, Ilene B; Beckman, Thomas J; Reed, Darcy A; Cook, David A
2014-09-01
Standards for reporting exist for many types of quantitative research, but currently none exist for the broad spectrum of qualitative research. The purpose of the present study was to formulate and define standards for reporting qualitative research while preserving the requisite flexibility to accommodate various paradigms, approaches, and methods. The authors identified guidelines, reporting standards, and critical appraisal criteria for qualitative research by searching PubMed, Web of Science, and Google through July 2013; reviewing the reference lists of retrieved sources; and contacting experts. Specifically, two authors reviewed a sample of sources to generate an initial set of items that were potentially important in reporting qualitative research. Through an iterative process of reviewing sources, modifying the set of items, and coding all sources for items, the authors prepared a near-final list of items and descriptions and sent this list to five external reviewers for feedback. The final items and descriptions included in the reporting standards reflect this feedback. The Standards for Reporting Qualitative Research (SRQR) consists of 21 items. The authors define and explain key elements of each item and provide examples from recently published articles to illustrate ways in which the standards can be met. The SRQR aims to improve the transparency of all aspects of qualitative research by providing clear standards for reporting qualitative research. These standards will assist authors during manuscript preparation, editors and reviewers in evaluating a manuscript for potential publication, and readers when critically appraising, applying, and synthesizing study findings.
An innovative system for 3D clinical photography in the resource-limited settings.
Baghdadchi, Saharnaz; Liu, Kimberly; Knapp, Jacquelyn; Prager, Gabriel; Graves, Susannah; Akrami, Kevan; Manuel, Rolanda; Bastos, Rui; Reid, Erin; Carson, Dennis; Esener, Sadik; Carson, Joseph; Liu, Yu-Tsueng
2014-06-15
Kaposi's sarcoma (KS) is the most frequently occurring cancer in Mozambique among men and the second most frequently occurring cancer among women. Effective therapeutic treatments for KS are poorly understood in this area. There is an unmet need to develop a simple but accurate tool for improved monitoring and diagnosis in a resource-limited setting. Standardized clinical photographs have been considered to be an essential part of the evaluation. When a therapeutic response is achieved, nodular KS often exhibits a reduction of the thickness without a change in the base area of the lesion. To evaluate the vertical space along with other characters of a KS lesion, we have created an innovative imaging system with a consumer light-field camera attached to a miniature "photography studio" adaptor. The image file can be further processed by computational methods for quantification. With this novel imaging system, each high-quality 3D image was consistently obtained with a single camera shot at bedside by minimally trained personnel. After computational processing, all-focused photos and measurable 3D parameters were obtained. More than 80 KS image sets were processed in a semi-automated fashion. In this proof-of-concept study, the feasibility to use a simple, low-cost and user-friendly system has been established for future clinical study to monitor KS therapeutic response. This 3D imaging system can be also applied to obtain standardized clinical photographs for other diseases.
An innovative system for 3D clinical photography in the resource-limited settings
2014-01-01
Background Kaposi’s sarcoma (KS) is the most frequently occurring cancer in Mozambique among men and the second most frequently occurring cancer among women. Effective therapeutic treatments for KS are poorly understood in this area. There is an unmet need to develop a simple but accurate tool for improved monitoring and diagnosis in a resource-limited setting. Standardized clinical photographs have been considered to be an essential part of the evaluation. Methods When a therapeutic response is achieved, nodular KS often exhibits a reduction of the thickness without a change in the base area of the lesion. To evaluate the vertical space along with other characters of a KS lesion, we have created an innovative imaging system with a consumer light-field camera attached to a miniature “photography studio” adaptor. The image file can be further processed by computational methods for quantification. Results With this novel imaging system, each high-quality 3D image was consistently obtained with a single camera shot at bedside by minimally trained personnel. After computational processing, all-focused photos and measurable 3D parameters were obtained. More than 80 KS image sets were processed in a semi-automated fashion. Conclusions In this proof-of-concept study, the feasibility to use a simple, low-cost and user-friendly system has been established for future clinical study to monitor KS therapeutic response. This 3D imaging system can be also applied to obtain standardized clinical photographs for other diseases. PMID:24929434
Gallo, Stephen A.; Carpenter, Afton S.; Glisson, Scott R.
2013-01-01
Teleconferencing as a setting for scientific peer review is an attractive option for funding agencies, given the substantial environmental and cost savings. Despite this, there is a paucity of published data validating teleconference-based peer review compared to the face-to-face process. Our aim was to conduct a retrospective analysis of scientific peer review data to investigate whether review setting has an effect on review process and outcome measures. We analyzed reviewer scoring data from a research program that had recently modified the review setting from face-to-face to a teleconference format with minimal changes to the overall review procedures. This analysis included approximately 1600 applications over a 4-year period: two years of face-to-face panel meetings compared to two years of teleconference meetings. The average overall scientific merit scores, score distribution, standard deviations and reviewer inter-rater reliability statistics were measured, as well as reviewer demographics and length of time discussing applications. The data indicate that few differences are evident between face-to-face and teleconference settings with regard to average overall scientific merit score, scoring distribution, standard deviation, reviewer demographics or inter-rater reliability. However, some difference was found in the discussion time. These findings suggest that most review outcome measures are unaffected by review setting, which would support the trend of using teleconference reviews rather than face-to-face meetings. However, further studies are needed to assess any correlations among discussion time, application funding and the productivity of funded research projects. PMID:23951223
Maserat, Elham; Seied Farajollah, Seiede Sedigheh; Safdari, Reza; Ghazisaeedi, Marjan; Aghdaei, Hamid Asadzadeh; Zali, Mohammad Reza
2015-01-01
Colorectal cancer is a major cause of morbidity and mortality throughout the world. Colorectal cancer screening is an optimal way for reducing of morbidity and mortality and a clinical decision support system (CDSS) plays an important role in predicting success of screening processes. DSS is a computer-based information system that improves the delivery of preventive care services. The aim of this article was to detail engineering of information requirements and work flow design of CDSS for a colorectal cancer screening program. In the first stage a screening minimum data set was determined. Developed and developing countries were analyzed for identifying this data set. Then information deficiencies and gaps were determined by check list. The second stage was a qualitative survey with a semi-structured interview as the study tool. A total of 15 users and stakeholders' perspectives about workflow of CDSS were studied. Finally workflow of DSS of control program was designed by standard clinical practice guidelines and perspectives. Screening minimum data set of national colorectal cancer screening program was defined in five sections, including colonoscopy data set, surgery, pathology, genetics and pedigree data set. Deficiencies and information gaps were analyzed. Then we designed a work process standard of screening. Finally workflow of DSS and entry stage were determined. A CDSS facilitates complex decision making for screening and has key roles in designing optimal interactions between colonoscopy, pathology and laboratory departments. Also workflow analysis is useful to identify data reconciliation strategies to address documentation gaps. Following recommendations of CDSS should improve quality of colorectal cancer screening.
Reporting standards for studies of diagnostic test accuracy in dementia
Noel-Storr, Anna H.; McCleery, Jenny M.; Richard, Edo; Ritchie, Craig W.; Flicker, Leon; Cullum, Sarah J.; Davis, Daniel; Quinn, Terence J.; Hyde, Chris; Rutjes, Anne W.S.; Smailagic, Nadja; Marcus, Sue; Black, Sandra; Blennow, Kaj; Brayne, Carol; Fiorivanti, Mario; Johnson, Julene K.; Köpke, Sascha; Schneider, Lon S.; Simmons, Andrew; Mattsson, Niklas; Zetterberg, Henrik; Bossuyt, Patrick M.M.; Wilcock, Gordon
2014-01-01
Objective: To provide guidance on standards for reporting studies of diagnostic test accuracy for dementia disorders. Methods: An international consensus process on reporting standards in dementia and cognitive impairment (STARDdem) was established, focusing on studies presenting data from which sensitivity and specificity were reported or could be derived. A working group led the initiative through 4 rounds of consensus work, using a modified Delphi process and culminating in a face-to-face consensus meeting in October 2012. The aim of this process was to agree on how best to supplement the generic standards of the STARD statement to enhance their utility and encourage their use in dementia research. Results: More than 200 comments were received during the wider consultation rounds. The areas at most risk of inadequate reporting were identified and a set of dementia-specific recommendations to supplement the STARD guidance were developed, including better reporting of patient selection, the reference standard used, avoidance of circularity, and reporting of test-retest reliability. Conclusion: STARDdem is an implementation of the STARD statement in which the original checklist is elaborated and supplemented with guidance pertinent to studies of cognitive disorders. Its adoption is expected to increase transparency, enable more effective evaluation of diagnostic tests in Alzheimer disease and dementia, contribute to greater adherence to methodologic standards, and advance the development of Alzheimer biomarkers. PMID:24944261
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
A search for W' bosons in events with one lepton (electron or muon) and missing transverse momentum is presented. The search uses 3.2 fb -1 of pp collision data collected at √s=13 TeV by the ATLAS experiment at the LHC in 2015. The transverse mass distribution is examined and no significant excess of events above the level expected from Standard Model processes is observed. Upper limits on the W' boson cross-section times branching ratio to leptons are set as a function of the W' mass. Within the Sequential Standard Model W ' masses below 4.07 TeV are excluded at the 95%more » confidence level. This extends the limit set using LHC data at √s=8 TeV by around 800 GeV.« less
2013-02-25
This final rule sets forth standards for health insurance issuers consistent with title I of the Patient Protection and Affordable Care Act, as amended by the Health Care and Education Reconciliation Act of 2010, referred to collectively as the Affordable Care Act. Specifically, this final rule outlines Exchange and issuer standards related to coverage of essential health benefits and actuarial value. This rule also finalizes a timeline for qualified health plans to be accredited in Federally-facilitated Exchanges and amends regulations providing an application process for the recognition of additional accrediting entities for purposes of certification of qualified health plans.
Lopez, Amy
2014-10-01
Information and communication technologies (ICTs) are becoming essential to social work practice by providing increased treatment possibilities and reducing barriers to service. While recognizing the importance of ICTs in practice, social work practitioners have had concerns about ethical use. In response, NASW compiled the Standards for Technology and Social Work Practice. While the guidelines set the groundwork, they were not embedded in a process that would allow them to adapt to the swift pace of ICT changes. This article reviews the current Standards, evaluates how these have been implemented by practitioners, and offers suggestions for updates.
Daskalou, Efstratia; Galli-Tsinopoulou, Assimina; Karagiozoglou-Lampoudi, Thomais; Augoustides-Savvopoulou, Persefone
2016-01-01
Malnutrition is a frequent finding in pediatric health care settings in the form of undernutrition or excess body weight. Its increasing prevalence and impact on overall health status, which is reflected in the adverse outcomes, renders imperative the application of commonly accepted and evidence-based practices and tools by health care providers. Nutrition risk screening on admission and nutrition status evaluation are key points during clinical management of hospitalized pediatric patients, in order to prevent health deterioration that can lead to serious complications and growth consequences. In addition, anthropometric data based on commonly accepted universal growth standards can give accurate results for nutrition status. Both nutrition risk screening and nutrition status assessment are techniques that should be routinely implemented, based on commonly accepted growth standards and methodology, and linked to clinical outcomes. The aim of the present review was to address the issue of hospital malnutrition in pediatric settings in terms of prevalence, outline nutrition status evaluation and nutrition screening process using different criteria and available tools, and present its relationship with outcome measures. Key teaching points • Malnutrition-underweight or excess body weight-is a frequent imbalance in pediatric settings that affects physical growth and results in undesirable clinical outcomes. • Anthropometry interpretation through growth charts and nutrition screening are cornerstones for the assessment of malnutrition.To date no commonly accepted anthropometric criteria or nutrition screening tools are used in hospitalized pediatric patients. • Commonly accepted nutrition status and screening processes based on the World Health Organization's growth standards can contribute to the overall hospital nutrition care of pediatric patients.
Validation results of specifications for motion control interoperability
NASA Astrophysics Data System (ADS)
Szabo, Sandor; Proctor, Frederick M.
1997-01-01
The National Institute of Standards and Technology (NIST) is participating in the Department of Energy Technologies Enabling Agile Manufacturing (TEAM) program to establish interface standards for machine tool, robot, and coordinate measuring machine controllers. At NIST, the focus is to validate potential application programming interfaces (APIs) that make it possible to exchange machine controller components with a minimal impact on the rest of the system. This validation is taking place in the enhanced machine controller (EMC) consortium and is in cooperation with users and vendors of motion control equipment. An area of interest is motion control, including closed-loop control of individual axes and coordinated path planning. Initial tests of the motion control APIs are complete. The APIs were implemented on two commercial motion control boards that run on two different machine tools. The results for a baseline set of APIs look promising, but several issues were raised. These include resolving differing approaches in how motions are programmed and defining a standard measurement of performance for motion control. This paper starts with a summary of the process used in developing a set of specifications for motion control interoperability. Next, the EMC architecture and its classification of motion control APIs into two classes, Servo Control and Trajectory Planning, are reviewed. Selected APIs are presented to explain the basic functionality and some of the major issues involved in porting the APIs to other motion controllers. The paper concludes with a summary of the main issues and ways to continue the standards process.
Automated feature detection and identification in digital point-ordered signals
Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.
1998-01-01
A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.
The caBIG Terminology Review Process
Cimino, James J.; Hayamizu, Terry F.; Bodenreider, Olivier; Davis, Brian; Stafford, Grace A.; Ringwald, Martin
2009-01-01
The National Cancer Institute (NCI) is developing an integrated biomedical informatics infrastructure, the cancer Biomedical Informatics Grid (caBIG®), to support collaboration within the cancer research community. A key part of the caBIG architecture is the establishment of terminology standards for representing data. In order to evaluate the suitability of existing controlled terminologies, the caBIG Vocabulary and Data Elements Workspace (VCDE WS) working group has developed a set of criteria that serve to assess a terminology's structure, content, documentation, and editorial process. This paper describes the evolution of these criteria and the results of their use in evaluating four standard terminologies: the Gene Ontology (GO), the NCI Thesaurus (NCIt), the Common Terminology for Adverse Events (known as CTCAE), and the laboratory portion of the Logical Objects, Identifiers, Names and Codes (LOINC). The resulting caBIG criteria are presented as a matrix that may be applicable to any terminology standardization effort. PMID:19154797
SU-B-213-03: Evaluation of Graduate Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, B.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
SU-B-213-04: Evaluation of Residency Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reft, C.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
SU-B-213-06: Development of ABR Examination Questions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allison, J.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starkschall, G.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starkschall, G.
2015-06-15
The North American medical physics community validates the education received by medical physicists and the clinical qualifications for medical physicists through accreditation of educational programs and certification of medical physicists. Medical physics educational programs (graduate education and residency education) are accredited by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP), whereas medical physicists are certified by several organizations, the most familiar of which is the American Board of Radiology (ABR). In order for an educational program to become accredited or a medical physicist to become certified, the applicant must meet certain specified standards set by the appropriate organization.more » In this Symposium, representatives from both CAMPEP and the ABR will describe the process by which standards are established as well as the process by which qualifications of candidates for accreditation or certification are shown to be compliant with these standards. The Symposium will conclude with a panel discussion. Learning Objectives: Recognize the difference between accreditation of an educational program and certification of an individual Identify the two organizations primarily responsible for these tasks Describe the development of educational standards Describe the process by which examination questions are developed GS is Executive Secretary of CAMPEP.« less
Sexual orientation data collection and progress toward Healthy People 2010.
Sell, R L; Becker, J B
2001-06-01
Without scientifically obtained data and published reports, it is difficult to raise awareness and acquire adequate resources to address the health concerns of lesbian, gay, and bisexual Americans. The Department of Health and Human Services must recognize gaps in its information systems regarding sexual orientation data and take immediate steps to monitor and eliminate health disparities as delineated in Healthy People 2010. A paper supported by funding from the Office of the Assistant Secretary for Planning and Evaluation explores these concerns and suggests that the department (1) create work groups to examine the collection of sexual orientation data; (2) create a set of guiding principles to govern the process of selecting standard definitions and measures; (3) recognize that racial/ethnic, immigrant-status, age, socioeconomic, and geographic differences must be taken into account when standard measures of sexual orientation are selected; (4) select a minimum set of standard sexual orientation measures; and (5) develop a long-range strategic plan for the collection of sexual orientation data.
Fluorescence intensity positivity classification of Hep-2 cells images using fuzzy logic
NASA Astrophysics Data System (ADS)
Sazali, Dayang Farzana Abang; Janier, Josefina Barnachea; May, Zazilah Bt.
2014-10-01
Indirect Immunofluorescence (IIF) is a good standard used for antinuclear autoantibody (ANA) test using Hep-2 cells to determine specific diseases. Different classifier algorithm methods have been proposed in previous works however, there still no valid set as a standard to classify the fluorescence intensity. This paper presents the use of fuzzy logic to classify the fluorescence intensity and to determine the positivity of the Hep-2 cell serum samples. The fuzzy algorithm involves the image pre-processing by filtering the noises and smoothen the image, converting the red, green and blue (RGB) color space of images to luminosity layer, chromaticity layer "a" and "b" (LAB) color space where the mean value of the lightness and chromaticity layer "a" was extracted and classified by using fuzzy logic algorithm based on the standard score ranges of antinuclear autoantibody (ANA) fluorescence intensity. Using 100 data sets of positive and intermediate fluorescence intensity for testing the performance measurements, the fuzzy logic obtained an accuracy of intermediate and positive class as 85% and 87% respectively.
Sexual orientation data collection and progress toward Healthy People 2010.
Sell, R L; Becker, J B
2001-01-01
Without scientifically obtained data and published reports, it is difficult to raise awareness and acquire adequate resources to address the health concerns of lesbian, gay, and bisexual Americans. The Department of Health and Human Services must recognize gaps in its information systems regarding sexual orientation data and take immediate steps to monitor and eliminate health disparities as delineated in Healthy People 2010. A paper supported by funding from the Office of the Assistant Secretary for Planning and Evaluation explores these concerns and suggests that the department (1) create work groups to examine the collection of sexual orientation data; (2) create a set of guiding principles to govern the process of selecting standard definitions and measures; (3) recognize that racial/ethnic, immigrant-status, age, socioeconomic, and geographic differences must be taken into account when standard measures of sexual orientation are selected; (4) select a minimum set of standard sexual orientation measures; and (5) develop a long-range strategic plan for the collection of sexual orientation data. PMID:11392926
Gratacós, Jordi; Luelmo, Jesús; Rodríguez, Jesús; Notario, Jaume; Marco, Teresa Navío; de la Cueva, Pablo; Busquets, Manel Pujol; Font, Mercè García; Joven, Beatriz; Rivera, Raquel; Vega, Jose Luis Alvarez; Álvarez, Antonio Javier Chaves; Parera, Ricardo Sánchez; Carrascosa, Jose Carlos Ruiz; Martínez, Fernando José Rodríguez; Sánchez, José Pardo; Olmos, Carlos Feced; Pujol, Conrad; Galindez, Eva; Barrio, Silvia Pérez; Arana, Ana Urruticoechea; Hergueta, Mercedes; Coto, Pablo; Queiro, Rubén
2018-06-01
To define and give priority to standards of care and quality indicators of multidisciplinary care for patients with psoriatic arthritis (PsA). A systematic literature review on PsA standards of care and quality indicators was performed. An expert panel of rheumatologists and dermatologists who provide multidisciplinary care was established. In a consensus meeting group, the experts discussed and developed the standards of care and quality indicators and graded their priority, agreement and also the feasibility (only for quality indicators) following qualitative methodology and a Delphi process. Afterwards, these results were discussed with 2 focus groups, 1 with patients, another with health managers. A descriptive analysis is presented. We obtained 25 standards of care (9 of structure, 9 of process, 7 of results) and 24 quality indicators (2 of structure, 5 of process, 17 of results). Standards of care include relevant aspects in the multidisciplinary care of PsA patients like an appropriate physical infrastructure and technical equipment, the access to nursing care, labs and imaging techniques, other health professionals and treatments, or the development of care plans. Regarding quality indicators, the definition of multidisciplinary care model objectives and referral criteria, the establishment of responsibilities and coordination among professionals and the active evaluation of patients and data collection were given a high priority. Patients considered all of them as important. This set of standards of care and quality indicators for the multidisciplinary care of patients with PsA should help improve quality of care in these patients.
Use of simulated data sets to evaluate the fidelity of Metagenomicprocessing methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mavromatis, Konstantinos; Ivanova, Natalia; Barry, Kerri
2006-12-01
Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity--based (blast hit distribution) and twomore » sequence composition--based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.« less
Higher Education Faculty Engagement in a Modified Mapmark Standard Setting
ERIC Educational Resources Information Center
Horst, S. Jeanne; DeMars, Christine E.
2016-01-01
The Mapmark standard setting method was adapted to a higher education setting in which faculty leaders were highly involved. Eighteen university faculty members participated in a day-long standard setting for a general education communications test. In Round 1, faculty set initial cut-scores for each of four student learning objectives. In Rounds…
Current Climate Data Set Documentation Standards: Somewhere between Anagrams and Full Disclosure
NASA Astrophysics Data System (ADS)
Fleig, A. J.
2008-12-01
In the 17th century scientists, concerned with establishing primacy for their discoveries while maintaining control of their intellectual property, often published their results as anagrams. Robert Hooke's initial publication in 1676 of his law of elasticity in the form ceiiinossttuv which he revealed two years later as "Ut tension sic vis" or "of the extension, so the force" is one of the better known examples although Galileo, Newton, and many others used the same approach. Fortunately the idea of open publication in scientific journals subject to peer review as a cornerstone of the scientific method gradually became established and is now the norm. Unfortunately though even peer reviewed publication does not necessarily lead to full disclosure. One example of this occurs in the production, review and distribution of large scale data sets of climate variables. Validation papers describe how the data was made in concept but do not provide adequate documentation of the process. Complete provenance of the resulting data sets including description of the exact input files, processing environment, and actual processing code are not required as part of the production and archival effort. A user of the data may be assured by the publication and peer review that the data is considered to be good and usable for scientific investigation but will not know exactly how the data set was made. The problem with this lack of knowledge may be most apparent when considering questions of climate change. Future measurements of the same geophysical parameter will surely be derived from a different observational system than the one used in creating today's data sets. An obvious task in assessing change between the present and the future data set will be to determine how much of the change is because the parameter changed and how much is because the measurement system changed. This will be hard to do without complete knowledge of how the predecessor data set was made. Automated techniques are being developed that will simplify the creation of much of the provenance information but there are both cultural and infrastructure problems that discourage provision of complete documentation. It is time to reconsider what the standards for production and documentation of data sets should be. There is only a short window before the loss of knowledge about current data sets associated with human mortality becomes irreversible. .
Curtivo, Cátia Panizzon Dal; Funghi, Nathália Bitencourt; Tavares, Guilherme Diniz; Barbosa, Sávio Fujita; Löbenberg, Raimar; Bou-Chacra, Nádia Araci
2015-05-01
In this work, near-infrared spectroscopy (NIRS) method was used to evaluate the uniformity of dosage units of three captopril 25 mg tablets commercial batches. The performance of the calibration method was assessed by determination of Q value (0.9986), standard error of estimation (C-set SEE = 1.956), standard error of prediction (V-set SEP = 2.076) as well as the consistency (106.1%). These results indicated the adequacy of the selected model. The method validation revealed the agreement of the reference high pressure liquid chromatography (HPLC) and NIRS methods. The process evaluation using the NIRS method showed that the variability was due to common causes and delivered predictable results consistently. Cp and Cpk values were, respectively, 2.05 and 1.80. These results revealed a non-centered process in relation to the average target (100% w/w), in the specified range (85-115%). The probability of failure was 21:100 million tablets of captopril. The NIRS in combination with the method of multivariate calibration, partial least squares (PLS) regression, allowed the development of methodology for the uniformity of dosage units evaluation of captopril tablets 25 mg. The statistical process control strategy associated with NIRS method as PAT played a critical role in understanding of the sources and degree of variation and its impact on the process. This approach led towards a better process understanding and provided the sound scientific basis for its continuous improvement.
Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R
2016-06-01
Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
Scale structure: Processing Minimum Standard and Maximum Standard Scalar Adjectives
Frazier, Lyn; Clifton, Charles; Stolterfoht, Britta
2008-01-01
Gradable adjectives denote a function that takes an object and returns a measure of the degree to which the object possesses some gradable property (Kennedy, 1999). Scales, ordered sets of degrees, have begun to be studied systematically in semantics (Kennedy, to appear, Kennedy & McNally, 2005, Rotstein & Winter, 2004). We report four experiments designed to investigate the processing of absolute adjectives with a maximum standard (e.g., clean) and their minimum standard antonyms (dirty). The central hypothesis is that the denotation of an absolute adjective introduces a ‘standard value’ on a scale as part of the normal comprehension of a sentence containing the adjective (the “Obligatory Scale” hypothesis). In line with the predictions of Kennedy and McNally (2005) and Rotstein and Winter (2004), maximum standard adjectives and minimum standard adjectives systematically differ from each other when they are combined with minimizing modifiers like slightly, as indicated by speeded acceptability judgments. An eye movement recording study shows that, as predicted by the Obligatory Scale hypothesis, the penalty due to combining slightly with a maximum standard adjective can be observed during the processing of the sentence; the penalty is not the result of some after-the-fact inferencing mechanism. Further, a type of ‘quantificational variability effect’ may be observed when a quantificational adverb (mostly) is combined with a minimum standard adjective in sentences like The dishes are mostly dirty, which may receive either a degree interpretation (e.g. 80% dirty) or a quantity interpretation (e.g., 80% of the dishes are dirty). The quantificational variability results provide suggestive support for the Obligatory Scale hypothesis by showing that the standard of a scalar adjective influences the preferred interpretation of other constituents in the sentence. PMID:17376422
Ambient-temperature incubation for the field detection of Escherichia coli in drinking water.
Brown, J; Stauber, C; Murphy, J L; Khan, A; Mu, T; Elliott, M; Sobsey, M D
2011-04-01
Escherichia coli is the pre-eminent microbiological indicator used to assess safety of drinking water globally. The cost and equipment requirements for processing samples by standard methods may limit the scale of water quality testing in technologically less developed countries and other resource-limited settings, however. We evaluate here the use of ambient-temperature incubation in detection of E. coli in drinking water samples as a potential cost-saving and convenience measure with applications in regions with high (>25°C) mean ambient temperatures. This study includes data from three separate water quality assessments: two in Cambodia and one in the Dominican Republic. Field samples of household drinking water were processed in duplicate by membrane filtration (Cambodia), Petrifilm™ (Cambodia) or Colilert® (Dominican Republic) on selective media at both standard incubation temperature (35–37°C) and ambient temperature, using up to three dilutions and three replicates at each dilution. Matched sample sets were well correlated with 80% of samples (n = 1037) within risk-based microbial count strata (E. coli CFU 100 ml−1 counts of <1, 1–10, 11–100, 101–1000, >1000), and a pooled coefficient of variation of 17% (95% CI 15–20%) for paired sample sets across all methods. These results suggest that ambient-temperature incubation of E. coli in at least some settings may yield sufficiently robust data for water safety monitoring where laboratory or incubator access is limited.
Mossman, Kenneth L
2009-08-01
Standard-setting agencies such as the U.S. Nuclear Regulatory Commission and the U.S. Environmental Protection Agency depend on advice from external expert advisory groups on matters of public policy and standard-setting. Authoritative bodies including the National Research Council and the National Council on Radiation Protection and Measurements provide analyses and recommendations that enable the technical and scientific soundness in decision-making. In radiological protection the nature of the scientific evidence is such that risk assessment at radiation doses typically encountered in environmental and occupational settings is highly uncertain, and several policy alternatives are scientifically defensible. The link between science and policy is problematic. The fundamental issue is the failure to properly consider risk assessment, risk communication, and risk management and then consolidate them in a process that leads to sound policy. Authoritative bodies should serve as unbiased brokers of policy choices by providing balanced and objective scientific analyses. As long as the policy-decision environment is characterized by high scientific uncertainty and a lack of values consensus, advisory groups should present unbiased evaluations of all scientifically plausible alternatives and recommend selection criteria that decision makers can use in the policy-setting process. To do otherwise (e.g., by serving as single position advocates) weakens decision-making by eliminating options and narrowing discussions of scientific perspectives. Understanding uncertainties and the limitations on available scientific information and conveying such information to policy makers remain key challenges for the technical and policy communities.
Anand, K; Saini, Ks; Chopra, Y; Binod, Sk
2010-07-01
'Medical Devices' include everything from highly sophisticated, computerized, medical equipment, right down to simple wooden tongue depressors. Regulations embody the public expectations for how buildings and facilities are expected to perform and as such represent public policy. Regulators, who develop and enforce regulations, are empowered to act in the public's interest to set this policy and are ultimately responsible to the public in this regard. Standardization contributes to the basic infrastructure that underpins society including health and environment, while promoting sustainability and good regulatory practice. The international organizations that produce International Standards are the International Electrotechnical Commission (IEC), the International Organization for Standardization (ISO), and the International Telecommunication Union (ITU). With the increasing globalization of markets, International Standards (as opposed to regional or national standards) have become critical to the trading process, ensuring a level playing field for exports, and ensuring that imports meet the internationally recognized levels of performance and safety. The development of standards is done in response to sectors and stakeholders that express a clearly established need for them. An industry sector or other stakeholder group typically communicates its requirement for standards to one of the national members. To be accepted for development, a proposed work item must receive a majority support of the participating members, who verify the global relevance of the proposed item. The regulatory authority (RA) should provide a method for the recognition of international voluntary standards and for public notification of such recognition. The process of recognition may vary from country to country. Recognition may occur by periodic publication of lists of standards that a regulatory authority has found will meet the Essential Principles. In conclusion, International standards, such as, basic standards, group standards, and product standards, are a tool for harmonizing regulatory processes, to assure the safety, quality, and performance of medical devices. Standards represent the opinion of experts from all interested parties, including industry, regulators, users, and others.
Godson, Richard H.
1974-01-01
GEOPAC .consists of a series of subroutines to primarily process potential-field geophysical data but other types of data can also be used with the program. The package contains routines to reduce, store, process and display information in two-dimensional or three-dimensional form. Input and output formats are standardized and temporary disk storage permits data sets to be processed by several subroutines in one job step. The subroutines are link-edited in an overlay mode to form one program and they can be executed by submitting a card containing the subroutine name in the input stream.
SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.
Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen
2012-07-23
We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-25
... to set forth clearer standards and curtail its discretion with respect to breaking erroneous trades... date if it finds such longer period to be appropriate and publishes its reasons for so finding, the... transparency and certainty to the process of breaking trades, and the comment letters that have been submitted...
ERIC Educational Resources Information Center
Wright, Lynne
2003-01-01
Increasingly, coordinators are undertaking a scrutiny of work in order to check standards in their subjects. This is done frequently for English and mathematics, where annual targets for attainment in year 6 have to be set, but less so for science. Carrying out a scrutiny of work can be a daunting and time-consuming process. Faced with a pile of…
Cost Finding Principles and Procedures. Preliminary Field Review Edition. Technical Report 26.
ERIC Educational Resources Information Center
Ziemer, Gordon; And Others
This report is part of the Larger Cost Finding Principles Project designed to develop a uniform set of standards, definitions, and alternative procedures that will use accounting and statistical data to find the full cost of resources utilized in the process of producing institutional outputs. This technical report describes preliminary procedures…
ERIC Educational Resources Information Center
Ravitch, Sharon M.
2014-01-01
Within the ever-developing, intersecting, and overlapping contexts of globalization, top-down policy, mandates, and standardization of public and higher education, many conceptualize and position practitioner research as a powerful stance and a tool of social, communal, and educational transformation, a set of methodological processes that…
Interactive Digital Signal Processor
NASA Technical Reports Server (NTRS)
Mish, W. H.
1985-01-01
Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.
Expanding horizons. Integrating environmental health in occupational health nursing.
Rogers, B; Cox, A R
1998-01-01
1. Environmental hazards are ubiquitous. Many exist in the workplace or occur as a result of work process exposures. 2. Environmental health is a natural component of the expanding practice of occupational health nursing. 3. AAOHN's vision for occupational and environmental health will continue to set the standard and provide leadership in the specialty.
Standard and Robust Methods in Regression Imputation
ERIC Educational Resources Information Center
Moraveji, Behjat; Jafarian, Koorosh
2014-01-01
The aim of this paper is to provide an introduction of new imputation algorithms for estimating missing values from official statistics in larger data sets of data pre-processing, or outliers. The goal is to propose a new algorithm called IRMI (iterative robust model-based imputation). This algorithm is able to deal with all challenges like…
49 CFR 192.227 - Qualification of welders.
Code of Federal Regulations, 2010 CFR
2010-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Welding of Steel in Pipelines § 192.227 Qualification... earlier edition. (b) A welder may qualify to perform welding on pipe to be operated at a pressure that... process to be used, under the test set forth in section I of Appendix C of this part. Each welder who is...
Foundations for Young Children to the Indiana Academic Standards.
ERIC Educational Resources Information Center
Indiana State Dept. of Education, Indianapolis.
Noting that young children need early childhood settings supporting the development of the full range of capacities that will serve as a foundation for future school learning, and that adults have an opportunity and an obligation to assist children in becoming active participants in the learning process, this document details foundations to…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... regulations allow. Because manufacturers have not yet developed respiratory protection for this occupational...-approved filtering facepiece respirators which are not designed for this use, or no respiratory protection... respiratory protective devices designed for the inhalation hazards of this occupational setting. On July 10...
Corporate corruption of science--the case of chromium(VI).
Egilman, David; Scout
2006-01-01
Corporate infiltration of a panel convened to set standards for chromium(VI) in California, buttressed by the engineered production of dubious "scientific" literature advancing industry's goal, succeeded in skewing the panel's decision to protect industry profits rather than public health. This situation demonstrates the insidious and effective influence of industry on the regulatory process.
Neutral model analysis of landscape patterns from mathematical morphology
Kurt H. Riitters; Peter Vogt; Pierre Soille; Jacek Kozak; Christine Estreguil
2007-01-01
Mathematical morphology encompasses methods for characterizing land-cover patterns in ecological research and biodiversity assessments. This paper reports a neutral model analysis of patterns in the absence of a structuring ecological process, to help set standards for comparing and interpreting patterns identified by mathematical morphology on real land-cover maps. We...
Dobecki, Marek
2012-01-01
This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.
Sjöstrand, Torbjörn; Ask, Stefan; Christiansen, Jesper R.; ...
2015-02-11
The Pythia program is a standard tool for the generation of events in high-energy collisions, comprising a coherent set of physics models for the evolution from a few-body hard process to a complex multiparticle final state. It contains a library of hard processes, models for initial- and final-state parton showers, matching and merging methods between hard processes and parton showers, multiparton interactions, beam remnants, string fragmentation and particle decays. It also has a set of utilities and several interfaces to external programs. Pythia 8.2 is the second main release after the complete rewrite from Fortran to C++, and now hasmore » reached such a maturity that it offers a complete replacement for most applications, notably for LHC physics studies. Lastly, the many new features should allow an improved description of data.« less
Recommendations for selecting drug-drug interactions for clinical decision support.
Tilson, Hugh; Hines, Lisa E; McEvoy, Gerald; Weinstein, David M; Hansten, Philip D; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L; Huang, Shiew-Mei; Perre, Anthony; Bates, David W; Poikonen, John; Wittie, Michael A; Grizzle, Amy J; Brown, Mary; Malone, Daniel C
2016-04-15
Recommendations for including drug-drug interactions (DDIs) in clinical decision support (CDS) are presented. A conference series was conducted to improve CDS for DDIs. A work group consisting of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information vendors, and healthcare organizations was convened to address (1) the process to use for developing and maintaining a standard set of DDIs, (2) the information that should be included in a knowledge base of standard DDIs, (3) whether a list of contraindicated drug pairs can or should be established, and (4) how to more intelligently filter DDI alerts. We recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated and more research to identify methods to safely reduce repetitive and less-relevant alerts. An expert panel with a centralized organizer or convener should be established to develop and maintain a standard set of DDIs for CDS in the United States. The process should be evidence driven, transparent, and systematic, with feedback from multiple stakeholders for continuous improvement. The scope of the expert panel's work should be carefully managed to ensure that the process is sustainable. Support for research to improve DDI alerting in the future is also needed. Adoption of these steps may lead to consistent and clinically relevant content for interruptive DDIs, thus reducing alert fatigue and improving patient safety. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Usvyat, Denis; Civalleri, Bartolomeo; Maschio, Lorenzo; Dovesi, Roberto; Pisani, Cesare; Schütz, Martin
2011-06-07
The atomic orbital basis set limit is approached in periodic correlated calculations for solid LiH. The valence correlation energy is evaluated at the level of the local periodic second order Møller-Plesset perturbation theory (MP2), using basis sets of progressively increasing size, and also employing "bond"-centered basis functions in addition to the standard atom-centered ones. Extended basis sets, which contain linear dependencies, are processed only at the MP2 stage via a dual basis set scheme. The local approximation (domain) error has been consistently eliminated by expanding the orbital excitation domains. As a final result, it is demonstrated that the complete basis set limit can be reached for both HF and local MP2 periodic calculations, and a general scheme is outlined for the definition of high-quality atomic-orbital basis sets for solids. © 2011 American Institute of Physics
[Ecological security of wastewater treatment processes: a review].
Yang, Sai; Hua, Tao
2013-05-01
Though the regular indicators of wastewater after treatment can meet the discharge requirements and reuse standards, it doesn't mean the effluent is harmless. From the sustainable point of view, to ensure the ecological and human security, comprehensive toxicity should be considered when discharge standards are set up. In order to improve the ecological security of wastewater treatment processes, toxicity reduction should be considered when selecting and optimizing the treatment processes. This paper reviewed the researches on the ecological security of wastewater treatment processes, with the focus on the purposes of various treatment processes, including the processes for special wastewater treatment, wastewater reuse, and for the safety of receiving waters. Conventional biological treatment combined with advanced oxidation technologies can enhance the toxicity reduction on the base of pollutants removal, which is worthy of further study. For the process aimed at wastewater reuse, the integration of different process units can complement the advantages of both conventional pollutants removal and toxicity reduction. For the process aimed at ecological security of receiving waters, the emphasis should be put on the toxicity reduction optimization of process parameters and process unit selection. Some suggestions for the problems in the current research and future research directions were put forward.
Transforming Oncology Care: Developing a Strategy and Measuring Success.
Reid Ponte, Patricia; Berry, Donna; Buswell, Lori; Gross, Anne; Hayes, Carolyn; Kostka, Judy; Poyner-Reed, Mary; West, Colleen
2016-05-01
To examine accountability and performance measurement in health care and present a case study that illustrates the link between goal setting and measurement and how a strategic plan can provide a framework for metric selection. National reports, literature review and institutional experience. Nurse leaders and clinicians in oncology settings are challenged to anticipate future trends in oncology care and create a culture, infrastructure, and practice environment that supports innovation, advancement of oncology nursing practice and excellence in patient- and family-centered care. Performance metrics assessing key processes and outcomes of care are essential to meet this challenge. With an increasing number of national organizations offering their version of key quality standards and metrics, it is critical for nurses to have a formal process in place to determine and implement the measures most useful in guiding change for a particular clinical setting. Copyright © 2016 Elsevier Inc. All rights reserved.
Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo
2017-04-01
As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.
Ayuso-Mateos, José L; Avila, Carolina C; Anaya, Celia; Cieza, Alarcos; Vieta, Eduard
2013-01-01
The International Classification of Functioning, Disability and Health (ICF) is a tool of the World Health Organization (WHO) designed to be a guide to identify and classify relevant domains of human experience affected by health conditions. The purpose of this article is to describe the process for the development of two Core Sets for bipolar disorder (BD) in the framework of the ICF. The Comprehensive ICF Core Set for BD intends to be a guide for multidisciplinary assessment of patients diagnosed with this condition, while the Brief ICF Core Set for BD will be useful when rating aspects of patient's experience for clinical practice or epidemiological studies. An international consensus conference involving a sample of experts with different professional backgrounds was performed using the nominal group technique. Various preparatory studies identified a set of 743 potential ICF categories to be included in the Core Sets. A total of 38 ICF categories were selected to be included in the Comprehensive Core Set for BD. A total of 19 ICF categories from the Comprehensive Core Set were chosen as the most significant to constitute the Brief Core Set for BD. The formal consensus process integrating evidence and expert opinion on the ICF led to the formal adoption of the ICF Core Sets for BD. The most important categories included are representative of the characteristics usually associated with BD. The next phase of this ICF project is to conduct a formal validation process to establish its applicability in clinical settings. Implications for Rehabilitation Bipolar disorder (BD) is a prevalent condition that has a great impact on people who suffer it, not only in health but also in daily functioning and quality of life. No standard has been defined so far regarding the problems in functioning of persons with BDs. The process described in this article defines the set of areas of functioning to be addressed in clinical assessments of persons with BD and establish the starting point for the development of condition-specific outcome measures.
The RISC (Reduced Instruction Set Computer) Architecture and Computer Performance Evaluation.
1986-03-01
time where the main emphasis of the evaluation process is put on the software . The model is intended to provide a tool for computer architects to use...program, or 3) Was to be implemented in random logic more effec- tively than the equivalent sequence of software instructions. Both data and address...definition is the IEEE standard 729-1983 stating Computer Architecture as: " The process of defining a collection of hardware and software components and
Detecting letters in continuous text: effects of display size.
Healy, A F; Oliver, W L; McNamara, T P
1987-05-01
In three letter detection experiments, subjects responded to each instance of the letter t in continuous text typed in a standard paragraph, typed with one to four words per line, or shown for a fixed duration on a computer screen either one or four words at a time. In the multiword and the standard paragraph conditions, errors were greatest and latencies longest on the word the when it was correctly spelled. This effect was diminished or reversed in the one-word conditions. These findings support a set of unitization hypotheses about the reading process, according to which subjects do not process the constituent letters of a word once that word has been identified unless no other word is in view.
CCD Strömvil Photometry of M 37
NASA Astrophysics Data System (ADS)
Boyle, R. P.; Janusz, R.; Kazlauskas, A.; Philip, A. G. Davis
2001-12-01
We have been working on a program of setting up standards in the Strömvil photometric system and have been doing CCD photometry of globular and open clusters. A previous paper (Boyle et al. BAAS, AAS Meeting #193, #68.08) described the results of observations made in the open cluster M 67, which we are setting up as one of the prime standard fields for Strömvil photometry. Now we discuss our observations of M 37, made on the Vatican Advanced Technology Telescope on Mt. Graham, Arizona. One of us (R.J.) has automated the data processing by a novel method. The Strömvil group is multinational. By use of this innovative automated, yet interactive processing method, one systematically applies the same processing steps to run in IRAF by capturing them as presented in html files and submitting them to the IRAF command language. Use of the mouse avoids errors and accelerates the processing from raw data frames to calibrated photometry. From several G2 V stars in M 67 we have calculated their mean color indices and compare them to stars in M 37 to identify candidate G2 V stars there. Identifying such stars relates to the search for terrestrial exoplanets. Ultimately we will use the calibrated Strömvil indices to make photometric determinations of log g and Teff.
Standards and Students with Disabilities: Reality or Virtual Reality? Brief Report 8.
ERIC Educational Resources Information Center
Saint Cloud State Univ., MN.
This Brief Report highlights current activities focused on setting standards in education, and examines whether students with disabilities are considered when standards are set. Types of standards are distinguished, including performance standards, delivery standards, and content standards. Information on organizations developing standards in…
Data standards can boost metabolomics research, and if there is a will, there is a way.
Rocca-Serra, Philippe; Salek, Reza M; Arita, Masanori; Correa, Elon; Dayalan, Saravanan; Gonzalez-Beltran, Alejandra; Ebbels, Tim; Goodacre, Royston; Hastings, Janna; Haug, Kenneth; Koulman, Albert; Nikolski, Macha; Oresic, Matej; Sansone, Susanna-Assunta; Schober, Daniel; Smith, James; Steinbeck, Christoph; Viant, Mark R; Neumann, Steffen
2016-01-01
Thousands of articles using metabolomics approaches are published every year. With the increasing amounts of data being produced, mere description of investigations as text in manuscripts is not sufficient to enable re-use anymore: the underlying data needs to be published together with the findings in the literature to maximise the benefit from public and private expenditure and to take advantage of an enormous opportunity to improve scientific reproducibility in metabolomics and cognate disciplines. Reporting recommendations in metabolomics started to emerge about a decade ago and were mostly concerned with inventories of the information that had to be reported in the literature for consistency. In recent years, metabolomics data standards have developed extensively, to include the primary research data, derived results and the experimental description and importantly the metadata in a machine-readable way. This includes vendor independent data standards such as mzML for mass spectrometry and nmrML for NMR raw data that have both enabled the development of advanced data processing algorithms by the scientific community. Standards such as ISA-Tab cover essential metadata, including the experimental design, the applied protocols, association between samples, data files and the experimental factors for further statistical analysis. Altogether, they pave the way for both reproducible research and data reuse, including meta-analyses. Further incentives to prepare standards compliant data sets include new opportunities to publish data sets, but also require a little "arm twisting" in the author guidelines of scientific journals to submit the data sets to public repositories such as the NIH Metabolomics Workbench or MetaboLights at EMBL-EBI. In the present article, we look at standards for data sharing, investigate their impact in metabolomics and give suggestions to improve their adoption.
2012-01-01
Introduction Promoting health equity is a key goal of many public health systems. However, little is known about how equity is conceptualized in such systems, particularly as standards of public health practice are established. As part of a larger study examining the renewal of public health in two Canadian provinces, Ontario and British Columbia (BC), we undertook an analysis of relevant public health documents related to equity. The aim of this paper is to discuss how equity is considered within documents that outline standards for public health. Methods A research team consisting of policymakers and academics identified key documents related to the public health renewal process in each province. The documents were analyzed using constant comparative analysis to identify key themes related to the conceptualization and integration of health equity as part of public health renewal in Ontario and BC. Documents were coded inductively with higher levels of abstraction achieved through multiple readings. Sets of questions were developed to guide the analysis throughout the process. Results In both sets of provincial documents health inequities were defined in a similar fashion, as the consequence of unfair or unjust structural conditions. Reducing health inequities was an explicit goal of the public health renewal process. In Ontario, addressing “priority populations” was used as a proxy term for health equity and the focus was on existing programs. In BC, the incorporation of an equity lens enhanced the identification of health inequities, with a particular emphasis on the social determinants of health. In both, priority was given to reducing barriers to public health services and to forming partnerships with other sectors to reduce health inequities. Limits to the accountability of public health to reduce health inequities were identified in both provinces. Conclusion This study contributes to understanding how health equity is conceptualized and incorporated into standards for local public health. As reflected in their policies, both provinces have embraced the importance of reducing health inequities. Both concepualized this process as rooted in structural injustices and the social determinants of health. Differences in the conceptualization of health equity likely reflect contextual influences on the public health renewal processes in each jurisdiction. PMID:22632097
Comparing Pattern Recognition Feature Sets for Sorting Triples in the FIRST Database
NASA Astrophysics Data System (ADS)
Proctor, D. D.
2006-07-01
Pattern recognition techniques have been used with increasing success for coping with the tremendous amounts of data being generated by automated surveys. Usually this process involves construction of training sets, the typical examples of data with known classifications. Given a feature set, along with the training set, statistical methods can be employed to generate a classifier. The classifier is then applied to process the remaining data. Feature set selection, however, is still an issue. This paper presents techniques developed for accommodating data for which a substantive portion of the training set cannot be classified unambiguously, a typical case for low-resolution data. Significance tests on the sort-ordered, sample-size-normalized vote distribution of an ensemble of decision trees is introduced as a method of evaluating relative quality of feature sets. The technique is applied to comparing feature sets for sorting a particular radio galaxy morphology, bent-doubles, from the Faint Images of the Radio Sky at Twenty Centimeters (FIRST) database. Also examined are alternative functional forms for feature sets. Associated standard deviations provide the means to evaluate the effect of the number of folds, the number of classifiers per fold, and the sample size on the resulting classifications. The technique also may be applied to situations for which, although accurate classifications are available, the feature set is clearly inadequate, but is desired nonetheless to make the best of available information.
Whittaker, P J; Gollins, H J; Roaf, E J
2014-03-01
Infant male circumcision is practised by many groups for religious and cultural reasons. Prompted by a desire to minimize the complication rate and to help parents identify good quality providers, a quality assurance (QA) process for infant male circumcision providers has been developed in Greater Manchester. Local stakeholders agreed a set of minimum standards, and providers were invited to submit evidence of their practice in relation to these standards. In participation with parents, community groups, faith groups, healthcare staff and safeguarding partners, an information leaflet for parents was produced. Engagement work with local community groups, faith groups, providers and healthcare staff was vital to ensure that the resources are accessible to parents and that providers continue to engage in the process. Providers that met the QA standards have been listed on a local website. Details of the website are included in the information leaflet distributed by maternity services, health visitors, primary care and community and faith groups. The leaflet is available in seven languages. Local QA processes can be used to encourage and identify good practice and to support parents who need to access services outside the remit of the National Health Service.
Dishwashing water recycling system and related water quality standards for military use.
Church, Jared; Verbyla, Matthew E; Lee, Woo Hyoung; Randall, Andrew A; Amundsen, Ted J; Zastrow, Dustin J
2015-10-01
As the demand for reliable and safe water supplies increases, both water quality and available quantity are being challenged by population growth and climate change. Greywater reuse is becoming a common practice worldwide; however, in remote locations of limited water supply, such as those encountered in military installations, it is desirable to expand its classification to include dishwashing water to maximize the conservation of fresh water. Given that no standards for dishwashing greywater reuse by the military are currently available, the current study determined a specific set of water quality standards for dishwater recycling systems for U.S. military field operations. A tentative water reuse standard for dishwashing water was developed based on federal and state regulations and guidelines for non-potable water, and the developed standard was cross-evaluated by monitoring water quality data from a full-scale dishwashing water recycling system using an innovative electrocoagulation and ultrafiltration process. Quantitative microbial risk assessment (QMRA) was also performed based on exposure scenarios derived from literature data. As a result, a specific set of dishwashing water reuse standards for field analysis (simple, but accurate) was finalized as follows: turbidity (<1 NTU), Escherichia coli (<50 cfu mL(-1)), and pH (6-9). UV254 was recommended as a surrogate for organic contaminants (e.g., BOD5), but requires further calibration steps for validation. The developed specific water standard is the first for dishwashing water reuse and will be expected to ensure that water quality is safe for field operations, but not so stringent that design complexity, cost, and operational and maintenance requirements will not be feasible for field use. In addition the parameters can be monitored using simple equipment in a field setting with only modest training requirements and real-time or rapid sample turn-around. This standard may prove useful in future development of civilian guidelines. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
McDonald, Michael C.; Kim, H. K.; Henry, J. R.; Cunningham, I. A.
2012-03-01
The detective quantum efficiency (DQE) is widely accepted as a primary measure of x-ray detector performance in the scientific community. A standard method for measuring the DQE, based on IEC 62220-1, requires the system to have a linear response meaning that the detector output signals are proportional to the incident x-ray exposure. However, many systems have a non-linear response due to characteristics of the detector, or post processing of the detector signals, that cannot be disabled and may involve unknown algorithms considered proprietary by the manufacturer. For these reasons, the DQE has not been considered as a practical candidate for routine quality assurance testing in a clinical setting. In this article we described a method that can be used to measure the DQE of both linear and non-linear systems that employ only linear image processing algorithms. The method was validated on a Cesium Iodide based flat panel system that simultaneously stores a raw (linear) and processed (non-linear) image for each exposure. It was found that the resulting DQE was equivalent to a conventional standards-compliant DQE with measurement precision, and the gray-scale inversion and linear edge enhancement did not affect the DQE result. While not IEC 62220-1 compliant, it may be adequate for QA programs.
Product pricing in the Solar Array Manufacturing Industry - An executive summary of SAMICS
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1978-01-01
Capabilities, methodology, and a description of input data to the Solar Array Manufacturing Industry Costing Standards (SAMICS) are presented. SAMICS were developed to provide a standardized procedure and data base for comparing manufacturing processes of Low-cost Solar Array (LSA) subcontractors, guide the setting of research priorities, and assess the progress of LSA toward its hundred-fold cost reduction goal. SAMICS can be used to estimate the manufacturing costs and product prices and determine the impact of inflation, taxes, and interest rates, but it is limited by its ignoring the effects of the market supply and demand and an assumption that all factories operate in a production line mode. The SAMICS methodology defines the industry structure, hypothetical supplier companies, and manufacturing processes and maintains a body of standardized data which is used to compute the final product price. The input data includes the product description, the process characteristics, the equipment cost factors, and production data for the preparation of detailed cost estimates. Activities validating that SAMICS produced realistic price estimates and cost breakdowns are described.
Oellrich, Anika; Collier, Nigel; Smedley, Damian; Groza, Tudor
2015-01-01
Electronic health records and scientific articles possess differing linguistic characteristics that may impact the performance of natural language processing tools developed for one or the other. In this paper, we investigate the performance of four extant concept recognition tools: the clinical Text Analysis and Knowledge Extraction System (cTAKES), the National Center for Biomedical Ontology (NCBO) Annotator, the Biomedical Concept Annotation System (BeCAS) and MetaMap. Each of the four concept recognition systems is applied to four different corpora: the i2b2 corpus of clinical documents, a PubMed corpus of Medline abstracts, a clinical trails corpus and the ShARe/CLEF corpus. In addition, we assess the individual system performances with respect to one gold standard annotation set, available for the ShARe/CLEF corpus. Furthermore, we built a silver standard annotation set from the individual systems' output and assess the quality as well as the contribution of individual systems to the quality of the silver standard. Our results demonstrate that mainly the NCBO annotator and cTAKES contribute to the silver standard corpora (F1-measures in the range of 21% to 74%) and their quality (best F1-measure of 33%), independent from the type of text investigated. While BeCAS and MetaMap can contribute to the precision of silver standard annotations (precision of up to 42%), the F1-measure drops when combined with NCBO Annotator and cTAKES due to a low recall. In conclusion, the performances of individual systems need to be improved independently from the text types, and the leveraging strategies to best take advantage of individual systems' annotations need to be revised. The textual content of the PubMed corpus, accession numbers for the clinical trials corpus, and assigned annotations of the four concept recognition systems as well as the generated silver standard annotation sets are available from http://purl.org/phenotype/resources. The textual content of the ShARe/CLEF (https://sites.google.com/site/shareclefehealth/data) and i2b2 (https://i2b2.org/NLP/DataSets/) corpora needs to be requested with the individual corpus providers.
Henderson, Amanda; Winch, Sarah
2008-01-01
Leadership strategies are important in facilitating the nursing profession to reach their optimum standards in the practice environment. To compare and contrast the central tenets of contemporary quality initiatives that are commensurate with enabling the environment so that best practice can occur. Democratic leadership, accessible and relevant education and professional development, the incorporation of evidence into practice and the ability of facilities to be responsive to change are core considerations for the successful maintenance of practice standards that are consistent with best nursing practice. While different concerns of management drive the adoption of contemporary approaches, there are many similarities in the how these approaches are translated into action in the clinical setting. Managers should focus on core principles of professional nursing that add value to practice rather than business processes.
NASA Astrophysics Data System (ADS)
Rimskog, Magnus; O'Loughlin, Brian J.
2007-02-01
Silex Microsystems handles a wide range of customized MEMS components. This speech will be describing Silex's MEMS foundry work model for providing customized solutions based on MEMS in a cost effective and well controlled manner. Factors for success are the capabilities to reformulate a customer product concept to manufacturing processes in the wafer fab, using standard process modules and production equipment. A well-controlled system increases the likelihood of a first batch success and enables fast ramp-up into volume production. The following success factors can be listed: strong enduring relationships with the customers; highly qualified well-experienced specialists working close with the customer; process solutions and building blocks ready to use out of a library; addressing manufacturing issues in the early design phase; in-house know how to meet demands for volume manufacturing; access to a wafer fab with high capacity, good organization, high availability of equipment, and short lead times; process development done in the manufacturing environment using production equipment for easy ramp-up to volume production. The article covers a method of working to address these factors: to have a long and enduring relationships with customers utilizing MEMS expertise and working close with customers, to translate their product ideas to MEMS components; to have stable process solutions for features such as Low ohmic vias, Spiked electrodes, Cantilevers, Silicon optical mirrors, Micro needles, etc, which can be used and modified for the customer needs; to use a structured development and design methodology in order to handle hundreds of process modules, and setting up standard run sheets. It is also very important to do real time process development in the manufacturing line. It minimizes the lead-time for the ramp-up of production; to have access to a state of the art Wafer Fab which is well organized, controlled and flexible, with high capacity and short lead-time for prototypes. It is crucial to have intimate control of processes, equipment, organization, production flow control and WIP. This has been addressed by using a fully computerized control and reporting system.
MTpy - Python Tools for Magnetotelluric Data Processing and Analysis
NASA Astrophysics Data System (ADS)
Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes
2014-05-01
We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.
Quality of haemophilia care in The Netherlands: new standards for optimal care.
Leebeek, Frank W G; Fischer, Kathelijn
2014-04-01
In the Netherlands, the first formal haemophilia comprehensive care centre was established in 1964, and Dutch haemophilia doctors have been organised since 1972. Although several steps were taken to centralise haemophilia care and maintain quality of care, treatment was still delivered in many hospitals, and formal criteria for haemophilia treatment centres as well as a national haemophilia registry were lacking. In collaboration with patients and other stakeholders, Dutch haemophilia doctors have undertaken a formal process to draft new quality standards for the haemophilia treatment centres. First a project group including doctors, nurses, patients and the institute for harmonisation of quality standards undertook a literature study on quality standards and performed explorative visits to several haemophilia treatment centres in the Netherlands. Afterwards concept standards were defined and validated in two treatment centres. Next, the concept standards were evaluated by haemophilia doctors, patients, health insurance representatives and regulators. Finally, the final version of the standards of care was approved by Central body of Experts on quality standards in clinical care and the Dutch Ministry of Health. A team of expert auditors have been trained and, together with an independent auditor, will perform audits in haemophilia centres applying for formal certification. Concomitantly, a national registry for haemophilia and allied disorders is being set up. It is expected that these processes will lead to further concentration and improved quality of haemophilia care in the Netherlands.
MacDougall, Margaret
2015-10-31
The principal aim of this study is to provide an account of variation in UK undergraduate medical assessment styles and corresponding standard setting approaches with a view to highlighting the importance of a UK national licensing exam in recognizing a common standard. Using a secure online survey system, response data were collected during the period 13 - 30 January 2014 from selected specialists in medical education assessment, who served as representatives for their respective medical schools. Assessment styles and corresponding choices of standard setting methods vary markedly across UK medical schools. While there is considerable consensus on the application of compensatory approaches, individual schools display their own nuances through use of hybrid assessment and standard setting styles, uptake of less popular standard setting techniques and divided views on norm referencing. The extent of variation in assessment and standard setting practices across UK medical schools validates the concern that there is a lack of evidence that UK medical students achieve a common standard on graduation. A national licensing exam is therefore a viable option for benchmarking the performance of all UK undergraduate medical students.
Rose, Bonnie E; Hill, Walter E; Umholtz, Robert; Ransom, Gerri M; James, William O
2002-06-01
The Food Safety and Inspection Service (FSIS) issued Pathogen Reduction; Hazard Analysis and Critical Control Point (HACCP) Systems; Final Rule (the PR/HACCP rule) on 25 July 1996. To verify that industry PR/HACCP systems are effective in controlling the contamination of raw meat and poultry products with human disease-causing bacteria, this rule sets product-specific Salmonella performance standards that must be met by slaughter establishments and establishments producing raw ground products. These performance standards are based on the prevalence of Salmonella as determined from the FSIS's nationwide microbial baseline studies and are expressed in terms of the maximum number of Salmonella-positive samples that are allowed in a given sample set. From 26 January 1998 through 31 December 2000, federal inspectors collected 98,204 samples and 1,502 completed sample sets for Salmonella analysis from large, small, and very small establishments that produced at least one of seven raw meat and poultry products: broilers, market hogs, cows and bulls, steers and heifers, ground beef, ground chicken, and ground turkey. Salmonella prevalence in most of the product categories was lower after the implementation of PR/HACCP than in pre-PR/HACCP baseline studies and surveys conducted by the FSIS. The results of 3 years of testing at establishments of all sizes combined show that >80% of the sample sets met the following Salmonella prevalence performance standards: 20.0% for broilers, 8.7% for market hogs, 2.7% for cows and bulls, 1.0% for steers and heifers, 7.5% for ground beef, 44.6% for ground chicken, and 49.9% for ground turkey. The decreased Salmonella prevalences may partly reflect industry improvements, such as improved process control, incorporation of antimicrobial interventions, and increased microbial-process control monitoring, in conjunction with PR/HACCP implementation.
Wei, Yaxing; Liu, Shishi; Huntzinger, Deborah N.; ...
2014-12-05
Ecosystems are important and dynamic components of the global carbon cycle, and terrestrial biospheric models (TBMs) are crucial tools in further understanding of how terrestrial carbon is stored and exchanged with the atmosphere across a variety of spatial and temporal scales. Improving TBM skills, and quantifying and reducing their estimation uncertainties, pose significant challenges. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) is a formal multi-scale and multi-model intercomparison effort set up to tackle these challenges. The MsTMIP protocol prescribes standardized environmental driver data that are shared among model teams to facilitate model model and model observation comparisons. Inmore » this article, we describe the global and North American environmental driver data sets prepared for the MsTMIP activity to both support their use in MsTMIP and make these data, along with the processes used in selecting/processing these data, accessible to a broader audience. Based on project needs and lessons learned from past model intercomparison activities, we compiled climate, atmospheric CO 2 concentrations, nitrogen deposition, land use and land cover change (LULCC), C3 / C4 grasses fractions, major crops, phenology and soil data into a standard format for global (0.5⁰ x 0.5⁰ resolution) and regional (North American: 0.25⁰ x 0.25⁰ resolution) simulations. In order to meet the needs of MsTMIP, improvements were made to several of the original environmental data sets, by improving the quality, and/or changing their spatial and temporal coverage, and resolution. The resulting standardized model driver data sets are being used by over 20 different models participating in MsTMIP. Lastly, the data are archived at the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, http://daac.ornl.gov) to provide long-term data management and distribution.« less
Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe
2014-09-01
Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.
Balonov, M; Kashparov, V; Nikolaenko, A; Berkovskyy, V; Fesenko, S
2018-06-01
The article critically examines the practice of post-Chernobyl standardisation of radionuclide concentrations (mainly 137 Cs and 90 Sr) in food products (FPs) in the USSR and the successor countries of Belarus, Russia and Ukraine. Recommendations are given on potential harmonisation of these standards of radionuclide concentrations in FPs among the three countries, taking into account substantial international experience. We propose to reduce the number of product groups for standardisation purposes from the current amount of several dozens to three to five groups to optimise radiation control and increase the transparency of the process. We recommend five product groups for the standardisation of 137 Cs and three groups for 90 Sr in food in radiocontaminated areas. The values of standards for individual product groups are recommended to be set proportionally to the measured specific activity in each of these groups, which will reduce unreasonable food rejection. The standards might be set for the entire country, and could be also used to control imports from other countries as well as exports to other countries. The developed recommendations were transferred in 2015-2016 to the regulatory authorities of the three countries.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.
Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V
2016-07-01
In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The Next Generation Science Standards: A potential revolution for geoscience education
NASA Astrophysics Data System (ADS)
Wysession, Michael E.
2014-05-01
The first and only set of U.S.-nationally distributed K-12 science education standards have been adopted by many states across America, with the potential to be adopted by many more. Earth and space science plays a prominent role in the new standards, with particular emphasis on critical Earth issues such as climate change, sustainability, and human impacts on Earth systems. In the states that choose to adopt the Next Generation Science Standards (NGSS), American youth will have a rigorous practice-based formal education in these important areas. Much work needs to be done to insure the adoption and adequate implementation of the NGSS by a majority of American states, however, and there are many things that Earth and space scientists can do to help facilitate the process.
NASA Astrophysics Data System (ADS)
Sharonov, M. A.; Sharonova, O. V.; Sharonova, V. P.
2018-03-01
The article is an attempt to create a model built using Eulerian circles (Venn diagrams) to illustrate the methodological impact of recent Federal Law 283-FZ “On the independent evaluation of qualifications” and new Federal State Educational Standards of higher education of generation 3++ on educational process in Russia. In modern economic conditions, the ability to correctly assess the role of professional standards, as a matter of fact, some set, the degree of intersection with the approximate basic educational program and the Federal State Educational Standards becomes an important factor on which in the future will depend not only the demand of graduates in the labor market, but also the possibility of passing the professional and public accreditation of the proposed program.
Yiu, Sean; Tom, Brian Dm
2017-01-01
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.
Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André
2010-01-01
Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.
What Does it Mean to Publish Data in Earth System Science Data Journal?
NASA Astrophysics Data System (ADS)
Carlson, D.; Pfeiffenberger, H.
2015-12-01
The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?
EU accession: A policy window opportunity for nursing?
De Raeve, Paul; Rafferty, Anne-Marie; Bariball, Louise; Young, Ruth; Boiko, Olga
2017-03-01
European enlargement has been studied in a wide range of policy areas within and beyond health. Yet the impact of EU enlargement upon one of the largest health professions, nursing, has been largely neglected. This paper aims to explore nurse leadership using a comparative case study method in two former Communist countries, Romania and Croatia. Specifically, it considers the extent to which engagement in the EU accession policy-making process provided a policy window for the leaders to formulate and implement a professional agenda while negotiating EU accession. Findings of qualitative interviews and documentary analysis indicate that the mechanisms used to facilitate the accession process were not successful in achieving compliance with the education standards in the Community Acquis, as highlighted in the criteria on the mutual recognition of professional qualifications set out in Directive 2005/36/EC. EU accession capacity building and accession funds were not deployed efficiently to upgrade Romanian and Croatian nursing education towards meeting EU standards. Conflicting views on accession held by the various nursing stakeholders (nursing regulator, nursing union, governmental chief nurse and the professional association) inhibited the setting of a common policy agenda to achieve compliance with EU standards. The study findings suggest a need to critically review EU accession mechanisms and better align leadership at all governance levels. Copyright © 2017 Elsevier B.V. All rights reserved.
Revision of the design of a standard for the dimensions of school furniture.
Molenbroek, J F M; Kroon-Ramaekers, Y M T; Snijders, C J
2003-06-10
In this study an anthropometric design process was followed. The aim was to improve the fit of school furniture sizes for European children. It was demonstrated statistically that the draft of a European standard does not cover the target population. No literature on design criteria for sizes exists, and in practice it is common to calculate the fit for only the mean values (P50). The calculations reported here used body dimensions of Dutch children, measured by the authors' Department, and used data from German and British national standards. A design process was followed that contains several steps, including: Target group, Anthropometric model and Percentage exclusion. The criteria developed in this study are (1) a fit on the basis of 1% exclusion (P1 or P99), and (2) a prescription based on popliteal height. Based on this new approach it was concluded that prescription of a set size should be based on popliteal height rather than body height. The drafted standard, Pren 1729, can be improved with this approach. A European standard for school furniture should include the exception that for Dutch children an extra large size is required.
Coch, Donna; Benoit, Clarisse
2015-01-01
We investigated whether and how standardized behavioral measures of reading and electrophysiological measures of reading were related in 72 typically developing, late elementary school children. Behavioral measures included standardized tests of spelling, phonological processing, vocabulary, comprehension, naming speed, and memory. Electrophysiological measures were composed of the amplitude of the N400 component of the event-related potential waveform elicited by real words, pseudowords, nonpronounceable letter strings, and strings of letter-like symbols (false fonts). The only significant brain-behavior correlations were between standard scores on the vocabulary test and N400 mean amplitude to real words (r = −.272) and pseudowords (r = −.235). We conclude that, while these specific sets of standardized behavioral and electrophysiological measures both provide an index of reading, for the most part, they are independent and draw upon different underlying processing resources. [T]o completely analyze what we do when we read… would be to describe very many of the most intricate workings of the human mind, as well as to unravel the tangled story of the most remarkable specific performance that civilization has learned in all its history(Huey, 1908/1968, p. 3). PMID:26346715
[A medical consumable material management information system].
Tang, Guoping; Hu, Liang
2014-05-01
Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.
Blood Sampling and Preparation Procedures for Proteomic Biomarker Studies of Psychiatric Disorders.
Guest, Paul C; Rahmoune, Hassan
2017-01-01
A major challenge in proteomic biomarker discovery and validation for psychiatric diseases is the inherent biological complexity underlying these conditions. There are also many technical issues which hinder this process such as the lack of standardization in sampling, processing and storage of bio-samples in preclinical and clinical settings. This chapter describes a reproducible procedure for sampling blood serum and plasma that is specifically designed for maximizing data quality output in two-dimensional gel electrophoresis, multiplex immunoassay and mass spectrometry profiling studies.
[Quality control in herbal supplements].
Oelker, Luisa
2005-01-01
Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.
An Integrated Earth Science, Astronomy, and Physics Course for Elementary Education Majors
ERIC Educational Resources Information Center
Plotnick, Roy E.; Varelas, Maria; Fan, Qian
2009-01-01
Physical World is a one-semester course designed for elementary education majors, that integrates earth science, astronomy, and physics. The course is part of a four-course set that explores science concepts, processes, and skills, along with the nature of scientific practice, that are included in state and national standards for elementary school…
ERIC Educational Resources Information Center
Kelly, Heather A.; Walters, Allison M.
2016-01-01
Comprehensive data are essential to answer questions from prospective students, parents, and private and public entities about the cost of college and students' return on investment, as well as to demonstrate how colleges and universities are helping to prepare the future workforce. An evolutionary data-collection process, efforts to improve the…
ERIC Educational Resources Information Center
Zavadsky, Heather
2014-01-01
The role of state education agencies (SEAs) has shifted significantly from low-profile, compliance activities like managing federal grants to engaging in more complex and politically charged tasks like setting curriculum standards, developing accountability systems, and creating new teacher evaluation systems. The move from compliance-monitoring…
Institutional Effectiveness as Process and Practice in the American Community College
ERIC Educational Resources Information Center
Manning, Terri Mulkins
2011-01-01
The six regional accrediting agencies in the United States have created a set of standards based on best practices in colleges and universities. The evolving perception of an effective institution is one that uses data, assessment, and evaluation results to improve programs and services and strives for a high level of institutional quality. While…
Advertising Practitioner's Ethical Decision-Making: The Utilitarian Viewpoint.
ERIC Educational Resources Information Center
Overstreet, Charles William
A study compared the decision making process of large and small advertising agencies to determine if the size of the agency, in terms of gross annual billing, had any effect on adherence to the rules set forth in the American Association of Advertising's Standards of Practice. Forty agency employees, 20 from agencies with billings less than $2.5…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... (NANC) recommending a set of standard thresholds and intervals for non-simple ports and ``projects... comment on whether the thresholds and processing timelines for non-simple ports and projects are...: Interested parties may submit comments, identified by WC Docket No. 07-244 and CC Docket No. 95-116, by any...
ERIC Educational Resources Information Center
Tong, Xiuhong; McBride, Catherine
2017-01-01
Is dyslexia in Chinese for Chinese-English bilinguals associated with difficulties in reading English, given differences in L1 and L2 orthographies? Among 11 Hong Kong Chinese adolescents with dyslexia, who were diagnosed by professional psychologists using the diagnostic criteria set out in a standardized test, and 14 adolescents without…
SU-E-I-27: Establishing Target Exposure Index Values for Computed Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, N; Tchou, P; Belcher, K
2014-06-01
Purpose: To develop a standard set of target exposure index (TEI) values to be applied to Agfa Computed Radiography (CR) readers in accordance with International Electrotechnical Committee 62494-1 (ed. 1.0). Methods: A large data cohort was collected from six USAF Medical Treatment Facilities that exclusively use Agfa CR Readers. Dose monitoring statistics were collected from each reader. The data was analyzed based on anatomic region, view, and processing speed class. The Agfa specific exposure metric, logarithmic mean (LGM), was converted to exposure index (EI) for each data set. The optimum TEI value was determined by minimizing the number of studiesmore » that fell outside the acceptable deviation index (DI) range of +/− 2 for phototimed techniques or a range of +/−3 for fixed techniques. An anthropomorphic radiographic phantom was used to corroborate the TEI recommendations. Images were acquired of several anatomic regions and views using standard techniques. The images were then evaluated by two radiologists as either acceptable or unacceptable. The acceptable image with the lowest exposure and EI value was compared to the recommended TEI values using a passing DI range. Results: Target EI values were determined for a comprehensive list of anatomic regions and views. Conclusion: Target EI values must be established on each CR unit in order to provide a positive feedback system for the technologist. This system will serve as a mechanism to prevent under or overexposures of patients. The TEI recommendations are a first attempt at a large scale process improvement with the goal of setting reasonable and standardized TEI values. The implementation and effectiveness of the recommended TEI values should be monitored and adjustments made as necessary.« less
Development of Indicators to Assess Quality of Care for Prostate Cancer.
Nag, Nupur; Millar, Jeremy; Davis, Ian D; Costello, Shaun; Duthie, James B; Mark, Stephen; Delprado, Warick; Smith, David; Pryor, David; Galvin, David; Sullivan, Frank; Murphy, Áine C; Roder, David; Elsaleh, Hany; Currow, David; White, Craig; Skala, Marketa; Moretti, Kim L; Walker, Tony; De Ieso, Paolo; Brooks, Andrew; Heathcote, Peter; Frydenberg, Mark; Thavaseelan, Jeffery; Evans, Sue M
2016-02-20
The development, monitoring, and reporting of indicator measures that describe standard of care provide the gold standard for assessing quality of care and patient outcomes. Although indicator measures have been reported, little evidence of their use in measuring and benchmarking performance is available. A standard set, defining numerator, denominator, and risk adjustments, will enable global benchmarking of quality of care. To develop a set of indicators to enable assessment and reporting of quality of care for men with localised prostate cancer (PCa). Candidate indicators were identified from the literature. An international panel was invited to participate in a modified Delphi process. Teleconferences were held before and after each voting round to provide instruction and to review results. Panellists were asked to rate each proposed indicator on a Likert scale of 1-9 in a two-round iterative process. Calculations required to report on the endorsed indicators were evaluated and modified to reflect the data capture of the Prostate Cancer Outcomes Registry-Australia and New Zealand (PCOR-ANZ). A total of 97 candidate indicators were identified, of which 12 were endorsed. The set includes indicators covering pre-, intra-, and post-treatment of PCa care, within the limits of the data captured by PCOR-ANZ. The 12 endorsed quality measures enable international benchmarking on the quality of care of men with localised PCa. Reporting on these indicators enhances safety and efficacy of treatment, reduces variation in care, and can improve patient outcomes. PCa has the highest incidence of all cancers in men. Early diagnosis and relatively high survival rates mean issues of quality of care and best possible health outcomes for patients are important. This paper identifies 12 important measurable quality indicators in PCa care. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Machine-Learning Algorithms to Code Public Health Spending Accounts
Leider, Jonathon P.; Resnick, Beth A.; Alfonso, Y. Natalia; Bishai, David
2017-01-01
Objectives: Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. Methods: We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Results: Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Conclusions: Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation. PMID:28363034
Machine-Learning Algorithms to Code Public Health Spending Accounts.
Brady, Eoghan S; Leider, Jonathon P; Resnick, Beth A; Alfonso, Y Natalia; Bishai, David
Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation.
Sequence tagging reveals unexpected modifications in toxicoproteomics
Dasari, Surendra; Chambers, Matthew C.; Codreanu, Simona G.; Liebler, Daniel C.; Collins, Ben C.; Pennington, Stephen R.; Gallagher, William M.; Tabb, David L.
2010-01-01
Toxicoproteomic samples are rich in posttranslational modifications (PTMs) of proteins. Identifying these modifications via standard database searching can incur significant performance penalties. Here we describe the latest developments in TagRecon, an algorithm that leverages inferred sequence tags to identify modified peptides in toxicoproteomic data sets. TagRecon identifies known modifications more effectively than the MyriMatch database search engine. TagRecon outperformed state of the art software in recognizing unanticipated modifications from LTQ, Orbitrap, and QTOF data sets. We developed user-friendly software for detecting persistent mass shifts from samples. We follow a three-step strategy for detecting unanticipated PTMs in samples. First, we identify the proteins present in the sample with a standard database search. Next, identified proteins are interrogated for unexpected PTMs with a sequence tag-based search. Finally, additional evidence is gathered for the detected mass shifts with a refinement search. Application of this technology on toxicoproteomic data sets revealed unintended cross-reactions between proteins and sample processing reagents. Twenty five proteins in rat liver showed signs of oxidative stress when exposed to potentially toxic drugs. These results demonstrate the value of mining toxicoproteomic data sets for modifications. PMID:21214251
The undergraduate research fellows program: a unique model to promote engagement in research.
Vessey, Judith A; DeMarco, Rosanna F
2008-01-01
Well-educated nurses with research expertise are needed to advance evidence-based nursing practice. A primary goal of undergraduate nursing curricula is to create meaningful participatory experiences to help students develop a research skill set that articulates with rapid career advancement of gifted, young graduates interested in nursing research and faculty careers. Three research enrichment models-undergraduate honors programs, research assistant work-for-hire programs, and research work/mentorship programs-to be in conjunction with standard research content are reviewed. The development and implementation of one research work/mentorship program, the Boston College undergraduate research fellows program (UGRF), is explicated. This process included surveying previous UGRFs followed by creating a retreat and seminars to address specific research skill sets. The research skill sets included (a) how to develop a research team, (b) accurate data retrieval, (c) ethical considerations, (d) the research process, (e) data management, (f) successful writing of abstracts, and (g) creating effective poster presentations. Outcomes include evidence of involvement in research productivity and valuing of evidenced-based practice through the UGRF mentorship process with faculty partners.
In-Office Endoscopic Laryngeal Laser Procedures: A Patient Safety Initiative.
Anderson, Jennifer; Bensoussan, Yael; Townsley, Richard; Kell, Erika
2018-05-01
Objective To review complications of in-office endoscopic laryngeal laser procedures after implementation of standardized safety protocol. Methods A retrospective review was conducted of the first 2 years of in-office laser procedures at St Michaels Hospital after the introduction of a standardized safety protocol. The protocol included patient screening, procedure checklist with standardized reporting of processes, medications, and complications. Primary outcomes measured were complication rates of in-office laryngeal laser procedures. Secondary outcomes included hemodynamic changes, local anesthetic dose, laser settings, total laser/procedure time, and incidence of sedation. Results A total of 145 in-office KTP procedures performed on 65 patients were reviewed. In 98% of cases, the safety protocol was fully implemented. The overall complication rate was 4.8%. No major complications were encountered. Minor complications included vasovagal episodes and patient intolerance. The rate of patient intolerance resulting early termination of anticipated procedure was 13.1%. Total local anesthetic dose averaged 172.9 mg lidocaine per procedure. The mean amount of laser energy dispersed was 261.2 J, with mean total procedure time of 48.3 minutes. Sixteen percent of patients had preprocedure sedation. Vital signs were found to vary modestly. Systolic blood pressure was lower postprocedure in 13.8% and symptomatic in 4.1%. Discussion The review of our standardized safety protocol has revealed that in-office laser treatment for laryngeal pathology has extremely low complication rates with safe patient outcomes. Implications for Practice The trend of shifting procedures out of the operating room into the office/clinic setting requires new processes designed to promote patient safety.
Setting, Evaluating, and Maintaining Certification Standards with the Rasch Model.
ERIC Educational Resources Information Center
Grosse, Martin E.; Wright, Benjamin D.
1986-01-01
Based on the standard setting procedures or the American Board of Preventive Medicine for their Core Test, this article describes how Rasch measurement can facilitate using test content judgments in setting a standard. Rasch measurement can then be used to evaluate and improve the precision of the standard and to hold it constant across time.…
Citizen Observatories: A Standards Based Architecture
NASA Astrophysics Data System (ADS)
Simonis, Ingo
2015-04-01
A number of large-scale research projects are currently under way exploring the various components of citizen observatories, e.g. CITI-SENSE (http://www.citi-sense.eu), Citclops (http://citclops.eu), COBWEB (http://cobwebproject.eu), OMNISCIENTIS (http://www.omniscientis.eu), and WeSenseIt (http://www.wesenseit.eu). Common to all projects is the motivation to develop a platform enabling effective participation by citizens in environmental projects, while considering important aspects such as security, privacy, long-term storage and availability, accessibility of raw and processed data and its proper integration into catalogues and international exchange and collaboration systems such as GEOSS or INSPIRE. This paper describes the software architecture implemented for setting up crowdsourcing campaigns using standardized components, interfaces, security features, and distribution capabilities. It illustrates the Citizen Observatory Toolkit, a software suite that allows defining crowdsourcing campaigns, to invite registered and unregistered participants to participate in crowdsourcing campaigns, and to analyze, process, and visualize raw and quality enhanced crowd sourcing data and derived products. The Citizen Observatory Toolkit is not a single software product. Instead, it is a framework of components that are built using internationally adopted standards wherever possible (e.g. OGC standards from Sensor Web Enablement, GeoPackage, and Web Mapping and Processing Services, as well as security and metadata/cataloguing standards), defines profiles of those standards where necessary (e.g. SWE O&M profile, SensorML profile), and implements design decisions based on the motivation to maximize interoperability and reusability of all components. The toolkit contains tools to set up, manage and maintain crowdsourcing campaigns, allows building on-demand apps optimized for the specific sampling focus, supports offline and online sampling modes using modern cell phones with built-in sensing technologies, automates the upload of the raw data, and handles conflation services to match quality requirements and analysis challenges. The strict implementation of all components using internationally adopted standards ensures maximal interoperability and reusability of all components. The Citizen Observatory Toolkit is currently developed as part of the COBWEB research project. COBWEB is partially funded by the European Programme FP7/2007-2013 under grant agreement n° 308513; part of the topic ENV.2012.6.5-1 "Developing community based environmental monitoring and information systems using innovative and novel earth observation applications.
Another HISA--the new standard: health informatics--service architecture.
Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik
2007-01-01
In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.
A Study on the Development of Service Quality Index for Incheon International Airport
NASA Technical Reports Server (NTRS)
Lee, Kang Seok; Lee, Seung Chang; Hong, Soon Kil
2003-01-01
The main purpose of this study is located at developing Ominibus Monitors System(OMS) for internal management, which will enable to establish standards, finding out matters to be improved, and appreciation for its treatment in a systematic way. It is through developing subjective or objective estimation tool with use importance, perceived level, and complex index at international airport by each principal service items. The direction of this study came towards for the purpose of developing a metric analysis tool, utilizing the Quantitative Second Data, Analysing Perceived Data through airport user surveys, systemizing the data collection-input-analysis process, making data image according to graph of results, planning Service Encounter and endowing control attribution, and ensuring competitiveness at the minimal international standards. It is much important to set up a pre-investigation plan on the base of existent foreign literature and actual inspection to international airport. Two tasks have been executed together on the base of this pre-investigation; one is developing subjective estimation standards for departing party, entering party, and airport residence and the other is developing objective standards as complementary methods. The study has processed for the purpose of monitoring services at airports regularly and irregularly through developing software system for operating standards after ensuring credibility and feasibility of estimation standards with substantial and statistical way.
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veazey, G.W.; Schake, A.R.; Shalek, P.D.
1996-10-01
The process used at TA-55 to cement transuranic (TRU) waste has experienced several problems with the gypsum-based cement currently being used. Specifically, the waste form could not reliably pass the Waste Isolation Pilot Plant (WIPP) prohibition for free liquid and the Environmental Protection Agency (EPA)-Toxicity Characteristic Leaching Procedure (TCLP) standard for chromium. This report describes the project to develop a portland cement-based waste form that ensures compliance to these standards, as well as other performance standards consisting of homogeneous mixing, moderate hydration temperature, timely initial set, and structural durability. Testing was conducted using the two most common waste streams requiringmore » cementation as of February 1994, lean residue (LR)- and oxalate filtrate (OX)-based evaporator bottoms (EV). A formulation with a pH of 10.3 to 12.1 and a minimum cement-to-liquid (C/L) ratio of 0.80 kg/l for OX-based EV and 0.94 kg/L for LR-based EV was found to pass the performance standards chosen for this project. The implementation of the portland process should result in a yearly cost savings for raw materials of approximately $27,000 over the gypsum process.« less
Data management in clinical research: An overview
Krishnankutty, Binny; Bellary, Shantala; Kumar, Naveen B.R.; Moodahadu, Latha S.
2012-01-01
Clinical Data Management (CDM) is a critical phase in clinical research, which leads to generation of high-quality, reliable, and statistically sound data from clinical trials. This helps to produce a drastic reduction in time from drug development to marketing. Team members of CDM are actively involved in all stages of clinical trial right from inception to completion. They should have adequate process knowledge that helps maintain the quality standards of CDM processes. Various procedures in CDM including Case Report Form (CRF) designing, CRF annotation, database designing, data-entry, data validation, discrepancy management, medical coding, data extraction, and database locking are assessed for quality at regular intervals during a trial. In the present scenario, there is an increased demand to improve the CDM standards to meet the regulatory requirements and stay ahead of the competition by means of faster commercialization of product. With the implementation of regulatory compliant data management tools, CDM team can meet these demands. Additionally, it is becoming mandatory for companies to submit the data electronically. CDM professionals should meet appropriate expectations and set standards for data quality and also have a drive to adapt to the rapidly changing technology. This article highlights the processes involved and provides the reader an overview of the tools and standards adopted as well as the roles and responsibilities in CDM. PMID:22529469
NASA Astrophysics Data System (ADS)
Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey
2015-04-01
A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.
Drake-Lee, A B; Skinner, D; Hawthorne, M; Clarke, R
2009-10-01
'High stakes' postgraduate medical examinations should conform to current educational standards. In the UK and Ireland, national assessments in surgery are devised and managed through the examination structure of the Royal Colleges of Surgeons. Their efforts are not reported in the medical education literature. In the current paper, we aim to clarify this process. To replace the clinical section of the Diploma of Otorhinolaryngology with an Objective, Structured, Clinical Examination, and to set the level of the assessment at one year of postgraduate training in the specialty. After 'blueprinting' against the whole curriculum, an Objective, Structured, Clinical Examination comprising 25 stations was divided into six clinical stations and 19 other stations exploring written case histories, instruments, test results, written communication skills and interpretation skills. The pass mark was set using a modified borderline method and other methods, and statistical analysis of the results was performed. The results of nine examinations between May 2004 and May 2008 are presented. The pass mark varied between 68 and 82 per cent. Internal consistency was good, with a Cronbach's alpha value of 0.99 for all examinations and split-half statistics varying from 0.96 to 0.99. Different standard settings gave similar pass marks. We have developed a summative, Objective, Structured, Clinical Examination for doctors training in otorhinolaryngology, reported herein. The objectives and standards of setting a high quality assessment were met.
He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei
2017-05-01
This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.
Lebreton, Florian; Routier, Guillaume; Héas, Stephane; Bodin, Dominique
2010-08-01
The article explores the process of "sportification"--i.e., processing physical activity in a sport regulated by a set of rules and standards, legitimized by supervisory institutions--from two originals practices, parkour and urban golf. To study these practices, we crossed the contributions of urban sociology and of the contemporary sociology of sport while respecting the methodological principles of qualitative sociology. A first point concerns the process of"sport" itself, its definition, its various stages, and the role played by communication of stakeholders on public space. The cultural mediation shows us how to institutionalize the movement that represents the "sports" resulted in the same time reconfiguration of physical practices themselves. Recent events illustrate the ongoing reconfiguration, we will detail them. Finally, we show the effects produced by the process on the definition of urban culture and sports: setting sight of activities, enhanced cooperation with the media-cultural, polarization between different types of practical in the case of parkour, around a confrontation between two of the founders.
NASA Technical Reports Server (NTRS)
Parker, Bradford H.
2011-01-01
In April 2008, NASA-STD-5009 established a requirement that only sensitivity level 4 penetrants are acceptable for NASA Standard Level liquid penetrant inspections. Having NASA contractors change existing processes or perform demonstration tests to certify sensitivity level 3 penetrants posed a potentially huge cost to the Agency. This study was conducted to directly compare the probability of detection (POD) of sensitivity level 3 and level 4 penetrants using both Method A and Method D inspection processes. POD demonstration tests were performed on 6061-Al, Haynes 188 and Ti-6Al-4V crack panel sets. The study results strongly support the conclusion that sensitivity level 3 penetrants are acceptable for NASA Standard Level inspections.
Industry Initiated Core Safety Attributes for Human Spaceflight for the 7th IAASS Conference
NASA Technical Reports Server (NTRS)
Mango, Edward J.
2014-01-01
Now that the NASA Commercial Crew Program (CCP) is beginning its full certification contract for crew transportation to the International Space Station (ISS), is it time for industry to embrace a minimum set of core safety attributes? Those attributes can then be evolved into an industry-led set of basic safety standards and requirements. After 50 years of human space travel sponsored by governments, there are two basic conditions that now exist within the international space industry. The first, there is enough of a space-faring history to encourage the space industry to design, develop and operate human spaceflight systems without government contracts for anything other than services. Second, industry is capable of defining and enforcing a set of industry-based safety attributes and standards for human spaceflight to low-Earth orbit (LEO). This paper will explore both of these basic conditions with a focus on the safety attributes and standards. In the United States, the Federal Aviation Administration (FAA) is now starting to dialogue with industry about the basic safety principles and attributes needed for potential future regulatory oversight. This process is not yet formalized and will take a number of years once approval is given to move forward. Therefore, throughout the next few years, it is an excellent time and opportunity for industry to collaborate together and develop the core set of attributes and standards. As industry engages and embraces a common set of safety attributes, then government agencies, like the FAA and NASA can use that industry-based product to strengthen their efforts on a safe commercial spaceflight foundation for the future. As the commercial space industry takes the lead role in establishing core safety attributes, and then enforcing those attributes, the entire planet can move away from governmental control of design and development and let industry expand safe and successful space operations in LEO. At that point the governmental agencies can focus on oversight of the industries' defined standards and enforcement for common welfare of the space-faring populous and overall public safety.
2015-01-01
Objectives The principal aim of this study is to provide an account of variation in UK undergraduate medical assessment styles and corresponding standard setting approaches with a view to highlighting the importance of a UK national licensing exam in recognizing a common standard. Methods Using a secure online survey system, response data were collected during the period 13 - 30 January 2014 from selected specialists in medical education assessment, who served as representatives for their respective medical schools. Results Assessment styles and corresponding choices of standard setting methods vary markedly across UK medical schools. While there is considerable consensus on the application of compensatory approaches, individual schools display their own nuances through use of hybrid assessment and standard setting styles, uptake of less popular standard setting techniques and divided views on norm referencing. Conclusions The extent of variation in assessment and standard setting practices across UK medical schools validates the concern that there is a lack of evidence that UK medical students achieve a common standard on graduation. A national licensing exam is therefore a viable option for benchmarking the performance of all UK undergraduate medical students. PMID:26520472
Non-prescription medicines: a process for standards development and testing in community pharmacy.
Benrimoj, Shalom Charlie I; Gilbert, Andrew; Quintrell, Neil; Neto, Abilio C de Almeida
2007-08-01
The objective of the study was to develop and test standards of practice for handling non-prescription medicines. In consultation with pharmacy registering authorities, key professional and consumer groups and selected community pharmacists, standards of practice were developed in the areas of Resource Management; Professional Practice; Pharmacy Design and Environment; and Rights and Needs of Customers. These standards defined and described minimum professional activities required in the provision of non-prescription medicines at a consistent and measurable level of practice. Seven standards were described and further defined by 20 criteria, including practice indicators. The Standards were tested in 40 community pharmacies in two States and after further adaptation, endorsed by all Australian pharmacy registering authorities and major Australian pharmacy and consumer organisations. The consultation process effectively engaged practicing pharmacists in developing standards to enable community pharmacists meet their legislative and professional responsibilities. Community pharmacies were audited against a set of standards of practice for handling non-prescription medicines developed in this project. Pharmacies were audited on the Standards at baseline, mid-intervention and post-intervention. Behavior of community pharmacists and their staff in relation to these standards was measured by conducting pseudo-patron visits to participating pharmacies. The testing process demonstrated a significant improvement in the quality of service delivered by staff in community pharmacies in the management of requests involving non-prescription medicines. The use of pseudo-patron visits, as a training tool with immediate feedback, was an acceptable and effective method of achieving changes in practice. Feedback from staff in the pharmacies regarding the pseudo-patron visits was very positive. Results demonstrated the methodology employed was effective in increasing overall compliance with the Standards from a rate of 47.4% to 70.0% (P < 0.01). This project led to a recommendation for the development and execution of a national implementation strategy.
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
Smith, Roger; Patel, Vipul; Satava, Richard
2014-09-01
There is a need for a standardized curriculum for training and assessment of robotic surgeons to proficiency, followed by high-stakes testing (HST) for certification. To standardize the curriculum and certification of robotic surgeons, a series of consensus conferences attended by 14 leading international surgical societies have been used to compile the outcomes measures and curriculum that should form the basis for a Fundamentals of Robotic Surgery (FRS) programme. A set of 25 outcomes measures and a curriculum for teaching the skills needed to safely use current generation surgical robotic systems has been developed and accepted by a committee of experienced robotic surgeons across 14 specialties. A standardized process for certifying the skills of a robotic surgeon has begun to emerge. The work described here documents both the processes used for developing educational material and the educational content of a robotic curriculum. Copyright © 2013 John Wiley & Sons, Ltd.
Health level 7 development framework for medication administration.
Kim, Hwa Sun; Cho, Hune
2009-01-01
We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.
Normative Databases for Imaging Instrumentation.
Realini, Tony; Zangwill, Linda M; Flanagan, John G; Garway-Heath, David; Patella, Vincent M; Johnson, Chris A; Artes, Paul H; Gaddie, Ian B; Fingeret, Murray
2015-08-01
To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer's database differs in size, eligibility criteria, and ethnic make-up, among other key features. The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments.
Normative Databases for Imaging Instrumentation
Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray
2015-01-01
Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003
On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Sinha, A. K.
1973-01-01
Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.
DHM simulation in virtual environments: a case-study on control room design.
Zamberlan, M; Santos, V; Streit, P; Oliveira, J; Cury, R; Negri, T; Pastura, F; Guimarães, C; Cid, G
2012-01-01
This paper will present the workflow developed for the application of serious games in the design of complex cooperative work settings. The project was based on ergonomic studies and development of a control room among participative design process. Our main concerns were the 3D human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. Using Unity3D platform to design the virtual environment, the virtual human model can be controlled by users on dynamic scenario in order to evaluate the new work settings and simulate work activities. The results obtained showed that this virtual technology can drastically change the design process by improving the level of interaction between final users and, managers and human factors team.
Measuring housing quality in the absence of a monetized real estate market.
Rindfuss, Ronald R; Piotrowski, Martin; Thongthai, Varachai; Prasartkul, Pramote
2007-03-01
Measuring housing quality or value or both has been a weak component of demographic and development research in less developed countries that lack an active real estate (housing) market. We describe a new method based on a standardized subjective rating process. It is designed to be used in settings that do not have an active, monetized housing market. The method is applied in an ongoing longitudinal study in north-east Thailand and could be straightforwardly used in many other settings. We develop a conceptual model of the process whereby households come to reside in high-quality or low-quality housing units. We use this theoretical model in conjunction with longitudinal data to show that the new method of measuring housing quality behaves as theoretically expected, thus providing evidence of face validity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clough, Katy; Figueras, Pau; Finkel, Hal
In this work, we introduce GRChombo: a new numerical relativity code which incorporates full adaptive mesh refinement (AMR) using block structured Berger-Rigoutsos grid generation. The code supports non-trivial 'many-boxes-in-many-boxes' mesh hierarchies and massive parallelism through the message passing interface. GRChombo evolves the Einstein equation using the standard BSSN formalism, with an option to turn on CCZ4 constraint damping if required. The AMR capability permits the study of a range of new physics which has previously been computationally infeasible in a full 3 + 1 setting, while also significantly simplifying the process of setting up the mesh for these problems. Wemore » show that GRChombo can stably and accurately evolve standard spacetimes such as binary black hole mergers and scalar collapses into black holes, demonstrate the performance characteristics of our code, and discuss various physics problems which stand to benefit from the AMR technique.« less
Aaboud, M.; Aad, G.; Abbott, B.; ...
2016-09-28
A search for W' bosons in events with one lepton (electron or muon) and missing transverse momentum is presented. The search uses 3.2 fb -1 of pp collision data collected at √s=13 TeV by the ATLAS experiment at the LHC in 2015. The transverse mass distribution is examined and no significant excess of events above the level expected from Standard Model processes is observed. Upper limits on the W' boson cross-section times branching ratio to leptons are set as a function of the W' mass. Within the Sequential Standard Model W ' masses below 4.07 TeV are excluded at the 95%more » confidence level. This extends the limit set using LHC data at √s=8 TeV by around 800 GeV.« less
Toolkit for visualization of the cellular structure and organelles in Aspergillus niger.
Buren, Emiel B J Ten; Karrenbelt, Michiel A P; Lingemann, Marit; Chordia, Shreyans; Deng, Ying; Hu, JingJing; Verest, Johanna M; Wu, Vincen; Gonzalez, Teresita J Bello; Heck, Ruben G A van; Odoni, Dorett I; Schonewille, Tom; Straat, Laura van der; Graaff, Leo H de; Passel, Mark W J van
2014-12-19
Aspergillus niger is a filamentous fungus that is extensively used in industrial fermentations for protein expression and the production of organic acids. Inherent biosynthetic capabilities, such as the capacity to secrete these biomolecules in high amounts, make A. niger an attractive production host. Although A. niger is renowned for this ability, the knowledge of the molecular components that underlie its production capacity, intercellular trafficking processes and secretion mechanisms is far from complete. Here, we introduce a standardized set of tools, consisting of an N-terminal GFP-actin fusion and codon optimized eforRed chromoprotein. Expression of the GFP-actin construct facilitates visualization of the actin filaments of the cytoskeleton, whereas expression of the chromoprotein construct results in a clearly distinguishable red phenotype. These experimentally validated constructs constitute the first set of standardized A. niger biomarkers, which can be used to study morphology, intercellular trafficking, and secretion phenomena.
The World Hypertension League: where now and where to in salt reduction
Lackland, Daniel T.; Lisheng, Liu; Zhang, Xin-Hua; Nilsson, Peter M.; Niebylski, Mark L.
2015-01-01
High dietary salt is a leading risk for death and disability largely by causing increased blood pressure. Other associated health risks include gastric and renal cell cancers, osteoporosis, renal stones, and increased disease activity in multiple sclerosis, headache, increased body fat and Meniere’s disease. The World Hypertension League (WHL) has prioritized advocacy for salt reduction. WHL resources and actions include a non-governmental organization policy statement, dietary salt fact sheet, development of standardized nomenclature, call for quality research, collaboration in a weekly salt science update, development of a process to set recommended dietary salt research standards and regular literature reviews, development of adoptable power point slide sets to support WHL positions and resources, and critic of weak research studies on dietary salt. The WHL plans to continue to work with multiple governmental and non-governmental organizations to promote dietary salt reduction towards the World Health Organization (WHO) recommendations. PMID:26090335
Scoring and setting pass/fail standards for an essay certification examination in nurse-midwifery.
Fullerton, J T; Greener, D L; Gross, L J
1992-03-01
Examination for certification or licensure of health professionals (credentialing) in the United States is almost exclusively of the multiple choice format. The certification examination for entry into the practice of the profession of nurse-midwifery has, however, used a modified essay format throughout its twenty-year history. The examination has recently undergone a revision in the method for score interpretation and for pass/fail decision-making. The revised method, described in this paper, has important implications for all health professional credentialing agencies which use modified essay, oral or practical methods of competency assessment. This paper describes criterion-referenced scoring, the process of constructing the essay items, the methods for assuring validity and reliability for the examination, and the manner of standard setting. In addition, two alternative methods for increasing the validity of the pass/fail decision are evaluated, and the rationale for decision-making about marginal candidates is described.
Pan, Qing; Yao, Jialiang; Wang, Ruofan; Cao, Ping; Ning, Gangmin; Fang, Luping
2017-08-01
The vessels in the microcirculation keep adjusting their structure to meet the functional requirements of the different tissues. A previously developed theoretical model can reproduce the process of vascular structural adaptation to help the study of the microcirculatory physiology. However, until now, such model lacks the appropriate methods for its parameter settings with subsequent limitation of further applications. This study proposed an improved quantum-behaved particle swarm optimization (QPSO) algorithm for setting the parameter values in this model. The optimization was performed on a real mesenteric microvascular network of rat. The results showed that the improved QPSO was superior to the standard particle swarm optimization, the standard QPSO and the previously reported Downhill algorithm. We conclude that the improved QPSO leads to a better agreement between mathematical simulation and animal experiment, rendering the model more reliable in future physiological studies.
NASA Astrophysics Data System (ADS)
Aaboud, M.; Aad, G.; Abbott, B.; Abdallah, J.; Abdinov, O.; Abeloos, B.; Aben, R.; Abouzeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Adelman, J.; Adomeit, S.; Adye, T.; Affolder, A. A.; Agatonovic-Jovin, T.; Agricola, J.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akerstedt, H.; Åkesson, T. P. A.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albrand, S.; Alconada Verzini, M. J.; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Ali, B.; Aliev, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Alstaty, M.; Alvarez Gonzalez, B.; Álvarez Piqueras, D.; Alviggi, M. G.; Amadio, B. T.; Amako, K.; Amaral Coutinho, Y.; Amelung, C.; Amidei, D.; Amor Dos Santos, S. P.; Amorim, A.; Amoroso, S.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antel, C.; Antonelli, M.; Antonov, A.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Arabidze, G.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arduh, F. A.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baak, M. A.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Baines, J. T.; Baker, O. K.; Baldin, E. M.; Balek, P.; Balestri, T.; Balli, F.; Balunas, W. K.; Banas, E.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisits, M.-S.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barranco Navarro, L.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, M.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, J. K.; Belanger-Champagne, C.; Bell, A. S.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez, J.; Benjamin, D. P.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertolucci, F.; Bertram, I. A.; Bertsche, C.; Bertsche, D.; Besjes, G. J.; Bessidskaia Bylund, O.; Bessner, M.; Besson, N.; Betancourt, C.; Bethke, S.; Bevan, A. J.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Biedermann, D.; Bielski, R.; Biesuz, N. V.; Biglietti, M.; Bilbao de Mendizabal, J.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biondi, S.; Bjergaard, D. M.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blanco, J. E.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogaerts, J. A.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bokan, P.; Bold, T.; Boldyrev, A. S.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Bossio Sola, J. D.; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Breaden Madden, W. D.; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Bruni, A.; Bruni, G.; Bruni, L. S.; Brunt, B. H.; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burckhart, H.; Burdin, S.; Burgard, C. D.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Burr, J. T. P.; Busato, E.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Cabrera Urbán, S.; Caforio, D.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Caloba, L. P.; Calvente Lopez, S.; Calvet, D.; Calvet, S.; Calvet, T. P.; Camacho Toro, R.; Camarda, S.; Camarri, P.; Cameron, D.; Caminal Armadans, R.; Camincher, C.; Campana, S.; Campanelli, M.; Camplani, A.; Campoverde, A.; Canale, V.; Canepa, A.; Cano Bret, M.; Cantero, J.; Cantrill, R.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, I.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Casolino, M.; Casper, D. W.; Castaneda-Miranda, E.; Castelijn, R.; Castelli, A.; Castillo Gimenez, V.; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerda Alberich, L.; Cerio, B. C.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chan, S. K.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chatterjee, A.; Chau, C. C.; Chavez Barajas, C. A.; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheng, Y.; Cheplakov, A.; Cheremushkina, E.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, B. K. B.; Christodoulou, V.; Chromek-Burckhart, D.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocca, C.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, M. R.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Colasurdo, L.; Cole, B.; Colijn, A. P.; Collot, J.; Colombo, T.; Compostella, G.; Conde Muiño, P.; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Consorti, V.; Constantinescu, S.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cormier, K. J. R.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Crispin Ortuzar, M.; Cristinziani, M.; Croft, V.; Crosetti, G.; Cuhadar Donszelmann, T.; Cummings, J.; Curatolo, M.; Cúth, J.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; D'Amen, G.; D'Auria, S.; D'Onofrio, M.; da Cunha Sargedas de Sousa, M. J.; da Via, C.; Dabrowski, W.; Dado, T.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Davey, W.; David, C.; Davidek, T.; Davies, M.; Davison, P.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; de, K.; de Asmundis, R.; de Benedetti, A.; de Castro, S.; de Cecco, S.; de Groot, N.; de Jong, P.; de la Torre, H.; de Lorenzi, F.; de Maria, A.; de Pedis, D.; de Salvo, A.; de Sanctis, U.; de Santo, A.; de Vivie de Regie, J. B.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Dehghanian, N.; Deigaard, I.; Del Gaudio, M.; Del Peso, J.; Del Prete, T.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; Della Volpe, D.; Delmastro, M.; Delsart, P. A.; Demarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; di Ciaccio, A.; di Ciaccio, L.; di Clemente, W. K.; di Donato, C.; di Girolamo, A.; di Girolamo, B.; di Micco, B.; di Nardo, R.; di Simone, A.; di Sipio, R.; di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; Do Vale, M. A. B.; Dobos, D.; Dobre, M.; Doglioni, C.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Duffield, E. M.; Duflot, L.; Duguid, L.; Dührssen, M.; Dumancic, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Dyndal, M.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Edwards, N. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Ennis, J. S.; Erdmann, J.; Ereditato, A.; Ernis, G.; Ernst, J.; Ernst, M.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Fabbri, F.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farina, E. M.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Faucci Giannelli, M.; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Feremenga, L.; Fernandez Martinez, P.; Fernandez Perez, S.; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, G. T.; Fletcher, R. R. M.; Flick, T.; Floderus, A.; Flores Castillo, L. R.; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fusayasu, T.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gao, J.; Gao, Y.; Gao, Y. S.; Garay Walls, F. M.; García, C.; García Navarro, J. E.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gascon Bravo, A.; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gecse, Z.; Gee, C. N. P.; Geich-Gimbel, Ch.; Geisen, M.; Geisler, M. P.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; Gentsos, C.; George, S.; Gerbaudo, D.; Gershon, A.; Ghasemi, S.; Ghazlane, H.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giannetti, P.; Gibbard, B.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giorgi, F. M.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Goncalves Pinto Firmino da Costa, J.; Gonella, G.; Gonella, L.; Gongadze, A.; González de La Hoz, S.; Gonzalez Parra, G.; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Graber, L.; Grabowska-Bold, I.; Gradin, P. O. J.; Grafström, P.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gravila, P. M.; Gray, H. M.; Graziani, E.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Grohs, J. P.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, Y.; Gupta, R.; Gupta, S.; Gustavino, G.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Haefner, P.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Haley, J.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Haney, B.; Hanisch, S.; Hanke, P.; Hanna, R.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrington, R. D.; Harrison, P. F.; Hartjes, F.; Hartmann, N. M.; Hasegawa, M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Hellman, S.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Henriques Correia, A. M.; Henrot-Versille, S.; Herbert, G. H.; Hernández Jiménez, Y.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Hickling, R.; Higón-Rodriguez, E.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hinman, R. R.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohn, D.; Holmes, T. R.; Homann, M.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, C.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, Q.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Huo, P.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Ince, T.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Ishijima, N.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ito, F.; Iturbe Ponce, J. M.; Iuppa, R.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, B.; Jackson, M.; Jackson, P.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansky, R.; Janssen, J.; Janus, M.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiggins, S.; Jimenez Pena, J.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Johansson, P.; Johns, K. A.; Johnson, W. J.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Juste Rozas, A.; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kaneti, S.; Kanjir, L.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kapliy, A.; Kar, D.; Karakostas, K.; Karamaoun, A.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karnevskiy, M.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kempster, J. J.; Kentaro, K.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khader, M.; Khalil-Zada, F.; Khanov, A.; Kharlamov, A. G.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; King, M.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kiuchi, K.; Kivernyk, O.; Kladiva, E.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klioutchnikova, T.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Knapik, J.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koffas, T.; Koffeman, E.; Koi, T.; Kolanoski, H.; Kolb, M.; Koletsou, I.; Komar, A. A.; Komori, Y.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotwal, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kouskoura, V.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozakai, C.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunigo, T.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; La Rosa, A.; La Rosa Navarro, J. L.; La Rotonda, L.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, J. C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Lasagni Manghi, F.; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Le, B.; Le Dortz, O.; Le Guirriec, E.; Le Quilleuc, E. P.; Leblanc, M.; Lecompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmann Miotto, G.; Lei, X.; Leight, W. A.; Leisos, A.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Leontsinis, S.; Lerner, G.; Leroy, C.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, D.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, H. L.; Li, L.; Li, L.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liberti, B.; Liblong, A.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limosani, A.; Lin, S. C.; Lin, T. H.; Lindquist, B. E.; Lionti, A. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, H.; Liu, H.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loew, K. M.; Loginov, A.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopes, L.; Lopez Mateos, D.; Lopez Paredes, B.; Lopez Paz, I.; Lopez Solis, A.; Lorenz, J.; Lorenzo Martinez, N.; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Luzi, P. M.; Lynn, D.; Lysak, R.; Lytken, E.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; MacDonald, C. M.; Maček, B.; Machado Miguens, J.; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A.; Magradze, E.; Mahlstedt, J.; Maiani, C.; Maidantchik, C.; Maier, A. A.; Maier, T.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandelli, B.; Mandelli, L.; Mandić, I.; Maneira, J.; Manhaes de Andrade Filho, L.; Manjarres Ramos, J.; Mann, A.; Manousos, A.; Mansoulie, B.; Mansour, J. D.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchiori, G.; Marcisovsky, M.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti-Garcia, S.; Martin, B.; Martin, T. A.; Martin, V. J.; Martin Dit Latour, B.; Martinez, M.; Martinez Outschoorn, V. I.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marx, M.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazza, S. M.; Mc Fadden, N. C.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McDonald, E. F.; McFayden, J. A.; McHedlidze, G.; McMahon, S. J.; McPherson, R. A.; Medinnis, M.; Meehan, S.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melini, D.; Mellado Garcia, B. R.; Melo, M.; Meloni, F.; Mengarelli, A.; Menke, S.; Meoni, E.; Mergelmeyer, S.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer Zu Theenhausen, H.; Miano, F.; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Mohapatra, S.; Molander, S.; Moles-Valls, R.; Monden, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Mortensen, S. S.; Morvaj, L.; Mosidze, M.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Mueller, T.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Munoz Sanchez, F. J.; Murillo Quijada, J. A.; Murray, W. J.; Musheghyan, H.; Muškinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Naranjo Garcia, R. F.; Narayan, R.; Narrias Villar, D. I.; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Nef, P. D.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Nguyen Manh, T.; Nickerson, R. B.; Nicolaidou, R.; Nielsen, J.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, J. K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nooney, T.; Norberg, S.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'Grady, F.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Oleiro Seabra, L. F.; Olivares Pino, S. A.; Oliveira Damazio, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Otero Y Garzon, G.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Pacheco Rodriguez, L.; Padilla Aranda, C.; Pagáčová, M.; Pagan Griso, S.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palazzo, S.; Palestini, S.; Palka, M.; Pallin, D.; Palma, A.; St. Panagiotopoulou, E.; Pandini, C. E.; Panduro Vazquez, J. G.; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V. R.; Pasqualucci, E.; Passaggio, S.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Pater, J. R.; Pauly, T.; Pearce, J.; Pearson, B.; Pedersen, L. E.; Pedersen, M.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Perez Codina, E.; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pin, A. W. J.; Pinamonti, M.; Pinfold, J. L.; Pingel, A.; Pires, S.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Pluth, D.; Poettgen, R.; Poggioli, L.; Pohl, D.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Pozo Astigarraga, M. E.; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Price, L. E.; Primavera, M.; Prince, S.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puddu, D.; Purohit, M.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Quayle, W. B.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rammensee, M.; Rangel-Smith, C.; Ratti, M. G.; Rauscher, F.; Rave, S.; Ravenscroft, T.; Ravinovich, I.; Raymond, M.; Read, A. L.; Readioff, N. P.; Reale, M.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reisin, H.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rimoldi, M.; Rinaldi, L.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodina, Y.; Rodriguez Perez, A.; Rodriguez Rodriguez, D.; Roe, S.; Rogan, C. S.; Røhne, O.; Romaniouk, A.; Romano, M.; Romano Saez, S. M.; Romero Adam, E.; Rompotis, N.; Ronzani, M.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, P.; Rosenthal, O.; Rosien, N.-A.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Rzehorz, G. F.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Salazar Loyola, J. E.; Salek, D.; Sales de Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sandbach, R. L.; Sander, H. G.; Sandhoff, M.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sannino, M.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Santoyo Castillo, I.; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sasaki, Y.; Sato, K.; Sauvage, G.; Sauvan, E.; Savage, G.; Savard, P.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Scarfone, V.; Schaarschmidt, J.; Schacht, P.; Schachtner, B. M.; Schaefer, D.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schier, S.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt-Sommerfeld, K. R.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schneider, B.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schott, M.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schuh, N.; Schulte, A.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwarz, T. A.; Schwegler, Ph.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Sciolla, G.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Seliverstov, D. M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Shoaleh Saadi, D.; Shochet, M. J.; Shojaii, S.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sickles, A. M.; Sidebo, P. E.; Sidiropoulou, O.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simon, D.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Sivoklokov, S. Yu.; Sjölin, J.; Skinner, M. B.; Skottowe, H. P.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smiesko, J.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Sokhrannyi, G.; Solans Sanchez, C. A.; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Song, H. Y.; Sood, A.; Sopczak, A.; Sopko, V.; Sorin, V.; Sosa, D.; Sotiropoulou, C. L.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; St. Denis, R. D.; Stabile, A.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Staroba, P.; Starovoitov, P.; Stärz, S.; Staszewski, R.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramaniam, R.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Taccini, C.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takai, H.; Takashima, R.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tannenwald, B. B.; Tapia Araya, S.; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, A. C.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teischinger, F. A.; Teixeira-Dias, P.; Temming, K. K.; Temple, D.; Ten Kate, H.; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Theveneaux-Pelzer, T.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Tibbetts, M. J.; Ticse Torres, R. E.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorov, T.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Trefzger, T.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsui, K. M.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turgeman, D.; Turra, R.; Turvey, A. J.; Tuts, P. M.; Tyndel, M.; Ucchielli, G.; Ueda, I.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valderanis, C.; Valdes Santurio, E.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Vallecorsa, S.; Valls Ferrer, J. A.; van den Wollenberg, W.; van der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; van Eldik, N.; van Gemmeren, P.; van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vanguri, R.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vazeille, F.; Vazquez Schroeder, T.; Veatch, J.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Vigne, R.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vlasak, M.; Vogel, M.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahlberg, H.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, T.; Wang, W.; Wang, X.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, M. D.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A.; White, M. J.; White, R.; Whiteson, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wittkowski, J.; Wolter, M. W.; Wolters, H.; Worm, S. D.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wu, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yakabe, R.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yeletskikh, I.; Yen, A. L.; Yildirim, E.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, L.; Zhou, M.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; Zur Nedden, M.; Zwalinski, L.; Atlas Collaboration
2016-11-01
A search for W‧ bosons in events with one lepton (electron or muon) and missing transverse momentum is presented. The search uses 3.2 fb-1 of pp collision data collected at √{ s} = 13 TeV by the ATLAS experiment at the LHC in 2015. The transverse mass distribution is examined and no significant excess of events above the level expected from Standard Model processes is observed. Upper limits on the W‧ boson cross-section times branching ratio to leptons are set as a function of the W‧ mass. Within the Sequential Standard Model W‧ masses below 4.07 TeV are excluded at the 95% confidence level. This extends the limit set using LHC data at √{ s} = 8 TeV by around 800 GeV.
Adding fling effects to processed ground‐motion time histories
Kamai, Ronnie; Abrahamson, Norman A.; Graves, Robert
2014-01-01
Fling is the engineering term for the effects of the permanent tectonic offset, caused by a rupturing fault in the recorded ground motions near the fault. It is expressed by a one‐sided pulse in ground velocity and a nonzero final displacement at the end of shaking. Standard processing of earthquake time histories removes some of the fling effects that may be required for engineering applications. A method to parameterize the fling‐step time history and to superimpose it onto traditionally processed time histories has been developed by Abrahamson (2002). In this paper, we first present an update to the Abrahamson (2002) fling‐step models, in which the fling step is parameterized as a single cycle of a sine wave. Parametric models are presented for the sine‐wave amplitude (Dsite) and period (Tf). The expressions for Dsite and Tf are derived from an extensive set of finite‐fault simulations conducted on the Southern California Earthquake Center broadband platform (see Data and Resources). The simulations were run with the Graves and Pitarka (2010) hybrid simulation method and included strike‐slip and reverse scenarios for magnitudes of 6.0–8.2 and dips of 30 through 90. Next, an improved approach for developing design ground motions with fling effects is presented, which deals with the problem of double‐counting intermediate period components that were not removed by the standard ground‐motion processing. Finally, the results are validated against a set of 84 empirical recordings containing fling.
A Versatile Multichannel Digital Signal Processing Module for Microcalorimeter Arrays
NASA Astrophysics Data System (ADS)
Tan, H.; Collins, J. W.; Walby, M.; Hennig, W.; Warburton, W. K.; Grudberg, P.
2012-06-01
Different techniques have been developed for reading out microcalorimeter sensor arrays: individual outputs for small arrays, and time-division or frequency-division or code-division multiplexing for large arrays. Typically, raw waveform data are first read out from the arrays using one of these techniques and then stored on computer hard drives for offline optimum filtering, leading not only to requirements for large storage space but also limitations on achievable count rate. Thus, a read-out module that is capable of processing microcalorimeter signals in real time will be highly desirable. We have developed multichannel digital signal processing electronics that are capable of on-board, real time processing of microcalorimeter sensor signals from multiplexed or individual pixel arrays. It is a 3U PXI module consisting of a standardized core processor board and a set of daughter boards. Each daughter board is designed to interface a specific type of microcalorimeter array to the core processor. The combination of the standardized core plus this set of easily designed and modified daughter boards results in a versatile data acquisition module that not only can easily expand to future detector systems, but is also low cost. In this paper, we first present the core processor/daughter board architecture, and then report the performance of an 8-channel daughter board, which digitizes individual pixel outputs at 1 MSPS with 16-bit precision. We will also introduce a time-division multiplexing type daughter board, which takes in time-division multiplexing signals through fiber-optic cables and then processes the digital signals to generate energy spectra in real time.
Lessons Learned for Collaborative Clinical Content Development
Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.
2013-01-01
Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366
Rabatin, Joseph S; Lipkin, Mack; Rubin, Alan S; Schachter, Allison; Nathan, Michael; Kalet, Adina
2004-05-01
We describe a specific mentoring approach in an academic general internal medicine setting by audiotaping and transcribing all mentoring sessions in the year. In advance, the mentor recorded his model. During the year, the mentee kept a process journal. Qualitative analysis revealed development of an intimate relationship based on empathy, trust, and honesty. The mentor's model was explicitly intended to develop independence, initiative, improved thinking, skills, and self-reflection. The mentor's methods included extensive and varied use of questioning, active listening, standard setting, and frequent feedback. During the mentoring, the mentee evolved as a teacher, enhanced the creativity in his teaching, and matured as a person. Specific accomplishments included a national workshop on professional writing, an innovative approach to inpatient attending, a new teaching skills curriculum for a residency program, and this study. A mentoring model stressing safety, intimacy, honesty, setting of high standards, praxis, and detailed planning and feedback was associated with mentee excitement, personal and professional growth and development, concrete accomplishments, and a commitment to teaching.
Utility of an Algorithm to Increase the Accuracy of Medication History in an Obstetrical Setting.
Corbel, Aline; Baud, David; Chaouch, Aziz; Beney, Johnny; Csajka, Chantal; Panchaud, Alice
2016-01-01
In an obstetrical setting, inaccurate medication histories at hospital admission may result in failure to identify potentially harmful treatments for patients and/or their fetus(es). This prospective study was conducted to assess average concordance rates between (1) a medication list obtained with a one-page structured medication history algorithm developed for the obstetrical setting and (2) the medication list reported in medical records and obtained by open-ended questions based on standard procedures. Both lists were converted into concordance rate using a best possible medication history approach as the reference (information obtained by patients, prescribers and community pharmacists' interviews). The algorithm-based method obtained a higher average concordance rate than the standard method, with respectively 90.2% [CI95% 85.8-94.3] versus 24.6% [CI95%15.3-34.4] concordance rates (p<0.01). Our algorithm-based method strongly enhanced the accuracy of the medication history in our obstetric population, without using substantial resources. Its implementation is an effective first step to the medication reconciliation process, which has been recognized as a very important component of patients' drug safety.
NASA Technical Reports Server (NTRS)
1972-01-01
The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.
Sharma, H S S; Reinard, N
2004-12-01
Flax fiber must be mechanically prepared to improve fineness and homogeneity of the sliver before chemical processing and wet-spinning. The changes in fiber characteristics are monitored by an airflow method, which is labor intensive and requires 90 minutes to process one sample. This investigation was carried out to develop robust visible and near-infrared calibrations that can be used as a rapid tool for quality assessment of input fibers and changes in fineness at the doubling (blending), first, second, third, and fourth drawing frames, and at the roving stage. The partial least squares (PLS) and principal component regression (PCR) methods were employed to generate models from different segments of the spectra (400-1100, 1100-1700, 1100-2498, 1700-2498, and 400-2498 nm) and a calibration set consisting of 462 samples obtained from the six processing stages. The calibrations were successfully validated with an independent set of 97 samples, and standard errors of prediction of 2.32 and 2.62 dtex were achieved with the best PLS (400-2498 nm) and PCR (1100-2498 nm) models, respectively. An optimized PLS model of the visible-near-infrared (vis-NIR) spectra explained 97% of the variation (R(2) = 0.97) in the sample set with a standard error of calibration (SEC) of 2.45 dtex and a standard error of cross-validation (SECV) of 2.51 dtex R(2) = 0.96). The mean error of the reference airflow method was 1.56 dtex, which is more accurate than the NIR calibration. The improvement in fiber fineness of the validation set obtained from the six production lines was predicted with an error range of -6.47 to +7.19 dtex for input fibers, -1.44 to +5.77 dtex for blended fibers at the doubling, and -4.72 to +3.59 dtex at the drawing frame stages. This level of precision is adequate for wet-spinners to monitor fiber fineness of input fibers and during the preparation of fibers. The advantage of visNIR spectroscopy is the potential capability of the technique to assess fineness and other important quality characteristics of a fiber sample simultaneously in less than 30 minutes; the disadvantages are the expensive instrumentation and the expertise required for operating the instrument compared to the reference method. These factors need to be considered by the industry before installing an off-line NIR system for predicting quality parameters of input materials and changes in fiber characteristics during mechanical processing.
Developing accreditation for community based surgery: the Irish experience.
Ní Riain, Ailís; Collins, Claire; O'Sullivan, Tony
2018-02-05
Purpose Carrying out minor surgery procedures in the primary care setting is popular with patients, cost effective and delivers at least as good outcomes as those performed in the hospital setting. This paper aims to describe the central role of clinical leadership in developing an accreditation system for general practitioners (GPs) undertaking community-based surgery in the Irish national setting where no mandatory accreditation process currently exists. Design/methodology/approach In all, 24 GPs were recruited to the GP network. Ten pilot standards were developed addressing GPs' experience and training, clinical activity and practice supporting infrastructure and tested, using information and document review, prospective collection of clinical data and a practice inspection visit. Two additional components were incorporated into the project (patient satisfaction survey and self-audit). A multi-modal evaluation was undertaken. A majority of GPs was included at all stages of the project, in line with the principles of action learning. The steering group had a majority of GPs with relevant expertise and representation of all other actors in the minor surgery arena. The GP research network contributed to each stage of the project. The project lead was a GP with minor surgery experience. Quantitative data collected were analysed using Predictive Analytic SoftWare. Krueger's framework analysis approach was used to analyse the qualitative data. Findings A total of 9 GPs achieved all standards at initial review, 14 successfully completed corrective actions and 1 GP did not achieve the required standard. Standards were then amended to reflect findings and a supporting framework was developed. Originality/value The flexibility of the action-learning approach and the clinical leadership design allowed for the development of robust quality standards in a short timeframe.
Multivariate regression model for predicting lumber grade volumes of northern red oak sawlogs
Daniel A. Yaussy; Robert L. Brisbin
1983-01-01
A multivariate regression model was developed to predict green board-foot yields for the seven common factory lumber grades processed from northern red oak (Quercus rubra L.) factory grade logs. The model uses the standard log measurements of grade, scaling diameter, length, and percent defect. It was validated with an independent data set. The model...
ERIC Educational Resources Information Center
Penton, John
Designed to provide information about reading in New Zealand, this report offers an overview of theory and practice in that area. Among the topics discussed are: the current concern about reading standards; developmental reading; effective methods of reading instruction; research into the nature of the reading process; preparation of teachers of…
FR notice and fact sheet concerning Advance Notice of Proposed Rulemaking that provides information on the process for setting an international CO2 emissions standard for aircraft at the International Civil Aviation Organization (ICAO)