Scanning for safety: an integrated approach to improved bar-code medication administration.
Early, Cynde; Riha, Chris; Martin, Jennifer; Lowdon, Karen W; Harvey, Ellen M
2011-03-01
This is a review of lessons learned in the postimplementation evaluation of a bar-code medication administration technology implemented at a major tertiary-care hospital in 2001. In 2006, with a bar-code medication administration scan compliance rate of 82%, a near-miss sentinel event prompted review of this technology as part of an institutional recommitment to a "culture of safety." Multifaceted problems with bar-code medication administration created an environment of circumventing safeguards as demonstrated by an increase in manual overrides to ensure timely medication administration. A multiprofessional team composed of nursing, pharmacy, human resources, quality, and technical services formalized. Each step in the bar-code medication administration process was reviewed. Technology, process, and educational solutions were identified and implemented systematically. Overall compliance with bar-code medication administration rose from 82% to 97%, which resulted in a calculated cost avoidance of more than $2.8 million during this time frame of the project.
Zhong, Qiu-Yue; Karlson, Elizabeth W; Gelaye, Bizu; Finan, Sean; Avillach, Paul; Smoller, Jordan W; Cai, Tianxi; Williams, Michelle A
2018-05-29
We examined the comparative performance of structured, diagnostic codes vs. natural language processing (NLP) of unstructured text for screening suicidal behavior among pregnant women in electronic medical records (EMRs). Women aged 10-64 years with at least one diagnostic code related to pregnancy or delivery (N = 275,843) from Partners HealthCare were included as our "datamart." Diagnostic codes related to suicidal behavior were applied to the datamart to screen women for suicidal behavior. Among women without any diagnostic codes related to suicidal behavior (n = 273,410), 5880 women were randomly sampled, of whom 1120 had at least one mention of terms related to suicidal behavior in clinical notes. NLP was then used to process clinical notes for the 1120 women. Chart reviews were performed for subsamples of women. Using diagnostic codes, 196 pregnant women were screened positive for suicidal behavior, among whom 149 (76%) had confirmed suicidal behavior by chart review. Using NLP among those without diagnostic codes, 486 pregnant women were screened positive for suicidal behavior, among whom 146 (30%) had confirmed suicidal behavior by chart review. The use of NLP substantially improves the sensitivity of screening suicidal behavior in EMRs. However, the prevalence of confirmed suicidal behavior was lower among women who did not have diagnostic codes for suicidal behavior but screened positive by NLP. NLP should be used together with diagnostic codes for future EMR-based phenotyping studies for suicidal behavior.
Technology Infusion of CodeSonar into the Space Network Ground Segment
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2009-01-01
This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.
A review of predictive coding algorithms.
Spratling, M W
2017-03-01
Predictive coding is a leading theory of how the brain performs probabilistic inference. However, there are a number of distinct algorithms which are described by the term "predictive coding". This article provides a concise review of these different predictive coding algorithms, highlighting their similarities and differences. Five algorithms are covered: linear predictive coding which has a long and influential history in the signal processing literature; the first neuroscience-related application of predictive coding to explaining the function of the retina; and three versions of predictive coding that have been proposed to model cortical function. While all these algorithms aim to fit a generative model to sensory data, they differ in the type of generative model they employ, in the process used to optimise the fit between the model and sensory data, and in the way that they are related to neurobiology. Copyright © 2016 Elsevier Inc. All rights reserved.
Support for Systematic Code Reviews with the SCRUB Tool
NASA Technical Reports Server (NTRS)
Holzmann, Gerald J.
2010-01-01
SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.
Garvin, Jennifer Hornung; Redd, Andrew; Bolton, Dan; Graham, Pauline; Roche, Dominic; Groeneveld, Peter; Leecaster, Molly; Shen, Shuying; Weiner, Mark G.
2013-01-01
Introduction International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes capture comorbidities that can be used to risk adjust nonrandom patient groups. We explored the accuracy of capturing comorbidities associated with one risk adjustment method, the Elixhauser Comorbidity Measure (ECM), in patients with chronic heart failure (CHF) at one Veterans Affairs (VA) medical center. We explored potential reasons for the differences found between the original codes assigned and conditions found through retrospective review. Methods This descriptive, retrospective study used a cohort of patients discharged with a principal diagnosis coded as CHF from one VA medical center in 2003. One admission per patient was used in the study; with multiple admissions, only the first admission was analyzed. We compared the assignment of original codes assigned to conditions found in a retrospective, manual review of the medical record conducted by an investigator with coding expertise as well as by physicians. Members of the team experienced with assigning ICD-9-CM codes and VA coding processes developed themes related to systemic reasons why chronic conditions were not coded in VA records using applied thematic techniques. Results In the 181-patient cohort, 388 comorbid conditions were identified; 305 of these were chronic conditions, originally coded at the time of discharge with an average of 1.7 comorbidities related to the ECM per patient. The review by an investigator with coding expertise revealed a total of 937 comorbidities resulting in 618 chronic comorbid conditions with an average of 3.4 per patient; physician review found 872 total comorbidities with 562 chronic conditions (average 3.1 per patient). The agreement between the original and the retrospective coding review was 88 percent. The kappa statistic for the original and the retrospective coding review was 0.375 with a 95 percent confidence interval (CI) of 0.352 to 0.398. The kappa statistic for the retrospective coding review and physician review was 0.849 (CI, 0.823–0.875). The kappa statistic for the original coding and the physician review was 0.340 (CI, 0.316–0.364). Several systemic factors were identified, including familiarity with inpatient VA and non-VA guidelines, the quality of documentation, and operational requirements to complete the coding process within short time frames and to identify the reasons for movement within a given facility. Conclusion Comorbidities within the ECM representing chronic conditions were significantly underrepresented in the original code assignment. Contributing factors potentially include prioritization of codes related to acute conditions over chronic conditions; coders’ professional training, educational level, and experience; and the limited number of codes allowed in initial coding software. This study highlights the need to evaluate systemic causes of underrepresentation of chronic conditions to improve the accuracy of risk adjustment used for health services research, resource allocation, and performance measurement. PMID:24159270
The barriers to clinical coding in general practice: a literature review.
de Lusignan, S
2005-06-01
Clinical coding is variable in UK general practice. The reasons for this remain undefined. This review explains why there are no readily available alternatives to recording structured clinical data and reviews the barriers to recording structured clinical data. Methods used included a literature review of bibliographic databases, university health informatics departments, and national and international medical informatics associations. The results show that the current state of development of computers and data processing means there is no practical alternative to coding data. The identified barriers to clinical coding are: the limitations of the coding systems and terminologies and the skill gap in their use; recording structured data in the consultation takes time and is distracting; the level of motivation of primary care professionals; and the priority within the organization. A taxonomy is proposed to describe the barriers to clinical coding. This can be used to identify barriers to coding and facilitate the development of strategies to overcome them.
Dual Coding, Reasoning and Fallacies.
ERIC Educational Resources Information Center
Hample, Dale
1982-01-01
Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)
The histone codes for meiosis.
Wang, Lina; Xu, Zhiliang; Khawar, Muhammad Babar; Liu, Chao; Li, Wei
2017-09-01
Meiosis is a specialized process that produces haploid gametes from diploid cells by a single round of DNA replication followed by two successive cell divisions. It contains many special events, such as programmed DNA double-strand break (DSB) formation, homologous recombination, crossover formation and resolution. These events are associated with dynamically regulated chromosomal structures, the dynamic transcriptional regulation and chromatin remodeling are mainly modulated by histone modifications, termed 'histone codes'. The purpose of this review is to summarize the histone codes that are required for meiosis during spermatogenesis and oogenesis, involving meiosis resumption, meiotic asymmetric division and other cellular processes. We not only systematically review the functional roles of histone codes in meiosis but also discuss future trends and perspectives in this field. © 2017 Society for Reproduction and Fertility.
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
Clinical code set engineering for reusing EHR data for research: A review.
Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels
2017-06-01
The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Mapping Saldana's Coding Methods onto the Literature Review Process
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Frels, Rebecca K.; Hwang, Eunjin
2016-01-01
Onwuegbuzie and Frels (2014) provided a step-by-step guide illustrating how discourse analysis can be used to analyze literature. However, more works of this type are needed to address the way that counselor researchers conduct literature reviews. Therefore, we present a typology for coding and analyzing information extracted for literature…
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
Processes of code status transitions in hospitalized patients with advanced cancer.
El-Jawahri, Areej; Lau-Min, Kelsey; Nipp, Ryan D; Greer, Joseph A; Traeger, Lara N; Moran, Samantha M; D'Arpino, Sara M; Hochberg, Ephraim P; Jackson, Vicki A; Cashavelly, Barbara J; Martinson, Holly S; Ryan, David P; Temel, Jennifer S
2017-12-15
Although hospitalized patients with advanced cancer have a low chance of surviving cardiopulmonary resuscitation (CPR), the processes by which they change their code status from full code to do not resuscitate (DNR) are unknown. We conducted a mixed-methods study on a prospective cohort of hospitalized patients with advanced cancer. Two physicians used a consensus-driven medical record review to characterize processes that led to code status order transitions from full code to DNR. In total, 1047 hospitalizations were reviewed among 728 patients. Admitting clinicians did not address code status in 53% of hospitalizations, resulting in code status orders of "presumed full." In total, 275 patients (26.3%) transitioned from full code to DNR, and 48.7% (134 of 275 patients) of those had an order of "presumed full" at admission; however, upon further clarification, the patients expressed that they had wished to be DNR before the hospitalization. We identified 3 additional processes leading to order transition from full code to DNR acute clinical deterioration (15.3%), discontinuation of cancer-directed therapy (17.1%), and education about the potential harms/futility of CPR (15.3%). Compared with discontinuing therapy and education, transitions because of acute clinical deterioration were associated with less patient involvement (P = .002), a shorter time to death (P < .001), and a greater likelihood of inpatient death (P = .005). One-half of code status order changes among hospitalized patients with advanced cancer were because of full code orders in patients who had a preference for DNR before hospitalization. Transitions due of acute clinical deterioration were associated with less patient engagement and a higher likelihood of inpatient death. Cancer 2017;123:4895-902. © 2017 American Cancer Society. © 2017 American Cancer Society.
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
ERIC Educational Resources Information Center
Wang, Yanqing; Li, Hang; Feng, Yuqiang; Jiang, Yu; Liu, Ying
2012-01-01
The traditional assessment approach, in which one single written examination counts toward a student's total score, no longer meets new demands of programming language education. Based on a peer code review process model, we developed an online assessment system called "EduPCR" and used a novel approach to assess the learning of computer…
Standardized Radiation Shield Design Methods: 2005 HZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.
2006-01-01
Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.
A systematic literature review of automated clinical coding and classification systems
Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R
2010-01-01
Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome. PMID:20962126
A systematic literature review of automated clinical coding and classification systems.
Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R
2010-01-01
Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.
Swinburn, Boyd; Vandevijvere, Stefanie; Woodward, Alistair; Hornblow, Andrew; Richardson, Ann; Burlingame, Barbara; Borman, Barry; Taylor, Barry; Breier, Bernhard; Arroll, Bruce; Drummond, Bernadette; Grant, Cameron; Bullen, Chris; Wall, Clare; Mhurchu, Cliona Ni; Cameron-Smith, David; Menkes, David; Murdoch, David; Mangin, Dee; Lennon, Diana; Sarfati, Diana; Sellman, Doug; Rush, Elaine; Sopoaga, Faafetai; Thomson, George; Devlin, Gerry; Abel, Gillian; White, Harvey; Coad, Jane; Hoek, Janet; Connor, Jennie; Krebs, Jeremy; Douwes, Jeroen; Mann, Jim; McCall, John; Broughton, John; Potter, John D; Toop, Les; McCowan, Lesley; Signal, Louise; Beckert, Lutz; Elwood, Mark; Kruger, Marlena; Farella, Mauro; Baker, Michael; Keall, Michael; Skeaff, Murray; Thomson, Murray; Wilson, Nick; Chandler, Nicholas; Reid, Papaarangi; Priest, Patricia; Brunton, Paul; Crampton, Peter; Davis, Peter; Gendall, Philip; Howden-Chapman, Philippa; Taylor, Rachael; Edwards, Richard; Beaglehole, Robert; Doughty, Robert; Scragg, Robert; Gauld, Robin; McGee, Robert; Jackson, Rod; Hughes, Roger; Mulder, Roger; Bonita, Ruth; Kruger, Rozanne; Casswell, Sally; Derrett, Sarah; Ameratunga, Shanthi; Denny, Simon; Hales, Simon; Pullon, Sue; Wells, Susan; Cundy, Tim; Blakely, Tony
2017-02-17
Reducing the exposure of children and young people to the marketing of unhealthy foods is a core strategy for reducing the high overweight and obesity prevalence in this population. The Advertising Standards Authority (ASA) has recently reviewed its self-regulatory codes and proposed a revised single code on advertising to children. This article evaluates the proposed code against eight criteria for an effective code, which were included in a submission to the ASA review process from over 70 New Zealand health professors. The evaluation found that the proposed code largely represents no change or uncertain change from the existing codes, and cannot be expected to provide substantial protection for children and young people from the marketing of unhealthy foods. Government regulations will be needed to achieve this important outcome.
The Development of a Discipline Code for Sue Bennett College.
ERIC Educational Resources Information Center
McLendon, Sandra F.
A Student Discipline Code (SDC) was developed to govern student life at Sue Bennett College (SBC), Kentucky, a private two-year college affiliated with the Methodist Church. Steps taken in the process included the following: a review of relevant literature on student discipline; examination of discipline codes from six other educational…
A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.
ERIC Educational Resources Information Center
ChanLin, Lih-Juan
This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…
Dual Coding Theory, Word Abstractness, and Emotion: A Critical Review of Kousta et al. (2011)
ERIC Educational Resources Information Center
Paivio, Allan
2013-01-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of…
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Dhir, V.K.; Gieseke, J.A.
1992-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. The newest version of MELCOR is Version 1.8.1, July 1991. MELCOR development has reached the point that the United States Nuclear Regulatory Commission sponsored a broad technical review by recognized experts to determine or confirm the technical adequacy of the code for the serious and complex analyses it is expected to perform. For this purpose, an eight-member MELCOR Peer Review Committee was organized. The Committee has completed its review of the MELCOR code: the review process and findingsmore » of the MELCOR Peer Review Committee are documented in this report. The Committee has determined that recommendations in five areas are appropriate: (1) MELCOR numerics, (2) models missing from MELCOR Version 1.8.1, (3) existing MELCOR models needing revision, (4) the need for expanded MELCOR assessment, and (5) documentation.« less
How to review 4 million lines of ATLAS code
NASA Astrophysics Data System (ADS)
Stewart, Graeme A.; Lampl, Walter;
2017-10-01
As the ATLAS Experiment prepares to move to a multi-threaded framework (AthenaMT) for Run3, we are faced with the problem of how to migrate 4 million lines of C++ source code. This code has been written over the past 15 years and has often been adapted, re-written or extended to the changing requirements and circumstances of LHC data taking. The code was developed by different authors, many of whom are no longer active, and under the deep assumption that processing ATLAS data would be done in a serial fashion. In order to understand the scale of the problem faced by the ATLAS software community, and to plan appropriately the significant efforts posed by the new AthenaMT framework, ATLAS embarked on a wide ranging review of our offline code, covering all areas of activity: event generation, simulation, trigger, reconstruction. We discuss the difficulties in even logistically organising such reviews in an already busy community, how to examine areas in sufficient depth to learn key areas in need of upgrade, yet also to finish the reviews in a timely fashion. We show how the reviews were organised and how the ouptuts were captured in a way that the sub-system communities could then tackle the problems uncovered on a realistic timeline. Further, we discuss how the review has inuenced the overall planning for the Run 3 ATLAS offline code.
Auditory spatial processing in the human cortex.
Salminen, Nelli H; Tiitinen, Hannu; May, Patrick J C
2012-12-01
The auditory system codes spatial locations in a way that deviates from the spatial representations found in other modalities. This difference is especially striking in the cortex, where neurons form topographical maps of visual and tactile space but where auditory space is represented through a population rate code. In this hemifield code, sound source location is represented in the activity of two widely tuned opponent populations, one tuned to the right and the other to the left side of auditory space. Scientists are only beginning to uncover how this coding strategy adapts to various spatial processing demands. This review presents the current understanding of auditory spatial processing in the cortex. To this end, the authors consider how various implementations of the hemifield code may exist within the auditory cortex and how these may be modulated by the stimulation and task context. As a result, a coherent set of neural strategies for auditory spatial processing emerges.
NASA Principal Center for Review of Clean Air Act Regulations
NASA Technical Reports Server (NTRS)
Clark-Ingram, Marceia; Munafo, Paul M. (Technical Monitor)
2002-01-01
The Clean Air Act (CAA) regulations have greatly impacted materials and processes utilized in the manufacture of aerospace hardware. Code JE/ NASA's Environmental Management Division at NASA Headquarters recognized the need for a formal, Agency-wide review process of CAA regulations. Marshall Space Flight Center (MSFC) was selected as the 'Principal Center for Review of Clean Air Act Regulations'. This presentation describes the centralized support provided by MSFC for the management and leadership of NASA's CAA regulation review process.
Siemann, Julia; Petermann, Franz
2018-01-01
This review reconciles past findings on numerical processing with key assumptions of the most predominant model of arithmetic in the literature, the Triple Code Model (TCM). This is implemented by reporting diverse findings in the literature ranging from behavioral studies on basic arithmetic operations over neuroimaging studies on numerical processing to developmental studies concerned with arithmetic acquisition, with a special focus on developmental dyscalculia (DD). We evaluate whether these studies corroborate the model and discuss possible reasons for contradictory findings. A separate section is dedicated to the transfer of TCM to arithmetic development and to alternative accounts focusing on developmental questions of numerical processing. We conclude with recommendations for future directions of arithmetic research, raising questions that require answers in models of healthy as well as abnormal mathematical development. This review assesses the leading model in the field of arithmetic processing (Triple Code Model) by presenting knowledge from interdisciplinary research. It assesses the observed contradictory findings and integrates the resulting opposing viewpoints. The focus is on the development of arithmetic expertise as well as abnormal mathematical development. The original aspect of this article is that it points to a gap in research on these topics and provides possible solutions for future models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Long Non-Coding RNAs Regulating Immunity in Insects
Satyavathi, Valluri; Ghosh, Rupam; Subramanian, Srividya
2017-01-01
Recent advances in modern technology have led to the understanding that not all genetic information is coded into protein and that the genomes of each and every organism including insects produce non-coding RNAs that can control different biological processes. Among RNAs identified in the last decade, long non-coding RNAs (lncRNAs) represent a repertoire of a hidden layer of internal signals that can regulate gene expression in physiological, pathological, and immunological processes. Evidence shows the importance of lncRNAs in the regulation of host–pathogen interactions. In this review, an attempt has been made to view the role of lncRNAs regulating immune responses in insects. PMID:29657286
Team interaction during surgery: a systematic review of communication coding schemes.
Tiferes, Judith; Bisantz, Ann M; Guru, Khurshid A
2015-05-15
Communication problems have been systematically linked to human errors in surgery and a deep understanding of the underlying processes is essential. Although a number of tools exist to assess nontechnical skills, methods to study communication and other team-related processes are far from being standardized, making comparisons challenging. We conducted a systematic review to analyze methods used to study events in the operating room (OR) and to develop a synthesized coding scheme for OR team communication. Six electronic databases were accessed to search for articles that collected individual events during surgery and included detailed coding schemes. Additional articles were added based on cross-referencing. That collection was then classified based on type of events collected, environment type (real or simulated), number of procedures, type of surgical task, team characteristics, method of data collection, and coding scheme characteristics. All dimensions within each coding scheme were grouped based on emergent content similarity. Categories drawn from articles, which focused on communication events, were further analyzed and synthesized into one common coding scheme. A total of 34 of 949 articles met the inclusion criteria. The methodological characteristics and coding dimensions of the articles were summarized. A priori coding was used in nine studies. The synthesized coding scheme for OR communication included six dimensions as follows: information flow, period, statement type, topic, communication breakdown, and effects of communication breakdown. The coding scheme provides a standardized coding method for OR communication, which can be used to develop a priori codes for future studies especially in comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.
77 FR 58978 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-25
...;and investigations, committee meetings, agency decisions and rulings, #0;delegations of authority..., telephone, fax, email, customer code, agency code, purchase order number, credit card number/exp. date and... submitting it to APFO. Information collected is used to process fiscal obligations, communicate with the...
Preparing Protocols for Institutional Review Boards.
ERIC Educational Resources Information Center
Lyons, Charles M.
1983-01-01
Introduces the process by which Institutional Review Boards (IRBs) review proposals for research involving human subjects. Describes the composition of IRBs. Presents the Nuremberg code, the elements of informed consent, the judging criteria for proposals, and a sample protocol format. References newly published regulations governing research with…
The National Transport Code Collaboration Module Library
NASA Astrophysics Data System (ADS)
Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.
2004-12-01
This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website http://w3.pppl.gov/NTCC. The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.
A code inspection process for security reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; /Fermilab
2009-05-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application andmore » their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.« less
A code inspection process for security reviews
NASA Astrophysics Data System (ADS)
Garzoglio, Gabriele
2010-04-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.
Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August
2018-07-01
Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Recent Developments in the Application of Biologically Inspired Computation to Chemical Sensing
NASA Astrophysics Data System (ADS)
Marco, S.; Gutierrez-Gálvez, A.
2009-05-01
Biological olfaction outperforms chemical instrumentation in specificity, response time, detection limit, coding capacity, time stability, robustness, size, power consumption, and portability. This biological function provides outstanding performance due, to a large extent, to the unique architecture of the olfactory pathway, which combines a high degree of redundancy, an efficient combinatorial coding along with unmatched chemical information processing mechanisms. The last decade has witnessed important advances in the understanding of the computational primitives underlying the functioning of the olfactory system. In this work, the state of the art concerning biologically inspired computation for chemical sensing will be reviewed. Instead of reviewing the whole body of computational neuroscience of olfaction, we restrict this review to the application of models to the processing of real chemical sensor data.
NASA Technical Reports Server (NTRS)
Caruso, Salvadore V.; Clark-Ingram, Marceia A.
2000-01-01
This paper presents a memorandum of agreement on Clean Air Regulations. NASA headquarters (code JE and code M) has asked MSFC to serve as principle center for review of Clean Air Act (CAA) regulations. The purpose of the principle center is to provide centralized support to NASA headquarters for the management and leadership of NASA's CAA regulation review process and to identify the potential impact of proposed CAA reguations on NASA program hardware and supporting facilities. The materials and processes utilized in the manufacture of NASA's programmatic hardware contain HAPs (Hazardous Air Pollutants), VOCs (Volatile Organic Compounds), and ODC (Ozone Depleting Chemicals). This paper is presented in viewgraph form.
Oladinrin, Olugbenga Timo; Ho, Christabel Man-Fong
2016-08-01
Several researchers have identified codes of ethics (CoEs) as tools that stimulate positive ethical behavior by shaping the organisational decision-making process, but few have considered the information needed for code implementation. Beyond being a legal and moral responsibility, ethical behavior needs to become an organisational priority, which requires an alignment process that integrates employee behavior with the organisation's ethical standards. This paper discusses processes for the responsible implementation of CoEs based on an extensive review of the literature. The internationally recognized European Foundation for Quality Management Excellence Model (EFQM model) is proposed as a suitable framework for assessing an organisation's ethical performance, including CoE embeddedness. The findings presented herein have both practical and research implications. They will encourage construction practitioners to shift their attention from ethical policies to possible enablers of CoE implementation and serve as a foundation for further research on ethical performance evaluation using the EFQM model. This is the first paper to discuss the model's use in the context of ethics in construction practice.
Coveney, John; Herbert, Danielle L; Hill, Kathy; Mow, Karen E; Graves, Nicholas; Barnett, Adrian
2017-01-01
In Australia, the peer review process for competitive funding is usually conducted by a peer review group in conjunction with prior assessment from external assessors. This process is quite mysterious to those outside it. The purpose of this research was to throw light on grant review panels (sometimes called the 'black box') through an examination of the impact of panel procedures, panel composition and panel dynamics on the decision-making in the grant review process. A further purpose was to compare experience of a simplified review process with more conventional processes used in assessing grant proposals in Australia. This project was one aspect of a larger study into the costs and benefits of a simplified peer review process. The Queensland University of Technology (QUT)-simplified process was compared with the National Health and Medical Research Council's (NHMRC) more complex process. Grant review panellists involved in both processes were interviewed about their experience of the decision-making process that assesses the excellence of an application. All interviews were recorded and transcribed. Each transcription was de-identified and returned to the respondent for review. Final transcripts were read repeatedly and coded, and similar codes were amalgamated into categories that were used to build themes. Final themes were shared with the research team for feedback. Two major themes arose from the research: (1) assessing grant proposals and (2) factors influencing the fairness, integrity and objectivity of review. Issues such as the quality of writing in a grant proposal, comparison of the two review methods, the purpose and use of the rebuttal, assessing the financial value of funded projects, the importance of the experience of the panel membership and the role of track record and the impact of group dynamics on the review process were all discussed. The research also examined the influence of research culture on decision-making in grant review panels. One of the aims of this study was to compare a simplified review process with more conventional processes. Generally, participants were supportive of the simplified process. Transparency in the grant review process will result in better appreciation of the outcome. Despite the provision of clear guidelines for peer review, reviewing processes are likely to be subjective to the extent that different reviewers apply different rules. The peer review process will come under more scrutiny as funding for research becomes even more competitive. There is justification for further research on the process, especially of a kind that taps more deeply into the 'black box' of peer review.
Phonological coding during reading.
Leinenger, Mallorie
2014-11-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Phonological coding during reading
Leinenger, Mallorie
2014-01-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679
Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).
Paivio, Allan
2013-02-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.
1983-10-01
Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less
An overview of data acquisition, signal coding and data analysis techniques for MST radars
NASA Technical Reports Server (NTRS)
Rastogi, P. K.
1986-01-01
An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.
Jones, B E; South, B R; Shao, Y; Lu, C C; Leng, J; Sauer, B C; Gundlapalli, A V; Samore, M H; Zeng, Q
2018-01-01
Identifying pneumonia using diagnosis codes alone may be insufficient for research on clinical decision making. Natural language processing (NLP) may enable the inclusion of cases missed by diagnosis codes. This article (1) develops a NLP tool that identifies the clinical assertion of pneumonia from physician emergency department (ED) notes, and (2) compares classification methods using diagnosis codes versus NLP against a gold standard of manual chart review to identify patients initially treated for pneumonia. Among a national population of ED visits occurring between 2006 and 2012 across the Veterans Affairs health system, we extracted 811 physician documents containing search terms for pneumonia for training, and 100 random documents for validation. Two reviewers annotated span- and document-level classifications of the clinical assertion of pneumonia. An NLP tool using a support vector machine was trained on the enriched documents. We extracted diagnosis codes assigned in the ED and upon hospital discharge and calculated performance characteristics for diagnosis codes, NLP, and NLP plus diagnosis codes against manual review in training and validation sets. Among the training documents, 51% contained clinical assertions of pneumonia; in the validation set, 9% were classified with pneumonia, of which 100% contained pneumonia search terms. After enriching with search terms, the NLP system alone demonstrated a recall/sensitivity of 0.72 (training) and 0.55 (validation), and a precision/positive predictive value (PPV) of 0.89 (training) and 0.71 (validation). ED-assigned diagnostic codes demonstrated lower recall/sensitivity (0.48 and 0.44) but higher precision/PPV (0.95 in training, 1.0 in validation); the NLP system identified more "possible-treated" cases than diagnostic coding. An approach combining NLP and ED-assigned diagnostic coding classification achieved the best performance (sensitivity 0.89 and PPV 0.80). System-wide application of NLP to clinical text can increase capture of initial diagnostic hypotheses, an important inclusion when studying diagnosis and clinical decision-making under uncertainty. Schattauer GmbH Stuttgart.
Crosstalk between the Notch signaling pathway and non-coding RNAs in gastrointestinal cancers
Pan, Yangyang; Mao, Yuyan; Jin, Rong; Jiang, Lei
2018-01-01
The Notch signaling pathway is one of the main signaling pathways that mediates direct contact between cells, and is essential for normal development. It regulates various cellular processes, including cell proliferation, apoptosis, migration, invasion, angiogenesis and metastasis. It additionally serves an important function in tumor progression. Non-coding RNAs mainly include small microRNAs, long non-coding RNAs and circular RNAs. At present, a large body of literature supports the biological significance of non-coding RNAs in tumor progression. It is also becoming increasingly evident that cross-talk exists between Notch signaling and non-coding RNAs. The present review summarizes the current knowledge of Notch-mediated gastrointestinal cancer cell processes, and the effect of the crosstalk between the three major types of non-coding RNAs and the Notch signaling pathway on the fate of gastrointestinal cancer cells. PMID:29285185
Budisan, Liviuta; Gulei, Diana; Zanoaga, Oana Mihaela; Irimie, Alexandra Iulia; Chira, Sergiu; Braicu, Cornelia; Gherman, Claudia Diana; Berindan-Neagoe, Ioana
2017-01-01
Phytochemicals are natural compounds synthesized as secondary metabolites in plants, representing an important source of molecules with a wide range of therapeutic applications. These natural agents are important regulators of key pathological processes/conditions, including cancer, as they are able to modulate the expression of coding and non-coding transcripts with an oncogenic or tumour suppressor role. These natural agents are currently exploited for the development of therapeutic strategies alone or in tandem with conventional treatments for cancer. The aim of this paper is to review the recent studies regarding the role of these natural phytochemicals in different processes related to cancer inhibition, including apoptosis activation, angiogenesis and metastasis suppression. From the large palette of phytochemicals we selected epigallocatechin gallate (EGCG), caffeic acid phenethyl ester (CAPE), genistein, morin and kaempferol, due to their increased activity in modulating multiple coding and non-coding genes, targeting the main hallmarks of cancer. PMID:28587155
Budisan, Liviuta; Gulei, Diana; Zanoaga, Oana Mihaela; Irimie, Alexandra Iulia; Sergiu, Chira; Braicu, Cornelia; Gherman, Claudia Diana; Berindan-Neagoe, Ioana
2017-06-01
Phytochemicals are natural compounds synthesized as secondary metabolites in plants, representing an important source of molecules with a wide range of therapeutic applications. These natural agents are important regulators of key pathological processes/conditions, including cancer, as they are able to modulate the expression of coding and non-coding transcripts with an oncogenic or tumour suppressor role. These natural agents are currently exploited for the development of therapeutic strategies alone or in tandem with conventional treatments for cancer. The aim of this paper is to review the recent studies regarding the role of these natural phytochemicals in different processes related to cancer inhibition, including apoptosis activation, angiogenesis and metastasis suppression. From the large palette of phytochemicals we selected epigallocatechin gallate (EGCG), caffeic acid phenethyl ester (CAPE), genistein, morin and kaempferol, due to their increased activity in modulating multiple coding and non-coding genes, targeting the main hallmarks of cancer.
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
Designing and maintaining an effective chargemaster.
Abbey, D C
2001-03-01
The chargemaster is the central repository of charges and associated coding information used to develop claims. But this simple description belies the chargemaster's true complexity. The chargemaster's role in the coding process differs from department to department, and not all codes provided on a claim form are necessarily included in the chargemaster, as codes for complex services may need to be developed and reviewed by coding staff. In addition, with the rise of managed care, the chargemaster increasingly is being used to track utilization of supplies and services. To ensure that the chargemaster performs all of its functions effectively, hospitals should appoint a chargemaster coordinator, supported by a chargemaster review team, to oversee the design and maintenance of the chargemaster. Important design issues that should be considered include the principle of "form follows function," static versus dynamic coding, how modifiers should be treated, how charges should be developed, how to incorporate physician fee schedules into the chargemaster, the interface between the chargemaster and cost reports, and how to include statistical information for tracking utilization.
Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela
2013-11-13
Although healthcare administrative data are commonly used for traumatic brain injury research, there is currently no consensus or consistency on using the International Classification of Diseases version 10 codes to define traumatic brain injury among children and youth. This protocol is for a systematic review of the literature to explore the range of International Classification of Diseases version 10 codes that are used to define traumatic brain injury in this population. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews will be systematically searched. Grey literature will be searched using Grey Matters and Google. Reference lists of included articles will also be searched. Articles will be screened using predefined inclusion and exclusion criteria and all full-text articles that meet the predefined inclusion criteria will be included for analysis. The study selection process and reasons for exclusion at the full-text level will be presented using a PRISMA study flow diagram. Information on the data source of included studies, year and location of study, age of study population, range of incidence, and study purpose will be abstracted into a separate table and synthesized for analysis. All International Classification of Diseases version 10 codes will be listed in tables and the codes that are used to define concussion, acquired traumatic brain injury, head injury, or head trauma will be identified. The identification of the optimal International Classification of Diseases version 10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. It also allows for comparisons across countries and studies. This protocol is for a review that identifies the range and most common diagnoses used to conduct surveillance for traumatic brain injury in children and youth. This is an important first step in reaching an appropriate definition using International Classification of Diseases version 10 codes and can inform future work on reaching consensus on the codes to define traumatic brain injury for this vulnerable population.
NASA Astrophysics Data System (ADS)
Qiu, Kun; Zhang, Chongfu; Ling, Yun; Wang, Yibo
2007-11-01
This paper proposes an all-optical label processing scheme using multiple optical orthogonal codes sequences (MOOCS) for optical packet switching (OPS) (MOOCS-OPS) networks, for the first time to the best of our knowledge. In this scheme, the multiple optical orthogonal codes (MOOC) from multiple-groups optical orthogonal codes (MGOOC) are permuted and combined to obtain the MOOCS for the optical labels, which are used to effectively enlarge the capacity of available optical codes for optical labels. The optical label processing (OLP) schemes are reviewed and analyzed, the principles of MOOCS-based optical labels for OPS networks are given, and analyzed, then the MOOCS-OPS topology and the key realization units of the MOOCS-based optical label packets are studied in detail, respectively. The performances of this novel all-optical label processing technology are analyzed, the corresponding simulation is performed. These analysis and results show that the proposed scheme can overcome the lack of available optical orthogonal codes (OOC)-based optical labels due to the limited number of single OOC for optical label with the short code length, and indicate that the MOOCS-OPS scheme is feasible.
An evaluation of computer assisted clinical classification algorithms.
Chute, C G; Yang, Y; Buntrock, J
1994-01-01
The Mayo Clinic has a long tradition of indexing patient records in high resolution and volume. Several algorithms have been developed which promise to help human coders in the classification process. We evaluate variations on code browsers and free text indexing systems with respect to their speed and error rates in our production environment. The more sophisticated indexing systems save measurable time in the coding process, but suffer from incompleteness which requires a back-up system or human verification. Expert Network does the best job of rank ordering clinical text, potentially enabling the creation of thresholds for the pass through of computer coded data without human review.
The role of CFD in the design process
NASA Astrophysics Data System (ADS)
Jennions, Ian K.
1994-05-01
Over the last decade the role played by CFD codes in turbomachinery design has changed remarkably. While convergence/stability or even the existence of unique solutions was discussed fervently ten years ago, CFD codes now form a valuable part of an overall integrated design system and have caused us to re-think much of what we do. The geometric and physical complexities addressed have also evolved, as have the number of software houses competing with in-house developers to provide solutions to daily design problems. This paper reviews how GE Aircraft Engines (GEAE) uses CFD in the turbomachinery design process and examines many of the issues faced in successful code implementation.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Decoding the function of nuclear long non-coding RNAs.
Chen, Ling-Ling; Carmichael, Gordon G
2010-06-01
Long non-coding RNAs (lncRNAs) are mRNA-like, non-protein-coding RNAs that are pervasively transcribed throughout eukaryotic genomes. Rather than silently accumulating in the nucleus, many of these are now known or suspected to play important roles in nuclear architecture or in the regulation of gene expression. In this review, we highlight some recent progress in how lncRNAs regulate these important nuclear processes at the molecular level. Copyright 2010 Elsevier Ltd. All rights reserved.
The role of verbalization in the Rorschach response process: a review.
Gold, J M
1987-01-01
Traditional Rorschach theory has consistently overlooked the linguistic aspect of the response process. Most conceptualizations focus on the perceptual and cognitive aspects of the process, never examining the subject's need to find a linguistic representation for the inner, perceptual process. This review examines the traditional formulations and suggests that new research from information-processing, neuropsychological, and dual-coding memory theory paradigms offer new possibilities for Rorschach research that would incorporate an appreciation of the unique cognitive demand of the test--the linking of percept with language.
Jones, Natalie; Schneider, Gary; Kachroo, Sumesh; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's (FDA) Mini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest (HOIs) from administrative and claims data. This paper summarizes the process and findings of the algorithm review of acute respiratory failure (ARF). PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis HOI. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify ARF, including validation estimates of the coding algorithms. Our search revealed a deficiency of literature focusing on ARF algorithms and validation estimates. Only two studies provided codes for ARF, each using related yet different ICD-9 codes (i.e., ICD-9 codes 518.8, "other diseases of lung," and 518.81, "acute respiratory failure"). Neither study provided validation estimates. Research needs to be conducted on designing validation studies to test ARF algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
A Spanish version for the new ERA-EDTA coding system for primary renal disease.
Zurriaga, Óscar; López-Briones, Carmen; Martín Escobar, Eduardo; Saracho-Rotaeche, Ramón; Moina Eguren, Íñigo; Pallardó Mateu, Luis; Abad Díez, José María; Sánchez Miret, José Ignacio
2015-01-01
The European Renal Association and the European Dialysis and Transplant Association (ERA-EDTA) have issued an English-language new coding system for primary kidney disease (PKD) aimed at solving the problems that were identified in the list of "Primary renal diagnoses" that has been in use for over 40 years. In the context of Registro Español de Enfermos Renales (Spanish Registry of Renal Patients, [REER]), the need for a translation and adaptation of terms, definitions and notes for the new ERA-EDTA codes was perceived in order to help those who have Spanish as their working language when using such codes. Bilingual nephrologists contributed a professional translation and were involved in a terminological adaptation process, which included a number of phases to contrast translation outputs. Codes, paragraphs, definitions and diagnostic criteria were reviewed and agreements and disagreements aroused for each term were labelled. Finally, the version that was accepted by a majority of reviewers was agreed. A wide agreement was reached in the first review phase, with only 5 points of discrepancy remaining, which were agreed on in the final phase. Translation and adaptation into Spanish represent an improvement that will help to introduce and use the new coding system for PKD, as it can help reducing the time devoted to coding and also the period of adaptation of health workers to the new codes. Copyright © 2015 The Authors. Published by Elsevier España, S.L.U. All rights reserved.
Lisman, John E; Jensen, Ole
2013-03-20
Theta and gamma frequency oscillations occur in the same brain regions and interact with each other, a process called cross-frequency coupling. Here, we review evidence for the following hypothesis: that the dual oscillations form a code for representing multiple items in an ordered way. This form of coding has been most clearly demonstrated in the hippocampus, where different spatial information is represented in different gamma subcycles of a theta cycle. Other experiments have tested the functional importance of oscillations and their coupling. These involve correlation of oscillatory properties with memory states, correlation with memory performance, and effects of disrupting oscillations on memory. Recent work suggests that this coding scheme coordinates communication between brain regions and is involved in sensory as well as memory processes. Copyright © 2013 Elsevier Inc. All rights reserved.
Management Sciences Division Annual Report (9th)
1992-01-01
41 Actuarial Process Consolidation and Review ....................................... 43 How M alfunction Code Reduction...47 Sun W ork Stations ............................................................................... 48 Actuarial Process Consolidation and...Information System (WSMIS). Dyna-METRIC is used for wartime supply support capability assessments. The Aircraft Sustainability Model ( ASM ) is the
Depathologising gender diversity in childhood in the process of ICD revision and reform.
Suess Schwend, Amets; Winter, Sam; Chiam, Zhan; Smiley, Adam; Cabral Grinspan, Mauro
2018-01-24
From 2007 on, the World Health Organisation (WHO) has been revising its diagnostic manual, the International Statistical Classification of Diseases and Related Health Problems (ICD), with approval of ICD-11 due in 2018. The ICD revision has prompted debates on diagnostic classifications related to gender diversity and gender development processes, and specifically on the 'Gender incongruence of childhood' (GIC) code. These debates have taken place at a time an emergent trans depathologisation movement is becoming increasingly international, and regional and international human rights bodies are recognising gender identity as a source of discrimination. With reference to the history of diagnostic classification of gender diversity in childhood, this paper conducts a literature review of academic, activist and institutional documents related to the current discussion on the merits of retaining or abandoning the GIC code. Within this broader discussion, the paper reviews in more detail recent publications arguing for the abandonment of this diagnostic code drawing upon clinical, bioethical and human rights perspectives. The review indicates that gender diverse children engaged in exploring their gender identity and expression do not benefit from diagnosis. Instead they benefit from support from their families, their schools and from society more broadly.
Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela
2015-02-04
Although healthcare administrative data are commonly used for traumatic brain injury (TBI) research, there is currently no consensus or consistency on the International Classification of Diseases Version 10 (ICD-10) codes used to define TBI among children and youth internationally. This study systematically reviewed the literature to explore the range of ICD-10 codes that are used to define TBI in this population. The identification of the range of ICD-10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews were systematically searched. Grey literature was searched using Grey Matters and Google. Reference lists of included articles were also searched for relevant studies. Two reviewers independently screened all titles and abstracts using pre-defined inclusion and exclusion criteria. A full text screen was conducted on articles that met the first screen inclusion criteria. All full text articles that met the pre-defined inclusion criteria were included for analysis in this systematic review. A total of 1,326 publications were identified through the predetermined search strategy and 32 articles/reports met all eligibility criteria for inclusion in this review. Five articles specifically examined children and youth aged 19 years or under with TBI. ICD-10 case definitions ranged from the broad injuries to the head codes (ICD-10 S00 to S09) to concussion only (S06.0). There was overwhelming consensus on the inclusion of ICD-10 code S06, intracranial injury, while codes S00 (superficial injury of the head), S03 (dislocation, sprain, and strain of joints and ligaments of head), and S05 (injury of eye and orbit) were only used by articles that examined head injury, none of which specifically examined children and youth. This review provides evidence for discussion on how best to use ICD codes for different goals. This is an important first step in reaching an appropriate definition and can inform future work on reaching consensus on the ICD-10 codes to define TBI for this vulnerable population.
Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica
2016-02-01
Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC International Classification of Diseases, 9th Revision (ICD-9) codes, and evaluated whether natural language processing by the Automated Retrieval Console (ARC) for document classification improves HCC identification. We identified a cohort of patients with ICD-9 codes for HCC during 2005-2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared with manual classification. PPV, sensitivity, and specificity of ARC were calculated. A total of 1138 patients with HCC were identified by ICD-9 codes. On the basis of manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. A combined approach of ICD-9 codes and natural language processing of pathology and radiology reports improves HCC case identification in automated data.
NASA Technical Reports Server (NTRS)
Lacey, J. C., Jr.; Mullins, D. W., Jr.
1983-01-01
A survey is presented of the literature on the experimental evidence for the genetic code assignments and the chemical reactions involved in the process of protein synthesis. In view of the enormous number of theoretical models that have been advanced to explain the origin of the genetic code, attention is confined to experimental studies. Since genetic coding has significance only within the context of protein synthesis, it is believed that the problem of the origin of the code must be dealt with in terms of the origin of the process of protein synthesis. It is contended that the answers must lie in the nature of the molecules, amino acids and nucleotides, the affinities they might have for one another, and the effect that those affinities must have on the chemical reactions that are related to primitive protein synthesis. The survey establishes that for the bulk of amino acids, there is a direct and significant correlation between the hydrophobicity rank of the amino acids and the hydrophobicity rank of their anticodonic dinucleotides.
Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide
McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger
2015-01-01
Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartori, E.; Roussin, R.W.
This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less
Review of the 9th NLTE code comparison workshop
Piron, Robin; Gilleron, Franck; Aglitskiy, Yefim; ...
2017-02-24
Here, we review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.
Review of the 9th NLTE code comparison workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piron, Robin; Gilleron, Franck; Aglitskiy, Yefim
Here, we review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.
Review of the 9th NLTE code comparison workshop
NASA Astrophysics Data System (ADS)
Piron, R.; Gilleron, F.; Aglitskiy, Y.; Chung, H.-K.; Fontes, C. J.; Hansen, S. B.; Marchuk, O.; Scott, H. A.; Stambulchik, E.; Ralchenko, Yu.
2017-06-01
We review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.
Moore, Brian C J
2003-03-01
To review how the properties of sounds are "coded" in the normal auditory system and to discuss the extent to which cochlear implants can and do represent these codes. Data are taken from published studies of the response of the cochlea and auditory nerve to simple and complex stimuli, in both the normal and the electrically stimulated ear. REVIEW CONTENT: The review describes: 1) the coding in the normal auditory system of overall level (which partly determines perceived loudness), spectral shape (which partly determines perceived timbre and the identity of speech sounds), periodicity (which partly determines pitch), and sound location; 2) the role of the active mechanism in the cochlea, and particularly the fast-acting compression associated with that mechanism; 3) the neural response patterns evoked by cochlear implants; and 4) how the response patterns evoked by implants differ from those observed in the normal auditory system in response to sound. A series of specific issues is then discussed, including: 1) how to compensate for the loss of cochlear compression; 2) the effective number of independent channels in a normal ear and in cochlear implantees; 3) the importance of independence of responses across neurons; 4) the stochastic nature of normal neural responses; 5) the possible role of across-channel coincidence detection; and 6) potential benefits of binaural implantation. Current cochlear implants do not adequately reproduce several aspects of the neural coding of sound in the normal auditory system. Improved electrode arrays and coding systems may lead to improved coding and, it is hoped, to better performance.
ERIC Educational Resources Information Center
Wood, Brenna K.; Drogan, Robin R.; Janney, Donna M.
2014-01-01
Reviewers analyzed studies published from 1990 to 2012 to determine early childhood practitioner involvement in functional behavioral assessment (FBA) and function-based behavioral intervention plans (BIP) for children with challenging behavior, age 6 and younger. Coding of 30 studies included practitioner involvement in FBA and BIP processes,…
NASA Astrophysics Data System (ADS)
Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens
2015-04-01
Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.
Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.
Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger
2015-01-01
To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Noel, Jonathan K; Babor, Thomas F
2017-01-01
Exposure to alcohol marketing is considered to be potentially harmful to adolescents. In addition to statutory regulation, industry self-regulation is a common way to protect adolescents from alcohol marketing exposures. This paper critically reviews research designed to evaluate the effectiveness of the alcohol industry's compliance procedures to manage complaints when alcohol marketing is considered to have violated a self-regulatory code. Peer-reviewed papers were identified through four literature search engines: PubMed, SCOPUS, PsychINFO and CINAHL. Non-peer-reviewed reports produced by public health agencies, alcohol research centers, non-governmental organizations, government research centers and national industry advertising associations were also included. The search process yielded three peer-reviewed papers, seven non-peer reviewed reports published by academic institutes and non-profit organizations and 20 industry reports. The evidence indicates that the complaint process lacks standardization across countries, industry adjudicators may be trained inadequately or biased and few complaints are upheld against advertisements pre-determined to contain violations of a self-regulatory code. The current alcohol industry marketing complaint process used in a wide variety of countries may be ineffective at removing potentially harmful content from the market-place. The process of determining the validity of complaints employed by most industry groups appears to suffer from serious conflict of interest and procedural weaknesses that could compromise objective adjudication of even well-documented complaints. In our opinion the current system of self-regulation needs major modifications if it is to serve public health objectives, and more systematic evaluations of the complaint process are needed. © 2016 Society for the Study of Addiction.
Danforth, Kim N; Early, Megan I; Ngan, Sharon; Kosco, Anne E; Zheng, Chengyi; Gould, Michael K
2012-08-01
Lung nodules are commonly encountered in clinical practice, yet little is known about their management in community settings. An automated method for identifying patients with lung nodules would greatly facilitate research in this area. Using members of a large, community-based health plan from 2006 to 2010, we developed a method to identify patients with lung nodules, by combining five diagnostic codes, four procedural codes, and a natural language processing algorithm that performed free text searches of radiology transcripts. An experienced pulmonologist reviewed a random sample of 116 radiology transcripts, providing a reference standard for the natural language processing algorithm. With the use of an automated method, we identified 7112 unique members as having one or more incident lung nodules. The mean age of the patients was 65 years (standard deviation 14 years). There were slightly more women (54%) than men, and Hispanics and non-whites comprised 45% of the lung nodule cohort. Thirty-six percent were never smokers whereas 11% were current smokers. Fourteen percent of the patients were subsequently diagnosed with lung cancer. The sensitivity and specificity of the natural language processing algorithm for identifying the presence of lung nodules were 96% and 86%, respectively, compared with clinician review. Among the true positive transcripts in the validation sample, only 35% were solitary and unaccompanied by one or more associated findings, and 56% measured 8 to 30 mm in diameter. A combination of diagnostic codes, procedural codes, and a natural language processing algorithm for free text searching of radiology reports can accurately and efficiently identify patients with incident lung nodules, many of whom are subsequently diagnosed with lung cancer.
The systematic review as a research process in music therapy.
Hanson-Abromeit, Deanna; Sena Moore, Kimberly
2014-01-01
Music therapists are challenged to present evidence on the efficacy of music therapy treatment and incorporate the best available research evidence to make informed healthcare and treatment decisions. Higher standards of evidence can come from a variety of sources including systematic reviews. To define and describe a range of research review methods using examples from music therapy and related literature, with emphasis on the systematic review. In addition, the authors provide a detailed overview of methodological processes for conducting and reporting systematic reviews in music therapy. The systematic review process is described in five steps. Step 1 identifies the research plan and operationalized research question(s). Step 2 illustrates the identification and organization of the existing literature related to the question(s). Step 3 details coding of data extracted from the literature. Step 4 explains the synthesis of coded findings and analysis to answer the research question(s). Step 5 describes the strength of evidence evaluation and results presentation for practice recommendations. Music therapists are encouraged to develop and conduct systematic reviews. This methodology contributes to review outcome credibility and can determine how information is interpreted and used by clinicians, clients or patients, and policy makers. A systematic review is a methodologically rigorous research method used to organize and evaluate extant literature related to a clinical problem. Systematic reviews can assist music therapists in managing the ever-increasing literature, making well-informed evidence based practice and research decisions, and translating existing music-based and nonmusic based literature to clinical practice and research development. © the American Music Therapy Association 2014. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
[Long non-coding RNAs in plants].
Xiaoqing, Huang; Dandan, Li; Juan, Wu
2015-04-01
Long non-coding RNAs (lncRNAs), which are longer than 200 nucleotides in length, widely exist in organisms and function in a variety of biological processes. Currently, most of lncRNAs found in plants are transcribed by RNA polymerase Ⅱ and mediate gene expression through multiple mechanisms, such as target mimicry, transcription interference, histone methylation and DNA methylation, and play important roles in flowering, male sterility, nutrition metabolism, biotic and abiotic stress and other biological processes as regulators in plants. In this review, we summarize the databases, prediction methods, and possible functions of plant lncRNAs discovered in recent years.
What Consumers Say About Nursing Homes in Online Reviews.
Kellogg, Caitlyn; Zhu, Yujun; Cardenas, Valeria; Vazquez, Katalina; Johari, Kayla; Rahman, Anna; Enguidanos, Susan
2018-04-20
Although patient-centered care is an expressed value of our healthcare system, no studies have examined what consumers say in online reviews about nursing homes (NHs). Insight into themes addressed in these reviews could inform improvement efforts that promote patient-centered NH care. We analyzed nursing home (NH) Yelp reviews. From a list of all NHs in California, we drew a purposeful sample of 51 NHs, selecting facilities representing a range of geographical areas and occupancy rates. Two research teams analyzed the reviews using grounded theory to identify codes and tracked how frequently each code was mentioned. We evaluated 264 reviews, identifying 24 codes, grouped under five categories: quality of staff care and staffing; physical facility and setting; resident safety and security; clinical care quality; and financial issues. More than half (53.41%) of Yelp reviewers posted comments related to staff attitude and caring and nearly a third (29.2%) posted comments related to staff responsiveness. Yelp reviewers also often posted about NHs' physical environment. Infrequently mentioned were the quality of health care provided and concerns about resident safety and security. Our results are consistent with those from related studies. Yelp reviewers focus on NH aspects that are not evaluated in most other NH rating systems. The federal Nursing Home Compare website, for instance, does not report measures of staff attitudes or the NH's physical setting. Rather, it reports measures of staffing levels and clinical processes and outcomes. We recommend that NH consumers consult both types of rating systems because they provide complementary information.
A content review of cognitive process measures used in pain research within adult populations.
Day, M A; Lang, C P; Newton-John, T R O; Ehde, D M; Jensen, M P
2017-01-01
Previous research suggests that measures of cognitive process may be confounded by the inclusion of items that also assess cognitive content. The primary aims of this content review were to: (1) identify the domains of cognitive processes assessed by measures used in pain research; and (2) determine if pain-specific cognitive process measures with adequate psychometric properties exist. PsychInfo, CINAHL, PsycArticles, MEDLINE, and Academic Search Complete databases were searched to identify the measures of cognitive process used in pain research. Identified measures were double coded and the measure's items were rated as: (1) cognitive content; (2) cognitive process; (3) behavioural/social; and/or (4) emotional coping/responses to pain. A total of 319 scales were identified; of these, 29 were coded as providing an un-confounded assessment of cognitive process, and 12 were pain-specific. The cognitive process domains assessed in these measures are Absorption, Dissociation, Reappraisal, Distraction/Suppression, Acceptance, Rumination, Non-Judgment, and Enhancement. Pain-specific, un-confounded measures were identified for: Dissociation, Reappraisal, Distraction/Suppression, and Acceptance. Psychometric properties of all 319 scales are reported in supplementary material. To understand the importance of cognitive processes in influencing pain outcomes as well as explaining the efficacy of pain treatments, valid and pain-specific cognitive process measures that are not confounded with non-process domains (e.g., cognitive content) are needed. The findings of this content review suggest that future research focused on developing cognitive process measures is critical in order to advance our understanding of the mechanisms that underlie effective pain treatment. Many cognitive process measures used in pain research contain a 'mix' of items that assess cognitive process, cognitive content, and behavioural/emotional responses. Databases searched: PsychInfo, CINAHL, PsycArticles, MEDLINE and Academic Search Complete. This review describes the domains assessed by measures assessing cognitive processes in pain research, as well as the strengths and limitations of these measures. © 2016 European Pain Federation - EFIC®.
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Minozzi, Silvia; Armaroli, Paola; Espina, Carolina; Villain, Patricia; Wiseman, Martin; Schüz, Joachim; Segnan, Nereo
2015-12-01
The European Code Against Cancer is a set of recommendations to give advice on cancer prevention. Its 4th edition is an update of the 3rd edition, from 2003. Working Groups of independent experts from different fields of cancer prevention were appointed to review the recommendations, supported by a Literature Group to provide scientific and technical support in the assessment of the scientific evidence, through systematic reviews of the literature. Common procedures were developed to guide the experts in identifying, retrieving, assessing, interpreting and summarizing the scientific evidence in order to revise the recommendations. The Code strictly followed the concept of providing advice to European Union citizens based on the current best available science. The advice, if followed, would be expected to reduce cancer risk, referring both to avoiding or reducing exposure to carcinogenic agents or changing behaviour related to cancer risk and to participating in medical interventions able to avert specific cancers or their consequences. The information sources and procedures for the review of the scientific evidence are described here in detail. The 12 recommendations of the 4th edition of the European Code Against Cancer were ultimately approved by a Scientific Committee of leading European cancer and public health experts. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.
Jeong, Dahn; Presseau, Justin; ElChamaa, Rima; Naumann, Danielle N; Mascaro, Colin; Luconi, Francesca; Smith, Karen M; Kitto, Simon
2018-04-10
This scoping review explored the barriers and facilitators that influence engagement in and implementation of self-directed learning (SDL) in continuing professional development (CPD) for physicians in Canada. This review followed the six-stage scoping review framework of Arksey and O'Malley and of Daudt et al. In 2015, the authors searched eight online databases for English-language Canadian articles published January 2005-December 2015. To chart and analyze the data from the 17 included studies, they employed two-step analysis process of conventional content analysis followed by directed coding guided by the Theoretical Domains Framework (TDF). Conventional content analysis generated five categories of barriers and facilitators: individual, program, technological, environmental, and workplace/organizational. Directed coding guided by the TDF allowed analysis of barriers and facilitators to behavior change according to two key groups: physicians engaging in SDL and SDL developers designing and implementing SDL programs. Of the 318 total barriers and facilitators coded, 290 (91.2%) were coded for physicians and 28 (8.8%) for SDL developers. The majority (209; 65.7%) were coded in four key TDF domains: environmental context and resources, social influences, beliefs about consequences, and behavioral regulation. This scoping review identified five categories of barriers and facilitators in the literature and four key TDF domains where most factors related to behavior change of physicians and SDL developers regarding SDL programs in CPD were coded. There was a significant gap in the literature about factors that may contribute to SDL developers' capacity to design and implement SDL programs in CPD.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Cognitive Evaluation Theory: An Experimental Test of Processes and Outcomes.
1981-06-26
ICONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Organizational Effectiveness Research Program June 26, 1981 Office of Naval Research (Code 452) 13. NUMBER...frame- work for explaining the detrimental effects of performance contingent rewards on intrinsically motivated behaviors. A review of the literature...part by the Organizational Effectiveness RMzarch Program, Office of Naval Research (Code 452), under Contract No. N0014-79-C-O750i NR 170-892 with Dr
[Long non-coding RNAs in the pathophysiology of atherosclerosis].
Novak, Jan; Vašků, Julie Bienertová; Souček, Miroslav
2018-01-01
The human genome contains about 22 000 protein-coding genes that are transcribed to an even larger amount of messenger RNAs (mRNA). Interestingly, the results of the project ENCODE from 2012 show, that despite up to 90 % of our genome being actively transcribed, protein-coding mRNAs make up only 2-3 % of the total amount of the transcribed RNA. The rest of RNA transcripts is not translated to proteins and that is why they are referred to as "non-coding RNAs". Earlier the non-coding RNA was considered "the dark matter of genome", or "the junk", whose genes has accumulated in our DNA during the course of evolution. Today we already know that non-coding RNAs fulfil a variety of regulatory functions in our body - they intervene into epigenetic processes from chromatin remodelling to histone methylation, or into the transcription process itself, or even post-transcription processes. Long non-coding RNAs (lncRNA) are one of the classes of non-coding RNAs that have more than 200 nucleotides in length (non-coding RNAs with less than 200 nucleotides in length are called small non-coding RNAs). lncRNAs represent a widely varied and large group of molecules with diverse regulatory functions. We can identify them in all thinkable cell types or tissues, or even in an extracellular space, which includes blood, specifically plasma. Their levels change during the course of organogenesis, they are specific to different tissues and their changes also occur along with the development of different illnesses, including atherosclerosis. This review article aims to present lncRNAs problematics in general and then focuses on some of their specific representatives in relation to the process of atherosclerosis (i.e. we describe lncRNA involvement in the biology of endothelial cells, vascular smooth muscle cells or immune cells), and we further describe possible clinical potential of lncRNA, whether in diagnostics or therapy of atherosclerosis and its clinical manifestations.Key words: atherosclerosis - lincRNA - lncRNA - MALAT - MIAT.
Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica
2013-01-01
Background Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC ICD-9 codes, and evaluated whether natural language processing (NLP) by the Automated Retrieval Console (ARC) for document classification improves HCC identification. Methods We identified a cohort of patients with ICD-9 codes for HCC during 2005–2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared to manual classification. PPV, sensitivity, and specificity of ARC were calculated. Results 1138 patients with HCC were identified by ICD-9 codes. Based on manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. Conclusion A combined approach of ICD-9 codes and NLP of pathology and radiology reports improves HCC case identification in automated data. PMID:23929403
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
... authority for use in the PSD permitting process. See 75 FR 64864 at 64899. \\18\\ EPA is currently developing... ethanol by natural fermentation under the North American Industry Classification System (NAICS) codes...
What is Bottom-Up and What is Top-Down in Predictive Coding?
Rauss, Karsten; Pourtois, Gilles
2013-01-01
Everyone knows what bottom-up is, and how it is different from top-down. At least one is tempted to think so, given that both terms are ubiquitously used, but only rarely defined in the psychology and neuroscience literature. In this review, we highlight the problems and limitations of our current understanding of bottom-up and top-down processes, and we propose a reformulation of this distinction in terms of predictive coding. PMID:23730295
Review Of Piping And Pressure Vessel Code Design Criteria. Technical Report 217.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
1969-04-18
This Technical Report summarizes a review of the design philosophies and criteria of the ASME Boiler and Pressure Vessel Code and the USASI Code for Pressure Piping. It traces the history of the Codes since their inception and critically reviews their present status. Recommendations are made concerning the applicability of the Codes to the special needs of LMFBR liquid sodium piping.
ERIC Educational Resources Information Center
Donaldosn, William S.; Stephens, Thomas M.
1979-01-01
Sections address the RFP/IFB (Request for Proposals/Invitation for Bids) process and procedures for selecting the "best" contract. In reviewing the federal procurement code, and the recent decision of the Government Accounting Office, particularly regarding the NIMIS (National Instructional Materials Information System contract), inconsistencies…
Reactive transport modeling in fractured rock: A state-of-the-science review
NASA Astrophysics Data System (ADS)
MacQuarrie, Kerry T. B.; Mayer, K. Ulrich
2005-10-01
The field of reactive transport modeling has expanded significantly in the past two decades and has assisted in resolving many issues in Earth Sciences. Numerical models allow for detailed examination of coupled transport and reactions, or more general investigation of controlling processes over geologic time scales. Reactive transport models serve to provide guidance in field data collection and, in particular, enable researchers to link modeling and hydrogeochemical studies. In this state-of-science review, the key objectives were to examine the applicability of reactive transport codes for exploring issues of redox stability to depths of several hundreds of meters in sparsely fractured crystalline rock, with a focus on the Canadian Shield setting. A conceptual model of oxygen ingress and redox buffering, within a Shield environment at time and space scales relevant to nuclear waste repository performance, is developed through a review of previous research. This conceptual model describes geochemical and biological processes and mechanisms materially important to understanding redox buffering capacity and radionuclide mobility in the far-field. Consistent with this model, reactive transport codes should ideally be capable of simulating the effects of changing recharge water compositions as a result of long-term climate change, and fracture-matrix interactions that may govern water-rock interaction. Other aspects influencing the suitability of reactive transport codes include the treatment of various reaction and transport time scales, the ability to apply equilibrium or kinetic formulations simultaneously, the need to capture feedback between water-rock interactions and porosity-permeability changes, and the representation of fractured crystalline rock environments as discrete fracture or dual continuum media. A review of modern multicomponent reactive transport codes indicates a relatively high-level of maturity. Within the Yucca Mountain nuclear waste disposal program, reactive transport codes of varying complexity have been applied to investigate the migration of radionuclides and the geochemical evolution of host rock around the planned disposal facility. Through appropriate near- and far-field application of dual continuum codes, this example demonstrates how reactive transport models have been applied to assist in constraining historic water infiltration rates, interpreting the sealing of flow paths due to mineral precipitation, and investigating post-closure geochemical monitoring strategies. Natural analogue modeling studies, although few in number, are also of key importance as they allow the comparison of model results with hydrogeochemical and paleohydrogeological data over geologic time scales.
Discussion on LDPC Codes and Uplink Coding
NASA Technical Reports Server (NTRS)
Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio
2007-01-01
This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.
Zhu, Vivienne J; Walker, Tina D; Warren, Robert W; Jenny, Peggy B; Meystre, Stephane; Lenert, Leslie A
2017-01-01
Quality reporting that relies on coded administrative data alone may not completely and accurately depict providers’ performance. To assess this concern with a test case, we developed and evaluated a natural language processing (NLP) approach to identify falls risk screenings documented in clinical notes of patients without coded falls risk screening data. Extracting information from 1,558 clinical notes (mainly progress notes) from 144 eligible patients, we generated a lexicon of 38 keywords relevant to falls risk screening, 26 terms for pre-negation, and 35 terms for post-negation. The NLP algorithm identified 62 (out of the 144) patients who falls risk screening documented only in clinical notes and not coded. Manual review confirmed 59 patients as true positives and 77 patients as true negatives. Our NLP approach scored 0.92 for precision, 0.95 for recall, and 0.93 for F-measure. These results support the concept of utilizing NLP to enhance healthcare quality reporting. PMID:29854264
García-Betances, Rebeca I; Huerta, Mónica K
2012-01-01
A comparative review is presented of available technologies suitable for automatic reading of patient identification bracelet tags. Existing technologies' backgrounds, characteristics, advantages and disadvantages, are described in relation to their possible use by public health care centers with budgetary limitations. A comparative assessment is presented of suitable automatic identification systems based on graphic codes, both one- (1D) and two-dimensional (2D), printed on labels, as well as those based on radio frequency identification (RFID) tags. The analysis looks at the tradeoffs of these technologies to provide guidance to hospital administrator looking to deploy patient identification technology. The results suggest that affordable automatic patient identification systems can be easily and inexpensively implemented using 2D code printed on low cost bracelet labels, which can then be read and automatically decoded by ordinary mobile smart phones. Because of mobile smart phones' present versatility and ubiquity, the implantation and operation of 2D code, and especially Quick Response® (QR) Code, technology emerges as a very attractive alternative to automate the patients' identification processes in low-budget situations.
García-Betances, Rebeca I.; Huerta, Mónica K.
2012-01-01
A comparative review is presented of available technologies suitable for automatic reading of patient identification bracelet tags. Existing technologies’ backgrounds, characteristics, advantages and disadvantages, are described in relation to their possible use by public health care centers with budgetary limitations. A comparative assessment is presented of suitable automatic identification systems based on graphic codes, both one- (1D) and two-dimensional (2D), printed on labels, as well as those based on radio frequency identification (RFID) tags. The analysis looks at the tradeoffs of these technologies to provide guidance to hospital administrator looking to deploy patient identification technology. The results suggest that affordable automatic patient identification systems can be easily and inexpensively implemented using 2D code printed on low cost bracelet labels, which can then be read and automatically decoded by ordinary mobile smart phones. Because of mobile smart phones’ present versatility and ubiquity, the implantation and operation of 2D code, and especially Quick Response® (QR) Code, technology emerges as a very attractive alternative to automate the patients’ identification processes in low-budget situations. PMID:23569629
Progress towards a world-wide code of conduct
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J.A.N.; Berleur, J.
1994-12-31
In this paper the work of the International Federation for Information Processing (IFIP) Task Group on Ethics is described and the recommendations presented to the General Assembly are reviewed. While a common code of ethics or conduct has been not recommended for consideration by the member societies of IMP, a set of guidelines for the establishment and evaluation of codes has been produced and procedures for the assistance of code development have been established within IMP. This paper proposes that the data collected by the Task Group and the proposed guidelines can be used as a tool for the studymore » of codes of practice providing a teachable, learnable educational module in courses related to the ethics of computing and computation, and looks at the next steps in bringing ethical awareness to the IT community.« less
Metabolic Free Energy and Biological Codes: A 'Data Rate Theorem' Aging Model.
Wallace, Rodrick
2015-06-01
A famous argument by Maturana and Varela (Autopoiesis and cognition. Reidel, Dordrecht, 1980) holds that the living state is cognitive at every scale and level of organization. Since it is possible to associate many cognitive processes with 'dual' information sources, pathologies can sometimes be addressed using statistical models based on the Shannon Coding, the Shannon-McMillan Source Coding, the Rate Distortion, and the Data Rate Theorems, which impose necessary conditions on information transmission and system control. Deterministic-but-for-error biological codes do not directly invoke cognition, but may be essential subcomponents within larger cognitive processes. A formal argument, however, places such codes within a similar framework, with metabolic free energy serving as a 'control signal' stabilizing biochemical code-and-translator dynamics in the presence of noise. Demand beyond available energy supply triggers punctuated destabilization of the coding channel, affecting essential biological functions. Aging, normal or prematurely driven by psychosocial or environmental stressors, must interfere with the routine operation of such mechanisms, initiating the chronic diseases associated with senescence. Amyloid fibril formation, intrinsically disordered protein logic gates, and cell surface glycan/lectin 'kelp bed' logic gates are reviewed from this perspective. The results generalize beyond coding machineries having easily recognizable symmetry modes, and strip a layer of mathematical complication from the study of phase transitions in nonequilibrium biological systems.
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Analysis of Design-Build Processes, Best Practices, and Applications to the Department of Defense
2006-06-01
NAVFAC design-build processes published in trade journals, books , magazines, internet articles, and DoD policy. In their book , Contract Management...literature review concentrates on recent articles published in books , trade magazines, and on the internet to determine design-build processes and...Keith Molenaar ) Design-build projects under the State of California’s Public Contract Code (Legaltips.org, 2006) requires the owner, for example the
Codes of environmental management practice: Assessing their potential as a tool for change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nash, J.; Ehrenfeld, J.
1997-12-31
Codes of environmental management practice emerged as a tool of environmental policy in the late 1980s. Industry and other groups have developed codes for two purposes: to change the environmental behavior of participating firms and to increase public confidence in industry`s commitment to environmental protection. This review examines five codes of environmental management practice: Responsible Care, the International Chamber of Commerce`s Business Charter for Sustainable Development, ISO 14000, the CERES Principles, and The Natural Step. The first three codes have been drafted and promoted primarily by industry; the others have been developed by non-industry groups. These codes have spurred participatingmore » firms to introduce new practices, including the institution of environmental management systems, public environmental reporting, and community advisory panels. The extent to which codes are introducing a process of cultural change is considered in terms of four dimensions: new consciousness, norms, organization, and tools. 94 refs., 3 tabs.« less
Adams, Bradley J; Aschheim, Kenneth W
2016-01-01
Comparison of antemortem and postmortem dental records is a leading method of victim identification, especially for incidents involving a large number of decedents. This process may be expedited with computer software that provides a ranked list of best possible matches. This study provides a comparison of the most commonly used conventional coding and sorting algorithms used in the United States (WinID3) with a simplified coding format that utilizes an optimized sorting algorithm. The simplified system consists of seven basic codes and utilizes an optimized algorithm based largely on the percentage of matches. To perform this research, a large reference database of approximately 50,000 antemortem and postmortem records was created. For most disaster scenarios, the proposed simplified codes, paired with the optimized algorithm, performed better than WinID3 which uses more complex codes. The detailed coding system does show better performance with extremely large numbers of records and/or significant body fragmentation. © 2015 American Academy of Forensic Sciences.
The analysis of verbal interaction sequences in dyadic clinical communication: a review of methods.
Connor, Martin; Fletcher, Ian; Salmon, Peter
2009-05-01
To identify methods available for sequential analysis of dyadic verbal clinical communication and to review their methodological and conceptual differences. Critical review, based on literature describing sequential analyses of clinical and other relevant social interaction. Dominant approaches are based on analysis of communication according to its precise position in the series of utterances that constitute event-coded dialogue. For practical reasons, methods focus on very short-term processes, typically the influence of one party's speech on what the other says next. Studies of longer-term influences are rare. Some analyses have statistical limitations, particularly in disregarding heterogeneity between consultations, patients or practitioners. Additional techniques, including ones that can use information about timing and duration of speech from interval-coding are becoming available. There is a danger that constraints of commonly used methods shape research questions and divert researchers from potentially important communication processes including ones that operate over a longer-term than one or two speech turns. Given that no one method can model the complexity of clinical communication, multiple methods, both quantitative and qualitative, are necessary. Broadening the range of methods will allow the current emphasis on exploratory studies to be balanced by tests of hypotheses about clinically important communication processes.
NASA's Principal Center for Review of Clean Air Act Regulations
NASA Technical Reports Server (NTRS)
Clark-Ingram, Marceia
2003-01-01
Marshall Space Flight Center (MSFC) was selected as the Principal Center for review of Clean Air Act (CAA) regulations. The CAA Principal Center is tasked to: 1) Provide centralized support to NASA/HDQ Code JE for the management and leadership of NASA's CAA regulation review process; 2) Identify potential impact from proposed CAA regulations to NASA program hardware and supporting facilities. The Shuttle Environmental Assurance Initiative, one of the responsibilities of the NASA CAA Working Group (WG), is described in part of this viewgraph presentation.
Image Coding Based on Address Vector Quantization.
NASA Astrophysics Data System (ADS)
Feng, Yushu
Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing Adaptive VQ Technique" is presented. In addition to chapters 2 through 6 which report on new work, this dissertation includes one chapter (chapter 1) and part of chapter 2 which review previous work on VQ and image coding, respectively. Finally, a short discussion of directions for further research is presented in conclusion.
The NPG 7120.5A Electronic Review Process
NASA Technical Reports Server (NTRS)
McBrayer, Robert; Ives, Mark
1998-01-01
The use of electronics to review a document is well within the technical realm of today's state-of-the-art workplace. File servers and web site interaction are common tools for many NASA employees. The electronic comment processing described here was developed for the NPG 7120.5A review to augment the existing NASA Online Directives Information System (NODIS). The NODIS system is NASA's official system for formal review, approval and storage of NASA Directives. The electronic review process worked so well that NASA and other agencies may want to consider it as one of our "best practices." It was participatory decision making at its very best, a process that attracted dozens of very good ideas to improve the document as well as the way we can be managing projects far more effectively. The revision of NPG 7120.5A has significant implications for the way all elements of the Agency accomplish program and project management. Therefore, the review of NPG 7120.5A was an Agencywide effort with high visibility, heavy participation and a short schedule. The level of involvement created interest in supplementing the formal NODIS system with a system to collect comments efficiently and to allow the Centers and Codes to review and consolidate their comments into the official system in a short period of time. In addition, the Program Management Council Working Group (PMCWG), responsible for the revision of the document and the disposition of official comments, needed an electronic system to manage the disposition of comments, obtain PMCWG consensus on each disposition, and coordinate the disposition with the appropriate Headquarters Code that had submitted the official comment. The combined NASA and contractor talents and resources provided a system that supplemented the NODIS system and its operating personnel to produce a thorough review and approval of NPG 7120.5A on April 3, 1998, 7.5 months from the start of the process. The original six-month schedule is indicated. All milestones occurred on time, except for completion of comment disposition, which required an additional 30 days. Approval of the document occurred sixteen days after completion of the "Purple Package."
Quality of coding diagnoses in emergency departments: effects on mapping the public's health.
Aharonson-Daniel, Limor; Schwartz, Dagan; Hornik-Lurie, Tzipi; Halpern, Pinchas
2014-01-01
Emergency department (ED) attendees reflect the health of the population served by that hospital and the availability of health care services in the community. To examine the quality and accuracy of diagnoses recorded in the ED to appraise its potential utility as a guage of the population's medical needs. Using the Delphi process, a preliminary list of health indicators generated by an expert focus group was converted to a query to the Ministry of Health's database. In parallel, medical charts were reviewed in four hospitals to compare the handwritten diagnosis in the medical record with that recorded on the standard diagnosis "pick list" coding sheet. Quantity and quality of coding were assessed using explicit criteria. During 2010 a total of 17,761 charts were reviewed; diagnoses were not coded in 42%. The accuracy of existing coding was excellent (mismatch 1%-5%). Database query (2,670,300 visits to 28 hospitals in 2009) demonstrated potential benefits of these data as indicators of regional health needs. The findings suggest that an increase in the provision of community care may reduce ED attendance. Information on ED visits can be used to support health care planning. A "pick list" form with common diagnoses can facilitate quality recording of diagnoses in a busy ED, profiling the population's health needs in order to optimize care. Better compliance with the directive to code diagnosis is desired.
Revisiting place and temporal theories of pitch
2014-01-01
The nature of pitch and its neural coding have been studied for over a century. A popular debate has revolved around the question of whether pitch is coded via “place” cues in the cochlea, or via timing cues in the auditory nerve. In the most recent incarnation of this debate, the role of temporal fine structure has been emphasized in conveying important pitch and speech information, particularly because the lack of temporal fine structure coding in cochlear implants might explain some of the difficulties faced by cochlear implant users in perceiving music and pitch contours in speech. In addition, some studies have postulated that hearing-impaired listeners may have a specific deficit related to processing temporal fine structure. This article reviews some of the recent literature surrounding the debate, and argues that much of the recent evidence suggesting the importance of temporal fine structure processing can also be accounted for using spectral (place) or temporal-envelope cues. PMID:25364292
Theory of Mind: A Neural Prediction Problem
Koster-Hale, Jorie; Saxe, Rebecca
2014-01-01
Predictive coding posits that neural systems make forward-looking predictions about incoming information. Neural signals contain information not about the currently perceived stimulus, but about the difference between the observed and the predicted stimulus. We propose to extend the predictive coding framework from high-level sensory processing to the more abstract domain of theory of mind; that is, to inferences about others’ goals, thoughts, and personalities. We review evidence that, across brain regions, neural responses to depictions of human behavior, from biological motion to trait descriptions, exhibit a key signature of predictive coding: reduced activity to predictable stimuli. We discuss how future experiments could distinguish predictive coding from alternative explanations of this response profile. This framework may provide an important new window on the neural computations underlying theory of mind. PMID:24012000
Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana
2017-01-01
The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are—for the time being—not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells. PMID:28703782
Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana
2017-07-13
The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are-for the time being-not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells.
An audit of alcohol brand websites.
Gordon, Ross
2011-11-01
The study investigated the nature and content of alcohol brand websites in the UK. The research involved an audit of the websites of the 10 leading alcohol brands by sales in the UK across four categories: lager, spirits, Flavoured Alcoholic Beverages and cider/perry. Each site was visited twice over a 1-month period with site features and content recorded using a pro-forma. The content of websites was then reviewed against the regulatory codes governing broadcast advertising of alcohol. It was found that 27 of 40 leading alcohol brands had a dedicated website. Sites featured sophisticated content, including sports and music sections, games, downloads and competitions. Case studies of two brand websites demonstrate the range of content features on such sites. A review of the application of regulatory codes covering traditional advertising found some content may breach the codes. Study findings illustrate the sophisticated range of content accessible on alcohol brand websites. When applying regulatory codes covering traditional alcohol marketing channels it is apparent that some content on alcohol brand websites would breach the codes. This suggests the regulation of alcohol brand websites may be an issue requiring attention from policymakers. Further research in this area would help inform this process. © 2010 Australasian Professional Society on Alcohol and other Drugs.
Facts and updates about cardiovascular non-coding RNAs in heart failure.
Thum, Thomas
2015-09-01
About 11% of all deaths include heart failure as a contributing cause. The annual cost of heart failure amounts to US $34,000,000,000 in the United States alone. With the exception of heart transplantation, there is no curative therapy available. Only occasionally there are new areas in science that develop into completely new research fields. The topic on non-coding RNAs, including microRNAs, long non-coding RNAs, and circular RNAs, is such a field. In this short review, we will discuss the latest developments about non-coding RNAs in cardiovascular disease. MicroRNAs are short regulatory non-coding endogenous RNA species that are involved in virtually all cellular processes. Long non-coding RNAs also regulate gene and protein levels; however, by much more complicated and diverse mechanisms. In general, non-coding RNAs have been shown to be of great value as therapeutic targets in adverse cardiac remodelling and also as diagnostic and prognostic biomarkers for heart failure. In the future, non-coding RNA-based therapeutics are likely to enter the clinical reality offering a new treatment approach of heart failure.
PRISM Software: Processing and Review Interface for Strong‐Motion Data
Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter
2017-01-01
A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.
Long non-coding RNAs involved in autophagy regulation
Yang, Lixian; Wang, Hanying; Shen, Qi; Feng, Lifeng; Jin, Hongchuan
2017-01-01
Autophagy degrades non-functioning or damaged proteins and organelles to maintain cellular homeostasis in a physiological or pathological context. Autophagy can be protective or detrimental, depending on its activation status and other conditions. Therefore, autophagy has a crucial role in a myriad of pathophysiological processes. From the perspective of autophagy-related (ATG) genes, the molecular dissection of autophagy process and the regulation of its level have been largely unraveled. However, the discovery of long non-coding RNAs (lncRNAs) provides a new paradigm of gene regulation in almost all important biological processes, including autophagy. In this review, we highlight recent advances in autophagy-associated lncRNAs and their specific autophagic targets, as well as their relevance to human diseases such as cancer, cardiovascular disease, diabetes and cerebral ischemic stroke. PMID:28981093
PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; Suhs, Norman; Dietz, William; Rogers, Stuart; Nash, Steve; Chan, William; Tramel, Robert; Onufer, Jeff
2006-01-01
This viewgraph presentation reviews the use and requirements of Pegasus 5. PEGASUS 5 is a code which performs a pre-processing step for the Overset CFD method. The code prepares the overset volume grids for the flow solver by computing the domain connectivity database, and blanking out grid points which are contained inside a solid body. PEGASUS 5 successfully automates most of the overset process. It leads to dramatic reduction in user input over previous generations of overset software. It also can lead to an order of magnitude reduction in both turn-around time and user expertise requirements. It is also however not a "black-box" procedure; care must be taken to examine the resulting grid system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
Time Analysis: Still an Important Accountability Tool.
ERIC Educational Resources Information Center
Fairchild, Thomas N.; Seeley, Tracey J.
1994-01-01
Reviews benefits to school counselors of conducting a time analysis. Describes time analysis system that authors have used, including case illustration of how authors used data to effect counseling program changes. System described followed process outlined by Fairchild: identifying services, devising coding system, keeping records, synthesizing…
Pattern formation in early embryogenesis of Xenopus laevis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mglinets, V.A.
1995-07-01
Establishment of egg polarity, separation of germ layers, and the appearance of animal-vegetal, dorsoventral, and anteroposterior axes in Xenopus laevis embryos are considered. The control of these processes by gene coding for growth factors, protooncogens, and homeobox-containing genes is also been reviewed.
How do health care organizations take on best practices? A scoping literature review.
Innis, Jennifer; Dryden-Palmer, Karen; Perreira, Tyrone; Berta, Whitney
2015-12-01
The aims of this scoping literature review are to examine and summarize the organizational-level factors, context, and processes that influence the use of evidence-based practice in healthcare organizations. A scoping literature review was done to answer the question: What is known from the existing empirical literature about factors, context, and processes that influence the uptake, implementation, and sustainability of evidence-based practice in healthcare organizations? This review used the Arksey and O'Malley framework to describe findings and to identify gaps in the existing research literature. Inclusion and exclusion criteria were developed to screen studies. Relevant studies published between January 1991 and March 2014 were identified using four electronic databases. Study abstracts were screened for eligibility by two reviewers. Following this screening process, full-text articles were reviewed to determine the eligibility of the studies by the primary author. Eligible studies were then analyzed by coding findings with descriptive labels to distinguish elements that appeared relevant to this literature review. Coding was used to form categories, and these categories led to the development of themes. Thirty studies met the eligibility criteria for this literature review. The themes identified were: the process organizations use to select evidence-based practices for adoption, use of a needs assessment, linkage to the organization's strategic direction, organizational culture, the organization's internal social networks, resources (including education and training, presence of information technology, financial resources, resources for patient care, and staff qualifications), leadership, the presence of champions, standardization of processes, role clarity of staff, and the presence of social capital. Several gaps were identified by this review. There is a lack of research on how evidence-based practices may be sustained by organizations. Most of the research done to date has been cross-sectional. Longitudinal research would give insight into the relationship between organizational characteristics and the uptake, implementation, and sustainability of evidence-based practice. In addition, although it is clear that financial resources are required to implement evidence-based practice, existing studies contain a lack of detail about the cost of adopting and using new practices. This scoping review contains a number of implications for healthcare administrators, managers, and providers to consider when adopting and implementing evidence-based practices in healthcare organizations.
Imprinted and X-linked non-coding RNAs as potential regulators of human placental function
Buckberry, Sam; Bianco-Miotto, Tina; Roberts, Claire T
2014-01-01
Pregnancy outcome is inextricably linked to placental development, which is strictly controlled temporally and spatially through mechanisms that are only partially understood. However, increasing evidence suggests non-coding RNAs (ncRNAs) direct and regulate a considerable number of biological processes and therefore may constitute a previously hidden layer of regulatory information in the placenta. Many ncRNAs, including both microRNAs and long non-coding transcripts, show almost exclusive or predominant expression in the placenta compared with other somatic tissues and display altered expression patterns in placentas from complicated pregnancies. In this review, we explore the results of recent genome-scale and single gene expression studies using human placental tissue, but include studies in the mouse where human data are lacking. Our review focuses on the ncRNAs epigenetically regulated through genomic imprinting or X-chromosome inactivation and includes recent evidence surrounding the H19 lincRNA, the imprinted C19MC cluster microRNAs, and X-linked miRNAs associated with pregnancy complications. PMID:24081302
The Big Entity of New RNA World: Long Non-Coding RNAs in Microvascular Complications of Diabetes.
Raut, Satish K; Khullar, Madhu
2018-01-01
A major part of the genome is known to be transcribed into non-protein coding RNAs (ncRNAs), such as microRNA and long non-coding RNA (lncRNA). The importance of ncRNAs is being increasingly recognized in physiological and pathological processes. lncRNAs are a novel class of ncRNAs that do not code for proteins and are important regulators of gene expression. In the past, these molecules were thought to be transcriptional "noise" with low levels of evolutionary conservation. However, recent studies provide strong evidence indicating that lncRNAs are (i) regulated during various cellular processes, (ii) exhibit cell type-specific expression, (iii) localize to specific organelles, and (iv) associated with human diseases. Emerging evidence indicates an aberrant expression of lncRNAs in diabetes and diabetes-related microvascular complications. In the present review, we discuss the current state of knowledge of lncRNAs, their genesis from genome, and the mechanism of action of individual lncRNAs in the pathogenesis of microvascular complications of diabetes and therapeutic approaches.
Work motivation in health care: a scoping literature review.
Perreira, Tyrone A; Innis, Jennifer; Berta, Whitney
2016-12-01
The aim of this scoping literature review was to examine and summarize the factors, context, and processes that influence work motivation of health care workers. A scoping literature review was done to answer the question: What is known from the existing empirical literature about factors, context, and processes that influence work motivation of health care workers? This scoping review used the Arksey and O'Malley framework to describe and summarize findings. Inclusion and exclusion criteria were developed to screen studies. Relevant studies published between January 2005 and May 2016 were identified using five electronic databases. Study abstracts were screened for eligibility by two reviewers. Following this screening process, full-text articles were reviewed to determine the eligibility of the studies. Eligible studies were then evaluated by coding findings with descriptive labels to distinguish elements that appeared pertinent to this review. Coding was used to form groups, and these groups led to the development of themes. Twenty-five studies met the eligibility criteria for this literature review. The themes identified were work performance, organizational justice, pay, status, personal characteristics, work relationships (including bullying), autonomy, organizational identification, training, and meaningfulness of work. Most of the research involved the use of surveys. There is a need for more qualitative research and for the use of case studies to examine work motivation in health care organizations. All of the studies were cross-sectional. Longitudinal research would provide insight into how work motivation changes, and how it can be influenced and shaped. Several implications for practice were identified. There is a need to ensure that health care workers have access to training opportunities, and that autonomy is optimized. To improve work motivation, there is a need to address bullying and hostile behaviours in the workplace. Addressing the factors that influence work motivation in health care settings has the potential to influence the care that patients receive.
Levels of Syntactic Realization in Oral Reading.
ERIC Educational Resources Information Center
Brown, Eric
Two contrasting theories of reading are reviewed in light of recent research in psycholinguistics. A strictly "visual" model of fluent reading is contrasted with several mediational theories where auditory or articulatory coding is deemed necessary for comprehension. Surveying the research in visual information processing, oral reading,…
Role of non-coding RNAs in non-aging-related neurological disorders.
Vieira, A S; Dogini, D B; Lopes-Cendes, I
2018-06-11
Protein coding sequences represent only 2% of the human genome. Recent advances have demonstrated that a significant portion of the genome is actively transcribed as non-coding RNA molecules. These non-coding RNAs are emerging as key players in the regulation of biological processes, and act as "fine-tuners" of gene expression. Neurological disorders are caused by a wide range of genetic mutations, epigenetic and environmental factors, and the exact pathophysiology of many of these conditions is still unknown. It is currently recognized that dysregulations in the expression of non-coding RNAs are present in many neurological disorders and may be relevant in the mechanisms leading to disease. In addition, circulating non-coding RNAs are emerging as potential biomarkers with great potential impact in clinical practice. In this review, we discuss mainly the role of microRNAs and long non-coding RNAs in several neurological disorders, such as epilepsy, Huntington disease, fragile X-associated ataxia, spinocerebellar ataxias, amyotrophic lateral sclerosis (ALS), and pain. In addition, we give information about the conditions where microRNAs have demonstrated to be potential biomarkers such as in epilepsy, pain, and ALS.
Decision-making in schizophrenia: A predictive-coding perspective.
Sterzer, Philipp; Voss, Martin; Schlagenhauf, Florian; Heinz, Andreas
2018-05-31
Dysfunctional decision-making has been implicated in the positive and negative symptoms of schizophrenia. Decision-making can be conceptualized within the framework of hierarchical predictive coding as the result of a Bayesian inference process that uses prior beliefs to infer states of the world. According to this idea, prior beliefs encoded at higher levels in the brain are fed back as predictive signals to lower levels. Whenever these predictions are violated by the incoming sensory data, a prediction error is generated and fed forward to update beliefs encoded at higher levels. Well-documented impairments in cognitive decision-making support the view that these neural inference mechanisms are altered in schizophrenia. There is also extensive evidence relating the symptoms of schizophrenia to aberrant signaling of prediction errors, especially in the domain of reward and value-based decision-making. Moreover, the idea of altered predictive coding is supported by evidence for impaired low-level sensory mechanisms and motor processes. We review behavioral and neural findings from these research areas and provide an integrated view suggesting that schizophrenia may be related to a pervasive alteration in predictive coding at multiple hierarchical levels, including cognitive and value-based decision-making processes as well as sensory and motor systems. We relate these findings to decision-making processes and propose that varying degrees of impairment in the implicated brain areas contribute to the variety of psychotic experiences. Copyright © 2018 Elsevier Inc. All rights reserved.
Jones, Jeanne; Kalkan, Erol; Stephens, Christopher
2017-02-23
A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.
Jones, Lyell K; Ney, John P
2016-12-01
Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.
Schneider, Gary; Kachroo, Sumesh; Jones, Natalie; Crean, Sheila; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's (FDA) Mini-Sentinel pilot program aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest (HOIs) from administrative and claims data. This paper summarizes the process and findings of the algorithm review of erythema multiforme and related conditions. PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the erythema multiforme HOI. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles that used administrative and claims data to identify erythema multiforme, Stevens-Johnson syndrome, or toxic epidermal necrolysis and that included validation estimates of the coding algorithms. Our search revealed limited literature focusing on erythema multiforme and related conditions that provided administrative and claims data-based algorithms and validation estimates. Only four studies provided validated algorithms and all studies used the same International Classification of Diseases code, 695.1. Approximately half of cases subjected to expert review were consistent with erythema multiforme and related conditions. Updated research needs to be conducted on designing validation studies that test algorithms for erythema multiforme and related conditions and that take into account recent changes in the diagnostic coding of these diseases. Copyright © 2012 John Wiley & Sons, Ltd.
Spectral fitting, shock layer modeling, and production of nitrogen oxides and excited nitrogen
NASA Technical Reports Server (NTRS)
Blackwell, H. E.
1991-01-01
An analysis was made of N2 emission from 8.72 MJ/kg shock layer at 2.54, 1.91, and 1.27 cm positions and vibrational state distributions, temperatures, and relative electronic state populations was obtained from data sets. Other recorded arc jet N2 and air spectral data were reviewed and NO emission characteristics were studied. A review of operational procedures of the DSMC code was made. Information on other appropriate codes and modifications, including ionization, were made as well as a determination of the applicability of codes reviewed to task requirement. A review was also made of computational procedures used in CFD codes of Li and other codes on JSC computers. An analysis was made of problems associated with integration of specific chemical kinetics applicable to task into CFD codes.
The bioelectric code: An ancient computational medium for dynamic control of growth and form.
Levin, Michael; Martyniuk, Christopher J
2018-02-01
What determines large-scale anatomy? DNA does not directly specify geometrical arrangements of tissues and organs, and a process of encoding and decoding for morphogenesis is required. Moreover, many species can regenerate and remodel their structure despite drastic injury. The ability to obtain the correct target morphology from a diversity of initial conditions reveals that the morphogenetic code implements a rich system of pattern-homeostatic processes. Here, we describe an important mechanism by which cellular networks implement pattern regulation and plasticity: bioelectricity. All cells, not only nerves and muscles, produce and sense electrical signals; in vivo, these processes form bioelectric circuits that harness individual cell behaviors toward specific anatomical endpoints. We review emerging progress in reading and re-writing anatomical information encoded in bioelectrical states, and discuss the approaches to this problem from the perspectives of information theory, dynamical systems, and computational neuroscience. Cracking the bioelectric code will enable much-improved control over biological patterning, advancing basic evolutionary developmental biology as well as enabling numerous applications in regenerative medicine and synthetic bioengineering. Copyright © 2017 Elsevier B.V. All rights reserved.
The job of 'ethics committees'.
Moore, Andrew; Donnelly, Andrew
2015-11-13
What should authorities establish as the job of ethics committees and review boards? Two answers are: (1) review of proposals for consistency with the duly established and applicable code and (2) review of proposals for ethical acceptability. The present paper argues that these two jobs come apart in principle and in practice. On grounds of practicality, publicity and separation of powers, it argues that the relevant authorities do better to establish code-consistency review and not ethics-consistency review. It also rebuts bad code and independence arguments for the opposite view. It then argues that authorities at present variously specify both code-consistency and ethics-consistency jobs, but most are also unclear on this issue. The paper then argues that they should reform the job of review boards and ethics committees, by clearly establishing code-consistency review and disestablishing ethics-consistency review, and through related reform of the basic orientation, focus, name, and expertise profile of these bodies and their actions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
The Technology Review 10: Emerging Technologies that Will Change the World.
ERIC Educational Resources Information Center
Technology Review, 2001
2001-01-01
Identifies 10 emerging areas of technology that will soon have a profound impact on the economy and on how people live and work: brain-machine interfaces; flexible transistors; data mining; digital rights management; biometrics; natural language processing; microphotonics; untangling code; robot design; and microfluidics. In each area, one…
75 FR 8995 - Submission for OMB Review: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... currently approved collection. Title of Collection: Occupational Code Assignment (OCA). OMB Control Number... (OCA), is provided as a public service for the states as well as for others who use occupational information. The OCA process is designed to help users relate an occupational specialty or a job title or to...
Jones, Natalie; Schneider, Gary; Kachroo, Sumesh; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's Mini-Sentinel pilot program initially aimed to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest (HOIs) from administrative and claims data. This paper summarizes the process and findings of the algorithm review of pulmonary fibrosis and interstitial lung disease. PubMed and Iowa Drug Information Service Web searches were conducted to identify citations applicable to the pulmonary fibrosis/interstitial lung disease HOI. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify pulmonary fibrosis and interstitial lung disease, including validation estimates of the coding algorithms. Our search revealed a deficiency of literature focusing on pulmonary fibrosis and interstitial lung disease algorithms and validation estimates. Only five studies provided codes; none provided validation estimates. Because interstitial lung disease includes a broad spectrum of diseases, including pulmonary fibrosis, the scope of these studies varied, as did the corresponding diagnostic codes used. Research needs to be conducted on designing validation studies to test pulmonary fibrosis and interstitial lung disease algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
Schneider, Gary; Kachroo, Sumesh; Jones, Natalie; Crean, Sheila; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's Mini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest from administrative and claims data. This article summarizes the process and findings of the algorithm review of anaphylaxis. PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis health outcome of interest. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify anaphylaxis and including validation estimates of the coding algorithms. Our search revealed limited literature focusing on anaphylaxis that provided administrative and claims data-based algorithms and validation estimates. Only four studies identified via literature searches provided validated algorithms; however, two additional studies were identified by Mini-Sentinel collaborators and were incorporated. The International Classification of Diseases, Ninth Revision, codes varied, as did the positive predictive value, depending on the cohort characteristics and the specific codes used to identify anaphylaxis. Research needs to be conducted on designing validation studies to test anaphylaxis algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
Shepherd, Jonathan; Frampton, Geoff K; Pickett, Karen; Wyatt, Jeremy C
2018-01-01
To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review 'innovations'. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality.
Pesch, Megan H; Lumeng, Julie C
2017-12-15
Behavioral coding of videotaped eating and feeding interactions can provide researchers with rich observational data and unique insights into eating behaviors, food intake, food selection as well as interpersonal and mealtime dynamics of children and their families. Unlike self-report measures of eating and feeding practices, the coding of videotaped eating and feeding behaviors can allow for the quantitative and qualitative examinations of behaviors and practices that participants may not self-report. While this methodology is increasingly more common, behavioral coding protocols and methodology are not widely shared in the literature. This has important implications for validity and reliability of coding schemes across settings. Additional guidance on how to design, implement, code and analyze videotaped eating and feeding behaviors could contribute to advancing the science of behavioral nutrition. The objectives of this narrative review are to review methodology for the design, operationalization, and coding of videotaped behavioral eating and feeding data in children and their families, and to highlight best practices. When capturing eating and feeding behaviors through analysis of videotapes, it is important for the study and coding to be hypothesis driven. Study design considerations include how to best capture the target behaviors through selection of a controlled experimental laboratory environment versus home mealtime, duration of video recording, number of observations to achieve reliability across eating episodes, as well as technical issues in video recording and sound quality. Study design must also take into account plans for coding the target behaviors, which may include behavior frequency, duration, categorization or qualitative descriptors. Coding scheme creation and refinement occur through an iterative process. Reliability between coders can be challenging to achieve but is paramount to the scientific rigor of the methodology. Analysis approach is dependent on the how data were coded and collapsed. Behavioral coding of videotaped eating and feeding behaviors can capture rich data "in-vivo" that is otherwise unobtainable from self-report measures. While data collection and coding are time-intensive the data yielded can be extremely valuable. Additional sharing of methodology and coding schemes around eating and feeding behaviors could advance the science and field.
Nummer, Brian A
2013-11-01
Kombucha is a fermented beverage made from brewed tea and sugar. The taste is slightly sweet and acidic and it may have residual carbon dioxide. Kombucha is consumed in many countries as a health beverage and it is gaining in popularity in the U.S. Consequently, many retailers and food service operators are seeking to brew this beverage on site. As a fermented beverage, kombucha would be categorized in the Food and Drug Administration model Food Code as a specialized process and would require a variance with submission of a food safety plan. This special report was created to assist both operators and regulators in preparing or reviewing a kombucha food safety plan.
Representational geometry: integrating cognition, computation, and the brain
Kriegeskorte, Nikolaus; Kievit, Rogier A.
2013-01-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. PMID:23876494
Deciphering Neural Codes of Memory during Sleep
Chen, Zhe; Wilson, Matthew A.
2017-01-01
Memories of experiences are stored in the cerebral cortex. Sleep is critical for consolidating hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms on memory consolidation, and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here, we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches for analyzing sleep-associated neural codes (SANC). We focus on two analysis paradigms for sleep-associated memory, and propose a new unsupervised learning framework (“memory first, meaning later”) for unbiased assessment of SANC. PMID:28390699
Mixture and odorant processing in the olfactory systems of insects: a comparative perspective.
Clifford, Marie R; Riffell, Jeffrey A
2013-11-01
Natural olfactory stimuli are often complex mixtures of volatiles, of which the identities and ratios of constituents are important for odor-mediated behaviors. Despite this importance, the mechanism by which the olfactory system processes this complex information remains an area of active study. In this review, we describe recent progress in how odorants and mixtures are processed in the brain of insects. We use a comparative approach toward contrasting olfactory coding and the behavioral efficacy of mixtures in different insect species, and organize these topics around four sections: (1) Examples of the behavioral efficacy of odor mixtures and the olfactory environment; (2) mixture processing in the periphery; (3) mixture coding in the antennal lobe; and (4) evolutionary implications and adaptations for olfactory processing. We also include pertinent background information about the processing of individual odorants and comparative differences in wiring and anatomy, as these topics have been richly investigated and inform the processing of mixtures in the insect olfactory system. Finally, we describe exciting studies that have begun to elucidate the role of the processing of complex olfactory information in evolution and speciation.
Collaborating in the context of co-location: a grounded theory study.
Wener, Pamela; Woodgate, Roberta L
2016-03-10
Most individuals with mental health concerns seek care from their primary care provider, who may lack comfort, knowledge, and time to provide care. Interprofessional collaboration between providers improves access to primary mental health services and increases primary care providers' comfort offering these services. Building and sustaining interprofessional relationships is foundational to collaborative practice in primary care settings. However, little is known about the relationship building process within these collaborative relationships. The purpose of this grounded theory study was to gain a theoretical understanding of the interprofessional collaborative relationship-building process to guide health care providers and leaders as they integrate mental health services into primary care settings. Forty primary and mental health care providers completed a demographic questionnaire and participated in either an individual or group interview. Interviews were audio-recorded and transcribed verbatim. Transcripts were reviewed several times and then individually coded. Codes were reviewed and similar codes were collapsed to form categories using using constant comparison. All codes and categories were discussed amongst the researchers and the final categories and core category was agreed upon using constant comparison and consensus. A four-stage developmental interprofessional collaborative relationship-building model explained the emergent core category of Collaboration in the Context of Co-location. The four stages included 1) Looking for Help, 2) Initiating Co-location, 3) Fitting-in, and 4) Growing Reciprocity. A patient-focus and communication strategies were essential processes throughout the interprofessional collaborative relationship-building process. Building interprofessional collaborative relationships amongst health care providers are essential to delivering mental health services in primary care settings. This developmental model describes the process of how these relationships are co-created and supported by the health care region. Furthermore, the model emphasizes that all providers must develop and sustain a patient-focus and communication strategies that are flexible. Applying this model, health care providers can guide the creation and sustainability of primary care interprofessional collaborative relationships. Moreover, this model may guide health care leaders and policy makers as they initiate interprofessional collaborative practice in other health care settings.
NASA Astrophysics Data System (ADS)
Selker, J. S.; Roques, C.; Higgins, C. W.; Good, S. P.; Hut, R.; Selker, A.
2015-12-01
The confluence of 3-Dimensional printing, low-cost solid-state-sensors, low-cost, low-power digital controllers (e.g., Arduinos); and open-source publishing (e.g., Github) is poised to transform environmental sensing. The Open-Source Published Environmental Sensing (OPENS) laboratory has launched and is available for all to use. OPENS combines cutting edge technologies and makes them available to the global environmental sensing community. OPENS includes a Maker lab space where people may collaborate in person or virtually via on-line forum for the publication and discussion of environmental sensing technology (Corvallis, Oregon, USA, please feel free to request a free reservation for space and equipment use). The physical lab houses a test-bed for sensors, as well as a complete classical machine shop, 3-D printers, electronics development benches, and workstations for code development. OPENS will provide a web-based formal publishing framework wherein global students and scientists can peer-review publish (with DOI) novel and evolutionary advancements in environmental sensor systems. This curated and peer-reviewed digital collection will include complete sets of "printable" parts and operating computer code for sensing systems. The physical lab will include all of the machines required to produce these sensing systems. These tools can be addressed in person or virtually, creating a truly global venue for advancement in monitoring earth's environment and agricultural systems. In this talk we will present an example of the process of design and publication the design and data from the OPENS-Permeameter. The publication includes 3-D printing code, Arduino (or other control/logging platform) operational code; sample data sets, and a full discussion of the design set in the scientific context of previous related devices. Editors for the peer-review process are currently sought - contact John.Selker@Oregonstate.edu or Clement.Roques@Oregonstate.edu.
NASA Astrophysics Data System (ADS)
Hartling, K.; Ciungu, B.; Li, G.; Bentoumi, G.; Sur, B.
2018-05-01
Monte Carlo codes such as MCNP and Geant4 rely on a combination of physics models and evaluated nuclear data files (ENDF) to simulate the transport of neutrons through various materials and geometries. The grid representation used to represent the final-state scattering energies and angles associated with neutron scattering interactions can significantly affect the predictions of these codes. In particular, the default thermal scattering libraries used by MCNP6.1 and Geant4.10.3 do not accurately reproduce the ENDF/B-VII.1 model in simulations of the double-differential cross section for thermal neutrons interacting with hydrogen nuclei in a thin layer of water. However, agreement between model and simulation can be achieved within the statistical error by re-processing ENDF/B-VII.I thermal scattering libraries with the NJOY code. The structure of the thermal scattering libraries and sampling algorithms in MCNP and Geant4 are also reviewed.
Computing Challenges in Coded Mask Imaging
NASA Technical Reports Server (NTRS)
Skinner, Gerald
2009-01-01
This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.
Wills, Peter R
2016-03-13
This article reviews contributions to this theme issue covering the topic 'DNA as information' in relation to the structure of DNA, the measure of its information content, the role and meaning of information in biology and the origin of genetic coding as a transition from uninformed to meaningful computational processes in physical systems. © 2016 The Author(s).
76 FR 24034 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... discussed at workgroup meetings. In turn, CMS' HCPCS workgroup reaches a decision as to whether a change... Level II Codes. As a result, the National Panel was delineated and CMS continued with the decision-making process under its current structure, the CMS HCPCS Workgroup (herein referred to as ``the...
Scaling GDL for Multi-cores to Process Planck HFI Beams Monte Carlo on HPC
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Duvert, G.; Park, J.; Arabas, S.; Erard, S.; Roudier, G.; Hivon, E.; Mottet, S.; Laurent, B.; Pinter, M.; Kasradze, N.; Ayad, M.
2014-05-01
After reviewing the majors progress done in GDL -now in 0.9.4- on performance and plotting capabilities since ADASS XXI paper (Coulais et al. 2012), we detail how a large code for Planck HFI beams Monte Carlo was successfully transposed from IDL to GDL on HPC.
Chronic myelogenous leukemia in eastern Pennsylvania: an assessment of registry reporting.
Mertz, Kristen J; Buchanich, Jeanine M; Washington, Terri L; Irvin-Barnwell, Elizabeth A; Woytowitz, Donald V; Smith, Roy E
2015-01-01
Chronic myelogenous leukemia (CML) has been reportable to the Pennsylvania Cancer Registry (PCR) since the 1980s, but the completeness of reporting is unknown. This study assessed CML reporting in eastern Pennsylvania where a cluster of another myeloproliferative neoplasm was previously identified. Cases were identified from 2 sources: 1) PCR case reports for residents of Carbon, Luzerne, or Schuylkill County with International Classification of Diseases for Oncology, Third Edition (ICD-O-3) codes 9875 (CML, BCR-ABL+), 9863 (CML, NOS), and 9860 (myeloid leukemia) and date of diagnosis 2001-2009, and 2) review of billing records at hematology practices. Participants were interviewed and their medical records were reviewed by board-certified hematologists. PCR reports included 99 cases coded 9875 or 9863 and 9 cases coded 9860; 2 additional cases were identified by review of billing records. Of the 110 identified cases, 93 were mailed consent forms, 23 consented, and 12 medical records were reviewed. Hematologists confirmed 11 of 12 reviewed cases as CML cases; all 11 confirmed cases were BCR/ABL positive, but only 1 was coded as positive (code 9875). Very few unreported CML cases were identified, suggesting relatively complete reporting to the PCR. Cases reviewed were accurately diagnosed, but ICD-0-3 coding often did not reflect BCR-ABL-positive tests. Cancer registry abstracters should look for these test results and code accordingly.
The Effects of Bar-coding Technology on Medication Errors: A Systematic Literature Review.
Hutton, Kevin; Ding, Qian; Wellman, Gregory
2017-02-24
The bar-coding technology adoptions have risen drastically in U.S. health systems in the past decade. However, few studies have addressed the impact of bar-coding technology with strong prospective methodologies and the research, which has been conducted from both in-pharmacy and bedside implementations. This systematic literature review is to examine the effectiveness of bar-coding technology on preventing medication errors and what types of medication errors may be prevented in the hospital setting. A systematic search of databases was performed from 1998 to December 2016. Studies measuring the effect of bar-coding technology on medication errors were included in a full-text review. Studies with the outcomes other than medication errors such as efficiency or workarounds were excluded. The outcomes were measured and findings were summarized for each retained study. A total of 2603 articles were initially identified and 10 studies, which used prospective before-and-after study design, were fully reviewed in this article. Of the 10 included studies, 9 took place in the United States, whereas the remaining was conducted in the United Kingdom. One research article focused on bar-coding implementation in a pharmacy setting, whereas the other 9 focused on bar coding within patient care areas. All 10 studies showed overall positive effects associated with bar-coding implementation. The results of this review show that bar-coding technology may reduce medication errors in hospital settings, particularly on preventing targeted wrong dose, wrong drug, wrong patient, unauthorized drug, and wrong route errors.
Non-coding RNAs in cardiac fibrosis: emerging biomarkers and therapeutic targets.
Chen, Zhongxiu; Li, Chen; Lin, Ke; Cai, Huawei; Ruan, Weiqiang; Han, Junyang; Rao, Li
2017-12-14
Non-coding RNAs (ncRNAs) are a class of RNA molecules that do not encode proteins. ncRNAs are involved in cell proliferation, apoptosis, differentiation, metabolism, and other physiological processes as well as the pathogenesis of diseases. Cardiac fibrosis is increasingly recognized as a common final pathway in advanced heart diseases. Many studies have shown that the occurrence and development of cardiac fibrosis is closely related to the regulation of ncRNAs. This review will highlight recent updates regarding the involvement of ncRNAs in cardiac fibrosis, and their potential as emerging biomarkers and therapeutic targets.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, M
2006-12-12
ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less
Piepers, Daniel W.; Robbins, Rachel A.
2012-01-01
It is widely agreed that the human face is processed differently from other objects. However there is a lack of consensus on what is meant by a wide array of terms used to describe this “special” face processing (e.g., holistic and configural) and the perceptually relevant information within a face (e.g., relational properties and configuration). This paper will review existing models of holistic/configural processing, discuss how they differ from one another conceptually, and review the wide variety of measures used to tap into these concepts. In general we favor a model where holistic processing of a face includes some or all of the interrelations between features and has separate coding for features. However, some aspects of the model remain unclear. We propose the use of moving faces as a way of clarifying what types of information are included in the holistic representation of a face. PMID:23413184
The effect of multiple internal representations on context-rich instruction
NASA Astrophysics Data System (ADS)
Lasry, Nathaniel; Aulls, Mark W.
2007-11-01
We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.
Representational geometry: integrating cognition, computation, and the brain.
Kriegeskorte, Nikolaus; Kievit, Rogier A
2013-08-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. Copyright © 2013 Elsevier Ltd. All rights reserved.
Processing concrete words: fMRI evidence against a specific right-hemisphere involvement.
Fiebach, Christian J; Friederici, Angela D
2004-01-01
Behavioral, patient, and electrophysiological studies have been taken as support for the assumption that processing of abstract words is confined to the left hemisphere, whereas concrete words are processed also by right-hemispheric brain areas. These are thought to provide additional information from an imaginal representational system, as postulated in the dual-coding theory of memory and cognition. Here we report new event-related fMRI data on the processing of concrete and abstract words in a lexical decision task. While abstract words activated a subregion of the left inferior frontal gyrus (BA 45) more strongly than concrete words, specific activity for concrete words was observed in the left basal temporal cortex. These data as well as data from other neuroimaging studies reviewed here are not compatible with the assumption of a specific right-hemispheric involvement for concrete words. The combined findings rather suggest a revised view of the neuroanatomical bases of the imaginal representational system assumed in the dual-coding theory, at least with respect to word recognition.
Efficiency turns the table on neural encoding, decoding and noise.
Deneve, Sophie; Chalk, Matthew
2016-04-01
Sensory neurons are usually described with an encoding model, for example, a function that predicts their response from the sensory stimulus using a receptive field (RF) or a tuning curve. However, central to theories of sensory processing is the notion of 'efficient coding'. We argue here that efficient coding implies a completely different neural coding strategy. Instead of a fixed encoding model, neural populations would be described by a fixed decoding model (i.e. a model reconstructing the stimulus from the neural responses). Because the population solves a global optimization problem, individual neurons are variable, but not noisy, and have no truly invariant tuning curve or receptive field. We review recent experimental evidence and implications for neural noise correlations, robustness and adaptation. Copyright © 2016. Published by Elsevier Ltd.
Kimia, Amir A; Savova, Guergana; Landschaft, Assaf; Harper, Marvin B
2015-07-01
Electronically stored clinical documents may contain both structured data and unstructured data. The use of structured clinical data varies by facility, but clinicians are familiar with coded data such as International Classification of Diseases, Ninth Revision, Systematized Nomenclature of Medicine-Clinical Terms codes, and commonly other data including patient chief complaints or laboratory results. Most electronic health records have much more clinical information stored as unstructured data, for example, clinical narrative such as history of present illness, procedure notes, and clinical decision making are stored as unstructured data. Despite the importance of this information, electronic capture or retrieval of unstructured clinical data has been challenging. The field of natural language processing (NLP) is undergoing rapid development, and existing tools can be successfully used for quality improvement, research, healthcare coding, and even billing compliance. In this brief review, we provide examples of successful uses of NLP using emergency medicine physician visit notes for various projects and the challenges of retrieving specific data and finally present practical methods that can run on a standard personal computer as well as high-end state-of-the-art funded processes run by leading NLP informatics researchers.
Amoroso, P J; Smith, G S; Bell, N S
2000-04-01
Accurate injury cause data are essential for injury prevention research. U.S. military hospitals, unlike civilian hospitals, use the NATO STANAG system for cause-of-injury coding. Reported deficiencies in civilian injury cause data suggested a need to specifically evaluate the STANAG. The Total Army Injury and Health Outcomes Database (TAIHOD) was used to evaluate worldwide Army injury hospitalizations, especially STANAG Trauma, Injury, and Place of Occurrence coding. We conducted a review of hospital procedures at Tripler Army Medical Center (TAMC) including injury cause and intent coding, potential crossover between acute injuries and musculoskeletal conditions, and data for certain hospital patients who are not true admissions. We also evaluated the use of free-text injury comment fields in three hospitals. Army-wide review of injury records coding revealed full compliance with cause coding, although nonspecific codes appeared to be overused. A small but intensive single hospital records review revealed relatively poor intent coding but good activity and cause coding. Data on specific injury history were present on most acute injury records and 75% of musculoskeletal conditions. Place of Occurrence coding, although inherently nonspecific, was over 80% accurate. Review of text fields produced additional details of the injuries in over 80% of cases. STANAG intent coding specificity was poor, while coding of cause of injury was at least comparable to civilian systems. The strengths of military hospital data systems are an exceptionally high compliance with injury cause coding, the availability of free text, and capture of all population hospital records without regard to work-relatedness. Simple changes in procedures could greatly improve data quality.
Development and Use of Health-Related Technologies in Indigenous Communities: Critical Review
Jacklin, Kristen; O'Connell, Megan E
2017-01-01
Background Older Indigenous adults encounter multiple challenges as their age intersects with health inequities. Research suggests that a majority of older Indigenous adults prefer to age in place, and they will need culturally safe assistive technologies to do so. Objective The aim of this critical review was to examine literature concerning use, adaptation, and development of assistive technologies for health purposes by Indigenous peoples. Methods Working within Indigenous research methodologies and from a decolonizing approach, searches of peer-reviewed academic and gray literature dated to February 2016 were conducted using keywords related to assistive technology and Indigenous peoples. Sources were reviewed and coded thematically. Results Of the 34 sources captured, only 2 concerned technology specifically for older Indigenous adults. Studies detailing technology with Indigenous populations of all ages originated primarily from Canada (n=12), Australia (n=10), and the United States (n=9) and were coded to four themes: meaningful user involvement and community-based processes in development, the digital divide, Indigenous innovation in technology, and health technology needs as holistic and interdependent. Conclusions A key finding is the necessity of meaningful user involvement in technology development, especially in communities struggling with the digital divide. In spite of, or perhaps because of this divide, Indigenous communities are enthusiastically adapting mobile technologies to suit their needs in creative, culturally specific ways. This enthusiasm and creativity, coupled with the extensive experience many Indigenous communities have with telehealth technologies, presents opportunity for meaningful, culturally safe development processes. PMID:28729237
USDA-ARS?s Scientific Manuscript database
A specific class of endogenous, non-coding RNAs, classified as microRNAs (miRNAs), has been identified. It has been found that miRNAs are associated with many biological processes and disease states, including all stages of cancer from initiation to tumor promotion and progression. These studies d...
ERIC Educational Resources Information Center
Taylor, Karen A.
This review of the literature and annotated bibliography summarizes the available research relating to teaching programming to high school students. It is noted that, while the process of programming a computer could be broken down into five steps--problem definition, algorithm design, code writing, debugging, and documentation--current research…
Statement Before the U. S. Senate Committee on Commerce Subcommittee on Communications.
ERIC Educational Resources Information Center
Schneider, Alfred R.
The American Broadcasting Company's (ABC) Department of Standards and Practices follows a precise and detailed series of steps in its review of material presented over the network, to assure its conformity with the Television Code of the National Association of Broadcasters. In this process, special attention is given to programs which contain…
Frampton, Geoff K.; Pickett, Karen; Wyatt, Jeremy C.
2018-01-01
Objective To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. Methods A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review ‘innovations’. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. Results A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. Conclusions There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality. PMID:29750807
Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses
ERIC Educational Resources Information Center
Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan
2013-01-01
Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…
Differential expression and emerging functions of non-coding RNAs in cold adaptation.
Frigault, Jacques J; Morin, Mathieu D; Morin, Pier Jr
2017-01-01
Several species undergo substantial physiological and biochemical changes to confront the harsh conditions associated with winter. Small mammalian hibernators and cold-hardy insects are examples of natural models of cold adaptation that have been amply explored. While the molecular picture associated with cold adaptation has started to become clearer in recent years, notably through the use of high-throughput experimental approaches, the underlying cold-associated functions attributed to several non-coding RNAs, including microRNAs (miRNAs) and long non-coding RNAs (lncRNAs), remain to be better characterized. Nevertheless, key pioneering work has provided clues on the likely relevance of these molecules in cold adaptation. With an emphasis on mammalian hibernation and insect cold hardiness, this work first reviews various molecular changes documented so far in these processes. The cascades leading to miRNA and lncRNA production as well as the mechanisms of action of these non-coding RNAs are subsequently described. Finally, we present examples of differentially expressed non-coding RNAs in models of cold adaptation and elaborate on the potential significance of this modulation with respect to low-temperature adaptation.
Quantum image pseudocolor coding based on the density-stratified method
NASA Astrophysics Data System (ADS)
Jiang, Nan; Wu, Wenya; Wang, Luo; Zhao, Na
2015-05-01
Pseudocolor processing is a branch of image enhancement. It dyes grayscale images to color images to make the images more beautiful or to highlight some parts on the images. This paper proposes a quantum image pseudocolor coding scheme based on the density-stratified method which defines a colormap and changes the density value from gray to color parallel according to the colormap. Firstly, two data structures: quantum image GQIR and quantum colormap QCR are reviewed or proposed. Then, the quantum density-stratified algorithm is presented. Based on them, the quantum realization in the form of circuits is given. The main advantages of the quantum version for pseudocolor processing over the classical approach are that it needs less memory and can speed up the computation. Two kinds of examples help us to describe the scheme further. Finally, the future work are analyzed.
Long Noncoding RNAs: a New Regulatory Code in Metabolic Control
Zhao, Xu-Yun; Lin, Jiandie D.
2015-01-01
Long noncoding RNAs (lncRNAs) are emerging as an integral part of the regulatory information encoded in the genome. LncRNAs possess the unique capability to interact with nucleic acids and proteins and exert discrete effects on numerous biological processes. Recent studies have delineated multiple lncRNA pathways that control metabolic tissue development and function. The expansion of the regulatory code that links nutrient and hormonal signals to tissue metabolism gives new insights into the genetic and pathogenic mechanisms underlying metabolic disease. This review discusses lncRNA biology with a focus on its role in the development, signaling, and function of key metabolic tissues. PMID:26410599
MODEST: A Tool for Geodesy and Astronomy
NASA Technical Reports Server (NTRS)
Sovers, Ojars J.; Jacobs, Christopher S.; Lanyi, Gabor E.
2004-01-01
Features of the JPL VLBI modeling and estimation software "MODEST" are reviewed. Its main advantages include thoroughly documented model physics, portability, and detailed error modeling. Two unique models are included: modeling of source structure and modeling of both spatial and temporal correlations in tropospheric delay noise. History of the code parallels the development of the astrometric and geodetic VLBI technique and the software retains many of the models implemented during its advancement. The code has been traceably maintained since the early 1980s, and will continue to be updated with recent IERS standards. Scripts are being developed to facilitate user-friendly data processing in the era of e-VLBI.
Mikhailov, Alexander T; Torrado, Mario
2018-05-12
There is growing evidence that putative gene regulatory networks including cardio-enriched transcription factors, such as PITX2, TBX5, ZFHX3, and SHOX2, and their effector/target genes along with downstream non-coding RNAs can play a potentially important role in the process of adaptive and maladaptive atrial rhythm remodeling. In turn, expression of atrial fibrillation-associated transcription factors is under the control of upstream regulatory non-coding RNAs. This review broadly explores gene regulatory mechanisms associated with susceptibility to atrial fibrillation-with key examples from both animal models and patients-within the context of both cardiac transcription factors and non-coding RNAs. These two systems appear to have multiple levels of cross-regulation and act coordinately to achieve effective control of atrial rhythm effector gene expression. Perturbations of a dynamic expression balance between transcription factors and corresponding non-coding RNAs can provoke the development or promote the progression of atrial fibrillation. We also outline deficiencies in current models and discuss ongoing studies to clarify remaining mechanistic questions. An understanding of the function of transcription factors and non-coding RNAs in gene regulatory networks associated with atrial fibrillation risk will enable the development of innovative therapeutic strategies.
Bowers, Jeffrey S
2009-01-01
A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated representations. One of the putative advantages of this approach is that the theories are biologically plausible. Indeed, advocates of the PDP approach often highlight the close parallels between distributed representations learned in connectionist models and neural coding in brain and often dismiss localist (grandmother cell) theories as biologically implausible. The author reviews a range a data that strongly challenge this claim and shows that localist models provide a better account of single-cell recording studies. The author also contrast local and alternative distributed coding schemes (sparse and coarse coding) and argues that common rejection of grandmother cell theories in neuroscience is due to a misunderstanding about how localist models behave. The author concludes that the localist representations embedded in theories of perception and cognition are consistent with neuroscience; biology only calls into question the distributed representations often learned in PDP models.
Forsyth, Stewart
2013-06-01
Infant feeding policy and practice continues to be a contentious area of global health care. The infant formula industry is widely considered to be the bête noire with frequent claims that they adopt marketing and sales practices that are not compliant with the WHO Code. However, failure to resolve these issues over three decades suggests that there may be wider systemic failings. Review of published papers, commentaries and reports relating to the implementation and governance of the WHO Code with specific reference to issues of non-compliance. The analysis set out in this paper indicates that there are systemic failings at all levels of the implementation and monitoring process including the failure of WHO to successfully 'urge' governments to implement the Code in its entirety; a lack of political will by Member States to implement and monitor the Code and a lack of formal and transparent governance structures. Non-compliance with the WHO Code is not confined to the infant formula industry and several actions are identified, including the need to address issues of partnership working and the establishment of governance systems that are robust, independent and transparent.
Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K
2013-08-01
Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75.2% and 80.9%). NLP performed with greater sensitivity, specificity, and accuracy than CPT codes in identifying stoma procedures and stoma types. Major differences where NLP outperformed CPT included identifying ileostomy (specificity 95.8%, sensitivity 88.3%, and accuracy 91.5%) and colostomy (97.6%, 90.5%, and 92.8%, respectively). CPT codes can identify effectively patients who have had stoma procedures and are adequate in distinguishing between formation and reversal; however, CPT codes cannot differentiate ileostomy from colostomy. NLP can be used to differentiate between ileostomy- and colostomy-related procedures. The role of NLP in conjunction with electronic medical records in data retrieval warrants further investigation. Published by Mosby, Inc.
Birth, coming of age and death: The intriguing life of long noncoding RNAs.
Samudyata; Castelo-Branco, Gonçalo; Bonetti, Alessandro
2018-07-01
Mammalian genomes are pervasively transcribed, with long noncoding RNAs being the most abundant fraction. Recent studies have highlighted the central role played by these transcripts in several physiological and pathological processes. Despite several metabolic features shared between coding and noncoding transcripts, these two classes of RNAs exhibit multiple differences regarding their biogenesis and processing. Here we review such distinctions, focusing on the unique features of specific long noncoding RNAs. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mandala, Mahender Arjun
A cornerstone of design and design education is frequent situated feedback. With increasing class sizes, and shrinking financial and human resources, providing rich feedback to students becomes increasingly difficult. In the field of writing, web-based peer review--the process of utilizing equal status learners within a class to provide feedback to each other on their work using networked computing systems--has been shown to be a reliable and valid source of feedback in addition to improving student learning. Designers communicate in myriad ways, using the many languages of design and combining visual and descriptive information. This complex discourse of design intent makes peer reviews by design students ambiguous and often not helpful to the receivers of this feedback. Furthermore, engaging students in the review process itself is often difficult. Teams can complement individual diversity and may assist novice designers collectively resolve complex task. However, teams often incur production losses and may be impacted by individual biases. In the current work, we look at utilizing a collaborative team of reviewers, working collectively and synchronously, in generating web based peer reviews in a sophomore engineering design class. Students participated in a cross-over design, conducting peer reviews as individuals and collaborative teams in parallel sequences. Raters coded the feedback generated on the basis of their appropriateness and accuracy. Self-report surveys and passive observation of teams conducting reviews captured student opinion on the process, its value, and the contrasting experience they had conducting team and individual reviews. We found team reviews generated better quality feedback in comparison to individual reviews. Furthermore, students preferred conducting reviews in teams, finding the process 'fun' and engaging. We observed several learning benefits of using collaboration in reviewing including improved understanding of the assessment criteria, roles, expectations, and increased team reflection. These results provide insight into how to improve the review process for instructors and researchers, and forms a basis for future research work in this area. With respect to facilitating peer review process in design based classrooms, we also present recommendations for creating effective review system design and implementation in classroom supported by research and practical experience.
Decoding the ubiquitous role of microRNAs in neurogenesis.
Nampoothiri, Sreekala S; Rajanikant, G K
2017-04-01
Neurogenesis generates fledgling neurons that mature to form an intricate neuronal circuitry. The delusion on adult neurogenesis was far resolved in the past decade and became one of the largely explored domains to identify multifaceted mechanisms bridging neurodevelopment and neuropathology. Neurogenesis encompasses multiple processes including neural stem cell proliferation, neuronal differentiation, and cell fate determination. Each neurogenic process is specifically governed by manifold signaling pathways, several growth factors, coding, and non-coding RNAs. A class of small non-coding RNAs, microRNAs (miRNAs), is ubiquitously expressed in the brain and has emerged to be potent regulators of neurogenesis. It functions by fine-tuning the expression of specific neurogenic gene targets at the post-transcriptional level and modulates the development of mature neurons from neural progenitor cells. Besides the commonly discussed intrinsic factors, the neuronal morphogenesis is also under the control of several extrinsic temporal cues, which in turn are regulated by miRNAs. This review enlightens on dicer controlled switch from neurogenesis to gliogenesis, miRNA regulation of neuronal maturation and the differential expression of miRNAs in response to various extrinsic cues affecting neurogenesis.
microRNA Therapeutics in Cancer - An Emerging Concept.
Shah, Maitri Y; Ferrajoli, Alessandra; Sood, Anil K; Lopez-Berestein, Gabriel; Calin, George A
2016-10-01
MicroRNAs (miRNAs) are an evolutionarily conserved class of small, regulatory non-coding RNAs that negatively regulate protein coding gene and other non-coding transcripts expression. miRNAs have been established as master regulators of cellular processes, and they play a vital role in tumor initiation, progression and metastasis. Further, widespread deregulation of microRNAs have been reported in several cancers, with several microRNAs playing oncogenic and tumor suppressive roles. Based on these, miRNAs have emerged as promising therapeutic tools for cancer management. In this review, we have focused on the roles of miRNAs in tumorigenesis, the miRNA-based therapeutic strategies currently being evaluated for use in cancer, and the advantages and current challenges to their use in the clinic. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Deciphering Neural Codes of Memory during Sleep.
Chen, Zhe; Wilson, Matthew A
2017-05-01
Memories of experiences are stored in the cerebral cortex. Sleep is critical for the consolidation of hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms in memory consolidation and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches to the analysis of sleep-associated neural codes (SANCs). We focus on two analysis paradigms for sleep-associated memory and propose a new unsupervised learning framework ('memory first, meaning later') for unbiased assessment of SANCs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Staggering of angular momentum distribution in fission
NASA Astrophysics Data System (ADS)
Tamagno, Pierre; Litaize, Olivier
2018-03-01
We review here the role of angular momentum distributions in the fission process. To do so the algorithm implemented in the FIFRELIN code [?] is detailed with special emphasis on the place of fission fragment angular momenta. The usual Rayleigh distribution used for angular momentum distribution is presented and the related model derivation is recalled. Arguments are given to justify why this distribution should not hold for low excitation energy of the fission fragments. An alternative ad hoc expression taking into account low-lying collectiveness is presented as has been implemented in the FIFRELIN code. Yet on observables currently provided by the code, no dramatic impact has been found. To quantify the magnitude of the impact of the low-lying staggering in the angular momentum distribution, a textbook case is considered for the decay of the 144Ba nucleus with low excitation energy.
Generating Code Review Documentation for Auto-Generated Mission-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2009-01-01
Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.
Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana
2018-05-01
Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.
Cătană, Cristina- Sorina; Pichler, Martin; Giannelli, Gianluigi; Mader, Robert M; Berindan-Neagoe, Ioana
2017-04-25
In a continuous and mutual exchange of information, cancer cells are invariably exposed to microenvironment transformation. This continuous alteration of the genetic, molecular and cellular peritumoral stroma background has become as critical as the management of primary tumor progression events in cancer cells. The communication between stroma and tumor cells within the extracellular matrix is one of the triggers in colon and liver carcinogenesis. All non- codingRNAs including long non-coding RNAs, microRNAs and ultraconserved genes play a critical role in almost all cancers and are responsible for the modulation of the tumor microenvironment in several malignant processes such as initiation, progression and dissemination. This review details the involvement of non codingRNAs in the evolution of human colorectal carcinoma and hepatocellular carcinoma in relationship with the microenvironment. Recent research has shown that a considerable number of dysregulated non- codingRNAs could be valuable diagnostic and prognostic biomarkers in cancer. Therefore, more in-depth knowledge of the role non- codingRNAs play in stroma-tumor communication and of the complex regulatory mechanisms between ultraconserved genes and microRNAs supports the validation of future effective therapeutic targets in patients suffering from hepatocellular and colorectal carcinoma, two distinctive entities which share quite a lot common non-coding RNAs.
Cătană, Cristina- Sorina; Pichler, Martin; Giannelli, Gianluigi; Mader, Robert M.; Berindan-Neagoe, Ioana
2017-01-01
In a continuous and mutual exchange of information, cancer cells are invariably exposed to microenvironment transformation. This continuous alteration of the genetic, molecular and cellular peritumoral stroma background has become as critical as the management of primary tumor progression events in cancer cells. The communication between stroma and tumor cells within the extracellular matrix is one of the triggers in colon and liver carcinogenesis. All non- codingRNAs including long non-coding RNAs, microRNAs and ultraconserved genes play a critical role in almost all cancers and are responsible for the modulation of the tumor microenvironment in several malignant processes such as initiation, progression and dissemination. This review details the involvement of non codingRNAs in the evolution of human colorectal carcinoma and hepatocellular carcinoma in relationship with the microenvironment. Recent research has shown that a considerable number of dysregulated non- codingRNAs could be valuable diagnostic and prognostic biomarkers in cancer. Therefore, more in-depth knowledge of the role non- codingRNAs play in stroma-tumor communication and of the complex regulatory mechanisms between ultraconserved genes and microRNAs supports the validation of future effective therapeutic targets in patients suffering from hepatocellular and colorectal carcinoma, two distinctive entities which share quite a lot common non-coding RNAs. PMID:28392501
Increasing Flexibility in Energy Code Compliance: Performance Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Rosenberg, Michael I.
Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less
Ionisation Feedback in Star and Cluster Formation Simulations
NASA Astrophysics Data System (ADS)
Ercolano, Barbara; Gritschneder, Matthias
2011-04-01
Feedback from photoionisation may dominate on parsec scales in massive star-forming regions. Such feedback may inhibit or enhance the star formation efficiency and sustain or even drive turbulence in the parent molecular cloud. Photoionisation feedback may also provide a mechanism for the rapid expulsion of gas from young clusters' potentials, often invoked as the main cause of `infant mortality'. There is currently no agreement, however, with regards to the efficiency of this process and how environment may affect the direction (positive or negative) in which it proceeds. The study of the photoionisation process as part of hydrodynamical simulations is key to understanding these issues, however, due to the computational demand of the problem, crude approximations for the radiation transfer are often employed. We will briefly review some of the most commonly used approximations and discuss their major drawbacks. We will then present the results of detailed tests carried out using the detailed photoionisation code mocassin and the SPH+ionisation code iVINE code, aimed at understanding the error introduced by the simplified photoionisation algorithms. This is particularly relevant as a number of new codes have recently been developed along those lines. We will finally propose a new approach that should allow to efficiently and self-consistently treat the photoionisation problem for complex radiation and density fields.
Global Precipitation Measurement: GPM Microwave Imager (GMI) Algorithm Development Approach
NASA Technical Reports Server (NTRS)
Stocker, Erich Franz
2009-01-01
This slide presentation reviews the approach to the development of the Global Precipitation Measurement algorithm. This presentation includes information about the responsibilities for the development of the algorithm, and the calibration. Also included is information about the orbit, and the sun angle. The test of the algorithm code will be done with synthetic data generated from the Precipitation Processing System (PPS).
Improving the Performance of Online Learning Teams--A Discourse Analysis
ERIC Educational Resources Information Center
Liu, Ying Chieh; Burn, Janice M.
2007-01-01
This paper compares the processes of Face-To-Face (FTF) teams and Online Learning Teams (OLTs) and proposes methods to improve the performance of OLTs. An empirical study reviewed the performance of fifteen FTF teams and OLTs and their communication patterns were coded by the TEMPO system developed by Futoran et al. (1989) in order to develop a…
Efficient and Robust Signal Approximations
2009-05-01
otherwise. Remark. Permutation matrices are both orthogonal and doubly- stochastic [62]. We will now show how to further simplify the Robust Coding...reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: signal processing, image compression, independent component analysis , sparse
REDUCED PROTECTIVE CLOTHING DETERMINATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN, R.L.
2003-06-13
This technical basis document defines conditions where reduced protective clothing can be allowed, defines reduced protective clothing, and documents the regulatory review that determines the process is compliant with the Tank Farm Radiological Control Manual (TFRCM) and Title 10, Part 835, of the Code of Federal Regulations (10CFR835). The criteria, standards, and requirements contained in this document apply only to Tank Farm Contractor (TFC) facilities.
Olier, Ivan; Springate, David A; Ashcroft, Darren M; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos
2016-01-01
The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists.
VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaperow, J.H.; Bixler, N.E.
1996-12-31
VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manteufel, R.D.; Ahola, M.P.; Turner, D.R.
A literature review has been conducted to determine the state of knowledge available in the modeling of coupled thermal (T), hydrologic (H), mechanical (M), and chemical (C) processes relevant to the design and/or performance of the proposed high-level waste (HLW) repository at Yucca Mountain, Nevada. The review focuses on identifying coupling mechanisms between individual processes and assessing their importance (i.e., if the coupling is either important, potentially important, or negligible). The significance of considering THMC-coupled processes lies in whether or not the processes impact the design and/or performance objectives of the repository. A review, such as reported here, is usefulmore » in identifying which coupled effects will be important, hence which coupled effects will need to be investigated by the US Nuclear Regulatory Commission in order to assess the assumptions, data, analyses, and conclusions in the design and performance assessment of a geologic reposit``. Although this work stems from regulatory interest in the design of the geologic repository, it should be emphasized that the repository design implicitly considers all of the repository performance objectives, including those associated with the time after permanent closure. The scope of this review is considered beyond previous assessments in that it attempts with the current state-of-knowledge) to determine which couplings are important, and identify which computer codes are currently available to model coupled processes.« less
A Review of Contemporary Diversity Literature in Pharmacy Education.
Bush, Antonio A; McLaughlin, Jacqueline E; White, Carla
2017-09-01
Objective. To review and categorize published educational research concerning diversity within colleges and schools of pharmacy. Methods. The Three Models of Organizational Diversity Capabilities in Higher Education framework was used to guide the review efforts. Of the 593 documents retrieved, 11 met the inclusion criteria for review. Each included article was individually reviewed and coded according to the framework. Results. The reviewed articles were primarily influenced by contemporary drivers of change (eg, shifting demographics in the United States), focused on enhancing the compositional diversity of colleges and schools of pharmacy, examined the experiences of underrepresented groups, and suggested process improvement recommendations. Conclusion. There is limited published educational research concerning diversity within schools and colleges of pharmacy. Contemporary drivers of change are influencing this research, but more attention must be given to the focus of the research, individuals targeted, and recommendations suggested.
Utter, Garth H; Miller, Preston R; Mowery, Nathan T; Tominaga, Gail T; Gunter, Oliver; Osler, Turner M; Ciesla, David J; Agarwal, Suresh K; Inaba, Kenji; Aboutanos, Michel B; Brown, Carlos V R; Ross, Steven E; Crandall, Marie L; Shafi, Shahid
2015-05-01
The American Association for the Surgery of Trauma (AAST) recently established a grading system for uniform reporting of anatomic severity of several emergency general surgery (EGS) diseases. There are five grades of severity for each disease, ranging from I (lowest severity) to V (highest severity). However, the grading process requires manual chart review. We sought to evaluate whether International Classification of Diseases, 9th and 10th Revisions, Clinical Modification (ICD-9-CM, ICD-10-CM) codes might allow estimation of AAST grades for EGS diseases. The Patient Assessment and Outcomes Committee of the AAST reviewed all available ICD-9-CM and ICD-10-CM diagnosis codes relevant to 16 EGS diseases with available AAST grades. We then matched grades for each EGS disease with one or more ICD codes. We used the Official Coding Guidelines for ICD-9-CM and ICD-10-CM and the American Hospital Association's "Coding Clinic for ICD-9-CM" for coding guidance. The ICD codes did not allow for matching all five AAST grades of severity for each of the 16 diseases. With ICD-9-CM, six diseases mapped into four categories of severity (instead of five), another six diseases into three categories of severity, and four diseases into only two categories of severity. With ICD-10-CM, five diseases mapped into four categories of severity, seven diseases into three categories, and four diseases into two categories. Two diseases mapped into discontinuous categories of grades (two in ICD-9-CM and one in ICD-10-CM). Although resolution is limited, ICD-9-CM and ICD-10-CM diagnosis codes might have some utility in roughly approximating the severity of the AAST grades in the absence of more precise information. These ICD mappings should be validated and refined before widespread use to characterize EGS disease severity. In the long-term, it may be desirable to develop alternatives to ICD-9-CM and ICD-10-CM codes for routine collection of disease severity characteristics.
Rohwer, Anke; Schoonees, Anel; Young, Taryn
2014-11-02
This paper describes the process, our experience and the lessons learnt in doing document reviews of health science curricula. Since we could not find relevant literature to guide us on how to approach these reviews, we feel that sharing our experience would benefit researchers embarking on similar projects. We followed a rigorous, transparent, pre-specified approach that included the preparation of a protocol, a pre-piloted data extraction form and coding schedule. Data were extracted, analysed and synthesised. Quality checks were included at all stages of the process. The main lessons we learnt related to time and project management, continuous quality assurance, selecting the software that meets the needs of the project, involving experts as needed and disseminating the findings to relevant stakeholders. A complete curriculum evaluation comprises, apart from a document review, interviews with students and lecturers to assess the learnt and taught curricula respectively. Rigorous methods must be used to ensure an objective assessment.
Development and Use of Health-Related Technologies in Indigenous Communities: Critical Review.
Jones, Louise; Jacklin, Kristen; O'Connell, Megan E
2017-07-20
Older Indigenous adults encounter multiple challenges as their age intersects with health inequities. Research suggests that a majority of older Indigenous adults prefer to age in place, and they will need culturally safe assistive technologies to do so. The aim of this critical review was to examine literature concerning use, adaptation, and development of assistive technologies for health purposes by Indigenous peoples. Working within Indigenous research methodologies and from a decolonizing approach, searches of peer-reviewed academic and gray literature dated to February 2016 were conducted using keywords related to assistive technology and Indigenous peoples. Sources were reviewed and coded thematically. Of the 34 sources captured, only 2 concerned technology specifically for older Indigenous adults. Studies detailing technology with Indigenous populations of all ages originated primarily from Canada (n=12), Australia (n=10), and the United States (n=9) and were coded to four themes: meaningful user involvement and community-based processes in development, the digital divide, Indigenous innovation in technology, and health technology needs as holistic and interdependent. A key finding is the necessity of meaningful user involvement in technology development, especially in communities struggling with the digital divide. In spite of, or perhaps because of this divide, Indigenous communities are enthusiastically adapting mobile technologies to suit their needs in creative, culturally specific ways. This enthusiasm and creativity, coupled with the extensive experience many Indigenous communities have with telehealth technologies, presents opportunity for meaningful, culturally safe development processes. ©Louise Jones, Kristen Jacklin, Megan E O'Connell. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 20.07.2017.
Information processing. [in human performance
NASA Technical Reports Server (NTRS)
Wickens, Christopher D.; Flach, John M.
1988-01-01
Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).
Heery, Richard; Finn, Stephen P.; Cuffe, Sinead; Gray, Steven G.
2017-01-01
Epithelial mesenchymal transition (EMT), the adoption by epithelial cells of a mesenchymal-like phenotype, is a process co-opted by carcinoma cells in order to initiate invasion and metastasis. In addition, it is becoming clear that is instrumental to both the development of drug resistance by tumour cells and in the generation and maintenance of cancer stem cells. EMT is thus a pivotal process during tumour progression and poses a major barrier to the successful treatment of cancer. Non-coding RNAs (ncRNA) often utilize epigenetic programs to regulate both gene expression and chromatin structure. One type of ncRNA, called long non-coding RNAs (lncRNAs), has become increasingly recognized as being both highly dysregulated in cancer and to play a variety of different roles in tumourigenesis. Indeed, over the last few years, lncRNAs have rapidly emerged as key regulators of EMT in cancer. In this review, we discuss the lncRNAs that have been associated with the EMT process in cancer and the variety of molecular mechanisms and signalling pathways through which they regulate EMT, and finally discuss how these EMT-regulating lncRNAs impact on both anti-cancer drug resistance and the cancer stem cell phenotype. PMID:28430163
Stakeholder analysis for adopting a personal health record standard in Korea.
Kang, Min-Jeoung; Jung, Chai Young; Kim, Soyoun; Boo, Yookyung; Lee, Yuri; Kim, Sundo
Interest in health information exchanges (HIEs) is increasing. Several countries have adopted core health data standards with appropriate strategies. This study was conducted to determine the feasibility of a continuity of care record (CCR) as the standard for an electronic version of the official transfer note and the HIE in Korean healthcare. A technical review of the CCR standard and analysis of stakeholders' views were undertaken. Transfer notes were reviewed and matched with CCR standard categories. The standard for the Korean coding system was selected. Stakeholder analysis included an online survey of members of the Korean Society of Medical Informatics, a public hearing to derive opinions of consumers, doctors, vendors, academic societies and policy makers about the policy process, and a focus group meeting with EMR vendors to determine which HIE objects were technically applicable. Data objects in the official transfer note form matched CCR standards. Korean Classification of Diseases, Korean Standard Terminology of Medicine, Electronic Data Interchange code (EDI code), Logical Observation Identifiers Names and Codes, and Korean drug codes (KD code) were recommended as the Korean coding standard.'Social history', 'payers', and 'encounters' were mostly marked as optional or unnecessary sections, and 'allergies', 'alerts', 'medication list', 'problems/diagnoses', 'results',and 'procedures' as mandatory. Unlike the US, 'social history' was considered optional and 'advance directives' mandatory.At the public hearing there was some objection from the Korean Medical Association to the HIE on legal grounds in termsof intellectual property and patients' personal information. Other groups showed positive or neutral responses. Focus group members divided CCR data objects into three phases based onpredicted adoption time in CCR: (i) immediate adoption; (ii) short-term adoption ('alerts', 'family history'); and (iii) long-term adoption ('results', 'advanced directives', 'functional status', 'medical equipment', 'vital signs', 'plan of care', 'social history'). There were no technical problems in generating the CCR standard document from EMRs. Matters of concern that arose from study results should be resolved with time and consultation.
Boles, D B
1989-01-01
Three attributes of words are their imageability, concreteness, and familiarity. From a literature review and several experiments, I previously concluded (Boles, 1983a) that only familiarity affects the overall near-threshold recognition of words, and that none of the attributes affects right-visual-field superiority for word recognition. Here these conclusions are modified by two experiments demonstrating a critical mediating influence of intentional versus incidental memory instructions. In Experiment 1, subjects were instructed to remember the words they were shown, for subsequent recall. The results showed effects of both imageability and familiarity on overall recognition, as well as an effect of imageability on lateralization. In Experiment 2, word-memory instructions were deleted and the results essentially reinstated the findings of Boles (1983a). It is concluded that right-hemisphere imagery processes can participate in word recognition under intentional memory instructions. Within the dual coding theory (Paivio, 1971), the results argue that both discrete and continuous processing modes are available, that the modes can be used strategically, and that continuous processing can occur prior to response stages.
Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla
ERIC Educational Resources Information Center
Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman
2015-01-01
This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…
Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla
ERIC Educational Resources Information Center
Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman
2014-01-01
This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…
Professional codes in a changing nursing context: literature review.
Meulenbergs, Tom; Verpeet, Ellen; Schotsmans, Paul; Gastmans, Chris
2004-05-01
Professional codes played a definitive role during a specific period of time, when the professional context of nursing was characterized by an increasing professionalization. Today, however, this professional context has changed. This paper reports on a study which aimed to explore the meaning of professional codes in the current context of the nursing profession. A literature review on professional codes and the nursing profession was carried out. The literature was systematically investigated using the electronic databases PubMed and The Philosopher's Index, and the keywords nursing codes, professional codes in nursing, ethics codes/ethical codes, professional ethics. Due to the nursing profession's growing multidisciplinary nature, the increasing dominance of economic discourse, and the intensified legal framework in which health care professionals need to operate, the context of nursing is changing. In this changed professional context, nursing professional codes have to accommodate to the increasing ethical demands placed upon the profession. Therefore, an ethicization of these codes is desirable, and their moral objectives need to be revalued.
An engineer's view on genetic information and biological evolution.
Battail, Gérard
2004-01-01
We develop ideas on genome replication introduced in Battail [Europhys. Lett. 40 (1997) 343]. Starting with the hypothesis that the genome replication process uses error-correcting means, and the auxiliary one that nested codes are used to this end, we first review the concepts of redundancy and error-correcting codes. Then we show that these hypotheses imply that: distinct species exist with a hierarchical taxonomy, there is a trend of evolution towards complexity, and evolution proceeds by discrete jumps. At least the first two features above may be considered as biological facts so, in the absence of direct evidence, they provide an indirect proof in favour of the hypothesized error-correction system. The very high redundancy of genomes makes it possible. In order to explain how it is implemented, we suggest that soft codes and replication decoding, to be briefly described, are plausible candidates. Experimentally proven properties of long-range correlation of the DNA message substantiate this claim.
A systems engineering initiative for NASA's space communications
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1993-01-01
In addition to but separate from the Red and Blue Teams commissioned by the NASA Administrator, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper, without compromising safety. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo. The Blue Team process and results are summarized. The Associate Administrator for Space Communications subsequently convened a special management session to discuss the significance and implications of the Blue Team's report and to lay the groundwork and teamwork for the next steps, including the transition from engineering systems to systems engineering. The methodology and progress toward realizing the Code O Family vision and accomplishing the systems engineering initiative for NASA's space communications are presented.
BSPS Program (ESI-Mass Spectrometry) Biological Sample Data Analysis; Disruption of Bacteria Spores
2005-10-01
the original usage of the translational as a broad description of the entire process by which the polymer of the three-letter code in the mRNA is...translated. There is extensive review of post transnational modifications of proteins by Finn Wold(1981)24, given as in vivo chemical modifications... thiolation , biotin, bromination, carbamylation, deamidation, methylation, glu- cosylation, lipoyl, phosphorylation,, pyridoxal phosphate
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
Mechanisms of Long Non-Coding RNAs in the Assembly and Plasticity of Neural Circuitry.
Wang, Andi; Wang, Junbao; Liu, Ying; Zhou, Yan
2017-01-01
The mechanisms underlying development processes and functional dynamics of neural circuits are far from understood. Long non-coding RNAs (lncRNAs) have emerged as essential players in defining identities of neural cells, and in modulating neural activities. In this review, we summarized latest advances concerning roles and mechanisms of lncRNAs in assembly, maintenance and plasticity of neural circuitry, as well as lncRNAs' implications in neurological disorders. We also discussed technical advances and challenges in studying functions and mechanisms of lncRNAs in neural circuitry. Finally, we proposed that lncRNA studies would advance our understanding on how neural circuits develop and function in physiology and disease conditions.
The emerging roles of long non-coding RNA in gallbladder cancer tumorigenesis.
Chen, Bing; Li, Ya; He, Yuting; Xue, Chen; Xu, Feng
2018-05-04
Accumulating evidence suggests that long non-coding RNAs (lncRNAs) have important regulatory functions in gallbladder cancer (GBC) tumorigenesis and can serve as potential novel markers and/or targets for GBC. In this review, we critically discuss the emerging alteration of lncRNAs in GBC, the lncRNAs induced epigenetic regulation, the interaction of lncRNAs with microRNAs and lncRNAs effects on tumor-related signaling pathways. Additionally, contributions of lncRNAs in epithelial-mesenchymal transition process and energy metabolism reprogramming in GBC are also addressed. This may pave new ways towards the determination of GBC pathogenesis and lead to the development of new preventive and therapeutic strategies for GBC.
Non-coding RNA networks underlying cognitive disorders across the lifespan
Qureshi, Irfan A.; Mehler, Mark F.
2011-01-01
Non-coding RNAs (ncRNAs) and their associated regulatory networks are increasingly being implicated in mediating a complex repertoire of neurobiological functions. Cognitive and behavioral processes are proving to be no exception. Here, we discuss the emergence of many novel, diverse, and rapidly expanding classes and subclasses of short and long ncRNAs. We briefly review the life cycles and molecular functions of these ncRNAs. We also examine how ncRNA circuitry mediates brain development, plasticity, stress responses, and aging and highlight its potential roles in the pathophysiology of cognitive disorders, including neural developmental and age-associated neurodegenerative diseases as well as those that manifest throughout the lifespan. PMID:21411369
Open Rotor Noise Prediction Methods at NASA Langley- A Technology Review
NASA Technical Reports Server (NTRS)
Farassat, F.; Dunn, Mark H.; Tinetti, Ana F.; Nark, Douglas M.
2009-01-01
Open rotors are once again under consideration for propulsion of the future airliners because of their high efficiency. The noise generated by these propulsion systems must meet the stringent noise standards of today to reduce community impact. In this paper we review the open rotor noise prediction methods available at NASA Langley. We discuss three codes called ASSPIN (Advanced Subsonic-Supersonic Propeller Induced Noise), FW - Hpds (Ffowcs Williams-Hawkings with penetrable data surface) and the FSC (Fast Scattering Code). The first two codes are in the time domain and the third code is a frequency domain code. The capabilities of these codes and the input data requirements as well as the output data are presented. Plans for further improvements of these codes are discussed. In particular, a method based on equivalent sources is outlined to get rid of spurious signals in the FW - Hpds code.
Matus, Bethany A; Bridges, Kayla M; Logomarsino, John V
2018-06-21
Individualized feeding care plans and safe handling of milk (human or formula) are critical in promoting growth, immune function, and neurodevelopment in the preterm infant. Feeding errors and disruptions or limitations to feeding processes in the neonatal intensive care unit (NICU) are associated with negative safety events. Feeding errors include contamination of milk and delivery of incorrect or expired milk and may result in adverse gastrointestinal illnesses. The purpose of this review was to evaluate the effect(s) of centralized milk preparation, use of trained technicians, use of bar code-scanning software, and collaboration between registered dietitians and registered nurses on feeding safety in the NICU. A systematic review of the literature was completed, and 12 articles were selected as relevant to search criteria. Study quality was evaluated using the Downs and Black scoring tool. An evaluation of human studies indicated that the use of centralized milk preparation, trained technicians, bar code-scanning software, and possible registered dietitian involvement decreased feeding-associated error in the NICU. A state-of-the-art NICU includes a centralized milk preparation area staffed by trained technicians, care supported by bar code-scanning software, and utilization of a registered dietitian to improve patient safety. These resources will provide nurses more time to focus on nursing-specific neonatal care. Further research is needed to evaluate the impact of factors related to feeding safety in the NICU as well as potential financial benefits of these quality improvement opportunities.
DRG benchmarking study establishes national coding norms.
Vaul, J H
1998-05-01
With the increase in fraud and abuse investigations, healthcare financial managers should examine their organization's medical record coding procedures. The Federal government and third-party payers are looking specifically for improper billing of outpatient services, unbundling of procedures to increase payment, assigning higher-paying DRG codes for inpatient claims, and other abuses. A recent benchmarking study of Medicare Provider Analysis and Review (MEDPAR) data has established national norms for hospital coding and case mix based on DRGs and has revealed the majority of atypical coding cases fall into six DRG pairs. Organizations with a greater percentage of atypical cases--those more likely to be scrutinized by Federal investigators--will want to conduct suitable review and be sure appropriate documentation exists to justify the coding.
Olier, Ivan; Springate, David A.; Ashcroft, Darren M.; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos
2016-01-01
Background The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. Methods We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. Results We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. Conclusion We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. PMID:26918439
Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.
Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S
2015-05-01
Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
McEvoy, Fintan J; Shen, Nicholas W; Nielsen, Dorte H; Buelund, Lene E; Holm, Peter
2017-02-01
Communicating radiological reports to peers has pedagogical value. Students may be uneasy with the process due to a lack of communication and peer review skills or to their failure to see value in the process. We describe a communication exercise with peer review in an undergraduate veterinary radiology course. The computer code used to manage the course and deliver images online is reported, and we provide links to the executable files. We tested to see if undergraduate peer review of radiological reports has validity and describe student impressions of the learning process. Peer review scores for student-generated radiological reports were compared to scores obtained in the summative multiple choice (MCQ) examination for the course. Student satisfaction was measured using a bespoke questionnaire. There was a weak positive correlation (Pearson correlation coefficient = 0.32, p < 0.01) between peer review scores students received and the student scores obtained in the MCQ examination. The difference in peer review scores received by students grouped according to their level of course performance (high vs. low) was statistically significant (p < 0.05). No correlation was found between peer review scores awarded by the students and the scores they obtained in the MCQ examination (Pearson correlation coefficient = 0.17, p = 0.14). In conclusion, we have created a realistic radiology imaging exercise with readily available software. The peer review scores are valid in that to a limited degree they reflect student future performance in an examination. Students valued the process of learning to communicate radiological findings but do not fully appreciated the value of peer review.
Kinetic models of gene expression including non-coding RNAs
NASA Astrophysics Data System (ADS)
Zhdanov, Vladimir P.
2011-03-01
In cells, genes are transcribed into mRNAs, and the latter are translated into proteins. Due to the feedbacks between these processes, the kinetics of gene expression may be complex even in the simplest genetic networks. The corresponding models have already been reviewed in the literature. A new avenue in this field is related to the recognition that the conventional scenario of gene expression is fully applicable only to prokaryotes whose genomes consist of tightly packed protein-coding sequences. In eukaryotic cells, in contrast, such sequences are relatively rare, and the rest of the genome includes numerous transcript units representing non-coding RNAs (ncRNAs). During the past decade, it has become clear that such RNAs play a crucial role in gene expression and accordingly influence a multitude of cellular processes both in the normal state and during diseases. The numerous biological functions of ncRNAs are based primarily on their abilities to silence genes via pairing with a target mRNA and subsequently preventing its translation or facilitating degradation of the mRNA-ncRNA complex. Many other abilities of ncRNAs have been discovered as well. Our review is focused on the available kinetic models describing the mRNA, ncRNA and protein interplay. In particular, we systematically present the simplest models without kinetic feedbacks, models containing feedbacks and predicting bistability and oscillations in simple genetic networks, and models describing the effect of ncRNAs on complex genetic networks. Mathematically, the presentation is based primarily on temporal mean-field kinetic equations. The stochastic and spatio-temporal effects are also briefly discussed.
General review of the MOSTAS computer code for wind turbines
NASA Technical Reports Server (NTRS)
Dungundji, J.; Wendell, J. H.
1981-01-01
The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.
A Review of Contemporary Diversity Literature in Pharmacy Education
Bush, Antonio A.; White, Carla
2017-01-01
Objective. To review and categorize published educational research concerning diversity within colleges and schools of pharmacy. Methods. The Three Models of Organizational Diversity Capabilities in Higher Education framework was used to guide the review efforts. Of the 593 documents retrieved, 11 met the inclusion criteria for review. Each included article was individually reviewed and coded according to the framework. Results. The reviewed articles were primarily influenced by contemporary drivers of change (eg, shifting demographics in the United States), focused on enhancing the compositional diversity of colleges and schools of pharmacy, examined the experiences of underrepresented groups, and suggested process improvement recommendations. Conclusion. There is limited published educational research concerning diversity within schools and colleges of pharmacy. Contemporary drivers of change are influencing this research, but more attention must be given to the focus of the research, individuals targeted, and recommendations suggested. PMID:29109561
Botsis, Taxiarchis; Foster, Matthew; Kreimeyer, Kory; Pandey, Abhishek; Forshee, Richard
2017-01-01
Literature review is critical but time-consuming in the post-market surveillance of medical products. We focused on the safety signal of intussusception after the vaccination of infants with the Rotashield Vaccine in 1999 and retrieved all PubMed abstracts for rotavirus vaccines published after January 1, 1998. We used the Event-based Text-mining of Health Electronic Records system, the MetaMap tool, and the National Center for Biomedical Ontologies Annotator to process the abstracts and generate coded terms stamped with the date of publication. Data were analyzed in the Pattern-based and Advanced Network Analyzer for Clinical Evaluation and Assessment to evaluate the intussusception-related findings before and after the release of the new rotavirus vaccines in 2006. The tight connection of intussusception with the historical signal in the first period and the absence of any safety concern for the new vaccines in the second period were verified. We demonstrated the feasibility for semi-automated solutions that may assist medical reviewers in monitoring biomedical literature.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Using natural language processing to identify problem usage of prescription opioids.
Carrell, David S; Cronkite, David; Palmer, Roy E; Saunders, Kathleen; Gross, David E; Masters, Elizabeth T; Hylan, Timothy R; Von Korff, Michael
2015-12-01
Accurate and scalable surveillance methods are critical to understand widespread problems associated with misuse and abuse of prescription opioids and for implementing effective prevention and control measures. Traditional diagnostic coding incompletely documents problem use. Relevant information for each patient is often obscured in vast amounts of clinical text. We developed and evaluated a method that combines natural language processing (NLP) and computer-assisted manual review of clinical notes to identify evidence of problem opioid use in electronic health records (EHRs). We used the EHR data and text of 22,142 patients receiving chronic opioid therapy (≥70 days' supply of opioids per calendar quarter) during 2006-2012 to develop and evaluate an NLP-based surveillance method and compare it to traditional methods based on International Classification of Disease, Ninth Edition (ICD-9) codes. We developed a 1288-term dictionary for clinician mentions of opioid addiction, abuse, misuse or overuse, and an NLP system to identify these mentions in unstructured text. The system distinguished affirmative mentions from those that were negated or otherwise qualified. We applied this system to 7336,445 electronic chart notes of the 22,142 patients. Trained abstractors using a custom computer-assisted software interface manually reviewed 7751 chart notes (from 3156 patients) selected by the NLP system and classified each note as to whether or not it contained textual evidence of problem opioid use. Traditional diagnostic codes for problem opioid use were found for 2240 (10.1%) patients. NLP-assisted manual review identified an additional 728 (3.1%) patients with evidence of clinically diagnosed problem opioid use in clinical notes. Inter-rater reliability among pairs of abstractors reviewing notes was high, with kappa=0.86 and 97% agreement for one pair, and kappa=0.71 and 88% agreement for another pair. Scalable, semi-automated NLP methods can efficiently and accurately identify evidence of problem opioid use in vast amounts of EHR text. Incorporating such methods into surveillance efforts may increase prevalence estimates by as much as one-third relative to traditional methods. Copyright © 2015. Published by Elsevier Ireland Ltd.
Code Mixing and Modernization across Cultures.
ERIC Educational Resources Information Center
Kamwangamalu, Nkonko M.
A review of recent studies addressed the functional uses of code mixing across cultures. Expressions of code mixing (CM) are not random; in fact, a number of functions of code mixing can easily be delineated, for example, the concept of "modernization.""Modernization" is viewed with respect to how bilingual code mixers perceive…
Figueiredo, Rafael L F; Singhal, Sonica; Dempster, Laura; Hwang, Stephen W; Quinonez, Carlos
2015-01-01
Emergency department (ED) visits for nontraumatic dental conditions (NTDCs) may be a sign of unmet need for dental care. The objective of this study was to determine the accuracy of the International Classification of Diseases codes (ICD-10-CA) for ED visits for NTDC. ED visits in 2008-2099 at one hospital in Toronto were identified if the discharge diagnosis in the administrative database system was an ICD-10-CA code for a NTDC (K00-K14). A random sample of 100 visits was selected, and the medical records for these visits were reviewed by a dentist. The description of the clinical signs and symptoms were evaluated, and a diagnosis was assigned. This diagnosis was compared with the diagnosis assigned by the physician and the code assigned to the visit. The 100 ED visits reviewed were associated with 16 different ICD-10-CA codes for NTDC. Only 2 percent of these visits were clearly caused by trauma. The code K0887 (toothache) was the most frequent diagnostic code (31 percent). We found 43.3 percent disagreement on the discharge diagnosis reported by the physician, and 58.0 percent disagreement on the code in the administrative database assigned by the abstractor, compared with what it was suggested by the dentist reviewing the chart. There are substantial discrepancies between the ICD-10-CA diagnosis assigned in administrative databases and the diagnosis assigned by a dentist reviewing the chart retrospectively. However, ICD-10-CA codes can be used to accurately identify ED visits for NTDC. © 2015 American Association of Public Health Dentistry.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review
Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-01-01
Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883
Schneider, Gary; Kachroo, Sumesh; Jones, Natalie; Crean, Sheila; Rotella, Philip; Avetisyan, Ruzan; Reynolds, Matthew W
2012-01-01
The Food and Drug Administration's Mini-Sentinel pilot program aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest from administrative and claims data. This article summarizes the process and findings of the algorithm review of hypersensitivity reactions. PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the hypersensitivity reactions of health outcomes of interest. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify hypersensitivity reactions and including validation estimates of the coding algorithms. We identified five studies that provided validated hypersensitivity-reaction algorithms. Algorithm positive predictive values (PPVs) for various definitions of hypersensitivity reactions ranged from 3% to 95%. PPVs were high (i.e. 90%-95%) when both exposures and diagnoses were very specific. PPV generally decreased when the definition of hypersensitivity was expanded, except in one study that used data mining methodology for algorithm development. The ability of coding algorithms to identify hypersensitivity reactions varied, with decreasing performance occurring with expanded outcome definitions. This examination of hypersensitivity-reaction coding algorithms provides an example of surveillance bias resulting from outcome definitions that include mild cases. Data mining may provide tools for algorithm development for hypersensitivity and other health outcomes. Research needs to be conducted on designing validation studies to test hypersensitivity-reaction algorithms and estimating their predictive power, sensitivity, and specificity. Copyright © 2012 John Wiley & Sons, Ltd.
Rushton, A; White, L; Heap, A; Heneghan, N; Goodwin, P
2016-01-01
Objectives To develop an optimised 1:1 physiotherapy intervention that reflects best practice, with flexibility to tailor management to individual patients, thereby ensuring patient-centred practice. Design Mixed-methods combining evidence synthesis, expert review and focus groups. Setting Secondary care involving 5 UK specialist spinal centres. Participants A purposive panel of clinical experts from the 5 spinal centres, comprising spinal surgeons, inpatient and outpatient physiotherapists, provided expert review of the draft intervention. Purposive samples of patients (n=10) and physiotherapists (n=10) (inpatient/outpatient physiotherapists managing patients with lumbar discectomy) were invited to participate in the focus groups at 1 spinal centre. Methods A draft intervention developed from 2 systematic reviews; a survey of current practice and research related to stratified care was circulated to the panel of clinical experts. Lead physiotherapists collaborated with physiotherapy and surgeon colleagues to provide feedback that informed the intervention presented at 2 focus groups investigating acceptability to patients and physiotherapists. The focus groups were facilitated by an experienced facilitator, recorded in written and tape-recorded forms by an observer. Tape recordings were transcribed verbatim. Data analysis, conducted by 2 independent researchers, employed an iterative and constant comparative process of (1) initial descriptive coding to identify categories and subsequent themes, and (2) deeper, interpretive coding and thematic analysis enabling concepts to emerge and overarching pattern codes to be identified. Results The intervention reflected best available evidence and provided flexibility to ensure patient-centred care. The intervention comprised up to 8 sessions of 1:1 physiotherapy over 8 weeks, starting 4 weeks postsurgery. The intervention was acceptable to patients and physiotherapists. Conclusions A rigorous process informed an optimised 1:1 physiotherapy intervention post-lumbar discectomy that reflects best practice. The developed intervention was agreed on by the 5 spinal centres for implementation in a randomised controlled trial to evaluate its effectiveness. PMID:26916690
Southern, Danielle A; Burnand, Bernard; Droesler, Saskia E; Flemons, Ward; Forster, Alan J; Gurevich, Yana; Harrison, James; Quan, Hude; Pincus, Harold A; Romano, Patrick S; Sundararajan, Vijaya; Kostanjsek, Nenad; Ghali, William A
2017-03-01
Existing administrative data patient safety indicators (PSIs) have been limited by uncertainty around the timing of onset of included diagnoses. We undertook de novo PSI development through a data-driven approach that drew upon "diagnosis timing" information available in some countries' administrative hospital data. Administrative database analysis and modified Delphi rating process. All hospitalized adults in Canada in 2009. We queried all hospitalizations for ICD-10-CA diagnosis codes arising during hospital stay. We then undertook a modified Delphi panel process to rate the extent to which each of the identified diagnoses has a potential link to suboptimal quality of care. We grouped the identified quality/safety-related diagnoses into relevant clinical categories. Lastly, we queried Alberta hospital discharge data to assess the frequency of the newly defined PSI events. Among 2,416,413 national hospitalizations, we found 2590 unique ICD-10-CA codes flagged as having arisen after admission. Seven panelists evaluated these in a 2-round review process, and identified a listing of 640 ICD-10-CA diagnosis codes judged to be linked to suboptimal quality of care and thus appropriate for inclusion in PSIs. These were then grouped by patient safety experts into 18 clinically relevant PSI categories. We then analyzed data on 2,381,652 Alberta hospital discharges from 2005 through 2012, and found that 134,299 (5.2%) hospitalizations had at least 1 PSI diagnosis. The resulting work creates a foundation for a new set of PSIs for routine large-scale surveillance of hospital and health system performance.
Electrophysiological Evidence for the Sources of the Masking Level Difference.
Fowler, Cynthia G
2017-08-16
The purpose of this review article is to review evidence from auditory evoked potential studies to describe the contributions of the auditory brainstem and cortex to the generation of the masking level difference (MLD). A literature review was performed, focusing on the auditory brainstem, middle, and late latency responses used in protocols similar to those used to generate the behavioral MLD. Temporal coding of the signals necessary for generating the MLD occurs in the auditory periphery and brainstem. Brainstem disorders up to wave III of the auditory brainstem response (ABR) can disrupt the MLD. The full MLD requires input to the generators of the auditory late latency potentials to produce all characteristics of the MLD; these characteristics include threshold differences for various binaural signal and noise conditions. Studies using central auditory lesions are beginning to identify the cortical effects on the MLD. The MLD requires auditory processing from the periphery to cortical areas. A healthy auditory periphery and brainstem codes temporal synchrony, which is essential for the ABR. Threshold differences require engaging cortical function beyond the primary auditory cortex. More studies using cortical lesions and evoked potentials or imaging should clarify the specific cortical areas involved in the MLD.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... parts of the National Board Inspection Code at http://www.nationalboard.org . DATES: The comment period... edition of the National Board Inspection Code for public review at www.nationalboard.org . Both documents...
Beller, Elaine; Clark, Justin; Tsafnat, Guy; Adams, Clive; Diehl, Heinz; Lund, Hans; Ouzzani, Mourad; Thayer, Kristina; Thomas, James; Turner, Tari; Xia, Jun; Robinson, Karen; Glasziou, Paul
2018-05-19
Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.
NASA Technical Reports Server (NTRS)
Duda, James L.; Barth, Suzanna C
2005-01-01
The VIIRS sensor provides measurements for 22 Environmental Data Records (EDRs) addressing the atmosphere, ocean surface temperature, ocean color, land parameters, aerosols, imaging for clouds and ice, and more. That is, the VIIRS collects visible and infrared radiometric data of the Earth's atmosphere, ocean, and land surfaces. Data types include atmospheric, clouds, Earth radiation budget, land/water and sea surface temperature, ocean color, and low light imagery. This wide scope of measurements calls for the preparation of a multiplicity of Algorithm Theoretical Basis Documents (ATBDs), and, additionally, for intermediate products such as cloud mask, et al. Furthermore, the VIIRS interacts with three or more other sensors. This paper addresses selected and crucial elements of the process being used to convert and test an immense volume of a maturing and changing science code to the initial operational source code in preparation for launch of NPP. The integrity of the original science code is maintained and enhanced via baseline comparisons when re-hosted, in addition to multiple planned code performance reviews.
Janamian, Tina; Jackson, Claire L; Glasson, Nicola; Nicholson, Caroline
2014-08-04
To review the available literature to identify the major challenges and barriers to implementation and adoption of the patient-centred medical home (PCMH) model, topical in current Australian primary care reforms. Systematic review of peer-reviewed literature. PubMed and Embase databases were searched in December 2012 for studies published in English between January 2007 and December 2012. Studies of any type were included if they defined PCMH using the Patient-Centered Primary Care Collaborative Joint Principles, and reported data on challenges and barriers to implementation and adoption of the PCMH model. One researcher with content knowledge in the area abstracted data relating to the review objective and study design from eligible articles. A second researcher reviewed the abstracted data alongside the original article to check for accuracy and completeness. Thematic synthesis was used to in three stages: free line-by-line coding of data; organisation of "free codes" into related areas to construct "descriptive" themes and develop "analytical" themes. The main barriers identified related to: challenges with the transformation process; difficulties associated with change management; challenges in implementing and using an electronic health record that administers principles of PCMH; challenges with funding and appropriate payment models; insufficient resources and infrastructure within practices; and inadequate measures of performance. This systematic review documents the key challenges and barriers to implementing the PCMH model in United States family practice. It provides valuable evidence for Australian clinicians, policymakers, and organisations approaching adoption of PCMH elements within reform initiatives in this country.
Practices in Code Discoverability: Astrophysics Source Code Library
NASA Astrophysics Data System (ADS)
Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.
2012-09-01
Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.
NASA Technical Reports Server (NTRS)
Thomson, F.
1972-01-01
The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.
An experiment to assess the cost-benefits of code inspections in large scale software development
NASA Technical Reports Server (NTRS)
Porter, A.; Siy, H.; Toman, C. A.; Votta, L. G.
1994-01-01
This experiment (currently in progress) is designed to measure costs and benefits of different code inspection methods. It is being performed with a real development team writing software for a commercial product. The dependent variables for each code unit's inspection are the elapsed time and the number of defects detected. We manipulate the method of inspection by randomly assigning reviewers, varying the number of reviewers and the number of teams, and, when using more than one team, randomly assigning author repair and non-repair of detected defects between code inspections. After collecting and analyzing the first 17 percent of the data, we have discovered several interesting facts about reviewers, about the defects recorded during reviewer preparation and during the inspection collection meeting, and about the repairs that are eventually made. (1) Only 17 percent of the defects that reviewers record in their preparations are true defects that are later repaired. (2) Defects recorded at the inspection meetings fall into three categories: 18 percent false positives requiring no author repair, 57 percent soft maintenance where the author makes changes only for readability or code standard enforcement, and 25 percent true defects requiring repair. (3) The median elapsed calendar time for code inspections is 10 working days - 8 working days before the collection meeting and 2 after. (4) In the collection meetings, 31 percent of the defects discovered by reviewers during preparation are suppressed. (5) Finally, 33 percent of the true defects recorded are discovered at the collection meetings and not during any reviewer's preparation. The results to date suggest that inspections with two sessions (two different teams) of two reviewers per session (2sX2p) are the most effective. These two-session inspections may be performed with author repair or with no author repair between the two sessions. We are finding that the two-session, two-person with repair (2sX2pR) inspections are the most expensive, taking 15 working days of calendar time from the time the code is ready for review until author repair is complete, whereas two-session, two-person with no repair (2sX2pN) inspections take only 10 working days, but find about 10 percent fewer defects.
Taste buds: cells, signals and synapses
Roper, Stephen D.; Chaudhari, Nirupa
2018-01-01
The past decade has witnessed a consolidation and refinement of the extraordinary progress made in taste research. This Review describes recent advances in our understanding of taste receptors, taste buds, and the connections between taste buds and sensory afferent fibres. The article discusses new findings regarding the cellular mechanisms for detecting tastes, new data on the transmitters involved in taste processing and new studies that address longstanding arguments about taste coding. PMID:28655883
Taste buds: cells, signals and synapses.
Roper, Stephen D; Chaudhari, Nirupa
2017-08-01
The past decade has witnessed a consolidation and refinement of the extraordinary progress made in taste research. This Review describes recent advances in our understanding of taste receptors, taste buds, and the connections between taste buds and sensory afferent fibres. The article discusses new findings regarding the cellular mechanisms for detecting tastes, new data on the transmitters involved in taste processing and new studies that address longstanding arguments about taste coding.
ERIC Educational Resources Information Center
Morris, Suzanne E.
2010-01-01
This paper provides a review of institutional authorship policies as required by the "Australian Code for the Responsible Conduct of Research" (the "Code") (National Health and Medical Research Council (NHMRC), the Australian Research Council (ARC) & Universities Australia (UA) 2007), and assesses them for Code compliance.…
RNA G-quadruplexes: emerging mechanisms in disease
Cammas, Anne
2017-01-01
Abstract RNA G-quadruplexes (G4s) are formed by G-rich RNA sequences in protein-coding (mRNA) and non-coding (ncRNA) transcripts that fold into a four-stranded conformation. Experimental studies and bioinformatic predictions support the view that these structures are involved in different cellular functions associated to both DNA processes (telomere elongation, recombination and transcription) and RNA post-transcriptional mechanisms (including pre-mRNA processing, mRNA turnover, targeting and translation). An increasing number of different diseases have been associated with the inappropriate regulation of RNA G4s exemplifying the potential importance of these structures on human health. Here, we review the different molecular mechanisms underlying the link between RNA G4s and human diseases by proposing several overlapping models of deregulation emerging from recent research, including (i) sequestration of RNA-binding proteins, (ii) aberrant expression or localization of RNA G4-binding proteins, (iii) repeat associated non-AUG (RAN) translation, (iv) mRNA translational blockade and (v) disabling of protein–RNA G4 complexes. This review also provides a comprehensive survey of the functional RNA G4 and their mechanisms of action. Finally, we highlight future directions for research aimed at improving our understanding on RNA G4-mediated regulatory mechanisms linked to diseases. PMID:28013268
Optimizing the post-graduate institutional program evaluation process.
Lypson, Monica L; Prince, Mark E P; Kasten, Steven J; Osborne, Nicholas H; Cohan, Richard H; Kowalenko, Terry; Dougherty, Paul J; Reynolds, R Kevin; Spires, M Catherine; Kozlow, Jeffrey H; Gitlin, Scott D
2016-02-17
Reviewing program educational efforts is an important component of postgraduate medical education program accreditation. The post-graduate review process has evolved over time to include centralized oversight based on accreditation standards. The institutional review process and the impact on participating faculty are topics not well described in the literature. We conducted multiple Plan-Do-Study-Act (PDSA) cycles to identify and implement areas for change to improve productivity in our institutional program review committee. We also conducted one focus group and six in-person interviews with 18 committee members to explore their perspectives on the committee's evolution. One author (MLL) reviewed the transcripts and performed the initial thematic coding with a PhD level research associate and identified and categorized themes. These themes were confirmed by all participating committee members upon review of a detailed summary. Emergent themes were triangulated with the University of Michigan Medical School's Admissions Executive Committee (AEC). We present an overview of adopted new practices to the educational program evaluation process at the University of Michigan Health System that includes standardization of meetings, inclusion of resident members, development of area content experts, solicitation of committed committee members, transition from paper to electronic committee materials, and focus on continuous improvement. Faculty and resident committee members identified multiple improvement areas including the ability to provide high quality reviews of training programs, personal and professional development, and improved feedback from program trainees. A standing committee that utilizes the expertise of a group of committed faculty members and which includes formal resident membership has significant advantages over ad hoc or other organizational structures for program evaluation committees.
Turbulence Modeling: Progress and Future Outlook
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.; Huang, George P.
1996-01-01
Progress in the development of the hierarchy of turbulence models for Reynolds-averaged Navier-Stokes codes used in aerodynamic applications is reviewed. Steady progress is demonstrated, but transfer of the modeling technology has not kept pace with the development and demands of the computational fluid dynamics (CFD) tools. An examination of the process of model development leads to recommendations for a mid-course correction involving close coordination between modelers, CFD developers, and application engineers. In instances where the old process is changed and cooperation enhanced, timely transfer is realized. A turbulence modeling information database is proposed to refine the process and open it to greater participation among modeling and CFD practitioners.
Research on pre-processing of QR Code
NASA Astrophysics Data System (ADS)
Sun, Haixing; Xia, Haojie; Dong, Ning
2013-10-01
QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.
Ethics committees in New Zealand.
Gillett, Grant; Douglass, Alison
2012-12-01
The ethical review of research in New Zealand after the Cartwright Report of 1988 produced a major change in safeguards for and empowerment of participants in health care research. Several reforms since then have streamlined some processes but also seriously weakened some of the existing safeguards. The latest reforms, against the advice of various ethics bodies and the New Zealand Law Society, further reduced and attenuated the role of ethics committees so that New Zealand has moved from being a world leader in ethical review processes to there being serious doubt whether it is in conformity to international Conventions and codes. The latest round of reforms, seemingly driven by narrow economic aspirations, anecdote and innuendo, have occurred without any clear evidence of dysfunction in the system nor any plans for the resourcing required to improve quality of ethical review or to audit the process. It is of serious concern both to ethicists and medical lawyers in New Zealand that such hasty and poorly researched changes should have been made which threaten the hard-won gains of the Cartwright reforms.
Experiences of living with motor neurone disease: a review of qualitative research.
Sakellariou, Dikaios; Boniface, Gail; Brown, Paul
2013-10-01
This review sought to answer the question "what is known about people's experiences of living with MND?". The review followed the guidelines of the Centre of Reviews and Dissemination. Twenty articles met the inclusion criteria and their results were analysed thematically. Data were managed and coded using the software package NVIVO and the analysis was performed in two stages, with the first stage aiming to develop descriptive themes offering an overview of the included data. During the second stage, analytical themes were developed with the explicit aim to answer the review question. The themes that emerged point to the following: (a) people with motor neurone disease (MND) develop experiential knowledge that helps them to live with the disease and (b) while people with MND believe they do not have any control over the disease, they try to have control over their lives through active choices, e.g. how and when to use adaptive equipment. This review highlights the decision-making and knowledge generating processes used by people with MND. Further research is required to explore these processes and their implications for the care of people with MND. Decision-making process by MND patients regarding their care is complex and takes into account the social elements of the disease as well as the medical. Exploring the practical knowledge that patients develop can offer insights on appropriate care for MND patients.
Gallacher, Katie; Jani, Bhautesh; Morrison, Deborah; Macdonald, Sara; Blane, David; Erwin, Patricia; May, Carl R; Montori, Victor M; Eton, David T; Smith, Fiona; Batty, G David; Batty, David G; Mair, Frances S
2013-01-28
Treatment burden can be defined as the self-care practices that patients with chronic illness must perform to respond to the requirements of their healthcare providers, as well as the impact that these practices have on patient functioning and well being. Increasing levels of treatment burden may lead to suboptimal adherence and negative outcomes. Systematic review of the qualitative literature is a useful method for exploring the patient experience of care, in this case the experience of treatment burden. There is no consensus on methods for qualitative systematic review. This paper describes the methodology used for qualitative systematic reviews of the treatment burdens identified in three different common chronic conditions, using stroke as our exemplar. Qualitative studies in peer reviewed journals seeking to understand the patient experience of stroke management were sought. Limitations of English language and year of publication 2000 onwards were set. An exhaustive search strategy was employed, consisting of a scoping search, database searches (Scopus, CINAHL, Embase, Medline & PsycINFO) and reference, footnote and citation searching. Papers were screened, data extracted, quality appraised and analysed by two individuals, with a third party for disagreements. Data analysis was carried out using a coding framework underpinned by Normalization Process Theory (NPT). A total of 4364 papers were identified, 54 were included in the review. Of these, 51 (94%) were retrieved from our database search. Methodological issues included: creating an appropriate search strategy; investigating a topic not previously conceptualised; sorting through irrelevant data within papers; the quality appraisal of qualitative research; and the use of NPT as a novel method of data analysis, shown to be a useful method for the purposes of this review. The creation of our search strategy may be of particular interest to other researchers carrying out synthesis of qualitative studies. Importantly, the successful use of NPT to inform a coding frame for data analysis involving qualitative data that describes processes relating to self management highlights the potential of a new method for analyses of qualitative data within systematic reviews.
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
Mertz, Marcel; Sofaer, Neema; Strech, Daniel
2014-09-27
The systematic review of reasons is a new way to obtain comprehensive information about specific ethical topics. One such review was carried out for the question of why post-trial access to trial drugs should or need not be provided. The objective of this study was to empirically validate this review using an author check method. The article also reports on methodological challenges faced by our study. We emailed a questionnaire to the 64 corresponding authors of those papers that were assessed in the review of reasons on post-trial access. The questionnaire consisted of all quotations ("reason mentions") that were identified by the review to represent a reason in a given author's publication, together with a set of codings for the quotations. The authors were asked to rate the correctness of the codings. We received 19 responses, from which only 13 were completed questionnaires. In total, 98 quotations and their related codes in the 13 questionnaires were checked by the addressees. For 77 quotations (79%), all codings were deemed correct, for 21 quotations (21%), some codings were deemed to need correction. Most corrections were minor and did not imply a complete misunderstanding of the citation. This first attempt to validate a review of reasons leads to four crucial methodological questions relevant to the future conduct of such validation studies: 1) How can a description of a reason be deemed incorrect? 2) Do the limited findings of this author check study enable us to determine whether the core results of the analysed SRR are valid? 3) Why did the majority of surveyed authors refrain from commenting on our understanding of their reasoning? 4) How can the method for validating reviews of reasons be improved?
Batch Model for Batched Timestamps Data Analysis with Application to the SSA Disability Program
Yue, Qingqi; Yuan, Ao; Che, Xuan; Huynh, Minh; Zhou, Chunxiao
2016-01-01
The Office of Disability Adjudication and Review (ODAR) is responsible for holding hearings, issuing decisions, and reviewing appeals as part of the Social Security Administration’s disability determining process. In order to control and process cases, the ODAR has established a Case Processing and Management System (CPMS) to record management information since December 2003. The CPMS provides a detailed case status history for each case. Due to the large number of appeal requests and limited resources, the number of pending claims at ODAR was over one million cases by March 31, 2015. Our National Institutes of Health (NIH) team collaborated with SSA and developed a Case Status Change Model (CSCM) project to meet the ODAR’s urgent need of reducing backlogs and improve hearings and appeals process. One of the key issues in our CSCM project is to estimate the expected service time and its variation for each case status code. The challenge is that the systems recorded job departure times may not be the true job finished times. As the CPMS timestamps data of case status codes showed apparent batch patterns, we proposed a batch model and applied the constrained least squares method to estimate the mean service times and the variances. We also proposed a batch search algorithm to determine the optimal batch partition, as no batch partition was given in the real data. Simulation studies were conducted to evaluate the performance of the proposed methods. Finally, we applied the method to analyze a real CPMS data from ODAR/SSA. PMID:27747132
Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits.
Ginde, Adit A; Blanc, Phillip G; Lieberman, Rebecca M; Camargo, Carlos A
2008-04-01
Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3). We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64%) cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8), often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2) identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2%) true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86-92) for detecting hypoglycemia visits. The proposed algorithm improves on prior strategies to identify hypoglycemia visits in administrative data sets and will enhance the ability to study the epidemiology and design interventions for this important complication of diabetes care.
Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikio Kurihara; Masahiro Aoki; Yu Maruyama
2006-07-01
Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less
Challenges in Coding Adverse Events in Clinical Trials: A Systematic Review
Schroll, Jeppe Bennekou; Maund, Emma; Gøtzsche, Peter C.
2012-01-01
Background Misclassification of adverse events in clinical trials can sometimes have serious consequences. Therefore, each of the many steps involved, from a patient's adverse experience to presentation in tables in publications, should be as standardised as possible, minimising the scope for interpretation. Adverse events are categorised by a predefined dictionary, e.g. MedDRA, which is updated biannually with many new categories. The objective of this paper is to study interobserver variation and other challenges of coding. Methods Systematic review using PRISMA. We searched PubMed, EMBASE and The Cochrane Library. All studies were screened for eligibility by two authors. Results Our search returned 520 unique studies of which 12 were included. Only one study investigated interobserver variation. It reported that 12% of the codes were evaluated differently by two coders. Independent physicians found that 8% of all the codes deviated from the original description. Other studies found that product summaries could be greatly affected by the choice of dictionary. With the introduction of MedDRA, it seems to have become harder to identify adverse events statistically because each code is divided in subgroups. To account for this, lumping techniques have been developed but are rarely used, and guidance on when to use them is vague. An additional challenge is that adverse events are censored if they already occurred in the run-in period of a trial. As there are more than 26 ways of determining whether an event has already occurred, this can lead to bias, particularly because data analysis is rarely performed blindly. Conclusion There is a lack of evidence that coding of adverse events is a reliable, unbiased and reproducible process. The increase in categories has made detecting adverse events harder, potentially compromising safety. It is crucial that readers of medical publications are aware of these challenges. Comprehensive interobserver studies are needed. PMID:22911755
Keeping abreast with long non-coding RNAs in mammary gland development and breast cancer
Hansji, Herah; Leung, Euphemia Y.; Baguley, Bruce C.; Finlay, Graeme J.; Askarian-Amiri, Marjan E.
2014-01-01
The majority of the human genome is transcribed, even though only 2% of transcripts encode proteins. Non-coding transcripts were originally dismissed as evolutionary junk or transcriptional noise, but with the development of whole genome technologies, these non-coding RNAs (ncRNAs) are emerging as molecules with vital roles in regulating gene expression. While shorter ncRNAs have been extensively studied, the functional roles of long ncRNAs (lncRNAs) are still being elucidated. Studies over the last decade show that lncRNAs are emerging as new players in a number of diseases including cancer. Potential roles in both oncogenic and tumor suppressive pathways in cancer have been elucidated, but the biological functions of the majority of lncRNAs remain to be identified. Accumulated data are identifying the molecular mechanisms by which lncRNA mediates both structural and functional roles. LncRNA can regulate gene expression at both transcriptional and post-transcriptional levels, including splicing and regulating mRNA processing, transport, and translation. Much current research is aimed at elucidating the function of lncRNAs in breast cancer and mammary gland development, and at identifying the cellular processes influenced by lncRNAs. In this paper we review current knowledge of lncRNAs contributing to these processes and present lncRNA as a new paradigm in breast cancer development. PMID:25400658
microRNAs Databases: Developmental Methodologies, Structural and Functional Annotations.
Singh, Nagendra Kumar
2017-09-01
microRNA (miRNA) is an endogenous and evolutionary conserved non-coding RNA, involved in post-transcriptional process as gene repressor and mRNA cleavage through RNA-induced silencing complex (RISC) formation. In RISC, miRNA binds in complementary base pair with targeted mRNA along with Argonaut proteins complex, causes gene repression or endonucleolytic cleavage of mRNAs and results in many diseases and syndromes. After the discovery of miRNA lin-4 and let-7, subsequently large numbers of miRNAs were discovered by low-throughput and high-throughput experimental techniques along with computational process in various biological and metabolic processes. The miRNAs are important non-coding RNA for understanding the complex biological phenomena of organism because it controls the gene regulation. This paper reviews miRNA databases with structural and functional annotations developed by various researchers. These databases contain structural and functional information of animal, plant and virus miRNAs including miRNAs-associated diseases, stress resistance in plant, miRNAs take part in various biological processes, effect of miRNAs interaction on drugs and environment, effect of variance on miRNAs, miRNAs gene expression analysis, sequence of miRNAs, structure of miRNAs. This review focuses on the developmental methodology of miRNA databases such as computational tools and methods used for extraction of miRNAs annotation from different resources or through experiment. This study also discusses the efficiency of user interface design of every database along with current entry and annotations of miRNA (pathways, gene ontology, disease ontology, etc.). Here, an integrated schematic diagram of construction process for databases is also drawn along with tabular and graphical comparison of various types of entries in different databases. Aim of this paper is to present the importance of miRNAs-related resources at a single place.
Short-term synaptic plasticity and heterogeneity in neural systems
NASA Astrophysics Data System (ADS)
Mejias, J. F.; Kappen, H. J.; Longtin, A.; Torres, J. J.
2013-01-01
We review some recent results on neural dynamics and information processing which arise when considering several biophysical factors of interest, in particular, short-term synaptic plasticity and neural heterogeneity. The inclusion of short-term synaptic plasticity leads to enhanced long-term memory capacities, a higher robustness of memory to noise, and irregularity in the duration of the so-called up cortical states. On the other hand, considering some level of neural heterogeneity in neuron models allows neural systems to optimize information transmission in rate coding and temporal coding, two strategies commonly used by neurons to codify information in many brain areas. In all these studies, analytical approximations can be made to explain the underlying dynamics of these neural systems.
Do humans make good decisions?
Summerfield, Christopher; Tsetsos, Konstantinos
2014-01-01
Human performance on perceptual classification tasks approaches that of an ideal observer, but economic decisions are often inconsistent and intransitive, with preferences reversing according to the local context. We discuss the view that suboptimal choices may result from the efficient coding of decision-relevant information, a strategy that allows expected inputs to be processed with higher gain than unexpected inputs. Efficient coding leads to ‘robust’ decisions that depart from optimality but maximise the information transmitted by a limited-capacity system in a rapidly-changing world. We review recent work showing that when perceptual environments are variable or volatile, perceptual decisions exhibit the same suboptimal context-dependence as economic choices, and propose a general computational framework that accounts for findings across the two domains. PMID:25488076
Software engineering for ESO's VLT project
NASA Astrophysics Data System (ADS)
Filippi, G.
1994-12-01
This paper reports on the experience at the European Southern Observatory on the application of software engineering techniques to a 200 man-year control software project for the Very Large Telescope (VLT). This shall provide astronomers, before the end of the century, with one of the most powerful telescopes in the world. From the definition of the general model, described in the software management plan, specific activities have been and will be defined: standards for documents and for code development, design approach using a CASE tool, the process of reviewing both documentation and code, quality assurance, test strategy, etc. The initial choices, the current implementation and the future planned activities are presented and, where feedback is already available, pros and cons are discussed.
Canonical microcircuits for predictive coding
Bastos, Andre M.; Usrey, W. Martin; Adams, Rick A.; Mangun, George R.; Fries, Pascal; Friston, Karl J.
2013-01-01
Summary This review considers the influential notion of a canonical (cortical) microcircuit in light of recent theories about neuronal processing. Specifically, we conciliate quantitative studies of microcircuitry and the functional logic of neuronal computations. We revisit the established idea that message passing among hierarchical cortical areas implements a form of Bayesian inference – paying careful attention to the implications for intrinsic connections among neuronal populations. By deriving canonical forms for these computations, one can associate specific neuronal populations with specific computational roles. This analysis discloses a remarkable correspondence between the microcircuitry of the cortical column and the connectivity implied by predictive coding. Furthermore, it provides some intuitive insights into the functional asymmetries between feedforward and feedback connections and the characteristic frequencies over which they operate. PMID:23177956
Summary of papers on current and anticipated uses of thermal-hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less
Accuracy of clinical coding from 1210 appendicectomies in a British district general hospital.
Bhangu, Aneel; Nepogodiev, Dmitri; Taylor, Caroline; Durkin, Natalie; Patel, Rajan
2012-01-01
The primary aim of this study was to assess the accuracy of clinical coding in identifying negative appendicectomies. The secondary aim was to analyse trends over time in rates of simple, complex (gangrenous or perforated) and negative appendicectomies. Retrospective review of 1210 patients undergoing emergency appendicectomy during a five year period (2006-2010). Histopathology reports were taken as gold standard for diagnosis and compared to clinical coding lists. Clinical coding is the process by which non-medical administrators apply standardised diagnostic codes to patients, based upon clinical notes at discharge. These codes then contribute to national databases. Statistical analysis included correlation studies and regression analyses. Clinical coding had only moderate correlation with histopathology, with an overall kappa of 0.421. Annual kappa values varied between 0.378 and 0.500. Overall 14% of patients were incorrectly coded as having had appendicitis when in fact they had a histopathologically normal appendix (153/1107), whereas 4% were falsely coded as having received a negative appendicectomy when they had appendicitis (48/1107). There was an overall significant fall and then rise in the rate of simple appendicitis (B coefficient -0.239 (95% confidence interval -0.426, -0.051), p = 0.014) but no change in the rate of complex appendicitis (B coefficient 0.008 (-0.015, 0.031), p = 0.476). Clinical coding for negative appendicectomy was unreliable. Negative rates may be higher than suspected. This has implications for the validity of national database analyses. Using this form of data as a quality indictor for appendicitis should be reconsidered until its quality is improved. Copyright © 2012 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Imran, Noreen; Seet, Boon-Chong; Fong, A C M
2015-01-01
Distributed video coding (DVC) is a relatively new video coding architecture originated from two fundamental theorems namely, Slepian-Wolf and Wyner-Ziv. Recent research developments have made DVC attractive for applications in the emerging domain of wireless video sensor networks (WVSNs). This paper reviews the state-of-the-art DVC architectures with a focus on understanding their opportunities and gaps in addressing the operational requirements and application needs of WVSNs.
Baldo, Brian A; Kelley, Ann E
2007-04-01
The idea that nucleus accumbens (Acb) dopamine transmission contributes to the neural mediation of reward, at least in a general sense, has achieved wide acceptance. Nevertheless, debate remains over the precise nature of dopamine's role in reward and even over the nature of reward itself. In the present article, evidence is reviewed from studies of food intake, feeding microstructure, instrumental responding for food reinforcement, and dopamine efflux associated with feeding, which suggests that reward processing in the Acb is best understood as an interaction among distinct processes coded by discrete neurotransmitter systems. In agreement with several theories of Acb dopamine function, it is proposed here that allocation of motor effort in seeking food or food-associated conditioned stimuli can be dissociated from computations relevant to the hedonic evaluation of food during the consummatory act. The former appears to depend upon Acb dopamine transmission and the latter upon striatal opioid peptide release. Moreover, dopamine transmission may play a role in 'stamping in' associations between motor acts and goal attainment and perhaps also neural representations corresponding to rewarding outcomes. Finally, evidence is reviewed that amino acid transmission specifically in the Acb shell acts as a central 'circuit breaker' to flexibly enable or terminate the consummatory act, via descending connections to hypothalamic feeding control systems. The heuristic framework outlined above may help explain why dopamine-compromising manipulations that strongly diminish instrumental goal-seeking behaviors leave consummatory activity relatively unaffected.
Radiation Transport Tools for Space Applications: A Review
NASA Technical Reports Server (NTRS)
Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn
2008-01-01
This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.
System theory in industrial patient monitoring: an overview.
Baura, G D
2004-01-01
Patient monitoring refers to the continuous observation of repeating events of physiologic function to guide therapy or to monitor the effectiveness of interventions, and is used primarily in the intensive care unit and operating room. Commonly processed signals are the electrocardiogram, intraarterial blood pressure, arterial saturation of oxygen, and cardiac output. To this day, the majority of physiologic waveform processing in patient monitors is conducted using heuristic curve fitting. However in the early 1990s, a few enterprising engineers and physicians began using system theory to improve their core processing. Applications included improvement of signal-to-noise ratio, either due to low signal levels or motion artifact, and improvement in feature detection. The goal of this mini-symposium is to review the early work in this emerging field, which has led to technologic breakthroughs. In this overview talk, the process of system theory algorithm research and development is discussed. Research for industrial monitors involves substantial data collection, with some data used for algorithm training and the remainder used for validation. Once the algorithms are validated, they are translated into detailed specifications. Development then translates these specifications into DSP code. The DSP code is verified and validated per the Good Manufacturing Practices mandated by FDA.
Selection platforms for directed evolution in synthetic biology
Tizei, Pedro A.G.; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B.
2016-01-01
Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules–gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function–be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. PMID:27528765
Selection platforms for directed evolution in synthetic biology.
Tizei, Pedro A G; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B
2016-08-15
Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules-gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function-be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. © 2016 The Author(s).
Coding pulmonary sepsis and mortality statistics in Rio de Janeiro, RJ.
Cardoso, Bruno Baptista; Kale, Pauline Lorena
2016-01-01
This study aimed to describe "pulmonary sepsis" reported as a cause of death, measure its association to pneumonia, and the significance of the coding rules in mortality statistics, including the diagnosis of pneumonia on death certificates (DC) with the mention of pulmonary sepsis in Rio de Janeiro, Brazil, in 2011. DC with mention of pulmonary sepsis was identified, regardless of the underlying cause of death. Medical records related to the certificates with reference to "pulmonary sepsis" were reviewed and physicians were interviewed to measure the association between pulmonary sepsis and pneumonia. A simulation was performed in the mortality data by inserting the International Classification of Diseases (ICD-10) code for pneumonia in the certificates with pulmonary sepsis. "Pulmonary sepsis" constituted 30.9% of reported sepsis and pneumonia was not reported in 51.3% of these DC. Pneumonia was registered in 82.8% of the sample of the medical records. Among physicians interviewed, 93.3% declared pneumonia as the most common cause of "pulmonary sepsis." The simulation of the coding process resulted in a different underlying cause of death for 7.8% of the deaths with sepsis reported and 2.4% of all deaths, regardless the original cause. The conclusion is that "pulmonary sepsis" is frequently associated to pneumonia and that the addition of the ICD-10 code for pneumonia in DC could affect the mortality statistics, highlighting the need to improve mortality coding rules.
McCrea, Simon M
2009-01-01
Alexander Luria's model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria's syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria's two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning-Attention-Simultaneous-Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria's theory of higher cortical functions. In this paper a theoretical review of Luria's theory, Das and colleagues elaboration of Luria's model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby.
A review and empirical study of the composite scales of the Das–Naglieri cognitive assessment system
McCrea, Simon M
2009-01-01
Alexander Luria’s model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria’s syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria’s two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning–Attention–Simultaneous–Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria’s theory of higher cortical functions. In this paper a theoretical review of Luria’s theory, Das and colleagues elaboration of Luria’s model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby. PMID:22110322
Authorship Attribution of Source Code
ERIC Educational Resources Information Center
Tennyson, Matthew F.
2013-01-01
Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…
One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.
ERIC Educational Resources Information Center
Milroy, Lesley, Ed.; Muysken, Pieter, Ed.
Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…
Bellis, Jennifer R; Kirkham, Jamie J; Nunn, Anthony J; Pirmohamed, Munir
2014-12-17
National Health Service (NHS) hospitals in the UK use a system of coding for patient episodes. The coding system used is the International Classification of Disease (ICD-10). There are ICD-10 codes which may be associated with adverse drug reactions (ADRs) and there is a possibility of using these codes for ADR surveillance. This study aimed to determine whether ADRs prospectively identified in children admitted to a paediatric hospital were coded appropriately using ICD-10. The electronic admission abstract for each patient with at least one ADR was reviewed. A record was made of whether the ADR(s) had been coded using ICD-10. Of 241 ADRs, 76 (31.5%) were coded using at least one ICD-10 ADR code. Of the oncology ADRs, 70/115 (61%) were coded using an ICD-10 ADR code compared with 6/126 (4.8%) non-oncology ADRs (difference in proportions 56%, 95% CI 46.2% to 65.8%; p < 0.001). The majority of ADRs detected in a prospective study at a paediatric centre would not have been identified if the study had relied on ICD-10 codes as a single means of detection. Data derived from administrative healthcare databases are not reliable for identifying ADRs by themselves, but may complement other methods of detection.
Processing module operating methods, processing modules, and communications systems
McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy
2014-09-09
A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.
Siew, Edward D; Basu, Rajit K; Wunsch, Hannah; Shaw, Andrew D; Goldstein, Stuart L; Ronco, Claudio; Kellum, John A; Bagshaw, Sean M
2016-01-01
The purpose of this review is to report how administrative data have been used to study AKI, identify current limitations, and suggest how these data sources might be enhanced to address knowledge gaps in the field. 1) To review the existing evidence-base on how AKI is coded across administrative datasets, 2) To identify limitations, gaps in knowledge, and major barriers to scientific progress in AKI related to coding in administrative data, 3) To discuss how administrative data for AKI might be enhanced to enable "communication" and "translation" within and across administrative jurisdictions, and 4) To suggest how administrative databases might be configured to inform 'registry-based' pragmatic studies. Literature review of English language articles through PubMed search for relevant AKI literature focusing on the validation of AKI in administrative data or used administrative data to describe the epidemiology of AKI. Acute Dialysis Quality Initiative (ADQI) Consensus Conference September 6-7(th), 2015, Banff, Canada. Hospitalized patients with AKI. The coding structure for AKI in many administrative datasets limits understanding of true disease burden (especially less severe AKI), its temporal trends, and clinical phenotyping. Important opportunities exist to improve the quality and coding of AKI data to better address critical knowledge gaps in AKI and improve care. A modified Delphi consensus building process consisting of review of the literature and summary statements were developed through a series of alternating breakout and plenary sessions. Administrative codes for AKI are limited by poor sensitivity, lack of standardization to classify severity, and poor contextual phenotyping. These limitations are further hampered by reduced awareness of AKI among providers and the subjective nature of reporting. While an idealized definition of AKI may be difficult to implement, improving standardization of reporting by using laboratory-based definitions and providing complementary information on the context in which AKI occurs are possible. Administrative databases may also help enhance the conduct of and inform clinical or registry-based pragmatic studies. Data sources largely restricted to North American and Europe. Administrative data are rapidly growing and evolving, and represent an unprecedented opportunity to address knowledge gaps in AKI. Progress will require continued efforts to improve awareness of the impact of AKI on public health, engage key stakeholders, and develop tangible strategies to reconfigure infrastructure to improve the reporting and phenotyping of AKI. WHY IS THIS REVIEW IMPORTANT?: Rapid growth in the size and availability of administrative data has enhanced the clinical study of acute kidney injury (AKI). However, significant limitations exist in coding that hinder our ability to better understand its epidemiology and address knowledge gaps. The following consensus-based review discusses how administrative data have been used to study AKI, identify current limitations, and suggest how these data sources might be enhanced to improve the future study of this disease. WHAT ARE THE KEY MESSAGES?: The current coding structure of administrative data is hindered by a lack of sensitivity, standardization to properly classify severity, and limited clinical phenotyping. These limitations combined with reduced awareness of AKI and the subjective nature of reporting limit understanding of disease burden across settings and time periods. As administrative data become more sophisticated and complex, important opportunities to employ more objective criteria to diagnose and stage AKI as well as improve contextual phenotyping exist that can help address knowledge gaps and improve care.
Stalfors, J; Enoksson, F; Hermansson, A; Hultcrantz, M; Robinson, Å; Stenfeldt, K; Groth, A
2013-04-01
To investigate the internal validity of the diagnosis code used at discharge after treatment of acute mastoiditis. Retrospective national re-evaluation study of patient records 1993-2007 and make comparison with the original ICD codes. All ENT departments at university hospitals and one large county hospital department in Sweden. A total of 1966 records were reviewed for patients with ICD codes for in-patient treatment of acute (529), chronic (44) and unspecified mastoiditis (21) and acute otitis media (1372). ICD codes were reviewed by the authors with a defined protocol for the clinical diagnosis of acute mastoiditis. Those not satisfying the diagnosis were given an alternative diagnosis. Of 529 records with ICD coding for acute mastoiditis, 397 (75%) were found to meet the definition of acute mastoiditis used in this study, while 18% were not diagnosed as having any type of mastoiditis after review. Review of the in-patients treated for acute media otitis identified an additional 60 cases fulfilling the definition of acute mastoiditis. Overdiagnosis was common, and many patients with a diagnostic code indicating acute mastoiditis had been treated for external otitis or otorrhoea with transmyringeal drainage. The internal validity of the diagnosis acute mastoiditis is dependent on the use of standardised, well-defined criteria. Reliability of diagnosis is fundamental for the comparison of results from different studies. Inadequate reliability in the diagnosis of acute mastoiditis also affects calculations of incidence rates and statistical power and may also affect the conclusions drawn from the results. © 2013 Blackwell Publishing Ltd.
Coding in Stroke and Other Cerebrovascular Diseases.
Korb, Pearce J; Jones, William
2017-02-01
Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.
1985-12-31
34 Optics Letters, 2 (1), 1-3 (1978). 7. Grossberg, S., "Adaptive Resonance in Development, Perception and Cognition ," SIAM-AMS Proc., 13, 107-156...Illusions," Biol. Cybernetics, 23, 187-202, (1976b). 11. Grossberg, S., "How Does A Brain Build a Cognitive Code?", Psychol. Review, 87 (1), 1-51 (1980...34perceptron" (F. Rosenblatt, Principles of Neurodynamics ), workers in the neural network field have been seeking to understand how neural networks can perform
Colorectal surgeons teaching general surgery residents: current challenges and opportunities.
Schmitz, Connie C; Chow, Christopher J; Rothenberger, David A
2012-09-01
Effective teaching for general surgery residents requires that faculty members with colorectal expertise actively engage in the education process and fully understand the current context for residency training. In this article, we review important national developments with respect to graduate medical education that impact resident supervision, curriculum implementation, resident assessment, and program evaluation. We argue that establishing a culture of respect and professionalism in today's teaching environment is one of the most important legacies that surgical educators can leave for the coming generation. Faculty role modeling and the process of socializing residents is highlighted. We review the American College of Surgeons' Code of Professional Conduct, summarize some of the current strategies for teaching and assessing professionalism, and reflect on principles of motivation that apply to resident training both for the trainee and the trainer.
Colorectal Surgeons Teaching General Surgery Residents: Current Challenges and Opportunities
Schmitz, Connie C.; Chow, Christopher J.; Rothenberger, David A.
2012-01-01
Effective teaching for general surgery residents requires that faculty members with colorectal expertise actively engage in the education process and fully understand the current context for residency training. In this article, we review important national developments with respect to graduate medical education that impact resident supervision, curriculum implementation, resident assessment, and program evaluation. We argue that establishing a culture of respect and professionalism in today's teaching environment is one of the most important legacies that surgical educators can leave for the coming generation. Faculty role modeling and the process of socializing residents is highlighted. We review the American College of Surgeons' Code of Professional Conduct, summarize some of the current strategies for teaching and assessing professionalism, and reflect on principles of motivation that apply to resident training both for the trainee and the trainer. PMID:23997668
Practice management education during surgical residency.
Jones, Kory; Lebron, Ricardo A; Mangram, Alicia; Dunn, Ernest
2008-12-01
Surgical education has undergone radical changes in the past decade. The introductions of laparoscopic surgery and endovascular techniques have required program directors to alter surgical training. The 6 competencies are now in place. One issue that still needs to be addressed is the business aspect of surgical practice. Often residents complete their training with minimal or no knowledge on coding of charges or basic aspects on how to set up a practice. We present our program, which has been in place over the past 2 years and is designed to teach the residents practice management. The program begins with a series of 10 lectures given monthly beginning in August. Topics include an introduction to types of practices available, negotiating a contract, managed care, and marketing the practice. Both medical and surgical residents attend these conferences. In addition, the surgical residents meet monthly with the business office to discuss billing and coding issues. These are didactic sessions combined with in-house chart reviews of surgical coding. The third phase of the practice management plan has the coding team along with the program director attend the outpatient clinic to review in real time the evaluation and management coding of clinic visits. Resident evaluations were completed for each of the practice management lectures. The responses were recorded on a Likert scale. The scores ranged from 4.1 to 4.8 (average, 4.3). Highest scores were given to lectures concerning negotiating employee agreements, recruiting contracts, malpractice insurance, and risk management. The medical education department has tracked resident coding compliance over the past 2 years. Surgical coding compliance increased from 36% to 88% over a 12-month period. The program director who participated in the educational process increased his accuracy from 50% to 90% over the same time period. When residents finish their surgical training they need to be ready to enter the world of business. These needs will be present whether pursuing a career in academic medicine or the private sector. A program that focuses on the business aspect of surgery enables the residents to better navigate the future while helping to fulfill the systems-based practice competency.
Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.
Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai
2017-03-31
As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 31.03.2017.
Basak, Jolly; Nithin, Chandran
2015-01-01
Non-coding RNAs (ncRNAs) have emerged as versatile master regulator of biological functions in recent years. MicroRNAs (miRNAs) are small endogenous ncRNAs of 18-24 nucleotides in length that originates from long self-complementary precursors. Besides their direct involvement in developmental processes, plant miRNAs play key roles in gene regulatory networks and varied biological processes. Alternatively, long ncRNAs (lncRNAs) are a large and diverse class of transcribed ncRNAs whose length exceed that of 200 nucleotides. Plant lncRNAs are transcribed by different RNA polymerases, showing diverse structural features. Plant lncRNAs also are important regulators of gene expression in diverse biological processes. There has been a breakthrough in the technology of genome editing, the CRISPR-Cas9 (clustered regulatory interspaced short palindromic repeats/CRISPR-associated protein 9) technology, in the last decade. CRISPR loci are transcribed into ncRNA and eventually form a functional complex with Cas9 and further guide the complex to cleave complementary invading DNA. The CRISPR-Cas technology has been successfully applied in model plants such as Arabidopsis and tobacco and important crops like wheat, maize, and rice. However, all these studies are focused on protein coding genes. Information about targeting non-coding genes is scarce. Hitherto, the CRISPR-Cas technology has been exclusively used in vertebrate systems to engineer miRNA/lncRNAs, but it is still relatively unexplored in plants. While briefing miRNAs, lncRNAs and applications of the CRISPR-Cas technology in human and animals, this review essentially elaborates several strategies to overcome the challenges of applying the CRISPR-Cas technology in editing ncRNAs in plants and the future perspective of this field.
Audigé, Laurent; Cornelius, Carl-Peter; Ieva, Antonio Di; Prein, Joachim
2014-01-01
Validated trauma classification systems are the sole means to provide the basis for reliable documentation and evaluation of patient care, which will open the gateway to evidence-based procedures and healthcare in the coming years. With the support of AO Investigation and Documentation, a classification group was established to develop and evaluate a comprehensive classification system for craniomaxillofacial (CMF) fractures. Blueprints for fracture classification in the major constituents of the human skull were drafted and then evaluated by a multispecialty group of experienced CMF surgeons and a radiologist in a structured process during iterative agreement sessions. At each session, surgeons independently classified the radiological imaging of up to 150 consecutive cases with CMF fractures. During subsequent review meetings, all discrepancies in the classification outcome were critically appraised for clarification and improvement until consensus was reached. The resulting CMF classification system is structured in a hierarchical fashion with three levels of increasing complexity. The most elementary level 1 simply distinguishes four fracture locations within the skull: mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94). Levels 2 and 3 focus on further defining the fracture locations and for fracture morphology, achieving an almost individual mapping of the fracture pattern. This introductory article describes the rationale for the comprehensive AO CMF classification system, discusses the methodological framework, and provides insight into the experiences and interactions during the evaluation process within the core groups. The details of this system in terms of anatomy and levels are presented in a series of focused tutorials illustrated with case examples in this special issue of the Journal. PMID:25489387
Fisher, Brian T; Harris, Tracey; Torp, Kari; Seif, Alix E; Shah, Ami; Huang, Yuan-Shung V; Bailey, L Charles; Kersun, Leslie S; Reilly, Anne F; Rheingold, Susan R; Walker, Dana; Li, Yimei; Aplenc, Richard
2014-01-01
Acute lymphoblastic leukemia (ALL) accounts for almost one quarter of pediatric cancer in the United States. Despite cooperative group therapeutic trials, there remains a paucity of large cohort data on which to conduct epidemiology and comparative effectiveness research studies. We designed a 3-step process utilizing International Classification of Diseases-9 Clinical Modification (ICD-9) discharge diagnoses codes and chemotherapy exposure data contained in the Pediatric Health Information System administrative database to establish a cohort of children with de novo ALL. This process was validated by chart review at 1 of the pediatric centers. An ALL cohort of 8733 patients was identified with a sensitivity of 88% [95% confidence interval (CI), 83%-92%] and a positive predictive value of 93% (95% CI, 89%-96%). The 30-day all cause inpatient case fatality rate using this 3-step process was 0.80% (95% CI, 0.63%-1.01%), which was significantly different than the case fatality rate of 1.40% (95% CI, 1.23%-1.60%) when ICD-9 codes alone were used. This is the first report of assembly and validation of a cohort of de novo ALL patients from a database representative of free-standing children's hospitals across the United States. Our data demonstrate that the use of ICD-9 codes alone to establish cohorts will lead to substantial patient misclassification and result in biased outcome estimates. Systematic methods beyond the use of just ICD-9 codes must be used before analysis to establish accurate cohorts of patients with malignancy. A similar approach should be followed when establishing future cohorts from administrative data.
Audigé, Laurent; Cornelius, Carl-Peter; Di Ieva, Antonio; Prein, Joachim
2014-12-01
Validated trauma classification systems are the sole means to provide the basis for reliable documentation and evaluation of patient care, which will open the gateway to evidence-based procedures and healthcare in the coming years. With the support of AO Investigation and Documentation, a classification group was established to develop and evaluate a comprehensive classification system for craniomaxillofacial (CMF) fractures. Blueprints for fracture classification in the major constituents of the human skull were drafted and then evaluated by a multispecialty group of experienced CMF surgeons and a radiologist in a structured process during iterative agreement sessions. At each session, surgeons independently classified the radiological imaging of up to 150 consecutive cases with CMF fractures. During subsequent review meetings, all discrepancies in the classification outcome were critically appraised for clarification and improvement until consensus was reached. The resulting CMF classification system is structured in a hierarchical fashion with three levels of increasing complexity. The most elementary level 1 simply distinguishes four fracture locations within the skull: mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94). Levels 2 and 3 focus on further defining the fracture locations and for fracture morphology, achieving an almost individual mapping of the fracture pattern. This introductory article describes the rationale for the comprehensive AO CMF classification system, discusses the methodological framework, and provides insight into the experiences and interactions during the evaluation process within the core groups. The details of this system in terms of anatomy and levels are presented in a series of focused tutorials illustrated with case examples in this special issue of the Journal.
Review of codes, standards, and regulations for natural gas locomotives.
DOT National Transportation Integrated Search
2014-06-01
This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...
Gomes, Clarissa P C; de Gonzalo-Calvo, David; Toro, Rocio; Fernandes, Tiago; Theisen, Daniel; Wang, Da-Zhi; Devaux, Yvan
2018-05-23
There is overwhelming evidence that regular exercise training is protective against cardiovascular disease (CVD), the main cause of death worldwide. Despite the benefits of exercise, the intricacies of their underlying molecular mechanisms remain largely unknown. Non-coding RNAs (ncRNAs) have been recognized as a major regulatory network governing gene expression in several physiological processes and appeared as pivotal modulators in a myriad of cardiovascular processes under physiological and pathological conditions. However, little is known about ncRNA expression and role in response to exercise. Revealing the molecular components and mechanisms of the link between exercise and health outcomes will catalyse discoveries of new biomarkers and therapeutic targets. Here we review the current understanding of the ncRNA role in exercise-induced adaptations focused on the cardiovascular system and address their potential role in clinical applications for CVD. Finally, considerations and perspectives for future studies will be proposed. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.
Castillo, Jonathan; Stueve, Theresa R.; Marconett, Crystal N.
2017-01-01
Previously thought of as junk transcripts and pseudogene remnants, long non-coding RNAs (lncRNAs) have come into their own over the last decade as an essential component of cellular activity, regulating a plethora of functions within multicellular organisms. lncRNAs are now known to participate in development, cellular homeostasis, immunological processes, and the development of disease. With the advent of next generation sequencing technology, hundreds of thousands of lncRNAs have been identified. However, movement beyond mere discovery to the understanding of molecular processes has been stymied by the complicated genomic structure, tissue-restricted expression, and diverse regulatory roles lncRNAs play. In this review, we will focus on lncRNAs involved in lung cancer, the most common cause of cancer-related death in the United States and worldwide. We will summarize their various methods of discovery, provide consensus rankings of deregulated lncRNAs in lung cancer, and describe in detail the limited functional analysis that has been undertaken so far. PMID:29113413
Interoceptive inference: From computational neuroscience to clinic.
Owens, Andrew P; Allen, Micah; Ondobaka, Sasha; Friston, Karl J
2018-04-22
The central and autonomic nervous systems can be defined by their anatomical, functional and neurochemical characteristics, but neither functions in isolation. For example, fundamental components of autonomically mediated homeostatic processes are afferent interoceptive signals reporting the internal state of the body and efferent signals acting on interoceptive feedback assimilated by the brain. Recent predictive coding (interoceptive inference) models formulate interoception in terms of embodied predictive processes that support emotion and selfhood. We propose interoception may serve as a way to investigate holistic nervous system function and dysfunction in disorders of brain, body and behaviour. We appeal to predictive coding and (active) interoceptive inference, to describe the homeostatic functions of the central and autonomic nervous systems. We do so by (i) reviewing the active inference formulation of interoceptive and autonomic function, (ii) survey clinical applications of this formulation and (iii) describe how it offers an integrative approach to human physiology; particularly, interactions between the central and peripheral nervous systems in health and disease. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Esophageal function testing: Billing and coding update.
Khan, A; Massey, B; Rao, S; Pandolfino, J
2018-01-01
Esophageal function testing is being increasingly utilized in diagnosis and management of esophageal disorders. There have been several recent technological advances in the field to allow practitioners the ability to more accurately assess and treat such conditions, but there has been a relative lack of education in the literature regarding the associated Common Procedural Terminology (CPT) codes and methods of reimbursement. This review, commissioned and supported by the American Neurogastroenterology and Motility Society Council, aims to summarize each of the CPT codes for esophageal function testing and show the trends of associated reimbursement, as well as recommend coding methods in a practical context. We also aim to encourage many of these codes to be reviewed on a gastrointestinal (GI) societal level, by providing evidence of both discrepancies in coding definitions and inadequate reimbursement in this new era of esophageal function testing. © 2017 John Wiley & Sons Ltd.
The Astrophysics Source Code Library by the numbers
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein
2018-01-01
The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.
Billing, coding, and documentation in the critical care environment.
Fakhry, S M
2000-06-01
Optimal conduct of modern-day physician practices involves a thorough understanding and application of the principles of documentation, coding, and billing. Physicians' role in these activities can no longer be secondary. Surgeons practicing critical care must be well versed in these concepts and their effective application to ensure that they are competitive in an increasingly difficult and demanding environment. Health care policies and regulations continue to evolve, mandating constant education of practicing physicians and their staffs and surgical residents who also will have to function in this environment. Close, collaborative relationships between physicians and individuals well versed in the concepts of documentation, coding, and billing are indispensable. Similarly, ongoing educational and review processes (whether internal or consultative from outside sources) not only can decrease the possibility of unfavorable outcomes from audit but also will likely enhance practice efficiency and cash flow. A financially viable practice is certainly a prerequisite for a surgical critical care practice to achieve its primary goal of excellence in patient care.
Methylation of miRNA genes and oncogenesis.
Loginov, V I; Rykov, S V; Fridman, M V; Braga, E A
2015-02-01
Interaction between microRNA (miRNA) and messenger RNA of target genes at the posttranscriptional level provides fine-tuned dynamic regulation of cell signaling pathways. Each miRNA can be involved in regulating hundreds of protein-coding genes, and, conversely, a number of different miRNAs usually target a structural gene. Epigenetic gene inactivation associated with methylation of promoter CpG-islands is common to both protein-coding genes and miRNA genes. Here, data on functions of miRNAs in development of tumor-cell phenotype are reviewed. Genomic organization of promoter CpG-islands of the miRNA genes located in inter- and intragenic areas is discussed. The literature and our own results on frequency of CpG-island methylation in miRNA genes from tumors are summarized, and data regarding a link between such modification and changed activity of miRNA genes and, consequently, protein-coding target genes are presented. Moreover, the impact of miRNA gene methylation on key oncogenetic processes as well as affected signaling pathways is discussed.
3DHZETRN: Inhomogeneous Geometry Issues
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.
2017-01-01
Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.
Identification and role of regulatory non-coding RNAs in Listeria monocytogenes.
Izar, Benjamin; Mraheil, Mobarak Abu; Hain, Torsten
2011-01-01
Bacterial regulatory non-coding RNAs control numerous mRNA targets that direct a plethora of biological processes, such as the adaption to environmental changes, growth and virulence. Recently developed high-throughput techniques, such as genomic tiling arrays and RNA-Seq have allowed investigating prokaryotic cis- and trans-acting regulatory RNAs, including sRNAs, asRNAs, untranslated regions (UTR) and riboswitches. As a result, we obtained a more comprehensive view on the complexity and plasticity of the prokaryotic genome biology. Listeria monocytogenes was utilized as a model system for intracellular pathogenic bacteria in several studies, which revealed the presence of about 180 regulatory RNAs in the listerial genome. A regulatory role of non-coding RNAs in survival, virulence and adaptation mechanisms of L. monocytogenes was confirmed in subsequent experiments, thus, providing insight into a multifaceted modulatory function of RNA/mRNA interference. In this review, we discuss the identification of regulatory RNAs by high-throughput techniques and in their functional role in L. monocytogenes.
Audit of Clinical Coding of Major Head and Neck Operations
Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean
2009-01-01
INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944
System Design Description for the TMAD Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finfrock, S.H.
This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.
Pre- and Post-Processing Tools to Streamline the CFD Process
NASA Technical Reports Server (NTRS)
Dorney, Suzanne Miller
2002-01-01
This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.
ICD-10 codes used to identify adverse drug events in administrative data: a systematic review.
Hohl, Corinne M; Karpov, Andrei; Reddekopp, Lisa; Doyle-Waters, Mimi; Stausberg, Jürgen
2014-01-01
Adverse drug events, the unintended and harmful effects of medications, are important outcome measures in health services research. Yet no universally accepted set of International Classification of Diseases (ICD) revision 10 codes or coding algorithms exists to ensure their consistent identification in administrative data. Our objective was to synthesize a comprehensive set of ICD-10 codes used to identify adverse drug events. We developed a systematic search strategy and applied it to five electronic reference databases. We searched relevant medical journals, conference proceedings, electronic grey literature and bibliographies of relevant studies, and contacted content experts for unpublished studies. One author reviewed the titles and abstracts for inclusion and exclusion criteria. Two authors reviewed eligible full-text articles and abstracted data in duplicate. Data were synthesized in a qualitative manner. Of 4241 titles identified, 41 were included. We found a total of 827 ICD-10 codes that have been used in the medical literature to identify adverse drug events. The median number of codes used to search for adverse drug events was 190 (IQR 156-289) with a large degree of variability between studies in the numbers and types of codes used. Authors commonly used external injury (Y40.0-59.9) and disease manifestation codes. Only two papers reported on the sensitivity of their code set. Substantial variability exists in the methods used to identify adverse drug events in administrative data. Our work may serve as a point of reference for future research and consensus building in this area.
ICD-10 codes used to identify adverse drug events in administrative data: a systematic review
Hohl, Corinne M; Karpov, Andrei; Reddekopp, Lisa; Stausberg, Jürgen
2014-01-01
Background Adverse drug events, the unintended and harmful effects of medications, are important outcome measures in health services research. Yet no universally accepted set of International Classification of Diseases (ICD) revision 10 codes or coding algorithms exists to ensure their consistent identification in administrative data. Our objective was to synthesize a comprehensive set of ICD-10 codes used to identify adverse drug events. Methods We developed a systematic search strategy and applied it to five electronic reference databases. We searched relevant medical journals, conference proceedings, electronic grey literature and bibliographies of relevant studies, and contacted content experts for unpublished studies. One author reviewed the titles and abstracts for inclusion and exclusion criteria. Two authors reviewed eligible full-text articles and abstracted data in duplicate. Data were synthesized in a qualitative manner. Results Of 4241 titles identified, 41 were included. We found a total of 827 ICD-10 codes that have been used in the medical literature to identify adverse drug events. The median number of codes used to search for adverse drug events was 190 (IQR 156–289) with a large degree of variability between studies in the numbers and types of codes used. Authors commonly used external injury (Y40.0–59.9) and disease manifestation codes. Only two papers reported on the sensitivity of their code set. Conclusions Substantial variability exists in the methods used to identify adverse drug events in administrative data. Our work may serve as a point of reference for future research and consensus building in this area. PMID:24222671
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
...: Notice. SUMMARY: The DOE participates in the code development process of the International Code Council... notice outlines the process by which DOE produces code change proposals, and participates in the ICC code development process. FOR FURTHER INFORMATION CONTACT: Jeremiah Williams, U.S. Department of Energy, Office of...
Watson, Jessica; Nicholson, Brian D; Hamilton, Willie; Price, Sarah
2017-11-22
Analysis of routinely collected electronic health record (EHR) data from primary care is reliant on the creation of codelists to define clinical features of interest. To improve scientific rigour, transparency and replicability, we describe and demonstrate a standardised reproducible methodology for clinical codelist development. We describe a three-stage process for developing clinical codelists. First, the clear definition a priori of the clinical feature of interest using reliable clinical resources. Second, development of a list of potential codes using statistical software to comprehensively search all available codes. Third, a modified Delphi process to reach consensus between primary care practitioners on the most relevant codes, including the generation of an 'uncertainty' variable to allow sensitivity analysis. These methods are illustrated by developing a codelist for shortness of breath in a primary care EHR sample, including modifiable syntax for commonly used statistical software. The codelist was used to estimate the frequency of shortness of breath in a cohort of 28 216 patients aged over 18 years who received an incident diagnosis of lung cancer between 1 January 2000 and 30 November 2016 in the Clinical Practice Research Datalink (CPRD). Of 78 candidate codes, 29 were excluded as inappropriate. Complete agreement was reached for 44 (90%) of the remaining codes, with partial disagreement over 5 (10%). 13 091 episodes of shortness of breath were identified in the cohort of 28 216 patients. Sensitivity analysis demonstrates that codes with the greatest uncertainty tend to be rarely used in clinical practice. Although initially time consuming, using a rigorous and reproducible method for codelist generation 'future-proofs' findings and an auditable, modifiable syntax for codelist generation enables sharing and replication of EHR studies. Published codelists should be badged by quality and report the methods of codelist generation including: definitions and justifications associated with each codelist; the syntax or search method; the number of candidate codes identified; and the categorisation of codes after Delphi review. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farnham, Irene; Rehfeldt, Kenneth
Preemptive reviews (PERs) of Underground Test Area (UGTA) Activity corrective action unit (CAU) studies are an important and long-maintained quality improvement process. The CAU-specific PER committees provide internal technical review of ongoing work throughout the CAU lifecycle. The reviews, identified in the UGTA Quality Assurance Plan (QAP) (Sections 1.3.5.1 and 3.2), assure work is comprehensive, accurate, in keeping with the state of the art, and consistent with CAU goals. PER committees review various products, including data, documents, software/codes, analyses, and models. PER committees may also review technical briefings including Federal Facility Agreement and Consent Order (FFACO)-required presentations to the Nevadamore » Division of Environmental Protection (NDEP) and presentations supporting key technical decisions (e.g., investigation plans and approaches). PER committees provide technical recommendations to support regulatory decisions that are the responsibility of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) and NDEP.« less
A combined Fuzzy and Naive Bayesian strategy can be used to assign event codes to injury narratives.
Marucci-Wellman, H; Lehto, M; Corns, H
2011-12-01
Bayesian methods show promise for classifying injury narratives from large administrative datasets into cause groups. This study examined a combined approach where two Bayesian models (Fuzzy and Naïve) were used to either classify a narrative or select it for manual review. Injury narratives were extracted from claims filed with a worker's compensation insurance provider between January 2002 and December 2004. Narratives were separated into a training set (n=11,000) and prediction set (n=3,000). Expert coders assigned two-digit Bureau of Labor Statistics Occupational Injury and Illness Classification event codes to each narrative. Fuzzy and Naïve Bayesian models were developed using manually classified cases in the training set. Two semi-automatic machine coding strategies were evaluated. The first strategy assigned cases for manual review if the Fuzzy and Naïve models disagreed on the classification. The second strategy selected additional cases for manual review from the Agree dataset using prediction strength to reach a level of 50% computer coding and 50% manual coding. When agreement alone was used as the filtering strategy, the majority were coded by the computer (n=1,928, 64%) leaving 36% for manual review. The overall combined (human plus computer) sensitivity was 0.90 and positive predictive value (PPV) was >0.90 for 11 of 18 2-digit event categories. Implementing the 2nd strategy improved results with an overall sensitivity of 0.95 and PPV >0.90 for 17 of 18 categories. A combined Naïve-Fuzzy Bayesian approach can classify some narratives with high accuracy and identify others most beneficial for manual review, reducing the burden on human coders.
A History of Instructional Methods in Uncontracted and Contracted Braille
ERIC Educational Resources Information Center
D'Andrea, Frances Mary
2009-01-01
This literature review outlines the history of the braille code as used in the United States and Canada, illustrating how both the code itself and instructional strategies for teaching it changed over time. The review sets the stage for the research questions of the recently completed Alphabetic Braille and Contracted Braille Study.
Roadway contributing factors in traffic crashes.
DOT National Transportation Integrated Search
2014-09-01
This project involved an evaluation of the codes which relate to roadway contributing : factors. This included a review of relevant codes used in other states. Crashes with related : codes were summarized and analyzed. A sample of crash sites was ins...
Properties of a certain stochastic dynamical system, channel polarization, and polar codes
NASA Astrophysics Data System (ADS)
Tanaka, Toshiyuki
2010-06-01
A new family of codes, called polar codes, has recently been proposed by Arikan. Polar codes are of theoretical importance because they are provably capacity achieving with low-complexity encoding and decoding. We first discuss basic properties of a certain stochastic dynamical system, on the basis of which properties of channel polarization and polar codes are reviewed, with emphasis on our recent results.
Non-coding RNAs in cancer brain metastasis
Wu, Kerui; Sharma, Sambad; Venkat, Suresh; Liu, Keqin; Zhou, Xiaobo; Watabe, Kounosuke
2017-01-01
More than 90% of cancer death is attributed to metastatic disease, and the brain is one of the major metastatic sites of melanoma, colon, renal, lung and breast cancers. Despite the recent advancement of targeted therapy for cancer, the incidence of brain metastasis is increasing. One reason is that most therapeutic drugs can’t penetrate blood-brain-barrier and tumor cells find the brain as sanctuary site. In this review, we describe the pathophysiology of brain metastases to introduce the latest understandings of metastatic brain malignancies. This review also particularly focuses on non-coding RNAs and their roles in cancer brain metastasis. Furthermore, we discuss the roles of the extracellular vesicles as they are known to transport information between cells to initiate cancer cell-microenvironment communication. The potential clinical translation of non-coding RNAs as a tool for diagnosis and for treatment is also discussed in this review. At the end, the computational aspects of non-coding RNA detection, the sequence and structure calculation and epigenetic regulation of non-coding RNA in brain metastasis are discussed. PMID:26709907
Ribonucleoprotein complexes in neurologic diseases.
Ule, Jernej
2008-10-01
Ribonucleoprotein (RNP) complexes regulate the tissue-specific RNA processing and transport that increases the coding capacity of our genome and the ability to respond quickly and precisely to the diverse set of signals. This review focuses on three proteins that are part of RNP complexes in most cells of our body: TAR DNA-binding protein (TDP-43), the survival motor neuron protein (SMN), and fragile-X mental retardation protein (FMRP). In particular, the review asks the question why these ubiquitous proteins are primarily associated with defects in specific regions of the central nervous system? To understand this question, it is important to understand the role of genetic and cellular environment in causing the defect in the protein, as well as how the defective protein leads to misregulation of specific target RNAs. Two approaches for comprehensive analysis of defective RNA-protein interactions are presented. The first approach defines the RNA code or the collection of proteins that bind to a certain cis-acting RNA site in order to lead to a predictable outcome. The second approach defines the RNA map or the summary of positions on target RNAs where binding of a particular RNA-binding protein leads to a predictable outcome. As we learn more about the RNA codes and maps that guide the action of the dynamic RNP world in our brain, possibilities for new treatments of neurologic diseases are bound to emerge.
Non-coding RNAs—Novel targets in neurotoxicity
Tal, Tamara L.; Tanguay, Robert L.
2012-01-01
Over the past ten years non-coding RNAs (ncRNAs) have emerged as pivotal players in fundamental physiological and cellular processes and have been increasingly implicated in cancer, immune disorders, and cardiovascular, neurodegenerative, and metabolic diseases. MicroRNAs (miRNAs) represent a class of ncRNA molecules that function as negative regulators of post-transcriptional gene expression. miRNAs are predicted to regulate 60% of all human protein-coding genes and as such, play key roles in cellular and developmental processes, human health, and disease. Relative to counterparts that lack bindings sites for miRNAs, genes encoding proteins that are post-transcriptionally regulated by miRNAs are twice as likely to be sensitive to environmental chemical exposure. Not surprisingly, miRNAs have been recognized as targets or effectors of nervous system, developmental, hepatic, and carcinogenic toxicants, and have been identified as putative regulators of phase I xenobiotic-metabolizing enzymes. In this review, we give an overview of the types of ncRNAs and highlight their roles in neurodevelopment, neurological disease, activity-dependent signaling, and drug metabolism. We then delve into specific examples that illustrate their importance as mediators, effectors, or adaptive agents of neurotoxicants or neuroactive pharmaceutical compounds. Finally, we identify a number of outstanding questions regarding ncRNAs and neurotoxicity. PMID:22394481
Attachment-related psychodynamics.
Shaver, Phillip R; Mikulincer, Mario
2002-09-01
Because there has been relatively little communication and cross-fertilization between the two major lines of research on adult attachment, one based on coded narrative assessments of defensive processes, the other on simple self-reports of 'attachment style' in close relationships, we here explain and review recent work based on a combination of self-report and other kinds of method, including behavioral observations and unconscious priming techniques. The review indicates that considerable progress has been made in testing central hypotheses derived from attachment theory and in exploring unconscious, psychodynamic processes related to affect-regulation and attachment-system activation. The combination of self-report assessment of attachment style and experimental manipulation of other theoretically pertinent variables allows researchers to test causal hypotheses. We present a model of normative and individual-difference processes related to attachment and identify areas in which further research is needed and likely to be successful. One long-range goal is to create a more complete theory of personality built on attachment theory and other object relations theories.
A Comparison of Source Code Plagiarism Detection Engines
NASA Astrophysics Data System (ADS)
Lancaster, Thomas; Culwin, Fintan
2004-06-01
Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.
The Astrophysics Source Code Library: An Update
NASA Astrophysics Data System (ADS)
Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.
2012-01-01
The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.
Long non-coding RNA and Polycomb: an intricate partnership in cancer biology.
Achour, Cyrinne; Aguilo, Francesca
2018-06-01
High-throughput analyses have revealed that the vast majority of the transcriptome does not code for proteins. These non-translated transcripts, when larger than 200 nucleotides, are termed long non-coding RNAs (lncRNAs), and play fundamental roles in diverse cellular processes. LncRNAs are subject to dynamic chemical modification, adding another layer of complexity to our understanding of the potential roles that lncRNAs play in health and disease. Many lncRNAs regulate transcriptional programs by influencing the epigenetic state through direct interactions with chromatin-modifying proteins. Among these proteins, Polycomb repressive complexes 1 and 2 (PRC1 and PRC2) have been shown to be recruited by lncRNAs to silence target genes. Aberrant expression, deficiency or mutation of both lncRNA and Polycomb have been associated with numerous human diseases, including cancer. In this review, we have highlighted recent findings regarding the concerted mechanism of action of Polycomb group proteins (PcG), acting together with some classically defined lncRNAs including X-inactive specific transcript ( XIST ), antisense non-coding RNA in the INK4 locus ( ANRIL ), metastasis associated lung adenocarcinoma transcript 1 ( MALAT1 ), and HOX transcript antisense RNA ( HOTAIR ).
Supplementing Public Health Inspection via Social Media
Schomberg, John P.; Haimson, Oliver L.; Hayes, Gillian R.; Anton-Culver, Hoda
2016-01-01
Foodborne illness is prevented by inspection and surveillance conducted by health departments across America. Appropriate restaurant behavior is enforced and monitored via public health inspections. However, surveillance coverage provided by state and local health departments is insufficient in preventing the rising number of foodborne illness outbreaks. To address this need for improved surveillance coverage we conducted a supplementary form of public health surveillance using social media data: Yelp.com restaurant reviews in the city of San Francisco. Yelp is a social media site where users post reviews and rate restaurants they have personally visited. Presence of keywords related to health code regulations and foodborne illness symptoms, number of restaurant reviews, number of Yelp stars, and restaurant price range were included in a model predicting a restaurant’s likelihood of health code violation measured by the assigned San Francisco public health code rating. For a list of major health code violations see (S1 Table). We built the predictive model using 71,360 Yelp reviews of restaurants in the San Francisco Bay Area. The predictive model was able to predict health code violations in 78% of the restaurants receiving serious citations in our pilot study of 440 restaurants. Training and validation data sets each pulled data from 220 restaurants in San Francisco. Keyword analysis of free text within Yelp not only improved detection of high-risk restaurants, but it also served to identify specific risk factors related to health code violation. To further validate our model we applied the model generated in our pilot study to Yelp data from 1,542 restaurants in San Francisco. The model achieved 91% sensitivity 74% specificity, area under the receiver operator curve of 98%, and positive predictive value of 29% (given a substandard health code rating prevalence of 10%). When our model was applied to restaurant reviews in New York City we achieved 74% sensitivity, 54% specificity, area under the receiver operator curve of 77%, and positive predictive value of 25% (given a prevalence of 12%). Model accuracy improved when reviews ranked highest by Yelp were utilized. Our results indicate that public health surveillance can be improved by using social media data to identify restaurants at high risk for health code violation. Additionally, using highly ranked Yelp reviews improves predictive power and limits the number of reviews needed to generate prediction. Use of this approach as an adjunct to current risk ranking of restaurants prior to inspection may enhance detection of those restaurants participating in high risk practices that may have gone previously undetected. This model represents a step forward in the integration of social media into meaningful public health interventions. PMID:27023681
Supplementing Public Health Inspection via Social Media.
Schomberg, John P; Haimson, Oliver L; Hayes, Gillian R; Anton-Culver, Hoda
2016-01-01
Foodborne illness is prevented by inspection and surveillance conducted by health departments across America. Appropriate restaurant behavior is enforced and monitored via public health inspections. However, surveillance coverage provided by state and local health departments is insufficient in preventing the rising number of foodborne illness outbreaks. To address this need for improved surveillance coverage we conducted a supplementary form of public health surveillance using social media data: Yelp.com restaurant reviews in the city of San Francisco. Yelp is a social media site where users post reviews and rate restaurants they have personally visited. Presence of keywords related to health code regulations and foodborne illness symptoms, number of restaurant reviews, number of Yelp stars, and restaurant price range were included in a model predicting a restaurant's likelihood of health code violation measured by the assigned San Francisco public health code rating. For a list of major health code violations see (S1 Table). We built the predictive model using 71,360 Yelp reviews of restaurants in the San Francisco Bay Area. The predictive model was able to predict health code violations in 78% of the restaurants receiving serious citations in our pilot study of 440 restaurants. Training and validation data sets each pulled data from 220 restaurants in San Francisco. Keyword analysis of free text within Yelp not only improved detection of high-risk restaurants, but it also served to identify specific risk factors related to health code violation. To further validate our model we applied the model generated in our pilot study to Yelp data from 1,542 restaurants in San Francisco. The model achieved 91% sensitivity 74% specificity, area under the receiver operator curve of 98%, and positive predictive value of 29% (given a substandard health code rating prevalence of 10%). When our model was applied to restaurant reviews in New York City we achieved 74% sensitivity, 54% specificity, area under the receiver operator curve of 77%, and positive predictive value of 25% (given a prevalence of 12%). Model accuracy improved when reviews ranked highest by Yelp were utilized. Our results indicate that public health surveillance can be improved by using social media data to identify restaurants at high risk for health code violation. Additionally, using highly ranked Yelp reviews improves predictive power and limits the number of reviews needed to generate prediction. Use of this approach as an adjunct to current risk ranking of restaurants prior to inspection may enhance detection of those restaurants participating in high risk practices that may have gone previously undetected. This model represents a step forward in the integration of social media into meaningful public health interventions.
2014-01-01
Background The systematic review of reasons is a new way to obtain comprehensive information about specific ethical topics. One such review was carried out for the question of why post-trial access to trial drugs should or need not be provided. The objective of this study was to empirically validate this review using an author check method. The article also reports on methodological challenges faced by our study. Methods We emailed a questionnaire to the 64 corresponding authors of those papers that were assessed in the review of reasons on post-trial access. The questionnaire consisted of all quotations (“reason mentions”) that were identified by the review to represent a reason in a given author’s publication, together with a set of codings for the quotations. The authors were asked to rate the correctness of the codings. Results We received 19 responses, from which only 13 were completed questionnaires. In total, 98 quotations and their related codes in the 13 questionnaires were checked by the addressees. For 77 quotations (79%), all codings were deemed correct, for 21 quotations (21%), some codings were deemed to need correction. Most corrections were minor and did not imply a complete misunderstanding of the citation. Conclusions This first attempt to validate a review of reasons leads to four crucial methodological questions relevant to the future conduct of such validation studies: 1) How can a description of a reason be deemed incorrect? 2) Do the limited findings of this author check study enable us to determine whether the core results of the analysed SRR are valid? 3) Why did the majority of surveyed authors refrain from commenting on our understanding of their reasoning? 4) How can the method for validating reviews of reasons be improved? PMID:25262532
Building Standards and Codes for Energy Conservation
ERIC Educational Resources Information Center
Gross, James G.; Pierlert, James H.
1977-01-01
Current activity intended to lead to energy conservation measures in building codes and standards is reviewed by members of the Office of Building Standards and Codes Services of the National Bureau of Standards. For journal availability see HE 508 931. (LBH)
2015-07-06
geotechnical reconunendations to aid in design of a retaining wall structure in case this alternative is considered in the future. Based on a telephone...potentially involve significant impacts on the environment must be reviewed in accordance with the National Environmental Policy Act (NEPA) and all other...Code of Federal Regulations (CFR) 989, Environmental Impact Analysis Process (EIAP). The analyses focus on the following environmental resources: noise
ERIC Educational Resources Information Center
Baker, Opal Ruth
Research on Spanish/English code switching is reviewed and the definitions and categories set up by the investigators are examined. Their methods of locating, limiting, and classifying true code switches, and the terms used and results obtained, are compared. It is found that in these studies, conversational (intra-discourse) code switching is…
NASA Technical Reports Server (NTRS)
Tatchell, D. G.
1979-01-01
A code, CATHY3/M, was prepared and demonstrated by application to a sample case. The preparation is reviewed, a summary of the capabilities and main features of the code is given, and the sample case results are discussed. Recommendations for future use and development of the code are provided.
Methods of treating complex space vehicle geometry for charged particle radiation transport
NASA Technical Reports Server (NTRS)
Hill, C. W.
1973-01-01
Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.
Nguyen, Vivian M; Haddaway, Neal R; Gutowsky, Lee F G; Wilson, Alexander D M; Gallagher, Austin J; Donaldson, Michael R; Hammerschlag, Neil; Cooke, Steven J
2015-01-01
Delays in peer reviewed publication may have consequences for both assessment of scientific prowess in academics as well as communication of important information to the knowledge receptor community. We present an analysis on the perspectives of authors publishing in conservation biology journals regarding their opinions on the importance of speed in peer-review as well as how to improve review times. Authors were invited to take part in an online questionnaire, of which the data was subjected to both qualitative (open coding, categorizing) and quantitative analyses (generalized linear models). We received 637 responses to a total of 6,547 e-mail invitations sent. Peer-review speed was generally perceived as slow, with authors experiencing a typical turnaround time of 14 weeks while their perceived optimal review time is six weeks. Male and younger respondents seem to have higher expectations of review speed than females and older respondents. Majority of participants attributed lengthy review times to the 'stress' on the peer-review system (i.e., reviewer and editor fatigue), while editor persistence and journal prestige were believed to speed up the review process. Negative consequences of lengthy review times appear to be greater for early career researchers and can also have impact on author morale (e.g. motivation or frustration). Competition among colleagues were also of concern to respondents. Incentivizing peer review was among the top suggested alterations to the system along with training graduate students in peer review, increased editorial persistence, and changes to the norms of peer-review such as opening the peer-review process to the public. It is clear that authors surveyed in this study view the peer-review system as under stress and we encourage scientists and publishers to push the envelope for new peer review models.
An evaluation of the effect of JPEG, JPEG2000, and H.264/AVC on CQR codes decoding process
NASA Astrophysics Data System (ADS)
Vizcarra Melgar, Max E.; Farias, Mylène C. Q.; Zaghetto, Alexandre
2015-02-01
This paper presents a binarymatrix code based on QR Code (Quick Response Code), denoted as CQR Code (Colored Quick Response Code), and evaluates the effect of JPEG, JPEG2000 and H.264/AVC compression on the decoding process. The proposed CQR Code has three additional colors (red, green and blue), what enables twice as much storage capacity when compared to the traditional black and white QR Code. Using the Reed-Solomon error-correcting code, the CQR Code model has a theoretical correction capability of 38.41%. The goal of this paper is to evaluate the effect that degradations inserted by common image compression algorithms have on the decoding process. Results show that a successful decoding process can be achieved for compression rates up to 0.3877 bits/pixel, 0.1093 bits/pixel and 0.3808 bits/pixel for JPEG, JPEG2000 and H.264/AVC formats, respectively. The algorithm that presents the best performance is the H.264/AVC, followed by the JPEG2000, and JPEG.
ERIC Educational Resources Information Center
Salisbury, Amy L.; Fallone, Melissa Duncan; Lester, Barry
2005-01-01
This review provides an overview and definition of the concept of neurobehavior in human development. Two neurobehavioral assessments used by the authors in current fetal and infant research are discussed: the NICU Network Neurobehavioral Assessment Scale and the Fetal Neurobehavior Coding System. This review will present how the two assessments…
Cantonese-English Code-Switching Research in Hong Kong: A Y2K Review.
ERIC Educational Resources Information Center
Li, David C. S.
2000-01-01
Reviews the major works on code switching in Hong Kong to date. Four context-specific motivations commonly found in the Hong Kong Chinese Press--euphemism, specificity, bilingual punning, and principle of economy--are adduced to show that English is one of the important linguistic resources used by Chinese Hongkongers to fulfill a variety of…
Code of Federal Regulations, 2013 CFR
2013-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2012 CFR
2012-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2011 CFR
2011-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2014 CFR
2014-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2010 CFR
2010-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
77 FR 17460 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-26
.... 120214135-2203-02] RIN 0660-XA27 Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct... request for public comments on the multistakeholder process to develop consumer data privacy codes of...-multistakeholder-process without change. All personal identifying information (for example, name, address...
NASA Astrophysics Data System (ADS)
Larmat, C. S.; Delorey, A.; Rougier, E.; Knight, E. E.; Steedman, D. W.; Bradley, C. R.
2017-12-01
This presentation reports numerical modeling efforts to improve knowledge of the processes that affect seismic wave generation and propagation from underground explosions, with a focus on Rg waves. The numerical model is based on the coupling of hydrodynamic simulation codes (Abaqus, CASH and HOSS), with a 3D full waveform propagation code, SPECFEM3D. Validation datasets are provided by the Source Physics Experiment (SPE) which is a series of highly instrumented chemical explosions at the Nevada National Security Site with yields from 100kg to 5000kg. A first series of explosions in a granite emplacement has just been completed and a second series in alluvium emplacement is planned for 2018. The long-term goal of this research is to review and improve current existing seismic sources models (e.g. Mueller & Murphy, 1971; Denny & Johnson, 1991) by providing first principles calculations provided by the coupled codes capability. The hydrodynamic codes, Abaqus, CASH and HOSS, model the shocked, hydrodynamic region via equations of state for the explosive, borehole stemming and jointed/weathered granite. A new material model for unconsolidated alluvium materials has been developed and validated with past nuclear explosions, including the 10 kT 1965 Merlin event (Perret, 1971) ; Perret and Bass, 1975). We use the efficient Spectral Element Method code, SPECFEM3D (e.g. Komatitsch, 1998; 2002), and Geologic Framework Models to model the evolution of wavefield as it propagates across 3D complex structures. The coupling interface is a series of grid points of the SEM mesh situated at the edge of the hydrodynamic code domain. We will present validation tests and waveforms modeled for several SPE tests which provide evidence that the damage processes happening in the vicinity of the explosions create secondary seismic sources. These sources interfere with the original explosion moment and reduces the apparent seismic moment at the origin of Rg waves up to 20%.
Emergency readmissions to paediatric surgery and urology: The impact of inappropriate coding.
Peeraully, R; Henderson, K; Davies, B
2016-04-01
Introduction In England, emergency readmissions within 30 days of hospital discharge after an elective admission are not reimbursed if they do not meet Payment by Results (PbR) exclusion criteria. However, coding errors could inappropriately penalise hospitals. We aimed to assess the accuracy of coding for emergency readmissions. Methods Emergency readmissions attributed to paediatric surgery and urology between September 2012 and August 2014 to our tertiary referral centre were retrospectively reviewed. Payment by Results (PbR) coding data were obtained from the hospital's Family Health Directorate. Clinical details were obtained from contemporaneous records. All readmissions were categorised as appropriately coded (postoperative or nonoperative) or inappropriately coded (planned surgical readmission, unrelated surgical admission, unrelated medical admission or coding error). Results Over the 24-month period, 241 patients were coded as 30-day readmissions, with 143 (59%) meeting the PbR exclusion criteria. Of the remaining 98 (41%) patients, 24 (25%) were inappropriately coded as emergency readmissions. These readmissions resulted in 352 extra bed days, of which 117 (33%) were attributable to inappropriately coded cases. Conclusions One-quarter of non-excluded emergency readmissions were inappropriately coded, accounting for one-third of additional bed days. As a stay on a paediatric ward costs up to £500 a day, the potential cost to our institution due to inappropriate readmission coding was over £50,000. Diagnoses and the reason for admission for each care episode should be accurately documented and coded, and readmission data should be reviewed at a senior clinician level.
Citrin, Rebecca; Horowitz, Joseph P; Reilly, Anne F; Li, Yimei; Huang, Yuan-Shung; Getz, Kelly D; Seif, Alix E; Fisher, Brian T; Aplenc, Richard
2017-01-01
Mature B-cell non-Hodgkin lymphoma (B-NHL) constitutes a collection of relatively rare pediatric malignancies. In order to utilize administrative data to perform large-scale epidemiologic studies within this population, a two-step process was used to assemble a 12-year cohort of B-NHL patients treated between 2004 and 2015 within the Pediatric Health Information System database. Patients were identified by ICD-9 codes, and their chemotherapy data were then manually reviewed against standard B-NHL treatment regimens. A total of 1,409 patients were eligible for cohort inclusion. This process was validated at a single center, utilizing both an institutional tumor registry and medical record review as the gold standards. The validation demonstrated appropriate sensitivity (91.5%) and positive predictive value (95.1%) to allow for the future use of this cohort for epidemiologic and comparative effectiveness research.
Epigenetic impacts of endocrine disruptors in the brain☆
Walker, Deena M.; Gore, Andrea C.
2017-01-01
The acquisition of reproductive competence is organized and activated by steroid hormones acting upon the hypothalamus during critical windows of development. This review describes the potential role of epigenetic processes, particularly DNA methylation, in the regulation of sexual differentiation of the hypothalamus by hormones. We examine disruption of these processes by endocrine-disrupting chemicals (EDCs) in an age-, sex-, and region-specific manner, focusing on how perinatal EDCs act through epigenetic mechanisms to reprogram DNA methylation and sex steroid hormone receptor expression throughout life. These receptors are necessary for brain sexual differentiation and their altered expression may underlie disrupted reproductive physiology and behavior. Finally, we review the literature on histone modifications and non-coding RNA involvement in brain sexual differentiation and their perturbation by EDCs. By putting these data into a sex and developmental context we conclude that perinatal EDC exposure alters the developmental trajectory of reproductive neuroendocrine systems in a sex-specific manner. PMID:27663243
On the importance of cotranscriptional RNA structure formation
Lai, Daniel; Proctor, Jeff R.; Meyer, Irmtraud M.
2013-01-01
The expression of genes, both coding and noncoding, can be significantly influenced by RNA structural features of their corresponding transcripts. There is by now mounting experimental and some theoretical evidence that structure formation in vivo starts during transcription and that this cotranscriptional folding determines the functional RNA structural features that are being formed. Several decades of research in bioinformatics have resulted in a wide range of computational methods for predicting RNA secondary structures. Almost all state-of-the-art methods in terms of prediction accuracy, however, completely ignore the process of structure formation and focus exclusively on the final RNA structure. This review hopes to bridge this gap. We summarize the existing evidence for cotranscriptional folding and then review the different, currently used strategies for RNA secondary-structure prediction. Finally, we propose a range of ideas on how state-of-the-art methods could be potentially improved by explicitly capturing the process of cotranscriptional structure formation. PMID:24131802
Nguyen, Vivian M.; Haddaway, Neal R.; Gutowsky, Lee F. G.; Wilson, Alexander D. M.; Gallagher, Austin J.; Donaldson, Michael R.; Hammerschlag, Neil; Cooke, Steven J.
2015-01-01
Delays in peer reviewed publication may have consequences for both assessment of scientific prowess in academia as well as communication of important information to the knowledge receptor community. We present an analysis on the perspectives of authors publishing in conservation biology journals regarding their opinions on the importance of speed in peer-review as well as how to improve review times. Authors were invited to take part in an online questionnaire, of which the data was subjected to both qualitative (open coding, categorizing) and quantitative analyses (generalized linear models). We received 637 responses to 6,547 e-mail invitations sent. Peer-review speed was generally perceived as slow, with authors experiencing a typical turnaround time of 14 weeks while their perceived optimal review time was six weeks. Male and younger respondents seem to have higher expectations of review speed than females and older respondents. The majority of participants attributed lengthy review times to reviewer and editor fatigue, while editor persistence and journal prestige were believed to speed up the review process. Negative consequences of lengthy review times were perceived to be greater for early career researchers and to have impact on author morale (e.g. motivation or frustration). Competition among colleagues was also of concern to respondents. Incentivizing peer-review was among the top suggested alterations to the system along with training graduate students in peer-review, increased editorial persistence, and changes to the norms of peer-review such as opening the peer-review process to the public. It is clear that authors surveyed in this study viewed the peer-review system as under stress and we encourage scientists and publishers to push the envelope for new peer-review models. PMID:26267491
Rushton, A; White, L; Heap, A; Calvert, M; Heneghan, N; Goodwin, P
2016-02-25
To develop an optimised 1:1 physiotherapy intervention that reflects best practice, with flexibility to tailor management to individual patients, thereby ensuring patient-centred practice. Mixed-methods combining evidence synthesis, expert review and focus groups. Secondary care involving 5 UK specialist spinal centres. A purposive panel of clinical experts from the 5 spinal centres, comprising spinal surgeons, inpatient and outpatient physiotherapists, provided expert review of the draft intervention. Purposive samples of patients (n=10) and physiotherapists (n=10) (inpatient/outpatient physiotherapists managing patients with lumbar discectomy) were invited to participate in the focus groups at 1 spinal centre. A draft intervention developed from 2 systematic reviews; a survey of current practice and research related to stratified care was circulated to the panel of clinical experts. Lead physiotherapists collaborated with physiotherapy and surgeon colleagues to provide feedback that informed the intervention presented at 2 focus groups investigating acceptability to patients and physiotherapists. The focus groups were facilitated by an experienced facilitator, recorded in written and tape-recorded forms by an observer. Tape recordings were transcribed verbatim. Data analysis, conducted by 2 independent researchers, employed an iterative and constant comparative process of (1) initial descriptive coding to identify categories and subsequent themes, and (2) deeper, interpretive coding and thematic analysis enabling concepts to emerge and overarching pattern codes to be identified. The intervention reflected best available evidence and provided flexibility to ensure patient-centred care. The intervention comprised up to 8 sessions of 1:1 physiotherapy over 8 weeks, starting 4 weeks postsurgery. The intervention was acceptable to patients and physiotherapists. A rigorous process informed an optimised 1:1 physiotherapy intervention post-lumbar discectomy that reflects best practice. The developed intervention was agreed on by the 5 spinal centres for implementation in a randomised controlled trial to evaluate its effectiveness. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Towards acute pediatric status epilepticus intervention teams: Do we need "Seizure Codes"?
Stredny, Coral M; Abend, Nicholas S; Loddenkemper, Tobias
2018-05-01
To identify areas of treatment delay and barriers to care in pediatric status epilepticus, review ongoing quality improvement initiatives, and provide suggestions for further innovations to improve and standardize these patient care processes. Narrative review of current status epilepticus management algorithms, anti-seizure medication administration and outcomes associated with delays, and initiatives to improve time to treatment. Articles reviewing or reporting quality improvement initiatives were identified through a PubMed search with keywords "status epilepticus," "quality improvement," "guideline adherence," and/or "protocol;" references of included articles were also reviewed. Rapid initiation and escalation of status epilepticus treatment has been associated with shortened seizure duration and more favorable outcomes. Current evidence-based guidelines for management of status epilepticus propose medication algorithms with suggested times for each management step. However, time to antiseizure medication administration for pediatric status epilepticus remains delayed in both the pre- and in-hospital settings. Barriers to timely treatment include suboptimal preventive care, inaccurate seizure detection, infrequent or restricted use of home rescue medications by caregivers and pre-hospital emergency personnel, delayed summoning and arrival of emergency personnel, and use of inappropriately dosed medications. Ongoing quality improvement initiatives in the pre- and in-hospital settings targeting these barriers are reviewed. Improved preventive care, seizure detection, and rescue medication education may advance pre-hospital management, and we propose the use of acute status epilepticus intervention teams to initiate and incorporate in-hospital interventions as time-sensitive "Seizure Code" emergencies. Copyright © 2018 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Location Based Service in Indoor Environment Using Quick Response Code Technology
NASA Astrophysics Data System (ADS)
Hakimpour, F.; Zare Zardiny, A.
2014-10-01
Today by extensive use of intelligent mobile phones, increased size of screens and enriching the mobile phones by Global Positioning System (GPS) technology use of location based services have been considered by public users more than ever.. Based on the position of users, they can receive the desired information from different LBS providers. Any LBS system generally includes five main parts: mobile devices, communication network, positioning system, service provider and data provider. By now many advances have been gained in relation to any of these parts; however the users positioning especially in indoor environments is propounded as an essential and critical issue in LBS. It is well known that GPS performs too poorly inside buildings to provide usable indoor positioning. On the other hand, current indoor positioning technologies such as using RFID or WiFi network need different hardware and software infrastructures. In this paper, we propose a new method to overcome these challenges. This method is using the Quick Response (QR) Code Technology. QR Code is a 2D encrypted barcode with a matrix structure which consists of black modules arranged in a square grid. Scanning and data retrieving process from QR Code is possible by use of different camera-enabled mobile phones only by installing the barcode reader software. This paper reviews the capabilities of QR Code technology and then discusses the advantages of using QR Code in Indoor LBS (ILBS) system in comparison to other technologies. Finally, some prospects of using QR Code are illustrated through implementation of a scenario. The most important advantages of using this new technology in ILBS are easy implementation, spending less expenses, quick data retrieval, possibility of printing the QR Code on different products and no need for complicated hardware and software infrastructures.
Consolidated principles for screening based on a systematic review and consensus process.
Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-04-09
In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. © 2018 Joule Inc. or its licensors.
Consolidated principles for screening based on a systematic review and consensus process
Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-01-01
BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. PMID:29632037
2013-01-01
Background Significant efforts have been made to address the problem of identifying short genes in prokaryotic genomes. However, most known methods are not effective in detecting short genes. Because of the limited information contained in short DNA sequences, it is very difficult to accurately distinguish between protein coding and non-coding sequences in prokaryotic genomes. We have developed a new Iteratively Adaptive Sparse Partial Least Squares (IASPLS) algorithm as the classifier to improve the accuracy of the identification process. Results For testing, we chose the short coding and non-coding sequences from seven prokaryotic organisms. We used seven feature sets (including GC content, Z-curve, etc.) of short genes. In comparison with GeneMarkS, Metagene, Orphelia, and Heuristic Approachs methods, our model achieved the best prediction performance in identification of short prokaryotic genes. Even when we focused on the very short length group ([60–100 nt)), our model provided sensitivity as high as 83.44% and specificity as high as 92.8%. These values are two or three times higher than three of the other methods while Metagene fails to recognize genes in this length range. The experiments also proved that the IASPLS can improve the identification accuracy in comparison with other widely used classifiers, i.e. Logistic, Random Forest (RF) and K nearest neighbors (KNN). The accuracy in using IASPLS was improved 5.90% or more in comparison with the other methods. In addition to the improvements in accuracy, IASPLS required ten times less computer time than using KNN or RF. Conclusions It is conclusive that our method is preferable for application as an automated method of short gene classification. Its linearity and easily optimized parameters make it practicable for predicting short genes of newly-sequenced or under-studied species. Reviewers This article was reviewed by Alexey Kondrashov, Rajeev Azad (nominated by Dr J.Peter Gogarten) and Yuriy Fofanov (nominated by Dr Janet Siefert). PMID:24067167
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N
2015-12-11
Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach's utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.
Varieties of numerical abilities.
Dehaene, S
1992-08-01
This paper provides a tutorial introduction to numerical cognition, with a review of essential findings and current points of debate. A tacit hypothesis in cognitive arithmetic is that numerical abilities derive from human linguistic competence. One aim of this special issue is to confront this hypothesis with current knowledge of number representations in animals, infants, normal and gifted adults, and brain-lesioned patients. First, the historical evolution of number notations is presented, together with the mental processes for calculating and transcoding from one notation to another. While these domains are well described by formal symbol-processing models, this paper argues that such is not the case for two other domains of numerical competence: quantification and approximation. The evidence for counting, subitizing and numerosity estimation in infants, children, adults and animals is critically examined. Data are also presented which suggest a specialization for processing approximate numerical quantities in animals and humans. A synthesis of these findings is proposed in the form of a triple-code model, which assumes that numbers are mentally manipulated in an arabic, verbal or analogical magnitude code depending on the requested mental operation. Only the analogical magnitude representation seems available to animals and preverbal infants.
Potential coordination role between O-GlcNAcylation and epigenetics.
Wu, Donglu; Cai, Yong; Jin, Jingji
2017-10-01
Dynamic changes of the post-translational O-GlcNAc modification (O-GlcNAcylation) are controlled by O-linked β-N-acetylglucosamine (O-GlcNAc) transferase (OGT) and the glycoside hydrolase O-GlcNAcase (OGA) in cells. O-GlcNAcylation often occurs on serine (Ser) and threonine (Thr) residues of the specific substrate proteins via the addition of O-GlcNAc group by OGT. It has been known that O-GlcNAcylation is not only involved in many fundamental cellular processes, but also plays an important role in cancer development through various mechanisms. Recently, accumulating data reveal that O-GlcNAcylation at histones or non-histone proteins can lead to the start of the subsequent biological processes, suggesting that O-GlcNAcylation as 'protein code' or 'histone code' may provide recognition platforms or executive instructions for subsequent recruitment of proteins to carry out the specific functions. In this review, we summarize the interaction of O-GlcNAcylation and epigenetic changes, introduce recent research findings that link crosstalk between O-GlcNAcylation and epigenetic changes, and speculate on the potential coordination role of O-GlcNAcylation with epigenetic changes in intracellular biological processes.
A systems neurophysiology approach to voluntary event coding.
Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian
2016-07-15
Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. Copyright © 2016 Elsevier Inc. All rights reserved.
Development and validation of a registry-based definition of eosinophilic esophagitis in Denmark
Dellon, Evan S; Erichsen, Rune; Pedersen, Lars; Shaheen, Nicholas J; Baron, John A; Sørensen, Henrik T; Vyberg, Mogens
2013-01-01
AIM: To develop and validate a case definition of eosinophilic esophagitis (EoE) in the linked Danish health registries. METHODS: For case definition development, we queried the Danish medical registries from 2006-2007 to identify candidate cases of EoE in Northern Denmark. All International Classification of Diseases-10 (ICD-10) and prescription codes were obtained, and archived pathology slides were obtained and re-reviewed to determine case status. We used an iterative process to select inclusion/exclusion codes, refine the case definition, and optimize sensitivity and specificity. We then re-queried the registries from 2008-2009 to yield a validation set. The case definition algorithm was applied, and sensitivity and specificity were calculated. RESULTS: Of the 51 and 49 candidate cases identified in both the development and validation sets, 21 and 24 had EoE, respectively. Characteristics of EoE cases in the development set [mean age 35 years; 76% male; 86% dysphagia; 103 eosinophils per high-power field (eos/hpf)] were similar to those in the validation set (mean age 42 years; 83% male; 67% dysphagia; 77 eos/hpf). Re-review of archived slides confirmed that the pathology coding for esophageal eosinophilia was correct in greater than 90% of cases. Two registry-based case algorithms based on pathology, ICD-10, and pharmacy codes were successfully generated in the development set, one that was sensitive (90%) and one that was specific (97%). When these algorithms were applied to the validation set, they remained sensitive (88%) and specific (96%). CONCLUSION: Two registry-based definitions, one highly sensitive and one highly specific, were developed and validated for the linked Danish national health databases, making future population-based studies feasible. PMID:23382628
Recent advances in lossless coding techniques
NASA Astrophysics Data System (ADS)
Yovanof, Gregory S.
Current lossless techniques are reviewed with reference to both sequential data files and still images. Two major groups of sequential algorithms, dictionary and statistical techniques, are discussed. In particular, attention is given to Lempel-Ziv coding, Huffman coding, and arithmewtic coding. The subject of lossless compression of imagery is briefly discussed. Finally, examples of practical implementations of lossless algorithms and some simulation results are given.
A unified coding strategy for processing faces and voices
Yovel, Galit; Belin, Pascal
2013-01-01
Both faces and voices are rich in socially-relevant information, which humans are remarkably adept at extracting, including a person's identity, age, gender, affective state, personality, etc. Here, we review accumulating evidence from behavioral, neuropsychological, electrophysiological, and neuroimaging studies which suggest that the cognitive and neural processing mechanisms engaged by perceiving faces or voices are highly similar, despite the very different nature of their sensory input. The similarity between the two mechanisms likely facilitates the multi-modal integration of facial and vocal information during everyday social interactions. These findings emphasize a parsimonious principle of cerebral organization, where similar computational problems in different modalities are solved using similar solutions. PMID:23664703
Measuring the complexity of design in real-time imaging software
NASA Astrophysics Data System (ADS)
Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.
2007-02-01
Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.
Chan, Teresa; Orlich, Donika; Kulasegaram, Kulamakan; Sherbino, Jonathan
2013-01-01
To define the important elements of an emergency department (ED) consultation request and to develop a simple model of the process. From March to September 2010, 61 physicians (21 emergency medicine [EM], 20 general surgery [GS], 20 internal medicine [IM]; 31 residents, 30 attending staff) were questioned about how junior learners should be taught about ED consultation. Two investigators independently reviewed focus group and interview transcripts using grounded theory to generate an index of themes until saturation was reached. Disagreements were resolved by consensus, yielding an inventory of themes and subthemes. All transcripts were coded using this index of themes; 30% of transcripts were coded in duplicate to determine the agreement. A total of 245 themes and subthemes were identified. The agreement between reviewers was 77%. Important themes in the process were as follows: initial preparation and review of investigations by EM physician (overall endorsement 87% [range 70-100% in different groups]); identification of involved parties (patient and involved physicians) (100%); hypothesis of patient's diagnosis (75% [range 62-83%]) or question for the consulting physician (70% [range 55-95%]); urgency (100%) and stability (74% [range 62-80%]); questions from the consultant (100%); discussion/communication (98% [range 95-100%]); and feedback (98% [range 95-100%]). These components were reorganized into a simple framework (PIQUED). Each clinical specialty significantly contributed to the model (χ2 = 7.9; p value = 0.019). Each group contributed uniquely to the final list of important elements (percent contributions: EM, 57%; GS, 41%; IM, 64%). We define important elements of an ED consultation with input from emergency and consulting physicians. We propose a model that organizes these elements into a simple framework (PIQUED) that may be valuable for junior learners.
NASA Astrophysics Data System (ADS)
Neakrase, Lynn; Hornung, Danae; Sweebe, Kathrine; Huber, Lyle; Chanover, Nancy J.; Stevenson, Zena; Berdis, Jodi; Johnson, Joni J.; Beebe, Reta F.
2017-10-01
The Research and Analysis programs within NASA’s Planetary Science Division now require archiving of resultant data with the Planetary Data System (PDS) or an equivalent archive. The PDS Atmospheres Node is developing an online environment for assisting data providers with this task. The Educational Labeling System for Atmospheres (ELSA) is being designed with Django/Python coding to provide an easier environment for facilitating not only communication with the PDS node, but also streamlining the process of learning, developing, submitting, and reviewing archive bundles under the new PDS4 archiving standard. Under the PDS4 standard, data are archived in bundles, collections, and basic products that form an organizational hierarchy of interconnected labels that describe the data and relationships between the data and its documentation. PDS4 labels are implemented using Extensible Markup Language (XML), which is an international standard for managing metadata. Potential data providers entering the ELSA environment can learn more about PDS4, plan and develop label templates, and build their archive bundles. ELSA provides an interface to tailor label templates aiding in the creation of required internal Logical Identifiers (URN - Uniform Resource Names) and Context References (missions, instruments, targets, facilities, etc.). The underlying structure of ELSA uses Django/Python code that make maintaining and updating the interface easy to do for our undergraduate/graduate students. The ELSA environment will soon provide an interface for using the tailored templates in a pipeline to produce entire collections of labeled products, essentially building the user’s archive bundle. Once the pieces of the archive bundle are assembled, ELSA provides options for queuing the completed bundle for peer review. The peer review process has also been streamlined for online access and tracking to help make the archiving process with PDS as transparent as possible. We discuss the current status of ELSA and provide examples of its implementation.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
... costs and benefits of the rule and to identify any relevant changes in technology that have occurred... access to care; Whether the public health benefits of an action have been realized; Whether the public or... reviewing under E.O. 13563 is the Bar Code Final Rule. The Agency plans to reassess its costs and benefits...
ERIC Educational Resources Information Center
Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey
2017-01-01
A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on…
Recent Developments in the Code RITRACKS (Relativistic Ion Tracks)
NASA Technical Reports Server (NTRS)
Plante, Ianik; Ponomarev, Artem L.; Blattnig, Steve R.
2018-01-01
The code RITRACKS (Relativistic Ion Tracks) was developed to simulate detailed stochastic radiation track structures of ions of different types and energies. Many new capabilities were added to the code during the recent years. Several options were added to specify the times at which the tracks appear in the irradiated volume, allowing the simulation of dose-rate effects. The code has been used to simulate energy deposition in several targets: spherical, ellipsoidal and cylindrical. More recently, density changes as well as a spherical shell were implemented for spherical targets, in order to simulate energy deposition in walled tissue equivalent proportional counters. RITRACKS is used as a part of the new program BDSTracks (Biological Damage by Stochastic Tracks) to simulate several types of chromosome aberrations in various irradiation conditions. The simulation of damage to various DNA structures (linear and chromatin fiber) by direct and indirect effects has been improved and is ongoing. Many improvements were also made to the graphic user interface (GUI), including the addition of several labels allowing changes of units. A new GUI has been added to display the electron ejection vectors. The parallel calculation capabilities, notably the pre- and post-simulation processing on Windows and Linux machines have been reviewed to make them more portable between different systems. The calculation part is currently maintained in an Atlassian Stash® repository for code tracking and possibly future collaboration.
Converting Panax ginseng DNA and chemical fingerprints into two-dimensional barcode.
Cai, Yong; Li, Peng; Li, Xi-Wen; Zhao, Jing; Chen, Hai; Yang, Qing; Hu, Hao
2017-07-01
In this study, we investigated how to convert the Panax ginseng DNA sequence code and chemical fingerprints into a two-dimensional code. In order to improve the compression efficiency, GATC2Bytes and digital merger compression algorithms are proposed. HPLC chemical fingerprint data of 10 groups of P. ginseng from Northeast China and the internal transcribed spacer 2 (ITS2) sequence code as the DNA sequence code were ready for conversion. In order to convert such data into a two-dimensional code, the following six steps were performed: First, the chemical fingerprint characteristic data sets were obtained through the inflection filtering algorithm. Second, precompression processing of such data sets is undertaken. Third, precompression processing was undertaken with the P. ginseng DNA (ITS2) sequence codes. Fourth, the precompressed chemical fingerprint data and the DNA (ITS2) sequence code were combined in accordance with the set data format. Such combined data can be compressed by Zlib, an open source data compression algorithm. Finally, the compressed data generated a two-dimensional code called a quick response code (QR code). Through the abovementioned converting process, it can be found that the number of bytes needed for storing P. ginseng chemical fingerprints and its DNA (ITS2) sequence code can be greatly reduced. After GTCA2Bytes algorithm processing, the ITS2 compression rate reaches 75% and the chemical fingerprint compression rate exceeds 99.65% via filtration and digital merger compression algorithm processing. Therefore, the overall compression ratio even exceeds 99.36%. The capacity of the formed QR code is around 0.5k, which can easily and successfully be read and identified by any smartphone. P. ginseng chemical fingerprints and its DNA (ITS2) sequence code can form a QR code after data processing, and therefore the QR code can be a perfect carrier of the authenticity and quality of P. ginseng information. This study provides a theoretical basis for the development of a quality traceability system of traditional Chinese medicine based on a two-dimensional code.
Complying with current Joint Commission Statement of Conditions (SOC) requirements.
Erickson, D; Berek, B; Mills, G
1997-01-01
This Technical Document has been developed to provide the reader with insight into the Joint Commission on Accreditation of Healthcare Organizations' (JCAHO) Statement of Conditions (SOC) process and recent changes for completing the SOC for Business Occupancies. The intent of this document is not to replace the instructions in Part 1 of the SOC or to give a complete review of the National Fire Protection Agency's (NFPA) Life Safety Code for health care or business occupancies, but rather to complement them.
Energy and technology review: Engineering modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.
1986-10-01
This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.
Factor information retrieval system version 2. 0 (fire) (for microcomputers). Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
FIRE Version 2.0 contains EPA's unique recommended criteria and toxic air emission estimation factors. FIRE consists of: (1) an EPA internal repository system that contains emission factor data identified and collected, and (2) an external distribution system that contains only EPA's recommended factors. The emission factors, compiled from a review of the literature, are identified by pollutant name, CAS number, process and emission source descriptions, SIC code, SCC, and control status. The factors are rated for quality using AP-42 rating criteria.
Epigenetic Therapy in Lung Cancer - Role of microRNAs.
Rothschild, Sacha I
2013-01-01
Lung cancer is the leading cause of cancer deaths worldwide. microRNAs (miRNAs) are a class of small non-coding RNA species that have been implicated in the control of many fundamental cellular and physiological processes such as cellular differentiation, proliferation, apoptosis, and stem cell maintenance. Some miRNAs have been categorized as "oncomiRs" as opposed to "tumor suppressor miRs." This review focuses on the role of miRNAs in the lung cancer carcinogenesis and their potential as diagnostic, prognostic, or predictive markers.
Data processing with microcode designed with source coding
McCoy, James A; Morrison, Steven E
2013-05-07
Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.
Software engineering and automatic continuous verification of scientific software
NASA Astrophysics Data System (ADS)
Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.
2011-12-01
Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.
Re-engineering NASA's space communications to remain viable in a constrained fiscal environment
NASA Astrophysics Data System (ADS)
Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1994-11-01
Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.
Re-engineering NASA's space communications to remain viable in a constrained fiscal environment
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1994-01-01
Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.
Tompkins, Connie A.
2009-01-01
This article reviews and evaluates leading accounts of narrative comprehension deficits in adults with focal damage to the right cerebral hemisphere (RHD). It begins with a discussion of models of comprehension, which explain how comprehension proceeds through increasingly complex levels of representation. These models include two phases of comprehension processes, broad activation of information as well as pruning and focusing interpretation of meaning based on context. The potential effects of RHD on each processing phase are reviewed, focusing on factors that range from relatively specific (e.g., how the right versus the left hemisphere activate word meanings; how the right hemisphere is involved in inferencing) to more general (the influence of cognitive resource factors; the role of suppression of contextually-irrelevant information). Next, two specific accounts of RHD comprehension difficulties, coarse coding and suppression deficit, are described. These have been construed as opposing processes, but a possible reconciliation is proposed related to the different phases of comprehension and the extent of meaning activation. Finally, the article addresses the influences of contextual constraint on language processing and the continuity of literal and nonliteral language processing, two areas in which future developments may assist our clinical planning PMID:20011667
Little, Elizabeth A; Presseau, Justin; Eccles, Martin P
2015-06-17
Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to retrospectively identify the likely targeted factors using theoretical frameworks such as the TDF. In osteoporosis management, this suggested that several likely determinants of healthcare professional behaviour appear not yet to have been considered in implementation interventions. This approach may serve as a useful basis for using theory-based frameworks such as the TDF to retrospectively identify targeted factors within systematic reviews of implementation interventions in other implementation contexts.
PRISM, Processing and Review Interface for Strong Motion Data Software
NASA Astrophysics Data System (ADS)
Kalkan, E.; Jones, J. M.; Stephens, C. D.; Ng, P.
2016-12-01
A continually increasing number of high-quality digital strong-motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the U.S., calls for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. PRISM automates the processing of strong-motion records by providing batch-processing capabilities. The PRISM software is platform-independent (coded in Java), open-source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a graphical user interface (GUI) for manual review and processing. To facilitate the use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and GUI components) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X and Windows. PRISM was designed to be flexible and extensible in order to accommodate implementation of new processing techniques. Input to PRISM currently is limited to data files in the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) V0 format, so that all retrieved acceleration time series need to be converted to this format. Output products include COSMOS V1, V2 and V3 files as: (i) raw acceleration time series in physical units with mean removed (V1), (ii) baseline-corrected and filtered acceleration, velocity, and displacement time series (V2), and (iii) response spectra, Fourier amplitude spectra and common earthquake-engineering intensity measures (V3). A thorough description of the record processing features supported by PRISM is presented with examples and validation results. All computing features have been thoroughly tested.
Digital signal processing of the phonocardiogram: review of the most recent advancements.
Durand, L G; Pibarot, P
1995-01-01
The objective of the present paper is to provide a detailed review of the most recent developments in instrumentation and signal processing of digital phonocardiography and heart auscultation. After a short introduction, the paper presents a brief history of heart auscultation and phonocardiography, which is followed by a summary of the basic theories and controversies regarding the genesis of the heart sounds. The application of spectral analysis and the potential of new time-frequency representations and cardiac acoustic mapping to resolve the controversies and better understand the genesis and transmission of heart sounds and murmurs within the heart-thorax acoustic system are reviewed. The most recent developments in the application of linear predictive coding, spectral analysis, time-frequency representation techniques, and pattern recognition for the detection and follow-up of native and prosthetic valve degeneration and dysfunction are also presented in detail. New areas of research and clinical applications and areas of potential future developments are then highlighted. The final section is a discussion about a multidegree of freedom theory on the origin of the heart sounds and murmurs, which is completed by the authors' conclusion.
Country Report on Building Energy Codes in Australia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shui, Bin; Evans, Meredydd; Somasundaram, Sriram
2009-04-02
This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in Australia, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, and lighting) for commercial and residential buildings in Australia.
Qiu, Guo-Hua
2016-01-01
In this review, the protective function of the abundant non-coding DNA in the eukaryotic genome is discussed from the perspective of genome defense against exogenous nucleic acids. Peripheral non-coding DNA has been proposed to act as a bodyguard that protects the genome and the central protein-coding sequences from ionizing radiation-induced DNA damage. In the proposed mechanism of protection, the radicals generated by water radiolysis in the cytosol and IR energy are absorbed, blocked and/or reduced by peripheral heterochromatin; then, the DNA damage sites in the heterochromatin are removed and expelled from the nucleus to the cytoplasm through nuclear pore complexes, most likely through the formation of extrachromosomal circular DNA. To strengthen this hypothesis, this review summarizes the experimental evidence supporting the protective function of non-coding DNA against exogenous nucleic acids. Based on these data, I hypothesize herein about the presence of an additional line of defense formed by small RNAs in the cytosol in addition to their bodyguard protection mechanism in the nucleus. Therefore, exogenous nucleic acids may be initially inactivated in the cytosol by small RNAs generated from non-coding DNA via mechanisms similar to the prokaryotic CRISPR-Cas system. Exogenous nucleic acids may enter the nucleus, where some are absorbed and/or blocked by heterochromatin and others integrate into chromosomes. The integrated fragments and the sites of DNA damage are removed by repetitive non-coding DNA elements in the heterochromatin and excluded from the nucleus. Therefore, the normal eukaryotic genome and the central protein-coding sequences are triply protected by non-coding DNA against invasion by exogenous nucleic acids. This review provides evidence supporting the protective role of non-coding DNA in genome defense. Copyright © 2016 Elsevier B.V. All rights reserved.
Emergency readmissions to paediatric surgery and urology: The impact of inappropriate coding
Peeraully, R; Henderson, K; Davies, B
2016-01-01
Introduction In England, emergency readmissions within 30 days of hospital discharge after an elective admission are not reimbursed if they do not meet Payment by Results (PbR) exclusion criteria. However, coding errors could inappropriately penalise hospitals. We aimed to assess the accuracy of coding for emergency readmissions. Methods Emergency readmissions attributed to paediatric surgery and urology between September 2012 and August 2014 to our tertiary referral centre were retrospectively reviewed. Payment by Results (PbR) coding data were obtained from the hospital’s Family Health Directorate. Clinical details were obtained from contemporaneous records. All readmissions were categorised as appropriately coded (postoperative or nonoperative) or inappropriately coded (planned surgical readmission, unrelated surgical admission, unrelated medical admission or coding error). Results Over the 24-month period, 241 patients were coded as 30-day readmissions, with 143 (59%) meeting the PbR exclusion criteria. Of the remaining 98 (41%) patients, 24 (25%) were inappropriately coded as emergency readmissions. These readmissions resulted in 352 extra bed days, of which 117 (33%) were attributable to inappropriately coded cases. Conclusions One-quarter of non-excluded emergency readmissions were inappropriately coded, accounting for one-third of additional bed days. As a stay on a paediatric ward costs up to £500 a day, the potential cost to our institution due to inappropriate readmission coding was over £50,000. Diagnoses and the reason for admission for each care episode should be accurately documented and coded, and readmission data should be reviewed at a senior clinician level. PMID:26924486
Can dialysis patients be accurately identified using healthcare claims data?
Taneja, Charu; Berger, Ariel; Inglese, Gary W; Lamerato, Lois; Sloand, James A; Wolff, Greg G; Sheehan, Michael; Oster, Gerry
2014-01-01
While health insurance claims data are often used to estimate the costs of renal replacement therapy in patients with end-stage renal disease (ESRD), the accuracy of methods used to identify patients receiving dialysis - especially peritoneal dialysis (PD) and hemodialysis (HD) - in these data is unknown. The study population consisted of all persons aged 18 - 63 years in a large US integrated health plan with ESRD and dialysis-related billing codes (i.e., diagnosis, procedures) on healthcare encounters between January 1, 2005, and December 31, 2008. Using billing codes for all healthcare encounters within 30 days of each patient's first dialysis-related claim ("index encounter"), we attempted to designate each study subject as either a "PD patient" or "HD patient." Using alternative windows of ± 30 days, ± 90 days, and ± 180 days around the index encounter, we reviewed patients' medical records to determine the dialysis modality actually received. We calculated the positive predictive value (PPV) for each dialysis-related billing code, using information in patients' medical records as the "gold standard." We identified a total of 233 patients with evidence of ESRD and receipt of dialysis in healthcare claims data. Based on examination of billing codes, 43 and 173 study subjects were designated PD patients and HD patients, respectively (14 patients had evidence of PD and HD, and modality could not be ascertained for 31 patients). The PPV of codes used to identify PD patients was low based on a ± 30-day medical record review window (34.9%), and increased with use of ± 90-day and ± 180-day windows (both 67.4%). The PPV for codes used to identify HD patients was uniformly high - 86.7% based on ± 30-day review, 90.8% based on ± 90-day review, and 93.1% based on ± 180-day review. While HD patients could be accurately identified using billing codes in healthcare claims data, case identification was much more problematic for patients receiving PD. Copyright © 2014 International Society for Peritoneal Dialysis.
14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... to NASA Headquarters, Code OX, Space Network Division, Washington, DC 20546. Upon review and... submitted in writing to both NASA Headquarters, Code OX, Space Network Division, and GSFC, Code 501.... Request for services within priority groups shall be negotiated with non-NASA users on a first come, first...
Discontinued Codes in The USDA Food and Nutrient Database for Dietary Studies
USDA-ARS?s Scientific Manuscript database
For each new version of the Food and Nutrient Database for Dietary Studies (FNDDS), foods and beverages, portions, and nutrient values are reviewed and updated. New food and beverage codes are added based on changes in consumption and the marketplace; additionally, codes are discontinued. To date,...
Rationale for Student Dress Codes: A Review of School Handbooks
ERIC Educational Resources Information Center
Freeburg, Elizabeth W.; Workman, Jane E.; Lentz-Hees, Elizabeth S.
2004-01-01
Through dress codes, schools establish rules governing student appearance. This study examined stated rationales for dress and appearance codes in secondary school handbooks; 182 handbooks were received. Of 150 handbooks containing a rationale, 117 related dress and appearance regulations to students' right to a non-disruptive educational…
O’Doherty, John P.
2015-01-01
Neural correlates of value have been extensively reported in a diverse set of brain regions. However, in many cases it is difficult to determine whether a particular neural response pattern corresponds to a value-signal per se as opposed to an array of alternative non-value related processes, such as outcome-identity coding, informational coding, encoding of autonomic and skeletomotor consequences, alongside previously described “salience” or “attentional” effects. Here, I review a number of experimental manipulations that can be used to test for value, and I identify the challenges in ascertaining whether a particular neural response is or is not a value signal. Finally, I emphasize that some non-value related signals may be especially informative as a means of providing insight into the nature of the decision-making related computations that are being implemented in a particular brain region. PMID:24726573
[Acute myocardial infarction with ST-segment elevation: Code I].
Borrayo-Sánchez, Gabriela; Rosas-Peralta, Martín; Pérez-Rodríguez, Gilberto; Ramírez-Árias, Erick; Almeida-Gutiérrez, Eduardo; Arriaga-Dávila, José de Jesús
2018-01-01
Code infarction is a timely strategy for the treatment of acute myocardial infarction (AMI) with elevation of the ST segment. This strategy has shown an increase in survival and quality of life of patients suffering from this event around the world. The processes of management and disposition aimed at the reduction of time for effective and timely reperfusion are undoubtedly a continuous challenge. In the Instituto Mexicano del Seguro Social (IMSS) the mortality due to AMI has been reduced more than 50%, which is a historical situation that deserves much attention. Nonetheless, the continuous improvement and a wider coverage of this strategy in our country are the key factors that will outline a change in the natural history of the leading cause of death in Mexico. This review focuses on current strategies for the management of patients with acute myocardial infarction.
Spatial transform coding of color images.
NASA Technical Reports Server (NTRS)
Pratt, W. K.
1971-01-01
The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.
Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.
2017-05-01
MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.
Quality improvement in neurology: AAN Parkinson disease quality measures
Cheng, E.M.; Tonn, S.; Swain-Eng, R.; Factor, S.A.; Weiner, W.J.; Bever, C.T.
2010-01-01
Background: Measuring the quality of health care is a fundamental step toward improving health care and is increasingly used in pay-for-performance initiatives and maintenance of certification requirements. Measure development to date has focused on primary care and common conditions such as diabetes; thus, the number of measures that apply to neurologic care is limited. The American Academy of Neurology (AAN) identified the need for neurologists to develop measures of neurologic care and to establish a process to accomplish this. Objective: To adapt and test the feasibility of a process for independent development by the AAN of measures for neurologic conditions for national measurement programs. Methods: A process that has been used nationally for measure development was adapted for use by the AAN. Topics for measure development are chosen based upon national priorities, available evidence base from a systematic literature search, gaps in care, and the potential impact for quality improvement. A panel composed of subject matter and measure development methodology experts oversees the development of the measures. Recommendation statements and their corresponding level of evidence are reviewed and considered for development into draft candidate measures. The candidate measures are refined by the expert panel during a 30-day public comment period and by review by the American Medical Association for Current Procedural Terminology (CPT) II codes. All final AAN measures are approved by the AAN Board of Directors. Results: Parkinson disease (PD) was chosen for measure development. A review of the medical literature identified 258 relevant recommendation statements. A 28-member panel approved 10 quality measures for PD that included full specifications and CPT II codes. Conclusion: The AAN has adapted a measure development process that is suitable for national measurement programs and has demonstrated its capability to independently develop quality measures. GLOSSARY AAN = American Academy of Neurology; ABPN = American Board of Psychiatry and Neurology; AMA = American Medical Association; CPT II = Current Procedural Terminology; PCPI = Physician Consortium for Performance Improvement; PD = Parkinson disease; PMAG = Performance Measurement Advisory Group; PQRI = Physician Quality Reporting Initiative; QMR = Quality Measurement and Reporting Subcommittee. PMID:21115958
Validation of Living Donor Nephrectomy Codes
Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.
2018-01-01
Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679
Latina food patterns in the United States: a qualitative metasynthesis.
Gerchow, Lauren; Tagliaferro, Barbara; Squires, Allison; Nicholson, Joey; Savarimuthu, Stella M; Gutnick, Damara; Jay, Melanie
2014-01-01
Obesity disproportionately affects Latinas living in the United States, and cultural food patterns contribute to this health concern. The aim of this study was to synthesize the qualitative results of research regarding Latina food patterns in order to (a) identify common patterns across Latino culture and within Latino subcultures and (b) inform future research by determining gaps in the literature. A systematic search of three databases produced 13 studies (15 manuscripts) that met the inclusion criteria for review. The Critical Appraisal Skills Program tool and the recommendations of Squires for evaluating translation methods in qualitative research were applied to appraise study quality. Authors coded through directed content analysis and an adaptation of the Joanna Briggs Institute Qualitative Assessment and Review Instrument coding template to extract themes. Coding focused on food patterns, obesity, population breakdown, immigration, acculturation, and barriers and facilitators to healthy eating. Other themes and categories emerged from this process to complement this approach. Major findings included the following: (a) Immigration driven changes in scheduling, food choice, socioeconomic status, and family dynamics shape the complex psychology behind healthy food choices for Latina women; (b) in Latina populations, barriers and facilitators to healthy lifestyle choices around food are complex; and (c) there is a clear need to differentiate Latino populations by country of origin in future qualitative studies on eating behavior. Healthcare providers need to recognize the complex influences behind eating behaviors among immigrant Latinas in order to design effective behavior change and goal-setting programs to support healthy lifestyles.
NASA Astrophysics Data System (ADS)
Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi
2014-12-01
Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.
Topaz, Maxim; Lai, Kenneth; Dowding, Dawn; Lei, Victor J; Zisberg, Anna; Bowles, Kathryn H; Zhou, Li
2016-12-01
Electronic health records are being increasingly used by nurses with up to 80% of the health data recorded as free text. However, only a few studies have developed nursing-relevant tools that help busy clinicians to identify information they need at the point of care. This study developed and validated one of the first automated natural language processing applications to extract wound information (wound type, pressure ulcer stage, wound size, anatomic location, and wound treatment) from free text clinical notes. First, two human annotators manually reviewed a purposeful training sample (n=360) and random test sample (n=1100) of clinical notes (including 50% discharge summaries and 50% outpatient notes), identified wound cases, and created a gold standard dataset. We then trained and tested our natural language processing system (known as MTERMS) to process the wound information. Finally, we assessed our automated approach by comparing system-generated findings against the gold standard. We also compared the prevalence of wound cases identified from free-text data with coded diagnoses in the structured data. The testing dataset included 101 notes (9.2%) with wound information. The overall system performance was good (F-measure is a compiled measure of system's accuracy=92.7%), with best results for wound treatment (F-measure=95.7%) and poorest results for wound size (F-measure=81.9%). Only 46.5% of wound notes had a structured code for a wound diagnosis. The natural language processing system achieved good performance on a subset of randomly selected discharge summaries and outpatient notes. In more than half of the wound notes, there were no coded wound diagnoses, which highlight the significance of using natural language processing to enrich clinical decision making. Our future steps will include expansion of the application's information coverage to other relevant wound factors and validation of the model with external data. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Short Review of Ablative-Material Response Models and Simulation Tools
NASA Technical Reports Server (NTRS)
Lachaud, Jean; Magin, Thierry E.; Cozmuta, Ioana; Mansour, Nagi N.
2011-01-01
A review of the governing equations and boundary conditions used to model the response of ablative materials submitted to a high-enthalpy flow is proposed. The heritage of model-development efforts undertaken in the 1960s is extremely clear: the bases of the models used in the community are mathematically equivalent. Most of the material-response codes implement a single model in which the equation parameters may be modified to model different materials or conditions. The level of fidelity of the models implemented in design tools only slightly varies. Research and development codes are generally more advanced but often not as robust. The capabilities of each of these codes are summarized in a color-coded table along with research and development efforts currently in progress.
HOTAIR: An Oncogenic Long Non-Coding RNA in Human Cancer.
Tang, Qing; Hann, Swei Sunny
2018-05-24
Long non-coding RNAs (LncRNAs) represent a novel class of noncoding RNAs that are longer than 200 nucleotides without protein-coding potential and function as novel master regulators in various human diseases, including cancer. Accumulating evidence shows that lncRNAs are dysregulated and implicated in various aspects of cellular homeostasis, such as proliferation, apoptosis, mobility, invasion, metastasis, chromatin remodeling, gene transcription, and post-transcriptional processing. However, the mechanisms by which lncRNAs regulate various biological functions in human diseases have yet to be determined. HOX antisense intergenic RNA (HOTAIR) is a recently discovered lncRNA and plays a critical role in various areas of cancer, such as proliferation, survival, migration, drug resistance, and genomic stability. In this review, we briefly introduce the concept, identification, and biological functions of HOTAIR. We then describe the involvement of HOTAIR that has been associated with tumorigenesis, growth, invasion, cancer stem cell differentiation, metastasis, and drug resistance in cancer. We also discuss emerging insights into the role of HOTAIR as potential biomarkers and therapeutic targets for novel treatment paradigms in cancer. © 2018 The Author(s). Published by S. Karger AG, Basel.
Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)
NASA Astrophysics Data System (ADS)
Valentine, Timothy
2017-09-01
The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.
Zucchelli, Silvia; Patrucco, Laura; Persichetti, Francesca; Gustincich, Stefano; Cotella, Diego
2016-01-01
Mammalian cells are an indispensable tool for the production of recombinant proteins in contexts where function depends on post-translational modifications. Among them, Chinese Hamster Ovary (CHO) cells are the primary factories for the production of therapeutic proteins, including monoclonal antibodies (MAbs). To improve expression and stability, several methodologies have been adopted, including methods based on media formulation, selective pressure and cell- or vector engineering. This review presents current approaches aimed at improving mammalian cell factories that are based on the enhancement of translation. Among well-established techniques (codon optimization and improvement of mRNA secondary structure), we describe SINEUPs, a family of antisense long non-coding RNAs that are able to increase translation of partially overlapping protein-coding mRNAs. By exploiting their modular structure, SINEUP molecules can be designed to target virtually any mRNA of interest, and thus to increase the production of secreted proteins. Thus, synthetic SINEUPs represent a new versatile tool to improve the production of secreted proteins in biomanufacturing processes.
X-ray backscatter radiography with lower open fraction coded masks
NASA Astrophysics Data System (ADS)
Muñoz, André A. M.; Vella, Anna; Healy, Matthew J. F.; Lane, David W.; Jupp, Ian; Lockley, David
2017-09-01
Single sided radiographic imaging would find great utility for medical, aerospace and security applications. While coded apertures can be used to form such an image from backscattered X-rays they suffer from near field limitations that introduce noise. Several theoretical studies have indicated that for an extended source the images signal to noise ratio may be optimised by using a low open fraction (<0.5) mask. However, few experimental results have been published for such low open fraction patterns and details of their formulation are often unavailable or are ambiguous. In this paper we address this process for two types of low open fraction mask, the dilute URA and the Singer set array. For the dilute URA the procedure for producing multiple 2D array patterns from given 1D binary sequences (Barker codes) is explained. Their point spread functions are calculated and their imaging properties are critically reviewed. These results are then compared to those from the Singer set and experimental exposures are presented for both type of pattern; their prospects for near field imaging are discussed.
Visual adaptation and face perception
Webster, Michael A.; MacLeod, Donald I. A.
2011-01-01
The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555
Pediatric emergency room visits for nontraumatic dental disease.
Graham, D B; Webb, M D; Seale, N S
2000-01-01
This study described the incidence and predisposing, enabling, and need factors of outpatients in a pediatric ER who sought care for nontraumatic preventable dental disease and analyzed treatment rendered by attending physicians and associated hospital charges for treatment. Chart review of outpatients discharged from the ER of a children's hospital during 1996-97, using ICD-9 diagnostic codes for dental caries, periapical abscess and facial cellulitis yielded the data for this investigation. During 1996-97, 149 patients made 159 ER visits. The most common diagnoses were ICD-9 codes 521.0 for dental caries (48%) and 522.5 for periapical abscess (47%). Medicaid recipients used the ER at an intermediate level between patients with no payor source and those with private insurance. Almost one-half of the accounts changed status during the billing process, with the majority being entered as private pay upon admission, but changing to bad debt or charity after the registration records were processed and collection was attempted. Most patients were treated empirically by the ER physicians according to their presenting signs/symptoms. This study confirmed that parents utilize the ER as their child's primary dental care source.
Visual adaptation and face perception.
Webster, Michael A; MacLeod, Donald I A
2011-06-12
The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces.
Modes of Visual Recognition and Perceptually Relevant Sketch-based Coding for Images
NASA Technical Reports Server (NTRS)
Jobson, Daniel J.
1991-01-01
A review of visual recognition studies is used to define two levels of information requirements. These two levels are related to two primary subdivisions of the spatial frequency domain of images and reflect two distinct different physical properties of arbitrary scenes. In particular, pathologies in recognition due to cerebral dysfunction point to a more complete split into two major types of processing: high spatial frequency edge based recognition vs. low spatial frequency lightness (and color) based recognition. The former is more central and general while the latter is more specific and is necessary for certain special tasks. The two modes of recognition can also be distinguished on the basis of physical scene properties: the highly localized edges associated with reflectance and sharp topographic transitions vs. smooth topographic undulation. The extreme case of heavily abstracted images is pursued to gain an understanding of the minimal information required to support both modes of recognition. Here the intention is to define the semantic core of transmission. This central core of processing can then be fleshed out with additional image information and coding and rendering techniques.
Evaluation of three coding schemes designed for improved data communication
NASA Technical Reports Server (NTRS)
Snelsire, R. W.
1974-01-01
Three coding schemes designed for improved data communication are evaluated. Four block codes are evaluated relative to a quality function, which is a function of both the amount of data rejected and the error rate. The Viterbi maximum likelihood decoding algorithm as a decoding procedure is reviewed. This evaluation is obtained by simulating the system on a digital computer. Short constraint length rate 1/2 quick-look codes are studied, and their performance is compared to general nonsystematic codes.
On origin of genetic code and tRNA before translation
2011-01-01
Background Synthesis of proteins is based on the genetic code - a nearly universal assignment of codons to amino acids (aas). A major challenge to the understanding of the origins of this assignment is the archetypal "key-lock vs. frozen accident" dilemma. Here we re-examine this dilemma in light of 1) the fundamental veto on "foresight evolution", 2) modular structures of tRNAs and aminoacyl-tRNA synthetases, and 3) the updated library of aa-binding sites in RNA aptamers successfully selected in vitro for eight amino acids. Results The aa-binding sites of arginine, isoleucine and tyrosine contain both their cognate triplets, anticodons and codons. We have noticed that these cases might be associated with palindrome-dinucleotides. For example, one-base shift to the left brings arginine codons CGN, with CG at 1-2 positions, to the respective anticodons NCG, with CG at 2-3 positions. Formally, the concomitant presence of codons and anticodons is also expected in the reverse situation, with codons containing palindrome-dinucleotides at their 2-3 positions, and anticodons exhibiting them at 1-2 positions. A closer analysis reveals that, surprisingly, RNA binding sites for Arg, Ile and Tyr "prefer" (exactly as in the actual genetic code) the anticodon(2-3)/codon(1-2) tetramers to their anticodon(1-2)/codon(2-3) counterparts, despite the seemingly perfect symmetry of the latter. However, since in vitro selection of aa-specific RNA aptamers apparently had nothing to do with translation, this striking preference provides a new strong support to the notion of the genetic code emerging before translation, in response to catalytic (and possibly other) needs of ancient RNA life. Consistently with the pre-translation origin of the code, we propose here a new model of tRNA origin by the gradual, Fibonacci process-like, elongation of a tRNA molecule from a primordial coding triplet and 5'DCCA3' quadruplet (D is a base-determinator) to the eventual 76 base-long cloverleaf-shaped molecule. Conclusion Taken together, our findings necessarily imply that primordial tRNAs, tRNA aminoacylating ribozymes, and (later) the translation machinery in general have been co-evolving to ''fit'' the (likely already defined) genetic code, rather than the opposite way around. Coding triplets in this primal pre-translational code were likely similar to the anticodons, with second and third nucleotides being more important than the less specific first one. Later, when the code was expanding in co-evolution with the translation apparatus, the importance of 2-3 nucleotides of coding triplets "transferred" to the 1-2 nucleotides of their complements, thus distinguishing anticodons from codons. This evolutionary primacy of anticodons in genetic coding makes the hypothesis of primal stereo-chemical affinity between amino acids and cognate triplets, the hypothesis of coding coenzyme handles for amino acids, the hypothesis of tRNA-like genomic 3' tags suggesting that tRNAs originated in replication, and the hypothesis of ancient ribozymes-mediated operational code of tRNA aminoacylation not mutually contradicting but rather co-existing in harmony. Reviewers This article was reviewed by Eugene V. Koonin, Wentao Ma (nominated by Juergen Brosius) and Anthony Poole. PMID:21342520
Bayesian decision support for coding occupational injury data.
Nanda, Gaurav; Grattan, Kathleen M; Chu, MyDzung T; Davis, Letitia K; Lehto, Mark R
2016-06-01
Studies on autocoding injury data have found that machine learning algorithms perform well for categories that occur frequently but often struggle with rare categories. Therefore, manual coding, although resource-intensive, cannot be eliminated. We propose a Bayesian decision support system to autocode a large portion of the data, filter cases for manual review, and assist human coders by presenting them top k prediction choices and a confusion matrix of predictions from Bayesian models. We studied the prediction performance of Single-Word (SW) and Two-Word-Sequence (TW) Naïve Bayes models on a sample of data from the 2011 Survey of Occupational Injury and Illness (SOII). We used the agreement in prediction results of SW and TW models, and various prediction strength thresholds for autocoding and filtering cases for manual review. We also studied the sensitivity of the top k predictions of the SW model, TW model, and SW-TW combination, and then compared the accuracy of the manually assigned codes to SOII data with that of the proposed system. The accuracy of the proposed system, assuming well-trained coders reviewing a subset of only 26% of cases flagged for review, was estimated to be comparable (86.5%) to the accuracy of the original coding of the data set (range: 73%-86.8%). Overall, the TW model had higher sensitivity than the SW model, and the accuracy of the prediction results increased when the two models agreed, and for higher prediction strength thresholds. The sensitivity of the top five predictions was 93%. The proposed system seems promising for coding injury data as it offers comparable accuracy and less manual coding. Accurate and timely coded occupational injury data is useful for surveillance as well as prevention activities that aim to make workplaces safer. Copyright © 2016 Elsevier Ltd and National Safety Council. All rights reserved.
Why and how Mastering an Incremental and Iterative Software Development Process
NASA Astrophysics Data System (ADS)
Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe
2004-06-01
One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.
Software Certification - Coding, Code, and Coders
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Holzmann, Gerard J.
2011-01-01
We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.
Grimshaw, Paul; McGowan, Linda; McNichol, Elaine
2016-10-10
Purpose For leadership and management of Western health systems, good quality relationships are a fundamental cornerstone of organising health and social care (H&SC) delivery, delivering benefits across organisations and communities. The purpose of this paper is to explore the extant management, H&SC literature, grounded in older people care, reveal behaviours, processes and practices that if readily identified across a context will support healthy relationships across the "whole system" of stakeholders. Design/methodology/approach An academic/practitioner group designed and guided a scoping literature review of the H&SC and broader management literature to identify and extract important behaviours, processes and practices underlying the support of high-quality relationships. A search strategy was agreed and key health and management databases were interrogated and 51 papers selected for inclusion. Working with the practitioners, the selected papers were coded and then organised into emergent themes. Findings The paper outlines the relational behaviours, processes and practice elements that should be present within an older peoples care community, to support a healthy relational environment. These elements are presented under the five emergent literature themes of integrity, compassion, respect, fairness and trust. These five topics are examined in detail. A way forward for building statements using the review material, that may be applied to reveal relational patterns within older people care, is also explored and outlined. Research limitations/implications All literature reviews are subject to practical decisions around time, budget, scope and depth restraints. Therefore potentially relevant papers may have been missed in the review process. The scoping review process adapted here does not seek to make any major considerations with regards to the weighting of evidence behind the primary research. Originality/value This paper contributes to a growing need for designers of health systems to more fully understand, measure and draw on the value of relationships to help bridge the gap between diminishing resources and the expanding demand on H&SC services.
Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886
A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage scheme.
Pongpirul, Krit; Walker, Damian G; Winch, Peter J; Robinson, Courtland
2011-04-08
In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.
A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage Scheme
2011-01-01
Background In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Methods Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Results Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Conclusions Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors. PMID:21477310
New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Rodemich, E. R.; Rumsey, H., Jr.; Welch, L. R.
1977-01-01
An upper bound on the rate of a binary code as a function of minimum code distance (using a Hamming code metric) is arrived at from Delsarte-MacWilliams inequalities. The upper bound so found is asymptotically less than Levenshtein's bound, and a fortiori less than Elias' bound. Appendices review properties of Krawtchouk polynomials and Q-polynomials utilized in the rigorous proofs.
Henry, Kenneth S.; Heinz, Michael G.
2013-01-01
People with sensorineural hearing loss have substantial difficulty understanding speech under degraded listening conditions. Behavioral studies suggest that this difficulty may be caused by changes in auditory processing of the rapidly-varying temporal fine structure (TFS) of acoustic signals. In this paper, we review the presently known effects of sensorineural hearing loss on processing of TFS and slower envelope modulations in the peripheral auditory system of mammals. Cochlear damage has relatively subtle effects on phase locking by auditory-nerve fibers to the temporal structure of narrowband signals under quiet conditions. In background noise, however, sensorineural loss does substantially reduce phase locking to the TFS of pure-tone stimuli. For auditory processing of broadband stimuli, sensorineural hearing loss has been shown to severely alter the neural representation of temporal information along the tonotopic axis of the cochlea. Notably, auditory-nerve fibers innervating the high-frequency part of the cochlea grow increasingly responsive to low-frequency TFS information and less responsive to temporal information near their characteristic frequency (CF). Cochlear damage also increases the correlation of the response to TFS across fibers of varying CF, decreases the traveling-wave delay between TFS responses of fibers with different CFs, and can increase the range of temporal modulation frequencies encoded in the periphery for broadband sounds. Weaker neural coding of temporal structure in background noise and degraded coding of broadband signals along the tonotopic axis of the cochlea are expected to contribute considerably to speech perception problems in people with sensorineural hearing loss. PMID:23376018
NASA Astrophysics Data System (ADS)
Korchagova, V. N.; Kraposhin, M. V.; Marchevsky, I. K.; Smirnova, E. V.
2017-11-01
A droplet impact on a deep pool can induce macro-scale or micro-scale effects like a crown splash, a high-speed jet, formation of secondary droplets or thin liquid films, etc. It depends on the diameter and velocity of the droplet, liquid properties, effects of external forces and other factors that a ratio of dimensionless criteria can account for. In the present research, we considered the droplet and the pool consist of the same viscous incompressible liquid. We took surface tension into account but neglected gravity forces. We used two open-source codes (OpenFOAM and Gerris) for our computations. We review the possibility of using these codes for simulation of processes in free-surface flows that may take place after a droplet impact on the pool. Both codes simulated several modes of droplet impact. We estimated the effect of liquid properties with respect to the Reynolds number and Weber number. Numerical simulation enabled us to find boundaries between different modes of droplet impact on a deep pool and to plot corresponding mode maps. The ratio of liquid density to that of the surrounding gas induces several changes in mode maps. Increasing this density ratio suppresses the crown splash.
A Subsonic Aircraft Design Optimization With Neural Network and Regression Approximators
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.; Haller, William J.
2004-01-01
The Flight-Optimization-System (FLOPS) code encountered difficulty in analyzing a subsonic aircraft. The limitation made the design optimization problematic. The deficiencies have been alleviated through use of neural network and regression approximations. The insight gained from using the approximators is discussed in this paper. The FLOPS code is reviewed. Analysis models are developed and validated for each approximator. The regression method appears to hug the data points, while the neural network approximation follows a mean path. For an analysis cycle, the approximate model required milliseconds of central processing unit (CPU) time versus seconds by the FLOPS code. Performance of the approximators was satisfactory for aircraft analysis. A design optimization capability has been created by coupling the derived analyzers to the optimization test bed CometBoards. The approximators were efficient reanalysis tools in the aircraft design optimization. Instability encountered in the FLOPS analyzer was eliminated. The convergence characteristics were improved for the design optimization. The CPU time required to calculate the optimum solution, measured in hours with the FLOPS code was reduced to minutes with the neural network approximation and to seconds with the regression method. Generation of the approximators required the manipulation of a very large quantity of data. Design sensitivity with respect to the bounds of aircraft constraints is easily generated.
Mull, Hillary J; Graham, Laura A; Morris, Melanie S; Rosen, Amy K; Richman, Joshua S; Whittle, Jeffery; Burns, Edith; Wagner, Todd H; Copeland, Laurel A; Wahl, Tyler; Jones, Caroline; Hollis, Robert H; Itani, Kamal M F; Hawn, Mary T
2018-04-18
Postoperative readmission data are used to measure hospital performance, yet the extent to which these readmissions reflect surgical quality is unknown. To establish expert consensus on whether reasons for postoperative readmission are associated with the quality of surgery in the index admission. In a modified Delphi process, a panel of 14 experts in medical and surgical readmissions comprising physicians and nonphysicians from Veterans Affairs (VA) and private-sector institutions reviewed 30-day postoperative readmissions from fiscal years 2008 through 2014 associated with inpatient surgical procedures performed at a VA medical center between October 1, 2007, and September 30, 2014. The consensus process was conducted from January through May 2017. Reasons for readmission were grouped into categories based on International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes. Panelists were given the proportion of readmissions coded by each reason and median (interquartile range) days to readmission. They answered the question, "Does the readmission reason reflect possible surgical quality of care problems in the index admission?" on a scale of 1 (never related) to 5 (directly related) in 3 rounds of consensus building. The consensus process was completed in May 2017 and data were analyzed in June 2017. Consensus on proportion of ICD-9-coded readmission reasons that reflected quality of surgical procedure. In 3 Delphi rounds, the 14 panelists achieved consensus on 50 reasons for readmission; 12 panelists also completed group telephone calls between rounds 1 and 2. Readmissions with diagnoses of infection, sepsis, pneumonia, hemorrhage/hematoma, anemia, ostomy complications, acute renal failure, fluid/electrolyte disorders, or venous thromboembolism were considered associated with surgical quality and accounted for 25 521 of 39 664 readmissions (64% of readmissions; 7.5% of 340 858 index surgical procedures). The proportion of readmissions considered to be not associated with surgical quality varied by procedure, ranging from to 21% (613 of 2331) of readmissions after lower-extremity amputations to 47% (745 of 1598) of readmissions after cholecystectomy. One-third of postoperative readmissions are unlikely to reflect problems with surgical quality. Future studies should test whether restricting readmissions to those with specific ICD-9 codes might yield a more useful quality measure.
Awareness Becomes Necessary Between Adaptive Pattern Coding of Open and Closed Curvatures
Sweeny, Timothy D.; Grabowecky, Marcia; Suzuki, Satoru
2012-01-01
Visual pattern processing becomes increasingly complex along the ventral pathway, from the low-level coding of local orientation in the primary visual cortex to the high-level coding of face identity in temporal visual areas. Previous research using pattern aftereffects as a psychophysical tool to measure activation of adaptive feature coding has suggested that awareness is relatively unimportant for the coding of orientation, but awareness is crucial for the coding of face identity. We investigated where along the ventral visual pathway awareness becomes crucial for pattern coding. Monoptic masking, which interferes with neural spiking activity in low-level processing while preserving awareness of the adaptor, eliminated open-curvature aftereffects but preserved closed-curvature aftereffects. In contrast, dichoptic masking, which spares spiking activity in low-level processing while wiping out awareness, preserved open-curvature aftereffects but eliminated closed-curvature aftereffects. This double dissociation suggests that adaptive coding of open and closed curvatures straddles the divide between weakly and strongly awareness-dependent pattern coding. PMID:21690314
Domestic animals as models for biomedical research.
Andersson, Leif
2016-01-01
Domestic animals are unique models for biomedical research due to their long history (thousands of years) of strong phenotypic selection. This process has enriched for novel mutations that have contributed to phenotype evolution in domestic animals. The characterization of such mutations provides insights in gene function and biological mechanisms. This review summarizes genetic dissection of about 50 genetic variants affecting pigmentation, behaviour, metabolic regulation, and the pattern of locomotion. The variants are controlled by mutations in about 30 different genes, and for 10 of these our group was the first to report an association between the gene and a phenotype. Almost half of the reported mutations occur in non-coding sequences, suggesting that this is the most common type of polymorphism underlying phenotypic variation since this is a biased list where the proportion of coding mutations are inflated as they are easier to find. The review documents that structural changes (duplications, deletions, and inversions) have contributed significantly to the evolution of phenotypic diversity in domestic animals. Finally, we describe five examples of evolution of alleles, which means that alleles have evolved by the accumulation of several consecutive mutations affecting the function of the same gene.
Domestic animals as models for biomedical research
Andersson, Leif
2016-01-01
Domestic animals are unique models for biomedical research due to their long history (thousands of years) of strong phenotypic selection. This process has enriched for novel mutations that have contributed to phenotype evolution in domestic animals. The characterization of such mutations provides insights in gene function and biological mechanisms. This review summarizes genetic dissection of about 50 genetic variants affecting pigmentation, behaviour, metabolic regulation, and the pattern of locomotion. The variants are controlled by mutations in about 30 different genes, and for 10 of these our group was the first to report an association between the gene and a phenotype. Almost half of the reported mutations occur in non-coding sequences, suggesting that this is the most common type of polymorphism underlying phenotypic variation since this is a biased list where the proportion of coding mutations are inflated as they are easier to find. The review documents that structural changes (duplications, deletions, and inversions) have contributed significantly to the evolution of phenotypic diversity in domestic animals. Finally, we describe five examples of evolution of alleles, which means that alleles have evolved by the accumulation of several consecutive mutations affecting the function of the same gene. PMID:26479863
Proctor, Sherrie L; Romano, Maria
2016-09-01
Shortages of school psychologists and the underrepresentation of minorities in school psychology represent longstanding concerns. Scholars recommend that one way to address both issues is to recruit individuals from racially and ethnically diverse backgrounds into school psychology. The purpose of this study was to explore the characteristics and minority focused findings of school psychology recruitment studies conducted from 1994 to 2014. Using an electronic search that included specified databases, subject terms and study inclusion criteria along with a manual search of 10 school psychology focused journals, the review yielded 10 published, peer-reviewed recruitment studies focused primarily on school psychology over the 20-year span. Two researchers coded these 10 studies using a rigorous coding process that included a high level of inter rater reliability. Results suggest that the studies utilized varied methodologies, primarily sampled undergraduate populations, and mostly included White participants. Five studies focused on minority populations specifically. These studies indicate that programs should actively recruit minority undergraduates and offer financial support to attract minority candidates. Implications suggest a need for more recruitment research focused on minority populations and the implementation and evaluation of minority recruitment models. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Zelingher, Julian; Ash, Nachman
2013-05-01
The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding systems. In summary, the adoption of ICD-10-CM is in line with the USA decision to abandon ICD-9-CM, and the Israeli heaLthcare system could benefit from USA heaLthcare efforts in this direction. The Large content of SNOMED-CT and its sophisticated hierarchical data structure will enable advanced cLinicaL decision support and quality improvement applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... Request; Bar Code Label Requirement for Human Drug and Biological Products AGENCY: Food and Drug... and clearance. Bar Code Label Requirement for Human Drug and Biological Products--(OMB Control Number... that required human drug product and biological product labels to have bar codes. The rule required bar...
Ethical conduct for research : a code of scientific ethics
Marcia Patton-Mallory; Kathleen Franzreb; Charles Carll; Richard Cline
2000-01-01
The USDA Forest Service recently developed and adopted a code of ethical conduct for scientific research and development. The code addresses issues related to research misconduct, such as fabrication, falsification, or plagiarism in proposing, performing, or reviewing research or in reporting research results, as well as issues related to professional misconduct, such...
The Lambert Code: Can We Define Best Practice?
ERIC Educational Resources Information Center
Shattock, Michael
2004-01-01
The article explores the proposals put forward in the Lambert Report for reforms in university governance. It compares the recommendation for a Code with the analogue Combined Code which regulates corporate governance in companies and draws a distinction between attempts, from the Cadbury Report in 1992 to the Higgs Review in 2003, to create board…
Sequential Syndrome Decoding of Convolutional Codes
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.
1984-01-01
The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.
76 FR 12600 - Review of the Emergency Alert System
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... appropriate, various administrative procedures for national tests, including test codes to be used and pre... administrative procedures for national tests, including test codes to be used and pre-test outreach. B. Summary... test codes to be used and pre-test outreach, the Commission has instructed the Bureau to factor in the...
Code-Mixing as a Bilingual Instructional Strategy
ERIC Educational Resources Information Center
Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram
2014-01-01
This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…
Comparison of injury severity between AIS 2005 and AIS 1990 in a large injury database
Barnes, J; Hassan, A; Cuerden, R; Cookson, R; Kohlhofer, J
2009-01-01
The aim of this study is to investigate the differences in car occupant injury severity recorded in AIS 2005 compared to AIS 1990 and to outline the likely effects on future data analysis findings. Occupant injury data in the UK Cooperative Crash Injury Study Database (CCIS) were coded for the period February 2006 to November 2007 using both AIS 1990 and AIS 2005. Data for 1,994 occupants with over 6000 coded injuries were reviewed at the AIS and MAIS level of severities and body regions to determine changes between the two coding methodologies. Overall there was an apparent general trend for fewer injuries to be coded at the AIS 4+ severity and more injuries to be coded at the AIS 2 severity. When these injury trends were reviewed in more detail it was found that the body regions which contributed the most to these changes in severity were the head, thorax and extremities. This is one of the first studies to examine the implications for large databases when changing to an updated method for coding injuries. PMID:20184835
28 CFR 71.42 - Judicial review.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Judicial review. 71.42 Section 71.42....42 Judicial review. Section 3805 of title 31, United States Code, authorizes judicial review by an... assessments under this part and specifies the procedures for such review. ...
77 FR 7559 - Certification Process for State Capital Counsel Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-13
...Section 2265 of title 28, United States Code, instructs the Attorney General to promulgate regulations establishing a certification procedure for States seeking to qualify for the special Federal habeas corpus review provisions for capital cases under chapter 154 of title 28. The benefits of chapter 154--including expedited timing and limits on the scope of Federal habeas review of State judgments--are available to States on the condition that they provide counsel to indigent capital defendants in State postconviction proceedings pursuant to mechanisms that satisfy certain statutory requirements. This supplemental notice of proposed rulemaking (supplemental notice) requests public comment concerning five changes that the Department is considering to a previously published proposed rule for the chapter 154 certification procedure.
HSFs, Stress Sensors and Sculptors of Transcription Compartments and Epigenetic Landscapes.
Miozzo, Federico; Sabéran-Djoneidi, Délara; Mezger, Valérie
2015-12-04
Starting as a paradigm for stress responses, the study of the transcription factor (TF) family of heat shock factors (HSFs) has quickly and widely expanded these last decades, thanks to their fascinating and significant involvement in a variety of pathophysiological processes, including development, reproduction, neurodegeneration and carcinogenesis. HSFs, originally defined as classical TFs, strikingly appeared to play a central and often pioneering role in reshaping the epigenetic landscape. In this review, we describe how HSFs are able to sense the epigenetic environment, and we review recent data that support their role as sculptors of the chromatin landscape through their complex interplay with chromatin remodelers, histone-modifying enzymes and non-coding RNAs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Diabetes Mellitus Coding Training for Family Practice Residents.
Urse, Geraldine N
2015-07-01
Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.
Country Report on Building Energy Codes in Canada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shui, Bin; Evans, Meredydd
2009-04-06
This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America . This reports gives an overview of the development of building energy codes in Canada, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, lighting, and water heating) for commercial and residential buildingsmore » in Canada.« less
Woolgar, Alexandra; Williams, Mark A; Rich, Anina N
2015-04-01
Selective attention is fundamental for human activity, but the details of its neural implementation remain elusive. One influential theory, the adaptive coding hypothesis (Duncan, 2001, An adaptive coding model of neural function in prefrontal cortex, Nature Reviews Neuroscience 2:820-829), proposes that single neurons in certain frontal and parietal regions dynamically adjust their responses to selectively encode relevant information. This selective representation may in turn support selective processing in more specialized brain regions such as the visual cortices. Here, we use multi-voxel decoding of functional magnetic resonance images to demonstrate selective representation of attended--and not distractor--objects in frontal, parietal, and visual cortices. In addition, we highlight a critical role for task demands in determining which brain regions exhibit selective coding. Strikingly, representation of attended objects in frontoparietal cortex was highest under conditions of high perceptual demand, when stimuli were hard to perceive and coding in early visual cortex was weak. Coding in early visual cortex varied as a function of attention and perceptual demand, while coding in higher visual areas was sensitive to the allocation of attention but robust to changes in perceptual difficulty. Consistent with high-profile reports, peripherally presented objects could also be decoded from activity at the occipital pole, a region which corresponds to the fovea. Our results emphasize the flexibility of frontoparietal and visual systems. They support the hypothesis that attention enhances the multi-voxel representation of information in the brain, and suggest that the engagement of this attentional mechanism depends critically on current task demands. Copyright © 2015 Elsevier Inc. All rights reserved.
Development and feasibility testing of the Pediatric Emergency Discharge Interaction Coding Scheme.
Curran, Janet A; Taylor, Alexandra; Chorney, Jill; Porter, Stephen; Murphy, Andrea; MacPhee, Shannon; Bishop, Andrea; Haworth, Rebecca
2017-08-01
Discharge communication is an important aspect of high-quality emergency care. This study addresses the gap in knowledge on how to describe discharge communication in a paediatric emergency department (ED). The objective of this feasibility study was to develop and test a coding scheme to characterize discharge communication between health-care providers (HCPs) and caregivers who visit the ED with their children. The Pediatric Emergency Discharge Interaction Coding Scheme (PEDICS) and coding manual were developed following a review of the literature and an iterative refinement process involving HCP observations, inter-rater assessments and team consensus. The coding scheme was pilot-tested through observations of HCPs across a range of shifts in one urban paediatric ED. Overall, 329 patient observations were carried out across 50 observational shifts. Inter-rater reliability was evaluated in 16% of the observations. The final version of the PEDICS contained 41 communication elements. Kappa scores were greater than .60 for the majority of communication elements. The most frequently observed communication elements were under the Introduction node and the least frequently observed were under the Social Concerns node. HCPs initiated the majority of the communication. Pediatric Emergency Discharge Interaction Coding Scheme addresses an important gap in the discharge communication literature. The tool is useful for mapping patterns of discharge communication between HCPs and caregivers. Results from our pilot test identified deficits in specific areas of discharge communication that could impact adherence to discharge instructions. The PEDICS would benefit from further testing with a different sample of HCPs. © 2017 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Review of current nuclear fallout codes.
Auxier, Jerrad P; Auxier, John D; Hall, Howard L
2017-05-01
The importance of developing a robust nuclear forensics program to combat the illicit use of nuclear material that may be used as an improvised nuclear device is widely accepted. In order to decrease the threat to public safety and improve governmental response, government agencies have developed fallout-analysis codes to predict the fallout particle size, dose, and dispersion and dispersion following a detonation. This paper will review the different codes that have been developed for predicting fallout from both chemical and nuclear weapons. This will decrease the response time required for the government to respond to the event. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Wilkins-Diehr, Nancy; Chue Hong, Neil; Venters, Colin C.; Howison, James; Seinstra, Frank; Jones, Matthew; Cranston, Karen; Clune, Thomas L.; de Val-Borro, Miguel; Littauer, Richard
2016-02-01
This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2). The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a "Software Paper."
Maritz, Roxanne; Aronsky, Dominik; Prodinger, Birgit
2017-09-20
The International Classification of Functioning, Disability and Health (ICF) is the World Health Organization's standard for describing health and health-related states. Examples of how the ICF has been used in Electronic Health Records (EHRs) have not been systematically summarized and described yet. To provide a systematic review of peer-reviewed literature about the ICF's use in EHRs, including related challenges and benefits. Peer-reviewed literature, published between January 2001 and July 2015 was retrieved from Medline ® , CINAHL ® , Scopus ® , and ProQuest ® Social Sciences using search terms related to ICF and EHR concepts. Publications were categorized according to three groups: Requirement specification, development and implementation. Information extraction was conducted according to a qualitative content analysis method, deductively informed by the evaluation framework for Health Information Systems: Human, Organization and Technology-fit (HOT-fit). Of 325 retrieved articles, 17 publications were included; 4 were categorized as requirement specification, 7 as development, and 6 as implementation publications. Information regarding the HOT-fit evaluation framework was summarized. Main benefits of using the ICF in EHRs were its unique comprehensive perspective on health and its interdisciplinary focus. Main challenges included the fact that the ICF is not structured as a formal terminology as well as the need for a reduced number of ICF codes for more feasible and practical use. Different approaches and technical solutions exist for integrating the ICF in EHRs, such as combining the ICF with other existing standards for EHR or selecting ICF codes with natural language processing. Though the use of the ICF in EHRs is beneficial as this review revealed, the ICF could profit from further improvements such as formalizing the knowledge representation in the ICF to support and enhance interoperability.
The identification of incident cancers in UK primary care databases: a systematic review.
Rañopa, Michael; Douglas, Ian; van Staa, Tjeerd; Smeeth, Liam; Klungel, Olaf; Reynolds, Robert; Bhaskaran, Krishnan
2015-01-01
UK primary care databases are frequently used in observational studies with cancer outcomes. We aimed to systematically review methods used by such studies to identify and validate incident cancers of the breast, colorectum, and prostate. Medline and Embase (1980-2013) were searched for UK primary care database studies with incident breast, colorectal, or prostate cancer outcomes. Data on the methods used for case ascertainment were extracted and summarised. Questionnaires were sent to corresponding authors to obtain details about case ascertainment. Eighty-four studies of breast (n = 51), colorectal (n = 54), and prostate cancer (n = 31) were identified; 30 examined >1 cancer type. Among the 84 studies, 57 defined cancers using only diagnosis codes, while 27 required further evidence such as chemotherapy. Few studies described methods used to create cancer code lists (n = 5); or made lists available directly (n = 5). Twenty-eight code lists were received on request from study authors. All included malignant neoplasm diagnosis codes, but there was considerable variation in the specific codes included which was not explained by coding dictionary changes. Code lists also varied in terms of other types of codes included, such as in-situ, cancer morphology, history of cancer, and secondary/suspected/borderline cancer codes. In UK primary care database studies, methods for identifying breast, colorectal, and prostate cancers were often unclear. Code lists were often unavailable, and where provided, we observed variation in the individual codes and types of codes included. Clearer reporting of methods and publication of code lists would improve transparency and reproducibility of studies. Copyright © 2014 John Wiley & Sons, Ltd.
A Qualitative Analysis of the Navy’s HSI Billet Structure
2008-06-01
of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering...and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other...subspecialty code. The research results support the hypothesis that the work requirements of the July 2007 data set of 4600P-coded billets (billets
Epigenetics: a new frontier in dentistry.
Williams, S D; Hughes, T E; Adler, C J; Brook, A H; Townsend, G C
2014-06-01
In 2007, only four years after the completion of the Human Genome Project, the journal Science announced that epigenetics was the 'breakthrough of the year'. Time magazine placed it second in the top 10 discoveries of 2009. While our genetic code (i.e. our DNA) contains all of the information to produce the elements we require to function, our epigenetic code determines when and where genes in the genetic code are expressed. Without the epigenetic code, the genetic code is like an orchestra without a conductor. Although there is now a substantial amount of published research on epigenetics in medicine and biology, epigenetics in dental research is in its infancy. However, epigenetics promises to become increasingly relevant to dentistry because of the role it plays in gene expression during development and subsequently potentially influencing oral disease susceptibility. This paper provides a review of the field of epigenetics aimed specifically at oral health professionals. It defines epigenetics, addresses the underlying concepts and provides details about specific epigenetic molecular mechanisms. Further, we discuss some of the key areas where epigenetics is implicated, and review the literature on epigenetics research in dentistry, including its relevance to clinical disciplines. This review considers some implications of epigenetics for the future of dental practice, including a 'personalized medicine' approach to the management of common oral diseases. © 2014 Australian Dental Association.
Najjar, Peter; Kachalia, Allen; Sutherland, Tori; Beloff, Jennifer; David-Kasdan, Jo Ann; Bates, David W; Urman, Richard D
2015-01-01
The AHRQ Patient Safety Indicators (PSIs) are used for calculation of risk-adjusted postoperative rates for adverse events. The payers and quality consortiums are increasingly requiring public reporting of hospital performance on these metrics. We discuss processes designed to improve the accuracy and clinical utility of PSI reporting in practice. The study was conducted at a 793-bed tertiary care academic medical center where PSI processes have been aggressively implemented to track patient safety events at discharge. A three-phased approach to improving administrative data quality was implemented. The initiative consisted of clinical review of all PSIs, documentation improvement, and provider outreach including active querying for patient safety events. This multidisciplinary effort to develop a streamlined process for PSI calculation reduced the reporting of miscoded PSIs and increased the clinical utility of PSI monitoring. Over 4 quarters, 4 of 41 (10%) PSI-11 and 9 of 138 (7%) PSI-15 errors were identified on review of clinical documentation and appropriate adjustments were made. A multidisciplinary, phased approach leveraging existing billing infrastructure for robust metric coding, ongoing clinical review, and frontline provider outreach is a novel and effective way to reduce the reporting of false-positive outcomes and improve the clinical utility of PSIs.
Roland, Carl L; Lake, Joanita; Oderda, Gary M
2016-12-01
We conducted a systematic review to evaluate worldwide human English published literature from 2009 to 2014 on prevalence of opioid misuse/abuse in retrospective databases where International Classification of Diseases (ICD) codes were used. Inclusion criteria for the studies were use of a retrospective database, measured abuse, dependence, and/or poisoning using ICD codes, stated prevalence or it could be derived, and documented time frame. A meta-analysis was not performed. A qualitative narrative synthesis was used, and 16 studies were included for data abstraction. ICD code use varies; 10 studies used ICD codes that encompassed all three terms: abuse, dependence, or poisoning. Eight studies limited determination of misuse/abuse to an opioid user population. Abuse prevalence among opioid users in commercial databases using all three terms of ICD codes varied depending on the opioid; 21 per 1000 persons (reformulated extended-release oxymorphone; 2011-2012) to 113 per 1000 persons (immediate-release opioids; 2010-2011). Abuse prevalence in general populations using all three ICD code terms ranged from 1.15 per 1000 persons (commercial; 6 months 2010) to 8.7 per 1000 persons (Medicaid; 2002-2003). Prevalence increased over time. When similar ICD codes are used, the highest prevalence is in US government-insured populations. Limiting population to continuous opioid users increases prevalence. Prevalence varies depending on ICD codes used, population, time frame, and years studied. Researchers using ICD codes to determine opioid abuse prevalence need to be aware of cautions and limitations.
Auto-Coding UML Statecharts for Flight Software
NASA Technical Reports Server (NTRS)
Benowitz, Edward G; Clark, Ken; Watney, Garth J.
2006-01-01
Statecharts have been used as a means to communicate behaviors in a precise manner between system engineers and software engineers. Hand-translating a statechart to code, as done on some previous space missions, introduces the possibility of errors in the transformation from chart to code. To improve auto-coding, we have developed a process that generates flight code from UML statecharts. Our process is being used for the flight software on the Space Interferometer Mission (SIM).
NASA Technical Reports Server (NTRS)
Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush
2006-01-01
This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).
The Plasma Simulation Code: A modern particle-in-cell code with patch-based load-balancing
NASA Astrophysics Data System (ADS)
Germaschewski, Kai; Fox, William; Abbott, Stephen; Ahmadi, Narges; Maynard, Kristofor; Wang, Liang; Ruhl, Hartmut; Bhattacharjee, Amitava
2016-08-01
This work describes the Plasma Simulation Code (PSC), an explicit, electromagnetic particle-in-cell code with support for different order particle shape functions. We review the basic components of the particle-in-cell method as well as the computational architecture of the PSC code that allows support for modular algorithms and data structure in the code. We then describe and analyze in detail a distinguishing feature of PSC: patch-based load balancing using space-filling curves which is shown to lead to major efficiency gains over unbalanced methods and a previously used simpler balancing method.
Air Traffic Controller Working Memory: Considerations in Air Traffic Control Tactical Operations
1993-09-01
INFORMATION PROCESSING SYSTEM 3 2. AIR TRAFFIC CONTROLLER MEMORY 5 2.1 MEMORY CODES 6 21.1 Visual Codes 7 2.1.2 Phonetic Codes 7 2.1.3 Semantic Codes 8...raise an awareness of the memory re- quirements of ATC tactical operations by presenting information on working memory processes that are relevant to...working v memory permeates every aspect of the controller’s ability to process air traffic information and control live traffic. The
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
The Integrated Medical Model: Outcomes from Independent Review
NASA Technical Reports Server (NTRS)
Myers, J.; Garcia, Y.; Griffin, D.; Arellano, J.; Boley, L.; Goodenow, D. A.; Kerstman, E.; Reyes, D.; Saile, L.; Walton, M.;
2017-01-01
In 2016, the Integrated Medical Model (IMM) v4.0 underwent an extensive external review in preparation for transition to an operational status. In order to insure impartiality of the review process, the Exploration Medical Capabilities Element of NASA's Human Research Program convened the review through the Systems Review Office at NASA Goddard Space Flight Center (GSFC). The review board convened by GSFC consisted of persons from both NASA and academia with expertise in the fields of statistics, epidemiology, modeling, software development, aerospace medicine, and project management (see Figure 1). The board reviewed software and code standards, as well as evidence pedigree associated with both the input and outcomes information. The board also assesses the models verification, validation, sensitivity to parameters and ability to answer operational questions. This talk will discuss the processes for designing the review, how the review progressed and the findings from the board, as well as summarize the IMM project responses to those findings. Overall, the board found that the IMM is scientifically sound, represents a necessary, comprehensive approach to identifying medical and environmental risks facing astronauts in long duration missions and is an excellent tool for communication between engineers and physicians. The board also found IMM and its customer(s) should convene an additional review of the IMM data sources and to develop a sustainable approach to augment, peer review, and maintain the information utilized in the IMM. The board found this is critically important because medical knowledge continues to evolve. Delivery of IMM v4.0 to the Crew Health and Safety (CHS) Program will occur in the 2017. Once delivered for operational decision support, IMM v4.0 will provide CHS with additional quantitative capability in to assess astronaut medical risks and required medical capabilities to help drive down overall mission risks.
Zhang, Fei-Fei; Luo, Yu-Hao; Wang, Hui; Zhao, Liang
2016-01-01
Long non-coding RNAs (lncRNAs), a newly discovered class of ncRNA molecules, have been widely accepted as crucial regulators of various diseases including cancer. Increasing numbers of studies have demonstrated that lncRNAs are involved in diverse physiological and pathophysiological processes, such as cell cycle progression, chromatin remodeling, gene transcription, and posttranscriptional processing. Aberrant expression of lncRNAs frequently occurs in gastrointestinal cancer and plays emerging roles in cancer metastasis. In this review, we focus on and outline the regulatory functions of recently identified metastasis-associated lncRNAs, and evaluate the potential roles of lncRNAs as novel diagnostic biomarkers and therapeutic targets in gastrointestinal cancer. PMID:27818589
Overview of research on Bombyx mori microRNA
Wang, Xin; Tang, Shun-ming; Shen, Xing-jia
2014-01-01
Abstract MicroRNAs (miRNAs) constitute some of the most significant regulatory factors involved at the post-transcriptional level after gene expression, contributing to the modulation of a large number of physiological processes such as development, metabolism, and disease occurrence. This review comprehensively and retrospectively explores the literature investigating silkworm, Bombyx mori L. (Lepidoptera: Bombicidae), miRNAs published to date, including discovery, identification, expression profiling analysis, target gene prediction, and the functional analysis of both miRNAs and their targets. It may provide experimental considerations and approaches for future study of miRNAs and benefit elucidation of the mechanisms of miRNAs involved in silkworm developmental processes and intracellular activities of other unknown non-coding RNAs. PMID:25368077
The function and failure of sensory predictions.
Bansal, Sonia; Ford, Judith M; Spering, Miriam
2018-04-23
Humans and other primates are equipped with neural mechanisms that allow them to automatically make predictions about future events, facilitating processing of expected sensations and actions. Prediction-driven control and monitoring of perceptual and motor acts are vital to normal cognitive functioning. This review provides an overview of corollary discharge mechanisms involved in predictions across sensory modalities and discusses consequences of predictive coding for cognition and behavior. Converging evidence now links impairments in corollary discharge mechanisms to neuropsychiatric symptoms such as hallucinations and delusions. We review studies supporting a prediction-failure hypothesis of perceptual and cognitive disturbances. We also outline neural correlates underlying prediction function and failure, highlighting similarities across the visual, auditory, and somatosensory systems. In linking basic psychophysical and psychophysiological evidence of visual, auditory, and somatosensory prediction failures to neuropsychiatric symptoms, our review furthers our understanding of disease mechanisms. © 2018 New York Academy of Sciences.
Pierce, Hannah L; Stafford, Julia M; Daube, Mike
2017-07-26
Young people in Australia are frequently exposed to alcohol marketing. Leading health organisations recommend legislative controls on alcohol advertising as part of a comprehensive approach to reduce alcohol-related harm. However, Australia relies largely on industry self-regulation. This paper describes the development and implementation of the Alcohol Advertising Review Board (AARB), a world-first public health advocacy initiative that encourages independent regulation of alcohol advertising. The AARB reviews complaints about alcohol advertising, and uses strategies such as media advocacy, community engagement and communicating with policy makers to highlight the need for effective regulation. In 4 years of operation, the AARB has received more complaints than the self-regulatory system across a similar period. There has been encouraging movement towards stronger regulation of alcohol advertising. Key lessons include the importance of a strong code, credible review processes, gathering support from reputable organisations, and consideration of legal risks and sustainability. The AARB provides a unique model that could be replicated elsewhere.
NASA Technical Reports Server (NTRS)
Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron
2011-01-01
A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.
Certification of medical librarians, 1949--1977 statistical analysis.
Schmidt, D
1979-01-01
The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised.
Certification of medical librarians, 1949--1977 statistical analysis.
Schmidt, D
1979-01-01
The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised. PMID:427287
Improving the medical records department processes by lean management.
Ajami, Sima; Ketabi, Saeedeh; Sadeghian, Akram; Saghaeinnejad-Isfahani, Sakine
2015-01-01
Lean management is a process improvement technique to identify waste actions and processes to eliminate them. The benefits of Lean for healthcare organizations are that first, the quality of the outcomes in terms of mistakes and errors improves. The second is that the amount of time taken through the whole process significantly improves. The purpose of this paper is to improve the Medical Records Department (MRD) processes at Ayatolah-Kashani Hospital in Isfahan, Iran by utilizing Lean management. This research was applied and an interventional study. The data have been collected by brainstorming, observation, interview, and workflow review. The study population included MRD staff and other expert staff within the hospital who were stakeholders and users of the MRD. The MRD were initially taught the concepts of Lean management and then formed into the MRD Lean team. The team then identified and reviewed the current processes subsequently; they identified wastes and values, and proposed solutions. The findings showed that the MRD units (Archive, Coding, Statistics, and Admission) had 17 current processes, 28 wastes, and 11 values were identified. In addition, they offered 27 comments for eliminating the wastes. The MRD is the critical department for the hospital information system and, therefore, the continuous improvement of its services and processes, through scientific methods such as Lean management, are essential. The study represents one of the few attempts trying to eliminate wastes in the MRD.
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
NASA Astrophysics Data System (ADS)
Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto
2015-08-01
We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.
Implementation of a Post-Code Pause: Extending Post-Event Debriefing to Include Silence.
Copeland, Darcy; Liska, Heather
2016-01-01
This project arose out of a need to address two issues at our hospital: we lacked a formal debriefing process for code/trauma events and the emergency department wanted to address the psychological and spiritual needs of code/trauma responders. We developed a debriefing process for code/trauma events that intentionally included mechanisms to facilitate recognition, acknowledgment, and, when needed, responses to the psychological and spiritual needs of responders. A post-code pause process was implemented in the emergency department with the aims of standardizing a debriefing process, encouraging a supportive team-based culture, improving transition back to "normal" activities after responding to code/trauma events, and providing responders an opportunity to express reverence for patients involved in code/trauma events. The post-code pause process incorporates a moment of silence and the addition of two simple questions to a traditional operational debrief. Implementation of post-code pauses was feasible despite the fast paced nature of the department. At the end of the 1-year pilot period, staff members reported increases in feeling supported by peers and leaders, their ability to pay homage to patients, and having time to regroup prior to returning to their assignment. There was a decrease in the number of respondents reporting having thoughts or feelings associated with the event within 24 hr. The pauses create a mechanism for operational team debriefing, provide an opportunity for staff members to honor their work and their patients, and support an environment in which the psychological and spiritual effects of responding to code/trauma events can be acknowledged.
TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks
DOE Office of Scientific and Technical Information (OSTI.GOV)
French, S; Nazareth, D; Bellor, M
Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less
An overview of bacterial nomenclature with special reference to plant pathogens.
Young, J M
2008-12-01
The nomenclature of plant pathogenic bacteria is regulated by the International Code of Nomenclature of Prokaryotes and the International Standards for Naming Pathovars of Phytopathogenic Bacteria. The object of these regulations is to ensure that nomenclature is unambiguous, with correct designations in genera and species and, for many plant pathogens, in infrasubspecies as pathovars. Failure to apply these regulations or to apply them carelessly introduces confusion and misunderstanding over the intended identity of particular pathogens. In this review, bacterial nomenclature is introduced in the context of general communication, with a brief history of the origins of modern bacterial nomenclature. A critical overview of the Code pays most attention to those Rules that are relevant to naming new taxa and new combinations, with comments on common misunderstandings. There follows an account of the application of infrasubspecies, specifically of pathovars as regulated by the Standards for Naming Pathovars. Both the Code and Standards, written almost 30 years ago in response to the exigencies of the time, could be revised to improve clarity. It is not possible for either the Code or the Standards to give formal guidance to the process of translation of pathovars, governed by the Standards, to higher taxonomic ranks, governed by the Code. If the introduction of ambiguity of names is to be avoided in making such translations, then it is the responsibility of individual bacteriologists to consider carefully the nomenclatural implications and outcomes of their proposals.
ERIC Educational Resources Information Center
Leach, Mark M.; Oakland, Thomas
2007-01-01
Ethics codes are designed to protect the public by prescribing behaviors professionals are expected to exhibit. Although test use is universal, albeit reflecting strong Western influences, previous studies that examine the degree issues pertaining to test development and use and that are addressed in ethics codes of national psychological…
Review and verification of CARE 3 mathematical model and code
NASA Technical Reports Server (NTRS)
Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.
1983-01-01
The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.
Electronic data processing codes for California wildland plants
Merton J. Reed; W. Robert Powell; Bur S. Bal
1963-01-01
Systematized codes for plant names are helpful to a wide variety of workers who must record the identity of plants in the field. We have developed such codes for a majority of the vascular plants encountered on California wildlands and have published the codes in pocket size, using photo-reductions of the output from data processing machines. A limited number of the...
New technologies accelerate the exploration of non-coding RNAs in horticultural plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Degao; Mewalal, Ritesh; Hu, Rongbin
Non-coding RNAs (ncRNAs), that is, RNAs not translated into proteins, are crucial regulators of a variety of biological processes in plants. While protein-encoding genes have been relatively well-annotated in sequenced genomes, accounting for a small portion of the genome space in plants, the universe of plant ncRNAs is rapidly expanding. Recent advances in experimental and computational technologies have generated a great momentum for discovery and functional characterization of ncRNAs. Here we summarize the classification and known biological functions of plant ncRNAs, review the application of next-generation sequencing (NGS) technology and ribosome profiling technology to ncRNA discovery in horticultural plants andmore » discuss the application of new technologies, especially the new genome-editing tool clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated protein 9 (Cas9) systems, to functional characterization of plant ncRNAs.« less
A Looking-Glass of Non-Coding RNAs in Oral Cancer
Irimie, Alexandra Iulia; Braicu, Cornelia; Sonea, Laura; Zimta, Alina Andreea; Diudea, Diana; Buduru, Smaranda; Berindan-Neagoe, Ioana
2017-01-01
Oral cancer is a multifactorial pathology and is characterized by the lack of efficient treatment and accurate diagnostic tools. This is mainly due the late diagnosis; therefore, reliable biomarkers for the timely detection of the disease and patient stratification are required. Non-coding RNAs (ncRNAs) are key elements in the physiological and pathological processes of various cancers, which is also reflected in oral cancer development and progression. A better understanding of their role could give a more thorough perspective on the future treatment options for this cancer type. This review offers a glimpse into the ncRNA involvement in oral cancer, which can help the medical community tap into the world of ncRNAs and lay the ground for more powerful diagnostic, prognostic and treatment tools for oral cancer that will ultimately help build a brighter future for these patients. PMID:29206174
Laminar fMRI and computational theories of brain function.
Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J
2017-11-02
Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.
clearScience: Infrastructure for Communicating Data-Intensive Science.
Bot, Brian M; Burdick, David; Kellen, Michael; Huang, Erich S
2013-01-01
Progress in biomedical research requires effective scientific communication to one's peers and to the public. Current research routinely encompasses large datasets and complex analytic processes, and the constraints of traditional journal formats limit useful transmission of these elements. We are constructing a framework through which authors can not only provide the narrative of what was done, but the primary and derivative data, the source code, the compute environment, and web-accessible virtual machines. This infrastructure allows authors to "hand their machine"- prepopulated with libraries, data, and code-to those interested in reviewing or building off of their work. This project, "clearScience," seeks to provide an integrated system that accommodates the ad hoc nature of discovery in the data-intensive sciences and seamless transitions from working to reporting. We demonstrate that rather than merely describing the science being reported, one can deliver the science itself.
tRNA-Derived Small RNA: A Novel Regulatory Small Non-Coding RNA.
Li, Siqi; Xu, Zhengping; Sheng, Jinghao
2018-05-10
Deep analysis of next-generation sequencing data unveils numerous small non-coding RNAs with distinct functions. Recently, fragments derived from tRNA, named as tRNA-derived small RNA (tsRNA), have attracted broad attention. There are mainly two types of tsRNAs, including tRNA-derived stress-induced RNA (tiRNA) and tRNA-derived fragment (tRF), which differ in the cleavage position of the precursor or mature tRNA transcript. Emerging evidence has shown that tsRNAs are not merely tRNA degradation debris but have been recognized to play regulatory roles in many specific physiological and pathological processes. In this review, we summarize the biogeneses of various tsRNAs, present the emerging concepts regarding functions and mechanisms of action of tsRNAs, highlight the potential application of tsRNAs in human diseases, and put forward the current problems and future research directions.
MicroRNAs in large herpesvirus DNA genomes: recent advances.
Sorel, Océane; Dewals, Benjamin G
2016-08-01
MicroRNAs (miRNAs) are small non-coding RNAs (ncRNAs) that regulate gene expression. They alter mRNA translation through base-pair complementarity, leading to regulation of genes during both physiological and pathological processes. Viruses have evolved mechanisms to take advantage of the host cells to multiply and/or persist over the lifetime of the host. Herpesviridae are a large family of double-stranded DNA viruses that are associated with a number of important diseases, including lymphoproliferative diseases. Herpesviruses establish lifelong latent infections through modulation of the interface between the virus and its host. A number of reports have identified miRNAs in a very large number of human and animal herpesviruses suggesting that these short non-coding transcripts could play essential roles in herpesvirus biology. This review will specifically focus on the recent advances on the functions of herpesvirus miRNAs in infection and pathogenesis.
[Epigenetics of plant vernalization regulated by non-coding RNAs].
Zhang, Shao-Feng; Li, Xiao-Rong; Sun, Chuan-Bao; He, Yu-Ke
2012-07-01
Many higher plants must experience a period of winter cold to accomplish the transition from vegetative to reproductive growth. This biological process is called vernalization. Some crops such as wheat (Triticum aestivum L.) and oilseed rape (Brassica napus L.) produce seeds as edible organs, and therefore special measures of rotation and cultivation are necessary for plants to go through an early vernalization for flower differentiation and development, whereas the other crops such as Chinese cabbage (B rapa ssp. pekinenesis) and cabbage (Brassica napus L.) produce leafy heads as edible organs, and additional practice should be taken to avoid vernalization for a prolonged and fully vegetative growth. Before vernalization, flowering is repressed by the action of a gene called Flowering Locus C (FLC). This paper reviewed the function of non-coding RNAs and some proteins including VRN1, VRN2, and VIN3 in epigenetic regulation of FLC during vernalization.
New technologies accelerate the exploration of non-coding RNAs in horticultural plants
Liu, Degao; Mewalal, Ritesh; Hu, Rongbin; Tuskan, Gerald A; Yang, Xiaohan
2017-01-01
Non-coding RNAs (ncRNAs), that is, RNAs not translated into proteins, are crucial regulators of a variety of biological processes in plants. While protein-encoding genes have been relatively well-annotated in sequenced genomes, accounting for a small portion of the genome space in plants, the universe of plant ncRNAs is rapidly expanding. Recent advances in experimental and computational technologies have generated a great momentum for discovery and functional characterization of ncRNAs. Here we summarize the classification and known biological functions of plant ncRNAs, review the application of next-generation sequencing (NGS) technology and ribosome profiling technology to ncRNA discovery in horticultural plants and discuss the application of new technologies, especially the new genome-editing tool clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated protein 9 (Cas9) systems, to functional characterization of plant ncRNAs. PMID:28698797
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Continuing the tradition established in prior years, this panel encompasses one of the broadest ranges of topics and issues of any panel at the Summer Study. It includes papers addressing all sectors, low-income residential to industrial, and views energy efficiency from many perspectives including programmatic, evaluation, codes, standards, legislation, technical transfer, economic development, and least-cost planning. The papers represent work being performed in most geographic regions of the United States and in the international arena, specifically Thailand, China, Europe, and Scandinavia. This delightful smorgasbord has been organized, based on general content area, into the following eight sessions: (1) new directionsmore » for low-income weatherization; (2) pursuing efficiency through legislation and standards; (3) international perspectives on energy efficiency; (4) technical transfer strategies; (5) government energy policy; (6) commercial codes and standards; (7) innovative programs; and, (8) state-of-the-art review. For these conference proceedings, individual papers are processed separately for the Energy Data Base.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiner, J.L.; Lime, J.F.; Elson, J.S.
One dimensional TRAC transient calculations of the process inherent ultimate safety (PIUS) advanced reactor design were performed for a pump-trip SCRAM. The TRAC calculations showed that the reactor power response and shutdown were in qualitative agreement with the one-dimensional analyses presented in the PIUS Preliminary Safety Information Document (PSID) submitted by Asea Brown Boveri (ABB) to the US Nuclear Regulatory Commission for preapplication safety review. The PSID analyses were performed with the ABB-developed RIGEL code. The TRAC-calculated phenomena and trends were also similar to those calculated with another one-dimensional PIUS model, the Brookhaven National Laboratory developed PIPA code. A TRACmore » pump-trip SCRAM transient has also been calculated with a TRAC model containing a multi-dimensional representation of the PIUS intemal flow structures and core region. The results obtained using the TRAC fully one-dimensional PIUS model are compared to the RIGEL, PIPA, and TRAC multi-dimensional results.« less
Cai, Tianxi; Karlson, Elizabeth W.
2013-01-01
Objectives To test whether data extracted from full text patient visit notes from an electronic medical record (EMR) would improve the classification of PsA compared to an algorithm based on codified data. Methods From the > 1,350,000 adults in a large academic EMR, all 2318 patients with a billing code for PsA were extracted and 550 were randomly selected for chart review and algorithm training. Using codified data and phrases extracted from narrative data using natural language processing, 31 predictors were extracted and three random forest algorithms trained using coded, narrative, and combined predictors. The receiver operator curve (ROC) was used to identify the optimal algorithm and a cut point was chosen to achieve the maximum sensitivity possible at a 90% positive predictive value (PPV). The algorithm was then used to classify the remaining 1768 charts and finally validated in a random sample of 300 cases predicted to have PsA. Results The PPV of a single PsA code was 57% (95%CI 55%–58%). Using a combination of coded data and NLP the random forest algorithm reached a PPV of 90% (95%CI 86%–93%) at sensitivity of 87% (95% CI 83% – 91%) in the training data. The PPV was 93% (95%CI 89%–96%) in the validation set. Adding NLP predictors to codified data increased the area under the ROC (p < 0.001). Conclusions Using NLP with text notes from electronic medical records improved the performance of the prediction algorithm significantly. Random forests were a useful tool to accurately classify psoriatic arthritis cases to enable epidemiological research. PMID:20701955
Summary of Pressure Gain Combustion Research at NASA
NASA Technical Reports Server (NTRS)
Perkins, H. Douglas; Paxson, Daniel E.
2018-01-01
NASA has undertaken a systematic exploration of many different facets of pressure gain combustion over the last 25 years in an effort to exploit the inherent thermodynamic advantage of pressure gain combustion over the constant pressure combustion process used in most aerospace propulsion systems. Applications as varied as small-scale UAV's, rotorcraft, subsonic transports, hypersonics and launch vehicles have been considered. In addition to studying pressure gain combustor concepts such as wave rotors, pulse detonation engines, pulsejets, and rotating detonation engines, NASA has studied inlets, nozzles, ejectors and turbines which must also process unsteady flow in an integrated propulsion system. Other design considerations such as acoustic signature, combustor material life and heat transfer that are unique to pressure gain combustors have also been addressed in NASA research projects. In addition to a wide range of experimental studies, a number of computer codes, from 0-D up through 3-D, have been developed or modified to specifically address the analysis of unsteady flow fields. Loss models have also been developed and incorporated into these codes that improve the accuracy of performance predictions and decrease computational time. These codes have been validated numerous times across a broad range of operating conditions, and it has been found that once validated for one particular pressure gain combustion configuration, these codes are readily adaptable to the others. All in all, the documentation of this work has encompassed approximately 170 NASA technical reports, conference papers and journal articles to date. These publications are very briefly summarized herein, providing a single point of reference for all of NASA's pressure gain combustion research efforts. This documentation does not include the significant contributions made by NASA research staff to the programs of other agencies, universities, industrial partners and professional society committees through serving as technical advisors, technical reviewers and research consultants.
Cell-assembly coding in several memory processes.
Sakurai, Y
1998-01-01
The present paper discusses why the cell assembly, i.e., an ensemble population of neurons with flexible functional connections, is a tenable view of the basic code for information processes in the brain. The main properties indicating the reality of cell-assembly coding are neurons overlaps among different assemblies and connection dynamics within and among the assemblies. The former can be detected as multiple functions of individual neurons in processing different kinds of information. Individual neurons appear to be involved in multiple information processes. The latter can be detected as changes of functional synaptic connections in processing different kinds of information. Correlations of activity among some of the recorded neurons appear to change in multiple information processes. Recent experiments have compared several different memory processes (tasks) and detected these two main properties, indicating cell-assembly coding of memory in the working brain. The first experiment compared different types of processing of identical stimuli, i.e., working memory and reference memory of auditory stimuli. The second experiment compared identical processes of different types of stimuli, i.e., discriminations of simple auditory, simple visual, and configural auditory-visual stimuli. The third experiment compared identical processes of different types of stimuli with or without temporal processing of stimuli, i.e., discriminations of elemental auditory, configural auditory-visual, and sequential auditory-visual stimuli. Some possible features of the cell-assembly coding, especially "dual coding" by individual neurons and cell assemblies, are discussed for future experimental approaches. Copyright 1998 Academic Press.
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
Hamming and Accumulator Codes Concatenated with MPSK or QAM
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Dolinar, Samuel
2009-01-01
In a proposed coding-and-modulation scheme, a high-rate binary data stream would be processed as follows: 1. The input bit stream would be demultiplexed into multiple bit streams. 2. The multiple bit streams would be processed simultaneously into a high-rate outer Hamming code that would comprise multiple short constituent Hamming codes a distinct constituent Hamming code for each stream. 3. The streams would be interleaved. The interleaver would have a block structure that would facilitate parallelization for high-speed decoding. 4. The interleaved streams would be further processed simultaneously into an inner two-state, rate-1 accumulator code that would comprise multiple constituent accumulator codes - a distinct accumulator code for each stream. 5. The resulting bit streams would be mapped into symbols to be transmitted by use of a higher-order modulation - for example, M-ary phase-shift keying (MPSK) or quadrature amplitude modulation (QAM). The novelty of the scheme lies in the concatenation of the multiple-constituent Hamming and accumulator codes and the corresponding parallel architectures of the encoder and decoder circuitry (see figure) needed to process the multiple bit streams simultaneously. As in the cases of other parallel-processing schemes, one advantage of this scheme is that the overall data rate could be much greater than the data rate of each encoder and decoder stream and, hence, the encoder and decoder could handle data at an overall rate beyond the capability of the individual encoder and decoder circuits.
Robertson, Eden G; Wakefield, Claire E; Signorelli, Christina; Cohn, Richard J; Patenaude, Andrea; Foster, Claire; Pettit, Tristan; Fardell, Joanna E
2018-07-01
We conducted a systematic review to identify the strategies that have been recommended in the literature to facilitate shared decision-making regarding enrolment in pediatric oncology clinical trials. We searched seven databases for peer-reviewed literature, published 1990-2017. Of 924 articles identified, 17 studies were eligible for the review. We assessed study quality using the 'Mixed-Methods Appraisal Tool'. We coded the results and discussions of papers line-by-line using nVivo software. We categorized strategies thematically. Five main themes emerged: 1) decision-making as a process, 2) individuality of the process; 3) information provision, 4) the role of communication, or 5) decision and psychosocial support. Families should have adequate time to make a decision. HCPs should elicit parents' and patients' preferences for level of information and decision involvement. Information should be clear and provided in multiple modalities. Articles also recommended providing training for healthcare professionals and access to psychosocial support for families. High quality, individually-tailored information, open communication and psychosocial support appear vital in supporting decision-making regarding enrollment in clinical trials. These data will usefully inform future decision-making interventions/tools to support families making clinical trial decisions. A solid evidence-base for effective strategies which facilitate shared decision-making is needed. Copyright © 2018 Elsevier B.V. All rights reserved.
Natural Language Interface for Safety Certification of Safety-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2011-01-01
Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.
Reaction-diffusion systems in natural sciences and new technology transfer
NASA Astrophysics Data System (ADS)
Keller, André A.
2012-12-01
Diffusion mechanisms in natural sciences and innovation management involve partial differential equations (PDEs). This is due to their spatio-temporal dimensions. Functional semi-discretized PDEs (with lattice spatial structures or time delays) may be even more adapted to real world problems. In the modeling process, PDEs can also formalize behaviors, such as the logistic growth of populations with migration, and the adopters’ dynamics of new products in innovation models. In biology, these events are related to variations in the environment, population densities and overcrowding, migration and spreading of humans, animals, plants and other cells and organisms. In chemical reactions, molecules of different species interact locally and diffuse. In the management of new technologies, the diffusion processes of innovations in the marketplace (e.g., the mobile phone) are a major subject. These innovation diffusion models refer mainly to epidemic models. This contribution introduces that modeling process by using PDEs and reviews the essential features of the dynamics and control in biological, chemical and new technology transfer. This paper is essentially user-oriented with basic nonlinear evolution equations, delay PDEs, several analytical and numerical methods for solving, different solutions, and with the use of mathematical packages, notebooks and codes. The computations are carried out by using the software Wolfram Mathematica®7, and C++ codes.
A Code of Ethics and Integrity for HRD Research and Practice.
ERIC Educational Resources Information Center
Hatcher, Tim; Aragon, Steven R.
2000-01-01
Describes the rationale for a code of ethics and integrity in human resource development (HRD). Outlines the Academy of Human Resource Development's standards. Reviews ethical issues faced by the HRD profession. (SK)
Carnahan, Ryan M; Kee, Vicki R
2012-01-01
This paper aimed to systematically review algorithms to identify transfusion-related ABO incompatibility reactions in administrative data, with a focus on studies that have examined the validity of the algorithms. A literature search was conducted using PubMed, Iowa Drug Information Service database, and Embase. A Google Scholar search was also conducted because of the difficulty identifying relevant studies. Reviews were conducted by two investigators to identify studies using data sources from the USA or Canada because these data sources were most likely to reflect the coding practices of Mini-Sentinel data sources. One study was found that validated International Classification of Diseases (ICD-9-CM) codes representing transfusion reactions. None of these cases were ABO incompatibility reactions. Several studies consistently used ICD-9-CM code 999.6, which represents ABO incompatibility reactions, and a technical report identified the ICD-10 code for these reactions. One study included the E-code E8760 for mismatched blood in transfusion in the algorithm. Another study reported finding no ABO incompatibility reaction codes in the Healthcare Cost and Utilization Project Nationwide Inpatient Sample database, which contains data of 2.23 million patients who received transfusions, raising questions about the sensitivity of administrative data for identifying such reactions. Two studies reported perfect specificity, with sensitivity ranging from 21% to 83%, for the code identifying allogeneic red blood cell transfusions in hospitalized patients. There is no information to assess the validity of algorithms to identify transfusion-related ABO incompatibility reactions. Further information on the validity of algorithms to identify transfusions would also be useful. Copyright © 2012 John Wiley & Sons, Ltd.
Manley, Ray; Satiani, Bhagwan
2009-11-01
With the widening gap between overhead expenses and reimbursement, management of the revenue cycle is a critical part of a successful vascular surgery practice. It is important to review the data on all the components of the revenue cycle: payer contracting, appointment scheduling, preregistration, registration process, coding and capturing charges, proper billing of patients and insurers, follow-up of accounts receivable, and finally using appropriate benchmarking. The industry benchmarks used should be those of peers in identical groups. Warning signs of poor performance are discussed enabling the practice to formulate a performance improvement plan.
Epigenetic Therapy in Lung Cancer – Role of microRNAs
Rothschild, Sacha I.
2013-01-01
Lung cancer is the leading cause of cancer deaths worldwide. microRNAs (miRNAs) are a class of small non-coding RNA species that have been implicated in the control of many fundamental cellular and physiological processes such as cellular differentiation, proliferation, apoptosis, and stem cell maintenance. Some miRNAs have been categorized as “oncomiRs” as opposed to “tumor suppressor miRs.” This review focuses on the role of miRNAs in the lung cancer carcinogenesis and their potential as diagnostic, prognostic, or predictive markers. PMID:23802096
Automatic Processing of Reactive Polymers
NASA Technical Reports Server (NTRS)
Roylance, D.
1985-01-01
A series of process modeling computer codes were examined. The codes use finite element techniques to determine the time-dependent process parameters operative during nonisothermal reactive flows such as can occur in reaction injection molding or composites fabrication. The use of these analytical codes to perform experimental control functions is examined; since the models can determine the state of all variables everywhere in the system, they can be used in a manner similar to currently available experimental probes. A small but well instrumented reaction vessel in which fiber-reinforced plaques are cured using computer control and data acquisition was used. The finite element codes were also extended to treat this particular process.
Decision Making and the IACUC: Part 1—Protocol Information Discussed at Full-Committee Reviews
Silverman, Jerald; Lidz, Charles W; Clayfield, Jonathan C; Murray, Alexandra; Simon, Lorna J; Rondeau, Richard G
2015-01-01
IACUC protocols can be reviewed by either the full committee or designated members. Both review methods use the principles of the 3 Rs (reduce, refine, replace) as the overarching paradigm, with federal regulations and policies providing more detailed guidance. The primary goal of this study was to determine the frequency of topics discussed by IACUC during full-committee reviews and whether the topics included those required for consideration by IACUC (for example, pain and distress, number of animals used, availability of alternatives, skill and experience of researchers). We recorded and transcribed 87 protocol discussions undergoing full-committee review at 10 academic institutions. Each transcript was coded to capture the key concepts of the discussion and analyzed for the frequency of the codes mentioned. Pain and distress was the code mentioned most often, followed by the specific procedures performed, the study design, and the completeness of the protocol form. Infrequently mentioned topics were alternatives to animal use or painful or distressful procedures, the importance of the research, and preliminary data. Not all of the topics required to be considered by the IACUC were openly discussed for all protocols, and many of the discussions were limited in their depth. PMID:26224439
Nuclear shell model code CRUNCHER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resler, D.A.; Grimes, S.M.
1988-05-01
A new nuclear shell model code CRUNCHER, patterned after the code VLADIMIR, has been developed. While CRUNCHER and VLADIMIR employ the techniques of an uncoupled basis and the Lanczos process, improvements in the new code allow it to handle much larger problems than the previous code and to perform them more efficiently. Tests involving a moderately sized calculation indicate that CRUNCHER running on a SUN 3/260 workstation requires approximately one-half the central processing unit (CPU) time required by VLADIMIR running on a CRAY-1 supercomputer.
Continuation of research into language concepts for the mission support environment: Source code
NASA Technical Reports Server (NTRS)
Barton, Timothy J.; Ratner, Jeremiah M.
1991-01-01
Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farren Hunt
Idaho National Laboratory (INL) performed an Annual Effectiveness Review of the Integrated Safety Management System (ISMS), per 48 Code of Federal Regulations (CFR) 970.5223 1, “Integration of Environment, Safety and Health into Work Planning and Execution.” The annual review assessed Integrated Safety Management (ISM) effectiveness, provided feedback to maintain system integrity, and identified target areas for focused improvements and assessments for fiscal year (FY) 2013. Results of the FY 2012 annual effectiveness review demonstrated that the INL’s ISMS program was significantly strengthened. Actions implemented by the INL demonstrate that the overall Integrated Safety Management System is sound and ensures safemore » and successful performance of work while protecting workers, the public, and environment. This report also provides several opportunities for improvement that will help further strengthen the ISM Program and the pursuit of safety excellence. Demonstrated leadership and commitment, continued surveillance, and dedicated resources have been instrumental in maturing a sound ISMS program. Based upon interviews with personnel, reviews of assurance activities, and analysis of ISMS process implementation, this effectiveness review concludes that ISM is institutionalized and is “Effective”.« less